Sample records for geostatistics random sets

  1. Random vectors and spatial analysis by geostatistics for geotechnical applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Young, D.S.

    1987-08-01

    Geostatistics is extended to the spatial analysis of vector variables by defining the estimation variance and vector variogram in terms of the magnitude of difference vectors. Many random variables in geotechnology are in vectorial terms rather than scalars, and its structural analysis requires those sample variable interpolations to construct and characterize structural models. A better local estimator will result in greater quality of input models; geostatistics can provide such estimators; kriging estimators. The efficiency of geostatistics for vector variables is demonstrated in a case study of rock joint orientations in geological formations. The positive cross-validation encourages application of geostatistics tomore » spatial analysis of random vectors in geoscience as well as various geotechnical fields including optimum site characterization, rock mechanics for mining and civil structures, cavability analysis of block cavings, petroleum engineering, and hydrologic and hydraulic modelings.« less

  2. Applications of Geostatistics in Plant Nematology

    PubMed Central

    Wallace, M. K.; Hawkins, D. M.

    1994-01-01

    The application of geostatistics to plant nematology was made by evaluating soil and nematode data acquired from 200 soil samples collected from the Ap horizon of a reed canary-grass field in northern Minnesota. Geostatistical concepts relevant to nematology include semi-variogram modelling, kriging, and change of support calculations. Soil and nematode data generally followed a spherical semi-variogram model, with little random variability associated with soil data and large inherent variability for nematode data. Block kriging of soil and nematode data provided useful contour maps of the data. Change of snpport calculations indicated that most of the random variation in nematode data was due to short-range spatial variability in the nematode population densities. PMID:19279938

  3. Applications of geostatistics in plant nematology.

    PubMed

    Wallace, M K; Hawkins, D M

    1994-12-01

    The application of geostatistics to plant nematology was made by evaluating soil and nematode data acquired from 200 soil samples collected from the A(p) horizon of a reed canary-grass field in northern Minnesota. Geostatistical concepts relevant to nematology include semi-variogram modelling, kriging, and change of support calculations. Soil and nematode data generally followed a spherical semi-variogram model, with little random variability associated with soil data and large inherent variability for nematode data. Block kriging of soil and nematode data provided useful contour maps of the data. Change of snpport calculations indicated that most of the random variation in nematode data was due to short-range spatial variability in the nematode population densities.

  4. Application of geostatistics to risk assessment.

    PubMed

    Thayer, William C; Griffith, Daniel A; Goodrum, Philip E; Diamond, Gary L; Hassett, James M

    2003-10-01

    Geostatistics offers two fundamental contributions to environmental contaminant exposure assessment: (1) a group of methods to quantitatively describe the spatial distribution of a pollutant and (2) the ability to improve estimates of the exposure point concentration by exploiting the geospatial information present in the data. The second contribution is particularly valuable when exposure estimates must be derived from small data sets, which is often the case in environmental risk assessment. This article addresses two topics related to the use of geostatistics in human and ecological risk assessments performed at hazardous waste sites: (1) the importance of assessing model assumptions when using geostatistics and (2) the use of geostatistics to improve estimates of the exposure point concentration (EPC) in the limited data scenario. The latter topic is approached here by comparing design-based estimators that are familiar to environmental risk assessors (e.g., Land's method) with geostatistics, a model-based estimator. In this report, we summarize the basics of spatial weighting of sample data, kriging, and geostatistical simulation. We then explore the two topics identified above in a case study, using soil lead concentration data from a Superfund site (a skeet and trap range). We also describe several areas where research is needed to advance the use of geostatistics in environmental risk assessment.

  5. Imprecise (fuzzy) information in geostatistics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bardossy, A.; Bogardi, I.; Kelly, W.E.

    1988-05-01

    A methodology based on fuzzy set theory for the utilization of imprecise data in geostatistics is presented. A common problem preventing a broader use of geostatistics has been the insufficient amount of accurate measurement data. In certain cases, additional but uncertain (soft) information is available and can be encoded as subjective probabilities, and then the soft kriging method can be applied (Journal, 1986). In other cases, a fuzzy encoding of soft information may be more realistic and simplify the numerical calculations. Imprecise (fuzzy) spatial information on the possible variogram is integrated into a single variogram which is used in amore » fuzzy kriging procedure. The overall uncertainty of prediction is represented by the estimation variance and the calculated membership function for each kriged point. The methodology is applied to the permeability prediction of a soil liner for hazardous waste containment. The available number of hard measurement data (20) was not enough for a classical geostatistical analysis. An additional 20 soft data made it possible to prepare kriged contour maps using the fuzzy geostatistical procedure.« less

  6. Assessment of geostatistical features for object-based image classification of contrasted landscape vegetation cover

    NASA Astrophysics Data System (ADS)

    de Oliveira Silveira, Eduarda Martiniano; de Menezes, Michele Duarte; Acerbi Júnior, Fausto Weimar; Castro Nunes Santos Terra, Marcela; de Mello, José Márcio

    2017-07-01

    Accurate mapping and monitoring of savanna and semiarid woodland biomes are needed to support the selection of areas of conservation, to provide sustainable land use, and to improve the understanding of vegetation. The potential of geostatistical features, derived from medium spatial resolution satellite imagery, to characterize contrasted landscape vegetation cover and improve object-based image classification is studied. The study site in Brazil includes cerrado sensu stricto, deciduous forest, and palm swamp vegetation cover. Sentinel 2 and Landsat 8 images were acquired and divided into objects, for each of which a semivariogram was calculated using near-infrared (NIR) and normalized difference vegetation index (NDVI) to extract the set of geostatistical features. The features selected by principal component analysis were used as input data to train a random forest algorithm. Tests were conducted, combining spectral and geostatistical features. Change detection evaluation was performed using a confusion matrix and its accuracies. The semivariogram curves were efficient to characterize spatial heterogeneity, with similar results using NIR and NDVI from Sentinel 2 and Landsat 8. Accuracy was significantly greater when combining geostatistical features with spectral data, suggesting that this method can improve image classification results.

  7. GEOSTATISTICAL SAMPLING DESIGNS FOR HAZARDOUS WASTE SITES

    EPA Science Inventory

    This chapter discusses field sampling design for environmental sites and hazardous waste sites with respect to random variable sampling theory, Gy's sampling theory, and geostatistical (kriging) sampling theory. The literature often presents these sampling methods as an adversari...

  8. Preferential sampling and Bayesian geostatistics: Statistical modeling and examples.

    PubMed

    Cecconi, Lorenzo; Grisotto, Laura; Catelan, Dolores; Lagazio, Corrado; Berrocal, Veronica; Biggeri, Annibale

    2016-08-01

    Preferential sampling refers to any situation in which the spatial process and the sampling locations are not stochastically independent. In this paper, we present two examples of geostatistical analysis in which the usual assumption of stochastic independence between the point process and the measurement process is violated. To account for preferential sampling, we specify a flexible and general Bayesian geostatistical model that includes a shared spatial random component. We apply the proposed model to two different case studies that allow us to highlight three different modeling and inferential aspects of geostatistical modeling under preferential sampling: (1) continuous or finite spatial sampling frame; (2) underlying causal model and relevant covariates; and (3) inferential goals related to mean prediction surface or prediction uncertainty. © The Author(s) 2016.

  9. Introduction to Geostatistics

    NASA Astrophysics Data System (ADS)

    Kitanidis, P. K.

    1997-05-01

    Introduction to Geostatistics presents practical techniques for engineers and earth scientists who routinely encounter interpolation and estimation problems when analyzing data from field observations. Requiring no background in statistics, and with a unique approach that synthesizes classic and geostatistical methods, this book offers linear estimation methods for practitioners and advanced students. Well illustrated with exercises and worked examples, Introduction to Geostatistics is designed for graduate-level courses in earth sciences and environmental engineering.

  10. Geostatistics and petroleum geology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hohn, M.E.

    1988-01-01

    This book examines purpose and use of geostatistics in exploration and development of oil and gas with an emphasis on appropriate and pertinent case studies. It present an overview of geostatistics. Topics covered include: The semivariogram; Linear estimation; Multivariate geostatistics; Nonlinear estimation; From indicator variables to nonparametric estimation; and More detail, less certainty; conditional simulation.

  11. Fast Geostatistical Inversion using Randomized Matrix Decompositions and Sketchings for Heterogeneous Aquifer Characterization

    NASA Astrophysics Data System (ADS)

    O'Malley, D.; Le, E. B.; Vesselinov, V. V.

    2015-12-01

    We present a fast, scalable, and highly-implementable stochastic inverse method for characterization of aquifer heterogeneity. The method utilizes recent advances in randomized matrix algebra and exploits the structure of the Quasi-Linear Geostatistical Approach (QLGA), without requiring a structured grid like Fast-Fourier Transform (FFT) methods. The QLGA framework is a more stable version of Gauss-Newton iterates for a large number of unknown model parameters, but provides unbiased estimates. The methods are matrix-free and do not require derivatives or adjoints, and are thus ideal for complex models and black-box implementation. We also incorporate randomized least-square solvers and data-reduction methods, which speed up computation and simulate missing data points. The new inverse methodology is coded in Julia and implemented in the MADS computational framework (http://mads.lanl.gov). Julia is an advanced high-level scientific programing language that allows for efficient memory management and utilization of high-performance computational resources. Inversion results based on series of synthetic problems with steady-state and transient calibration data are presented.

  12. Geostatistical applications in environmental remediation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stewart, R.N.; Purucker, S.T.; Lyon, B.F.

    1995-02-01

    Geostatistical analysis refers to a collection of statistical methods for addressing data that vary in space. By incorporating spatial information into the analysis, geostatistics has advantages over traditional statistical analysis for problems with a spatial context. Geostatistics has a history of success in earth science applications, and its popularity is increasing in other areas, including environmental remediation. Due to recent advances in computer technology, geostatistical algorithms can be executed at a speed comparable to many standard statistical software packages. When used responsibly, geostatistics is a systematic and defensible tool can be used in various decision frameworks, such as the Datamore » Quality Objectives (DQO) process. At every point in the site, geostatistics can estimate both the concentration level and the probability or risk of exceeding a given value. Using these probability maps can assist in identifying clean-up zones. Given any decision threshold and an acceptable level of risk, the probability maps identify those areas that are estimated to be above or below the acceptable risk. Those areas that are above the threshold are of the most concern with regard to remediation. In addition to estimating clean-up zones, geostatistics can assist in designing cost-effective secondary sampling schemes. Those areas of the probability map with high levels of estimated uncertainty are areas where more secondary sampling should occur. In addition, geostatistics has the ability to incorporate soft data directly into the analysis. These data include historical records, a highly correlated secondary contaminant, or expert judgment. The role of geostatistics in environmental remediation is a tool that in conjunction with other methods can provide a common forum for building consensus.« less

  13. Delineating Facies Spatial Distribution by Integrating Ensemble Data Assimilation and Indicator Geostatistics with Level Set Transformation.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hammond, Glenn Edward; Song, Xuehang; Ye, Ming

    A new approach is developed to delineate the spatial distribution of discrete facies (geological units that have unique distributions of hydraulic, physical, and/or chemical properties) conditioned not only on direct data (measurements directly related to facies properties, e.g., grain size distribution obtained from borehole samples) but also on indirect data (observations indirectly related to facies distribution, e.g., hydraulic head and tracer concentration). Our method integrates for the first time ensemble data assimilation with traditional transition probability-based geostatistics. The concept of level set is introduced to build shape parameterization that allows transformation between discrete facies indicators and continuous random variables. Themore » spatial structure of different facies is simulated by indicator models using conditioning points selected adaptively during the iterative process of data assimilation. To evaluate the new method, a two-dimensional semi-synthetic example is designed to estimate the spatial distribution and permeability of two distinct facies from transient head data induced by pumping tests. The example demonstrates that our new method adequately captures the spatial pattern of facies distribution by imposing spatial continuity through conditioning points. The new method also reproduces the overall response in hydraulic head field with better accuracy compared to data assimilation with no constraints on spatial continuity on facies.« less

  14. Geostatistics and spatial analysis in biological anthropology.

    PubMed

    Relethford, John H

    2008-05-01

    A variety of methods have been used to make evolutionary inferences based on the spatial distribution of biological data, including reconstructing population history and detection of the geographic pattern of natural selection. This article provides an examination of geostatistical analysis, a method used widely in geology but which has not often been applied in biological anthropology. Geostatistical analysis begins with the examination of a variogram, a plot showing the relationship between a biological distance measure and the geographic distance between data points and which provides information on the extent and pattern of spatial correlation. The results of variogram analysis are used for interpolating values of unknown data points in order to construct a contour map, a process known as kriging. The methods of geostatistical analysis and discussion of potential problems are applied to a large data set of anthropometric measures for 197 populations in Ireland. The geostatistical analysis reveals two major sources of spatial variation. One pattern, seen for overall body and craniofacial size, shows an east-west cline most likely reflecting the combined effects of past population dispersal and settlement. The second pattern is seen for craniofacial height and shows an isolation by distance pattern reflecting rapid spatial changes in the midlands region of Ireland, perhaps attributable to the genetic impact of the Vikings. The correspondence of these results with other analyses of these data and the additional insights generated from variogram analysis and kriging illustrate the potential utility of geostatistical analysis in biological anthropology. (c) 2008 Wiley-Liss, Inc.

  15. Geostatistics and GIS: tools for characterizing environmental contamination.

    PubMed

    Henshaw, Shannon L; Curriero, Frank C; Shields, Timothy M; Glass, Gregory E; Strickland, Paul T; Breysse, Patrick N

    2004-08-01

    Geostatistics is a set of statistical techniques used in the analysis of georeferenced data that can be applied to environmental contamination and remediation studies. In this study, the 1,1-dichloro-2,2-bis(p-chlorophenyl)ethylene (DDE) contamination at a Superfund site in western Maryland is evaluated. Concern about the site and its future clean up has triggered interest within the community because residential development surrounds the area. Spatial statistical methods, of which geostatistics is a subset, are becoming increasingly popular, in part due to the availability of geographic information system (GIS) software in a variety of application packages. In this article, the joint use of ArcGIS software and the R statistical computing environment are demonstrated as an approach for comprehensive geostatistical analyses. The spatial regression method, kriging, is used to provide predictions of DDE levels at unsampled locations both within the site and the surrounding areas where residential development is ongoing.

  16. Exploring prediction uncertainty of spatial data in geostatistical and machine learning Approaches

    NASA Astrophysics Data System (ADS)

    Klump, J. F.; Fouedjio, F.

    2017-12-01

    Geostatistical methods such as kriging with external drift as well as machine learning techniques such as quantile regression forest have been intensively used for modelling spatial data. In addition to providing predictions for target variables, both approaches are able to deliver a quantification of the uncertainty associated with the prediction at a target location. Geostatistical approaches are, by essence, adequate for providing such prediction uncertainties and their behaviour is well understood. However, they often require significant data pre-processing and rely on assumptions that are rarely met in practice. Machine learning algorithms such as random forest regression, on the other hand, require less data pre-processing and are non-parametric. This makes the application of machine learning algorithms to geostatistical problems an attractive proposition. The objective of this study is to compare kriging with external drift and quantile regression forest with respect to their ability to deliver reliable prediction uncertainties of spatial data. In our comparison we use both simulated and real world datasets. Apart from classical performance indicators, comparisons make use of accuracy plots, probability interval width plots, and the visual examinations of the uncertainty maps provided by the two approaches. By comparing random forest regression to kriging we found that both methods produced comparable maps of estimated values for our variables of interest. However, the measure of uncertainty provided by random forest seems to be quite different to the measure of uncertainty provided by kriging. In particular, the lack of spatial context can give misleading results in areas without ground truth data. These preliminary results raise questions about assessing the risks associated with decisions based on the predictions from geostatistical and machine learning algorithms in a spatial context, e.g. mineral exploration.

  17. Using geostatistics to evaluate cleanup goals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marcon, M.F.; Hopkins, L.P.

    1995-12-01

    Geostatistical analysis is a powerful predictive tool typically used to define spatial variability in environmental data. The information from a geostatistical analysis using kriging, a geostatistical. tool, can be taken a step further to optimize sampling location and frequency and help quantify sampling uncertainty in both the remedial investigation and remedial design at a hazardous waste site. Geostatistics were used to quantify sampling uncertainty in attainment of a risk-based cleanup goal and determine the optimal sampling frequency necessary to delineate the horizontal extent of impacted soils at a Gulf Coast waste site.

  18. Local Geostatistical Models and Big Data in Hydrological and Ecological Applications

    NASA Astrophysics Data System (ADS)

    Hristopulos, Dionissios

    2015-04-01

    The advent of the big data era creates new opportunities for environmental and ecological modelling but also presents significant challenges. The availability of remote sensing images and low-cost wireless sensor networks implies that spatiotemporal environmental data to cover larger spatial domains at higher spatial and temporal resolution for longer time windows. Handling such voluminous data presents several technical and scientific challenges. In particular, the geostatistical methods used to process spatiotemporal data need to overcome the dimensionality curse associated with the need to store and invert large covariance matrices. There are various mathematical approaches for addressing the dimensionality problem, including change of basis, dimensionality reduction, hierarchical schemes, and local approximations. We present a Stochastic Local Interaction (SLI) model that can be used to model local correlations in spatial data. SLI is a random field model suitable for data on discrete supports (i.e., regular lattices or irregular sampling grids). The degree of localization is determined by means of kernel functions and appropriate bandwidths. The strength of the correlations is determined by means of coefficients. In the "plain vanilla" version the parameter set involves scale and rigidity coefficients as well as a characteristic length. The latter determines in connection with the rigidity coefficient the correlation length of the random field. The SLI model is based on statistical field theory and extends previous research on Spartan spatial random fields [2,3] from continuum spaces to explicitly discrete supports. The SLI kernel functions employ adaptive bandwidths learned from the sampling spatial distribution [1]. The SLI precision matrix is expressed explicitly in terms of the model parameter and the kernel function. Hence, covariance matrix inversion is not necessary for parameter inference that is based on leave-one-out cross validation. This property

  19. Quantifying natural delta variability using a multiple-point geostatistics prior uncertainty model

    NASA Astrophysics Data System (ADS)

    Scheidt, Céline; Fernandes, Anjali M.; Paola, Chris; Caers, Jef

    2016-10-01

    We address the question of quantifying uncertainty associated with autogenic pattern variability in a channelized transport system by means of a modern geostatistical method. This question has considerable relevance for practical subsurface applications as well, particularly those related to uncertainty quantification relying on Bayesian approaches. Specifically, we show how the autogenic variability in a laboratory experiment can be represented and reproduced by a multiple-point geostatistical prior uncertainty model. The latter geostatistical method requires selection of a limited set of training images from which a possibly infinite set of geostatistical model realizations, mimicking the training image patterns, can be generated. To that end, we investigate two methods to determine how many training images and what training images should be provided to reproduce natural autogenic variability. The first method relies on distance-based clustering of overhead snapshots of the experiment; the second method relies on a rate of change quantification by means of a computer vision algorithm termed the demon algorithm. We show quantitatively that with either training image selection method, we can statistically reproduce the natural variability of the delta formed in the experiment. In addition, we study the nature of the patterns represented in the set of training images as a representation of the "eigenpatterns" of the natural system. The eigenpattern in the training image sets display patterns consistent with previous physical interpretations of the fundamental modes of this type of delta system: a highly channelized, incisional mode; a poorly channelized, depositional mode; and an intermediate mode between the two.

  20. A practical primer on geostatistics

    USGS Publications Warehouse

    Olea, Ricardo A.

    2009-01-01

    The Challenge—Most geological phenomena are extraordinarily complex in their interrelationships and vast in their geographical extension. Ordinarily, engineers and geoscientists are faced with corporate or scientific requirements to properly prepare geological models with measurements involving a small fraction of the entire area or volume of interest. Exact description of a system such as an oil reservoir is neither feasible nor economically possible. The results are necessarily uncertain. Note that the uncertainty is not an intrinsic property of the systems; it is the result of incomplete knowledge by the observer.The Aim of Geostatistics—The main objective of geostatistics is the characterization of spatial systems that are incompletely known, systems that are common in geology. A key difference from classical statistics is that geostatistics uses the sampling location of every measurement. Unless the measurements show spatial correlation, the application of geostatistics is pointless. Ordinarily the need for additional knowledge goes beyond a few points, which explains the display of results graphically as fishnet plots, block diagrams, and maps.Geostatistical Methods—Geostatistics is a collection of numerical techniques for the characterization of spatial attributes using primarily two tools: probabilistic models, which are used for spatial data in a manner similar to the way in which time-series analysis characterizes temporal data, or pattern recognition techniques. The probabilistic models are used as a way to handle uncertainty in results away from sampling locations, making a radical departure from alternative approaches like inverse distance estimation methods.Differences with Time Series—On dealing with time-series analysis, users frequently concentrate their attention on extrapolations for making forecasts. Although users of geostatistics may be interested in extrapolation, the methods work at their best interpolating. This simple difference

  1. Efficient geostatistical inversion of transient groundwater flow using preconditioned nonlinear conjugate gradients

    NASA Astrophysics Data System (ADS)

    Klein, Ole; Cirpka, Olaf A.; Bastian, Peter; Ippisch, Olaf

    2017-04-01

    In the geostatistical inverse problem of subsurface hydrology, continuous hydraulic parameter fields, in most cases hydraulic conductivity, are estimated from measurements of dependent variables, such as hydraulic heads, under the assumption that the parameter fields are autocorrelated random space functions. Upon discretization, the continuous fields become large parameter vectors with O (104 -107) elements. While cokriging-like inversion methods have been shown to be efficient for highly resolved parameter fields when the number of measurements is small, they require the calculation of the sensitivity of each measurement with respect to all parameters, which may become prohibitive with large sets of measured data such as those arising from transient groundwater flow. We present a Preconditioned Conjugate Gradient method for the geostatistical inverse problem, in which a single adjoint equation needs to be solved to obtain the gradient of the objective function. Using the autocovariance matrix of the parameters as preconditioning matrix, expensive multiplications with its inverse can be avoided, and the number of iterations is significantly reduced. We use a randomized spectral decomposition of the posterior covariance matrix of the parameters to perform a linearized uncertainty quantification of the parameter estimate. The feasibility of the method is tested by virtual examples of head observations in steady-state and transient groundwater flow. These synthetic tests demonstrate that transient data can reduce both parameter uncertainty and time spent conducting experiments, while the presented methods are able to handle the resulting large number of measurements.

  2. [Spatial distribution pattern of Chilo suppressalis analyzed by classical method and geostatistics].

    PubMed

    Yuan, Zheming; Fu, Wei; Li, Fangyi

    2004-04-01

    Two original samples of Chilo suppressalis and their grid, random and sequence samples were analyzed by classical method and geostatistics to characterize the spatial distribution pattern of C. suppressalis. The limitations of spatial distribution analysis with classical method, especially influenced by the original position of grid, were summarized rather completely. On the contrary, geostatistics characterized well the spatial distribution pattern, congregation intensity and spatial heterogeneity of C. suppressalis. According to geostatistics, the population was up to Poisson distribution in low density. As for higher density population, its distribution was up to aggregative, and the aggregation intensity and dependence range were 0.1056 and 193 cm, respectively. Spatial heterogeneity was also found in the higher density population. Its spatial correlativity in line direction was more closely than that in row direction, and the dependence ranges in line and row direction were 115 and 264 cm, respectively.

  3. Benchmarking a geostatistical procedure for the homogenisation of annual precipitation series

    NASA Astrophysics Data System (ADS)

    Caineta, Júlio; Ribeiro, Sara; Henriques, Roberto; Soares, Amílcar; Costa, Ana Cristina

    2014-05-01

    The European project COST Action ES0601, Advances in homogenisation methods of climate series: an integrated approach (HOME), has brought to attention the importance of establishing reliable homogenisation methods for climate data. In order to achieve that, a benchmark data set, containing monthly and daily temperature and precipitation data, was created to be used as a comparison basis for the effectiveness of those methods. Several contributions were submitted and evaluated by a number of performance metrics, validating the results against realistic inhomogeneous data. HOME also led to the development of new homogenisation software packages, which included feedback and lessons learned during the project. Preliminary studies have suggested a geostatistical stochastic approach, which uses Direct Sequential Simulation (DSS), as a promising methodology for the homogenisation of precipitation data series. Based on the spatial and temporal correlation between the neighbouring stations, DSS calculates local probability density functions at a candidate station to detect inhomogeneities. The purpose of the current study is to test and compare this geostatistical approach with the methods previously presented in the HOME project, using surrogate precipitation series from the HOME benchmark data set. The benchmark data set contains monthly precipitation surrogate series, from which annual precipitation data series were derived. These annual precipitation series were subject to exploratory analysis and to a thorough variography study. The geostatistical approach was then applied to the data set, based on different scenarios for the spatial continuity. Implementing this procedure also promoted the development of a computer program that aims to assist on the homogenisation of climate data, while minimising user interaction. Finally, in order to compare the effectiveness of this methodology with the homogenisation methods submitted during the HOME project, the obtained results

  4. Geostatistics and petroleum geology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hohn, M.E.

    1988-01-01

    The book reviewed is designed as a practical guide to geostatistics or kriging for the petroleum geologists. The author's aim in the book is to explain geostatistics as a working tool for petroleum geologists through extensive use of case-study material mostly drawn from his own research in gas potential evaluation in West Virginia. Theory and mathematics are pared down to immediate needs.

  5. Reservoir property grids improve with geostatistics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vogt, J.

    1993-09-01

    Visualization software, reservoir simulators and many other E and P software applications need reservoir property grids as input. Using geostatistics, as compared to other gridding methods, to produce these grids leads to the best output from the software programs. For the purpose stated herein, geostatistics is simply two types of gridding methods. Mathematically, these methods are based on minimizing or duplicating certain statistical properties of the input data. One geostatical method, called kriging, is used when the highest possible point-by-point accuracy is desired. The other method, called conditional simulation, is used when one wants statistics and texture of the resultingmore » grid to be the same as for the input data. In the following discussion, each method is explained, compared to other gridding methods, and illustrated through example applications. Proper use of geostatistical data in flow simulations, use of geostatistical data for history matching, and situations where geostatistics has no significant advantage over other methods, also will be covered.« less

  6. An application of geostatistics and fractal geometry for reservoir characterization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aasum, Y.; Kelkar, M.G.; Gupta, S.P.

    1991-03-01

    This paper presents an application of geostatistics and fractal geometry concepts for 2D characterization of rock properties (k and {phi}) in a dolomitic, layered-cake reservoir. The results indicate that lack of closely spaced data yield effectively random distributions of properties. Further, incorporation of geology reduces uncertainties in fractal interpolation of wellbore properties.

  7. GEOSTATISTICS FOR WASTE MANAGEMENT: A USER'S MANUAL FOR THE GEOPACK (VERSION 1.0) GEOSTATISTICAL SOFTWARE SYSTEM

    EPA Science Inventory

    GEOPACK, a comprehensive user-friendly geostatistical software system, was developed to help in the analysis of spatially correlated data. The software system was developed to be used by scientists, engineers, regulators, etc., with little experience in geostatistical techniques...

  8. GEOSTATISTICS FOR WASTE MANAGEMENT: A USER'S MANUEL FOR THE GEOPACK (VERSION 1.0) GEOSTATISTICAL SOFTWARE SYSTEM

    EPA Science Inventory

    A comprehensive, user-friendly geostatistical software system called GEOPACk has been developed. The purpose of this software is to make available the programs necessary to undertake a geostatistical analysis of spatially correlated data. The programs were written so that they ...

  9. A geostatistical approach to estimate mining efficiency indicators with flexible meshes

    NASA Astrophysics Data System (ADS)

    Freixas, Genis; Garriga, David; Fernàndez-Garcia, Daniel; Sanchez-Vila, Xavier

    2014-05-01

    Geostatistics is a branch of statistics developed originally to predict probability distributions of ore grades for mining operations by considering the attributes of a geological formation at unknown locations as a set of correlated random variables. Mining exploitations typically aim to maintain acceptable mineral laws to produce commercial products based upon demand. In this context, we present a new geostatistical methodology to estimate strategic efficiency maps that incorporate hydraulic test data, the evolution of concentrations with time obtained from chemical analysis (packer tests and production wells) as well as hydraulic head variations. The methodology is applied to a salt basin in South America. The exploitation is based on the extraction of brines through vertical and horizontal wells. Thereafter, brines are precipitated in evaporation ponds to obtain target potassium and magnesium salts of economic interest. Lithium carbonate is obtained as a byproduct of the production of potassium chloride. Aside from providing an assemble of traditional geostatistical methods, the strength of this study falls with the new methodology developed, which focus on finding the best sites to exploit the brines while maintaining efficiency criteria. Thus, some strategic indicator efficiency maps have been developed under the specific criteria imposed by exploitation standards to incorporate new extraction wells in new areas that would allow maintain or improve production. Results show that the uncertainty quantification of the efficiency plays a dominant role and that the use flexible meshes, which properly describe the curvilinear features associated with vertical stratification, provides a more consistent estimation of the geological processes. Moreover, we demonstrate that the vertical correlation structure at the given salt basin is essentially linked to variations in the formation thickness, which calls for flexible meshes and non-stationarity stochastic processes.

  10. Robust geostatistical analysis of spatial data

    NASA Astrophysics Data System (ADS)

    Papritz, Andreas; Künsch, Hans Rudolf; Schwierz, Cornelia; Stahel, Werner A.

    2013-04-01

    Most of the geostatistical software tools rely on non-robust algorithms. This is unfortunate, because outlying observations are rather the rule than the exception, in particular in environmental data sets. Outliers affect the modelling of the large-scale spatial trend, the estimation of the spatial dependence of the residual variation and the predictions by kriging. Identifying outliers manually is cumbersome and requires expertise because one needs parameter estimates to decide which observation is a potential outlier. Moreover, inference after the rejection of some observations is problematic. A better approach is to use robust algorithms that prevent automatically that outlying observations have undue influence. Former studies on robust geostatistics focused on robust estimation of the sample variogram and ordinary kriging without external drift. Furthermore, Richardson and Welsh (1995) proposed a robustified version of (restricted) maximum likelihood ([RE]ML) estimation for the variance components of a linear mixed model, which was later used by Marchant and Lark (2007) for robust REML estimation of the variogram. We propose here a novel method for robust REML estimation of the variogram of a Gaussian random field that is possibly contaminated by independent errors from a long-tailed distribution. It is based on robustification of estimating equations for the Gaussian REML estimation (Welsh and Richardson, 1997). Besides robust estimates of the parameters of the external drift and of the variogram, the method also provides standard errors for the estimated parameters, robustified kriging predictions at both sampled and non-sampled locations and kriging variances. Apart from presenting our modelling framework, we shall present selected simulation results by which we explored the properties of the new method. This will be complemented by an analysis a data set on heavy metal contamination of the soil in the vicinity of a metal smelter. Marchant, B.P. and Lark, R

  11. Robust geostatistical analysis of spatial data

    NASA Astrophysics Data System (ADS)

    Papritz, A.; Künsch, H. R.; Schwierz, C.; Stahel, W. A.

    2012-04-01

    Most of the geostatistical software tools rely on non-robust algorithms. This is unfortunate, because outlying observations are rather the rule than the exception, in particular in environmental data sets. Outlying observations may results from errors (e.g. in data transcription) or from local perturbations in the processes that are responsible for a given pattern of spatial variation. As an example, the spatial distribution of some trace metal in the soils of a region may be distorted by emissions of local anthropogenic sources. Outliers affect the modelling of the large-scale spatial variation, the so-called external drift or trend, the estimation of the spatial dependence of the residual variation and the predictions by kriging. Identifying outliers manually is cumbersome and requires expertise because one needs parameter estimates to decide which observation is a potential outlier. Moreover, inference after the rejection of some observations is problematic. A better approach is to use robust algorithms that prevent automatically that outlying observations have undue influence. Former studies on robust geostatistics focused on robust estimation of the sample variogram and ordinary kriging without external drift. Furthermore, Richardson and Welsh (1995) [2] proposed a robustified version of (restricted) maximum likelihood ([RE]ML) estimation for the variance components of a linear mixed model, which was later used by Marchant and Lark (2007) [1] for robust REML estimation of the variogram. We propose here a novel method for robust REML estimation of the variogram of a Gaussian random field that is possibly contaminated by independent errors from a long-tailed distribution. It is based on robustification of estimating equations for the Gaussian REML estimation. Besides robust estimates of the parameters of the external drift and of the variogram, the method also provides standard errors for the estimated parameters, robustified kriging predictions at both sampled

  12. Restricted spatial regression in practice: Geostatistical models, confounding, and robustness under model misspecification

    USGS Publications Warehouse

    Hanks, Ephraim M.; Schliep, Erin M.; Hooten, Mevin B.; Hoeting, Jennifer A.

    2015-01-01

    In spatial generalized linear mixed models (SGLMMs), covariates that are spatially smooth are often collinear with spatially smooth random effects. This phenomenon is known as spatial confounding and has been studied primarily in the case where the spatial support of the process being studied is discrete (e.g., areal spatial data). In this case, the most common approach suggested is restricted spatial regression (RSR) in which the spatial random effects are constrained to be orthogonal to the fixed effects. We consider spatial confounding and RSR in the geostatistical (continuous spatial support) setting. We show that RSR provides computational benefits relative to the confounded SGLMM, but that Bayesian credible intervals under RSR can be inappropriately narrow under model misspecification. We propose a posterior predictive approach to alleviating this potential problem and discuss the appropriateness of RSR in a variety of situations. We illustrate RSR and SGLMM approaches through simulation studies and an analysis of malaria frequencies in The Gambia, Africa.

  13. Can Geostatistical Models Represent Nature's Variability? An Analysis Using Flume Experiments

    NASA Astrophysics Data System (ADS)

    Scheidt, C.; Fernandes, A. M.; Paola, C.; Caers, J.

    2015-12-01

    The lack of understanding in the Earth's geological and physical processes governing sediment deposition render subsurface modeling subject to large uncertainty. Geostatistics is often used to model uncertainty because of its capability to stochastically generate spatially varying realizations of the subsurface. These methods can generate a range of realizations of a given pattern - but how representative are these of the full natural variability? And how can we identify the minimum set of images that represent this natural variability? Here we use this minimum set to define the geostatistical prior model: a set of training images that represent the range of patterns generated by autogenic variability in the sedimentary environment under study. The proper definition of the prior model is essential in capturing the variability of the depositional patterns. This work starts with a set of overhead images from an experimental basin that showed ongoing autogenic variability. We use the images to analyze the essential characteristics of this suite of patterns. In particular, our goal is to define a prior model (a minimal set of selected training images) such that geostatistical algorithms, when applied to this set, can reproduce the full measured variability. A necessary prerequisite is to define a measure of variability. In this study, we measure variability using a dissimilarity distance between the images. The distance indicates whether two snapshots contain similar depositional patterns. To reproduce the variability in the images, we apply an MPS algorithm to the set of selected snapshots of the sedimentary basin that serve as training images. The training images are chosen from among the initial set by using the distance measure to ensure that only dissimilar images are chosen. Preliminary investigations show that MPS can reproduce fairly accurately the natural variability of the experimental depositional system. Furthermore, the selected training images provide

  14. A Comparison of Traditional, Step-Path, and Geostatistical Techniques in the Stability Analysis of a Large Open Pit

    NASA Astrophysics Data System (ADS)

    Mayer, J. M.; Stead, D.

    2017-04-01

    With the increased drive towards deeper and more complex mine designs, geotechnical engineers are often forced to reconsider traditional deterministic design techniques in favour of probabilistic methods. These alternative techniques allow for the direct quantification of uncertainties within a risk and/or decision analysis framework. However, conventional probabilistic practices typically discretize geological materials into discrete, homogeneous domains, with attributes defined by spatially constant random variables, despite the fact that geological media display inherent heterogeneous spatial characteristics. This research directly simulates this phenomenon using a geostatistical approach, known as sequential Gaussian simulation. The method utilizes the variogram which imposes a degree of controlled spatial heterogeneity on the system. Simulations are constrained using data from the Ok Tedi mine site in Papua New Guinea and designed to randomly vary the geological strength index and uniaxial compressive strength using Monte Carlo techniques. Results suggest that conventional probabilistic techniques have a fundamental limitation compared to geostatistical approaches, as they fail to account for the spatial dependencies inherent to geotechnical datasets. This can result in erroneous model predictions, which are overly conservative when compared to the geostatistical results.

  15. Geostatistical risk estimation at waste disposal sites in the presence of hot spots.

    PubMed

    Komnitsas, Kostas; Modis, Kostas

    2009-05-30

    The present paper aims to estimate risk by using geostatistics at the wider coal mining/waste disposal site of Belkovskaya, Tula region, in Russia. In this area the presence of hot spots causes a spatial trend in the mean value of the random field and a non-Gaussian data distribution. Prior to application of geostatistics, subtraction of trend and appropriate smoothing and transformation of the data into a Gaussian form were carried out; risk maps were then generated for the wider study area in order to assess the probability of exceeding risk thresholds. Finally, the present paper discusses the need for homogenization of soil risk thresholds regarding hazardous elements that will enhance reliability of risk estimation and enable application of appropriate rehabilitation actions in contaminated areas.

  16. Mine planning and emission control strategies using geostatistics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Martino, F.; Kim, Y.C.

    1983-03-01

    This paper reviews the past four years' research efforts performed jointly by the University of Arizona and the Homer City Owners in which geostatistics were applied to solve various problems associated with coal characterization, mine planning, and development of emission control strategies. Because geostatistics is the only technique which can quantify the degree of confidence associated with a given estimate (or prediction), it played an important role throughout the research efforts. Through geostatistics, it was learned that there is an urgent need for closely spaced sample information, if short-term coal quality predictions are to be made for mine planning purposes.

  17. Usability and potential of geostatistics for spatial discrimination of multiple sclerosis lesion patterns.

    PubMed

    Marschallinger, Robert; Golaszewski, Stefan M; Kunz, Alexander B; Kronbichler, Martin; Ladurner, Gunther; Hofmann, Peter; Trinka, Eugen; McCoy, Mark; Kraus, Jörg

    2014-01-01

    In multiple sclerosis (MS) the individual disease courses are very heterogeneous among patients and biomarkers for setting the diagnosis and the estimation of the prognosis for individual patients would be very helpful. For this purpose, we are developing a multidisciplinary method and workflow for the quantitative, spatial, and spatiotemporal analysis and characterization of MS lesion patterns from MRI with geostatistics. We worked on a small data set involving three synthetic and three real-world MS lesion patterns, covering a wide range of possible MS lesion configurations. After brain normalization, MS lesions were extracted and the resulting binary 3-dimensional models of MS lesion patterns were subject to geostatistical indicator variography in three orthogonal directions. By applying geostatistical indicator variography, we were able to describe the 3-dimensional spatial structure of MS lesion patterns in a standardized manner. Fitting a model function to the empirical variograms, spatial characteristics of the MS lesion patterns could be expressed and quantified by two parameters. An orthogonal plot of these parameters enabled a well-arranged comparison of the involved MS lesion patterns. This method in development is a promising candidate to complement standard image-based statistics by incorporating spatial quantification. The work flow is generic and not limited to analyzing MS lesion patterns. It can be completely automated for the screening of radiological archives. Copyright © 2013 by the American Society of Neuroimaging.

  18. Selective remediation of contaminated sites using a two-level multiphase strategy and geostatistics.

    PubMed

    Saito, Hirotaka; Goovaerts, Pierre

    2003-05-01

    Selective soil remediation aims to reduce costs by cleaning only the fraction of an exposure unit (EU) necessary to lower the average concentration below the regulatory threshold. This approach requires a prior stratification of each EU into smaller remediation units (RU) which are then selected according to various criteria. This paper presents a geostatistical framework to account for uncertainties attached to both RU and EU average concentrations in selective remediation. The selection of RUs is based on their impact on the postremediation probability for the EU average concentration to exceed the regulatory threshold, which is assessed using geostatistical stochastic simulation. Application of the technique to a set of 600 dioxin concentrations collected at Piazza Road EPA Superfund site in Missouri shows a substantial decrease in the number of RU remediated compared with single phase remediation. The lower remediation costs achieved by the new strategy are obtained to the detriment of a higher risk of false negatives, yet for this data set this risk remains below the 5% rate set by EPA region 7.

  19. Regression and Geostatistical Techniques: Considerations and Observations from Experiences in NE-FIA

    Treesearch

    Rachel Riemann; Andrew Lister

    2005-01-01

    Maps of forest variables improve our understanding of the forest resource by allowing us to view and analyze it spatially. The USDA Forest Service's Northeastern Forest Inventory and Analysis unit (NE-FIA) has used geostatistical techniques, particularly stochastic simulation, to produce maps and spatial data sets of FIA variables. That work underscores the...

  20. G STL: the geostatistical template library in C++

    NASA Astrophysics Data System (ADS)

    Remy, Nicolas; Shtuka, Arben; Levy, Bruno; Caers, Jef

    2002-10-01

    The development of geostatistics has been mostly accomplished by application-oriented engineers in the past 20 years. The focus on concrete applications gave birth to many algorithms and computer programs designed to address different issues, such as estimating or simulating a variable while possibly accounting for secondary information such as seismic data, or integrating geological and geometrical data. At the core of any geostatistical data integration methodology is a well-designed algorithm. Yet, despite their obvious differences, all these algorithms share many commonalities on which to build a geostatistics programming library, lest the resulting library is poorly reusable and difficult to expand. Building on this observation, we design a comprehensive, yet flexible and easily reusable library of geostatistics algorithms in C++. The recent advent of the generic programming paradigm allows us elegantly to express the commonalities of the geostatistical algorithms into computer code. Generic programming, also referred to as "programming with concepts", provides a high level of abstraction without loss of efficiency. This last point is a major gain over object-oriented programming which often trades efficiency for abstraction. It is not enough for a numerical library to be reusable, it also has to be fast. Because generic programming is "programming with concepts", the essential step in the library design is the careful identification and thorough definition of these concepts shared by most geostatistical algorithms. Building on these definitions, a generic and expandable code can be developed. To show the advantages of such a generic library, we use G STL to build two sequential simulation programs working on two different types of grids—a surface with faults and an unstructured grid—without requiring any change to the G STL code.

  1. Reservoir studies with geostatistics to forecast performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tang, R.W.; Behrens, R.A.; Emanuel, A.S.

    1991-05-01

    In this paper example geostatistics and streamtube applications are presented for waterflood and CO{sub 2} flood in two low-permeability sandstone reservoirs. Thy hybrid approach of combining fine vertical resolution in cross-sectional models with streamtubes resulted in models that showed water channeling and provided realistic performance estimates. Results indicate that the combination of detailed geostatistical cross sections and fine-grid streamtube models offers a systematic approach for realistic performance forecasts.

  2. Modelling Geomechanical Heterogeneity of Rock Masses Using Direct and Indirect Geostatistical Conditional Simulation Methods

    NASA Astrophysics Data System (ADS)

    Eivazy, Hesameddin; Esmaieli, Kamran; Jean, Raynald

    2017-12-01

    An accurate characterization and modelling of rock mass geomechanical heterogeneity can lead to more efficient mine planning and design. Using deterministic approaches and random field methods for modelling rock mass heterogeneity is known to be limited in simulating the spatial variation and spatial pattern of the geomechanical properties. Although the applications of geostatistical techniques have demonstrated improvements in modelling the heterogeneity of geomechanical properties, geostatistical estimation methods such as Kriging result in estimates of geomechanical variables that are not fully representative of field observations. This paper reports on the development of 3D models for spatial variability of rock mass geomechanical properties using geostatistical conditional simulation method based on sequential Gaussian simulation. A methodology to simulate the heterogeneity of rock mass quality based on the rock mass rating is proposed and applied to a large open-pit mine in Canada. Using geomechanical core logging data collected from the mine site, a direct and an indirect approach were used to model the spatial variability of rock mass quality. The results of the two modelling approaches were validated against collected field data. The study aims to quantify the risks of pit slope failure and provides a measure of uncertainties in spatial variability of rock mass properties in different areas of the pit.

  3. Gstat: a program for geostatistical modelling, prediction and simulation

    NASA Astrophysics Data System (ADS)

    Pebesma, Edzer J.; Wesseling, Cees G.

    1998-01-01

    Gstat is a computer program for variogram modelling, and geostatistical prediction and simulation. It provides a generic implementation of the multivariable linear model with trends modelled as a linear function of coordinate polynomials or of user-defined base functions, and independent or dependent, geostatistically modelled, residuals. Simulation in gstat comprises conditional or unconditional (multi-) Gaussian sequential simulation of point values or block averages, or (multi-) indicator sequential simulation. Besides many of the popular options found in other geostatistical software packages, gstat offers the unique combination of (i) an interactive user interface for modelling variograms and generalized covariances (residual variograms), that uses the device-independent plotting program gnuplot for graphical display, (ii) support for several ascii and binary data and map file formats for input and output, (iii) a concise, intuitive and flexible command language, (iv) user customization of program defaults, (v) no built-in limits, and (vi) free, portable ANSI-C source code. This paper describes the class of problems gstat can solve, and addresses aspects of efficiency and implementation, managing geostatistical projects, and relevant technical details.

  4. Geostatistics applied to gas reservoirs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meunier, G.; Coulomb, C.; Laille, J.P.

    1989-09-01

    The spatial distribution of many of the physical parameters connected with a gas reservoir is of primary interest to both engineers and geologists throughout the study, development, and operation of a field. It is therefore desirable for the distribution to be capable of statistical interpretation, to have a simple graphical representation, and to allow data to be entered from either two- or three-dimensional grids. To satisfy these needs while dealing with the geographical variables, new methods have been developed under the name geostatistics. This paper describes briefly the theory of geostatistics and its most recent improvements for the specific problemmore » of subsurface description. The external-drift technique has been emphasized in particular, and in addition, four case studies related to gas reservoirs are presented.« less

  5. Application of geostatistics to coal-resource characterization and mine planning. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kauffman, P.W.; Walton, D.R.; Martuneac, L.

    1981-12-01

    Geostatistics is a proven method of ore reserve estimation in many non-coal mining areas but little has been published concerning its application to coal resources. This report presents the case for using geostatistics for coal mining applications and describes how a coal mining concern can best utilize geostatistical techniques for coal resource characterization and mine planning. An overview of the theory of geostatistics is also presented. Many of the applications discussed are documented in case studies that are a part of the report. The results of an exhaustive literature search are presented and recommendations are made for needed future researchmore » and demonstration projects.« less

  6. Delineating Hydrofacies Spatial Distribution by Integrating Ensemble Data Assimilation and Indicator Geostatistics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Song, Xuehang; Chen, Xingyuan; Ye, Ming

    2015-07-01

    This study develops a new framework of facies-based data assimilation for characterizing spatial distribution of hydrofacies and estimating their associated hydraulic properties. This framework couples ensemble data assimilation with transition probability-based geostatistical model via a parameterization based on a level set function. The nature of ensemble data assimilation makes the framework efficient and flexible to be integrated with various types of observation data. The transition probability-based geostatistical model keeps the updated hydrofacies distributions under geological constrains. The framework is illustrated by using a two-dimensional synthetic study that estimates hydrofacies spatial distribution and permeability in each hydrofacies from transient head data.more » Our results show that the proposed framework can characterize hydrofacies distribution and associated permeability with adequate accuracy even with limited direct measurements of hydrofacies. Our study provides a promising starting point for hydrofacies delineation in complex real problems.« less

  7. Incorporating reservoir heterogeneity with geostatistics to investigate waterflood recoveries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wolcott, D.S.; Chopra, A.K.

    1993-03-01

    This paper presents an investigation of infill drilling performance and reservoir continuity with geostatistics and a reservoir simulator. The geostatistical technique provides many possible realizations and realistic descriptions of reservoir heterogeneity. Correlation between recovery efficiency and thickness of individual sand subunits is shown. Additional recovery from infill drilling results from thin, discontinuous subunits. The technique may be applied to variations in continuity for other sandstone reservoirs.

  8. Geostatistics as a validation tool for setting ozone standards for durum wheat.

    PubMed

    De Marco, Alessandra; Screpanti, Augusto; Paoletti, Elena

    2010-02-01

    Which is the best standard for protecting plants from ozone? To answer this question, we must validate the standards by testing biological responses vs. ambient data in the field. A validation is missing for European and USA standards, because the networks for ozone, meteorology and plant responses are spatially independent. We proposed geostatistics as validation tool, and used durum wheat in central Italy as a test. The standards summarized ozone impact on yield better than hourly averages. Although USA criteria explained ozone-induced yield losses better than European criteria, USA legal level (75 ppb) protected only 39% of sites. European exposure-based standards protected > or =90%. Reducing the USA level to the Canadian 65 ppb or using W126 protected 91% and 97%, respectively. For a no-threshold accumulated stomatal flux, 22 mmol m(-2) was suggested to protect 97% of sites. In a multiple regression, precipitation explained 22% and ozone explained <0.9% of yield variability. Copyright (c) 2009 Elsevier Ltd. All rights reserved.

  9. A MS-lesion pattern discrimination plot based on geostatistics.

    PubMed

    Marschallinger, Robert; Schmidt, Paul; Hofmann, Peter; Zimmer, Claus; Atkinson, Peter M; Sellner, Johann; Trinka, Eugen; Mühlau, Mark

    2016-03-01

    A geostatistical approach to characterize MS-lesion patterns based on their geometrical properties is presented. A dataset of 259 binary MS-lesion masks in MNI space was subjected to directional variography. A model function was fit to express the observed spatial variability in x, y, z directions by the geostatistical parameters Range and Sill. Parameters Range and Sill correlate with MS-lesion pattern surface complexity and total lesion volume. A scatter plot of ln(Range) versus ln(Sill), classified by pattern anisotropy, enables a consistent and clearly arranged presentation of MS-lesion patterns based on geometry: the so-called MS-Lesion Pattern Discrimination Plot. The geostatistical approach and the graphical representation of results are considered efficient exploratory data analysis tools for cross-sectional, follow-up, and medication impact analysis.

  10. Geostatistical regularization operators for geophysical inverse problems on irregular meshes

    NASA Astrophysics Data System (ADS)

    Jordi, C.; Doetsch, J.; Günther, T.; Schmelzbach, C.; Robertsson, J. OA

    2018-05-01

    Irregular meshes allow to include complicated subsurface structures into geophysical modelling and inverse problems. The non-uniqueness of these inverse problems requires appropriate regularization that can incorporate a priori information. However, defining regularization operators for irregular discretizations is not trivial. Different schemes for calculating smoothness operators on irregular meshes have been proposed. In contrast to classical regularization constraints that are only defined using the nearest neighbours of a cell, geostatistical operators include a larger neighbourhood around a particular cell. A correlation model defines the extent of the neighbourhood and allows to incorporate information about geological structures. We propose an approach to calculate geostatistical operators for inverse problems on irregular meshes by eigendecomposition of a covariance matrix that contains the a priori geological information. Using our approach, the calculation of the operator matrix becomes tractable for 3-D inverse problems on irregular meshes. We tested the performance of the geostatistical regularization operators and compared them against the results of anisotropic smoothing in inversions of 2-D surface synthetic electrical resistivity tomography (ERT) data as well as in the inversion of a realistic 3-D cross-well synthetic ERT scenario. The inversions of 2-D ERT and seismic traveltime field data with geostatistical regularization provide results that are in good accordance with the expected geology and thus facilitate their interpretation. In particular, for layered structures the geostatistical regularization provides geologically more plausible results compared to the anisotropic smoothness constraints.

  11. Geostatistical enhancement of european hydrological predictions

    NASA Astrophysics Data System (ADS)

    Pugliese, Alessio; Castellarin, Attilio; Parajka, Juraj; Arheimer, Berit; Bagli, Stefano; Mazzoli, Paolo; Montanari, Alberto; Blöschl, Günter

    2016-04-01

    Geostatistical Enhancement of European Hydrological Prediction (GEEHP) is a research experiment developed within the EU funded SWITCH-ON project, which proposes to conduct comparative experiments in a virtual laboratory in order to share water-related information and tackle changes in the hydrosphere for operational needs (http://www.water-switch-on.eu). The main objective of GEEHP deals with the prediction of streamflow indices and signatures in ungauged basins at different spatial scales. In particular, among several possible hydrological signatures we focus in our experiment on the prediction of flow-duration curves (FDCs) along the stream-network, which has attracted an increasing scientific attention in the last decades due to the large number of practical and technical applications of the curves (e.g. hydropower potential estimation, riverine habitat suitability and ecological assessments, etc.). We apply a geostatistical procedure based on Top-kriging, which has been recently shown to be particularly reliable and easy-to-use regionalization approach, employing two different type of streamflow data: pan-European E-HYPE simulations (http://hypeweb.smhi.se/europehype) and observed daily streamflow series collected in two pilot study regions, i.e. Tyrol (merging data from Austrian and Italian stream gauging networks) and Sweden. The merger of the two study regions results in a rather large area (~450000 km2) and might be considered as a proxy for a pan-European application of the approach. In a first phase, we implement a bidirectional validation, i.e. E-HYPE catchments are set as training sites to predict FDCs at the same sites where observed data are available, and vice-versa. Such a validation procedure reveals (1) the usability of the proposed approach for predicting the FDCs over the entire river network of interest using alternatively observed data and E-HYPE simulations and (2) the accuracy of E-HYPE-based predictions of FDCs in ungauged sites. In a

  12. Geostatistical applications in ground-water modeling in south-central Kansas

    USGS Publications Warehouse

    Ma, T.-S.; Sophocleous, M.; Yu, Y.-S.

    1999-01-01

    This paper emphasizes the supportive role of geostatistics in applying ground-water models. Field data of 1994 ground-water level, bedrock, and saltwater-freshwater interface elevations in south-central Kansas were collected and analyzed using the geostatistical approach. Ordinary kriging was adopted to estimate initial conditions for ground-water levels and topography of the Permian bedrock at the nodes of a finite difference grid used in a three-dimensional numerical model. Cokriging was used to estimate initial conditions for the saltwater-freshwater interface. An assessment of uncertainties in the estimated data is presented. The kriged and cokriged estimation variances were analyzed to evaluate the adequacy of data employed in the modeling. Although water levels and bedrock elevations are well described by spherical semivariogram models, additional data are required for better cokriging estimation of the interface data. The geostatistically analyzed data were employed in a numerical model of the Siefkes site in the project area. Results indicate that the computed chloride concentrations and ground-water drawdowns reproduced the observed data satisfactorily.This paper emphasizes the supportive role of geostatistics in applying ground-water models. Field data of 1994 ground-water level, bedrock, and saltwater-freshwater interface elevations in south-central Kansas were collected and analyzed using the geostatistical approach. Ordinary kriging was adopted to estimate initial conditions for ground-water levels and topography of the Permian bedrock at the nodes of a finite difference grid used in a three-dimensional numerical model. Cokriging was used to estimate initial conditions for the saltwater-freshwater interface. An assessment of uncertainties in the estimated data is presented. The kriged and cokriged estimation variances were analyzed to evaluate the adequacy of data employed in the modeling. Although water levels and bedrock elevations are well described

  13. Hydrogeologic unit flow characterization using transition probability geostatistics.

    PubMed

    Jones, Norman L; Walker, Justin R; Carle, Steven F

    2005-01-01

    This paper describes a technique for applying the transition probability geostatistics method for stochastic simulation to a MODFLOW model. Transition probability geostatistics has some advantages over traditional indicator kriging methods including a simpler and more intuitive framework for interpreting geologic relationships and the ability to simulate juxtapositional tendencies such as fining upward sequences. The indicator arrays generated by the transition probability simulation are converted to layer elevation and thickness arrays for use with the new Hydrogeologic Unit Flow package in MODFLOW 2000. This makes it possible to preserve complex heterogeneity while using reasonably sized grids and/or grids with nonuniform cell thicknesses.

  14. ON THE GEOSTATISTICAL APPROACH TO THE INVERSE PROBLEM. (R825689C037)

    EPA Science Inventory

    Abstract

    The geostatistical approach to the inverse problem is discussed with emphasis on the importance of structural analysis. Although the geostatistical approach is occasionally misconstrued as mere cokriging, in fact it consists of two steps: estimation of statist...

  15. Integrated geostatistics for modeling fluid contacts and shales in Prudhoe Bay

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perez, G.; Chopra, A.K.; Severson, C.D.

    1997-12-01

    Geostatistics techniques are being used increasingly to model reservoir heterogeneity at a wide range of scales. A variety of techniques is now available with differing underlying assumptions, complexity, and applications. This paper introduces a novel method of geostatistics to model dynamic gas-oil contacts and shales in the Prudhoe Bay reservoir. The method integrates reservoir description and surveillance data within the same geostatistical framework. Surveillance logs and shale data are transformed to indicator variables. These variables are used to evaluate vertical and horizontal spatial correlation and cross-correlation of gas and shale at different times and to develop variogram models. Conditional simulationmore » techniques are used to generate multiple three-dimensional (3D) descriptions of gas and shales that provide a measure of uncertainty. These techniques capture the complex 3D distribution of gas-oil contacts through time. The authors compare results of the geostatistical method with conventional techniques as well as with infill wells drilled after the study. Predicted gas-oil contacts and shale distributions are in close agreement with gas-oil contacts observed at infill wells.« less

  16. Applications of geostatistics and Markov models for logo recognition

    NASA Astrophysics Data System (ADS)

    Pham, Tuan

    2003-01-01

    Spatial covariances based on geostatistics are extracted as representative features of logo or trademark images. These spatial covariances are different from other statistical features for image analysis in that the structural information of an image is independent of the pixel locations and represented in terms of spatial series. We then design a classifier in the sense of hidden Markov models to make use of these geostatistical sequential data to recognize the logos. High recognition rates are obtained from testing the method against a public-domain logo database.

  17. Spatial analysis of groundwater levels using Fuzzy Logic and geostatistical tools

    NASA Astrophysics Data System (ADS)

    Theodoridou, P. G.; Varouchakis, E. A.; Karatzas, G. P.

    2017-12-01

    The spatial variability evaluation of the water table of an aquifer provides useful information in water resources management plans. Geostatistical methods are often employed to map the free surface of an aquifer. In geostatistical analysis using Kriging techniques the selection of the optimal variogram is very important for the optimal method performance. This work compares three different criteria to assess the theoretical variogram that fits to the experimental one: the Least Squares Sum method, the Akaike Information Criterion and the Cressie's Indicator. Moreover, variable distance metrics such as the Euclidean, Minkowski, Manhattan, Canberra and Bray-Curtis are applied to calculate the distance between the observation and the prediction points, that affects both the variogram calculation and the Kriging estimator. A Fuzzy Logic System is then applied to define the appropriate neighbors for each estimation point used in the Kriging algorithm. The two criteria used during the Fuzzy Logic process are the distance between observation and estimation points and the groundwater level value at each observation point. The proposed techniques are applied to a data set of 250 hydraulic head measurements distributed over an alluvial aquifer. The analysis showed that the Power-law variogram model and Manhattan distance metric within ordinary kriging provide the best results when the comprehensive geostatistical analysis process is applied. On the other hand, the Fuzzy Logic approach leads to a Gaussian variogram model and significantly improves the estimation performance. The two different variogram models can be explained in terms of a fractional Brownian motion approach and of aquifer behavior at local scale. Finally, maps of hydraulic head spatial variability and of predictions uncertainty are constructed for the area with the two different approaches comparing their advantages and drawbacks.

  18. TiConverter: A training image converting tool for multiple-point geostatistics

    NASA Astrophysics Data System (ADS)

    Fadlelmula F., Mohamed M.; Killough, John; Fraim, Michael

    2016-11-01

    TiConverter is a tool developed to ease the application of multiple-point geostatistics whether by the open source Stanford Geostatistical Modeling Software (SGeMS) or other available commercial software. TiConverter has a user-friendly interface and it allows the conversion of 2D training images into numerical representations in four different file formats without the need for additional code writing. These are the ASCII (.txt), the geostatistical software library (GSLIB) (.txt), the Isatis (.dat), and the VTK formats. It performs the conversion based on the RGB color system. In addition, TiConverter offers several useful tools including image resizing, smoothing, and segmenting tools. The purpose of this study is to introduce the TiConverter, and to demonstrate its application and advantages with several examples from the literature.

  19. EFFICIENT MODEL-FITTING AND MODEL-COMPARISON FOR HIGH-DIMENSIONAL BAYESIAN GEOSTATISTICAL MODELS. (R826887)

    EPA Science Inventory

    Geostatistical models are appropriate for spatially distributed data measured at irregularly spaced locations. We propose an efficient Markov chain Monte Carlo (MCMC) algorithm for fitting Bayesian geostatistical models with substantial numbers of unknown parameters to sizable...

  20. Geostatistical regularization of inverse models for the retrieval of vegetation biophysical variables

    NASA Astrophysics Data System (ADS)

    Atzberger, C.; Richter, K.

    2009-09-01

    The robust and accurate retrieval of vegetation biophysical variables using radiative transfer models (RTM) is seriously hampered by the ill-posedness of the inverse problem. With this research we further develop our previously published (object-based) inversion approach [Atzberger (2004)]. The object-based RTM inversion takes advantage of the geostatistical fact that the biophysical characteristics of nearby pixel are generally more similar than those at a larger distance. A two-step inversion based on PROSPECT+SAIL generated look-up-tables is presented that can be easily implemented and adapted to other radiative transfer models. The approach takes into account the spectral signatures of neighboring pixel and optimizes a common value of the average leaf angle (ALA) for all pixel of a given image object, such as an agricultural field. Using a large set of leaf area index (LAI) measurements (n = 58) acquired over six different crops of the Barrax test site, Spain), we demonstrate that the proposed geostatistical regularization yields in most cases more accurate and spatially consistent results compared to the traditional (pixel-based) inversion. Pros and cons of the approach are discussed and possible future extensions presented.

  1. Analysis of dengue fever risk using geostatistics model in bone regency

    NASA Astrophysics Data System (ADS)

    Amran, Stang, Mallongi, Anwar

    2017-03-01

    This research aim is to analysis of dengue fever risk based on Geostatistics model in Bone Regency. Risk levels of dengue fever are denoted by parameter of Binomial distribution. Effect of temperature, rainfalls, elevation, and larvae abundance are investigated through Geostatistics model. Bayesian hierarchical method is used in estimation process. Using dengue fever data in eleven locations this research shows that temperature and rainfall have significant effect of dengue fever risk in Bone regency.

  2. Hydrogeologic Unit Flow Characterization Using Transition Probability Geostatistics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, N L; Walker, J R; Carle, S F

    2003-11-21

    This paper describes a technique for applying the transition probability geostatistics method for stochastic simulation to a MODFLOW model. Transition probability geostatistics has several advantages over traditional indicator kriging methods including a simpler and more intuitive framework for interpreting geologic relationships and the ability to simulate juxtapositional tendencies such as fining upwards sequences. The indicator arrays generated by the transition probability simulation are converted to layer elevation and thickness arrays for use with the new Hydrogeologic Unit Flow (HUF) package in MODFLOW 2000. This makes it possible to preserve complex heterogeneity while using reasonably sized grids. An application of themore » technique involving probabilistic capture zone delineation for the Aberjona Aquifer in Woburn, Ma. is included.« less

  3. Model-Based Geostatistical Mapping of the Prevalence of Onchocerca volvulus in West Africa

    PubMed Central

    O’Hanlon, Simon J.; Slater, Hannah C.; Cheke, Robert A.; Boatin, Boakye A.; Coffeng, Luc E.; Pion, Sébastien D. S.; Boussinesq, Michel; Zouré, Honorat G. M.; Stolk, Wilma A.; Basáñez, María-Gloria

    2016-01-01

    Background The initial endemicity (pre-control prevalence) of onchocerciasis has been shown to be an important determinant of the feasibility of elimination by mass ivermectin distribution. We present the first geostatistical map of microfilarial prevalence in the former Onchocerciasis Control Programme in West Africa (OCP) before commencement of antivectorial and antiparasitic interventions. Methods and Findings Pre-control microfilarial prevalence data from 737 villages across the 11 constituent countries in the OCP epidemiological database were used as ground-truth data. These 737 data points, plus a set of statistically selected environmental covariates, were used in a Bayesian model-based geostatistical (B-MBG) approach to generate a continuous surface (at pixel resolution of 5 km x 5km) of microfilarial prevalence in West Africa prior to the commencement of the OCP. Uncertainty in model predictions was measured using a suite of validation statistics, performed on bootstrap samples of held-out validation data. The mean Pearson’s correlation between observed and estimated prevalence at validation locations was 0.693; the mean prediction error (average difference between observed and estimated values) was 0.77%, and the mean absolute prediction error (average magnitude of difference between observed and estimated values) was 12.2%. Within OCP boundaries, 17.8 million people were deemed to have been at risk, 7.55 million to have been infected, and mean microfilarial prevalence to have been 45% (range: 2–90%) in 1975. Conclusions and Significance This is the first map of initial onchocerciasis prevalence in West Africa using B-MBG. Important environmental predictors of infection prevalence were identified and used in a model out-performing those without spatial random effects or environmental covariates. Results may be compared with recent epidemiological mapping efforts to find areas of persisting transmission. These methods may be extended to areas where

  4. Monte Carlo Analysis of Reservoir Models Using Seismic Data and Geostatistical Models

    NASA Astrophysics Data System (ADS)

    Zunino, A.; Mosegaard, K.; Lange, K.; Melnikova, Y.; Hansen, T. M.

    2013-12-01

    We present a study on the analysis of petroleum reservoir models consistent with seismic data and geostatistical constraints performed on a synthetic reservoir model. Our aim is to invert directly for structure and rock bulk properties of the target reservoir zone. To infer the rock facies, porosity and oil saturation seismology alone is not sufficient but a rock physics model must be taken into account, which links the unknown properties to the elastic parameters. We then combine a rock physics model with a simple convolutional approach for seismic waves to invert the "measured" seismograms. To solve this inverse problem, we employ a Markov chain Monte Carlo (MCMC) method, because it offers the possibility to handle non-linearity, complex and multi-step forward models and provides realistic estimates of uncertainties. However, for large data sets the MCMC method may be impractical because of a very high computational demand. To face this challenge one strategy is to feed the algorithm with realistic models, hence relying on proper prior information. To address this problem, we utilize an algorithm drawn from geostatistics to generate geologically plausible models which represent samples of the prior distribution. The geostatistical algorithm learns the multiple-point statistics from prototype models (in the form of training images), then generates thousands of different models which are accepted or rejected by a Metropolis sampler. To further reduce the computation time we parallelize the software and run it on multi-core machines. The solution of the inverse problem is then represented by a collection of reservoir models in terms of facies, porosity and oil saturation, which constitute samples of the posterior distribution. We are finally able to produce probability maps of the properties we are interested in by performing statistical analysis on the collection of solutions.

  5. Geostatistics for environmental and geotechnical applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rouhani, S.; Srivastava, R.M.; Desbarats, A.J.

    1996-12-31

    This conference was held January 26--27, 1995 in Phoenix, Arizona. The purpose of this conference was to provide a multidisciplinary forum for exchange of state-of-the-art information on the technology of geostatistics and its applicability for environmental studies, especially site characterization. Individual papers have been processed separately for inclusion in the appropriate data bases.

  6. Combining geostatistics with Moran's I analysis for mapping soil heavy metals in Beijing, China.

    PubMed

    Huo, Xiao-Ni; Li, Hong; Sun, Dan-Feng; Zhou, Lian-Di; Li, Bao-Guo

    2012-03-01

    Production of high quality interpolation maps of heavy metals is important for risk assessment of environmental pollution. In this paper, the spatial correlation characteristics information obtained from Moran's I analysis was used to supplement the traditional geostatistics. According to Moran's I analysis, four characteristics distances were obtained and used as the active lag distance to calculate the semivariance. Validation of the optimality of semivariance demonstrated that using the two distances where the Moran's I and the standardized Moran's I, Z(I) reached a maximum as the active lag distance can improve the fitting accuracy of semivariance. Then, spatial interpolation was produced based on the two distances and their nested model. The comparative analysis of estimation accuracy and the measured and predicted pollution status showed that the method combining geostatistics with Moran's I analysis was better than traditional geostatistics. Thus, Moran's I analysis is a useful complement for geostatistics to improve the spatial interpolation accuracy of heavy metals.

  7. Geostatistical noise filtering of geophysical images : application to unexploded ordnance (UXO) sites.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saito, Hirotaka; McKenna, Sean Andrew; Coburn, Timothy C.

    2004-07-01

    Geostatistical and non-geostatistical noise filtering methodologies, factorial kriging and a low-pass filter, and a region growing method are applied to analytic signal magnetometer images at two UXO contaminated sites to delineate UXO target areas. Overall delineation performance is improved by removing background noise. Factorial kriging slightly outperforms the low-pass filter but there is no distinct difference between them in terms of finding anomalies of interest.

  8. Use of geostatistics for remediation planning to transcend urban political boundaries.

    PubMed

    Milillo, Tammy M; Sinha, Gaurav; Gardella, Joseph A

    2012-11-01

    Soil remediation plans are often dictated by areas of jurisdiction or property lines instead of scientific information. This study exemplifies how geostatistically interpolated surfaces can substantially improve remediation planning. Ordinary kriging, ordinary co-kriging, and inverse distance weighting spatial interpolation methods were compared for analyzing surface and sub-surface soil sample data originally collected by the US EPA and researchers at the University at Buffalo in Hickory Woods, an industrial-residential neighborhood in Buffalo, NY, where both lead and arsenic contamination is present. Past clean-up efforts estimated contamination levels from point samples, but parcel and agency jurisdiction boundaries were used to define remediation sites, rather than geostatistical models estimating the spatial behavior of the contaminants in the soil. Residents were understandably dissatisfied with the arbitrariness of the remediation plan. In this study we show how geostatistical mapping and participatory assessment can make soil remediation scientifically defensible, socially acceptable, and economically feasible. Copyright © 2012 Elsevier Ltd. All rights reserved.

  9. Estimation of geotechnical parameters on the basis of geophysical methods and geostatistics

    NASA Astrophysics Data System (ADS)

    Brom, Aleksander; Natonik, Adrianna

    2017-12-01

    The paper presents possible implementation of ordinary cokriging and geophysical investigation on humidity data acquired in geotechnical studies. The Author describes concept of geostatistics, terminology of geostatistical modelling, spatial correlation functions, principles of solving cokriging systems, advantages of (co-)kriging in comparison with other interpolation methods, obstacles in this type of attempt. Cross validation and discussion of results was performed with an indication of prospect of applying similar procedures in various researches..

  10. Geostatistical Interpolation of Particle-Size Curves in Heterogeneous Aquifers

    NASA Astrophysics Data System (ADS)

    Guadagnini, A.; Menafoglio, A.; Secchi, P.

    2013-12-01

    We address the problem of predicting the spatial field of particle-size curves (PSCs) from measurements associated with soil samples collected at a discrete set of locations within an aquifer system. Proper estimates of the full PSC are relevant to applications related to groundwater hydrology, soil science and geochemistry and aimed at modeling physical and chemical processes occurring in heterogeneous earth systems. Hence, we focus on providing kriging estimates of the entire PSC at unsampled locations. To this end, we treat particle-size curves as cumulative distribution functions, model their densities as functional compositional data and analyze them by embedding these into the Hilbert space of compositional functions endowed with the Aitchison geometry. On this basis, we develop a new geostatistical methodology for the analysis of spatially dependent functional compositional data. Our functional compositional kriging (FCK) approach allows providing predictions at unsampled location of the entire particle-size curve, together with a quantification of the associated uncertainty, by fully exploiting both the functional form of the data and their compositional nature. This is a key advantage of our approach with respect to traditional methodologies, which treat only a set of selected features (e.g., quantiles) of PSCs. Embedding the full PSC into a geostatistical analysis enables one to provide a complete characterization of the spatial distribution of lithotypes in a reservoir, eventually leading to improved predictions of soil hydraulic attributes through pedotransfer functions as well as of soil geochemical parameters which are relevant in sorption/desorption and cation exchange processes. We test our new method on PSCs sampled along a borehole located within an alluvial aquifer near the city of Tuebingen, Germany. The quality of FCK predictions is assessed through leave-one-out cross-validation. A comparison between hydraulic conductivity estimates obtained

  11. Combining Geostatistics with Moran’s I Analysis for Mapping Soil Heavy Metals in Beijing, China

    PubMed Central

    Huo, Xiao-Ni; Li, Hong; Sun, Dan-Feng; Zhou, Lian-Di; Li, Bao-Guo

    2012-01-01

    Production of high quality interpolation maps of heavy metals is important for risk assessment of environmental pollution. In this paper, the spatial correlation characteristics information obtained from Moran’s I analysis was used to supplement the traditional geostatistics. According to Moran’s I analysis, four characteristics distances were obtained and used as the active lag distance to calculate the semivariance. Validation of the optimality of semivariance demonstrated that using the two distances where the Moran’s I and the standardized Moran’s I, Z(I) reached a maximum as the active lag distance can improve the fitting accuracy of semivariance. Then, spatial interpolation was produced based on the two distances and their nested model. The comparative analysis of estimation accuracy and the measured and predicted pollution status showed that the method combining geostatistics with Moran’s I analysis was better than traditional geostatistics. Thus, Moran’s I analysis is a useful complement for geostatistics to improve the spatial interpolation accuracy of heavy metals. PMID:22690179

  12. Assessment and modeling of the groundwater hydrogeochemical quality parameters via geostatistical approaches

    NASA Astrophysics Data System (ADS)

    Karami, Shawgar; Madani, Hassan; Katibeh, Homayoon; Fatehi Marj, Ahmad

    2018-03-01

    Geostatistical methods are one of the advanced techniques used for interpolation of groundwater quality data. The results obtained from geostatistics will be useful for decision makers to adopt suitable remedial measures to protect the quality of groundwater sources. Data used in this study were collected from 78 wells in Varamin plain aquifer located in southeast of Tehran, Iran, in 2013. Ordinary kriging method was used in this study to evaluate groundwater quality parameters. According to what has been mentioned in this paper, seven main quality parameters (i.e. total dissolved solids (TDS), sodium adsorption ratio (SAR), electrical conductivity (EC), sodium (Na+), total hardness (TH), chloride (Cl-) and sulfate (SO4 2-)), have been analyzed and interpreted by statistical and geostatistical methods. After data normalization by Nscore method in WinGslib software, variography as a geostatistical tool to define spatial regression was compiled and experimental variograms were plotted by GS+ software. Then, the best theoretical model was fitted to each variogram based on the minimum RSS. Cross validation method was used to determine the accuracy of the estimated data. Eventually, estimation maps of groundwater quality were prepared in WinGslib software and estimation variance map and estimation error map were presented to evaluate the quality of estimation in each estimated point. Results showed that kriging method is more accurate than the traditional interpolation methods.

  13. Breast carcinoma, intratumour heterogeneity and histological grading, using geostatistics.

    PubMed

    Sharifi-Salamatian, V; de Roquancourt, A; Rigaut, J P

    2000-01-01

    Tumour progression is currently believed to result from genetic instability. Chromosomal patterns specific of a type of cancer are frequent even though phenotypic spatial heterogeneity is omnipresent. The latter is the usual cause of histological grading imprecision, a well documented problem, without any fully satisfactory solution up to now. The present article addresses this problem in breast carcinoma. The assessment of a genetic marker for human tumours requires quantifiable measures of intratumoral heterogeneity. If any invariance paradigm representing a stochastic or geostatistic function could be discovered, this might help in solving the grading problem. A novel methodological approach using geostatistics to measure heterogeneity is used. Twenty tumours from the three usual (Scarff-Bloom and Richardson) grades were obtained and paraffin sections stained by MIB-1 (Ki-67) and peroxidase staining. Whole two-dimensional sections were sampled. Morphometric grids of variable sizes allowed a simple and fast recording of positions of epithelial nuclei, marked or not by MIB-1. The geostatistical method is based here upon the asymptotic behaviour of dispersion variance. Measure of asymptotic exponent of dispersion variance shows an increase from grade 1 to grade 3. Preliminary results are encouraging: grades 1 and 3 on one hand and 2 and 3 on the other hand are totally separated. The final proof of an improved grading using this measure will of course require a confrontation with the results of survival studies.

  14. A multiple-point geostatistical approach to quantifying uncertainty for flow and transport simulation in geologically complex environments

    NASA Astrophysics Data System (ADS)

    Cronkite-Ratcliff, C.; Phelps, G. A.; Boucher, A.

    2011-12-01

    In many geologic settings, the pathways of groundwater flow are controlled by geologic heterogeneities which have complex geometries. Models of these geologic heterogeneities, and consequently, their effects on the simulated pathways of groundwater flow, are characterized by uncertainty. Multiple-point geostatistics, which uses a training image to represent complex geometric descriptions of geologic heterogeneity, provides a stochastic approach to the analysis of geologic uncertainty. Incorporating multiple-point geostatistics into numerical models provides a way to extend this analysis to the effects of geologic uncertainty on the results of flow simulations. We present two case studies to demonstrate the application of multiple-point geostatistics to numerical flow simulation in complex geologic settings with both static and dynamic conditioning data. Both cases involve the development of a training image from a complex geometric description of the geologic environment. Geologic heterogeneity is modeled stochastically by generating multiple equally-probable realizations, all consistent with the training image. Numerical flow simulation for each stochastic realization provides the basis for analyzing the effects of geologic uncertainty on simulated hydraulic response. The first case study is a hypothetical geologic scenario developed using data from the alluvial deposits in Yucca Flat, Nevada. The SNESIM algorithm is used to stochastically model geologic heterogeneity conditioned to the mapped surface geology as well as vertical drill-hole data. Numerical simulation of groundwater flow and contaminant transport through geologic models produces a distribution of hydraulic responses and contaminant concentration results. From this distribution of results, the probability of exceeding a given contaminant concentration threshold can be used as an indicator of uncertainty about the location of the contaminant plume boundary. The second case study considers a

  15. Spatial distribution of Munida intermedia and M. sarsi (crustacea: Anomura) on the Galician continental shelf (NW Spain): Application of geostatistical analysis

    NASA Astrophysics Data System (ADS)

    Freire, J.; González-Gurriarán, E.; Olaso, I.

    1992-12-01

    Geostatistical methodology was used to analyse spatial structure and distribution of the epibenthic crustaceans Munida intermedia and M. sarsi within sets of data which had been collected during three survey cruises carried out on the Galician continental shelf (1983 and 1984). This study investigates the feasibility of using geostatistics for data collected according to traditional methods and of enhancing such methodology. The experimental variograms were calculated (pooled variance minus spatial covariance between samples taken one pair at a time vs. distance) and fitted to a 'spherical' model. The spatial structure model was used to estimate the abundance and distribution of the populations studied using the technique of kriging. The species display spatial structures, which are well marked during high density periods and in some areas (especially northern shelf). Geostatistical analysis allows identification of the density gradients in space as well as the patch grain along the continental shelf of 16-25 km diameter for M. intermedia and 12-20 km for M. sarsi. Patches of both species have a consistent location throughout the different cruises. As in other geographical areas, M. intermedia and M. sarsi usually appear at depths ranging from 200 to 500 m, with the highest densities in the continental shelf area located between Fisterra and Estaca de Bares. Althouh sampling was not originally designed specifically for geostatistics, this assay provides a measurement of spatial covariance, and shows variograms with variable structure depending on population density and geographical area. These ideas are useful in improving the design of future sampling cruises.

  16. Introduction to This Special Issue on Geostatistics and Geospatial Techniques in Remote Sensing

    NASA Technical Reports Server (NTRS)

    Atkinson, Peter; Quattrochi, Dale A.; Goodman, H. Michael (Technical Monitor)

    2000-01-01

    The germination of this special Computers & Geosciences (C&G) issue began at the Royal Geographical Society (with the Institute of British Geographers) (RGS-IBG) annual meeting in January 1997 held at the University of Exeter, UK. The snow and cold of the English winter were tempered greatly by warm and cordial discussion of how to stimulate and enhance cooperation on geostatistical and geospatial research in remote sensing 'across the big pond' between UK and US researchers. It was decided that one way forward would be to hold parallel sessions in 1998 on geostatistical and geospatial research in remote sensing at appropriate venues in both the UK and the US. Selected papers given at these sessions would be published as special issues of C&G on the UK side and Photogrammetric Engineering and Remote Sensing (PE&RS) on the US side. These issues would highlight the commonality in research on geostatistical and geospatial research in remote sensing on both sides of the Atlantic Ocean. As a consequence, a session on "Geostatistics and Geospatial Techniques for Remote Sensing of Land Surface Processes" was held at the RGS-IBG annual meeting in Guildford, Surrey, UK in January 1998, organized by the Modeling and Advanced Techniques Special Interest Group (MAT SIG) of the Remote Sensing Society (RSS). A similar session was held at the Association of American Geographers (AAG) annual meeting in Boston, Massachusetts in March 1998, sponsored by the AAG's Remote Sensing Specialty Group (RSSG). The 10 papers that make up this issue of C&G, comprise 7 papers from the UK and 3 papers from the LIS. We are both co-editors of each of the journal special issues, with the lead editor of each journal issue being from their respective side of the Atlantic. The special issue of PE&RS (vol. 65) that constitutes the other half of this co-edited journal series was published in early 1999, comprising 6 papers by US authors. We are indebted to the International Association for Mathematical

  17. Recent advances in scalable non-Gaussian geostatistics: The generalized sub-Gaussian model

    NASA Astrophysics Data System (ADS)

    Guadagnini, Alberto; Riva, Monica; Neuman, Shlomo P.

    2018-07-01

    Geostatistical analysis has been introduced over half a century ago to allow quantifying seemingly random spatial variations in earth quantities such as rock mineral content or permeability. The traditional approach has been to view such quantities as multivariate Gaussian random functions characterized by one or a few well-defined spatial correlation scales. There is, however, mounting evidence that many spatially varying quantities exhibit non-Gaussian behavior over a multiplicity of scales. The purpose of this minireview is not to paint a broad picture of the subject and its treatment in the literature. Instead, we focus on very recent advances in the recognition and analysis of this ubiquitous phenomenon, which transcends hydrology and the Earth sciences, brought about largely by our own work. In particular, we use porosity data from a deep borehole to illustrate typical aspects of such scalable non-Gaussian behavior, describe a very recent theoretical model that (for the first time) captures all these behavioral aspects in a comprehensive manner, show how this allows generating random realizations of the quantity conditional on sampled values, point toward ways of incorporating scalable non-Gaussian behavior in hydrologic analysis, highlight the significance of doing so, and list open questions requiring further research.

  18. Breast Carcinoma, Intratumour Heterogeneity and Histological Grading, Using Geostatistics

    PubMed Central

    Sharifi‐Salamatian, Vénus; de Roquancourt, Anne; Rigaut, Jean Paul

    2000-01-01

    Tumour progression is currently believed to result from genetic instability. Chromosomal patterns specific of a type of cancer are frequent even though phenotypic spatial heterogeneity is omnipresent. The latter is the usual cause of histological grading imprecision, a well documented problem, without any fully satisfactory solution up to now. The present article addresses this problem in breast carcinoma. The assessment of a genetic marker for human tumours requires quantifiable measures of intratumoral heterogeneity. If any invariance paradigm representing a stochastic or geostatistic function could be discovered, this might help in solving the grading problem. A novel methodological approach using geostatistics to measure heterogeneity is used. Twenty tumours from the three usual (Scarff‐Bloom and Richardson) grades were obtained and paraffin sections stained by MIB‐1 (Ki‐67) and peroxidase staining. Whole two‐dimensional sections were sampled. Morphometric grids of variable sizes allowed a simple and fast recording of positions of epithelial nuclei, marked or not by MIB‐1. The geostatistical method is based here upon the asymptotic behaviour of dispersion variance. Measure of asymptotic exponent of dispersion variance shows an increase from grade 1 to grade 3. Preliminary results are encouraging: grades 1 and 3 on one hand and 2 and 3 on the other hand are totally separated. The final proof of an improved grading using this measure will of course require a confrontation with the results of survival studies. PMID:11153611

  19. A Nonparametric Geostatistical Method For Estimating Species Importance

    Treesearch

    Andrew J. Lister; Rachel Riemann; Michael Hoppus

    2001-01-01

    Parametric statistical methods are not always appropriate for conducting spatial analyses of forest inventory data. Parametric geostatistical methods such as variography and kriging are essentially averaging procedures, and thus can be affected by extreme values. Furthermore, non normal distributions violate the assumptions of analyses in which test statistics are...

  20. Geostatistics: a new tool for describing spatially-varied surface conditions from timber harvested and burned hillslopes

    Treesearch

    Peter R. Robichaud

    1997-01-01

    Geostatistics provides a method to describe the spatial continuity of many natural phenomena. Spatial models are based upon the concept of scaling, kriging and conditional simulation. These techniques were used to describe the spatially-varied surface conditions on timber harvest and burned hillslopes. Geostatistical techniques provided estimates of the ground cover (...

  1. A Random Finite Set Approach to Space Junk Tracking and Identification

    DTIC Science & Technology

    2014-09-03

    Final 3. DATES COVERED (From - To) 31 Jan 13 – 29 Apr 14 4. TITLE AND SUBTITLE A Random Finite Set Approach to Space Junk Tracking and...01-2013 to 29-04-2014 4. TITLE AND SUBTITLE A Random Finite Set Approach to Space Junk Tracking and Identification 5a. CONTRACT NUMBER FA2386-13...Prescribed by ANSI Std Z39-18 A Random Finite Set Approach to Space Junk Tracking and Indentification Ba-Ngu Vo1, Ba-Tuong Vo1, 1Department of

  2. The Use of Geostatistics in the Study of Floral Phenology of Vulpia geniculata (L.) Link

    PubMed Central

    León Ruiz, Eduardo J.; García Mozo, Herminia; Domínguez Vilches, Eugenio; Galán, Carmen

    2012-01-01

    Traditionally phenology studies have been focused on changes through time, but there exist many instances in ecological research where it is necessary to interpolate among spatially stratified samples. The combined use of Geographical Information Systems (GIS) and Geostatistics can be an essential tool for spatial analysis in phenological studies. Geostatistics are a family of statistics that describe correlations through space/time and they can be used for both quantifying spatial correlation and interpolating unsampled points. In the present work, estimations based upon Geostatistics and GIS mapping have enabled the construction of spatial models that reflect phenological evolution of Vulpia geniculata (L.) Link throughout the study area during sampling season. Ten sampling points, scattered troughout the city and low mountains in the “Sierra de Córdoba” were chosen to carry out the weekly phenological monitoring during flowering season. The phenological data were interpolated by applying the traditional geostatitical method of Kriging, which was used to ellaborate weekly estimations of V. geniculata phenology in unsampled areas. Finally, the application of Geostatistics and GIS to create phenological maps could be an essential complement in pollen aerobiological studies, given the increased interest in obtaining automatic aerobiological forecasting maps. PMID:22629169

  3. The use of geostatistics in the study of floral phenology of Vulpia geniculata (L.) link.

    PubMed

    León Ruiz, Eduardo J; García Mozo, Herminia; Domínguez Vilches, Eugenio; Galán, Carmen

    2012-01-01

    Traditionally phenology studies have been focused on changes through time, but there exist many instances in ecological research where it is necessary to interpolate among spatially stratified samples. The combined use of Geographical Information Systems (GIS) and Geostatistics can be an essential tool for spatial analysis in phenological studies. Geostatistics are a family of statistics that describe correlations through space/time and they can be used for both quantifying spatial correlation and interpolating unsampled points. In the present work, estimations based upon Geostatistics and GIS mapping have enabled the construction of spatial models that reflect phenological evolution of Vulpia geniculata (L.) Link throughout the study area during sampling season. Ten sampling points, scattered throughout the city and low mountains in the "Sierra de Córdoba" were chosen to carry out the weekly phenological monitoring during flowering season. The phenological data were interpolated by applying the traditional geostatitical method of Kriging, which was used to elaborate weekly estimations of V. geniculata phenology in unsampled areas. Finally, the application of Geostatistics and GIS to create phenological maps could be an essential complement in pollen aerobiological studies, given the increased interest in obtaining automatic aerobiological forecasting maps.

  4. Assessing the resolution-dependent utility of tomograms for geostatistics

    USGS Publications Warehouse

    Day-Lewis, F. D.; Lane, J.W.

    2004-01-01

    Geophysical tomograms are used increasingly as auxiliary data for geostatistical modeling of aquifer and reservoir properties. The correlation between tomographic estimates and hydrogeologic properties is commonly based on laboratory measurements, co-located measurements at boreholes, or petrophysical models. The inferred correlation is assumed uniform throughout the interwell region; however, tomographic resolution varies spatially due to acquisition geometry, regularization, data error, and the physics underlying the geophysical measurements. Blurring and inversion artifacts are expected in regions traversed by few or only low-angle raypaths. In the context of radar traveltime tomography, we derive analytical models for (1) the variance of tomographic estimates, (2) the spatially variable correlation with a hydrologic parameter of interest, and (3) the spatial covariance of tomographic estimates. Synthetic examples demonstrate that tomograms of qualitative value may have limited utility for geostatistics; moreover, the imprint of regularization may preclude inference of meaningful spatial statistics from tomograms.

  5. Geostatistical estimation of signal-to-noise ratios for spectral vegetation indices

    USGS Publications Warehouse

    Ji, Lei; Zhang, Li; Rover, Jennifer R.; Wylie, Bruce K.; Chen, Xuexia

    2014-01-01

    In the past 40 years, many spectral vegetation indices have been developed to quantify vegetation biophysical parameters. An ideal vegetation index should contain the maximum level of signal related to specific biophysical characteristics and the minimum level of noise such as background soil influences and atmospheric effects. However, accurate quantification of signal and noise in a vegetation index remains a challenge, because it requires a large number of field measurements or laboratory experiments. In this study, we applied a geostatistical method to estimate signal-to-noise ratio (S/N) for spectral vegetation indices. Based on the sample semivariogram of vegetation index images, we used the standardized noise to quantify the noise component of vegetation indices. In a case study in the grasslands and shrublands of the western United States, we demonstrated the geostatistical method for evaluating S/N for a series of soil-adjusted vegetation indices derived from the Moderate Resolution Imaging Spectroradiometer (MODIS) sensor. The soil-adjusted vegetation indices were found to have higher S/N values than the traditional normalized difference vegetation index (NDVI) and simple ratio (SR) in the sparsely vegetated areas. This study shows that the proposed geostatistical analysis can constitute an efficient technique for estimating signal and noise components in vegetation indices.

  6. Dissecting random and systematic differences between noisy composite data sets.

    PubMed

    Diederichs, Kay

    2017-04-01

    Composite data sets measured on different objects are usually affected by random errors, but may also be influenced by systematic (genuine) differences in the objects themselves, or the experimental conditions. If the individual measurements forming each data set are quantitative and approximately normally distributed, a correlation coefficient is often used to compare data sets. However, the relations between data sets are not obvious from the matrix of pairwise correlations since the numerical value of the correlation coefficient is lowered by both random and systematic differences between the data sets. This work presents a multidimensional scaling analysis of the pairwise correlation coefficients which places data sets into a unit sphere within low-dimensional space, at a position given by their CC* values [as defined by Karplus & Diederichs (2012), Science, 336, 1030-1033] in the radial direction and by their systematic differences in one or more angular directions. This dimensionality reduction can not only be used for classification purposes, but also to derive data-set relations on a continuous scale. Projecting the arrangement of data sets onto the subspace spanned by systematic differences (the surface of a unit sphere) allows, irrespective of the random-error levels, the identification of clusters of closely related data sets. The method gains power with increasing numbers of data sets. It is illustrated with an example from low signal-to-noise ratio image processing, and an application in macromolecular crystallography is shown, but the approach is completely general and thus should be widely applicable.

  7. Application of Bayesian geostatistics for evaluation of mass discharge uncertainty at contaminated sites

    NASA Astrophysics Data System (ADS)

    Troldborg, Mads; Nowak, Wolfgang; Lange, Ida V.; Santos, Marta C.; Binning, Philip J.; Bjerg, Poul L.

    2012-09-01

    Mass discharge estimates are increasingly being used when assessing risks of groundwater contamination and designing remedial systems at contaminated sites. Such estimates are, however, rather uncertain as they integrate uncertain spatial distributions of both concentration and groundwater flow. Here a geostatistical simulation method for quantifying the uncertainty of the mass discharge across a multilevel control plane is presented. The method accounts for (1) heterogeneity of both the flow field and the concentration distribution through Bayesian geostatistics, (2) measurement uncertainty, and (3) uncertain source zone and transport parameters. The method generates conditional realizations of the spatial flow and concentration distribution. An analytical macrodispersive transport solution is employed to simulate the mean concentration distribution, and a geostatistical model of the Box-Cox transformed concentration data is used to simulate observed deviations from this mean solution. By combining the flow and concentration realizations, a mass discharge probability distribution is obtained. The method has the advantage of avoiding the heavy computational burden of three-dimensional numerical flow and transport simulation coupled with geostatistical inversion. It may therefore be of practical relevance to practitioners compared to existing methods that are either too simple or computationally demanding. The method is demonstrated on a field site contaminated with chlorinated ethenes. For this site, we show that including a physically meaningful concentration trend and the cosimulation of hydraulic conductivity and hydraulic gradient across the transect helps constrain the mass discharge uncertainty. The number of sampling points required for accurate mass discharge estimation and the relative influence of different data types on mass discharge uncertainty is discussed.

  8. Comparing the performance of geostatistical models with additional information from covariates for sewage plume characterization.

    PubMed

    Del Monego, Maurici; Ribeiro, Paulo Justiniano; Ramos, Patrícia

    2015-04-01

    In this work, kriging with covariates is used to model and map the spatial distribution of salinity measurements gathered by an autonomous underwater vehicle in a sea outfall monitoring campaign aiming to distinguish the effluent plume from the receiving waters and characterize its spatial variability in the vicinity of the discharge. Four different geostatistical linear models for salinity were assumed, where the distance to diffuser, the west-east positioning, and the south-north positioning were used as covariates. Sample variograms were fitted by the Matèrn models using weighted least squares and maximum likelihood estimation methods as a way to detect eventual discrepancies. Typically, the maximum likelihood method estimated very low ranges which have limited the kriging process. So, at least for these data sets, weighted least squares showed to be the most appropriate estimation method for variogram fitting. The kriged maps show clearly the spatial variation of salinity, and it is possible to identify the effluent plume in the area studied. The results obtained show some guidelines for sewage monitoring if a geostatistical analysis of the data is in mind. It is important to treat properly the existence of anomalous values and to adopt a sampling strategy that includes transects parallel and perpendicular to the effluent dispersion.

  9. Topsoil moisture mapping using geostatistical techniques under different Mediterranean climatic conditions.

    PubMed

    Martínez-Murillo, J F; Hueso-González, P; Ruiz-Sinoga, J D

    2017-10-01

    Soil mapping has been considered as an important factor in the widening of Soil Science and giving response to many different environmental questions. Geostatistical techniques, through kriging and co-kriging techniques, have made possible to improve the understanding of eco-geomorphologic variables, e.g., soil moisture. This study is focused on mapping of topsoil moisture using geostatistical techniques under different Mediterranean climatic conditions (humid, dry and semiarid) in three small watersheds and considering topography and soil properties as key factors. A Digital Elevation Model (DEM) with a resolution of 1×1m was derived from a topographical survey as well as soils were sampled to analyzed soil properties controlling topsoil moisture, which was measured during 4-years. Afterwards, some topography attributes were derived from the DEM, the soil properties analyzed in laboratory, and the topsoil moisture was modeled for the entire watersheds applying three geostatistical techniques: i) ordinary kriging; ii) co-kriging considering as co-variate topography attributes; and iii) co-kriging ta considering as co-variates topography attributes and gravel content. The results indicated topsoil moisture was more accurately mapped in the dry and semiarid watersheds when co-kriging procedure was performed. The study is a contribution to improve the efficiency and accuracy of studies about the Mediterranean eco-geomorphologic system and soil hydrology in field conditions. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Geostatistical mapping of effluent-affected sediment distribution on the Palos Verdes shelf

    USGS Publications Warehouse

    Murray, C.J.; Lee, H.J.; Hampton, M.A.

    2002-01-01

    Geostatistical techniques were used to study the spatial continuity of the thickness of effluent-affected sediment in the offshore Palos Verdes Margin area. The thickness data were measured directly from cores and indirectly from high-frequency subbottom profiles collected over the Palos Verdes Margin. Strong spatial continuity of the sediment thickness data was identified, with a maximum range of correlation in excess of 1.4 km. The spatial correlation showed a marked anisotropy, and was more than twice as continuous in the alongshore direction as in the cross-shelf direction. Sequential indicator simulation employing models fit to the thickness data variograms was used to map the distribution of the sediment, and to quantify the uncertainty in those estimates. A strong correlation between sediment thickness data and measurements of the mass of the contaminant p,p???-DDE per unit area was identified. A calibration based on the bivariate distribution of the thickness and p,p???-DDE data was applied using Markov-Bayes indicator simulation to extend the geostatistical study and map the contamination levels in the sediment. Integrating the map grids produced by the geostatistical study of the two variables indicated that 7.8 million m3 of effluent-affected sediment exist in the map area, containing approximately 61-72 Mg (metric tons) of p,p???-DDE. Most of the contaminated sediment (about 85% of the sediment and 89% of the p,p???-DDE) occurs in water depths < 100 m. The geostatistical study also indicated that the samples available for mapping are well distributed and the uncertainty of the estimates of the thickness and contamination level of the sediments is lowest in areas where the contaminated sediment is most prevalent. ?? 2002 Elsevier Science Ltd. All rights reserved.

  11. Multiobjective design of aquifer monitoring networks for optimal spatial prediction and geostatistical parameter estimation

    NASA Astrophysics Data System (ADS)

    Alzraiee, Ayman H.; Bau, Domenico A.; Garcia, Luis A.

    2013-06-01

    Effective sampling of hydrogeological systems is essential in guiding groundwater management practices. Optimal sampling of groundwater systems has previously been formulated based on the assumption that heterogeneous subsurface properties can be modeled using a geostatistical approach. Therefore, the monitoring schemes have been developed to concurrently minimize the uncertainty in the spatial distribution of systems' states and parameters, such as the hydraulic conductivity K and the hydraulic head H, and the uncertainty in the geostatistical model of system parameters using a single objective function that aggregates all objectives. However, it has been shown that the aggregation of possibly conflicting objective functions is sensitive to the adopted aggregation scheme and may lead to distorted results. In addition, the uncertainties in geostatistical parameters affect the uncertainty in the spatial prediction of K and H according to a complex nonlinear relationship, which has often been ineffectively evaluated using a first-order approximation. In this study, we propose a multiobjective optimization framework to assist the design of monitoring networks of K and H with the goal of optimizing their spatial predictions and estimating the geostatistical parameters of the K field. The framework stems from the combination of a data assimilation (DA) algorithm and a multiobjective evolutionary algorithm (MOEA). The DA algorithm is based on the ensemble Kalman filter, a Monte-Carlo-based Bayesian update scheme for nonlinear systems, which is employed to approximate the posterior uncertainty in K, H, and the geostatistical parameters of K obtained by collecting new measurements. Multiple MOEA experiments are used to investigate the trade-off among design objectives and identify the corresponding monitoring schemes. The methodology is applied to design a sampling network for a shallow unconfined groundwater system located in Rocky Ford, Colorado. Results indicate that

  12. Spatial analysis of hazardous waste data using geostatistics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zirschky, J.H.

    1984-01-01

    The objective of this investigation was to determine if geostatistics could be a useful tool for evaluating hazardous waste sites. Three sites contaminated by dioxin (2,3,7,8-tetrachlorodibenzo-p-dioxin (TCDD)) were investigated. The first site evaluated was a creek into which TCDD-contaminated soil had eroded. The second site was a town in which TCDD-contaminated wastes had been sprayed onto the streets. Finally, the third site was a highway of which the shoulders were contaminated by dust deposition from a nearby hazardous waste site. The distribution of TCDD at the first and third sites were investigated using kriging, an optimal estimation technique. By usingmore » kriging, the areas of both sites requiring cleanup were successfully identified. At the second site, the town, satisfactory results were not obtained. The distribution of contamination in this town is believed to be very heterogeneous; thus, reasonable estimates could not be obtained. Additional sampling was therefore recommended at this site. Based upon this research, geostatistics appears to be a very useful tool for evaluating a hazardous waste site if the distribution of contaminants at the site is homogeneous, or can be divided into homogeneous areas.« less

  13. Three-dimensional geostatistical inversion of flowmeter and pumping test data.

    PubMed

    Li, Wei; Englert, Andreas; Cirpka, Olaf A; Vereecken, Harry

    2008-01-01

    We jointly invert field data of flowmeter and multiple pumping tests in fully screened wells to estimate hydraulic conductivity using a geostatistical method. We use the steady-state drawdowns of pumping tests and the discharge profiles of flowmeter tests as our data in the inference. The discharge profiles need not be converted to absolute hydraulic conductivities. Consequently, we do not need measurements of depth-averaged hydraulic conductivity at well locations. The flowmeter profiles contain information about relative vertical distributions of hydraulic conductivity, while drawdown measurements of pumping tests provide information about horizontal fluctuation of the depth-averaged hydraulic conductivity. We apply the method to data obtained at the Krauthausen test site of the Forschungszentrum Jülich, Germany. The resulting estimate of our joint three-dimensional (3D) geostatistical inversion shows an improved 3D structure in comparison to the inversion of pumping test data only.

  14. The use of a genetic algorithm-based search strategy in geostatistics: application to a set of anisotropic piezometric head data

    NASA Astrophysics Data System (ADS)

    Abedini, M. J.; Nasseri, M.; Burn, D. H.

    2012-04-01

    In any geostatistical study, an important consideration is the choice of an appropriate, repeatable, and objective search strategy that controls the nearby samples to be included in the location-specific estimation procedure. Almost all geostatistical software available in the market puts the onus on the user to supply search strategy parameters in a heuristic manner. These parameters are solely controlled by geographical coordinates that are defined for the entire area under study, and the user has no guidance as to how to choose these parameters. The main thesis of the current study is that the selection of search strategy parameters has to be driven by data—both the spatial coordinates and the sample values—and cannot be chosen beforehand. For this purpose, a genetic-algorithm-based ordinary kriging with moving neighborhood technique is proposed. The search capability of a genetic algorithm is exploited to search the feature space for appropriate, either local or global, search strategy parameters. Radius of circle/sphere and/or radii of standard or rotated ellipse/ellipsoid are considered as the decision variables to be optimized by GA. The superiority of GA-based ordinary kriging is demonstrated through application to the Wolfcamp Aquifer piezometric head data. Assessment of numerical results showed that definition of search strategy parameters based on both geographical coordinates and sample values improves cross-validation statistics when compared with that based on geographical coordinates alone. In the case of a variable search neighborhood for each estimation point, optimization of local search strategy parameters for an elliptical support domain—the orientation of which is dictated by anisotropic axes—via GA was able to capture the dynamics of piezometric head in west Texas/New Mexico in an efficient way.

  15. Randomization Methods in Emergency Setting Trials: A Descriptive Review

    ERIC Educational Resources Information Center

    Corbett, Mark Stephen; Moe-Byrne, Thirimon; Oddie, Sam; McGuire, William

    2016-01-01

    Background: Quasi-randomization might expedite recruitment into trials in emergency care settings but may also introduce selection bias. Methods: We searched the Cochrane Library and other databases for systematic reviews of interventions in emergency medicine or urgent care settings. We assessed selection bias (baseline imbalances) in prognostic…

  16. Constraining geostatistical models with hydrological data to improve prediction realism

    NASA Astrophysics Data System (ADS)

    Demyanov, V.; Rojas, T.; Christie, M.; Arnold, D.

    2012-04-01

    Geostatistical models reproduce spatial correlation based on the available on site data and more general concepts about the modelled patters, e.g. training images. One of the problem of modelling natural systems with geostatistics is in maintaining realism spatial features and so they agree with the physical processes in nature. Tuning the model parameters to the data may lead to geostatistical realisations with unrealistic spatial patterns, which would still honour the data. Such model would result in poor predictions, even though although fit the available data well. Conditioning the model to a wider range of relevant data provide a remedy that avoid producing unrealistic features in spatial models. For instance, there are vast amounts of information about the geometries of river channels that can be used in describing fluvial environment. Relations between the geometrical channel characteristics (width, depth, wave length, amplitude, etc.) are complex and non-parametric and are exhibit a great deal of uncertainty, which is important to propagate rigorously into the predictive model. These relations can be described within a Bayesian approach as multi-dimensional prior probability distributions. We propose a way to constrain multi-point statistics models with intelligent priors obtained from analysing a vast collection of contemporary river patterns based on previously published works. We applied machine learning techniques, namely neural networks and support vector machines, to extract multivariate non-parametric relations between geometrical characteristics of fluvial channels from the available data. An example demonstrates how ensuring geological realism helps to deliver more reliable prediction of a subsurface oil reservoir in a fluvial depositional environment.

  17. Assessment of spatial distribution of fallout radionuclides through geostatistics concept.

    PubMed

    Mabit, L; Bernard, C

    2007-01-01

    After introducing geostatistics concept and its utility in environmental science and especially in Fallout Radionuclide (FRN) spatialisation, a case study for cesium-137 ((137)Cs) redistribution at the field scale using geostatistics is presented. On a Canadian agricultural field, geostatistics coupled with a Geographic Information System (GIS) was used to test three different techniques of interpolation [Ordinary Kriging (OK), Inverse Distance Weighting power one (IDW1) and two (IDW2)] to create a (137)Cs map and to establish a radioisotope budget. Following the optimization of variographic parameters, an experimental semivariogram was developed to determine the spatial dependence of (137)Cs. It was adjusted to a spherical isotropic model with a range of 30 m and a very small nugget effect. This (137)Cs semivariogram showed a good autocorrelation (R(2)=0.91) and was well structured ('nugget-to-sill' ratio of 4%). It also revealed that the sampling strategy was adequate to reveal the spatial correlation of (137)Cs. The spatial redistribution of (137)Cs was estimated by Ordinary Kriging and IDW to produce contour maps. A radioisotope budget was established for the 2.16 ha agricultural field under investigation. It was estimated that around 2 x 10(7)Bq of (137)Cs were missing (around 30% of the total initial fallout) and were exported by physical processes (runoff and erosion processes) from the area under investigation. The cross-validation analysis showed that in the case of spatially structured data, OK is a better interpolation method than IDW1 or IDW2 for the assessment of potential radioactive contamination and/or pollution.

  18. Geospatial interpolation and mapping of tropospheric ozone pollution using geostatistics.

    PubMed

    Kethireddy, Swatantra R; Tchounwou, Paul B; Ahmad, Hafiz A; Yerramilli, Anjaneyulu; Young, John H

    2014-01-10

    Tropospheric ozone (O3) pollution is a major problem worldwide, including in the United States of America (USA), particularly during the summer months. Ozone oxidative capacity and its impact on human health have attracted the attention of the scientific community. In the USA, sparse spatial observations for O3 may not provide a reliable source of data over a geo-environmental region. Geostatistical Analyst in ArcGIS has the capability to interpolate values in unmonitored geo-spaces of interest. In this study of eastern Texas O3 pollution, hourly episodes for spring and summer 2012 were selectively identified. To visualize the O3 distribution, geostatistical techniques were employed in ArcMap. Using ordinary Kriging, geostatistical layers of O3 for all the studied hours were predicted and mapped at a spatial resolution of 1 kilometer. A decent level of prediction accuracy was achieved and was confirmed from cross-validation results. The mean prediction error was close to 0, the root mean-standardized-prediction error was close to 1, and the root mean square and average standard errors were small. O3 pollution map data can be further used in analysis and modeling studies. Kriging results and O3 decadal trends indicate that the populace in Houston-Sugar Land-Baytown, Dallas-Fort Worth-Arlington, Beaumont-Port Arthur, San Antonio, and Longview are repeatedly exposed to high levels of O3-related pollution, and are prone to the corresponding respiratory and cardiovascular health effects. Optimization of the monitoring network proves to be an added advantage for the accurate prediction of exposure levels.

  19. SRS 2010 Vegetation Inventory GeoStatistical Mapping Results for Custom Reaction Intensity and Total Dead Fuels.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Edwards, Lloyd A.; Paresol, Bernard

    This report of the geostatistical analysis results of the fire fuels response variables, custom reaction intensity and total dead fuels is but a part of an SRS 2010 vegetation inventory project. For detailed description of project, theory and background including sample design, methods, and results please refer to USDA Forest Service Savannah River Site internal report “SRS 2010 Vegetation Inventory GeoStatistical Mapping Report”, (Edwards & Parresol 2013).

  20. Mapping aboveground woody biomass using forest inventory, remote sensing and geostatistical techniques.

    PubMed

    Yadav, Bechu K V; Nandy, S

    2015-05-01

    Mapping forest biomass is fundamental for estimating CO₂ emissions, and planning and monitoring of forests and ecosystem productivity. The present study attempted to map aboveground woody biomass (AGWB) integrating forest inventory, remote sensing and geostatistical techniques, viz., direct radiometric relationships (DRR), k-nearest neighbours (k-NN) and cokriging (CoK) and to evaluate their accuracy. A part of the Timli Forest Range of Kalsi Soil and Water Conservation Division, Uttarakhand, India was selected for the present study. Stratified random sampling was used to collect biophysical data from 36 sample plots of 0.1 ha (31.62 m × 31.62 m) size. Species-specific volumetric equations were used for calculating volume and multiplied by specific gravity to get biomass. Three forest-type density classes, viz. 10-40, 40-70 and >70% of Shorea robusta forest and four non-forest classes were delineated using on-screen visual interpretation of IRS P6 LISS-III data of December 2012. The volume in different strata of forest-type density ranged from 189.84 to 484.36 m(3) ha(-1). The total growing stock of the forest was found to be 2,024,652.88 m(3). The AGWB ranged from 143 to 421 Mgha(-1). Spectral bands and vegetation indices were used as independent variables and biomass as dependent variable for DRR, k-NN and CoK. After validation and comparison, k-NN method of Mahalanobis distance (root mean square error (RMSE) = 42.25 Mgha(-1)) was found to be the best method followed by fuzzy distance and Euclidean distance with RMSE of 44.23 and 45.13 Mgha(-1) respectively. DRR was found to be the least accurate method with RMSE of 67.17 Mgha(-1). The study highlighted the potential of integrating of forest inventory, remote sensing and geostatistical techniques for forest biomass mapping.

  1. Validating spatial structure in canopy water content using geostatistics

    NASA Technical Reports Server (NTRS)

    Sanderson, E. W.; Zhang, M. H.; Ustin, S. L.; Rejmankova, E.; Haxo, R. S.

    1995-01-01

    Heterogeneity in ecological phenomena are scale dependent and affect the hierarchical structure of image data. AVIRIS pixels average reflectance produced by complex absorption and scattering interactions between biogeochemical composition, canopy architecture, view and illumination angles, species distributions, and plant cover as well as other factors. These scales affect validation of pixel reflectance, typically performed by relating pixel spectra to ground measurements acquired at scales of 1m(exp 2) or less (e.g., field spectra, foilage and soil samples, etc.). As image analysis becomes more sophisticated, such as those for detection of canopy chemistry, better validation becomes a critical problem. This paper presents a methodology for bridging between point measurements and pixels using geostatistics. Geostatistics have been extensively used in geological or hydrogeolocial studies but have received little application in ecological studies. The key criteria for kriging estimation is that the phenomena varies in space and that an underlying controlling process produces spatial correlation between the measured data points. Ecological variation meets this requirement because communities vary along environmental gradients like soil moisture, nutrient availability, or topography.

  2. Unsupervised classification of multivariate geostatistical data: Two algorithms

    NASA Astrophysics Data System (ADS)

    Romary, Thomas; Ors, Fabien; Rivoirard, Jacques; Deraisme, Jacques

    2015-12-01

    With the increasing development of remote sensing platforms and the evolution of sampling facilities in mining and oil industry, spatial datasets are becoming increasingly large, inform a growing number of variables and cover wider and wider areas. Therefore, it is often necessary to split the domain of study to account for radically different behaviors of the natural phenomenon over the domain and to simplify the subsequent modeling step. The definition of these areas can be seen as a problem of unsupervised classification, or clustering, where we try to divide the domain into homogeneous domains with respect to the values taken by the variables in hand. The application of classical clustering methods, designed for independent observations, does not ensure the spatial coherence of the resulting classes. Image segmentation methods, based on e.g. Markov random fields, are not adapted to irregularly sampled data. Other existing approaches, based on mixtures of Gaussian random functions estimated via the expectation-maximization algorithm, are limited to reasonable sample sizes and a small number of variables. In this work, we propose two algorithms based on adaptations of classical algorithms to multivariate geostatistical data. Both algorithms are model free and can handle large volumes of multivariate, irregularly spaced data. The first one proceeds by agglomerative hierarchical clustering. The spatial coherence is ensured by a proximity condition imposed for two clusters to merge. This proximity condition relies on a graph organizing the data in the coordinates space. The hierarchical algorithm can then be seen as a graph-partitioning algorithm. Following this interpretation, a spatial version of the spectral clustering algorithm is also proposed. The performances of both algorithms are assessed on toy examples and a mining dataset.

  3. Approaches in highly parameterized inversion: bgaPEST, a Bayesian geostatistical approach implementation with PEST: documentation and instructions

    USGS Publications Warehouse

    Fienen, Michael N.; D'Oria, Marco; Doherty, John E.; Hunt, Randall J.

    2013-01-01

    The application bgaPEST is a highly parameterized inversion software package implementing the Bayesian Geostatistical Approach in a framework compatible with the parameter estimation suite PEST. Highly parameterized inversion refers to cases in which parameters are distributed in space or time and are correlated with one another. The Bayesian aspect of bgaPEST is related to Bayesian probability theory in which prior information about parameters is formally revised on the basis of the calibration dataset used for the inversion. Conceptually, this approach formalizes the conditionality of estimated parameters on the specific data and model available. The geostatistical component of the method refers to the way in which prior information about the parameters is used. A geostatistical autocorrelation function is used to enforce structure on the parameters to avoid overfitting and unrealistic results. Bayesian Geostatistical Approach is designed to provide the smoothest solution that is consistent with the data. Optionally, users can specify a level of fit or estimate a balance between fit and model complexity informed by the data. Groundwater and surface-water applications are used as examples in this text, but the possible uses of bgaPEST extend to any distributed parameter applications.

  4. Estimating Solar PV Output Using Modern Space/Time Geostatistics (Presentation)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, S. J.; George, R.; Bush, B.

    2009-04-29

    This presentation describes a project that uses mapping techniques to predict solar output at subhourly resolution at any spatial point, develop a methodology that is applicable to natural resources in general, and demonstrate capability of geostatistical techniques to predict the output of a potential solar plant.

  5. Geostatistical Borehole Image-Based Mapping of Karst-Carbonate Aquifer Pores.

    PubMed

    Sukop, Michael C; Cunningham, Kevin J

    2016-03-01

    Quantification of the character and spatial distribution of porosity in carbonate aquifers is important as input into computer models used in the calculation of intrinsic permeability and for next-generation, high-resolution groundwater flow simulations. Digital, optical, borehole-wall image data from three closely spaced boreholes in the karst-carbonate Biscayne aquifer in southeastern Florida are used in geostatistical experiments to assess the capabilities of various methods to create realistic two-dimensional models of vuggy megaporosity and matrix-porosity distribution in the limestone that composes the aquifer. When the borehole image data alone were used as the model training image, multiple-point geostatistics failed to detect the known spatial autocorrelation of vuggy megaporosity and matrix porosity among the three boreholes, which were only 10 m apart. Variogram analysis and subsequent Gaussian simulation produced results that showed a realistic conceptualization of horizontal continuity of strata dominated by vuggy megaporosity and matrix porosity among the three boreholes. © 2015, National Ground Water Association.

  6. Geostatistical borehole image-based mapping of karst-carbonate aquifer pores

    USGS Publications Warehouse

    Michael Sukop,; Cunningham, Kevin J.

    2016-01-01

    Quantification of the character and spatial distribution of porosity in carbonate aquifers is important as input into computer models used in the calculation of intrinsic permeability and for next-generation, high-resolution groundwater flow simulations. Digital, optical, borehole-wall image data from three closely spaced boreholes in the karst-carbonate Biscayne aquifer in southeastern Florida are used in geostatistical experiments to assess the capabilities of various methods to create realistic two-dimensional models of vuggy megaporosity and matrix-porosity distribution in the limestone that composes the aquifer. When the borehole image data alone were used as the model training image, multiple-point geostatistics failed to detect the known spatial autocorrelation of vuggy megaporosity and matrix porosity among the three boreholes, which were only 10 m apart. Variogram analysis and subsequent Gaussian simulation produced results that showed a realistic conceptualization of horizontal continuity of strata dominated by vuggy megaporosity and matrix porosity among the three boreholes.

  7. A geostatistics-informed hierarchical sensitivity analysis method for complex groundwater flow and transport modeling: GEOSTATISTICAL SENSITIVITY ANALYSIS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dai, Heng; Chen, Xingyuan; Ye, Ming

    Sensitivity analysis is an important tool for quantifying uncertainty in the outputs of mathematical models, especially for complex systems with a high dimension of spatially correlated parameters. Variance-based global sensitivity analysis has gained popularity because it can quantify the relative contribution of uncertainty from different sources. However, its computational cost increases dramatically with the complexity of the considered model and the dimension of model parameters. In this study we developed a hierarchical sensitivity analysis method that (1) constructs an uncertainty hierarchy by analyzing the input uncertainty sources, and (2) accounts for the spatial correlation among parameters at each level ofmore » the hierarchy using geostatistical tools. The contribution of uncertainty source at each hierarchy level is measured by sensitivity indices calculated using the variance decomposition method. Using this methodology, we identified the most important uncertainty source for a dynamic groundwater flow and solute transport in model at the Department of Energy (DOE) Hanford site. The results indicate that boundary conditions and permeability field contribute the most uncertainty to the simulated head field and tracer plume, respectively. The relative contribution from each source varied spatially and temporally as driven by the dynamic interaction between groundwater and river water at the site. By using a geostatistical approach to reduce the number of realizations needed for the sensitivity analysis, the computational cost of implementing the developed method was reduced to a practically manageable level. The developed sensitivity analysis method is generally applicable to a wide range of hydrologic and environmental problems that deal with high-dimensional spatially-distributed parameters.« less

  8. Geostatistical analysis of the flood risk perception queries in the village of Navaluenga (Central Spain)

    NASA Astrophysics Data System (ADS)

    Guardiola-Albert, Carolina; Díez-Herrero, Andrés; Amérigo, María; García, Juan Antonio; María Bodoque, José; Fernández-Naranjo, Nuria

    2017-04-01

    response actions, such as designing optimal evacuation routes during flood emergencies. Geostatistical tools also provide a set of interpolation techniques for the prediction of the variable value at unstudied similar locations, basing on the sample point values and other variables related with the measured variable. We attempt different geostatistical interpolation methods to obtain continuous surfaces of the risk perception and level of awareness in the study area. The use of these maps for future extensions and actualizations of the Civil Protection Plan is evaluated. References Bodoque, J. M., Amérigo, M., Díez-Herrero, A., García, J. A., Cortés, B., Ballesteros-Cánovas, J. A., & Olcina, J. (2016). Improvement of resilience of urban areas by integrating social perception in flash-flood risk management.Journal of Hydrology.

  9. Geostatistics - a tool applied to the distribution of Legionella pneumophila in a hospital water system.

    PubMed

    Laganà, Pasqualina; Moscato, Umberto; Poscia, Andrea; La Milia, Daniele Ignazio; Boccia, Stefania; Avventuroso, Emanuela; Delia, Santi

    2015-01-01

    Legionnaires' disease is normally acquired by inhalation of legionellae from a contaminated environmental source. Water systems of large buildings, such as hospitals, are often contaminated with legionellae and therefore represent a potential risk for the hospital population. The aim of this study was to evaluate the potential contamination of Legionella pneumophila (LP) in a large hospital in Italy through georeferential statistical analysis to assess the possible sources of dispersion and, consequently, the risk of exposure for both health care staff and patients. LP serogroups 1 and 2-14 distribution was considered in the wards housed on two consecutive floors of the hospital building. On the basis of information provided by 53 bacteriological analysis, a 'random' grid of points was chosen and spatial geostatistics or FAIk Kriging was applied and compared with the results of classical statistical analysis. Over 50% of the examined samples were positive for Legionella pneumophila. LP 1 was isolated in 69% of samples from the ground floor and in 60% of sample from the first floor; LP 2-14 in 36% of sample from the ground floor and 24% from the first. The iso-estimation maps show clearly the most contaminated pipe and the difference in the diffusion of the different L. pneumophila serogroups. Experimental work has demonstrated that geostatistical methods applied to the microbiological analysis of water matrices allows a better modeling of the phenomenon under study, a greater potential for risk management and a greater choice of methods of prevention and environmental recovery to be put in place with respect to the classical statistical analysis.

  10. Geostatistics as a tool to define various categories of resources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sabourin, R.

    1983-02-01

    Definition of 'measured' and 'indicated' resources tend to be vague. Yet, the calculation of such categories of resources in a mineral deposit calls for specific technical criteria. The author discusses how a geostatistical methodology provides the technical criteria required to classify reasonably assured resources by levels of assurance of their existence.

  11. A geostatistical approach to predicting sulfur content in the Pittsburgh coal bed

    USGS Publications Warehouse

    Watson, W.D.; Ruppert, L.F.; Bragg, L.J.; Tewalt, S.J.

    2001-01-01

    The US Geological Survey (USGS) is completing a national assessment of coal resources in the five top coal-producing regions in the US. Point-located data provide measurements on coal thickness and sulfur content. The sample data and their geologic interpretation represent the most regionally complete and up-to-date assessment of what is known about top-producing US coal beds. The sample data are analyzed using a combination of geologic and Geographic Information System (GIS) models to estimate tonnages and qualities of the coal beds. Traditionally, GIS practitioners use contouring to represent geographical patterns of "similar" data values. The tonnage and grade of coal resources are then assessed by using the contour lines as references for interpolation. An assessment taken to this point is only indicative of resource quantity and quality. Data users may benefit from a statistical approach that would allow them to better understand the uncertainty and limitations of the sample data. To develop a quantitative approach, geostatistics were applied to the data on coal sulfur content from samples taken in the Pittsburgh coal bed (located in the eastern US, in the southwestern part of the state of Pennsylvania, and in adjoining areas in the states of Ohio and West Virginia). Geostatistical methods that account for regional and local trends were applied to blocks 2.7 mi (4.3 km) on a side. The data and geostatistics support conclusions concerning the average sulfur content and its degree of reliability at regional- and economic-block scale over the large, contiguous part of the Pittsburgh outcrop, but not to a mine scale. To validate the method, a comparison was made with the sulfur contents in sample data taken from 53 coal mines located in the study area. The comparison showed a high degree of similarity between the sulfur content in the mine samples and the sulfur content represented by the geostatistically derived contours. Published by Elsevier Science B.V.

  12. Enhancing multiple-point geostatistical modeling: 1. Graph theory and pattern adjustment

    NASA Astrophysics Data System (ADS)

    Tahmasebi, Pejman; Sahimi, Muhammad

    2016-03-01

    In recent years, higher-order geostatistical methods have been used for modeling of a wide variety of large-scale porous media, such as groundwater aquifers and oil reservoirs. Their popularity stems from their ability to account for qualitative data and the great flexibility that they offer for conditioning the models to hard (quantitative) data, which endow them with the capability for generating realistic realizations of porous formations with very complex channels, as well as features that are mainly a barrier to fluid flow. One group of such models consists of pattern-based methods that use a set of data points for generating stochastic realizations by which the large-scale structure and highly-connected features are reproduced accurately. The cross correlation-based simulation (CCSIM) algorithm, proposed previously by the authors, is a member of this group that has been shown to be capable of simulating multimillion cell models in a matter of a few CPU seconds. The method is, however, sensitive to pattern's specifications, such as boundaries and the number of replicates. In this paper the original CCSIM algorithm is reconsidered and two significant improvements are proposed for accurately reproducing large-scale patterns of heterogeneities in porous media. First, an effective boundary-correction method based on the graph theory is presented by which one identifies the optimal cutting path/surface for removing the patchiness and discontinuities in the realization of a porous medium. Next, a new pattern adjustment method is proposed that automatically transfers the features in a pattern to one that seamlessly matches the surrounding patterns. The original CCSIM algorithm is then combined with the two methods and is tested using various complex two- and three-dimensional examples. It should, however, be emphasized that the methods that we propose in this paper are applicable to other pattern-based geostatistical simulation methods.

  13. Geological, geomechanical and geostatistical assessment of rockfall hazard in San Quirico Village (Abruzzo, Italy)

    NASA Astrophysics Data System (ADS)

    Chiessi, Vittorio; D'Orefice, Maurizio; Scarascia Mugnozza, Gabriele; Vitale, Valerio; Cannese, Christian

    2010-07-01

    This paper describes the results of a rockfall hazard assessment for the village of San Quirico (Abruzzo region, Italy) based on an engineering-geological model. After the collection of geological, geomechanical, and geomorphological data, the rockfall hazard assessment was performed based on two separate approaches: i) simulation of detachment of rock blocks and their downhill movement using a GIS; and ii) application of geostatistical techniques to the analysis of georeferenced observations of previously fallen blocks, in order to assess the probability of arrival of blocks due to potential future collapses. The results show that the trajectographic analysis is significantly influenced by the input parameters, with particular reference to the coefficients of restitution values. In order to solve this problem, the model was calibrated based on repeated field observations. The geostatistical approach is useful because it gives the best estimation of point-source phenomena such as rockfalls; however, the sensitivity of results to basic assumptions, e.g. assessment of variograms and choice of a threshold value, may be problematic. Consequently, interpolations derived from different variograms have been used and compared among them; hence, those showing the lowest errors were adopted. The data sets which were statistically analysed are relevant to both kinetic energy and surveyed rock blocks in the accumulation area. The obtained maps highlight areas susceptible to rock block arrivals, and show that the area accommodating the new settlement of S. Quirico Village has the highest level of hazard according to both probabilistic and deterministic methods.

  14. Geostatistical three-dimensional modeling of oolite shoals, St. Louis Limestone, southwest Kansas

    USGS Publications Warehouse

    Qi, L.; Carr, T.R.; Goldstein, R.H.

    2007-01-01

    In the Hugoton embayment of southwestern Kansas, reservoirs composed of relatively thin (<4 m; <13.1 ft) oolitic deposits within the St. Louis Limestone have produced more than 300 million bbl of oil. The geometry and distribution of oolitic deposits control the heterogeneity of the reservoirs, resulting in exploration challenges and relatively low recovery. Geostatistical three-dimensional (3-D) models were constructed to quantify the geometry and spatial distribution of oolitic reservoirs, and the continuity of flow units within Big Bow and Sand Arroyo Creek fields. Lithofacies in uncored wells were predicted from digital logs using a neural network. The tilting effect from the Laramide orogeny was removed to construct restored structural surfaces at the time of deposition. Well data and structural maps were integrated to build 3-D models of oolitic reservoirs using stochastic simulations with geometry data. Three-dimensional models provide insights into the distribution, the external and internal geometry of oolitic deposits, and the sedimentologic processes that generated reservoir intervals. The structural highs and general structural trend had a significant impact on the distribution and orientation of the oolitic complexes. The depositional pattern and connectivity analysis suggest an overall aggradation of shallow-marine deposits during pulses of relative sea level rise followed by deepening near the top of the St. Louis Limestone. Cemented oolitic deposits were modeled as barriers and baffles and tend to concentrate at the edge of oolitic complexes. Spatial distribution of porous oolitic deposits controls the internal geometry of rock properties. Integrated geostatistical modeling methods can be applicable to other complex carbonate or siliciclastic reservoirs in shallow-marine settings. Copyright ?? 2007. The American Association of Petroleum Geologists. All rights reserved.

  15. Using rank-order geostatistics for spatial interpolation of highly skewed data in a heavy-metal contaminated site.

    PubMed

    Juang, K W; Lee, D Y; Ellsworth, T R

    2001-01-01

    The spatial distribution of a pollutant in contaminated soils is usually highly skewed. As a result, the sample variogram often differs considerably from its regional counterpart and the geostatistical interpolation is hindered. In this study, rank-order geostatistics with standardized rank transformation was used for the spatial interpolation of pollutants with a highly skewed distribution in contaminated soils when commonly used nonlinear methods, such as logarithmic and normal-scored transformations, are not suitable. A real data set of soil Cd concentrations with great variation and high skewness in a contaminated site of Taiwan was used for illustration. The spatial dependence of ranks transformed from Cd concentrations was identified and kriging estimation was readily performed in the standardized-rank space. The estimated standardized rank was back-transformed into the concentration space using the middle point model within a standardized-rank interval of the empirical distribution function (EDF). The spatial distribution of Cd concentrations was then obtained. The probability of Cd concentration being higher than a given cutoff value also can be estimated by using the estimated distribution of standardized ranks. The contour maps of Cd concentrations and the probabilities of Cd concentrations being higher than the cutoff value can be simultaneously used for delineation of hazardous areas of contaminated soils.

  16. Overview and technical and practical aspects for use of geostatistics in hazardous-, toxic-, and radioactive-waste-site investigations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bossong, C.R.; Karlinger, M.R.; Troutman, B.M.

    1999-10-01

    Technical and practical aspects of applying geostatistics are developed for individuals involved in investigation at hazardous-, toxic-, and radioactive-waste sites. Important geostatistical concepts, such as variograms and ordinary, universal, and indicator kriging, are described in general terms for introductory purposes and in more detail for practical applications. Variogram modeling using measured ground-water elevation data is described in detail to illustrate principles of stationarity, anisotropy, transformations, and cross validation. Several examples of kriging applications are described using ground-water-level elevations, bedrock elevations, and ground-water-quality data. A review of contemporary literature and selected public domain software associated with geostatistics also is provided, asmore » is a discussion of alternative methods for spatial modeling, including inverse distance weighting, triangulation, splines, trend-surface analysis, and simulation.« less

  17. Hierarchical Nearest-Neighbor Gaussian Process Models for Large Geostatistical Datasets.

    PubMed

    Datta, Abhirup; Banerjee, Sudipto; Finley, Andrew O; Gelfand, Alan E

    2016-01-01

    Spatial process models for analyzing geostatistical data entail computations that become prohibitive as the number of spatial locations become large. This article develops a class of highly scalable nearest-neighbor Gaussian process (NNGP) models to provide fully model-based inference for large geostatistical datasets. We establish that the NNGP is a well-defined spatial process providing legitimate finite-dimensional Gaussian densities with sparse precision matrices. We embed the NNGP as a sparsity-inducing prior within a rich hierarchical modeling framework and outline how computationally efficient Markov chain Monte Carlo (MCMC) algorithms can be executed without storing or decomposing large matrices. The floating point operations (flops) per iteration of this algorithm is linear in the number of spatial locations, thereby rendering substantial scalability. We illustrate the computational and inferential benefits of the NNGP over competing methods using simulation studies and also analyze forest biomass from a massive U.S. Forest Inventory dataset at a scale that precludes alternative dimension-reducing methods. Supplementary materials for this article are available online.

  18. Hierarchical Nearest-Neighbor Gaussian Process Models for Large Geostatistical Datasets

    PubMed Central

    Datta, Abhirup; Banerjee, Sudipto; Finley, Andrew O.; Gelfand, Alan E.

    2018-01-01

    Spatial process models for analyzing geostatistical data entail computations that become prohibitive as the number of spatial locations become large. This article develops a class of highly scalable nearest-neighbor Gaussian process (NNGP) models to provide fully model-based inference for large geostatistical datasets. We establish that the NNGP is a well-defined spatial process providing legitimate finite-dimensional Gaussian densities with sparse precision matrices. We embed the NNGP as a sparsity-inducing prior within a rich hierarchical modeling framework and outline how computationally efficient Markov chain Monte Carlo (MCMC) algorithms can be executed without storing or decomposing large matrices. The floating point operations (flops) per iteration of this algorithm is linear in the number of spatial locations, thereby rendering substantial scalability. We illustrate the computational and inferential benefits of the NNGP over competing methods using simulation studies and also analyze forest biomass from a massive U.S. Forest Inventory dataset at a scale that precludes alternative dimension-reducing methods. Supplementary materials for this article are available online. PMID:29720777

  19. Large-scale inverse model analyses employing fast randomized data reduction

    NASA Astrophysics Data System (ADS)

    Lin, Youzuo; Le, Ellen B.; O'Malley, Daniel; Vesselinov, Velimir V.; Bui-Thanh, Tan

    2017-08-01

    When the number of observations is large, it is computationally challenging to apply classical inverse modeling techniques. We have developed a new computationally efficient technique for solving inverse problems with a large number of observations (e.g., on the order of 107 or greater). Our method, which we call the randomized geostatistical approach (RGA), is built upon the principal component geostatistical approach (PCGA). We employ a data reduction technique combined with the PCGA to improve the computational efficiency and reduce the memory usage. Specifically, we employ a randomized numerical linear algebra technique based on a so-called "sketching" matrix to effectively reduce the dimension of the observations without losing the information content needed for the inverse analysis. In this way, the computational and memory costs for RGA scale with the information content rather than the size of the calibration data. Our algorithm is coded in Julia and implemented in the MADS open-source high-performance computational framework (http://mads.lanl.gov). We apply our new inverse modeling method to invert for a synthetic transmissivity field. Compared to a standard geostatistical approach (GA), our method is more efficient when the number of observations is large. Most importantly, our method is capable of solving larger inverse problems than the standard GA and PCGA approaches. Therefore, our new model inversion method is a powerful tool for solving large-scale inverse problems. The method can be applied in any field and is not limited to hydrogeological applications such as the characterization of aquifer heterogeneity.

  20. Spatial interpolation of forest conditions using co-conditional geostatistical simulation

    Treesearch

    H. Todd Mowrer

    2000-01-01

    In recent work the author used the geostatistical Monte Carlo technique of sequential Gaussian simulation (s.G.s.) to investigate uncertainty in a GIS analysis of potential old-growth forest areas. The current study compares this earlier technique to that of co-conditional simulation, wherein the spatial cross-correlations between variables are included. As in the...

  1. Spatial analysis of the distribution of Spodoptera frugiperda (J.E. Smith) (Lepidoptera: Noctuidae) and losses in maize crop productivity using geostatistics.

    PubMed

    Farias, Paulo R S; Barbosa, José C; Busoli, Antonio C; Overal, William L; Miranda, Vicente S; Ribeiro, Susane M

    2008-01-01

    The fall armyworm, Spodoptera frugiperda (J.E. Smith), is one of the chief pests of maize in the Americas. The study of its spatial distribution is fundamental for designing correct control strategies, improving sampling methods, determining actual and potential crop losses, and adopting precise agricultural techniques. In São Paulo state, Brazil, a maize field was sampled at weekly intervals, from germination through harvest, for caterpillar densities, using quadrates. In each of 200 quadrates, 10 plants were sampled per week. Harvest weights were obtained in the field for each quadrate, and ear diameters and lengths were also sampled (15 ears per quadrate) and used to estimate potential productivity of the quadrate. Geostatistical analyses of caterpillar densities showed greatest ranges for small caterpillars when semivariograms were adjusted for a spherical model that showed greatest fit. As the caterpillars developed in the field, their spatial distribution became increasingly random, as shown by a model adjusted to a straight line, indicating a lack of spatial dependence among samples. Harvest weight and ear length followed the spherical model, indicating the existence of spatial variability of the production parameters in the maize field. Geostatistics shows promise for the application of precise methods in the integrated control of pests.

  2. Adapting geostatistics to analyze spatial and temporal trends in weed populations

    USDA-ARS?s Scientific Manuscript database

    Geostatistics were originally developed in mining to estimate the location, abundance and quality of ore over large areas from soil samples to optimize future mining efforts. Here, some of these methods were adapted to weeds to account for a limited distribution area (i.e., inside a field), variatio...

  3. Identification of high-permeability subsurface structures with multiple point geostatistics and normal score ensemble Kalman filter

    NASA Astrophysics Data System (ADS)

    Zovi, Francesco; Camporese, Matteo; Hendricks Franssen, Harrie-Jan; Huisman, Johan Alexander; Salandin, Paolo

    2017-05-01

    Alluvial aquifers are often characterized by the presence of braided high-permeable paleo-riverbeds, which constitute an interconnected preferential flow network whose localization is of fundamental importance to predict flow and transport dynamics. Classic geostatistical approaches based on two-point correlation (i.e., the variogram) cannot describe such particular shapes. In contrast, multiple point geostatistics can describe almost any kind of shape using the empirical probability distribution derived from a training image. However, even with a correct training image the exact positions of the channels are uncertain. State information like groundwater levels can constrain the channel positions using inverse modeling or data assimilation, but the method should be able to handle non-Gaussianity of the parameter distribution. Here the normal score ensemble Kalman filter (NS-EnKF) was chosen as the inverse conditioning algorithm to tackle this issue. Multiple point geostatistics and NS-EnKF have already been tested in synthetic examples, but in this study they are used for the first time in a real-world case study. The test site is an alluvial unconfined aquifer in northeastern Italy with an extension of approximately 3 km2. A satellite training image showing the braid shapes of the nearby river and electrical resistivity tomography (ERT) images were used as conditioning data to provide information on channel shape, size, and position. Measured groundwater levels were assimilated with the NS-EnKF to update the spatially distributed groundwater parameters (hydraulic conductivity and storage coefficients). Results from the study show that the inversion based on multiple point geostatistics does not outperform the one with a multiGaussian model and that the information from the ERT images did not improve site characterization. These results were further evaluated with a synthetic study that mimics the experimental site. The synthetic results showed that only for a much

  4. Mercury emissions from coal combustion in Silesia, analysis using geostatistics

    NASA Astrophysics Data System (ADS)

    Zasina, Damian; Zawadzki, Jaroslaw

    2015-04-01

    Data provided by the UNEP's report on mercury [1] shows that solid fuel combustion in significant source of mercury emission to air. Silesia, located in southwestern Poland, is notably affected by mercury emission due to being one of the most industrialized Polish regions: the place of coal mining, production of metals, stone mining, mineral quarrying and chemical industry. Moreover, Silesia is the region with high population density. People are exposed to severe risk of mercury emitted from both: industrial and domestic sources (i.e. small household furnaces). Small sources have significant contribution to total emission of mercury. Official and statistical analysis, including prepared for international purposes [2] did not provide data about spatial distribution of the mercury emitted to air, however number of analysis on Polish public power and energy sector had been prepared so far [3; 4]. The distribution of locations exposed for mercury emission from small domestic sources is interesting matter merging information from various sources: statistical, economical and environmental. This paper presents geostatistical approach to distibution of mercury emission from coal combustion. Analysed data organized in 2 independent levels: individual, bottom-up approach derived from national emission reporting system [5; 6] and top down - regional data calculated basing on official statistics [7]. Analysis, that will be presented, will include comparison of spatial distributions of mercury emission using data derived from sources mentioned above. Investigation will include three voivodeships of Poland: Lower Silesian, Opole (voivodeship) and Silesian using selected geostatistical methodologies including ordinary kriging [8]. References [1] UNEP. Global Mercury Assessment 2013: Sources, Emissions, Releases and Environmental Transport. UNEP Chemicals Branch, Geneva, Switzerland, 2013. [2] NCEM. Poland's Informative Inventory Report 2014. NCEM at the IEP-NRI, 2014. http

  5. Optimisation of groundwater level monitoring networks using geostatistical modelling based on the Spartan family variogram and a genetic algorithm method

    NASA Astrophysics Data System (ADS)

    Parasyris, Antonios E.; Spanoudaki, Katerina; Kampanis, Nikolaos A.

    2016-04-01

    Groundwater level monitoring networks provide essential information for water resources management, especially in areas with significant groundwater exploitation for agricultural and domestic use. Given the high maintenance costs of these networks, development of tools, which can be used by regulators for efficient network design is essential. In this work, a monitoring network optimisation tool is presented. The network optimisation tool couples geostatistical modelling based on the Spartan family variogram with a genetic algorithm method and is applied to Mires basin in Crete, Greece, an area of high socioeconomic and agricultural interest, which suffers from groundwater overexploitation leading to a dramatic decrease of groundwater levels. The purpose of the optimisation tool is to determine which wells to exclude from the monitoring network because they add little or no beneficial information to groundwater level mapping of the area. Unlike previous relevant investigations, the network optimisation tool presented here uses Ordinary Kriging with the recently-established non-differentiable Spartan variogram for groundwater level mapping, which, based on a previous geostatistical study in the area leads to optimal groundwater level mapping. Seventy boreholes operate in the area for groundwater abstraction and water level monitoring. The Spartan variogram gives overall the most accurate groundwater level estimates followed closely by the power-law model. The geostatistical model is coupled to an integer genetic algorithm method programmed in MATLAB 2015a. The algorithm is used to find the set of wells whose removal leads to the minimum error between the original water level mapping using all the available wells in the network and the groundwater level mapping using the reduced well network (error is defined as the 2-norm of the difference between the original mapping matrix with 70 wells and the mapping matrix of the reduced well network). The solution to the

  6. Geostatistics, remote sensing and precision farming.

    PubMed

    Mulla, D J

    1997-01-01

    Precision farming is possible today because of advances in farming technology, procedures for mapping and interpolating spatial patterns, and geographic information systems for overlaying and interpreting several soil, landscape and crop attributes. The key component of precision farming is the map showing spatial patterns in field characteristics. Obtaining information for this map is often achieved by soil sampling. This approach, however, can be cost-prohibitive for grain crops. Soil sampling strategies can be simplified by use of auxiliary data provided by satellite or aerial photo imagery. This paper describes geostatistical methods for estimating spatial patterns in soil organic matter, soil test phosphorus and wheat grain yield from a combination of Thematic Mapper imaging and soil sampling.

  7. Integration of GIS, Geostatistics, and 3-D Technology to Assess the Spatial Distribution of Soil Moisture

    NASA Technical Reports Server (NTRS)

    Betts, M.; Tsegaye, T.; Tadesse, W.; Coleman, T. L.; Fahsi, A.

    1998-01-01

    The spatial and temporal distribution of near surface soil moisture is of fundamental importance to many physical, biological, biogeochemical, and hydrological processes. However, knowledge of these space-time dynamics and the processes which control them remains unclear. The integration of geographic information systems (GIS) and geostatistics together promise a simple mechanism to evaluate and display the spatial and temporal distribution of this vital hydrologic and physical variable. Therefore, this research demonstrates the use of geostatistics and GIS to predict and display soil moisture distribution under vegetated and non-vegetated plots. The research was conducted at the Winfred Thomas Agricultural Experiment Station (WTAES), Hazel Green, Alabama. Soil moisture measurement were done on a 10 by 10 m grid from tall fescue grass (GR), alfalfa (AA), bare rough (BR), and bare smooth (BS) plots. Results indicated that variance associated with soil moisture was higher for vegetated plots than non-vegetated plots. The presence of vegetation in general contributed to the spatial variability of soil moisture. Integration of geostatistics and GIS can improve the productivity of farm lands and the precision of farming.

  8. Geostatistical Sampling Methods for Efficient Uncertainty Analysis in Flow and Transport Problems

    NASA Astrophysics Data System (ADS)

    Liodakis, Stylianos; Kyriakidis, Phaedon; Gaganis, Petros

    2015-04-01

    In hydrogeological applications involving flow and transport of in heterogeneous porous media the spatial distribution of hydraulic conductivity is often parameterized in terms of a lognormal random field based on a histogram and variogram model inferred from data and/or synthesized from relevant knowledge. Realizations of simulated conductivity fields are then generated using geostatistical simulation involving simple random (SR) sampling and are subsequently used as inputs to physically-based simulators of flow and transport in a Monte Carlo framework for evaluating the uncertainty in the spatial distribution of solute concentration due to the uncertainty in the spatial distribution of hydraulic con- ductivity [1]. Realistic uncertainty analysis, however, calls for a large number of simulated concentration fields; hence, can become expensive in terms of both time and computer re- sources. A more efficient alternative to SR sampling is Latin hypercube (LH) sampling, a special case of stratified random sampling, which yields a more representative distribution of simulated attribute values with fewer realizations [2]. Here, term representative implies realizations spanning efficiently the range of possible conductivity values corresponding to the lognormal random field. In this work we investigate the efficiency of alternative methods to classical LH sampling within the context of simulation of flow and transport in a heterogeneous porous medium. More precisely, we consider the stratified likelihood (SL) sampling method of [3], in which attribute realizations are generated using the polar simulation method by exploring the geometrical properties of the multivariate Gaussian distribution function. In addition, we propose a more efficient version of the above method, here termed minimum energy (ME) sampling, whereby a set of N representative conductivity realizations at M locations is constructed by: (i) generating a representative set of N points distributed on the

  9. A geostatistical approach to the change-of-support problem and variable-support data fusion in spatial analysis

    NASA Astrophysics Data System (ADS)

    Wang, Jun; Wang, Yang; Zeng, Hui

    2016-01-01

    A key issue to address in synthesizing spatial data with variable-support in spatial analysis and modeling is the change-of-support problem. We present an approach for solving the change-of-support and variable-support data fusion problems. This approach is based on geostatistical inverse modeling that explicitly accounts for differences in spatial support. The inverse model is applied here to produce both the best predictions of a target support and prediction uncertainties, based on one or more measurements, while honoring measurements. Spatial data covering large geographic areas often exhibit spatial nonstationarity and can lead to computational challenge due to the large data size. We developed a local-window geostatistical inverse modeling approach to accommodate these issues of spatial nonstationarity and alleviate computational burden. We conducted experiments using synthetic and real-world raster data. Synthetic data were generated and aggregated to multiple supports and downscaled back to the original support to analyze the accuracy of spatial predictions and the correctness of prediction uncertainties. Similar experiments were conducted for real-world raster data. Real-world data with variable-support were statistically fused to produce single-support predictions and associated uncertainties. The modeling results demonstrate that geostatistical inverse modeling can produce accurate predictions and associated prediction uncertainties. It is shown that the local-window geostatistical inverse modeling approach suggested offers a practical way to solve the well-known change-of-support problem and variable-support data fusion problem in spatial analysis and modeling.

  10. [Bayesian geostatistical prediction of soil organic carbon contents of solonchak soils in nor-thern Tarim Basin, Xinjiang, China.

    PubMed

    Wu, Wei Mo; Wang, Jia Qiang; Cao, Qi; Wu, Jia Ping

    2017-02-01

    Accurate prediction of soil organic carbon (SOC) distribution is crucial for soil resources utilization and conservation, climate change adaptation, and ecosystem health. In this study, we selected a 1300 m×1700 m solonchak sampling area in northern Tarim Basin, Xinjiang, China, and collected a total of 144 soil samples (5-10 cm). The objectives of this study were to build a Baye-sian geostatistical model to predict SOC content, and to assess the performance of the Bayesian model for the prediction of SOC content by comparing with other three geostatistical approaches [ordinary kriging (OK), sequential Gaussian simulation (SGS), and inverse distance weighting (IDW)]. In the study area, soil organic carbon contents ranged from 1.59 to 9.30 g·kg -1 with a mean of 4.36 g·kg -1 and a standard deviation of 1.62 g·kg -1 . Sample semivariogram was best fitted by an exponential model with the ratio of nugget to sill being 0.57. By using the Bayesian geostatistical approach, we generated the SOC content map, and obtained the prediction variance, upper 95% and lower 95% of SOC contents, which were then used to evaluate the prediction uncertainty. Bayesian geostatistical approach performed better than that of the OK, SGS and IDW, demonstrating the advantages of Bayesian approach in SOC prediction.

  11. Nonpoint Source Solute Transport Normal to Aquifer Bedding in Heterogeneous, Markov Chain Random Fields

    NASA Astrophysics Data System (ADS)

    Zhang, H.; Harter, T.; Sivakumar, B.

    2005-12-01

    Facies-based geostatistical models have become important tools for the stochastic analysis of flow and transport processes in heterogeneous aquifers. However, little is known about the dependency of these processes on the parameters of facies- based geostatistical models. This study examines the nonpoint source solute transport normal to the major bedding plane in the presence of interconnected high conductivity (coarse- textured) facies in the aquifer medium and the dependence of the transport behavior upon the parameters of the constitutive facies model. A facies-based Markov chain geostatistical model is used to quantify the spatial variability of the aquifer system hydrostratigraphy. It is integrated with a groundwater flow model and a random walk particle transport model to estimate the solute travel time probability distribution functions (pdfs) for solute flux from the water table to the bottom boundary (production horizon) of the aquifer. The cases examined include, two-, three-, and four-facies models with horizontal to vertical facies mean length anisotropy ratios, ek, from 25:1 to 300:1, and with a wide range of facies volume proportions (e.g, from 5% to 95% coarse textured facies). Predictions of travel time pdfs are found to be significantly affected by the number of hydrostratigraphic facies identified in the aquifer, the proportions of coarse-textured sediments, the mean length of the facies (particularly the ratio of length to thickness of coarse materials), and - to a lesser degree - the juxtapositional preference among the hydrostratigraphic facies. In transport normal to the sedimentary bedding plane, travel time pdfs are not log- normally distributed as is often assumed. Also, macrodispersive behavior (variance of the travel time pdf) was found to not be a unique function of the conductivity variance. The skewness of the travel time pdf varied from negatively skewed to strongly positively skewed within the parameter range examined. We also show

  12. Geostatistics for spatial genetic structures: study of wild populations of perennial ryegrass.

    PubMed

    Monestiez, P; Goulard, M; Charmet, G

    1994-04-01

    Methods based on geostatistics were applied to quantitative traits of agricultural interest measured on a collection of 547 wild populations of perennial ryegrass in France. The mathematical background of these methods, which resembles spatial autocorrelation analysis, is briefly described. When a single variable is studied, the spatial structure analysis is similar to spatial autocorrelation analysis, and a spatial prediction method, called "kriging", gives a filtered map of the spatial pattern over all the sampled area. When complex interactions of agronomic traits with different evaluation sites define a multivariate structure for the spatial analysis, geostatistical methods allow the spatial variations to be broken down into two main spatial structures with ranges of 120 km and 300 km, respectively. The predicted maps that corresponded to each range were interpreted as a result of the isolation-by-distance model and as a consequence of selection by environmental factors. Practical collecting methodology for breeders may be derived from such spatial structures.

  13. Soil texture and organic carbon fractions predicted from near-infrared spectroscopy and geostatistics

    USDA-ARS?s Scientific Manuscript database

    Field-specific management could help achieve agricultural sustainability by increasing production and decreasing environmental impacts. Near-infrared spectroscopy (NIRS) and geostatistics are relatively unexplored tools that could reduce time, labor, and costs of soil analysis. Our objective was to ...

  14. A pilot cluster randomized controlled trial of structured goal-setting following stroke.

    PubMed

    Taylor, William J; Brown, Melanie; William, Levack; McPherson, Kathryn M; Reed, Kirk; Dean, Sarah G; Weatherall, Mark

    2012-04-01

    To determine the feasibility, the cluster design effect and the variance and minimal clinical importance difference in the primary outcome in a pilot study of a structured approach to goal-setting. A cluster randomized controlled trial. Inpatient rehabilitation facilities. People who were admitted to inpatient rehabilitation following stroke who had sufficient cognition to engage in structured goal-setting and complete the primary outcome measure. Structured goal elicitation using the Canadian Occupational Performance Measure. Quality of life at 12 weeks using the Schedule for Individualised Quality of Life (SEIQOL-DW), Functional Independence Measure, Short Form 36 and Patient Perception of Rehabilitation (measuring satisfaction with rehabilitation). Assessors were blinded to the intervention. Four rehabilitation services and 41 patients were randomized. We found high values of the intraclass correlation for the outcome measures (ranging from 0.03 to 0.40) and high variance of the SEIQOL-DW (SD 19.6) in relation to the minimally importance difference of 2.1, leading to impractically large sample size requirements for a cluster randomized design. A cluster randomized design is not a practical means of avoiding contamination effects in studies of inpatient rehabilitation goal-setting. Other techniques for coping with contamination effects are necessary.

  15. Testing geostatistical methods to combine radar and rain gauges for precipitation mapping in a mountainous region

    NASA Astrophysics Data System (ADS)

    Erdin, R.; Frei, C.; Sideris, I.; Kuensch, H.-R.

    2010-09-01

    There is an increasing demand for accurate mapping of precipitation at a spatial resolution of kilometers. Radar and rain gauges - the two main precipitation measurement systems - exhibit complementary strengths and weaknesses. Radar offers high spatial and temporal resolution but lacks accuracy of absolute values, whereas rain gauges provide accurate values at their specific point location but suffer from poor spatial representativeness. Methods of geostatistical mapping have been proposed to combine radar and rain gauge data for quantitative precipitation estimation (QPE). The aim is to combine the respective strengths and compensate for the respective weaknesses of the two observation platforms. Several studies have demonstrated the potential of these methods over topography of moderate complexity, but their performance remains unclear for high-mountain regions where rainfall patterns are complex, the representativeness of rain gauge measurements is limited and radar observations are obstructed. In this study we examine the potential and limitations of two frequently used geostatistical mapping methods for the territory of Switzerland, where the mountain chain of the Alps poses particular challenges to QPE. The two geostatistical methods explored are kriging with external drift (KED) using radar as drift variable and ordinary kriging of radar errors (OKRE). The radar data is a composite from three C-band radars using a constant Z-R relationship, advanced correction processings for visibility, ground clutter and beam shielding and a climatological bias adjustment. The rain gauge data originates from an automatic network with a typical inter-station distance of 25 km. Both combination methods are applied to a set of case examples representing typical rainfall situations in the Alps with their inherent challenges at daily and hourly time resolution. The quality of precipitation estimates is assessed by several skill scores calculated from cross validation errors at

  16. Geostatistical analysis of fault and joint measurements in Austin Chalk, Superconducting Super Collider Site, Texas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mace, R.E.; Nance, H.S.; Laubach, S.E.

    1995-06-01

    Faults and joints are conduits for ground-water flow and targets for horizontal drilling in the petroleum industry. Spacing and size distribution are rarely predicted accurately by current structural models or documented adequately by conventional borehole or outcrop samples. Tunnel excavations present opportunities to measure fracture attributes in continuous subsurface exposures. These fracture measurements ran be used to improve structural models, guide interpretation of conventional borehole and outcrop data, and geostatistically quantify spatial and spacing characteristics for comparison to outcrop data or for generating distributions of fracture for numerical flow and transport modeling. Structure maps of over 9 mi of nearlymore » continuous tunnel excavations in Austin Chalk at the Superconducting Super Collider (SSC) site in Ellis County, Texas, provide a unique database of fault and joint populations for geostatistical analysis. Observationally, small faults (<10 ft. throw) occur in clusters or swarms that have as many as 24 faults, fault swarms are as much as 2,000 ft. wide and appear to be on average 1,000 ft. apart, and joints are in swarms spaced 500 to more than 2l,000 ft. apart. Semi-variograms show varying degrees of spatial correlation. These variograms have structured sills that correlate directly to highs and lows in fracture frequency observed in the tunnel. Semi-variograms generated with respect to fracture spacing and number also have structured sills, but tend to not show any near-field correlation. The distribution of fault spacing can be described with a negative exponential, which suggests a random distribution. However, there is clearly some structure and clustering in the spacing data as shown by running average and variograms, which implies that a number of different methods should be utilized to characterize fracture spacing.« less

  17. A Bayesian geostatistical approach for evaluating the uncertainty of contaminant mass discharges from point sources

    NASA Astrophysics Data System (ADS)

    Troldborg, M.; Nowak, W.; Binning, P. J.; Bjerg, P. L.

    2012-12-01

    Estimates of mass discharge (mass/time) are increasingly being used when assessing risks of groundwater contamination and designing remedial systems at contaminated sites. Mass discharge estimates are, however, prone to rather large uncertainties as they integrate uncertain spatial distributions of both concentration and groundwater flow velocities. For risk assessments or any other decisions that are being based on mass discharge estimates, it is essential to address these uncertainties. We present a novel Bayesian geostatistical approach for quantifying the uncertainty of the mass discharge across a multilevel control plane. The method decouples the flow and transport simulation and has the advantage of avoiding the heavy computational burden of three-dimensional numerical flow and transport simulation coupled with geostatistical inversion. It may therefore be of practical relevance to practitioners compared to existing methods that are either too simple or computationally demanding. The method is based on conditional geostatistical simulation and accounts for i) heterogeneity of both the flow field and the concentration distribution through Bayesian geostatistics (including the uncertainty in covariance functions), ii) measurement uncertainty, and iii) uncertain source zone geometry and transport parameters. The method generates multiple equally likely realizations of the spatial flow and concentration distribution, which all honour the measured data at the control plane. The flow realizations are generated by analytical co-simulation of the hydraulic conductivity and the hydraulic gradient across the control plane. These realizations are made consistent with measurements of both hydraulic conductivity and head at the site. An analytical macro-dispersive transport solution is employed to simulate the mean concentration distribution across the control plane, and a geostatistical model of the Box-Cox transformed concentration data is used to simulate observed

  18. Clustering of Multivariate Geostatistical Data

    NASA Astrophysics Data System (ADS)

    Fouedjio, Francky

    2017-04-01

    Multivariate data indexed by geographical coordinates have become omnipresent in the geosciences and pose substantial analysis challenges. One of them is the grouping of data locations into spatially contiguous clusters so that data locations belonging to the same cluster have a certain degree of homogeneity while data locations in the different clusters have to be as different as possible. However, groups of data locations created through classical clustering techniques turn out to show poor spatial contiguity, a feature obviously inconvenient for many geoscience applications. In this work, we develop a clustering method that overcomes this problem by accounting the spatial dependence structure of data; thus reinforcing the spatial contiguity of resulting cluster. The capability of the proposed clustering method to provide spatially contiguous and meaningful clusters of data locations is assessed using both synthetic and real datasets. Keywords: clustering, geostatistics, spatial contiguity, spatial dependence.

  19. Regional flow duration curves: Geostatistical techniques versus multivariate regression

    USGS Publications Warehouse

    Pugliese, Alessio; Farmer, William H.; Castellarin, Attilio; Archfield, Stacey A.; Vogel, Richard M.

    2016-01-01

    A period-of-record flow duration curve (FDC) represents the relationship between the magnitude and frequency of daily streamflows. Prediction of FDCs is of great importance for locations characterized by sparse or missing streamflow observations. We present a detailed comparison of two methods which are capable of predicting an FDC at ungauged basins: (1) an adaptation of the geostatistical method, Top-kriging, employing a linear weighted average of dimensionless empirical FDCs, standardised with a reference streamflow value; and (2) regional multiple linear regression of streamflow quantiles, perhaps the most common method for the prediction of FDCs at ungauged sites. In particular, Top-kriging relies on a metric for expressing the similarity between catchments computed as the negative deviation of the FDC from a reference streamflow value, which we termed total negative deviation (TND). Comparisons of these two methods are made in 182 largely unregulated river catchments in the southeastern U.S. using a three-fold cross-validation algorithm. Our results reveal that the two methods perform similarly throughout flow-regimes, with average Nash-Sutcliffe Efficiencies 0.566 and 0.662, (0.883 and 0.829 on log-transformed quantiles) for the geostatistical and the linear regression models, respectively. The differences between the reproduction of FDC's occurred mostly for low flows with exceedance probability (i.e. duration) above 0.98.

  20. Geostatistical radar-raingauge combination with nonparametric correlograms: methodological considerations and application in Switzerland

    NASA Astrophysics Data System (ADS)

    Schiemann, R.; Erdin, R.; Willi, M.; Frei, C.; Berenguer, M.; Sempere-Torres, D.

    2011-05-01

    Modelling spatial covariance is an essential part of all geostatistical methods. Traditionally, parametric semivariogram models are fit from available data. More recently, it has been suggested to use nonparametric correlograms obtained from spatially complete data fields. Here, both estimation techniques are compared. Nonparametric correlograms are shown to have a substantial negative bias. Nonetheless, when combined with the sample variance of the spatial field under consideration, they yield an estimate of the semivariogram that is unbiased for small lag distances. This justifies the use of this estimation technique in geostatistical applications. Various formulations of geostatistical combination (Kriging) methods are used here for the construction of hourly precipitation grids for Switzerland based on data from a sparse realtime network of raingauges and from a spatially complete radar composite. Two variants of Ordinary Kriging (OK) are used to interpolate the sparse gauge observations. In both OK variants, the radar data are only used to determine the semivariogram model. One variant relies on a traditional parametric semivariogram estimate, whereas the other variant uses the nonparametric correlogram. The variants are tested for three cases and the impact of the semivariogram model on the Kriging prediction is illustrated. For the three test cases, the method using nonparametric correlograms performs equally well or better than the traditional method, and at the same time offers great practical advantages. Furthermore, two variants of Kriging with external drift (KED) are tested, both of which use the radar data to estimate nonparametric correlograms, and as the external drift variable. The first KED variant has been used previously for geostatistical radar-raingauge merging in Catalonia (Spain). The second variant is newly proposed here and is an extension of the first. Both variants are evaluated for the three test cases as well as an extended evaluation

  1. Geostatistical radar-raingauge combination with nonparametric correlograms: methodological considerations and application in Switzerland

    NASA Astrophysics Data System (ADS)

    Schiemann, R.; Erdin, R.; Willi, M.; Frei, C.; Berenguer, M.; Sempere-Torres, D.

    2010-09-01

    Modelling spatial covariance is an essential part of all geostatistical methods. Traditionally, parametric semivariogram models are fit from available data. More recently, it has been suggested to use nonparametric correlograms obtained from spatially complete data fields. Here, both estimation techniques are compared. Nonparametric correlograms are shown to have a substantial negative bias. Nonetheless, when combined with the sample variance of the spatial field under consideration, they yield an estimate of the semivariogram that is unbiased for small lag distances. This justifies the use of this estimation technique in geostatistical applications. Various formulations of geostatistical combination (Kriging) methods are used here for the construction of hourly precipitation grids for Switzerland based on data from a sparse realtime network of raingauges and from a spatially complete radar composite. Two variants of Ordinary Kriging (OK) are used to interpolate the sparse gauge observations. In both OK variants, the radar data are only used to determine the semivariogram model. One variant relies on a traditional parametric semivariogram estimate, whereas the other variant uses the nonparametric correlogram. The variants are tested for three cases and the impact of the semivariogram model on the Kriging prediction is illustrated. For the three test cases, the method using nonparametric correlograms performs equally well or better than the traditional method, and at the same time offers great practical advantages. Furthermore, two variants of Kriging with external drift (KED) are tested, both of which use the radar data to estimate nonparametric correlograms, and as the external drift variable. The first KED variant has been used previously for geostatistical radar-raingauge merging in Catalonia (Spain). The second variant is newly proposed here and is an extension of the first. Both variants are evaluated for the three test cases as well as an extended evaluation

  2. Geostatistical simulations for radon indoor with a nested model including the housing factor.

    PubMed

    Cafaro, C; Giovani, C; Garavaglia, M

    2016-01-01

    The radon prone areas definition is matter of many researches in radioecology, since radon is considered a leading cause of lung tumours, therefore the authorities ask for support to develop an appropriate sanitary prevention strategy. In this paper, we use geostatistical tools to elaborate a definition accounting for some of the available information about the dwellings. Co-kriging is the proper interpolator used in geostatistics to refine the predictions by using external covariates. In advance, co-kriging is not guaranteed to improve significantly the results obtained by applying the common lognormal kriging. Here, instead, such multivariate approach leads to reduce the cross-validation residual variance to an extent which is deemed as satisfying. Furthermore, with the application of Monte Carlo simulations, the paradigm provides a more conservative radon prone areas definition than the one previously made by lognormal kriging. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. A Geostatistical Scaling Approach for the Generation of Non Gaussian Random Variables and Increments

    NASA Astrophysics Data System (ADS)

    Guadagnini, Alberto; Neuman, Shlomo P.; Riva, Monica; Panzeri, Marco

    2016-04-01

    We address manifestations of non-Gaussian statistical scaling displayed by many variables, Y, and their (spatial or temporal) increments. Evidence of such behavior includes symmetry of increment distributions at all separation distances (or lags) with sharp peaks and heavy tails which tend to decay asymptotically as lag increases. Variables reported to exhibit such distributions include quantities of direct relevance to hydrogeological sciences, e.g. porosity, log permeability, electrical resistivity, soil and sediment texture, sediment transport rate, rainfall, measured and simulated turbulent fluid velocity, and other. No model known to us captures all of the documented statistical scaling behaviors in a unique and consistent manner. We recently proposed a generalized sub-Gaussian model (GSG) which reconciles within a unique theoretical framework the probability distributions of a target variable and its increments. We presented an algorithm to generate unconditional random realizations of statistically isotropic or anisotropic GSG functions and illustrated it in two dimensions. In this context, we demonstrated the feasibility of estimating all key parameters of a GSG model underlying a single realization of Y by analyzing jointly spatial moments of Y data and corresponding increments. Here, we extend our GSG model to account for noisy measurements of Y at a discrete set of points in space (or time), present an algorithm to generate conditional realizations of corresponding isotropic or anisotropic random field, and explore them on one- and two-dimensional synthetic test cases.

  4. PCTO-SIM: Multiple-point geostatistical modeling using parallel conditional texture optimization

    NASA Astrophysics Data System (ADS)

    Pourfard, Mohammadreza; Abdollahifard, Mohammad J.; Faez, Karim; Motamedi, Sayed Ahmad; Hosseinian, Tahmineh

    2017-05-01

    Multiple-point Geostatistics is a well-known general statistical framework by which complex geological phenomena have been modeled efficiently. Pixel-based and patch-based are two major categories of these methods. In this paper, the optimization-based category is used which has a dual concept in texture synthesis as texture optimization. Our extended version of texture optimization uses the energy concept to model geological phenomena. While honoring the hard point, the minimization of our proposed cost function forces simulation grid pixels to be as similar as possible to training images. Our algorithm has a self-enrichment capability and creates a richer training database from a sparser one through mixing the information of all surrounding patches of the simulation nodes. Therefore, it preserves pattern continuity in both continuous and categorical variables very well. It also shows a fuzzy result in its every realization similar to the expected result of multi realizations of other statistical models. While the main core of most previous Multiple-point Geostatistics methods is sequential, the parallel main core of our algorithm enabled it to use GPU efficiently to reduce the CPU time. One new validation method for MPS has also been proposed in this paper.

  5. Geostatistics: a common link between medical geography, mathematical geology, and medical geology

    PubMed Central

    Goovaerts, P.

    2015-01-01

    Synopsis Since its development in the mining industry, geostatistics has emerged as the primary tool for spatial data analysis in various fields, ranging from earth and atmospheric sciences to agriculture, soil science, remote sensing, and more recently environmental exposure assessment. In the last few years, these tools have been tailored to the field of medical geography or spatial epidemiology, which is concerned with the study of spatial patterns of disease incidence and mortality and the identification of potential ‘causes’ of disease, such as environmental exposure, diet and unhealthy behaviours, economic or socio-demographic factors. On the other hand, medical geology is an emerging interdisciplinary scientific field studying the relationship between natural geological factors and their effects on human and animal health. This paper provides an introduction to the field of medical geology with an overview of geostatistical methods available for the analysis of geological and health data. Key concepts are illustrated using the mapping of groundwater arsenic concentration across eleven Michigan counties and the exploration of its relationship to the incidence of prostate cancer at the township level. PMID:25722963

  6. Geostatistics: a common link between medical geography, mathematical geology, and medical geology.

    PubMed

    Goovaerts, P

    2014-08-01

    Since its development in the mining industry, geostatistics has emerged as the primary tool for spatial data analysis in various fields, ranging from earth and atmospheric sciences to agriculture, soil science, remote sensing, and more recently environmental exposure assessment. In the last few years, these tools have been tailored to the field of medical geography or spatial epidemiology, which is concerned with the study of spatial patterns of disease incidence and mortality and the identification of potential 'causes' of disease, such as environmental exposure, diet and unhealthy behaviours, economic or socio-demographic factors. On the other hand, medical geology is an emerging interdisciplinary scientific field studying the relationship between natural geological factors and their effects on human and animal health. This paper provides an introduction to the field of medical geology with an overview of geostatistical methods available for the analysis of geological and health data. Key concepts are illustrated using the mapping of groundwater arsenic concentration across eleven Michigan counties and the exploration of its relationship to the incidence of prostate cancer at the township level.

  7. Qualitative and quantitative comparison of geostatistical techniques of porosity prediction from the seismic and logging data: a case study from the Blackfoot Field, Alberta, Canada

    NASA Astrophysics Data System (ADS)

    Maurya, S. P.; Singh, K. H.; Singh, N. P.

    2018-05-01

    In present study, three recently developed geostatistical methods, single attribute analysis, multi-attribute analysis and probabilistic neural network algorithm have been used to predict porosity in inter well region for Blackfoot field, Alberta, Canada, an offshore oil field. These techniques make use of seismic attributes, generated by model based inversion and colored inversion techniques. The principle objective of the study is to find the suitable combination of seismic inversion and geostatistical techniques to predict porosity and identification of prospective zones in 3D seismic volume. The porosity estimated from these geostatistical approaches is corroborated with the well log porosity. The results suggest that all the three implemented geostatistical methods are efficient and reliable to predict the porosity but the multi-attribute and probabilistic neural network analysis provide more accurate and high resolution porosity sections. A low impedance (6000-8000 m/s g/cc) and high porosity (> 15%) zone is interpreted from inverted impedance and porosity sections respectively between 1060 and 1075 ms time interval and is characterized as reservoir. The qualitative and quantitative results demonstrate that of all the employed geostatistical methods, the probabilistic neural network along with model based inversion is the most efficient method for predicting porosity in inter well region.

  8. Analysis of alluvial hydrostratigraphy using indicator geostatistics, with examples from Santa Clara Valley, California

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1995-03-01

    Current trends in hydrogeology seek to enlist sedimentary concepts in the interpretation of permeability structures. However, existing conceptual models of alluvial deposition tend to inadequately account for the heterogeneity caused by complex sedimentological and external factors. This dissertation presents three analyses of alluvial hydrostratigraphy using indicator geostatistics. This approach empirically acknowledges both the random and structured qualities of alluvial structures at scales relevant to site investigations. The first analysis introduces the indicator approach, whereby binary values are assigned to borehole-log intervals on the basis of inferred relative permeability; it presents a case study of indicator variography at a well-documented ground-watermore » contamination site, and uses indicator kriging to interpolate an aquifer-aquitard sequence in three dimensions. The second analysis develops an alluvial-architecture context for interpreting semivariograms, and performs comparative variography for a suite of alluvial sites in Santa Clara Valley, California. The third analysis investigates the use of a water well perforation indicator for assessing large-scale hydrostratigraphic structures within relatively deep production zones.« less

  9. Nonpoint source solute transport normal to aquifer bedding in heterogeneous, Markov chain random fields

    NASA Astrophysics Data System (ADS)

    Zhang, Hua; Harter, Thomas; Sivakumar, Bellie

    2006-06-01

    Facies-based geostatistical models have become important tools for analyzing flow and mass transport processes in heterogeneous aquifers. Yet little is known about the relationship between these latter processes and the parameters of facies-based geostatistical models. In this study, we examine the transport of a nonpoint source solute normal (perpendicular) to the major bedding plane of an alluvial aquifer medium that contains multiple geologic facies, including interconnected, high-conductivity (coarse textured) facies. We also evaluate the dependence of the transport behavior on the parameters of the constitutive facies model. A facies-based Markov chain geostatistical model is used to quantify the spatial variability of the aquifer system's hydrostratigraphy. It is integrated with a groundwater flow model and a random walk particle transport model to estimate the solute traveltime probability density function (pdf) for solute flux from the water table to the bottom boundary (the production horizon) of the aquifer. The cases examined include two-, three-, and four-facies models, with mean length anisotropy ratios for horizontal to vertical facies, ek, from 25:1 to 300:1 and with a wide range of facies volume proportions (e.g., from 5 to 95% coarse-textured facies). Predictions of traveltime pdfs are found to be significantly affected by the number of hydrostratigraphic facies identified in the aquifer. Those predictions of traveltime pdfs also are affected by the proportions of coarse-textured sediments, the mean length of the facies (particularly the ratio of length to thickness of coarse materials), and, to a lesser degree, the juxtapositional preference among the hydrostratigraphic facies. In transport normal to the sedimentary bedding plane, traveltime is not lognormally distributed as is often assumed. Also, macrodispersive behavior (variance of the traveltime) is found not to be a unique function of the conductivity variance. For the parameter range

  10. Performance prediction using geostatistics and window reservoir simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fontanilla, J.P.; Al-Khalawi, A.A.; Johnson, S.G.

    1995-11-01

    This paper is the first window model study in the northern area of a large carbonate reservoir in Saudi Arabia. It describes window reservoir simulation with geostatistics to model uneven water encroachment in the southwest producing area of the northern portion of the reservoir. In addition, this paper describes performance predictions that investigate the sweep efficiency of the current peripheral waterflood. A 50 x 50 x 549 (240 m. x 260 m. x 0.15 m. average grid block size) geological model was constructed with geostatistics software. Conditional simulation was used to obtain spatial distributions of porosity and volume of dolomite.more » Core data transforms were used to obtain horizontal and vertical permeability distributions. Simple averaging techniques were used to convert the 549-layer geological model to a 50 x 50 x 10 (240 m. x 260 m. x 8 m. average grid block size) window reservoir simulation model. Flux injectors and flux producers were assigned to the outermost grid blocks. Historical boundary flux rates were obtained from a coarsely-ridded full-field model. Pressure distribution, water cuts, GORs, and recent flowmeter data were history matched. Permeability correction factors and numerous parameter adjustments were required to obtain the final history match. The permeability correction factors were based on pressure transient permeability-thickness analyses. The prediction phase of the study evaluated the effects of infill drilling, the use of artificial lifts, workovers, horizontal wells, producing rate constraints, and tight zone development to formulate depletion strategies for the development of this area. The window model will also be used to investigate day-to-day reservoir management problems in this area.« less

  11. How to Fully Represent Expert Information about Imprecise Properties in a Computer System – Random Sets, Fuzzy Sets, and Beyond: An Overview

    PubMed Central

    Nguyen, Hung T.; Kreinovich, Vladik

    2014-01-01

    To help computers make better decisions, it is desirable to describe all our knowledge in computer-understandable terms. This is easy for knowledge described in terms on numerical values: we simply store the corresponding numbers in the computer. This is also easy for knowledge about precise (well-defined) properties which are either true or false for each object: we simply store the corresponding “true” and “false” values in the computer. The challenge is how to store information about imprecise properties. In this paper, we overview different ways to fully store the expert information about imprecise properties. We show that in the simplest case, when the only source of imprecision is disagreement between different experts, a natural way to store all the expert information is to use random sets; we also show how fuzzy sets naturally appear in such random-set representation. We then show how the random-set representation can be extended to the general (“fuzzy”) case when, in addition to disagreements, experts are also unsure whether some objects satisfy certain properties or not. PMID:25386045

  12. Geostatistical Prediction of Microbial Water Quality Throughout a Stream Network Using Meteorology, Land Cover, and Spatiotemporal Autocorrelation.

    PubMed

    Holcomb, David A; Messier, Kyle P; Serre, Marc L; Rowny, Jakob G; Stewart, Jill R

    2018-06-25

    Predictive modeling is promising as an inexpensive tool to assess water quality. We developed geostatistical predictive models of microbial water quality that empirically modeled spatiotemporal autocorrelation in measured fecal coliform (FC) bacteria concentrations to improve prediction. We compared five geostatistical models featuring different autocorrelation structures, fit to 676 observations from 19 locations in North Carolina's Jordan Lake watershed using meteorological and land cover predictor variables. Though stream distance metrics (with and without flow-weighting) failed to improve prediction over the Euclidean distance metric, incorporating temporal autocorrelation substantially improved prediction over the space-only models. We predicted FC throughout the stream network daily for one year, designating locations "impaired", "unimpaired", or "unassessed" if the probability of exceeding the state standard was ≥90%, ≤10%, or >10% but <90%, respectively. We could assign impairment status to more of the stream network on days any FC were measured, suggesting frequent sample-based monitoring remains necessary, though implementing spatiotemporal predictive models may reduce the number of concurrent sampling locations required to adequately assess water quality. Together, these results suggest that prioritizing sampling at different times and conditions using geographically sparse monitoring networks is adequate to build robust and informative geostatistical models of water quality impairment.

  13. Geostatistical Analysis of Mesoscale Spatial Variability and Error in SeaWiFS and MODIS/Aqua Global Ocean Color Data

    NASA Astrophysics Data System (ADS)

    Glover, David M.; Doney, Scott C.; Oestreich, William K.; Tullo, Alisdair W.

    2018-01-01

    Mesoscale (10-300 km, weeks to months) physical variability strongly modulates the structure and dynamics of planktonic marine ecosystems via both turbulent advection and environmental impacts upon biological rates. Using structure function analysis (geostatistics), we quantify the mesoscale biological signals within global 13 year SeaWiFS (1998-2010) and 8 year MODIS/Aqua (2003-2010) chlorophyll a ocean color data (Level-3, 9 km resolution). We present geographical distributions, seasonality, and interannual variability of key geostatistical parameters: unresolved variability or noise, resolved variability, and spatial range. Resolved variability is nearly identical for both instruments, indicating that geostatistical techniques isolate a robust measure of biophysical mesoscale variability largely independent of measurement platform. In contrast, unresolved variability in MODIS/Aqua is substantially lower than in SeaWiFS, especially in oligotrophic waters where previous analysis identified a problem for the SeaWiFS instrument likely due to sensor noise characteristics. Both records exhibit a statistically significant relationship between resolved mesoscale variability and the low-pass filtered chlorophyll field horizontal gradient magnitude, consistent with physical stirring acting on large-scale gradient as an important factor supporting observed mesoscale variability. Comparable horizontal length scales for variability are found from tracer-based scaling arguments and geostatistical decorrelation. Regional variations between these length scales may reflect scale dependence of biological mechanisms that also create variability directly at the mesoscale, for example, enhanced net phytoplankton growth in coastal and frontal upwelling and convective mixing regions. Global estimates of mesoscale biophysical variability provide an improved basis for evaluating higher resolution, coupled ecosystem-ocean general circulation models, and data assimilation.

  14. A multiple-point geostatistical method for characterizing uncertainty of subsurface alluvial units and its effects on flow and transport

    USGS Publications Warehouse

    Cronkite-Ratcliff, C.; Phelps, G.A.; Boucher, A.

    2012-01-01

    This report provides a proof-of-concept to demonstrate the potential application of multiple-point geostatistics for characterizing geologic heterogeneity and its effect on flow and transport simulation. The study presented in this report is the result of collaboration between the U.S. Geological Survey (USGS) and Stanford University. This collaboration focused on improving the characterization of alluvial deposits by incorporating prior knowledge of geologic structure and estimating the uncertainty of the modeled geologic units. In this study, geologic heterogeneity of alluvial units is characterized as a set of stochastic realizations, and uncertainty is indicated by variability in the results of flow and transport simulations for this set of realizations. This approach is tested on a hypothetical geologic scenario developed using data from the alluvial deposits in Yucca Flat, Nevada. Yucca Flat was chosen as a data source for this test case because it includes both complex geologic and hydrologic characteristics and also contains a substantial amount of both surface and subsurface geologic data. Multiple-point geostatistics is used to model geologic heterogeneity in the subsurface. A three-dimensional (3D) model of spatial variability is developed by integrating alluvial units mapped at the surface with vertical drill-hole data. The SNESIM (Single Normal Equation Simulation) algorithm is used to represent geologic heterogeneity stochastically by generating 20 realizations, each of which represents an equally probable geologic scenario. A 3D numerical model is used to simulate groundwater flow and contaminant transport for each realization, producing a distribution of flow and transport responses to the geologic heterogeneity. From this distribution of flow and transport responses, the frequency of exceeding a given contaminant concentration threshold can be used as an indicator of uncertainty about the location of the contaminant plume boundary.

  15. Systematic evaluation of sequential geostatistical resampling within MCMC for posterior sampling of near-surface geophysical inverse problems

    NASA Astrophysics Data System (ADS)

    Ruggeri, Paolo; Irving, James; Holliger, Klaus

    2015-08-01

    We critically examine the performance of sequential geostatistical resampling (SGR) as a model proposal mechanism for Bayesian Markov-chain-Monte-Carlo (MCMC) solutions to near-surface geophysical inverse problems. Focusing on a series of simple yet realistic synthetic crosshole georadar tomographic examples characterized by different numbers of data, levels of data error and degrees of model parameter spatial correlation, we investigate the efficiency of three different resampling strategies with regard to their ability to generate statistically independent realizations from the Bayesian posterior distribution. Quite importantly, our results show that, no matter what resampling strategy is employed, many of the examined test cases require an unreasonably high number of forward model runs to produce independent posterior samples, meaning that the SGR approach as currently implemented will not be computationally feasible for a wide range of problems. Although use of a novel gradual-deformation-based proposal method can help to alleviate these issues, it does not offer a full solution. Further, we find that the nature of the SGR is found to strongly influence MCMC performance; however no clear rule exists as to what set of inversion parameters and/or overall proposal acceptance rate will allow for the most efficient implementation. We conclude that although the SGR methodology is highly attractive as it allows for the consideration of complex geostatistical priors as well as conditioning to hard and soft data, further developments are necessary in the context of novel or hybrid MCMC approaches for it to be considered generally suitable for near-surface geophysical inversions.

  16. Principal Component Geostatistical Approach for large-dimensional inverse problems.

    PubMed

    Kitanidis, P K; Lee, J

    2014-07-01

    The quasi-linear geostatistical approach is for weakly nonlinear underdetermined inverse problems, such as Hydraulic Tomography and Electrical Resistivity Tomography. It provides best estimates as well as measures for uncertainty quantification. However, for its textbook implementation, the approach involves iterations, to reach an optimum, and requires the determination of the Jacobian matrix, i.e., the derivative of the observation function with respect to the unknown. Although there are elegant methods for the determination of the Jacobian, the cost is high when the number of unknowns, m , and the number of observations, n , is high. It is also wasteful to compute the Jacobian for points away from the optimum. Irrespective of the issue of computing derivatives, the computational cost of implementing the method is generally of the order of m 2 n , though there are methods to reduce the computational cost. In this work, we present an implementation that utilizes a matrix free in terms of the Jacobian matrix Gauss-Newton method and improves the scalability of the geostatistical inverse problem. For each iteration, it is required to perform K runs of the forward problem, where K is not just much smaller than m but can be smaller that n . The computational and storage cost of implementation of the inverse procedure scales roughly linearly with m instead of m 2 as in the textbook approach. For problems of very large m , this implementation constitutes a dramatic reduction in computational cost compared to the textbook approach. Results illustrate the validity of the approach and provide insight in the conditions under which this method perform best.

  17. An unusual case of random fire-setting behavior associated with lacunar stroke.

    PubMed

    Bosshart, Herbert; Capek, Sonia

    2011-06-15

    A case of a 47-year-old man with a sudden onset of a bizarre and random fire-setting behavior is reported. The man, who had been arrested on felony arson charges, complained of difficulties concentrating and of recent memory impairment. Axial T1-weighted magnetic resonance imaging showed a low intensity lacunar lesion in the genu and anterior limb of the left internal capsule. A neuropsychological test battery revealed lower than normal scores for executive functions, attention and memory, consistent with frontal lobe dysfunction. The recent onset of fire-setting behavior and the chronic nature of the lacunar lesion, together with an unremarkable performance on tests measuring executive functions two years prior, suggested a causal relationship between this organic brain lesion and the fire-setting behavior. The present case describes a rare and as yet unreported association between random impulse-driven fire-setting behavior and damage to the left internal capsule and suggests a disconnection of frontal lobe structures as a possible pathogenic mechanism. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  18. Analysis of the spatio-temporal distribution of Eurygaster integriceps (Hemiptera: Scutelleridae) by using spatial analysis by distance indices and geostatistics.

    PubMed

    Karimzadeh, R; Hejazi, M J; Helali, H; Iranipour, S; Mohammadi, S A

    2011-10-01

    Eurygaster integriceps Puton (Hemiptera: Scutelleridae) is the most serious insect pest of wheat (Triticum aestivum L.) and barley (Hordeum vulgare L.) in Iran. In this study, spatio-temporal distribution of this pest was determined in wheat by using spatial analysis by distance indices (SADIE) and geostatistics. Global positioning and geographic information systems were used for spatial sampling and mapping the distribution of this insect. The study was conducted for three growing seasons in Gharamalek, an agricultural region to the west of Tabriz, Iran. Weekly sampling began when E. integriceps adults migrated to wheat fields from overwintering sites and ended when the new generation adults appeared at the end of season. The adults were sampled using 1- by 1-m quadrat and distance-walk methods. A sweep net was used for sampling the nymphs, and five 180° sweeps were considered as the sampling unit. The results of spatial analyses by using geostatistics and SADIE indicated that E. integriceps adults were clumped after migration to fields and had significant spatial dependency. The second- and third-instar nymphs showed aggregated spatial structure in the middle of growing season. At the end of the season, population distribution changed toward random or regular patterns; and fourth and fifth instars had weaker spatial structure compared with younger nymphs. In Iran, management measures for E. integriceps in wheat fields are mainly applied against overwintering adults, as well as second and third instars. Because of the aggregated distribution of these life stages, site-specific spraying of chemicals is feasible in managing E. integriceps.

  19. Spectral turning bands for efficient Gaussian random fields generation on GPUs and accelerators

    NASA Astrophysics Data System (ADS)

    Hunger, L.; Cosenza, B.; Kimeswenger, S.; Fahringer, T.

    2015-11-01

    A random field (RF) is a set of correlated random variables associated with different spatial locations. RF generation algorithms are of crucial importance for many scientific areas, such as astrophysics, geostatistics, computer graphics, and many others. Current approaches commonly make use of 3D fast Fourier transform (FFT), which does not scale well for RF bigger than the available memory; they are also limited to regular rectilinear meshes. We introduce random field generation with the turning band method (RAFT), an RF generation algorithm based on the turning band method that is optimized for massively parallel hardware such as GPUs and accelerators. Our algorithm replaces the 3D FFT with a lower-order, one-dimensional FFT followed by a projection step and is further optimized with loop unrolling and blocking. RAFT can easily generate RF on non-regular (non-uniform) meshes and efficiently produce fields with mesh sizes bigger than the available device memory by using a streaming, out-of-core approach. Our algorithm generates RF with the correct statistical behavior and is tested on a variety of modern hardware, such as NVIDIA Tesla, AMD FirePro and Intel Phi. RAFT is faster than the traditional methods on regular meshes and has been successfully applied to two real case scenarios: planetary nebulae and cosmological simulations.

  20. Precipitation estimation in mountainous terrain using multivariate geostatistics. Part I: structural analysis

    USGS Publications Warehouse

    Hevesi, Joseph A.; Istok, Jonathan D.; Flint, Alan L.

    1992-01-01

    Values of average annual precipitation (AAP) are desired for hydrologic studies within a watershed containing Yucca Mountain, Nevada, a potential site for a high-level nuclear-waste repository. Reliable values of AAP are not yet available for most areas within this watershed because of a sparsity of precipitation measurements and the need to obtain measurements over a sufficient length of time. To estimate AAP over the entire watershed, historical precipitation data and station elevations were obtained from a network of 62 stations in southern Nevada and southeastern California. Multivariate geostatistics (cokriging) was selected as an estimation method because of a significant (p = 0.05) correlation of r = .75 between the natural log of AAP and station elevation. A sample direct variogram for the transformed variable, TAAP = ln [(AAP) 1000], was fitted with an isotropic, spherical model defined by a small nugget value of 5000, a range of 190 000 ft, and a sill value equal to the sample variance of 163 151. Elevations for 1531 additional locations were obtained from topographic maps to improve the accuracy of cokriged estimates. A sample direct variogram for elevation was fitted with an isotropic model consisting of a nugget value of 5500 and three nested transition structures: a Gaussian structure with a range of 61 000 ft, a spherical structure with a range of 70 000 ft, and a quasi-stationary, linear structure. The use of an isotropic, stationary model for elevation was considered valid within a sliding-neighborhood radius of 120 000 ft. The problem of fitting a positive-definite, nonlinear model of coregionalization to an inconsistent sample cross variogram for TAAP and elevation was solved by a modified use of the Cauchy-Schwarz inequality. A selected cross-variogram model consisted of two nested structures: a Gaussian structure with a range of 61 000 ft and a spherical structure with a range of 190 000 ft. Cross validation was used for model selection and for

  1. Geo-statistical analysis of Culicoides spp. distribution and abundance in Sicily, Italy.

    PubMed

    Blanda, Valeria; Blanda, Marcellocalogero; La Russa, Francesco; Scimeca, Rossella; Scimeca, Salvatore; D'Agostino, Rosalia; Auteri, Michelangelo; Torina, Alessandra

    2018-02-01

    Biting midges belonging to Culicoides imicola, Culicoides obsoletus complex and Culicoides pulicaris complex (Diptera: Ceratopogonidae) are increasingly implicated as vectors of bluetongue virus in Palaearctic regions. Culicoides obsoletus complex includes C. obsoletus (sensu stricto), C. scoticus, C. dewulfi and C. chiopterus. Culicoides pulicaris and C. lupicaris belong to the Culicoides pulicaris complex. The aim of this study was a geo-statistical analysis of the abundance and spatial distribution of Culicoides spp. involved in bluetongue virus transmission. As part of the national bluetongue surveillance plan 7081 catches were collected in 897 Sicilian farms from 2000 to 2013. Onderstepoort-type blacklight traps were used for sample collection and each catch was analysed for the presence of Culicoides spp. and for the presence and abundance of Culicoides vector species (C. imicola, C. pulicaris / C. obsoletus complexes). A geo-statistical analysis was carried out monthly via the interpolation of measured values based on the Inverse Distance Weighted method, using a GIS tool. Raster maps were reclassified into seven classes according to the presence and abundance of Culicoides, in order to obtain suitable maps for Map Algebra operations. Sicilian provinces showing a very high abundance of Culicoides vector species were Messina (80% of the whole area), Palermo (20%) and Catania (12%). A total of 5654 farms fell within the very high risk area for bluetongue (21% of the 26,676 farms active in Sicily); of these, 3483 farms were in Messina, 1567 in Palermo and 604 in Catania. Culicoides imicola was prevalent in Palermo, C. pulicaris in Messina and C. obsoletus complex was very abundant over the whole island with the highest abundance value in Messina. Our study reports the results of a geo-statistical analysis concerning the abundance and spatial distribution of Culicoides spp. in Sicily throughout the fourteen year study. It provides useful decision support in the

  2. Conditioning geostatistical simulations of a heterogeneous paleo-fluvial bedrock aquifer using lithologs and pumping tests

    NASA Astrophysics Data System (ADS)

    Niazi, A.; Bentley, L. R.; Hayashi, M.

    2016-12-01

    Geostatistical simulations are used to construct heterogeneous aquifer models. Optimally, such simulations should be conditioned with both lithologic and hydraulic data. We introduce an approach to condition lithologic geostatistical simulations of a paleo-fluvial bedrock aquifer consisting of relatively high permeable sandstone channels embedded in relatively low permeable mudstone using hydraulic data. The hydraulic data consist of two-hour single well pumping tests extracted from the public water well database for a 250-km2 watershed in Alberta, Canada. First, lithologic models of the entire watershed are simulated and conditioned with hard lithological data using transition probability - Markov chain geostatistics (TPROGS). Then, a segment of the simulation around a pumping well is used to populate a flow model (FEFLOW) with either sand or mudstone. The values of the hydraulic conductivity and specific storage of sand and mudstone are then adjusted to minimize the difference between simulated and actual pumping test data using the parameter estimation program PEST. If the simulated pumping test data do not adequately match the measured data, the lithologic model is updated by locally deforming the lithology distribution using the probability perturbation method and the model parameters are again updated with PEST. This procedure is repeated until the simulated and measured data agree within a pre-determined tolerance. The procedure is repeated for each well that has pumping test data. The method creates a local groundwater model that honors both the lithologic model and pumping test data and provides estimates of hydraulic conductivity and specific storage. Eventually, the simulations will be integrated into a watershed-scale groundwater model.

  3. Principal Component Geostatistical Approach for large-dimensional inverse problems

    PubMed Central

    Kitanidis, P K; Lee, J

    2014-01-01

    The quasi-linear geostatistical approach is for weakly nonlinear underdetermined inverse problems, such as Hydraulic Tomography and Electrical Resistivity Tomography. It provides best estimates as well as measures for uncertainty quantification. However, for its textbook implementation, the approach involves iterations, to reach an optimum, and requires the determination of the Jacobian matrix, i.e., the derivative of the observation function with respect to the unknown. Although there are elegant methods for the determination of the Jacobian, the cost is high when the number of unknowns, m, and the number of observations, n, is high. It is also wasteful to compute the Jacobian for points away from the optimum. Irrespective of the issue of computing derivatives, the computational cost of implementing the method is generally of the order of m2n, though there are methods to reduce the computational cost. In this work, we present an implementation that utilizes a matrix free in terms of the Jacobian matrix Gauss-Newton method and improves the scalability of the geostatistical inverse problem. For each iteration, it is required to perform K runs of the forward problem, where K is not just much smaller than m but can be smaller that n. The computational and storage cost of implementation of the inverse procedure scales roughly linearly with m instead of m2 as in the textbook approach. For problems of very large m, this implementation constitutes a dramatic reduction in computational cost compared to the textbook approach. Results illustrate the validity of the approach and provide insight in the conditions under which this method perform best. PMID:25558113

  4. Medical Geography: a Promising Field of Application for Geostatistics

    PubMed Central

    Goovaerts, P.

    2008-01-01

    The analysis of health data and putative covariates, such as environmental, socio-economic, behavioral or demographic factors, is a promising application for geostatistics. It presents, however, several methodological challenges that arise from the fact that data are typically aggregated over irregular spatial supports and consist of a numerator and a denominator (i.e. population size). This paper presents an overview of recent developments in the field of health geostatistics, with an emphasis on three main steps in the analysis of areal health data: estimation of the underlying disease risk, detection of areas with significantly higher risk, and analysis of relationships with putative risk factors. The analysis is illustrated using age-adjusted cervix cancer mortality rates recorded over the 1970–1994 period for 118 counties of four states in the Western USA. Poisson kriging allows the filtering of noisy mortality rates computed from small population sizes, enhancing the correlation with two putative explanatory variables: percentage of habitants living below the federally defined poverty line, and percentage of Hispanic females. Area-to-point kriging formulation creates continuous maps of mortality risk, reducing the visual bias associated with the interpretation of choropleth maps. Stochastic simulation is used to generate realizations of cancer mortality maps, which allows one to quantify numerically how the uncertainty about the spatial distribution of health outcomes translates into uncertainty about the location of clusters of high values or the correlation with covariates. Last, geographically-weighted regression highlights the non-stationarity in the explanatory power of covariates: the higher mortality values along the coast are better explained by the two covariates than the lower risk recorded in Utah. PMID:19412347

  5. Medical Geography: a Promising Field of Application for Geostatistics.

    PubMed

    Goovaerts, P

    2009-01-01

    The analysis of health data and putative covariates, such as environmental, socio-economic, behavioral or demographic factors, is a promising application for geostatistics. It presents, however, several methodological challenges that arise from the fact that data are typically aggregated over irregular spatial supports and consist of a numerator and a denominator (i.e. population size). This paper presents an overview of recent developments in the field of health geostatistics, with an emphasis on three main steps in the analysis of areal health data: estimation of the underlying disease risk, detection of areas with significantly higher risk, and analysis of relationships with putative risk factors. The analysis is illustrated using age-adjusted cervix cancer mortality rates recorded over the 1970-1994 period for 118 counties of four states in the Western USA. Poisson kriging allows the filtering of noisy mortality rates computed from small population sizes, enhancing the correlation with two putative explanatory variables: percentage of habitants living below the federally defined poverty line, and percentage of Hispanic females. Area-to-point kriging formulation creates continuous maps of mortality risk, reducing the visual bias associated with the interpretation of choropleth maps. Stochastic simulation is used to generate realizations of cancer mortality maps, which allows one to quantify numerically how the uncertainty about the spatial distribution of health outcomes translates into uncertainty about the location of clusters of high values or the correlation with covariates. Last, geographically-weighted regression highlights the non-stationarity in the explanatory power of covariates: the higher mortality values along the coast are better explained by the two covariates than the lower risk recorded in Utah.

  6. Randomized Trial of Infusion Set Function: Steel Versus Teflon

    PubMed Central

    Patel, Parul J.; Benasi, Kari; Ferrari, Gina; Evans, Mark G.; Shanmugham, Satya; Wilson, Darrell M.

    2014-01-01

    Abstract Background: This study compared infusion set function for up to 1 week using either a Teflon® (Dupont™, Wilmington, DE) catheter or a steel catheter for insulin pump therapy in type 1 diabetes mellitus. Subjects and Methods: Twenty subjects participating in a randomized, open-labeled, crossover study were asked to wear two Quick-Set® and two Sure-T® infusion sets (both from Medtronic Minimed, Northridge, CA) until the infusion set failed or was worn for 1 week. All subjects wore a MiniMed continuous glucose monitoring system for the duration of the study. Results: One subject withdrew from the study. There were 38 weeks of Sure-T wear and 39 weeks of Quick-Set wear with no difference in the survival curves of the infusion sets. There was, however, a 15% initial failure rate with the Teflon infusion set. After 7 days, both types of infusion sets had a 64% failure rate. Overall, 30% failed because of hyperglycemia and a failed correction dose, 13% were removed for pain, 10% were pulled out by accident, 10% had erythema and/or induration of>10 mm, 5% fell out because of loss of adhesion, and 4% were removed for infection. The main predictor of length of wear was the individual subject. There was no increase in hyperglycemia or daily insulin requirements when an infusion set was successfully used for 7 days (n=25 of 77 weeks). Conclusions: We found no difference between steel and Teflon infusion sets in their function over 7 days, although 15% of Teflon sets failed because of kinking on insertion. The strongest predictor of prolonged 7-day infusion set function was the individual subject, not the type of infusion set. PMID:24090124

  7. Connectivity ranking of heterogeneous random conductivity models

    NASA Astrophysics Data System (ADS)

    Rizzo, C. B.; de Barros, F.

    2017-12-01

    To overcome the challenges associated with hydrogeological data scarcity, the hydraulic conductivity (K) field is often represented by a spatial random process. The state-of-the-art provides several methods to generate 2D or 3D random K-fields, such as the classic multi-Gaussian fields or non-Gaussian fields, training image-based fields and object-based fields. We provide a systematic comparison of these models based on their connectivity. We use the minimum hydraulic resistance as a connectivity measure, which it has been found to be strictly correlated with early time arrival of dissolved contaminants. A computationally efficient graph-based algorithm is employed, allowing a stochastic treatment of the minimum hydraulic resistance through a Monte-Carlo approach and therefore enabling the computation of its uncertainty. The results show the impact of geostatistical parameters on the connectivity for each group of random fields, being able to rank the fields according to their minimum hydraulic resistance.

  8. Comparative soil CO2 flux measurements and geostatistical estimation methods on Masaya volcano, Nicaragua

    USGS Publications Warehouse

    Lewicki, Jennifer L.; Bergfeld, Deborah; Cardellini, Carlo; Chiodini, Giovanni; Granieri, Domenico; Varley, Nick; Werner, Cynthia A.

    2005-01-01

    We present a comparative study of soil CO2 flux (FCO2">FCO2) measured by five groups (Groups 1–5) at the IAVCEI-CCVG Eighth Workshop on Volcanic Gases on Masaya volcano, Nicaragua. Groups 1–5 measured FCO2 using the accumulation chamber method at 5-m spacing within a 900 m2 grid during a morning (AM) period. These measurements were repeated by Groups 1–3 during an afternoon (PM) period. Measured FCO2 ranged from 218 to 14,719 g m−2 day−1. The variability of the five measurements made at each grid point ranged from ±5 to 167%. However, the arithmetic means of fluxes measured over the entire grid and associated total CO2 emission rate estimates varied between groups by only ±22%. All three groups that made PM measurements reported an 8–19% increase in total emissions over the AM results. Based on a comparison of measurements made during AM and PM times, we argue that this change is due in large part to natural temporal variability of gas flow, rather than to measurement error. In order to estimate the mean and associated CO2 emission rate of one data set and to map the spatial FCO2 distribution, we compared six geostatistical methods: arithmetic and minimum variance unbiased estimator means of uninterpolated data, and arithmetic means of data interpolated by the multiquadric radial basis function, ordinary kriging, multi-Gaussian kriging, and sequential Gaussian simulation methods. While the total CO2 emission rates estimated using the different techniques only varied by ±4.4%, the FCO2 maps showed important differences. We suggest that the sequential Gaussian simulation method yields the most realistic representation of the spatial distribution of FCO2, but a variety of geostatistical methods are appropriate to estimate the total CO2 emission rate from a study area, which is a primary goal in volcano monitoring research.

  9. Alcohol risk management in college settings: the safer California universities randomized trial.

    PubMed

    Saltz, Robert F; Paschall, Mallie J; McGaffigan, Richard P; Nygaard, Peter M O

    2010-12-01

    Potentially effective environmental strategies have been recommended to reduce heavy alcohol use among college students. However, studies to date on environmental prevention strategies are few in number and have been limited by their nonexperimental designs, inadequate sample sizes, and lack of attention to settings where the majority of heavy drinking events occur. To determine whether environmental prevention strategies targeting off-campus settings would reduce the likelihood and incidence of student intoxication at those settings. The Safer California Universities study involved 14 large public universities, half of which were assigned randomly to the Safer intervention condition after baseline data collection in 2003. Environmental interventions took place in 2005 and 2006 after 1 year of planning with seven Safer intervention universities. Random cross-sectional samples of undergraduates completed online surveys in four consecutive fall semesters (2003-2006). Campuses and communities surrounding eight campuses of the University of California and six in the California State University system were utilized. The study used random samples of undergraduates (∼500-1000 per campus per year) attending the 14 public California universities. Safer environmental interventions included nuisance party enforcement operations, minor decoy operations, driving-under-the-influence checkpoints, social host ordinances, and use of campus and local media to increase the visibility of environmental strategies. Proportion of drinking occasions in which students drank to intoxication at six different settings during the fall semester (residence hall party, campus event, fraternity or sorority party, party at off-campus apartment or house, bar/restaurant, outdoor setting), any intoxication at each setting during the semester, and whether students drank to intoxication the last time they went to each setting. Significant reductions in the incidence and likelihood of intoxication at

  10. Porous media flux sensitivity to pore-scale geostatistics: A bottom-up approach

    NASA Astrophysics Data System (ADS)

    Di Palma, P. R.; Guyennon, N.; Heße, F.; Romano, E.

    2017-04-01

    Macroscopic properties of flow through porous media can be directly computed by solving the Navier-Stokes equations at the scales related to the actual flow processes, while considering the porous structures in an explicit way. The aim of this paper is to investigate the effects of the pore-scale spatial distribution on seepage velocity through numerical simulations of 3D fluid flow performed by the lattice Boltzmann method. To this end, we generate multiple random Gaussian fields whose spatial correlation follows an assigned semi-variogram function. The Exponential and Gaussian semi-variograms are chosen as extreme-cases of correlation for short distances and statistical properties of the resulting porous media (indicator field) are described using the Matèrn covariance model, with characteristic lengths of spatial autocorrelation (pore size) varying from 2% to 13% of the linear domain. To consider the sensitivity of the modeling results to the geostatistical representativeness of the domain as well as to the adopted resolution, porous media have been generated repetitively with re-initialized random seeds and three different resolutions have been tested for each resulting realization. The main difference among results is observed between the two adopted semi-variograms, indicating that the roughness (short distances autocorrelation) is the property mainly affecting the flux. However, computed seepage velocities show additionally a wide variability (about three orders of magnitude) for each semi-variogram model in relation to the assigned correlation length, corresponding to pore sizes. The spatial resolution affects more the results for short correlation lengths (i.e., small pore sizes), resulting in an increasing underestimation of the seepage velocity with the decreasing correlation length. On the other hand, results show an increasing uncertainty as the correlation length approaches the domain size.

  11. Geostatistical estimation of forest biomass in interior Alaska combining Landsat-derived tree cover, sampled airborne lidar and field observations

    NASA Astrophysics Data System (ADS)

    Babcock, Chad; Finley, Andrew O.; Andersen, Hans-Erik; Pattison, Robert; Cook, Bruce D.; Morton, Douglas C.; Alonzo, Michael; Nelson, Ross; Gregoire, Timothy; Ene, Liviu; Gobakken, Terje; Næsset, Erik

    2018-06-01

    The goal of this research was to develop and examine the performance of a geostatistical coregionalization modeling approach for combining field inventory measurements, strip samples of airborne lidar and Landsat-based remote sensing data products to predict aboveground biomass (AGB) in interior Alaska's Tanana Valley. The proposed modeling strategy facilitates pixel-level mapping of AGB density predictions across the entire spatial domain. Additionally, the coregionalization framework allows for statistically sound estimation of total AGB for arbitrary areal units within the study area---a key advance to support diverse management objectives in interior Alaska. This research focuses on appropriate characterization of prediction uncertainty in the form of posterior predictive coverage intervals and standard deviations. Using the framework detailed here, it is possible to quantify estimation uncertainty for any spatial extent, ranging from pixel-level predictions of AGB density to estimates of AGB stocks for the full domain. The lidar-informed coregionalization models consistently outperformed their counterpart lidar-free models in terms of point-level predictive performance and total AGB precision. Additionally, the inclusion of Landsat-derived forest cover as a covariate further improved estimation precision in regions with lower lidar sampling intensity. Our findings also demonstrate that model-based approaches that do not explicitly account for residual spatial dependence can grossly underestimate uncertainty, resulting in falsely precise estimates of AGB. On the other hand, in a geostatistical setting, residual spatial structure can be modeled within a Bayesian hierarchical framework to obtain statistically defensible assessments of uncertainty for AGB estimates.

  12. Geostatistical uncertainty of assessing air quality using high-spatial-resolution lichen data: A health study in the urban area of Sines, Portugal.

    PubMed

    Ribeiro, Manuel C; Pinho, P; Branquinho, C; Llop, Esteve; Pereira, Maria J

    2016-08-15

    In most studies correlating health outcomes with air pollution, personal exposure assignments are based on measurements collected at air-quality monitoring stations not coinciding with health data locations. In such cases, interpolators are needed to predict air quality in unsampled locations and to assign personal exposures. Moreover, a measure of the spatial uncertainty of exposures should be incorporated, especially in urban areas where concentrations vary at short distances due to changes in land use and pollution intensity. These studies are limited by the lack of literature comparing exposure uncertainty derived from distinct spatial interpolators. Here, we addressed these issues with two interpolation methods: regression Kriging (RK) and ordinary Kriging (OK). These methods were used to generate air-quality simulations with a geostatistical algorithm. For each method, the geostatistical uncertainty was drawn from generalized linear model (GLM) analysis. We analyzed the association between air quality and birth weight. Personal health data (n=227) and exposure data were collected in Sines (Portugal) during 2007-2010. Because air-quality monitoring stations in the city do not offer high-spatial-resolution measurements (n=1), we used lichen data as an ecological indicator of air quality (n=83). We found no significant difference in the fit of GLMs with any of the geostatistical methods. With RK, however, the models tended to fit better more often and worse less often. Moreover, the geostatistical uncertainty results showed a marginally higher mean and precision with RK. Combined with lichen data and land-use data of high spatial resolution, RK is a more effective geostatistical method for relating health outcomes with air quality in urban areas. This is particularly important in small cities, which generally do not have expensive air-quality monitoring stations with high spatial resolution. Further, alternative ways of linking human activities with their

  13. An uncommon case of random fire-setting behavior associated with Todd paralysis: a case report.

    PubMed

    Kanehisa, Masayuki; Morinaga, Katsuhiko; Kohno, Hisae; Maruyama, Yoshihiro; Ninomiya, Taiga; Ishitobi, Yoshinobu; Tanaka, Yoshihiro; Tsuru, Jusen; Hanada, Hiroaki; Yoshikawa, Tomoya; Akiyoshi, Jotaro

    2012-08-31

    The association between fire-setting behavior and psychiatric or medical disorders remains poorly understood. Although a link between fire-setting behavior and various organic brain disorders has been established, associations between fire setting and focal brain lesions have not yet been reported. Here, we describe the case of a 24-year-old first time arsonist who suffered Todd's paralysis prior to the onset of a bizarre and random fire-setting behavior. A case of a 24-year-old man with a sudden onset of a bizarre and random fire-setting behavior is reported. The man, who had been arrested on felony arson charges, complained of difficulties concentrating and of recent memory disturbances with leg weakness. A video-EEG recording demonstrated a close relationship between the focal motor impairment and a clear-cut epileptic ictal discharge involving the bilateral motor cortical areas. The SPECT result was statistically analyzed by comparing with standard SPECT images obtained from our institute (easy Z-score imaging system; eZIS). eZIS revealed hypoperfusion in cingulate cortex, basal ganglia and hyperperfusion in frontal cortex,. A neuropsychological test battery revealed lower than normal scores for executive function, attention, and memory, consistent with frontal lobe dysfunction. The fire-setting behavior and Todd's paralysis, together with an unremarkable performance on tests measuring executive function fifteen months prior, suggested a causal relationship between this organic brain lesion and the fire-setting behavior. The case describes a rare and as yet unreported association between random, impulse-driven fire-setting behavior and damage to the brain and suggests a disconnection of frontal lobe structures as a possible pathogenic mechanism.

  14. Downscaling remotely sensed imagery using area-to-point cokriging and multiple-point geostatistical simulation

    NASA Astrophysics Data System (ADS)

    Tang, Yunwei; Atkinson, Peter M.; Zhang, Jingxiong

    2015-03-01

    A cross-scale data integration method was developed and tested based on the theory of geostatistics and multiple-point geostatistics (MPG). The goal was to downscale remotely sensed images while retaining spatial structure by integrating images at different spatial resolutions. During the process of downscaling, a rich spatial correlation model in the form of a training image was incorporated to facilitate reproduction of similar local patterns in the simulated images. Area-to-point cokriging (ATPCK) was used as locally varying mean (LVM) (i.e., soft data) to deal with the change of support problem (COSP) for cross-scale integration, which MPG cannot achieve alone. Several pairs of spectral bands of remotely sensed images were tested for integration within different cross-scale case studies. The experiment shows that MPG can restore the spatial structure of the image at a fine spatial resolution given the training image and conditioning data. The super-resolution image can be predicted using the proposed method, which cannot be realised using most data integration methods. The results show that ATPCK-MPG approach can achieve greater accuracy than methods which do not account for the change of support issue.

  15. Geostatistical analysis of ground-penetrating radar data: A means of describing spatial variation in the subsurface

    NASA Astrophysics Data System (ADS)

    Rea, Jane; Knight, Rosemary

    1998-03-01

    We have investigated the use of ground-penetrating radar (GFR) as a means of characterizing the heterogeneity of the subsurface. Radar data were collected at several sites in southwestern British Columbia underlain by glaciodeltaic sediments. A cliff face study was conducted in which geostatistical analysis of a digitized photograph of the face and the radar image of the face showed excellent agreement in the maximum correlation direction and the correlation length determined from these two data sets. Other two-dimensional (2-D) sections of radar data were divided into sedimentary architectural elements on the basis of the distinct radar appearance of these sedimentary units. Examples of four sedimentary units were used to obtain semivariograms from the radar data and resulted in maximum correlation lengths between 0.5 and 4.8 m. A 3-D radar survey, collected over a package of gravel and sand foresets, was analyzed to determine the paleoflow direction; a correlation length of 4 m was found in that direction.

  16. High Performance Geostatistical Modeling of Biospheric Resources

    NASA Astrophysics Data System (ADS)

    Pedelty, J. A.; Morisette, J. T.; Smith, J. A.; Schnase, J. L.; Crosier, C. S.; Stohlgren, T. J.

    2004-12-01

    We are using parallel geostatistical codes to study spatial relationships among biospheric resources in several study areas. For example, spatial statistical models based on large- and small-scale variability have been used to predict species richness of both native and exotic plants (hot spots of diversity) and patterns of exotic plant invasion. However, broader use of geostastics in natural resource modeling, especially at regional and national scales, has been limited due to the large computing requirements of these applications. To address this problem, we implemented parallel versions of the kriging spatial interpolation algorithm. The first uses the Message Passing Interface (MPI) in a master/slave paradigm on an open source Linux Beowulf cluster, while the second is implemented with the new proprietary Xgrid distributed processing system on an Xserve G5 cluster from Apple Computer, Inc. These techniques are proving effective and provide the basis for a national decision support capability for invasive species management that is being jointly developed by NASA and the US Geological Survey.

  17. Dancing for Parkinson Disease: A Randomized Trial of Irish Set Dancing Compared With Usual Care.

    PubMed

    Shanahan, Joanne; Morris, Meg E; Bhriain, Orfhlaith Ni; Volpe, Daniele; Lynch, Tim; Clifford, Amanda M

    2017-09-01

    To examine the feasibility of a randomized controlled study design and to explore the benefits of a set dancing intervention compared with usual care. Randomized controlled design, with participants randomized to Irish set dance classes or a usual care group. Community based. Individuals with idiopathic Parkinson disease (PD) (N=90). The dance group attended a 1.5-hour dancing class each week for 10 weeks and undertook a home dance program for 20 minutes, 3 times per week. The usual care group continued with their usual care and daily activities. The primary outcome was feasibility, determined by recruitment rates, success of randomization and allocation procedures, attrition, adherence, safety, willingness of participants to be randomized, resource availability, and cost. Secondary outcomes were motor function (motor section of the Unified Parkinson's Disease Rating Scale), quality of life (Parkinson's Disease Questionnaire-39), functional endurance (6-min walk test), and balance (mini-BESTest). Ninety participants were randomized (45 per group). There were no adverse effects or resource constraints. Although adherence to the dancing program was 93.5%, there was >40% attrition in each group. Postintervention, the dance group had greater nonsignificant gains in quality of life than the usual care group. There was a meaningful deterioration in endurance in the usual care group. There were no meaningful changes in other outcomes. The exit questionnaire showed participants enjoyed the classes and would like to continue participation. For people with mild to moderately severe PD, set dancing is feasible and enjoyable and may improve quality of life. Copyright © 2016 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  18. [Spatial variability of soil nutrients based on geostatistics combined with GIS--a case study in Zunghua City of Hebei Province].

    PubMed

    Guo, X; Fu, B; Ma, K; Chen, L

    2000-08-01

    Geostatistics combined with GIS was applied to analyze the spatial variability of soil nutrients in topsoil (0-20 cm) in Zunghua City of Hebei Province. GIS can integrate attribute data with geographical data of system variables, which makes the application of geostatistics technique for large spatial scale more convenient. Soil nutrient data in this study included available N (alkaline hydrolyzing nitrogen), total N, available K, available P and organic matter. The results showed that the semivariograms of soil nutrients were best described by spherical model, except for that of available K, which was best fitted by complex structure of exponential model and linear with sill model. The spatial variability of available K was mainly produced by structural factor, while that of available N, total N, available P and organic matter was primarily caused by random factor. However, their spatial heterogeneity degree was different: the degree of total N and organic matter was higher, and that of available P and available N was lower. The results also indicated that the spatial correlation of the five tested soil nutrients at this large scale was moderately dependent. The ranges of available N and available P were almost same, which were 5 km and 5.5 km, respectively. The range of total N was up to 18 km, and that of organic matter was 8.5 km. For available K, the spatial variability scale primarily expressed exponential model between 0-3.5 km, but linear with sill model between 3.5-25.5 km. In addition, five soil nutrients exhibited different isotropic ranges. Available N and available P were isotropic through the whole research range (0-28 km). The isotropic range of available K was 0-8 km, and that of total N and organic matter was 0-10 km.

  19. Geostatistical Study of Precipitation on the Island of Crete

    NASA Astrophysics Data System (ADS)

    Agou, Vasiliki D.; Varouchakis, Emmanouil A.; Hristopulos, Dionissios T.

    2015-04-01

    precipitation which are fitted locally to a three-parameter probability distribution, based on which a normalized index is derived. We use the Spartan variogram function to model space-time correlations, because it is more flexible than classical models [3]. The performance of the variogram model is tested by means of leave-one-out cross validation. The variogram model is then used in connection with ordinary kriging to generate precipitation maps for the entire island. In the future, we will explore the joint spatiotemporal evolution of precipitation patterns on Crete. References [1] P. Goovaerts. Geostatistical approaches for incorporating elevation into the spatial interpolation of precipitation. Journal of Hydrology, 228(1):113-129, 2000. [2] N. B. Guttman. Accepting the standardized precipitation index: a calculation algorithm. American Water Resource Association, 35(2):311-322, 1999. [3] D. T Hristopulos. Spartan Gibbs random field models for geostatistical applications. SIAM Journal on Scientific Computing, 24(6):2125-2162, 2003. [4] A.G. Koutroulis, A.-E.K. Vrohidou, and I.K. Tsanis. Spatiotemporal characteristics of meteorological drought for the island of Crete. Journal of Hydrometeorology, 12(2):206-226, 2011. [5] T. B. McKee, N. J. Doesken, and J. Kleist. The relationship of drought frequency and duration to time scales. In Proceedings of the 8th Conference on Applied Climatology, page 179-184, Anaheim, California, 1993.

  20. Geostatistical and GIS analysis of the spatial variability of alluvial gold content in Ngoura-Colomines area, Eastern Cameroon: Implications for the exploration of primary gold deposit

    NASA Astrophysics Data System (ADS)

    Takodjou Wambo, Jonas Didero; Ganno, Sylvestre; Djonthu Lahe, Yannick Sthopira; Kouankap Nono, Gus Djibril; Fossi, Donald Hermann; Tchouatcha, Milan Stafford; Nzenti, Jean Paul

    2018-06-01

    Linear and nonlinear geostatistic is commonly used in ore grade estimation and seldom used in Geographical Information System (GIS) technology. In this study, we suggest an approach based on geostatistic linear ordinary kriging (OK) and Geographical Information System (GIS) techniques to investigate the spatial distribution of alluvial gold content, mineralized and gangue layers thicknesses from 73 pits at the Ngoura-Colomines area with the aim to determine controlling factors for the spatial distribution of mineralization and delineate the most prospective area for primary gold mineralization. Gold content varies between 0.1 and 4.6 g/m3 and has been broadly grouped into three statistical classes. These classes have been spatially subdivided into nine zones using ordinary kriging model based on physical and topographical characteristics. Both mineralized and barren layer thicknesses show randomly spatial distribution, and there is no correlation between these parameters and the gold content. This approach has shown that the Ngoura-Colomines area is located in a large shear zone compatible with the Riedel fault system composed of P and P‧ fractures oriented NE-SW and NNE-SSW respectively; E-W trending R fractures and R‧ fractures with NW-SE trends that could have contributed significantly to the establishment of this gold mineralization. The combined OK model and GIS analysis have led to the delineation of Colomines, Tissongo, Madubal and Boutou villages as the most prospective areas for the exploration of primary gold deposit in the study area.

  1. Comparing different approaches - data mining, geostatistic, and deterministic pedology - to assess the frequency of WRB Reference Soil Groups in the Italian soil regions

    NASA Astrophysics Data System (ADS)

    Lorenzetti, Romina; Barbetti, Roberto; L'Abate, Giovanni; Fantappiè, Maria; Costantini, Edoardo A. C.

    2013-04-01

    Estimating frequency of soil classes in map unit is always affected by some degree of uncertainty, especially at small scales, with a larger generalization. The aim of this study was to compare different possible approaches - data mining, geostatistic, deterministic pedology - to assess the frequency of WRB Reference Soil Groups (RSG) in the major Italian soil regions. In the soil map of Italy (Costantini et al., 2012), a list of the first five RSG was reported in each major 10 soil regions. The soil map was produced using the national soil geodatabase, which stored 22,015 analyzed and classified pedons, 1,413 soil typological unit (STU) and a set of auxiliary variables (lithology, land-use, DEM). Other variables were added, to better consider the influence of soil forming factors (slope, soil aridity index, carbon stock, soil inorganic carbon content, clay, sand, geography of soil regions and soil systems) and a grid at 1 km mesh was set up. The traditional deterministic pedology assessed the STU frequency according to the expert judgment presence in every elementary landscape which formed the mapping unit. Different data mining techniques were firstly compared in their ability to predict RSG through auxiliary variables (neural networks, random forests, boosted tree, supported vector machine (SVM)). We selected SVM according to the result of a testing set. A SVM model is a representation of the examples as points in space, mapped so that examples of separate categories are divided by a clear gap that is as wide as possible. The geostatistic algorithm we used was an indicator collocated cokriging. The class values of the auxiliary variables, available at all the points of the grid, were transformed in indicator variables (values 0, 1). A principal component analysis allowed us to select the variables that were able to explain the largest variability, and to correlate each RSG with the first principal component, which explained the 51% of the total variability. The

  2. The role of geostatistics in medical geology

    NASA Astrophysics Data System (ADS)

    Goovaerts, Pierre

    2014-05-01

    Since its development in the mining industry, geostatistics has emerged as the primary tool for spatial data analysis in various fields, ranging from earth and atmospheric sciences, to agriculture, soil science, remote sensing, and more recently environmental exposure assessment. In the last few years, these tools have been tailored to the field of medical geography or spatial epidemiology, which is concerned with the study of spatial patterns of disease incidence and mortality and the identification of potential 'causes' of disease, such as environmental exposure, diet and unhealthy behaviors, economic or socio-demographic factors. On the other hand, medical geology is an emerging interdisciplinary scientific field studying the relationship between natural geological factors and their effects on human and animal health. This paper provides an introduction to the field of medical geology with an overview of geostatistical methods available for the analysis of geological and health data. Key concepts are illustrated using the mapping of groundwater arsenic concentrations across eleven Michigan counties and the exploration of its relationship to the incidence of prostate cancer at the township level. Arsenic in drinking-water is a major problem and has received much attention because of the large human population exposed and the extremely high concentrations (e.g. 600 to 700 μg/L) recorded in many instances. Few studies have however assessed the risks associated with exposure to low levels of arsenic (say < 50 μg/L) most commonly found in drinking water in the United States. In the Michigan thumb region, arsenopyrite (up to 7% As by weight) has been identified in the bedrock of the Marshall Sandstone aquifer, one of the region's most productive aquifers. Epidemiologic studies have suggested a possible associationbetween exposure to inorganic arsenic and prostate cancer mortality, including a study of populations residing in Utah. The information available for the

  3. [Geostatistics analyzing to cause of formation of circle distribution of plant communities in Horqin Sandy Land].

    PubMed

    He, Xingdong; Gao, Yubao; Zhao, Wenzhi; Cong, Zili

    2004-09-01

    Investigation results in the present study showed that plant communities took typical concentric circles distribution patterns along habitat gradient from top, slope to interdune on a few large fixed dunes in middle part of Korqin Sandy Land. In order to explain this phenomenon, analysis of water content and its spatial heterogeneity in sand layers on different locations of dunes was conducted. In these dunes, water contents in sand layers of the tops were lower than those of the slopes; both of them were lower than those of the interdunes. According to the results of geostatistics analysis, whether shifting dune or fixed dune, spatial heterogeneity of water contents in sand layers took on regular changes, such as ratios between nugget and sill and ranges reduced gradually, fractal dimension increased gradually, the regular changes of these parameters indicated that random spatial heterogeneity reduced gradually, and autocorrelation spatial heterogeneity increased gradually from the top, the slope to the interdune. The regular changes of water contents in sand layers and their spatial heterogeneity of different locations of the dunes, thus, might be an important cause resulted in the formation of the concentric circles patterns of the plant communities on these fixed dunes.

  4. Precipitation estimation in mountainous terrain using multivariate geostatistics. Part II: isohyetal maps

    USGS Publications Warehouse

    Hevesi, Joseph A.; Flint, Alan L.; Istok, Jonathan D.

    1992-01-01

    Values of average annual precipitation (AAP) may be important for hydrologic characterization of a potential high-level nuclear-waste repository site at Yucca Mountain, Nevada. Reliable measurements of AAP are sparse in the vicinity of Yucca Mountain, and estimates of AAP were needed for an isohyetal mapping over a 2600-square-mile watershed containing Yucca Mountain. Estimates were obtained with a multivariate geostatistical model developed using AAP and elevation data from a network of 42 precipitation stations in southern Nevada and southeastern California. An additional 1531 elevations were obtained to improve estimation accuracy. Isohyets representing estimates obtained using univariate geostatistics (kriging) defined a smooth and continuous surface. Isohyets representing estimates obtained using multivariate geostatistics (cokriging) defined an irregular surface that more accurately represented expected local orographic influences on AAP. Cokriging results included a maximum estimate within the study area of 335 mm at an elevation of 7400 ft, an average estimate of 157 mm for the study area, and an average estimate of 172 mm at eight locations in the vicinity of the potential repository site. Kriging estimates tended to be lower in comparison because the increased AAP expected for remote mountainous topography was not adequately represented by the available sample. Regression results between cokriging estimates and elevation were similar to regression results between measured AAP and elevation. The position of the cokriging 250-mm isohyet relative to the boundaries of pinyon pine and juniper woodlands provided indirect evidence of improved estimation accuracy because the cokriging result agreed well with investigations by others concerning the relationship between elevation, vegetation, and climate in the Great Basin. Calculated estimation variances were also mapped and compared to evaluate improvements in estimation accuracy. Cokriging estimation variances

  5. Introduction to this Special Issue on Geostatistics and Scaling of Remote Sensing

    NASA Technical Reports Server (NTRS)

    Quattrochi, Dale A.

    1999-01-01

    The germination of this special PE&RS issue began at the Royal Geographical Society (with the Institute of British Geographers)(RCS-IBC) annual meeting in January, 1997 held at the University of Exeter in Exeter, England. The cold and snow of an England winter were greatly tempered by the friendly and cordial discussions that ensued at the meeting on possible ways to foster both dialog and research across "the Big Pond" between geographers in the US and the UK on the use of geostatistics and geospatial techniques for remote sensing of land surface processes. It was decided that one way to stimulate and enhance cooperation on the application of geostatistics and geospatial methods in remote sensing was to hold parallel sessions on these topics at appropriate meeting venues in 1998 in both the US and the UK Selected papers given at these sessions would be published as a special issue of PE&RS on the US side, and as a special issue of Computers and Geosciences (C&G) on the UK side, to highlight the commonality in research on geostatistics and geospatial methods in remote sensing and spatial data analysis on both sides of the Atlantic Ocean. As a consequence, a session on "Ceostatistics and Geospatial Techniques for Remote Sensing of Land Surface Processes" was held at the Association of American Geographers (AAG) annual meeting in Boston, Massachusetts in March, 1998, sponsored by the AAG's Remote Sensing Specialty Group (RSSG). A similar session was held at the RGS-IBG annual meeting in Guildford, Surrey, England in January 1998, organized by the Modeling and Advanced Techniques Special Interest Group (MAT SIG) of the Remote Sensing Society (RSS). The six papers that in part, comprise this issue of PE&RS, are the US complement to such a dual journal publication effort. Both of us are co-editors of each of the journal special issues, with the lead editor of each journal being from their respective side of the Atlantic where the journals are published. The special

  6. Geostatistical analysis of allele presence patterns among American black bears in eastern North Carolina

    USGS Publications Warehouse

    Thompson, L.M.; Van Manen, F.T.; King, T.L.

    2005-01-01

    Highways are one of the leading causes of wildlife habitat fragmentation and may particularly affect wide-ranging species, such as American black bears (Ursus americanus). We initiated a research project in 2000 to determine potential effects of a 4-lane highway on black bear ecology in Washington County, North Carolina. The research design included a treatment area (highway construction) and a control area and a pre- and post-construction phase. We used data from the pre-construction phase to determine whether we could detect scale dependency or directionality among allele occurrence patterns using geostatistics. Detection of such patterns could provide a powerful tool to measure the effects of landscape fragmentation on gene flow. We sampled DNA from roots of black bear hair at 70 hair-sampling sites on each study area for 7 weeks during fall of 2000. We used microsatellite analysis based on 10 loci to determine unique multi-locus genotypes. We examined all alleles sampled at ???25 sites on each study area and mapped their presence or absence at each hair-sample site. We calculated semivariograms, which measure the strength of statistical correlation as a function of distance, and adjusted them for anisotropy to determine the maximum direction of spatial continuity. We then calculated the mean direction of spatial continuity for all examined alleles. The mean direction of allele frequency variation was 118.3?? (SE = 8.5) on the treatment area and 172.3?? (SE = 6.0) on the control area. Rayleigh's tests showed that these directions differed from random distributions (P = 0.028 and P < 0.001, respectively), indicating consistent directional patterns for the alleles we examined in each area. Despite the small spatial scale of our study (approximately 11,000 ha for each study area), we observed distinct and consistent patterns of allele occurrence, suggesting different directions of gene flow between the study areas. These directions seemed to coincide with the

  7. Improved Assimilation of Streamflow and Satellite Soil Moisture with the Evolutionary Particle Filter and Geostatistical Modeling

    NASA Astrophysics Data System (ADS)

    Yan, Hongxiang; Moradkhani, Hamid; Abbaszadeh, Peyman

    2017-04-01

    Assimilation of satellite soil moisture and streamflow data into hydrologic models using has received increasing attention over the past few years. Currently, these observations are increasingly used to improve the model streamflow and soil moisture predictions. However, the performance of this land data assimilation (DA) system still suffers from two limitations: 1) satellite data scarcity and quality; and 2) particle weight degeneration. In order to overcome these two limitations, we propose two possible solutions in this study. First, the general Gaussian geostatistical approach is proposed to overcome the limitation in the space/time resolution of satellite soil moisture products thus improving their accuracy at uncovered/biased grid cells. Secondly, an evolutionary PF approach based on Genetic Algorithm (GA) and Markov Chain Monte Carlo (MCMC), the so-called EPF-MCMC, is developed to further reduce weight degeneration and improve the robustness of the land DA system. This study provides a detailed analysis of the joint and separate assimilation of streamflow and satellite soil moisture into a distributed Sacramento Soil Moisture Accounting (SAC-SMA) model, with the use of recently developed EPF-MCMC and the general Gaussian geostatistical approach. Performance is assessed over several basins in the USA selected from Model Parameter Estimation Experiment (MOPEX) and located in different climate regions. The results indicate that: 1) the general Gaussian approach can predict the soil moisture at uncovered grid cells within the expected satellite data quality threshold; 2) assimilation of satellite soil moisture inferred from the general Gaussian model can significantly improve the soil moisture predictions; and 3) in terms of both deterministic and probabilistic measures, the EPF-MCMC can achieve better streamflow predictions. These results recommend that the geostatistical model is a helpful tool to aid the remote sensing technique and the EPF-MCMC is a

  8. Integration of geology, geostatistics, well logs and pressure data to model a heterogeneous supergiant field in Iran

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Samimi, B.; Bagherpour, H.; Nioc, A.

    1995-08-01

    The geological reservoir study of the supergiant Ahwaz field significantly improved the history matching process in many aspects, particularly the development of a geostatistical model which allowed a sound basis for changes and by delivering much needed accurate estimates of grid block vertical permeabilities. The geostatistical reservoir evaluation was facilitated by using the Heresim package and litho-stratigraphic zonations for the entire field. For each of the geological zones, 3-dimensional electrolithofacies and petrophysical property distributions (realizations) were treated which captured the heterogeneities which significantly affected fluid flow. However, as this level of heterogeneity was at a significantly smaller scale than themore » flow simulation grid blocks, a scaling up effort was needed to derive the effective flow properties of the blocks (porosity, horizontal and vertical permeability, and water saturation). The properties relating to the static reservoir description were accurately derived by using stream tube techniques developed in-house whereas, the relative permeabilities of the grid block were derived by dynamic pseudo relative permeability techniques. The prediction of vertical and lateral communication and water encroachment was facilitated by a close integration of pressure, saturation data, geostatistical modelling and sedimentological studies of the depositional environments and paleocurrents. The nature of reservoir barriers and baffles varied both vertically and laterally in this heterogeneous reservoir. Maps showing differences in pressure between zones after years of production served as a guide to integrating the static geological studies to the dynamic behaviour of each of the 16 reservoir zones. The use of deep wells being drilled to a deeper reservoir provided data to better understand the sweep efficiency and the continuity of barriers and baffles.« less

  9. Characterization and geostatistical mapping of water salinity: A case study of terminal complex in the Oued Righ Valley (southern Algeria)

    NASA Astrophysics Data System (ADS)

    Belkesier, Mohamed Saleh; Zeddouri, Aziez; Halassa, Younes; Kechiched, Rabah

    2018-05-01

    The region of Oued Righ contains large quantities of groundwater hosted by the three aquifers: the Terminal Complex (CT), the Continental Intercalary (CI) and the phreatic aquifer. The present study is focused on the water from CT aquifer in order to characterize their salinity using geostatistical tool for maping. Indeed, water in this aquifer show a high mineralization exceeding the OMS standards. The main hydro-chemical facies of this water is Chloride-Sodium and Sulfate-Sodium. The elementary statistics have been performed on the physico-chemical analysis from 97 wells whereas 766 wells were analyzed on salinity and are used for the geostatistical mapping. The obtained results show a spatial evolution of the salinity toward the direction South to the North. The salinity is locally strong in the central part of Oued Righ valley. The non-parametric geostatistic of indicator kriging was performed on the salinity data using a cut-off of 5230 mg/l which represents the average value in the studied area. The indicator Kriging allows the estimation of salinity probabilities I (5230 mg / l) in the water of the CT aquifer using bloc model (500 x 500 m). The automatic mapping is used to visualize the distribution of the kriged probabilities of salinity. These results can help to ensure a rational and a selective exploitation of groundwater according the salinity contents.

  10. A geostatistical methodology for the optimal design of space-time hydraulic head monitoring networks and its application to the Valle de Querétaro aquifer.

    PubMed

    Júnez-Ferreira, H E; Herrera, G S

    2013-04-01

    This paper presents a new methodology for the optimal design of space-time hydraulic head monitoring networks and its application to the Valle de Querétaro aquifer in Mexico. The selection of the space-time monitoring points is done using a static Kalman filter combined with a sequential optimization method. The Kalman filter requires as input a space-time covariance matrix, which is derived from a geostatistical analysis. A sequential optimization method that selects the space-time point that minimizes a function of the variance, in each step, is used. We demonstrate the methodology applying it to the redesign of the hydraulic head monitoring network of the Valle de Querétaro aquifer with the objective of selecting from a set of monitoring positions and times, those that minimize the spatiotemporal redundancy. The database for the geostatistical space-time analysis corresponds to information of 273 wells located within the aquifer for the period 1970-2007. A total of 1,435 hydraulic head data were used to construct the experimental space-time variogram. The results show that from the existing monitoring program that consists of 418 space-time monitoring points, only 178 are not redundant. The implied reduction of monitoring costs was possible because the proposed method is successful in propagating information in space and time.

  11. Bayesian geostatistics in health cartography: the perspective of malaria.

    PubMed

    Patil, Anand P; Gething, Peter W; Piel, Frédéric B; Hay, Simon I

    2011-06-01

    Maps of parasite prevalences and other aspects of infectious diseases that vary in space are widely used in parasitology. However, spatial parasitological datasets rarely, if ever, have sufficient coverage to allow exact determination of such maps. Bayesian geostatistics (BG) is a method for finding a large sample of maps that can explain a dataset, in which maps that do a better job of explaining the data are more likely to be represented. This sample represents the knowledge that the analyst has gained from the data about the unknown true map. BG provides a conceptually simple way to convert these samples to predictions of features of the unknown map, for example regional averages. These predictions account for each map in the sample, yielding an appropriate level of predictive precision.

  12. Bayesian geostatistics in health cartography: the perspective of malaria

    PubMed Central

    Patil, Anand P.; Gething, Peter W.; Piel, Frédéric B.; Hay, Simon I.

    2011-01-01

    Maps of parasite prevalences and other aspects of infectious diseases that vary in space are widely used in parasitology. However, spatial parasitological datasets rarely, if ever, have sufficient coverage to allow exact determination of such maps. Bayesian geostatistics (BG) is a method for finding a large sample of maps that can explain a dataset, in which maps that do a better job of explaining the data are more likely to be represented. This sample represents the knowledge that the analyst has gained from the data about the unknown true map. BG provides a conceptually simple way to convert these samples to predictions of features of the unknown map, for example regional averages. These predictions account for each map in the sample, yielding an appropriate level of predictive precision. PMID:21420361

  13. Improving imperfect data from health management information systems in Africa using space-time geostatistics.

    PubMed

    Gething, Peter W; Noor, Abdisalan M; Gikandi, Priscilla W; Ogara, Esther A A; Hay, Simon I; Nixon, Mark S; Snow, Robert W; Atkinson, Peter M

    2006-06-01

    Reliable and timely information on disease-specific treatment burdens within a health system is critical for the planning and monitoring of service provision. Health management information systems (HMIS) exist to address this need at national scales across Africa but are failing to deliver adequate data because of widespread underreporting by health facilities. Faced with this inadequacy, vital public health decisions often rely on crudely adjusted regional and national estimates of treatment burdens. This study has taken the example of presumed malaria in outpatients within the largely incomplete Kenyan HMIS database and has defined a geostatistical modelling framework that can predict values for all data that are missing through space and time. The resulting complete set can then be used to define treatment burdens for presumed malaria at any level of spatial and temporal aggregation. Validation of the model has shown that these burdens are quantified to an acceptable level of accuracy at the district, provincial, and national scale. The modelling framework presented here provides, to our knowledge for the first time, reliable information from imperfect HMIS data to support evidence-based decision-making at national and sub-national levels.

  14. Improving Imperfect Data from Health Management Information Systems in Africa Using Space–Time Geostatistics

    PubMed Central

    Gething, Peter W; Noor, Abdisalan M; Gikandi, Priscilla W; Ogara, Esther A. A; Hay, Simon I; Nixon, Mark S; Snow, Robert W; Atkinson, Peter M

    2006-01-01

    Background Reliable and timely information on disease-specific treatment burdens within a health system is critical for the planning and monitoring of service provision. Health management information systems (HMIS) exist to address this need at national scales across Africa but are failing to deliver adequate data because of widespread underreporting by health facilities. Faced with this inadequacy, vital public health decisions often rely on crudely adjusted regional and national estimates of treatment burdens. Methods and Findings This study has taken the example of presumed malaria in outpatients within the largely incomplete Kenyan HMIS database and has defined a geostatistical modelling framework that can predict values for all data that are missing through space and time. The resulting complete set can then be used to define treatment burdens for presumed malaria at any level of spatial and temporal aggregation. Validation of the model has shown that these burdens are quantified to an acceptable level of accuracy at the district, provincial, and national scale. Conclusions The modelling framework presented here provides, to our knowledge for the first time, reliable information from imperfect HMIS data to support evidence-based decision-making at national and sub-national levels. PMID:16719557

  15. Multisource passive acoustic tracking: an application of random finite set data fusion

    NASA Astrophysics Data System (ADS)

    Ali, Andreas M.; Hudson, Ralph E.; Lorenzelli, Flavio; Yao, Kung

    2010-04-01

    Multisource passive acoustic tracking is useful in animal bio-behavioral study by replacing or enhancing human involvement during and after field data collection. Multiple simultaneous vocalizations are a common occurrence in a forest or a jungle, where many species are encountered. Given a set of nodes that are capable of producing multiple direction-of-arrivals (DOAs), such data needs to be combined into meaningful estimates. Random Finite Set provides the mathematical probabilistic model, which is suitable for analysis and optimal estimation algorithm synthesis. Then the proposed algorithm has been verified using a simulation and a controlled test experiment.

  16. Examining the Missing Completely at Random Mechanism in Incomplete Data Sets: A Multiple Testing Approach

    ERIC Educational Resources Information Center

    Raykov, Tenko; Lichtenberg, Peter A.; Paulson, Daniel

    2012-01-01

    A multiple testing procedure for examining implications of the missing completely at random (MCAR) mechanism in incomplete data sets is discussed. The approach uses the false discovery rate concept and is concerned with testing group differences on a set of variables. The method can be used for ascertaining violations of MCAR and disproving this…

  17. Bayesian Geostatistical Analysis and Ecoclimatic Determinants of Corynebacterium pseudotuberculosis Infection among Horses

    PubMed Central

    Boysen, Courtney; Davis, Elizabeth G.; Beard, Laurie A.; Lubbers, Brian V.; Raghavan, Ram K.

    2015-01-01

    Kansas witnessed an unprecedented outbreak in Corynebacterium pseudotuberculosis infection among horses, a disease commonly referred to as pigeon fever during fall 2012. Bayesian geostatistical models were developed to identify key environmental and climatic risk factors associated with C. pseudotuberculosis infection in horses. Positive infection status among horses (cases) was determined by positive test results for characteristic abscess formation, positive bacterial culture on purulent material obtained from a lanced abscess (n = 82), or positive serologic evidence of exposure to organism (≥1:512)(n = 11). Horses negative for these tests (n = 172)(controls) were considered free of infection. Information pertaining to horse demographics and stabled location were obtained through review of medical records and/or contact with horse owners via telephone. Covariate information for environmental and climatic determinants were obtained from USDA (soil attributes), USGS (land use/land cover), and NASA MODIS and NASA Prediction of Worldwide Renewable Resources (climate). Candidate covariates were screened using univariate regression models followed by Bayesian geostatistical models with and without covariates. The best performing model indicated a protective effect for higher soil moisture content (OR = 0.53, 95% CrI = 0.25, 0.71), and detrimental effects for higher land surface temperature (≥35°C) (OR = 2.81, 95% CrI = 2.21, 3.85) and habitat fragmentation (OR = 1.31, 95% CrI = 1.27, 2.22) for C. pseudotuberculosis infection status in horses, while age, gender and breed had no effect. Preventative and ecoclimatic significance of these findings are discussed. PMID:26473728

  18. Bayesian Geostatistical Analysis and Ecoclimatic Determinants of Corynebacterium pseudotuberculosis Infection among Horses.

    PubMed

    Boysen, Courtney; Davis, Elizabeth G; Beard, Laurie A; Lubbers, Brian V; Raghavan, Ram K

    2015-01-01

    Kansas witnessed an unprecedented outbreak in Corynebacterium pseudotuberculosis infection among horses, a disease commonly referred to as pigeon fever during fall 2012. Bayesian geostatistical models were developed to identify key environmental and climatic risk factors associated with C. pseudotuberculosis infection in horses. Positive infection status among horses (cases) was determined by positive test results for characteristic abscess formation, positive bacterial culture on purulent material obtained from a lanced abscess (n = 82), or positive serologic evidence of exposure to organism (≥ 1:512)(n = 11). Horses negative for these tests (n = 172)(controls) were considered free of infection. Information pertaining to horse demographics and stabled location were obtained through review of medical records and/or contact with horse owners via telephone. Covariate information for environmental and climatic determinants were obtained from USDA (soil attributes), USGS (land use/land cover), and NASA MODIS and NASA Prediction of Worldwide Renewable Resources (climate). Candidate covariates were screened using univariate regression models followed by Bayesian geostatistical models with and without covariates. The best performing model indicated a protective effect for higher soil moisture content (OR = 0.53, 95% CrI = 0.25, 0.71), and detrimental effects for higher land surface temperature (≥ 35°C) (OR = 2.81, 95% CrI = 2.21, 3.85) and habitat fragmentation (OR = 1.31, 95% CrI = 1.27, 2.22) for C. pseudotuberculosis infection status in horses, while age, gender and breed had no effect. Preventative and ecoclimatic significance of these findings are discussed.

  19. Delineation of estuarine management areas using multivariate geostatistics: the case of Sado Estuary.

    PubMed

    Caeiro, Sandra; Goovaerts, Pierre; Painho, Marco; Costa, M Helena

    2003-09-15

    The Sado Estuary is a coastal zone located in the south of Portugal where conflicts between conservation and development exist because of its location near industrialized urban zones and its designation as a natural reserve. The aim of this paper is to evaluate a set of multivariate geostatistical approaches to delineate spatially contiguous regions of sediment structure for Sado Estuary. These areas will be the supporting infrastructure of an environmental management system for this estuary. The boundaries of each homogeneous area were derived from three sediment characterization attributes through three different approaches: (1) cluster analysis of dissimilarity matrix function of geographical separation followed by indicator kriging of the cluster data, (2) discriminant analysis of kriged values of the three sediment attributes, and (3) a combination of methods 1 and 2. Final maximum likelihood classification was integrated into a geographical information system. All methods generated fairly spatially contiguous management areas that reproduce well the environment of the estuary. Map comparison techniques based on kappa statistics showed thatthe resultant three maps are similar, supporting the choice of any of the methods as appropriate for management of the Sado Estuary. However, the results of method 1 seem to be in better agreement with estuary behavior, assessment of contamination sources, and previous work conducted at this site.

  20. Latin hypercube sampling and geostatistical modeling of spatial uncertainty in a spatially explicit forest landscape model simulation

    Treesearch

    Chonggang Xu; Hong S. He; Yuanman Hu; Yu Chang; Xiuzhen Li; Rencang Bu

    2005-01-01

    Geostatistical stochastic simulation is always combined with Monte Carlo method to quantify the uncertainty in spatial model simulations. However, due to the relatively long running time of spatially explicit forest models as a result of their complexity, it is always infeasible to generate hundreds or thousands of Monte Carlo simulations. Thus, it is of great...

  1. Indoor radon variations in central Iran and its geostatistical map

    NASA Astrophysics Data System (ADS)

    Hadad, Kamal; Mokhtari, Javad

    2015-02-01

    We present the results of 2 year indoor radon survey in 10 cities of Yazd province in Central Iran (covering an area of 80,000 km2). We used passive diffusive samplers with LATEX polycarbonate films as Solid State Nuclear Track Detector (SSNTD). This study carried out in central Iran where there are major minerals and uranium mines. Our results indicate that despite few extraordinary high concentrations, average annual concentrations of indoor radon are within ICRP guidelines. When geostatistical spatial distribution of radon mapped onto geographical features of the province it was observed that risk of high radon concentration increases near the Saqand, Bafq, Harat and Abarkooh cities, this depended on the elevation and vicinity of the ores and mines.

  2. Supervised restoration of degraded medical images using multiple-point geostatistics.

    PubMed

    Pham, Tuan D

    2012-06-01

    Reducing noise in medical images has been an important issue of research and development for medical diagnosis, patient treatment, and validation of biomedical hypotheses. Noise inherently exists in medical and biological images due to the acquisition and transmission in any imaging devices. Being different from image enhancement, the purpose of image restoration is the process of removing noise from a degraded image in order to recover as much as possible its original version. This paper presents a statistically supervised approach for medical image restoration using the concept of multiple-point geostatistics. Experimental results have shown the effectiveness of the proposed technique which has potential as a new methodology for medical and biological image processing. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  3. Assessment of nitrate pollution in the Grand Morin aquifers (France): combined use of geostatistics and physically based modeling.

    PubMed

    Flipo, Nicolas; Jeannée, Nicolas; Poulin, Michel; Even, Stéphanie; Ledoux, Emmanuel

    2007-03-01

    The objective of this work is to combine several approaches to better understand nitrate fate in the Grand Morin aquifers (2700 km(2)), part of the Seine basin. cawaqs results from the coupling of the hydrogeological model newsam with the hydrodynamic and biogeochemical model of river ProSe. cawaqs is coupled with the agronomic model Stics in order to simulate nitrate migration in basins. First, kriging provides a satisfactory representation of aquifer nitrate contamination from local observations, to set initial conditions for the physically based model. Then associated confidence intervals, derived from data using geostatistics, are used to validate cawaqs results. Results and evaluation obtained from the combination of these approaches are given (period 1977-1988). Then cawaqs is used to simulate nitrate fate for a 20-year period (1977-1996). The mean nitrate concentrations increase in aquifers is 0.09 mgN L(-1)yr(-1), resulting from an average infiltration flux of 3500 kgN.km(-2)yr(-1).

  4. Online Early Resilience Intervention for Combat-Related PTSD in Military Primary Healthcare Settings: A Randomized Trial of DESTRESS-PC

    DTIC Science & Technology

    2009-08-01

    Bryant, R, Engel, CC (2004). A therapist-assisted internet self-help program for traumatic stress . Professional Psychology: Research and Practice, 35...Combat-Related PTSD in Military Primary Healthcare Settings: A Randomized Trial of “DESTRESS-PC” PRINCIPAL INVESTIGATOR: Charles Engel...Early Resilience Intervention for Combat-Related PTSD in Military Primary Healthcare Settings: A Randomized Trial of DESTRESS-PC 5b. GRANT NUMBER

  5. Characterizing gene sets using discriminative random walks with restart on heterogeneous biological networks.

    PubMed

    Blatti, Charles; Sinha, Saurabh

    2016-07-15

    Analysis of co-expressed gene sets typically involves testing for enrichment of different annotations or 'properties' such as biological processes, pathways, transcription factor binding sites, etc., one property at a time. This common approach ignores any known relationships among the properties or the genes themselves. It is believed that known biological relationships among genes and their many properties may be exploited to more accurately reveal commonalities of a gene set. Previous work has sought to achieve this by building biological networks that combine multiple types of gene-gene or gene-property relationships, and performing network analysis to identify other genes and properties most relevant to a given gene set. Most existing network-based approaches for recognizing genes or annotations relevant to a given gene set collapse information about different properties to simplify (homogenize) the networks. We present a network-based method for ranking genes or properties related to a given gene set. Such related genes or properties are identified from among the nodes of a large, heterogeneous network of biological information. Our method involves a random walk with restarts, performed on an initial network with multiple node and edge types that preserve more of the original, specific property information than current methods that operate on homogeneous networks. In this first stage of our algorithm, we find the properties that are the most relevant to the given gene set and extract a subnetwork of the original network, comprising only these relevant properties. We then re-rank genes by their similarity to the given gene set, based on a second random walk with restarts, performed on the above subnetwork. We demonstrate the effectiveness of this algorithm for ranking genes related to Drosophila embryonic development and aggressive responses in the brains of social animals. DRaWR was implemented as an R package available at veda.cs.illinois.edu/DRaWR. blatti

  6. GIS, geostatistics, metadata banking, and tree-based models for data analysis and mapping in environmental monitoring and epidemiology.

    PubMed

    Schröder, Winfried

    2006-05-01

    By the example of environmental monitoring, some applications of geographic information systems (GIS), geostatistics, metadata banking, and Classification and Regression Trees (CART) are presented. These tools are recommended for mapping statistically estimated hot spots of vectors and pathogens. GIS were introduced as tools for spatially modelling the real world. The modelling can be done by mapping objects according to the spatial information content of data. Additionally, this can be supported by geostatistical and multivariate statistical modelling. This is demonstrated by the example of modelling marine habitats of benthic communities and of terrestrial ecoregions. Such ecoregionalisations may be used to predict phenomena based on the statistical relation between measurements of an interesting phenomenon such as, e.g., the incidence of medically relevant species and correlated characteristics of the ecoregions. The combination of meteorological data and data on plant phenology can enhance the spatial resolution of the information on climate change. To this end, meteorological and phenological data have to be correlated. To enable this, both data sets which are from disparate monitoring networks have to be spatially connected by means of geostatistical estimation. This is demonstrated by the example of transformation of site-specific data on plant phenology into surface data. The analysis allows for spatial comparison of the phenology during the two periods 1961-1990 and 1991-2002 covering whole Germany. The changes in both plant phenology and air temperature were proved to be statistically significant. Thus, they can be combined by GIS overlay technique to enhance the spatial resolution of the information on the climate change and use them for the prediction of vector incidences at the regional scale. The localisation of such risk hot spots can be done by geometrically merging surface data on promoting factors. This is demonstrated by the example of the

  7. Uncertainty in Random Forests: What does it mean in a spatial context?

    NASA Astrophysics Data System (ADS)

    Klump, Jens; Fouedjio, Francky

    2017-04-01

    Geochemical surveys are an important part of exploration for mineral resources and in environmental studies. The samples and chemical analyses are often laborious and difficult to obtain and therefore come at a high cost. As a consequence, these surveys are characterised by datasets with large numbers of variables but relatively few data points when compared to conventional big data problems. With more remote sensing platforms and sensor networks being deployed, large volumes of auxiliary data of the surveyed areas are becoming available. The use of these auxiliary data has the potential to improve the prediction of chemical element concentrations over the whole study area. Kriging is a well established geostatistical method for the prediction of spatial data but requires significant pre-processing and makes some basic assumptions about the underlying distribution of the data. Some machine learning algorithms, on the other hand, may require less data pre-processing and are non-parametric. In this study we used a dataset provided by Kirkwood et al. [1] to explore the potential use of Random Forest in geochemical mapping. We chose Random Forest because it is a well understood machine learning method and has the advantage that it provides us with a measure of uncertainty. By comparing Random Forest to Kriging we found that both methods produced comparable maps of estimated values for our variables of interest. Kriging outperformed Random Forest for variables of interest with relatively strong spatial correlation. The measure of uncertainty provided by Random Forest seems to be quite different to the measure of uncertainty provided by Kriging. In particular, the lack of spatial context can give misleading results in areas without ground truth data. In conclusion, our preliminary results show that the model driven approach in geostatistics gives us more reliable estimates for our target variables than Random Forest for variables with relatively strong spatial

  8. Characterizing the spatial structure of endangered species habitat using geostatistical analysis of IKONOS imagery

    USGS Publications Warehouse

    Wallace, C.S.A.; Marsh, S.E.

    2005-01-01

    Our study used geostatistics to extract measures that characterize the spatial structure of vegetated landscapes from satellite imagery for mapping endangered Sonoran pronghorn habitat. Fine spatial resolution IKONOS data provided information at the scale of individual trees or shrubs that permitted analysis of vegetation structure and pattern. We derived images of landscape structure by calculating local estimates of the nugget, sill, and range variogram parameters within 25 ?? 25-m image windows. These variogram parameters, which describe the spatial autocorrelation of the 1-m image pixels, are shown in previous studies to discriminate between different species-specific vegetation associations. We constructed two independent models of pronghorn landscape preference by coupling the derived measures with Sonoran pronghorn sighting data: a distribution-based model and a cluster-based model. The distribution-based model used the descriptive statistics for variogram measures at pronghorn sightings, whereas the cluster-based model used the distribution of pronghorn sightings within clusters of an unsupervised classification of derived images. Both models define similar landscapes, and validation results confirm they effectively predict the locations of an independent set of pronghorn sightings. Such information, although not a substitute for field-based knowledge of the landscape and associated ecological processes, can provide valuable reconnaissance information to guide natural resource management efforts. ?? 2005 Taylor & Francis Group Ltd.

  9. Chapter J: Issues and challenges in the application of geostatistics and spatial-data analysis to the characterization of sand-and-gravel resources

    USGS Publications Warehouse

    Hack, Daniel R.

    2005-01-01

    Sand-and-gravel (aggregate) resources are a critical component of the Nation's infrastructure, yet aggregate-mining technologies lag far behind those of metalliferous mining and other sectors. Deposit-evaluation and site-characterization methodologies are antiquated, and few serious studies of the potential applications of spatial-data analysis and geostatistics have been published. However, because of commodity usage and the necessary proximity of a mine to end use, aggregate-resource exploration and evaluation differ fundamentally from comparable activities for metalliferous ores. Acceptable practices, therefore, can reflect this cruder scale. The increasing use of computer technologies is colliding with the need for sand-and-gravel mines to modernize and improve their overall efficiency of exploration, mine planning, scheduling, automation, and other operations. The emergence of megaquarries in the 21st century will also be a contributing factor. Preliminary research into the practical applications of exploratory-data analysis (EDA) have been promising. For example, EDA was used to develop a linear-regression equation to forecast freeze-thaw durability from absorption values for Lower Paleozoic carbonate rocks mined for crushed aggregate from quarries in Oklahoma. Applications of EDA within a spatial context, a method of spatial-data analysis, have also been promising, as with the investigation of undeveloped sand-and-gravel resources in the sedimentary deposits of Pleistocene Lake Bonneville, Utah. Formal geostatistical investigations of sand-and-gravel deposits are quite rare, and the primary focus of those studies that have been completed is on the spatial characterization of deposit thickness and its subsequent effect on ore reserves. A thorough investigation of a gravel deposit in an active aggregate-mining area in central Essex, U.K., emphasized the problems inherent in the geostatistical characterization of particle-size-analysis data. Beyond such factors

  10. Bridges between multiple-point geostatistics and texture synthesis: Review and guidelines for future research

    NASA Astrophysics Data System (ADS)

    Mariethoz, Gregoire; Lefebvre, Sylvain

    2014-05-01

    Multiple-Point Simulations (MPS) is a family of geostatistical tools that has received a lot of attention in recent years for the characterization of spatial phenomena in geosciences. It relies on the definition of training images to represent a given type of spatial variability, or texture. We show that the algorithmic tools used are similar in many ways to techniques developed in computer graphics, where there is a need to generate large amounts of realistic textures for applications such as video games and animated movies. Similarly to MPS, these texture synthesis methods use training images, or exemplars, to generate realistic-looking graphical textures. Both domains of multiple-point geostatistics and example-based texture synthesis present similarities in their historic development and share similar concepts. These disciplines have however remained separated, and as a result significant algorithmic innovations in each discipline have not been universally adopted. Texture synthesis algorithms present drastically increased computational efficiency, patterns reproduction and user control. At the same time, MPS developed ways to condition models to spatial data and to produce 3D stochastic realizations, which have not been thoroughly investigated in the field of texture synthesis. In this paper we review the possible links between these disciplines and show the potential and limitations of using concepts and approaches from texture synthesis in MPS. We also provide guidelines on how recent developments could benefit both fields of research, and what challenges remain open.

  11. Complementary nonparametric analysis of covariance for logistic regression in a randomized clinical trial setting.

    PubMed

    Tangen, C M; Koch, G G

    1999-03-01

    In the randomized clinical trial setting, controlling for covariates is expected to produce variance reduction for the treatment parameter estimate and to adjust for random imbalances of covariates between the treatment groups. However, for the logistic regression model, variance reduction is not obviously obtained. This can lead to concerns about the assumptions of the logistic model. We introduce a complementary nonparametric method for covariate adjustment. It provides results that are usually compatible with expectations for analysis of covariance. The only assumptions required are based on randomization and sampling arguments. The resulting treatment parameter is a (unconditional) population average log-odds ratio that has been adjusted for random imbalance of covariates. Data from a randomized clinical trial are used to compare results from the traditional maximum likelihood logistic method with those from the nonparametric logistic method. We examine treatment parameter estimates, corresponding standard errors, and significance levels in models with and without covariate adjustment. In addition, we discuss differences between unconditional population average treatment parameters and conditional subpopulation average treatment parameters. Additional features of the nonparametric method, including stratified (multicenter) and multivariate (multivisit) analyses, are illustrated. Extensions of this methodology to the proportional odds model are also made.

  12. Incorporating geologic information into hydraulic tomography: A general framework based on geostatistical approach

    NASA Astrophysics Data System (ADS)

    Zha, Yuanyuan; Yeh, Tian-Chyi J.; Illman, Walter A.; Onoe, Hironori; Mok, Chin Man W.; Wen, Jet-Chau; Huang, Shao-Yang; Wang, Wenke

    2017-04-01

    Hydraulic tomography (HT) has become a mature aquifer test technology over the last two decades. It collects nonredundant information of aquifer heterogeneity by sequentially stressing the aquifer at different wells and collecting aquifer responses at other wells during each stress. The collected information is then interpreted by inverse models. Among these models, the geostatistical approaches, built upon the Bayesian framework, first conceptualize hydraulic properties to be estimated as random fields, which are characterized by means and covariance functions. They then use the spatial statistics as prior information with the aquifer response data to estimate the spatial distribution of the hydraulic properties at a site. Since the spatial statistics describe the generic spatial structures of the geologic media at the site rather than site-specific ones (e.g., known spatial distributions of facies, faults, or paleochannels), the estimates are often not optimal. To improve the estimates, we introduce a general statistical framework, which allows the inclusion of site-specific spatial patterns of geologic features. Subsequently, we test this approach with synthetic numerical experiments. Results show that this approach, using conditional mean and covariance that reflect site-specific large-scale geologic features, indeed improves the HT estimates. Afterward, this approach is applied to HT surveys at a kilometer-scale-fractured granite field site with a distinct fault zone. We find that by including fault information from outcrops and boreholes for HT analysis, the estimated hydraulic properties are improved. The improved estimates subsequently lead to better prediction of flow during a different pumping test at the site.

  13. Randomized Controlled Trial in Clinical Settings to Evaluate Effectiveness of Coping Skills Education Used with Progressive Tinnitus Management

    ERIC Educational Resources Information Center

    Henry, James A.; Thielman, Emily J.; Zaugg, Tara L.; Kaelin, Christine; Schmidt, Caroline J.; Griest, Susan; McMillan, Garnett P.; Myers, Paula; Rivera, Izel; Baldwin, Robert; Carlson, Kathleen

    2017-01-01

    Purpose: This randomized controlled trial evaluated, within clinical settings, the effectiveness of coping skills education that is provided with progressive tinnitus management (PTM). Method: At 2 Veterans Affairs medical centers, N = 300 veterans were randomized to either PTM intervention or 6-month wait-list control. The PTM intervention…

  14. Coupling geostatistics to detailed reservoir description allows better visualization and more accurate characterization/simulation of turbidite reservoirs: Elk Hills oil field, California

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allan, M.E.; Wilson, M.L.; Wightman, J.

    1996-12-31

    The Elk Hills giant oilfield, located in the southern San Joaquin Valley of California, has produced 1.1 billion barrels of oil from Miocene and shallow Pliocene reservoirs. 65% of the current 64,000 BOPD production is from the pressure-supported, deeper Miocene turbidite sands. In the turbidite sands of the 31 S structure, large porosity & permeability variations in the Main Body B and Western 31 S sands cause problems with the efficiency of the waterflooding. These variations have now been quantified and visualized using geostatistics. The end result is a more detailed reservoir characterization for simulation. Traditional reservoir descriptions based onmore » marker correlations, cross-sections and mapping do not provide enough detail to capture the short-scale stratigraphic heterogeneity needed for adequate reservoir simulation. These deterministic descriptions are inadequate to tie with production data as the thinly bedded sand/shale sequences blur into a falsely homogenous picture. By studying the variability of the geologic & petrophysical data vertically within each wellbore and spatially from well to well, a geostatistical reservoir description has been developed. It captures the natural variability of the sands and shales that was lacking from earlier work. These geostatistical studies allow the geologic and petrophysical characteristics to be considered in a probabilistic model. The end-product is a reservoir description that captures the variability of the reservoir sequences and can be used as a more realistic starting point for history matching and reservoir simulation.« less

  15. Coupling geostatistics to detailed reservoir description allows better visualization and more accurate characterization/simulation of turbidite reservoirs: Elk Hills oil field, California

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allan, M.E.; Wilson, M.L.; Wightman, J.

    1996-01-01

    The Elk Hills giant oilfield, located in the southern San Joaquin Valley of California, has produced 1.1 billion barrels of oil from Miocene and shallow Pliocene reservoirs. 65% of the current 64,000 BOPD production is from the pressure-supported, deeper Miocene turbidite sands. In the turbidite sands of the 31 S structure, large porosity permeability variations in the Main Body B and Western 31 S sands cause problems with the efficiency of the waterflooding. These variations have now been quantified and visualized using geostatistics. The end result is a more detailed reservoir characterization for simulation. Traditional reservoir descriptions based on markermore » correlations, cross-sections and mapping do not provide enough detail to capture the short-scale stratigraphic heterogeneity needed for adequate reservoir simulation. These deterministic descriptions are inadequate to tie with production data as the thinly bedded sand/shale sequences blur into a falsely homogenous picture. By studying the variability of the geologic petrophysical data vertically within each wellbore and spatially from well to well, a geostatistical reservoir description has been developed. It captures the natural variability of the sands and shales that was lacking from earlier work. These geostatistical studies allow the geologic and petrophysical characteristics to be considered in a probabilistic model. The end-product is a reservoir description that captures the variability of the reservoir sequences and can be used as a more realistic starting point for history matching and reservoir simulation.« less

  16. Identification and Simulation of Subsurface Soil patterns using hidden Markov random fields and remote sensing and geophysical EMI data sets

    NASA Astrophysics Data System (ADS)

    Wang, Hui; Wellmann, Florian; Verweij, Elizabeth; von Hebel, Christian; van der Kruk, Jan

    2017-04-01

    Lateral and vertical spatial heterogeneity of subsurface properties such as soil texture and structure influences the available water and resource supply for crop growth. High-resolution mapping of subsurface structures using non-invasive geo-referenced geophysical measurements, like electromagnetic induction (EMI), enables a characterization of 3D soil structures, which have shown correlations to remote sensing information of the crop states. The benefit of EMI is that it can return 3D subsurface information, however the spatial dimensions are limited due to the labor intensive measurement procedure. Although active and passive sensors mounted on air- or space-borne platforms return 2D images, they have much larger spatial dimensions. Combining both approaches provides us with a potential pathway to extend the detailed 3D geophysical information to a larger area by using remote sensing information. In this study, we aim at extracting and providing insights into the spatial and statistical correlation of the geophysical and remote sensing observations of the soil/vegetation continuum system. To this end, two key points need to be addressed: 1) how to detect and recognize the geometric patterns (i.e., spatial heterogeneity) from multiple data sets, and 2) how to quantitatively describe the statistical correlation between remote sensing information and geophysical measurements. In the current study, the spatial domain is restricted to shallow depths up to 3 meters, and the geostatistical database contains normalized difference vegetation index (NDVI) derived from RapidEye satellite images and apparent electrical conductivities (ECa) measured from multi-receiver EMI sensors for nine depths of exploration ranging from 0-2.7 m. The integrated data sets are mapped into both the physical space (i.e. the spatial domain) and feature space (i.e. a two-dimensional space framed by the NDVI and the ECa data). Hidden Markov Random Fields (HMRF) are employed to model the

  17. A geostatistical extreme-value framework for fast simulation of natural hazard events

    PubMed Central

    Stephenson, David B.

    2016-01-01

    We develop a statistical framework for simulating natural hazard events that combines extreme value theory and geostatistics. Robust generalized additive model forms represent generalized Pareto marginal distribution parameters while a Student’s t-process captures spatial dependence and gives a continuous-space framework for natural hazard event simulations. Efficiency of the simulation method allows many years of data (typically over 10 000) to be obtained at relatively little computational cost. This makes the model viable for forming the hazard module of a catastrophe model. We illustrate the framework by simulating maximum wind gusts for European windstorms, which are found to have realistic marginal and spatial properties, and validate well against wind gust measurements. PMID:27279768

  18. Comparison of geostatistical interpolation and remote sensing techniques for estimating long-term exposure to ambient PM2.5 concentrations across the continental United States.

    PubMed

    Lee, Seung-Jae; Serre, Marc L; van Donkelaar, Aaron; Martin, Randall V; Burnett, Richard T; Jerrett, Michael

    2012-12-01

    A better understanding of the adverse health effects of chronic exposure to fine particulate matter (PM2.5) requires accurate estimates of PM2.5 variation at fine spatial scales. Remote sensing has emerged as an important means of estimating PM2.5 exposures, but relatively few studies have compared remote-sensing estimates to those derived from monitor-based data. We evaluated and compared the predictive capabilities of remote sensing and geostatistical interpolation. We developed a space-time geostatistical kriging model to predict PM2.5 over the continental United States and compared resulting predictions to estimates derived from satellite retrievals. The kriging estimate was more accurate for locations that were about 100 km from a monitoring station, whereas the remote sensing estimate was more accurate for locations that were > 100 km from a monitoring station. Based on this finding, we developed a hybrid map that combines the kriging and satellite-based PM2.5 estimates. We found that for most of the populated areas of the continental United States, geostatistical interpolation produced more accurate estimates than remote sensing. The differences between the estimates resulting from the two methods, however, were relatively small. In areas with extensive monitoring networks, the interpolation may provide more accurate estimates, but in the many areas of the world without such monitoring, remote sensing can provide useful exposure estimates that perform nearly as well.

  19. Using direct current resistivity sounding and geostatistics to aid in hydrogeological studies in the Choshuichi alluvial fan, Taiwan.

    PubMed

    Yang, Chieh-Hou; Lee, Wei-Feng

    2002-01-01

    Ground water reservoirs in the Choshuichi alluvial fan, central western Taiwan, were investigated using direct-current (DC) resistivity soundings at 190 locations, combined with hydrogeological measurements from 37 wells. In addition, attempts were made to calculate aquifer transmissivity from both surface DC resistivity measurements and geostatistically derived predictions of aquifer properties. DC resistivity sounding data are highly correlated to the hydraulic parameters in the Choshuichi alluvial fan. By estimating the spatial distribution of hydraulic conductivity from the kriged well data and the cokriged thickness of the correlative aquifer from both resistivity sounding data and well information, the transmissivity of the aquifer at each location can be obtained from the product of kriged hydraulic conductivity and computed thickness of the geoelectric layer. Thus, the spatial variation of the transmissivities in the study area is obtained. Our work is more comparable to Ahmed et al. (1988) than to the work of Niwas and Singhal (1981). The first "constraint" from Niwas and Singhal's work is a result of their use of linear regression. The geostatistical approach taken here (and by Ahmed et al. [1988]) is a natural improvement on the linear regression approach.

  20. Random forests-based differential analysis of gene sets for gene expression data.

    PubMed

    Hsueh, Huey-Miin; Zhou, Da-Wei; Tsai, Chen-An

    2013-04-10

    In DNA microarray studies, gene-set analysis (GSA) has become the focus of gene expression data analysis. GSA utilizes the gene expression profiles of functionally related gene sets in Gene Ontology (GO) categories or priori-defined biological classes to assess the significance of gene sets associated with clinical outcomes or phenotypes. Many statistical approaches have been proposed to determine whether such functionally related gene sets express differentially (enrichment and/or deletion) in variations of phenotypes. However, little attention has been given to the discriminatory power of gene sets and classification of patients. In this study, we propose a method of gene set analysis, in which gene sets are used to develop classifications of patients based on the Random Forest (RF) algorithm. The corresponding empirical p-value of an observed out-of-bag (OOB) error rate of the classifier is introduced to identify differentially expressed gene sets using an adequate resampling method. In addition, we discuss the impacts and correlations of genes within each gene set based on the measures of variable importance in the RF algorithm. Significant classifications are reported and visualized together with the underlying gene sets and their contribution to the phenotypes of interest. Numerical studies using both synthesized data and a series of publicly available gene expression data sets are conducted to evaluate the performance of the proposed methods. Compared with other hypothesis testing approaches, our proposed methods are reliable and successful in identifying enriched gene sets and in discovering the contributions of genes within a gene set. The classification results of identified gene sets can provide an valuable alternative to gene set testing to reveal the unknown, biologically relevant classes of samples or patients. In summary, our proposed method allows one to simultaneously assess the discriminatory ability of gene sets and the importance of genes for

  1. A geostatistical approach for quantification of contaminant mass discharge uncertainty using multilevel sampler measurements

    NASA Astrophysics Data System (ADS)

    Li, K. Betty; Goovaerts, Pierre; Abriola, Linda M.

    2007-06-01

    Contaminant mass discharge across a control plane downstream of a dense nonaqueous phase liquid (DNAPL) source zone has great potential to serve as a metric for the assessment of the effectiveness of source zone treatment technologies and for the development of risk-based source-plume remediation strategies. However, too often the uncertainty of mass discharge estimated in the field is not accounted for in the analysis. In this paper, a geostatistical approach is proposed to estimate mass discharge and to quantify its associated uncertainty using multilevel transect measurements of contaminant concentration (C) and hydraulic conductivity (K). The approach adapts the p-field simulation algorithm to propagate and upscale the uncertainty of mass discharge from the local uncertainty models of C and K. Application of this methodology to numerically simulated transects shows that, with a regular sampling pattern, geostatistics can provide an accurate model of uncertainty for the transects that are associated with low levels of source mass removal (i.e., transects that have a large percentage of contaminated area). For high levels of mass removal (i.e., transects with a few hot spots and large areas of near-zero concentration), a total sampling area equivalent to 6˜7% of the transect is required to achieve accurate uncertainty modeling. A comparison of the results for different measurement supports indicates that samples taken with longer screen lengths may lead to less accurate models of mass discharge uncertainty. The quantification of mass discharge uncertainty, in the form of a probability distribution, will facilitate risk assessment associated with various remediation strategies.

  2. Prediction of sedimentary facies of x-oilfield in northwest of China by geostatistical inversion

    NASA Astrophysics Data System (ADS)

    Lei, Zhao; Ling, Ke; Tingting, He

    2017-03-01

    In the early stage of oilfield development, there are only a few wells and well spacing can reach several kilometers. for the alluvial fans and other heterogeneous reservoirs, information from wells alone is not sufficient to derive detailed reservoir information. In this paper, the method of calculating sand thickness through geostatistics inversion is studied, and quantitative relationships between each sedimentary micro-facies are analyzed by combining with single well sedimentary facies. Further, the sedimentary facies plane distribution based on seismic inversion is obtained by combining with sedimentary model, providing the geological basis for the next exploration and deployment.

  3. Comparison of regression and geostatistical methods for mapping Leaf Area Index (LAI) with Landsat ETM+ data over a boreal forest.

    Treesearch

    Mercedes Berterretche; Andrew T. Hudak; Warren B. Cohen; Thomas K. Maiersperger; Stith T. Gower; Jennifer Dungan

    2005-01-01

    This study compared aspatial and spatial methods of using remote sensing and field data to predict maximum growing season leaf area index (LAI) maps in a boreal forest in Manitoba, Canada. The methods tested were orthogonal regression analysis (reduced major axis, RMA) and two geostatistical techniques: kriging with an external drift (KED) and sequential Gaussian...

  4. The gradient boosting algorithm and random boosting for genome-assisted evaluation in large data sets.

    PubMed

    González-Recio, O; Jiménez-Montero, J A; Alenda, R

    2013-01-01

    In the next few years, with the advent of high-density single nucleotide polymorphism (SNP) arrays and genome sequencing, genomic evaluation methods will need to deal with a large number of genetic variants and an increasing sample size. The boosting algorithm is a machine-learning technique that may alleviate the drawbacks of dealing with such large data sets. This algorithm combines different predictors in a sequential manner with some shrinkage on them; each predictor is applied consecutively to the residuals from the committee formed by the previous ones to form a final prediction based on a subset of covariates. Here, a detailed description is provided and examples using a toy data set are included. A modification of the algorithm called "random boosting" was proposed to increase predictive ability and decrease computation time of genome-assisted evaluation in large data sets. Random boosting uses a random selection of markers to add a subsequent weak learner to the predictive model. These modifications were applied to a real data set composed of 1,797 bulls genotyped for 39,714 SNP. Deregressed proofs of 4 yield traits and 1 type trait from January 2009 routine evaluations were used as dependent variables. A 2-fold cross-validation scenario was implemented. Sires born before 2005 were used as a training sample (1,576 and 1,562 for production and type traits, respectively), whereas younger sires were used as a testing sample to evaluate predictive ability of the algorithm on yet-to-be-observed phenotypes. Comparison with the original algorithm was provided. The predictive ability of the algorithm was measured as Pearson correlations between observed and predicted responses. Further, estimated bias was computed as the average difference between observed and predicted phenotypes. The results showed that the modification of the original boosting algorithm could be run in 1% of the time used with the original algorithm and with negligible differences in accuracy

  5. Accounting for regional background and population size in the detection of spatial clusters and outliers using geostatistical filtering and spatial neutral models: the case of lung cancer in Long Island, New York

    PubMed Central

    Goovaerts, Pierre; Jacquez, Geoffrey M

    2004-01-01

    Background Complete Spatial Randomness (CSR) is the null hypothesis employed by many statistical tests for spatial pattern, such as local cluster or boundary analysis. CSR is however not a relevant null hypothesis for highly complex and organized systems such as those encountered in the environmental and health sciences in which underlying spatial pattern is present. This paper presents a geostatistical approach to filter the noise caused by spatially varying population size and to generate spatially correlated neutral models that account for regional background obtained by geostatistical smoothing of observed mortality rates. These neutral models were used in conjunction with the local Moran statistics to identify spatial clusters and outliers in the geographical distribution of male and female lung cancer in Nassau, Queens, and Suffolk counties, New York, USA. Results We developed a typology of neutral models that progressively relaxes the assumptions of null hypotheses, allowing for the presence of spatial autocorrelation, non-uniform risk, and incorporation of spatially heterogeneous population sizes. Incorporation of spatial autocorrelation led to fewer significant ZIP codes than found in previous studies, confirming earlier claims that CSR can lead to over-identification of the number of significant spatial clusters or outliers. Accounting for population size through geostatistical filtering increased the size of clusters while removing most of the spatial outliers. Integration of regional background into the neutral models yielded substantially different spatial clusters and outliers, leading to the identification of ZIP codes where SMR values significantly depart from their regional background. Conclusion The approach presented in this paper enables researchers to assess geographic relationships using appropriate null hypotheses that account for the background variation extant in real-world systems. In particular, this new methodology allows one to identify

  6. Relationship between RADARSAT-2 Derived Snow Thickness on Winter First Year Sea-Ice and Aerial Melt-Pond Distribution using Geostatistics and GLCM Texture

    NASA Astrophysics Data System (ADS)

    Ramjan, S.; Geldsetzer, T.; Yackel, J.

    2016-12-01

    A contemporary shift from primarily thicker, older multi-year sea ice (MYI) to thinner, smoother first-year sea ice (FYI) has been attributed to increased atmospheric and oceanic warming in the Arctic, with a steady diminishing of Arctic sea ice thickness due to a reduction of thick MYI compared to FYI. With an increase in FYI fraction, increased melting takes place during the summer months, exposing the sea ice to additional incoming solar radiation. With this change, an increase in melt pond fraction has been observed during the summer melt season. Prior research advocated that thin/thick snow leads to dominant surface flooding/snow patches during summer because of an enhanced ice-albedo feedback. For instance, thin snow cover areas form melt ponds first. Therefore, aerial measurements of melt pond fraction provide a proxy for relative snow thickness. RADARSAT-2 polarimetric SAR data can provide enhanced information about both surface scattering and volume scattering mechanisms, as well as recording the phase difference between polarizations. These polarimetric parameters can be computed that have a useful physical interpretation. The principle research focus is to establish a methodology to determine the relationship between selected geostatistics and image texture measures of pre-melt RADARSAT-2 parameters and aerially-measured melt pond fraction. Overall, the notion of this study is to develop an algorithm to estimate relative snow thickness variability in winter through an integrated approach utilizing SAR polarimetric parameters, geostatistical analysis and texture measures. Results are validated with test sets of melt pond fractions, and in situ snow thickness measurements. Preliminary findings show significant correlations with pond fraction for the standard deviation of HH and HV parameters at small incidence angles, and for the mean of the co-pol phase difference parameter at large incidence angles.

  7. Evaluation of variable selection methods for random forests and omics data sets.

    PubMed

    Degenhardt, Frauke; Seifert, Stephan; Szymczak, Silke

    2017-10-16

    Machine learning methods and in particular random forests are promising approaches for prediction based on high dimensional omics data sets. They provide variable importance measures to rank predictors according to their predictive power. If building a prediction model is the main goal of a study, often a minimal set of variables with good prediction performance is selected. However, if the objective is the identification of involved variables to find active networks and pathways, approaches that aim to select all relevant variables should be preferred. We evaluated several variable selection procedures based on simulated data as well as publicly available experimental methylation and gene expression data. Our comparison included the Boruta algorithm, the Vita method, recurrent relative variable importance, a permutation approach and its parametric variant (Altmann) as well as recursive feature elimination (RFE). In our simulation studies, Boruta was the most powerful approach, followed closely by the Vita method. Both approaches demonstrated similar stability in variable selection, while Vita was the most robust approach under a pure null model without any predictor variables related to the outcome. In the analysis of the different experimental data sets, Vita demonstrated slightly better stability in variable selection and was less computationally intensive than Boruta.In conclusion, we recommend the Boruta and Vita approaches for the analysis of high-dimensional data sets. Vita is considerably faster than Boruta and thus more suitable for large data sets, but only Boruta can also be applied in low-dimensional settings. © The Author 2017. Published by Oxford University Press.

  8. An Efficient ERP-Based Brain-Computer Interface Using Random Set Presentation and Face Familiarity

    PubMed Central

    Müller, Klaus-Robert; Lee, Seong-Whan

    2014-01-01

    Event-related potential (ERP)-based P300 spellers are commonly used in the field of brain-computer interfaces as an alternative channel of communication for people with severe neuro-muscular diseases. This study introduces a novel P300 based brain-computer interface (BCI) stimulus paradigm using a random set presentation pattern and exploiting the effects of face familiarity. The effect of face familiarity is widely studied in the cognitive neurosciences and has recently been addressed for the purpose of BCI. In this study we compare P300-based BCI performances of a conventional row-column (RC)-based paradigm with our approach that combines a random set presentation paradigm with (non-) self-face stimuli. Our experimental results indicate stronger deflections of the ERPs in response to face stimuli, which are further enhanced when using the self-face images, and thereby improving P300-based spelling performance. This lead to a significant reduction of stimulus sequences required for correct character classification. These findings demonstrate a promising new approach for improving the speed and thus fluency of BCI-enhanced communication with the widely used P300-based BCI setup. PMID:25384045

  9. An efficient ERP-based brain-computer interface using random set presentation and face familiarity.

    PubMed

    Yeom, Seul-Ki; Fazli, Siamac; Müller, Klaus-Robert; Lee, Seong-Whan

    2014-01-01

    Event-related potential (ERP)-based P300 spellers are commonly used in the field of brain-computer interfaces as an alternative channel of communication for people with severe neuro-muscular diseases. This study introduces a novel P300 based brain-computer interface (BCI) stimulus paradigm using a random set presentation pattern and exploiting the effects of face familiarity. The effect of face familiarity is widely studied in the cognitive neurosciences and has recently been addressed for the purpose of BCI. In this study we compare P300-based BCI performances of a conventional row-column (RC)-based paradigm with our approach that combines a random set presentation paradigm with (non-) self-face stimuli. Our experimental results indicate stronger deflections of the ERPs in response to face stimuli, which are further enhanced when using the self-face images, and thereby improving P300-based spelling performance. This lead to a significant reduction of stimulus sequences required for correct character classification. These findings demonstrate a promising new approach for improving the speed and thus fluency of BCI-enhanced communication with the widely used P300-based BCI setup.

  10. Involving patients in setting priorities for healthcare improvement: a cluster randomized trial

    PubMed Central

    2014-01-01

    Background Patients are increasingly seen as active partners in healthcare. While patient involvement in individual clinical decisions has been extensively studied, no trial has assessed how patients can effectively be involved in collective healthcare decisions affecting the population. The goal of this study was to test the impact of involving patients in setting healthcare improvement priorities for chronic care at the community level. Methods Design: Cluster randomized controlled trial. Local communities were randomized in intervention (priority setting with patient involvement) and control sites (no patient involvement). Setting: Communities in a canadian region were required to set priorities for improving chronic disease management in primary care, from a list of 37 validated quality indicators. Intervention: Patients were consulted in writing, before participating in face-to-face deliberation with professionals. Control: Professionals established priorities among themselves, without patient involvement. Participants: A total of 172 individuals from six communities participated in the study, including 83 chronic disease patients, and 89 health professionals. Outcomes: The primary outcome was the level of agreement between patients’ and professionals’ priorities. Secondary outcomes included professionals’ intention to use the selected quality indicators, and the costs of patient involvement. Results Priorities established with patients were more aligned with core generic components of the Medical Home and Chronic Care Model, including: access to primary care, self-care support, patient participation in clinical decisions, and partnership with community organizations (p < 0.01). Priorities established by professionals alone placed more emphasis on the technical quality of single disease management. The involvement intervention fostered mutual influence between patients and professionals, which resulted in a 41% increase in agreement on common

  11. Amorphous topological insulators constructed from random point sets

    NASA Astrophysics Data System (ADS)

    Mitchell, Noah P.; Nash, Lisa M.; Hexner, Daniel; Turner, Ari M.; Irvine, William T. M.

    2018-04-01

    The discovery that the band structure of electronic insulators may be topologically non-trivial has revealed distinct phases of electronic matter with novel properties1,2. Recently, mechanical lattices have been found to have similarly rich structure in their phononic excitations3,4, giving rise to protected unidirectional edge modes5-7. In all of these cases, however, as well as in other topological metamaterials3,8, the underlying structure was finely tuned, be it through periodicity, quasi-periodicity or isostaticity. Here we show that amorphous Chern insulators can be readily constructed from arbitrary underlying structures, including hyperuniform, jammed, quasi-crystalline and uniformly random point sets. While our findings apply to mechanical and electronic systems alike, we focus on networks of interacting gyroscopes as a model system. Local decorations control the topology of the vibrational spectrum, endowing amorphous structures with protected edge modes—with a chirality of choice. Using a real-space generalization of the Chern number, we investigate the topology of our structures numerically, analytically and experimentally. The robustness of our approach enables the topological design and self-assembly of non-crystalline topological metamaterials on the micro and macro scale.

  12. Predicting polycyclic aromatic hydrocarbons using a mass fraction approach in a geostatistical framework across North Carolina.

    PubMed

    Reyes, Jeanette M; Hubbard, Heidi F; Stiegel, Matthew A; Pleil, Joachim D; Serre, Marc L

    2018-01-09

    Currently in the United States there are no regulatory standards for ambient concentrations of polycyclic aromatic hydrocarbons (PAHs), a class of organic compounds with known carcinogenic species. As such, monitoring data are not routinely collected resulting in limited exposure mapping and epidemiologic studies. This work develops the log-mass fraction (LMF) Bayesian maximum entropy (BME) geostatistical prediction method used to predict the concentration of nine particle-bound PAHs across the US state of North Carolina. The LMF method develops a relationship between a relatively small number of collocated PAH and fine Particulate Matter (PM2.5) samples collected in 2005 and applies that relationship to a larger number of locations where PM2.5 is routinely monitored to more broadly estimate PAH concentrations across the state. Cross validation and mapping results indicate that by incorporating both PAH and PM2.5 data, the LMF BME method reduces mean squared error by 28.4% and produces more realistic spatial gradients compared to the traditional kriging approach based solely on observed PAH data. The LMF BME method efficiently creates PAH predictions in a PAH data sparse and PM2.5 data rich setting, opening the door for more expansive epidemiologic exposure assessments of ambient PAH.

  13. Obtaining parsimonious hydraulic conductivity fields using head and transport observations: A Bayesian geostatistical parameter estimation approach

    NASA Astrophysics Data System (ADS)

    Fienen, M.; Hunt, R.; Krabbenhoft, D.; Clemo, T.

    2009-08-01

    Flow path delineation is a valuable tool for interpreting the subsurface hydrogeochemical environment. Different types of data, such as groundwater flow and transport, inform different aspects of hydrogeologic parameter values (hydraulic conductivity in this case) which, in turn, determine flow paths. This work combines flow and transport information to estimate a unified set of hydrogeologic parameters using the Bayesian geostatistical inverse approach. Parameter flexibility is allowed by using a highly parameterized approach with the level of complexity informed by the data. Despite the effort to adhere to the ideal of minimal a priori structure imposed on the problem, extreme contrasts in parameters can result in the need to censor correlation across hydrostratigraphic bounding surfaces. These partitions segregate parameters into facies associations. With an iterative approach in which partitions are based on inspection of initial estimates, flow path interpretation is progressively refined through the inclusion of more types of data. Head observations, stable oxygen isotopes (18O/16O ratios), and tritium are all used to progressively refine flow path delineation on an isthmus between two lakes in the Trout Lake watershed, northern Wisconsin, United States. Despite allowing significant parameter freedom by estimating many distributed parameter values, a smooth field is obtained.

  14. Obtaining parsimonious hydraulic conductivity fields using head and transport observations: A Bayesian geostatistical parameter estimation approach

    USGS Publications Warehouse

    Fienen, M.; Hunt, R.; Krabbenhoft, D.; Clemo, T.

    2009-01-01

    Flow path delineation is a valuable tool for interpreting the subsurface hydrogeochemical environment. Different types of data, such as groundwater flow and transport, inform different aspects of hydrogeologic parameter values (hydraulic conductivity in this case) which, in turn, determine flow paths. This work combines flow and transport information to estimate a unified set of hydrogeologic parameters using the Bayesian geostatistical inverse approach. Parameter flexibility is allowed by using a highly parameterized approach with the level of complexity informed by the data. Despite the effort to adhere to the ideal of minimal a priori structure imposed on the problem, extreme contrasts in parameters can result in the need to censor correlation across hydrostratigraphic bounding surfaces. These partitions segregate parameters into facies associations. With an iterative approach in which partitions are based on inspection of initial estimates, flow path interpretation is progressively refined through the inclusion of more types of data. Head observations, stable oxygen isotopes (18O/16O ratios), and tritium are all used to progressively refine flow path delineation on an isthmus between two lakes in the Trout Lake watershed, northern Wisconsin, United States. Despite allowing significant parameter freedom by estimating many distributed parameter values, a smooth field is obtained.

  15. Fine-grained sediment spatial distribution on the basis of a geostatistical analysis: Example of the eastern Bay of the Seine (France)

    NASA Astrophysics Data System (ADS)

    Méar, Y.; Poizot, E.; Murat, A.; Lesueur, P.; Thomas, M.

    2006-12-01

    The eastern Bay of the Seine (English Channel) was the subject in 1991 of a sampling survey of superficial sediments. Geostatistic tools were used to examine the complexity of the spatial distribution of the fine-grained fraction (<50 μm). A central depocentre of fine sediments (i.e. content up to 50%) oriented in a NW-SE direction in a muddy coastal strip, in a very high energy hydrodynamical situation due to storm swells and its megatidal setting, is for the first time recognised and discussed. Within this sedimentary unit, the distribution of the fine fraction is very heterogeneous, with mud patches of less than 4000 m diameter; the boundary between these mud patches and their substratum is very sharp. The distribution of this fine fraction appears to be controlled by an anticyclonic eddy located off the Pays de Caux. Under the influence of this, the suspended material expelled from the Seine estuary moves along the coast and swings off Antifer harbour, towards the NW. It is trapped within this eddy because of the settling of suspended particulate matter. Both at a general scale and a local scale the morphology (whether inherited or due to modern processes) has a strong influence on the spatial distribution of the fine fraction. At the general scale, the basin-like shape of the area facilitates the silting, and the presence of the submarine dunes, called "Ridins d'Antifer", clearly determines the northern limit of the muddy zone. At a local scale, the same influence is obvious: paleovalleys trap the fine sediments, whereas isolated sand dunes and ripples limit the silting. This duality of role of the morphology is therefore one of the reasons why the muddy surface is extremely heterogeneous spatially. The presence of an important population of suspension feeding echinoderm, the brittle-star Ophiothrix fragilis Abildgaard, has led to a local increase in the silting, and to the modification of the physicochemical and sedimentological parameters. A complex

  16. Soil moisture estimation by assimilating L-band microwave brightness temperature with geostatistics and observation localization.

    PubMed

    Han, Xujun; Li, Xin; Rigon, Riccardo; Jin, Rui; Endrizzi, Stefano

    2015-01-01

    The observation could be used to reduce the model uncertainties with data assimilation. If the observation cannot cover the whole model area due to spatial availability or instrument ability, how to do data assimilation at locations not covered by observation? Two commonly used strategies were firstly described: One is covariance localization (CL); the other is observation localization (OL). Compared with CL, OL is easy to parallelize and more efficient for large-scale analysis. This paper evaluated OL in soil moisture profile characterizations, in which the geostatistical semivariogram was used to fit the spatial correlated characteristics of synthetic L-Band microwave brightness temperature measurement. The fitted semivariogram model and the local ensemble transform Kalman filter algorithm are combined together to weight and assimilate the observations within a local region surrounding the grid cell of land surface model to be analyzed. Six scenarios were compared: 1_Obs with one nearest observation assimilated, 5_Obs with no more than five nearest local observations assimilated, and 9_Obs with no more than nine nearest local observations assimilated. The scenarios with no more than 16, 25, and 36 local observations were also compared. From the results we can conclude that more local observations involved in assimilation will improve estimations with an upper bound of 9 observations in this case. This study demonstrates the potentials of geostatistical correlation representation in OL to improve data assimilation of catchment scale soil moisture using synthetic L-band microwave brightness temperature, which cannot cover the study area fully in space due to vegetation effects.

  17. Soil Moisture Estimation by Assimilating L-Band Microwave Brightness Temperature with Geostatistics and Observation Localization

    PubMed Central

    Han, Xujun; Li, Xin; Rigon, Riccardo; Jin, Rui; Endrizzi, Stefano

    2015-01-01

    The observation could be used to reduce the model uncertainties with data assimilation. If the observation cannot cover the whole model area due to spatial availability or instrument ability, how to do data assimilation at locations not covered by observation? Two commonly used strategies were firstly described: One is covariance localization (CL); the other is observation localization (OL). Compared with CL, OL is easy to parallelize and more efficient for large-scale analysis. This paper evaluated OL in soil moisture profile characterizations, in which the geostatistical semivariogram was used to fit the spatial correlated characteristics of synthetic L-Band microwave brightness temperature measurement. The fitted semivariogram model and the local ensemble transform Kalman filter algorithm are combined together to weight and assimilate the observations within a local region surrounding the grid cell of land surface model to be analyzed. Six scenarios were compared: 1_Obs with one nearest observation assimilated, 5_Obs with no more than five nearest local observations assimilated, and 9_Obs with no more than nine nearest local observations assimilated. The scenarios with no more than 16, 25, and 36 local observations were also compared. From the results we can conclude that more local observations involved in assimilation will improve estimations with an upper bound of 9 observations in this case. This study demonstrates the potentials of geostatistical correlation representation in OL to improve data assimilation of catchment scale soil moisture using synthetic L-band microwave brightness temperature, which cannot cover the study area fully in space due to vegetation effects. PMID:25635771

  18. Measuring CAMD technique performance. 2. How "druglike" are drugs? Implications of Random test set selection exemplified using druglikeness classification models.

    PubMed

    Good, Andrew C; Hermsmeier, Mark A

    2007-01-01

    Research into the advancement of computer-aided molecular design (CAMD) has a tendency to focus on the discipline of algorithm development. Such efforts are often wrought to the detriment of the data set selection and analysis used in said algorithm validation. Here we highlight the potential problems this can cause in the context of druglikeness classification. More rigorous efforts are applied to the selection of decoy (nondruglike) molecules from the ACD. Comparisons are made between model performance using the standard technique of random test set creation with test sets derived from explicit ontological separation by drug class. The dangers of viewing druglike space as sufficiently coherent to permit simple classification are highlighted. In addition the issues inherent in applying unfiltered data and random test set selection to (Q)SAR models utilizing large and supposedly heterogeneous databases are discussed.

  19. Geostatistical analysis of regional hydraulic conductivity variations in the Snake River Plain aquifer, eastern Idaho

    USGS Publications Warehouse

    Welhan, J.A.; Reed, M.F.

    1997-01-01

    The regional spatial correlation structure of bulk horizontal hydraulic conductivity (Kb) estimated from published transmissivity data from 79 open boreholes in the fractured basalt aquifer of the eastern Snake River Plain was analyzed with geostatistical methods. The two-dimensional spatial correlation structure of In Kb shows a pronounced 4:1 range anisotropy, with a maximum correlation range in the north-northwest- south-southeast direction of about 6 km. The maximum variogram range of In Kb is similar to the mean length of flow groups exposed at the surface. The In Kb range anisotropy is similar to the mean width/length ratio of late Quaternary and Holocene basalt lava flows and the orientations of the major volcanic structural features on the eastern Snake River Plain. The similarity between In Kb correlation scales and basalt flow dimensions and between basalt flow orientations and correlation range anisotropy suggests that the spatial distribution of zones of high hydraulic conductivity may be controlled by the lateral dimensions, spatial distribution, and interconnection between highly permeable zones which are known to occur between lava flows within flow groups. If hydraulic conductivity and lithology are eventually shown to be cross correlative in this geologic setting, it may be possible to stochastically simulate hydraulic conductivity distributions, which are conditional on a knowledge of volcanic stratigraphy.

  20. Improving parenting skills for families of young children in pediatric settings: a randomized clinical trial.

    PubMed

    Perrin, Ellen C; Sheldrick, R Christopher; McMenamy, Jannette M; Henson, Brandi S; Carter, Alice S

    2014-01-01

    Disruptive behavior disorders, such as attention-deficient/hyperactivity disorder and oppositional defiant disorder, are common and stable throughout childhood. These disorders cause long-term morbidity but benefit from early intervention. While symptoms are often evident before preschool, few children receive appropriate treatment during this period. Group parent training, such as the Incredible Years program, has been shown to be effective in improving parenting strategies and reducing children's disruptive behaviors. Because they already monitor young children's behavior and development, primary care pediatricians are in a good position to intervene early when indicated. To investigate the feasibility and effectiveness of parent-training groups delivered to parents of toddlers in pediatric primary care settings. This randomized clinical trial was conducted at 11 diverse pediatric practices in the Greater Boston area. A total of 273 parents of children between 2 and 4 years old who acknowledged disruptive behaviors on a 20-item checklist were included. A 10-week Incredible Years parent-training group co-led by a research clinician and a pediatric staff member. Self-reports and structured videotaped observations of parent and child behaviors conducted prior to, immediately after, and 12 months after the intervention. A total of 150 parents were randomly assigned to the intervention or the waiting-list group. An additional 123 parents were assigned to receive intervention without a randomly selected comparison group. Compared with the waiting-list group, greater improvement was observed in both intervention groups (P < .05). No differences were observed between the randomized and the nonrandomized intervention groups. Self-reports and structured observations provided evidence of improvements in parenting practices and child disruptive behaviors that were attributable to participation in the Incredible Years groups. This study demonstrated the feasibility and

  1. Geostatistical Investigations of Displacements on the Basis of Data from the Geodetic Monitoring of a Hydrotechnical Object

    NASA Astrophysics Data System (ADS)

    Namysłowska-Wilczyńska, Barbara; Wynalek, Janusz

    2017-12-01

    Geostatistical methods make the analysis of measurement data possible. This article presents the problems directed towards the use of geostatistics in spatial analysis of displacements based on geodetic monitoring. Using methods of applied (spatial) statistics, the research deals with interesting and current issues connected to space-time analysis, modeling displacements and deformations, as applied to any large-area objects on which geodetic monitoring is conducted (e.g., water dams, urban areas in the vicinity of deep excavations, areas at a macro-regional scale subject to anthropogenic influences caused by mining, etc.). These problems are very crucial, especially for safety assessment of important hydrotechnical constructions, as well as for modeling and estimating mining damage. Based on the geodetic monitoring data, a substantial basic empirical material was created, comprising many years of research results concerning displacements of controlled points situated on the crown and foreland of an exemplary earth dam, and used to assess the behaviour and safety of the object during its whole operating period. A research method at a macro-regional scale was applied to investigate some phenomena connected with the operation of the analysed big hydrotechnical construction. Applying a semivariogram function enabled the spatial variability analysis of displacements. Isotropic empirical semivariograms were calculated and then, theoretical parameters of analytical functions were determined, which approximated the courses of the mentioned empirical variability measure. Using ordinary (block) kriging at the grid nodes of an elementary spatial grid covering the analysed object, the values of the Z* estimated means of displacements were calculated together with the accompanying assessment of uncertainty estimation - a standard deviation of estimation σk. Raster maps of the distribution of estimated averages Z* and raster maps of deviations of estimation σk (in perspective

  2. A geostatistics-informed hierarchical sensitivity analysis method for complex groundwater flow and transport modeling

    NASA Astrophysics Data System (ADS)

    Dai, Heng; Chen, Xingyuan; Ye, Ming; Song, Xuehang; Zachara, John M.

    2017-05-01

    Sensitivity analysis is an important tool for development and improvement of mathematical models, especially for complex systems with a high dimension of spatially correlated parameters. Variance-based global sensitivity analysis has gained popularity because it can quantify the relative contribution of uncertainty from different sources. However, its computational cost increases dramatically with the complexity of the considered model and the dimension of model parameters. In this study, we developed a new sensitivity analysis method that integrates the concept of variance-based method with a hierarchical uncertainty quantification framework. Different uncertain inputs are grouped and organized into a multilayer framework based on their characteristics and dependency relationships to reduce the dimensionality of the sensitivity analysis. A set of new sensitivity indices are defined for the grouped inputs using the variance decomposition method. Using this methodology, we identified the most important uncertainty source for a dynamic groundwater flow and solute transport model at the Department of Energy (DOE) Hanford site. The results indicate that boundary conditions and permeability field contribute the most uncertainty to the simulated head field and tracer plume, respectively. The relative contribution from each source varied spatially and temporally. By using a geostatistical approach to reduce the number of realizations needed for the sensitivity analysis, the computational cost of implementing the developed method was reduced to a practically manageable level. The developed sensitivity analysis method is generally applicable to a wide range of hydrologic and environmental problems that deal with high-dimensional spatially distributed input variables.

  3. A training image evaluation and selection method based on minimum data event distance for multiple-point geostatistics

    NASA Astrophysics Data System (ADS)

    Feng, Wenjie; Wu, Shenghe; Yin, Yanshu; Zhang, Jiajia; Zhang, Ke

    2017-07-01

    A training image (TI) can be regarded as a database of spatial structures and their low to higher order statistics used in multiple-point geostatistics (MPS) simulation. Presently, there are a number of methods to construct a series of candidate TIs (CTIs) for MPS simulation based on a modeler's subjective criteria. The spatial structures of TIs are often various, meaning that the compatibilities of different CTIs with the conditioning data are different. Therefore, evaluation and optimal selection of CTIs before MPS simulation is essential. This paper proposes a CTI evaluation and optimal selection method based on minimum data event distance (MDevD). In the proposed method, a set of MDevD properties are established through calculation of the MDevD of conditioning data events in each CTI. Then, CTIs are evaluated and ranked according to the mean value and variance of the MDevD properties. The smaller the mean value and variance of an MDevD property are, the more compatible the corresponding CTI is with the conditioning data. In addition, data events with low compatibility in the conditioning data grid can be located to help modelers select a set of complementary CTIs for MPS simulation. The MDevD property can also help to narrow the range of the distance threshold for MPS simulation. The proposed method was evaluated using three examples: a 2D categorical example, a 2D continuous example, and an actual 3D oil reservoir case study. To illustrate the method, a C++ implementation of the method is attached to the paper.

  4. The Ethics of Randomized Controlled Trials in Social Settings: Can Social Trials Be Scientifically Promising and Must There Be Equipoise?

    ERIC Educational Resources Information Center

    Fives, Allyn; Russell, Daniel W.; Canavan, John; Lyons, Rena; Eaton, Patricia; Devaney, Carmel; Kearns, Norean; O'Brien, Aoife

    2015-01-01

    In a randomized controlled trial (RCT), treatments are assigned randomly and treatments are withheld from participants. Is it ethically permissible to conduct an RCT in a social setting? This paper addresses two conditions for justifying RCTs: that there should be a state of equipoise and that the trial should be scientifically promising.…

  5. An interactive Bayesian geostatistical inverse protocol for hydraulic tomography

    USGS Publications Warehouse

    Fienen, Michael N.; Clemo, Tom; Kitanidis, Peter K.

    2008-01-01

    Hydraulic tomography is a powerful technique for characterizing heterogeneous hydrogeologic parameters. An explicit trade-off between characterization based on measurement misfit and subjective characterization using prior information is presented. We apply a Bayesian geostatistical inverse approach that is well suited to accommodate a flexible model with the level of complexity driven by the data and explicitly considering uncertainty. Prior information is incorporated through the selection of a parameter covariance model characterizing continuity and providing stability. Often, discontinuities in the parameter field, typically caused by geologic contacts between contrasting lithologic units, necessitate subdivision into zones across which there is no correlation among hydraulic parameters. We propose an interactive protocol in which zonation candidates are implied from the data and are evaluated using cross validation and expert knowledge. Uncertainty introduced by limited knowledge of dynamic regional conditions is mitigated by using drawdown rather than native head values. An adjoint state formulation of MODFLOW-2000 is used to calculate sensitivities which are used both for the solution to the inverse problem and to guide protocol decisions. The protocol is tested using synthetic two-dimensional steady state examples in which the wells are located at the edge of the region of interest.

  6. Kriging in the Shadows: Geostatistical Interpolation for Remote Sensing

    NASA Technical Reports Server (NTRS)

    Rossi, Richard E.; Dungan, Jennifer L.; Beck, Louisa R.

    1994-01-01

    It is often useful to estimate obscured or missing remotely sensed data. Traditional interpolation methods, such as nearest-neighbor or bilinear resampling, do not take full advantage of the spatial information in the image. An alternative method, a geostatistical technique known as indicator kriging, is described and demonstrated using a Landsat Thematic Mapper image in southern Chiapas, Mexico. The image was first classified into pasture and nonpasture land cover. For each pixel that was obscured by cloud or cloud shadow, the probability that it was pasture was assigned by the algorithm. An exponential omnidirectional variogram model was used to characterize the spatial continuity of the image for use in the kriging algorithm. Assuming a cutoff probability level of 50%, the error was shown to be 17% with no obvious spatial bias but with some tendency to categorize nonpasture as pasture (overestimation). While this is a promising result, the method's practical application in other missing data problems for remotely sensed images will depend on the amount and spatial pattern of the unobscured pixels and missing pixels and the success of the spatial continuity model used.

  7. Evaluating shrub-associated spatial patterns of soil properties in a shrub-steppe ecosystem using multiple-variable geostatistics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Halvorson, J.J.; Smith, J.L.; Bolton, H. Jr.

    1995-09-01

    Geostatistics are often calculated for a single variable at a time, even though many natural phenomena are functions of several variables. The objective of this work was to demonstrate a nonparametric approach for assessing the spatial characteristics of multiple-variable phenomena. Specifically, we analyzed the spatial characteristics of resource islands in the soil under big sagebrush (Artemisia tridentala Nutt.), a dominant shrub in the intermountain western USA. For our example, we defined resource islands as a function of six soil variables representing concentrations of soil resources, populations of microorganisms, and soil microbial physiological variables. By collectively evaluating the indicator transformations ofmore » these individual variables, we created a new data set, termed a multiple-variable indicator transform or MVIT. Alternate MVITs were obtained by varying the selection criteria. Each MVIT was analyzed with variography to characterize spatial continuity, and with indicator kriging to predict the combined probability of their occurrence at unsampled locations in the landscape. Simple graphical analysis and variography demonstrated spatial dependence for all individual soil variables. Maps derived from ordinary kriging of MVITs suggested that the combined probabilities for encountering zones of above-median resources were greatest near big sagebrush. 51 refs., 5 figs., 1 tab.« less

  8. Analysis of photogenerated random telegraph signal in single electron detector (photo-SET).

    PubMed

    Troudi, M; Sghaier, Na; Kalboussi, A; Souifi, A

    2010-01-04

    In this paper, we analyzed slow single traps, situated inside the tunnel oxide of small area single electron photo-detector (photo-SET or nanopixel). The relationship between excitation signal (photons) and random-telegraph-signal (RTS) was evidenced. We demonstrated that photoinduced RTS observed on a photo-detector is due to the interaction between single photogenerated charges that tunnel from dot to dot and current path. Based on RTS analysis for various temperatures, gate bias and optical power we determined the characteristics of these single photogenerated traps: the energy position within the silicon bandgap, capture cross section and the position within the Si/SiO(x = 1.5) interfaces.

  9. Geostatistics as a tool to improve the natural background level definition: An application in groundwater.

    PubMed

    Dalla Libera, Nico; Fabbri, Paolo; Mason, Leonardo; Piccinini, Leonardo; Pola, Marco

    2017-11-15

    The Natural Background Level (NBL), suggested by UE BRIDGE project, is suited for spatially distributed datasets providing a regional value that could be higher than the Threshold Value (TV) set by every country. In hydro-geochemically dis-homogeneous areas, the use of a unique regional NBL, higher than TV, could arise problems to distinguish between natural occurrences and anthropogenic contaminant sources. Hence, the goal of this study is to improve the NBL definition employing a geostatistical approach, which reconstructs the contaminant spatial structure accounting geochemical and hydrogeological relationships. This integrated mapping is fundamental to evaluate the contaminant's distribution impact on the NBL, giving indications to improve it. We decided to test this method on the Drainage Basin of Venice Lagoon (DBVL, NE Italy), where the existing NBL is seven times higher than the TV. This area is notoriously affected by naturally occurring arsenic contamination. An available geochemical dataset collected by 50 piezometers was used to reconstruct the spatial distribution of arsenic in the densely populated area of the DBVL. A cokriging approach was applied exploiting the geochemical relationships among As, Fe and NH4+. The obtained spatial predictions of arsenic concentrations were divided into three different zones: i) areas with an As concentration lower than the TV, ii) areas with an As concentration between the TV and the median of the values higher than the TV, and iii) areas with an As concentration higher than the median. Following the BRIDGE suggestions, where enough samples were available, the 90th percentile for each zone was calculated to obtain a local NBL (LNBL). Differently from the original NBL, this local value gives more detailed water quality information accounting the hydrogeological and geochemical setting, and contaminant spatial variation. Hence, the LNBL could give more indications about the distinction between natural occurrence and

  10. MoisturEC: an R application for geostatistical estimation of moisture content from electrical conductivity data

    NASA Astrophysics Data System (ADS)

    Terry, N.; Day-Lewis, F. D.; Werkema, D. D.; Lane, J. W., Jr.

    2017-12-01

    Soil moisture is a critical parameter for agriculture, water supply, and management of landfills. Whereas direct data (as from TDR or soil moisture probes) provide localized point scale information, it is often more desirable to produce 2D and/or 3D estimates of soil moisture from noninvasive measurements. To this end, geophysical methods for indirectly assessing soil moisture have great potential, yet are limited in terms of quantitative interpretation due to uncertainty in petrophysical transformations and inherent limitations in resolution. Simple tools to produce soil moisture estimates from geophysical data are lacking. We present a new standalone program, MoisturEC, for estimating moisture content distributions from electrical conductivity data. The program uses an indicator kriging method within a geostatistical framework to incorporate hard data (as from moisture probes) and soft data (as from electrical resistivity imaging or electromagnetic induction) to produce estimates of moisture content and uncertainty. The program features data visualization and output options as well as a module for calibrating electrical conductivity with moisture content to improve estimates. The user-friendly program is written in R - a widely used, cross-platform, open source programming language that lends itself to further development and customization. We demonstrate use of the program with a numerical experiment as well as a controlled field irrigation experiment. Results produced from the combined geostatistical framework of MoisturEC show improved estimates of moisture content compared to those generated from individual datasets. This application provides a convenient and efficient means for integrating various data types and has broad utility to soil moisture monitoring in landfills, agriculture, and other problems.

  11. Improving practice in community-based settings: a randomized trial of supervision - study protocol.

    PubMed

    Dorsey, Shannon; Pullmann, Michael D; Deblinger, Esther; Berliner, Lucy; Kerns, Suzanne E; Thompson, Kelly; Unützer, Jürgen; Weisz, John R; Garland, Ann F

    2013-08-10

    Evidence-based treatments for child mental health problems are not consistently available in public mental health settings. Expanding availability requires workforce training. However, research has demonstrated that training alone is not sufficient for changing provider behavior, suggesting that ongoing intervention-specific supervision or consultation is required. Supervision is notably under-investigated, particularly as provided in public mental health. The degree to which supervision in this setting includes 'gold standard' supervision elements from efficacy trials (e.g., session review, model fidelity, outcome monitoring, skill-building) is unknown. The current federally-funded investigation leverages the Washington State Trauma-focused Cognitive Behavioral Therapy Initiative to describe usual supervision practices and test the impact of systematic implementation of gold standard supervision strategies on treatment fidelity and clinical outcomes. The study has two phases. We will conduct an initial descriptive study (Phase I) of supervision practices within public mental health in Washington State followed by a randomized controlled trial of gold standard supervision strategies (Phase II), with randomization at the clinician level (i.e., supervisors provide both conditions). Study participants will be 35 supervisors and 130 clinicians in community mental health centers. We will enroll one child per clinician in Phase I (N = 130) and three children per clinician in Phase II (N = 390). We use a multi-level mixed within- and between-subjects longitudinal design. Audio recordings of supervision and therapy sessions will be collected and coded throughout both phases. Child outcome data will be collected at the beginning of treatment and at three and six months into treatment. This study will provide insight into how supervisors can optimally support clinicians delivering evidence-based treatments. Phase I will provide descriptive information, currently

  12. Generalized index for spatial data sets as a measure of complete spatial randomness

    NASA Astrophysics Data System (ADS)

    Hackett-Jones, Emily J.; Davies, Kale J.; Binder, Benjamin J.; Landman, Kerry A.

    2012-06-01

    Spatial data sets, generated from a wide range of physical systems can be analyzed by counting the number of objects in a set of bins. Previous work has been limited to equal-sized bins, which are inappropriate for some domains (e.g., circular). We consider a nonequal size bin configuration whereby overlapping or nonoverlapping bins cover the domain. A generalized index, defined in terms of a variance between bin counts, is developed to indicate whether or not a spatial data set, generated from exclusion or nonexclusion processes, is at the complete spatial randomness (CSR) state. Limiting values of the index are determined. Using examples, we investigate trends in the generalized index as a function of density and compare the results with those using equal size bins. The smallest bin size must be much larger than the mean size of the objects. We can determine whether a spatial data set is at the CSR state or not by comparing the values of a generalized index for different bin configurations—the values will be approximately the same if the data is at the CSR state, while the values will differ if the data set is not at the CSR state. In general, the generalized index is lower than the limiting value of the index, since objects do not have access to the entire region due to blocking by other objects. These methods are applied to two applications: (i) spatial data sets generated from a cellular automata model of cell aggregation in the enteric nervous system and (ii) a known plant data distribution.

  13. High-Dimensional Bayesian Geostatistics

    PubMed Central

    Banerjee, Sudipto

    2017-01-01

    With the growing capabilities of Geographic Information Systems (GIS) and user-friendly software, statisticians today routinely encounter geographically referenced data containing observations from a large number of spatial locations and time points. Over the last decade, hierarchical spatiotemporal process models have become widely deployed statistical tools for researchers to better understand the complex nature of spatial and temporal variability. However, fitting hierarchical spatiotemporal models often involves expensive matrix computations with complexity increasing in cubic order for the number of spatial locations and temporal points. This renders such models unfeasible for large data sets. This article offers a focused review of two methods for constructing well-defined highly scalable spatiotemporal stochastic processes. Both these processes can be used as “priors” for spatiotemporal random fields. The first approach constructs a low-rank process operating on a lower-dimensional subspace. The second approach constructs a Nearest-Neighbor Gaussian Process (NNGP) that ensures sparse precision matrices for its finite realizations. Both processes can be exploited as a scalable prior embedded within a rich hierarchical modeling framework to deliver full Bayesian inference. These approaches can be described as model-based solutions for big spatiotemporal datasets. The models ensure that the algorithmic complexity has ~ n floating point operations (flops), where n the number of spatial locations (per iteration). We compare these methods and provide some insight into their methodological underpinnings. PMID:29391920

  14. High-Dimensional Bayesian Geostatistics.

    PubMed

    Banerjee, Sudipto

    2017-06-01

    With the growing capabilities of Geographic Information Systems (GIS) and user-friendly software, statisticians today routinely encounter geographically referenced data containing observations from a large number of spatial locations and time points. Over the last decade, hierarchical spatiotemporal process models have become widely deployed statistical tools for researchers to better understand the complex nature of spatial and temporal variability. However, fitting hierarchical spatiotemporal models often involves expensive matrix computations with complexity increasing in cubic order for the number of spatial locations and temporal points. This renders such models unfeasible for large data sets. This article offers a focused review of two methods for constructing well-defined highly scalable spatiotemporal stochastic processes. Both these processes can be used as "priors" for spatiotemporal random fields. The first approach constructs a low-rank process operating on a lower-dimensional subspace. The second approach constructs a Nearest-Neighbor Gaussian Process (NNGP) that ensures sparse precision matrices for its finite realizations. Both processes can be exploited as a scalable prior embedded within a rich hierarchical modeling framework to deliver full Bayesian inference. These approaches can be described as model-based solutions for big spatiotemporal datasets. The models ensure that the algorithmic complexity has ~ n floating point operations (flops), where n the number of spatial locations (per iteration). We compare these methods and provide some insight into their methodological underpinnings.

  15. Implementation of the Iterative Proportion Fitting Algorithm for Geostatistical Facies Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li Yupeng, E-mail: yupeng@ualberta.ca; Deutsch, Clayton V.

    2012-06-15

    In geostatistics, most stochastic algorithm for simulation of categorical variables such as facies or rock types require a conditional probability distribution. The multivariate probability distribution of all the grouped locations including the unsampled location permits calculation of the conditional probability directly based on its definition. In this article, the iterative proportion fitting (IPF) algorithm is implemented to infer this multivariate probability. Using the IPF algorithm, the multivariate probability is obtained by iterative modification to an initial estimated multivariate probability using lower order bivariate probabilities as constraints. The imposed bivariate marginal probabilities are inferred from profiles along drill holes or wells.more » In the IPF process, a sparse matrix is used to calculate the marginal probabilities from the multivariate probability, which makes the iterative fitting more tractable and practical. This algorithm can be extended to higher order marginal probability constraints as used in multiple point statistics. The theoretical framework is developed and illustrated with estimation and simulation example.« less

  16. LSHSIM: A Locality Sensitive Hashing based method for multiple-point geostatistics

    NASA Astrophysics Data System (ADS)

    Moura, Pedro; Laber, Eduardo; Lopes, Hélio; Mesejo, Daniel; Pavanelli, Lucas; Jardim, João; Thiesen, Francisco; Pujol, Gabriel

    2017-10-01

    Reservoir modeling is a very important task that permits the representation of a geological region of interest, so as to generate a considerable number of possible scenarios. Since its inception, many methodologies have been proposed and, in the last two decades, multiple-point geostatistics (MPS) has been the dominant one. This methodology is strongly based on the concept of training image (TI) and the use of its characteristics, which are called patterns. In this paper, we propose a new MPS method that combines the application of a technique called Locality Sensitive Hashing (LSH), which permits to accelerate the search for patterns similar to a target one, with a Run-Length Encoding (RLE) compression technique that speeds up the calculation of the Hamming similarity. Experiments with both categorical and continuous images show that LSHSIM is computationally efficient and produce good quality realizations. In particular, for categorical data, the results suggest that LSHSIM is faster than MS-CCSIM, one of the state-of-the-art methods.

  17. Site characterization methodology for aquifers in support of bioreclamation activities. Volume 2: Borehole flowmeter technique, tracer tests, geostatistics and geology. Final report, August 1987-September 1989

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Young, S.C.

    1993-08-01

    This report discusses a field demonstration of a methodology for characterizing an aquifer's geohydrology in the detail required to design an optimum network of wells and/or infiltration galleries for bioreclamation systems. The project work was conducted on a 1-hectare test site at Columbus AFB, Mississippi. The technical report is divided into two volumes. Volume I describes the test site and the well network, the assumptions, and the application of equations that define groundwater flow to a well, the results of three large-scale aquifer tests, and the results of 160 single-pump tests. Volume II describes the bore hole flowmeter tests, themore » tracer tests, the geological investigations, the geostatistical analysis and the guidelines for using groundwater models to design bioreclamation systems. Site characterization, Hydraulic conductivity, Groundwater flow, Geostatistics, Geohydrology, Monitoring wells.« less

  18. Phase I/II randomized trial of aerobic exercise in Parkinson disease in a community setting.

    PubMed

    Uc, Ergun Y; Doerschug, Kevin C; Magnotta, Vincent; Dawson, Jeffrey D; Thomsen, Teri R; Kline, Joel N; Rizzo, Matthew; Newman, Sara R; Mehta, Sonya; Grabowski, Thomas J; Bruss, Joel; Blanchette, Derek R; Anderson, Steven W; Voss, Michelle W; Kramer, Arthur F; Darling, Warren G

    2014-07-29

    To (1) investigate effects of aerobic walking on motor function, cognition, and quality of life in Parkinson disease (PD), and (2) compare safety, tolerability, and fitness benefits of different forms of exercise intervention: continuous/moderate intensity vs interval/alternating between low and vigorous intensity, and individual/neighborhood vs group/facility setting. Initial design was a 6-month, 2 × 2 randomized trial of different exercise regimens in independently ambulatory patients with PD. All arms were required to exercise 3 times per week, 45 minutes per session. Randomization to group/facility setting was not feasible because of logistical factors. Over the first 2 years, we randomized 43 participants to continuous or interval training. Because preliminary analyses suggested higher musculoskeletal adverse events in the interval group and lack of difference between training methods in improving fitness, the next 17 participants were allocated only to continuous training. Eighty-one percent of 60 participants completed the study with a mean attendance of 83.3% (95% confidence interval: 77.5%-89.0%), exercising at 46.8% (44.0%-49.7%) of their heart rate reserve. There were no serious adverse events. Across all completers, we observed improvements in maximum oxygen consumption, gait speed, Unified Parkinson's Disease Rating Scale sections I and III scores (particularly axial functions and rigidity), fatigue, depression, quality of life (e.g., psychological outlook), and flanker task scores (p < 0.05 to p < 0.001). Increase in maximum oxygen consumption correlated with improvements on the flanker task and quality of life (p < 0.05). Our preliminary study suggests that aerobic walking in a community setting is safe, well tolerated, and improves aerobic fitness, motor function, fatigue, mood, executive control, and quality of life in mild to moderate PD. This study provides Class IV evidence that in patients with PD, an aerobic exercise program improves aerobic

  19. Geostatistics and Geographic Information Systems to Study the Spatial Distribution of Grapholita molesta (Busck) (Lepidoptera: Tortricidae) in Peach Fields.

    PubMed

    Duarte, F; Calvo, M V; Borges, A; Scatoni, I B

    2015-08-01

    The oriental fruit moth, Grapholita molesta (Busck), is the most serious pest in peach, and several insecticide applications are required to reduce crop damage to acceptable levels. Geostatistics and Geographic Information Systems (GIS) are employed to measure the range of spatial correlation of G. molesta in order to define the optimum sampling distance for performing spatial analysis and to determine the current distribution of the pest in peach orchards of southern Uruguay. From 2007 to 2010, 135 pheromone traps per season were installed and georeferenced in peach orchards distributed over 50,000 ha. Male adult captures were recorded weekly from September to April. Structural analysis of the captures was performed, yielding 14 semivariograms for the accumulated captures analyzed by generation and growing season. Two sets of maps were constructed to describe the pest distribution. Nine significant models were obtained in the 14 evaluated periods. The range estimated for the correlation was from 908 to 6884 m. Three hot spots of high population level and some areas with comparatively low populations were constant over the 3-year period, while there is a greater variation in the size of the population in different generations and years in other areas.

  20. [Multivariate geostatistics and GIS-based approach to study the spatial distribution and sources of heavy metals in agricultural soil in the Pearl River Delta, China].

    PubMed

    Cai, Li-mei; Ma, Jin; Zhou, Yong-zhang; Huang, Lan-chun; Dou, Lei; Zhang, Cheng-bo; Fu, Shan-ming

    2008-12-01

    One hundred and eighteen surface soil samples were collected from the Dongguan City, and analyzed for concentration of Cu, Zn, Ni, Cr, Pb, Cd, As, Hg, pH and OM. The spatial distribution and sources of soil heavy metals were studied using multivariate geostatistical methods and GIS technique. The results indicated concentrations of Cu, Zn, Ni, Pb, Cd and Hg were beyond the soil background content in Guangdong province, and especially concentrations of Pb, Cd and Hg were greatly beyond the content. The results of factor analysis group Cu, Zn, Ni, Cr and As in Factor 1, Pb and Hg in Factor 2 and Cd in Factor 3. The spatial maps based on geostatistical analysis show definite association of Factor 1 with the soil parent material, Factor 2 was mainly affected by industries. The spatial distribution of Factor 3 was attributed to anthropogenic influence.

  1. Inversion using a new low-dimensional representation of complex binary geological media based on a deep neural network

    NASA Astrophysics Data System (ADS)

    Laloy, Eric; Hérault, Romain; Lee, John; Jacques, Diederik; Linde, Niklas

    2017-12-01

    Efficient and high-fidelity prior sampling and inversion for complex geological media is still a largely unsolved challenge. Here, we use a deep neural network of the variational autoencoder type to construct a parametric low-dimensional base model parameterization of complex binary geological media. For inversion purposes, it has the attractive feature that random draws from an uncorrelated standard normal distribution yield model realizations with spatial characteristics that are in agreement with the training set. In comparison with the most commonly used parametric representations in probabilistic inversion, we find that our dimensionality reduction (DR) approach outperforms principle component analysis (PCA), optimization-PCA (OPCA) and discrete cosine transform (DCT) DR techniques for unconditional geostatistical simulation of a channelized prior model. For the considered examples, important compression ratios (200-500) are achieved. Given that the construction of our parameterization requires a training set of several tens of thousands of prior model realizations, our DR approach is more suited for probabilistic (or deterministic) inversion than for unconditional (or point-conditioned) geostatistical simulation. Probabilistic inversions of 2D steady-state and 3D transient hydraulic tomography data are used to demonstrate the DR-based inversion. For the 2D case study, the performance is superior compared to current state-of-the-art multiple-point statistics inversion by sequential geostatistical resampling (SGR). Inversion results for the 3D application are also encouraging.

  2. Combining area-based and individual-level data in the geostatistical mapping of late-stage cancer incidence.

    PubMed

    Goovaerts, Pierre

    2009-01-01

    This paper presents a geostatistical approach to incorporate individual-level data (e.g. patient residences) and area-based data (e.g. rates recorded at census tract level) into the mapping of late-stage cancer incidence, with an application to breast cancer in three Michigan counties. Spatial trends in cancer incidence are first estimated from census data using area-to-point binomial kriging. This prior model is then updated using indicator kriging and individual-level data. Simulation studies demonstrate the benefits of this two-step approach over methods (kernel density estimation and indicator kriging) that process only residence data.

  3. Set statistics in conductive bridge random access memory device with Cu/HfO{sub 2}/Pt structure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Meiyun; Long, Shibing, E-mail: longshibing@ime.ac.cn; Wang, Guoming

    2014-11-10

    The switching parameter variation of resistive switching memory is one of the most important challenges in its application. In this letter, we have studied the set statistics of conductive bridge random access memory with a Cu/HfO{sub 2}/Pt structure. The experimental distributions of the set parameters in several off resistance ranges are shown to nicely fit a Weibull model. The Weibull slopes of the set voltage and current increase and decrease logarithmically with off resistance, respectively. This experimental behavior is perfectly captured by a Monte Carlo simulator based on the cell-based set voltage statistics model and the Quantum Point Contact electronmore » transport model. Our work provides indications for the improvement of the switching uniformity.« less

  4. Association between expression of random gene sets and survival is evident in multiple cancer types and may be explained by sub-classification.

    PubMed

    Shimoni, Yishai

    2018-02-01

    One of the goals of cancer research is to identify a set of genes that cause or control disease progression. However, although multiple such gene sets were published, these are usually in very poor agreement with each other, and very few of the genes proved to be functional therapeutic targets. Furthermore, recent findings from a breast cancer gene-expression cohort showed that sets of genes selected randomly can be used to predict survival with a much higher probability than expected. These results imply that many of the genes identified in breast cancer gene expression analysis may not be causal of cancer progression, even though they can still be highly predictive of prognosis. We performed a similar analysis on all the cancer types available in the cancer genome atlas (TCGA), namely, estimating the predictive power of random gene sets for survival. Our work shows that most cancer types exhibit the property that random selections of genes are more predictive of survival than expected. In contrast to previous work, this property is not removed by using a proliferation signature, which implies that proliferation may not always be the confounder that drives this property. We suggest one possible solution in the form of data-driven sub-classification to reduce this property significantly. Our results suggest that the predictive power of random gene sets may be used to identify the existence of sub-classes in the data, and thus may allow better understanding of patient stratification. Furthermore, by reducing the observed bias this may allow more direct identification of biologically relevant, and potentially causal, genes.

  5. Association between expression of random gene sets and survival is evident in multiple cancer types and may be explained by sub-classification

    PubMed Central

    2018-01-01

    One of the goals of cancer research is to identify a set of genes that cause or control disease progression. However, although multiple such gene sets were published, these are usually in very poor agreement with each other, and very few of the genes proved to be functional therapeutic targets. Furthermore, recent findings from a breast cancer gene-expression cohort showed that sets of genes selected randomly can be used to predict survival with a much higher probability than expected. These results imply that many of the genes identified in breast cancer gene expression analysis may not be causal of cancer progression, even though they can still be highly predictive of prognosis. We performed a similar analysis on all the cancer types available in the cancer genome atlas (TCGA), namely, estimating the predictive power of random gene sets for survival. Our work shows that most cancer types exhibit the property that random selections of genes are more predictive of survival than expected. In contrast to previous work, this property is not removed by using a proliferation signature, which implies that proliferation may not always be the confounder that drives this property. We suggest one possible solution in the form of data-driven sub-classification to reduce this property significantly. Our results suggest that the predictive power of random gene sets may be used to identify the existence of sub-classes in the data, and thus may allow better understanding of patient stratification. Furthermore, by reducing the observed bias this may allow more direct identification of biologically relevant, and potentially causal, genes. PMID:29470520

  6. Use of geostatistics to determine the spatial distribution and infestation rate of leaf-cutting ant nests (Hymenoptera: Formicidae) in eucalyptus plantations.

    PubMed

    Lasmar, O; Zanetti, R; dos Santos, A; Fernandes, B V

    2012-08-01

    One of the fundamental steps in pest sampling is the assessment of the population distribution in the field. Several studies have investigated the distribution and appropriate sampling methods for leaf-cutting ants; however, more reliable methods are still required, such as those that use geostatistics. The objective of this study was to determine the spatial distribution and infestation rate of leaf-cutting ant nests in eucalyptus plantations by using geostatistics. The study was carried out in 2008 in two eucalyptus stands in Paraopeba, Minas Gerais, Brazil. All of the nests in the studied area were located and used for the generation of GIS maps, and the spatial pattern of distribution was determined considering the number and size of nests. Each analysis and map was made using the R statistics program and the geoR package. The nest spatial distribution in a savanna area of Minas Gerais was clustered to a certain extent. The models generated allowed the production of kriging maps of areas infested with leaf-cutting ants, where chemical intervention would be necessary, reducing the control costs, impact on humans, and the environment.

  7. Accuracy and uncertainty analysis of soil Bbf spatial distribution estimation at a coking plant-contaminated site based on normalization geostatistical technologies.

    PubMed

    Liu, Geng; Niu, Junjie; Zhang, Chao; Guo, Guanlin

    2015-12-01

    Data distribution is usually skewed severely by the presence of hot spots in contaminated sites. This causes difficulties for accurate geostatistical data transformation. Three types of typical normal distribution transformation methods termed the normal score, Johnson, and Box-Cox transformations were applied to compare the effects of spatial interpolation with normal distribution transformation data of benzo(b)fluoranthene in a large-scale coking plant-contaminated site in north China. Three normal transformation methods decreased the skewness and kurtosis of the benzo(b)fluoranthene, and all the transformed data passed the Kolmogorov-Smirnov test threshold. Cross validation showed that Johnson ordinary kriging has a minimum root-mean-square error of 1.17 and a mean error of 0.19, which was more accurate than the other two models. The area with fewer sampling points and that with high levels of contamination showed the largest prediction standard errors based on the Johnson ordinary kriging prediction map. We introduce an ideal normal transformation method prior to geostatistical estimation for severely skewed data, which enhances the reliability of risk estimation and improves the accuracy for determination of remediation boundaries.

  8. Involving patients in setting priorities for healthcare improvement: a cluster randomized trial.

    PubMed

    Boivin, Antoine; Lehoux, Pascale; Lacombe, Réal; Burgers, Jako; Grol, Richard

    2014-02-20

    Patients are increasingly seen as active partners in healthcare. While patient involvement in individual clinical decisions has been extensively studied, no trial has assessed how patients can effectively be involved in collective healthcare decisions affecting the population. The goal of this study was to test the impact of involving patients in setting healthcare improvement priorities for chronic care at the community level. Cluster randomized controlled trial. Local communities were randomized in intervention (priority setting with patient involvement) and control sites (no patient involvement). Communities in a canadian region were required to set priorities for improving chronic disease management in primary care, from a list of 37 validated quality indicators. Patients were consulted in writing, before participating in face-to-face deliberation with professionals. Professionals established priorities among themselves, without patient involvement. A total of 172 individuals from six communities participated in the study, including 83 chronic disease patients, and 89 health professionals. The primary outcome was the level of agreement between patients' and professionals' priorities. Secondary outcomes included professionals' intention to use the selected quality indicators, and the costs of patient involvement. Priorities established with patients were more aligned with core generic components of the Medical Home and Chronic Care Model, including: access to primary care, self-care support, patient participation in clinical decisions, and partnership with community organizations (p < 0.01). Priorities established by professionals alone placed more emphasis on the technical quality of single disease management. The involvement intervention fostered mutual influence between patients and professionals, which resulted in a 41% increase in agreement on common priorities (95%CI: +12% to +58%, p < 0.01). Professionals' intention to use the selected quality

  9. Use of geostatistics in planning optimum drilling program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ghose S.

    1989-08-01

    Application of geostatistics in the natural resources industry is well established. In a typical process of estimation, the statistically dependent geological data are used to predict the characteristics of a deposit. The estimator used is the best linear unbiased estimator (or BLUE), and a numerical factor of confidence is also provided. The natural inhomogeneity and anisotropy of a deposit are also quantified with preciseness. Drilling is the most reliable way of obtaining data for mining and related industries. However, it is often difficult to decide what is the optimum number of drill holes necessary for evaluation. In this paper, sequentialmore » measures of percent variation at 95% confidence level of a geological variable have been used to decipher economically optimum drilling density. A coal reserve model has been used to illustrate the method and findings. Fictitious drilling data were added (within the domain of population characteristics) in stages, to obtain a point of stability, beyond which the gain was significant (diminishing marginal benefit). The final relations are established by graphically projecting and comparing two variables - cost and precision. By mapping the percent variation at each stage, the localized areas of discrepancies can be identified. These are the locations where additional drilling is needed. The system can be controlled if performed at progressive stages and the preciseness toward stability is monitored.« less

  10. Multivariate Analysis and Modeling of Sediment Pollution Using Neural Network Models and Geostatistics

    NASA Astrophysics Data System (ADS)

    Golay, Jean; Kanevski, Mikhaïl

    2013-04-01

    The present research deals with the exploration and modeling of a complex dataset of 200 measurement points of sediment pollution by heavy metals in Lake Geneva. The fundamental idea was to use multivariate Artificial Neural Networks (ANN) along with geostatistical models and tools in order to improve the accuracy and the interpretability of data modeling. The results obtained with ANN were compared to those of traditional geostatistical algorithms like ordinary (co)kriging and (co)kriging with an external drift. Exploratory data analysis highlighted a great variety of relationships (i.e. linear, non-linear, independence) between the 11 variables of the dataset (i.e. Cadmium, Mercury, Zinc, Copper, Titanium, Chromium, Vanadium and Nickel as well as the spatial coordinates of the measurement points and their depth). Then, exploratory spatial data analysis (i.e. anisotropic variography, local spatial correlations and moving window statistics) was carried out. It was shown that the different phenomena to be modeled were characterized by high spatial anisotropies, complex spatial correlation structures and heteroscedasticity. A feature selection procedure based on General Regression Neural Networks (GRNN) was also applied to create subsets of variables enabling to improve the predictions during the modeling phase. The basic modeling was conducted using a Multilayer Perceptron (MLP) which is a workhorse of ANN. MLP models are robust and highly flexible tools which can incorporate in a nonlinear manner different kind of high-dimensional information. In the present research, the input layer was made of either two (spatial coordinates) or three neurons (when depth as auxiliary information could possibly capture an underlying trend) and the output layer was composed of one (univariate MLP) to eight neurons corresponding to the heavy metals of the dataset (multivariate MLP). MLP models with three input neurons can be referred to as Artificial Neural Networks with EXternal

  11. Demonstration and Validation of the Geostatistical Temporal-Spatial Algorithm (GTS) for Optimization of Long-Term Monitoring (LTM) of Groundwater at Military and Government Sites

    DTIC Science & Technology

    2010-08-01

    Long - Term Monitoring (LTM) of Groundwater at Military and...Geostatistical Temporal-Spatial Algorithm (GTS) for Optimization of Long - Term Monitoring (LTM) of Groundwater at Military and Government Sites 5a. CONTRACT NUMBER...Council LTM long - term monitoring LTMO long - term monitoring optimization LWQR locally weighted quadratic regression LZ Lower Zone MCL

  12. Quantifying Groundwater Fluctuations in the Southern High Plains with GIS and Geostatistics

    NASA Astrophysics Data System (ADS)

    Whitehead, B.

    2008-12-01

    Groundwater as a dwindling non-renewable natural resource has been an important research theme in agricultural studies coupled with human-environment interaction. This research incorporated contemporary Geographic Information System (GIS) methodologies and a universal kriging interpolator (geostatistics) to develop depth to groundwater surfaces for the southern portion of the High Plains, or Ogallala, aquifer. The variations in the interpolated surfaces were used to calculate the volume of water mined from the aquifer from 1980 to 2005. The findings suggest a nearly inverse relationship to the water withdrawal scenarios derived by the United States Geological Survey (USGS) during the Regional Aquifer System Analysis (RASA) performed in the early 1980's. These results advocate further research into regional climate change, groundwater-surface water interaction, and recharge mechanisms in the region, and provide a substantial contribution to the continuing and contentious issue concerning the environmental sustainability of the High Plains.

  13. Randomized trial of plastic bags to prevent term neonatal hypothermia in a resource-poor setting.

    PubMed

    Belsches, Theodore C; Tilly, Alyssa E; Miller, Tonya R; Kambeyanda, Rohan H; Leadford, Alicia; Manasyan, Albert; Chomba, Elwyn; Ramani, Manimaran; Ambalavanan, Namasivayam; Carlo, Waldemar A

    2013-09-01

    Term infants in resource-poor settings frequently develop hypothermia during the first hours after birth. Plastic bags or wraps are a low-cost intervention for the prevention of hypothermia in preterm and low birth weight infants that may also be effective in term infants. Our objective was to test the hypothesis that placement of term neonates in plastic bags at birth reduces hypothermia at 1 hour after birth in a resource-poor hospital. This parallel-group randomized controlled trial was conducted at University Teaching Hospital, the tertiary referral center in Zambia. Inborn neonates with both a gestational age ≥37 weeks and a birth weight ≥2500 g were randomized 1:1 to either a standard thermoregulation protocol or to a standard thermoregulation protocol with placement of the torso and lower extremities inside a plastic bag within 10 minutes after birth. The primary outcome was hypothermia (<36.5°C axillary temperature) at 1 hour after birth. Neonates randomized to plastic bag (n = 135) or to standard thermoregulation care (n = 136) had similar baseline characteristics (birth weight, gestational age, gender, and baseline temperature). Neonates in the plastic bag group had a lower rate of hypothermia (60% vs 73%, risk ratio 0.76, confidence interval 0.60-0.96, P = .026) and a higher axillary temperature (36.4 ± 0.5°C vs 36.2 ± 0.7°C, P < .001) at 1 hour after birth compared with infants receiving standard care. Placement in a plastic bag at birth reduced the incidence of hypothermia at 1 hour after birth in term neonates born in a resource-poor setting, but most neonates remained hypothermic.

  14. A Geostatistics-Informed Hierarchical Sensitivity Analysis Method for Complex Groundwater Flow and Transport Modeling

    NASA Astrophysics Data System (ADS)

    Dai, H.; Chen, X.; Ye, M.; Song, X.; Zachara, J. M.

    2017-12-01

    Sensitivity analysis is an important tool for development and improvement of mathematical models, especially for complex systems with a high dimension of spatially correlated parameters. Variance-based global sensitivity analysis has gained popularity because it can quantify the relative contribution of uncertainty from different sources. However, its computational cost increases dramatically with the complexity of the considered model and the dimension of model parameters. In this study we developed a new sensitivity analysis method that integrates the concept of variance-based method with a hierarchical uncertainty quantification framework. Different uncertain inputs are grouped and organized into a multi-layer framework based on their characteristics and dependency relationships to reduce the dimensionality of the sensitivity analysis. A set of new sensitivity indices are defined for the grouped inputs using the variance decomposition method. Using this methodology, we identified the most important uncertainty source for a dynamic groundwater flow and solute transport model at the Department of Energy (DOE) Hanford site. The results indicate that boundary conditions and permeability field contribute the most uncertainty to the simulated head field and tracer plume, respectively. The relative contribution from each source varied spatially and temporally. By using a geostatistical approach to reduce the number of realizations needed for the sensitivity analysis, the computational cost of implementing the developed method was reduced to a practically manageable level. The developed sensitivity analysis method is generally applicable to a wide range of hydrologic and environmental problems that deal with high-dimensional spatially-distributed input variables.

  15. Application of geostatistical simulation to compile seismotectonic provinces based on earthquake databases (case study: Iran)

    NASA Astrophysics Data System (ADS)

    Jalali, Mohammad; Ramazi, Hamidreza

    2018-04-01

    This article is devoted to application of a simulation algorithm based on geostatistical methods to compile and update seismotectonic provinces in which Iran has been chosen as a case study. Traditionally, tectonic maps together with seismological data and information (e.g., earthquake catalogues, earthquake mechanism, and microseismic data) have been used to update seismotectonic provinces. In many cases, incomplete earthquake catalogues are one of the important challenges in this procedure. To overcome this problem, a geostatistical simulation algorithm, turning band simulation, TBSIM, was applied to make a synthetic data to improve incomplete earthquake catalogues. Then, the synthetic data was added to the traditional information to study the seismicity homogeneity and classify the areas according to tectonic and seismic properties to update seismotectonic provinces. In this paper, (i) different magnitude types in the studied catalogues have been homogenized to moment magnitude (Mw), and earthquake declustering was then carried out to remove aftershocks and foreshocks; (ii) time normalization method was introduced to decrease the uncertainty in a temporal domain prior to start the simulation procedure; (iii) variography has been carried out in each subregion to study spatial regressions (e.g., west-southwestern area showed a spatial regression from 0.4 to 1.4 decimal degrees; the maximum range identified in the azimuth of 135 ± 10); (iv) TBSIM algorithm was then applied to make simulated events which gave rise to make 68,800 synthetic events according to the spatial regression found in several directions; (v) simulated events (i.e., magnitudes) were classified based on their intensity in ArcGIS packages and homogenous seismic zones have been determined. Finally, according to the synthetic data, tectonic features, and actual earthquake catalogues, 17 seismotectonic provinces were introduced in four major classes introduced as very high, high, moderate, and low

  16. Using river distance and existing hydrography data can improve the geostatistical estimation of fish tissue mercury at unsampled locations.

    PubMed

    Money, Eric S; Sackett, Dana K; Aday, D Derek; Serre, Marc L

    2011-09-15

    Mercury in fish tissue is a major human health concern. Consumption of mercury-contaminated fish poses risks to the general population, including potentially serious developmental defects and neurological damage in young children. Therefore, it is important to accurately identify areas that have the potential for high levels of bioaccumulated mercury. However, due to time and resource constraints, it is difficult to adequately assess fish tissue mercury on a basin wide scale. We hypothesized that, given the nature of fish movement along streams, an analytical approach that takes into account distance traveled along these streams would improve the estimation accuracy for fish tissue mercury in unsampled streams. Therefore, we used a river-based Bayesian Maximum Entropy framework (river-BME) for modern space/time geostatistics to estimate fish tissue mercury at unsampled locations in the Cape Fear and Lumber Basins in eastern North Carolina. We also compared the space/time geostatistical estimation using river-BME to the more traditional Euclidean-based BME approach, with and without the inclusion of a secondary variable. Results showed that this river-based approach reduced the estimation error of fish tissue mercury by more than 13% and that the median estimate of fish tissue mercury exceeded the EPA action level of 0.3 ppm in more than 90% of river miles for the study domain.

  17. Spatial variation of dental caries in late holocene samples of Southern South America: A geostatistical study.

    PubMed

    Menéndez, Lumila Paula

    2016-11-01

    The spatial variation of dental caries in late Holocene southern South American populations will be analyzed using geostatistical methods. The existence of a continuous geographical pattern of dental caries variation will be tested. The author recorded dental caries in 400 individuals, collated this information with published caries data from 666 additional individuals, and calculated a Caries Index. The caries spatial distribution was evaluated by means of 2D maps and scatterplots. Geostatistical analyses were performed by calculating Moran's I, correlograms and a Procrustes analysis. There is a relatively strong latitudinal continuous gradient of dental caries variation, especially in the extremes of the distribution. Moreover, the association between dental caries and geography was relatively high (m 12  = 0.6). Although northern and southern samples had the highest and lowest frequencies of dental caries, respectively, the central ones had the largest variation and had lower rates of caries than expected. The large variation in frequencies of dental caries in populations located in the center of the distribution could be explained by their subsistence strategies, characterized either by the consumption of wild cariogenic plants or cultigens (obtained locally or by exchange), a reliance on fishing, or the incorporation of plants rich in starch rather than carbohydrates. It is suggested that dental caries must be considered a multifactorial disease which results from the interaction of cultural practices and environmental factors. This can change how we understand subsistence strategies as well as how we interpret dental caries rates. Am. J. Hum. Biol., 2016. © 2016 Wiley Periodicals, Inc. Am. J. Hum. Biol. 28:825-836, 2016. © 2016Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  18. Assessing the spatial distribution of Tuta absoluta (Lepidoptera: Gelechiidae) eggs in open-field tomato cultivation through geostatistical analysis.

    PubMed

    Martins, Júlio C; Picanço, Marcelo C; Silva, Ricardo S; Gonring, Alfredo Hr; Galdino, Tarcísio Vs; Guedes, Raul Nc

    2018-01-01

    The spatial distribution of insects is due to the interaction between individuals and the environment. Knowledge about the within-field pattern of spatial distribution of a pest is critical to planning control tactics, developing efficient sampling plans, and predicting pest damage. The leaf miner Tuta absoluta (Meyrick) (Lepidoptera: Gelechiidae) is the main pest of tomato crops in several regions of the world. Despite the importance of this pest, the pattern of spatial distribution of T. absoluta on open-field tomato cultivation remains unknown. Therefore, this study aimed to characterize the spatial distribution of T. absoluta in 22 commercial open-field tomato cultivations with plants at the three phenological development stages by using geostatistical analysis. Geostatistical analysis revealed that there was strong evidence for spatially dependent (aggregated) T. absoluta eggs in 19 of the 22 sample tomato cultivations. The maps that were obtained demonstrated the aggregated structure of egg densities at the edges of the crops. Further, T. absoluta was found to accomplish egg dispersal along the rows more frequently than it does between rows. Our results indicate that the greatest egg densities of T. absoluta occur at the edges of tomato crops. These results are discussed in relation to the behavior of T. absoluta distribution within fields and in terms of their implications for improved sampling guidelines and precision targeting control methods that are essential for effective pest monitoring and management. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.

  19. Identification of hydraulic conductivity structure in sand and gravel aquifers: Cape Cod data set

    USGS Publications Warehouse

    Eggleston, J.R.; Rojstaczer, S.A.; Peirce, J.J.

    1996-01-01

    This study evaluates commonly used geostatistical methods to assess reproduction of hydraulic conductivity (K) structure and sensitivity under limiting amounts of data. Extensive conductivity measurements from the Cape Cod sand and gravel aquifer are used to evaluate two geostatistical estimation methods, conditional mean as an estimate and ordinary kriging, and two stochastic simulation methods, simulated annealing and sequential Gaussian simulation. Our results indicate that for relatively homogeneous sand and gravel aquifers such as the Cape Cod aquifer, neither estimation methods nor stochastic simulation methods give highly accurate point predictions of hydraulic conductivity despite the high density of collected data. Although the stochastic simulation methods yielded higher errors than the estimation methods, the stochastic simulation methods yielded better reproduction of the measured In (K) distribution and better reproduction of local contrasts in In (K). The inability of kriging to reproduce high In (K) values, as reaffirmed by this study, provides a strong instigation for choosing stochastic simulation methods to generate conductivity fields when performing fine-scale contaminant transport modeling. Results also indicate that estimation error is relatively insensitive to the number of hydraulic conductivity measurements so long as more than a threshold number of data are used to condition the realizations. This threshold occurs for the Cape Cod site when there are approximately three conductivity measurements per integral volume. The lack of improvement with additional data suggests that although fine-scale hydraulic conductivity structure is evident in the variogram, it is not accurately reproduced by geostatistical estimation methods. If the Cape Cod aquifer spatial conductivity characteristics are indicative of other sand and gravel deposits, then the results on predictive error versus data collection obtained here have significant practical

  20. Spatial distribution of soil organic carbon and total nitrogen based on GIS and geostatistics in a small watershed in a hilly area of northern China.

    PubMed

    Peng, Gao; Bing, Wang; Guangpo, Geng; Guangcan, Zhang

    2013-01-01

    The spatial variability of soil organic carbon (SOC) and total nitrogen (STN) levels is important in both global carbon-nitrogen cycle and climate change research. There has been little research on the spatial distribution of SOC and STN at the watershed scale based on geographic information systems (GIS) and geostatistics. Ninety-seven soil samples taken at depths of 0-20 cm were collected during October 2010 and 2011 from the Matiyu small watershed (4.2 km(2)) of a hilly area in Shandong Province, northern China. The impacts of different land use types, elevation, vegetation coverage and other factors on SOC and STN spatial distributions were examined using GIS and a geostatistical method, regression-kriging. The results show that the concentration variations of SOC and STN in the Matiyu small watershed were moderate variation based on the mean, median, minimum and maximum, and the coefficients of variation (CV). Residual values of SOC and STN had moderate spatial autocorrelations, and the Nugget/Sill were 0.2% and 0.1%, respectively. Distribution maps of regression-kriging revealed that both SOC and STN concentrations in the Matiyu watershed decreased from southeast to northwest. This result was similar to the watershed DEM trend and significantly correlated with land use type, elevation and aspect. SOC and STN predictions with the regression-kriging method were more accurate than those obtained using ordinary kriging. This research indicates that geostatistical characteristics of SOC and STN concentrations in the watershed were closely related to both land-use type and spatial topographic structure and that regression-kriging is suitable for investigating the spatial distributions of SOC and STN in the complex topography of the watershed.

  1. Spatial Distribution of Soil Organic Carbon and Total Nitrogen Based on GIS and Geostatistics in a Small Watershed in a Hilly Area of Northern China

    PubMed Central

    Peng, Gao; Bing, Wang; Guangpo, Geng; Guangcan, Zhang

    2013-01-01

    The spatial variability of soil organic carbon (SOC) and total nitrogen (STN) levels is important in both global carbon-nitrogen cycle and climate change research. There has been little research on the spatial distribution of SOC and STN at the watershed scale based on geographic information systems (GIS) and geostatistics. Ninety-seven soil samples taken at depths of 0–20 cm were collected during October 2010 and 2011 from the Matiyu small watershed (4.2 km2) of a hilly area in Shandong Province, northern China. The impacts of different land use types, elevation, vegetation coverage and other factors on SOC and STN spatial distributions were examined using GIS and a geostatistical method, regression-kriging. The results show that the concentration variations of SOC and STN in the Matiyu small watershed were moderate variation based on the mean, median, minimum and maximum, and the coefficients of variation (CV). Residual values of SOC and STN had moderate spatial autocorrelations, and the Nugget/Sill were 0.2% and 0.1%, respectively. Distribution maps of regression-kriging revealed that both SOC and STN concentrations in the Matiyu watershed decreased from southeast to northwest. This result was similar to the watershed DEM trend and significantly correlated with land use type, elevation and aspect. SOC and STN predictions with the regression-kriging method were more accurate than those obtained using ordinary kriging. This research indicates that geostatistical characteristics of SOC and STN concentrations in the watershed were closely related to both land-use type and spatial topographic structure and that regression-kriging is suitable for investigating the spatial distributions of SOC and STN in the complex topography of the watershed. PMID:24391791

  2. Geostatistical analysis of data on air temperature and plant phenology from Baden-Württemberg (Germany) as a basis for regional scaled models of climate change.

    PubMed

    Schröder, Winfried; Schmidt, Gunther; Hasenclever, Judith

    2006-09-01

    The rise of the air temperature is assured to be part of the global climatic change, but there is still a lack of knowledge about its effects at a regional scale. The article tackles the correlation of air temperature with the phenology of selected plants by the example of Baden-Württemberg to provide a spatial valid data base for regional climate change models. To this end, the data on air temperature and plant phenology, gathered from measurement sites without congruent coverage, were correlated after performing geostatistical analysis and estimation. In addition, geostatistics are used to analyze and cartographically depict the spatial structure of the phenology of plants in spring and in summer. The statistical analysis reveals a significant relationship between the rising air temperature and the earlier beginning of phenological phases like blooming or fruit maturation: From 1991 to 1999 spring time, as indicated by plant phenology, has begun up to 15 days earlier than from 1961 to 1990. As shown by geostatistics, this holds true for the whole territory of Baden-Württemberg. The effects of the rise of air temperature should be investigated not only by monitoring biological individuals, as for example plants, but on an ecosystem level as well. In Germany, the environmental monitoring should be supplemented by the study of the effects of the climatic change in ecosystems. Because air temperature and humidity have a great influence on the temporal and spatial distribution of pathogen carriers (vectors) and pathogens, mapping of the environmental determinants of vector and pathogen distribution in space and time should be performed in order to identify hot spots for risk assessment and further detailed epidemiological studies.

  3. Interpolating Non-Parametric Distributions of Hourly Rainfall Intensities Using Random Mixing

    NASA Astrophysics Data System (ADS)

    Mosthaf, Tobias; Bárdossy, András; Hörning, Sebastian

    2015-04-01

    The correct spatial interpolation of hourly rainfall intensity distributions is of great importance for stochastical rainfall models. Poorly interpolated distributions may lead to over- or underestimation of rainfall and consequently to wrong estimates of following applications, like hydrological or hydraulic models. By analyzing the spatial relation of empirical rainfall distribution functions, a persistent order of the quantile values over a wide range of non-exceedance probabilities is observed. As the order remains similar, the interpolation weights of quantile values for one certain non-exceedance probability can be applied to the other probabilities. This assumption enables the use of kernel smoothed distribution functions for interpolation purposes. Comparing the order of hourly quantile values over different gauges with the order of their daily quantile values for equal probabilities, results in high correlations. The hourly quantile values also show high correlations with elevation. The incorporation of these two covariates into the interpolation is therefore tested. As only positive interpolation weights for the quantile values assure a monotonically increasing distribution function, the use of geostatistical methods like kriging is problematic. Employing kriging with external drift to incorporate secondary information is not applicable. Nonetheless, it would be fruitful to make use of covariates. To overcome this shortcoming, a new random mixing approach of spatial random fields is applied. Within the mixing process hourly quantile values are considered as equality constraints and correlations with elevation values are included as relationship constraints. To profit from the dependence of daily quantile values, distribution functions of daily gauges are used to set up lower equal and greater equal constraints at their locations. In this way the denser daily gauge network can be included in the interpolation of the hourly distribution functions. The

  4. Minimization of required model runs in the Random Mixing approach to inverse groundwater flow and transport modeling

    NASA Astrophysics Data System (ADS)

    Hoerning, Sebastian; Bardossy, Andras; du Plessis, Jaco

    2017-04-01

    Most geostatistical inverse groundwater flow and transport modelling approaches utilize a numerical solver to minimize the discrepancy between observed and simulated hydraulic heads and/or hydraulic concentration values. The optimization procedure often requires many model runs, which for complex models lead to long run times. Random Mixing is a promising new geostatistical technique for inverse modelling. The method is an extension of the gradual deformation approach. It works by finding a field which preserves the covariance structure and maintains observed hydraulic conductivities. This field is perturbed by mixing it with new fields that fulfill the homogeneous conditions. This mixing is expressed as an optimization problem which aims to minimize the difference between the observed and simulated hydraulic heads and/or concentration values. To preserve the spatial structure, the mixing weights must lie on the unit hyper-sphere. We present a modification to the Random Mixing algorithm which significantly reduces the number of model runs required. The approach involves taking n equally spaced points on the unit circle as weights for mixing conditional random fields. Each of these mixtures provides a solution to the forward model at the conditioning locations. For each of the locations the solutions are then interpolated around the circle to provide solutions for additional mixing weights at very low computational cost. The interpolated solutions are used to search for a mixture which maximally reduces the objective function. This is in contrast to other approaches which evaluate the objective function for the n mixtures and then interpolate the obtained values. Keeping the mixture on the unit circle makes it easy to generate equidistant sampling points in the space; however, this means that only two fields are mixed at a time. Once the optimal mixture for two fields has been found, they are combined to form the input to the next iteration of the algorithm. This

  5. Geostatistical evaluation of integrated marsh management impact on mosquito vectors using before-after-control-impact (BACI) design

    PubMed Central

    Rochlin, Ilia; Iwanejko, Tom; Dempsey, Mary E; Ninivaggi, Dominick V

    2009-01-01

    Background In many parts of the world, salt marshes play a key ecological role as the interface between the marine and the terrestrial environments. Salt marshes are also exceedingly important for public health as larval habitat for mosquitoes that are vectors of disease and significant biting pests. Although grid ditching and pesticides have been effective in salt marsh mosquito control, marsh degradation and other environmental considerations compel a different approach. Targeted habitat modification and biological control methods known as Open Marsh Water Management (OMWM) had been proposed as a viable alternative to marsh-wide physical alterations and chemical control. However, traditional larval sampling techniques may not adequately assess the impacts of marsh management on mosquito larvae. To assess the effectiveness of integrated OMWM and marsh restoration techniques for mosquito control, we analyzed the results of a 5-year OMWM/marsh restoration project to determine changes in mosquito larval production using GIS and geostatistical methods. Methods The following parameters were evaluated using "Before-After-Control-Impact" (BACI) design: frequency and geographic extent of larval production, intensity of larval production, changes in larval habitat, and number of larvicide applications. The analyses were performed using Moran's I, Getis-Ord, and Spatial Scan statistics on aggregated before and after data as well as data collected over time. This allowed comparison of control and treatment areas to identify changes attributable to the OMWM/marsh restoration modifications. Results The frequency of finding mosquito larvae in the treatment areas was reduced by 70% resulting in a loss of spatial larval clusters compared to those found in the control areas. This effect was observed directly following OMWM treatment and remained significant throughout the study period. The greatly reduced frequency of finding larvae in the treatment areas led to a significant

  6. Usage of multivariate geostatistics in interpolation processes for meteorological precipitation maps

    NASA Astrophysics Data System (ADS)

    Gundogdu, Ismail Bulent

    2017-01-01

    Long-term meteorological data are very important both for the evaluation of meteorological events and for the analysis of their effects on the environment. Prediction maps which are constructed by different interpolation techniques often provide explanatory information. Conventional techniques, such as surface spline fitting, global and local polynomial models, and inverse distance weighting may not be adequate. Multivariate geostatistical methods can be more significant, especially when studying secondary variables, because secondary variables might directly affect the precision of prediction. In this study, the mean annual and mean monthly precipitations from 1984 to 2014 for 268 meteorological stations in Turkey have been used to construct country-wide maps. Besides linear regression, the inverse square distance and ordinary co-Kriging (OCK) have been used and compared to each other. Also elevation, slope, and aspect data for each station have been taken into account as secondary variables, whose use has reduced errors by up to a factor of three. OCK gave the smallest errors (1.002 cm) when aspect was included.

  7. The Applications of Model-Based Geostatistics in Helminth Epidemiology and Control

    PubMed Central

    Magalhães, Ricardo J. Soares; Clements, Archie C.A.; Patil, Anand P.; Gething, Peter W.; Brooker, Simon

    2011-01-01

    Funding agencies are dedicating substantial resources to tackle helminth infections. Reliable maps of the distribution of helminth infection can assist these efforts by targeting control resources to areas of greatest need. The ability to define the distribution of infection at regional, national and subnational levels has been enhanced greatly by the increased availability of good quality survey data and the use of model-based geostatistics (MBG), enabling spatial prediction in unsampled locations. A major advantage of MBG risk mapping approaches is that they provide a flexible statistical platform for handling and representing different sources of uncertainty, providing plausible and robust information on the spatial distribution of infections to inform the design and implementation of control programmes. Focussing on schistosomiasis and soil-transmitted helminthiasis, with additional examples for lymphatic filariasis and onchocerciasis, we review the progress made to date with the application of MBG tools in large-scale, real-world control programmes and propose a general framework for their application to inform integrative spatial planning of helminth disease control programmes. PMID:21295680

  8. The applications of model-based geostatistics in helminth epidemiology and control.

    PubMed

    Magalhães, Ricardo J Soares; Clements, Archie C A; Patil, Anand P; Gething, Peter W; Brooker, Simon

    2011-01-01

    Funding agencies are dedicating substantial resources to tackle helminth infections. Reliable maps of the distribution of helminth infection can assist these efforts by targeting control resources to areas of greatest need. The ability to define the distribution of infection at regional, national and subnational levels has been enhanced greatly by the increased availability of good quality survey data and the use of model-based geostatistics (MBG), enabling spatial prediction in unsampled locations. A major advantage of MBG risk mapping approaches is that they provide a flexible statistical platform for handling and representing different sources of uncertainty, providing plausible and robust information on the spatial distribution of infections to inform the design and implementation of control programmes. Focussing on schistosomiasis and soil-transmitted helminthiasis, with additional examples for lymphatic filariasis and onchocerciasis, we review the progress made to date with the application of MBG tools in large-scale, real-world control programmes and propose a general framework for their application to inform integrative spatial planning of helminth disease control programmes. Copyright © 2011 Elsevier Ltd. All rights reserved.

  9. Effectiveness of traditional bone setting in chronic neck pain: randomized clinical trial.

    PubMed

    Zaproudina, Nina; Hänninen, Osmo O P; Airaksinen, Olavi

    2007-01-01

    This study evaluates the effectiveness of traditional bone setting (TBS) in chronic neck pain (cNP) compared with conventional physiotherapy (PT) and massage (M). This was a randomized clinical trial. Working-aged employed subjects with cNP (n = 105; 37 men and 68 women; mean age, 41.5 years) were randomized into TBS, PT, and M groups. Follow-up times were 1, 6, and 12 months after the treatments. Neck pain intensity (visual analog scale), perceived disability (Neck Disability Index [NDI]), and neck spine mobility measurements were used as outcomes. Global assessment was evaluated by the subjects (scale from -1 to +10). Data were analyzed using time (pre and post) by group (TBS, PT and M), 2- way analysis of variance for repeated measures. Neck pain decreased and NDI scores improved in all groups 1 month after the treatment (P < .001). The improvement of NDI and persons' satisfaction were significantly better after TBS. Neck spine mobility in rotation movements tended to improve significantly better and the frons-knee distance improved more after TBS. One year later, both NDI and neck pain were significantly better after TBS than in reference groups. A significant improvement was reported by 40% to 45.5% of subjects in the PT and M groups and by 68.6% in the TBS group. Bone setters' ability to communicate and to interact with patients was evaluated significantly higher. In the TBS group, the number of sick days was minimal as was the use of painkillers during 1-year follow-up compared to that in the reference groups. Traditional bone setting, which is a soft manual mobilization technique focusing on the muscles, joints, and ligaments, appears to be effective in cNP. Two thirds of subjects experienced it as beneficial, and it seems to be able to improve disability and pain in patients with cNP. Subjective and partially objective benefits of TBS were found in those patients more than after other interventions, and the effects lasted at least for 1 year.

  10. A unified development of several techniques for the representation of random vectors and data sets

    NASA Technical Reports Server (NTRS)

    Bundick, W. T.

    1973-01-01

    Linear vector space theory is used to develop a general representation of a set of data vectors or random vectors by linear combinations of orthonormal vectors such that the mean squared error of the representation is minimized. The orthonormal vectors are shown to be the eigenvectors of an operator. The general representation is applied to several specific problems involving the use of the Karhunen-Loeve expansion, principal component analysis, and empirical orthogonal functions; and the common properties of these representations are developed.

  11. Geostatistical Characteristic of Space -Time Variation in Underground Water Selected Quality Parameters in Klodzko Water Intake Area (SW Part of Poland)

    NASA Astrophysics Data System (ADS)

    Namysłowska-Wilczyńska, Barbara

    2016-04-01

    This paper presents selected results of research connected with the development of a (3D) geostatistical hydrogeochemical model of the Klodzko Drainage Basin, dedicated to the spatial and time variation in the selected quality parameters of underground water in the Klodzko water intake area (SW part of Poland). The research covers the period 2011÷2012. Spatial analyses of the variation in various quality parameters, i.e, contents of: ammonium ion [gNH4+/m3], NO3- (nitrate ion) [gNO3/m3], PO4-3 (phosphate ion) [gPO4-3/m3], total organic carbon C (TOC) [gC/m3], pH redox potential and temperature C [degrees], were carried out on the basis of the chemical determinations of the quality parameters of underground water samples taken from the wells in the water intake area. Spatial and time variation in the quality parameters was analyzed on the basis of archival data (period 1977÷1999) for 22 (pump and siphon) wells with a depth ranging from 9.5 to 38.0 m b.g.l., later data obtained (November 2011) from tests of water taken from 14 existing wells. The wells were built in the years 1954÷1998. The water abstraction depth (difference between the terrain elevation and the dynamic water table level) is ranged from 276÷286 m a.s.l., with an average of 282.05 m a.s.l. Dynamic water table level is contained between 6.22 m÷16.44 m b.g.l., with a mean value of 9.64 m b.g.l. The latest data (January 2012) acquired from 3 new piezometers, with a depth of 9÷10m, which were made in other locations in the relevant area. Thematic databases, containing original data on coordinates X, Y (latitude, longitude) and Z (terrain elevation and time - years) and on regionalized variables, i.e. the underground water quality parameters in the Klodzko water intake area determined for different analytical configurations (22 wells, 14 wells, 14 wells + 3 piezometers), were created. Both archival data (acquired in the years 1977÷1999) and the latest data (collected in 2011÷2012) were analyzed

  12. A new algorithm combining geostatistics with the surrogate data approach to increase the accuracy of comparisons of point radiation measurements with cloud measurements

    NASA Astrophysics Data System (ADS)

    Venema, V. K. C.; Lindau, R.; Varnai, T.; Simmer, C.

    2009-04-01

    Two main groups of statistical methods used in the Earth sciences are geostatistics and stochastic modelling. Geostatistical methods, such as various kriging algorithms, aim at estimating the mean value for every point as well as possible. In case of sparse measurements, such fields have less variability at small scales and a narrower distribution as the true field. This can lead to biases if a nonlinear process is simulated on such a kriged field. Stochastic modelling aims at reproducing the structure of the data. One of the stochastic modelling methods, the so-called surrogate data approach, replicates the value distribution and power spectrum of a certain data set. However, while stochastic methods reproduce the statistical properties of the data, the location of the measurement is not considered. Because radiative transfer through clouds is a highly nonlinear process it is essential to model the distribution (e.g. of optical depth, extinction, liquid water content or liquid water path) accurately as well as the correlations in the cloud field because of horizontal photon transport. This explains the success of surrogate cloud fields for use in 3D radiative transfer studies. However, up to now we could only achieve good results for the radiative properties averaged over the field, but not for a radiation measurement located at a certain position. Therefore we have developed a new algorithm that combines the accuracy of stochastic (surrogate) modelling with the positioning capabilities of kriging. In this way, we can automatically profit from the large geostatistical literature and software. The algorithm is tested on cloud fields from large eddy simulations (LES). On these clouds a measurement is simulated. From the pseudo-measurement we estimated the distribution and power spectrum. Furthermore, the pseudo-measurement is kriged to a field the size of the final surrogate cloud. The distribution, spectrum and the kriged field are the inputs to the algorithm. This

  13. An assessment of gas emanation hazard using a geographic information system and geostatistics.

    PubMed

    Astorri, F; Beaubien, S E; Ciotoli, G; Lombardi, S

    2002-03-01

    This paper describes the use of geostatistical analysis and GIS techniques to assess gas emanation hazards. The Mt. Vulsini volcanic district was selected for this study because of the wide range of natural phenomena locally present that affect gas migration in the near surface. In addition, soil gas samples that were collected in this area should allow for a calibration between the generated risk/hazard models and the measured distribution of toxic gas species at surface. The approach used during this study consisted of three general stages. First data were digitally organized into thematic layers, then software functions in the GIS program "ArcView" were used to compare and correlate these various layers, and then finally the produced "potential-risk" map was compared with radon soil gas data in order to validate the model and/or to select zones for further, more-detailed soil gas investigations.

  14. Estimation of the lower and upper bounds on the probability of failure using subset simulation and random set theory

    NASA Astrophysics Data System (ADS)

    Alvarez, Diego A.; Uribe, Felipe; Hurtado, Jorge E.

    2018-02-01

    Random set theory is a general framework which comprises uncertainty in the form of probability boxes, possibility distributions, cumulative distribution functions, Dempster-Shafer structures or intervals; in addition, the dependence between the input variables can be expressed using copulas. In this paper, the lower and upper bounds on the probability of failure are calculated by means of random set theory. In order to accelerate the calculation, a well-known and efficient probability-based reliability method known as subset simulation is employed. This method is especially useful for finding small failure probabilities in both low- and high-dimensional spaces, disjoint failure domains and nonlinear limit state functions. The proposed methodology represents a drastic reduction of the computational labor implied by plain Monte Carlo simulation for problems defined with a mixture of representations for the input variables, while delivering similar results. Numerical examples illustrate the efficiency of the proposed approach.

  15. Geostatistical conditional simulation for the assessment of contaminated land by abandoned heavy metal mining.

    PubMed

    Ersoy, Adem; Yunsel, Tayfun Yusuf; Atici, Umit

    2008-02-01

    Abandoned mine workings can undoubtedly cause varying degrees of contamination of soil with heavy metals such as lead and zinc has occurred on a global scale. Exposure to these elements may cause to harm human health and environment. In the study, a total of 269 soil samples were collected at 1, 5, and 10 m regular grid intervals of 100 x 100 m area of Carsington Pasture in the UK. Cell declustering technique was applied to the data set due to no statistical representativity. Directional experimental semivariograms of the elements for the transformed data showed that both geometric and zonal anisotropy exists in the data. The most evident spatial dependence structure of the continuity for the directional experimental semivariogram, characterized by spherical and exponential models of Pb and Zn were obtained. This study reports the spatial distribution and uncertainty of Pb and Zn concentrations in soil at the study site using a probabilistic approach. The approach was based on geostatistical sequential Gaussian simulation (SGS), which is used to yield a series of conditional images characterized by equally probable spatial distributions of the heavy elements concentrations across the area. Postprocessing of many simulations allowed the mapping of contaminated and uncontaminated areas, and provided a model for the uncertainty in the spatial distribution of element concentrations. Maps of the simulated Pb and Zn concentrations revealed the extent and severity of contamination. SGS was validated by statistics, histogram, variogram reproduction, and simulation errors. The maps of the elements might be used in the remediation studies, help decision-makers and others involved in the abandoned heavy metal mining site in the world.

  16. Endurance Enhancement and High Speed Set/Reset of 50 nm Generation HfO2 Based Resistive Random Access Memory Cell by Intelligent Set/Reset Pulse Shape Optimization and Verify Scheme

    NASA Astrophysics Data System (ADS)

    Higuchi, Kazuhide; Miyaji, Kousuke; Johguchi, Koh; Takeuchi, Ken

    2012-02-01

    This paper proposes a verify-programming method for the resistive random access memory (ReRAM) cell which achieves a 50-times higher endurance and a fast set and reset compared with the conventional method. The proposed verify-programming method uses the incremental pulse width with turnback (IPWWT) for the reset and the incremental voltage with turnback (IVWT) for the set. With the combination of IPWWT reset and IVWT set, the endurance-cycle increases from 48 ×103 to 2444 ×103 cycles. Furthermore, the measured data retention-time after 20 ×103 set/reset cycles is estimated to be 10 years. Additionally, the filamentary based physical model is proposed to explain the set/reset failure mechanism with various set/reset pulse shapes. The reset pulse width and set voltage correspond to the width and length of the conductive-filament, respectively. Consequently, since the proposed IPWWT and IVWT recover set and reset failures of ReRAM cells, the endurance-cycles are improved.

  17. Channel characterization using multiple-point geostatistics, neural network, and modern analogy: A case study from a carbonate reservoir, southwest Iran

    NASA Astrophysics Data System (ADS)

    Hashemi, Seyyedhossein; Javaherian, Abdolrahim; Ataee-pour, Majid; Tahmasebi, Pejman; Khoshdel, Hossein

    2014-12-01

    In facies modeling, the ideal objective is to integrate different sources of data to generate a model that has the highest consistency to reality with respect to geological shapes and their facies architectures. Multiple-point (geo)statistics (MPS) is a tool that gives the opportunity of reaching this goal via defining a training image (TI). A facies modeling workflow was conducted on a carbonate reservoir located southwest Iran. Through a sequence stratigraphic correlation among the wells, it was revealed that the interval under a modeling process was deposited in a tidal flat environment. Bahamas tidal flat environment which is one of the most well studied modern carbonate tidal flats was considered to be the source of required information for modeling a TI. In parallel, a neural network probability cube was generated based on a set of attributes derived from 3D seismic cube to be applied into the MPS algorithm as a soft conditioning data. Moreover, extracted channel bodies and drilled well log facies came to the modeling as hard data. Combination of these constraints resulted to a facies model which was greatly consistent to the geological scenarios. This study showed how analogy of modern occurrences can be set as the foundation for generating a training image. Channel morphology and facies types currently being deposited, which are crucial for modeling a training image, was inferred from modern occurrences. However, there were some practical considerations concerning the MPS algorithm used for facies simulation. The main limitation was the huge amount of RAM and CPU-time needed to perform simulations.

  18. Improving practice in community-based settings: a randomized trial of supervision – study protocol

    PubMed Central

    2013-01-01

    Background Evidence-based treatments for child mental health problems are not consistently available in public mental health settings. Expanding availability requires workforce training. However, research has demonstrated that training alone is not sufficient for changing provider behavior, suggesting that ongoing intervention-specific supervision or consultation is required. Supervision is notably under-investigated, particularly as provided in public mental health. The degree to which supervision in this setting includes ‘gold standard’ supervision elements from efficacy trials (e.g., session review, model fidelity, outcome monitoring, skill-building) is unknown. The current federally-funded investigation leverages the Washington State Trauma-focused Cognitive Behavioral Therapy Initiative to describe usual supervision practices and test the impact of systematic implementation of gold standard supervision strategies on treatment fidelity and clinical outcomes. Methods/Design The study has two phases. We will conduct an initial descriptive study (Phase I) of supervision practices within public mental health in Washington State followed by a randomized controlled trial of gold standard supervision strategies (Phase II), with randomization at the clinician level (i.e., supervisors provide both conditions). Study participants will be 35 supervisors and 130 clinicians in community mental health centers. We will enroll one child per clinician in Phase I (N = 130) and three children per clinician in Phase II (N = 390). We use a multi-level mixed within- and between-subjects longitudinal design. Audio recordings of supervision and therapy sessions will be collected and coded throughout both phases. Child outcome data will be collected at the beginning of treatment and at three and six months into treatment. Discussion This study will provide insight into how supervisors can optimally support clinicians delivering evidence-based treatments. Phase I will

  19. Randomized controlled clinical trial on the three-dimensional accuracy of fast-set impression materials.

    PubMed

    Rudolph, Heike; Quaas, Sebastian; Haim, Manuela; Preißler, Jörg; Walter, Michael H; Koch, Rainer; Luthardt, Ralph G

    2013-06-01

    The use of fast-setting impression materials with different viscosities for the one-stage impression technique demands precise working times when mixing. We examined the effect of varying working time on impression precision in a randomized clinical trial. Focusing on tooth 46, three impressions were made from each of 96 volunteers, using either a polyether (PE: Impregum Penta H/L DuoSoft Quick, 3 M ESPE) or an addition-curing silicone (AS: Aquasil Ultra LV, Dentsply/DeTrey), one with the manufacturer's recommended working time (used as a reference) and two with altered working times. All stages of the impression-taking were subject to randomization. The three-dimensional precision of the non-standard working time impressions was digitally analyzed compared to the reference impression. Statistical analysis was performed using multivariate models. The mean difference in the position of the lower right first molar (vs. the reference impression) ranged from ±12 μm for PE to +19 and -14 μm for AS. Significantly higher mean values (+62 to -40 μm) were found for AS compared to PE (+21 to -26 μm) in the area of the distal adjacent tooth. Fast-set impression materials offer high precision when used for single tooth restorations as part of a one-stage impression technique, even when the working time (mixing plus application of the light- and heavy-body components) diverges significantly from the manufacturer's recommended protocol. Best accuracy was achieved with machine-mixed heavy-body/light-body polyether. Both materials examined met the clinical requirements regarding precision when the teeth were completely syringed with light material.

  20. Factors affecting paddy soil arsenic concentration in Bangladesh: prediction and uncertainty of geostatistical risk mapping.

    PubMed

    Ahmed, Zia U; Panaullah, Golam M; DeGloria, Stephen D; Duxbury, John M

    2011-12-15

    Knowledge of the spatial correlation of soil arsenic (As) concentrations with environmental variables is needed to assess the nature and extent of the risk of As contamination from irrigation water in Bangladesh. We analyzed 263 paired groundwater and paddy soil samples covering highland (HL) and medium highland-1 (MHL-1) land types for geostatistical mapping of soil As and delineation of As contaminated areas in Tala Upazilla, Satkhira district. We also collected 74 non-rice soil samples to assess the baseline concentration of soil As for this area. The mean soil As concentrations (mg/kg) for different land types under rice and non-rice crops were: rice-MHL-1 (21.2)>rice-HL (14.1)>non-rice-MHL-1 (11.9)>non-rice-HL (7.2). Multiple regression analyses showed that irrigation water As, Fe, land elevation and years of tubewell operation are the important factors affecting the concentrations of As in HL paddy soils. Only years of tubewell operation affected As concentration in the MHL-1 paddy soils. Quantitatively similar increases in soil As above the estimated baseline-As concentration were observed for rice soils on HL and MHL-1 after 6-8 years of groundwater irrigation, implying strong retention of As added in irrigation water in both land types. Application of single geostatistical methods with secondary variables such as regression kriging (RK) and ordinary co-kriging (OCK) gave little improvement in prediction of soil As over ordinary kriging (OK). Comparing single prediction methods, kriging within strata (KWS), the combination of RK for HL and OCK for MHL-1, gave more accurate soil As predictions and showed the lowest misclassification of declaring a location "contaminated" with respect to 14.8 mg As/kg, the highest value obtained for the baseline soil As concentration. Prediction of soil As buildup over time indicated that 75% or the soils cropped to rice would contain at least 30 mg/L As by the year 2020. Copyright © 2011 Elsevier B.V. All rights reserved.

  1. Whose data set is it anyway? Sharing raw data from randomized trials.

    PubMed

    Vickers, Andrew J

    2006-05-16

    Sharing of raw research data is common in many areas of medical research, genomics being perhaps the most well-known example. In the clinical trial community investigators routinely refuse to share raw data from a randomized trial without giving a reason. Data sharing benefits numerous research-related activities: reproducing analyses; testing secondary hypotheses; developing and evaluating novel statistical methods; teaching; aiding design of future trials; meta-analysis; and, possibly, preventing error, fraud and selective reporting. Clinical trialists, however, sometimes appear overly concerned with being scooped and with misrepresentation of their work. Both possibilities can be avoided with simple measures such as inclusion of the original trialists as co-authors on any publication resulting from data sharing. Moreover, if we treat any data set as belonging to the patients who comprise it, rather than the investigators, such concerns fall away. Technological developments, particularly the Internet, have made data sharing generally a trivial logistical problem. Data sharing should come to be seen as an inherent part of conducting a randomized trial, similar to the way in which we consider ethical review and publication of study results. Journals and funding bodies should insist that trialists make raw data available, for example, by publishing data on the Web. If the clinical trial community continues to fail with respect to data sharing, we will only strengthen the public perception that we do clinical trials to benefit ourselves, not our patients.

  2. Using geostatistical methods to estimate snow water equivalence distribution in a mountain watershed

    USGS Publications Warehouse

    Balk, B.; Elder, K.; Baron, Jill S.

    1998-01-01

    Knowledge of the spatial distribution of snow water equivalence (SWE) is necessary to adequately forecast the volume and timing of snowmelt runoff.  In April 1997, peak accumulation snow depth and density measurements were independently taken in the Loch Vale watershed (6.6 km2), Rocky Mountain National Park, Colorado.  Geostatistics and classical statistics were used to estimate SWE distribution across the watershed.  Snow depths were spatially distributed across the watershed through kriging interpolation methods which provide unbiased estimates that have minimum variances.  Snow densities were spatially modeled through regression analysis.  Combining the modeled depth and density with snow-covered area (SCA produced an estimate of the spatial distribution of SWE.  The kriged estimates of snow depth explained 37-68% of the observed variance in the measured depths.  Steep slopes, variably strong winds, and complex energy balance in the watershed contribute to a large degree of heterogeneity in snow depth.

  3. The use of sequential indicator simulation to characterize geostatistical uncertainty; Yucca Mountain Site Characterization Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hansen, K.M.

    1992-10-01

    Sequential indicator simulation (SIS) is a geostatistical technique designed to aid in the characterization of uncertainty about the structure or behavior of natural systems. This report discusses a simulation experiment designed to study the quality of uncertainty bounds generated using SIS. The results indicate that, while SIS may produce reasonable uncertainty bounds in many situations, factors like the number and location of available sample data, the quality of variogram models produced by the user, and the characteristics of the geologic region to be modeled, can all have substantial effects on the accuracy and precision of estimated confidence limits. It ismore » recommended that users of SIS conduct validation studies for the technique on their particular regions of interest before accepting the output uncertainty bounds.« less

  4. A geostatistical approach to data harmonization - Application to radioactivity exposure data

    NASA Astrophysics Data System (ADS)

    Baume, O.; Skøien, J. O.; Heuvelink, G. B. M.; Pebesma, E. J.; Melles, S. J.

    2011-06-01

    Environmental issues such as air, groundwater pollution and climate change are frequently studied at spatial scales that cross boundaries between political and administrative regions. It is common for different administrations to employ different data collection methods. If these differences are not taken into account in spatial interpolation procedures then biases may appear and cause unrealistic results. The resulting maps may show misleading patterns and lead to wrong interpretations. Also, errors will propagate when these maps are used as input to environmental process models. In this paper we present and apply a geostatistical model that generalizes the universal kriging model such that it can handle heterogeneous data sources. The associated best linear unbiased estimation and prediction (BLUE and BLUP) equations are presented and it is shown that these lead to harmonized maps from which estimated biases are removed. The methodology is illustrated with an example of country bias removal in a radioactivity exposure assessment for four European countries. The application also addresses multicollinearity problems in data harmonization, which arise when both artificial bias factors and natural drifts are present and cannot easily be distinguished. Solutions for handling multicollinearity are suggested and directions for further investigations proposed.

  5. Analysis and simulation of wireless signal propagation applying geostatistical interpolation techniques

    NASA Astrophysics Data System (ADS)

    Kolyaie, S.; Yaghooti, M.; Majidi, G.

    2011-12-01

    This paper is a part of an ongoing research to examine the capability of geostatistical analysis for mobile networks coverage prediction, simulation and tuning. Mobile network coverage predictions are used to find network coverage gaps and areas with poor serviceability. They are essential data for engineering and management in order to make better decision regarding rollout, planning and optimisation of mobile networks.The objective of this research is to evaluate different interpolation techniques in coverage prediction. In method presented here, raw data collected from drive testing a sample of roads in study area is analysed and various continuous surfaces are created using different interpolation methods. Two general interpolation methods are used in this paper with different variables; first, Inverse Distance Weighting (IDW) with various powers and number of neighbours and second, ordinary kriging with Gaussian, spherical, circular and exponential semivariogram models with different number of neighbours. For the result comparison, we have used check points coming from the same drive test data. Prediction values for check points are extracted from each surface and the differences with actual value are computed. The output of this research helps finding an optimised and accurate model for coverage prediction.

  6. Comparison of the Predictive Performance and Interpretability of Random Forest and Linear Models on Benchmark Data Sets.

    PubMed

    Marchese Robinson, Richard L; Palczewska, Anna; Palczewski, Jan; Kidley, Nathan

    2017-08-28

    The ability to interpret the predictions made by quantitative structure-activity relationships (QSARs) offers a number of advantages. While QSARs built using nonlinear modeling approaches, such as the popular Random Forest algorithm, might sometimes be more predictive than those built using linear modeling approaches, their predictions have been perceived as difficult to interpret. However, a growing number of approaches have been proposed for interpreting nonlinear QSAR models in general and Random Forest in particular. In the current work, we compare the performance of Random Forest to those of two widely used linear modeling approaches: linear Support Vector Machines (SVMs) (or Support Vector Regression (SVR)) and partial least-squares (PLS). We compare their performance in terms of their predictivity as well as the chemical interpretability of the predictions using novel scoring schemes for assessing heat map images of substructural contributions. We critically assess different approaches for interpreting Random Forest models as well as for obtaining predictions from the forest. We assess the models on a large number of widely employed public-domain benchmark data sets corresponding to regression and binary classification problems of relevance to hit identification and toxicology. We conclude that Random Forest typically yields comparable or possibly better predictive performance than the linear modeling approaches and that its predictions may also be interpreted in a chemically and biologically meaningful way. In contrast to earlier work looking at interpretation of nonlinear QSAR models, we directly compare two methodologically distinct approaches for interpreting Random Forest models. The approaches for interpreting Random Forest assessed in our article were implemented using open-source programs that we have made available to the community. These programs are the rfFC package ( https://r-forge.r-project.org/R/?group_id=1725 ) for the R statistical

  7. Improving a spatial rainfall product using multiple-point geostatistical simulations and its effect on a national hydrological model.

    NASA Astrophysics Data System (ADS)

    Oriani, F.; Stisen, S.

    2016-12-01

    Rainfall amount is one of the most sensitive inputs to distributed hydrological models. Its spatial representation is of primary importance to correctly study the uncertainty of basin recharge and its propagation to the surface and underground circulation. We consider here the 10-km-grid rainfall product provided by the Danish Meteorological Institute as input to the National Water Resources Model of Denmark. Due to a drastic reduction in the rain gauge network in recent years (from approximately 500 stations in the period 1996-2006, to 250 in the period 2007-2014), the grid rainfall product, based on the interpolation of these data, is much less reliable. Consequently, the related hydrological model shows a significantly lower prediction power. To give a better estimation of spatial rainfall at the grid points far from ground measurements, we use the direct sampling technique (DS) [1], belonging to the family of multiple-point geostatistics. DS, already applied to rainfall and spatial variable estimation [2, 3], simulates a grid value by sampling a training data set where a similar data neighborhood occurs. In this way, complex statistical relations are preserved by generating similar spatial patterns to the ones found in the training data set. Using the reliable grid product from the period 1996-2006 as training data set, we first test the technique by simulating part of this data set, then we apply the technique to the grid product of the period 2007-2014, and subsequently analyzing the uncertainty propagation to the hydrological model. We show that DS can improve the reliability of the rainfall product by generating more realistic rainfall patterns, with a significant repercussion on the hydrological model. The reduction of rain gauge networks is a global phenomenon which has huge implications for hydrological model performance and the uncertainty assessment of water resources. Therefore, the presented methodology can potentially be used in many regions where

  8. Efficient Graph-Based Resource Allocation Scheme Using Maximal Independent Set for Randomly- Deployed Small Star Networks

    PubMed Central

    Zhou, Jian; Wang, Lusheng; Wang, Weidong; Zhou, Qingfeng

    2017-01-01

    In future scenarios of heterogeneous and dense networks, randomly-deployed small star networks (SSNs) become a key paradigm, whose system performance is restricted to inter-SSN interference and requires an efficient resource allocation scheme for interference coordination. Traditional resource allocation schemes do not specifically focus on this paradigm and are usually too time consuming in dense networks. In this article, a very efficient graph-based scheme is proposed, which applies the maximal independent set (MIS) concept in graph theory to help divide SSNs into almost interference-free groups. We first construct an interference graph for the system based on a derived distance threshold indicating for any pair of SSNs whether there is intolerable inter-SSN interference or not. Then, SSNs are divided into MISs, and the same resource can be repetitively used by all the SSNs in each MIS. Empirical parameters and equations are set in the scheme to guarantee high performance. Finally, extensive scenarios both dense and nondense are randomly generated and simulated to demonstrate the performance of our scheme, indicating that it outperforms the classical max K-cut-based scheme in terms of system capacity, utility and especially time cost. Its achieved system capacity, utility and fairness can be close to the near-optimal strategy obtained by a time-consuming simulated annealing search. PMID:29113109

  9. Detection of masses in mammogram images using CNN, geostatistic functions and SVM.

    PubMed

    Sampaio, Wener Borges; Diniz, Edgar Moraes; Silva, Aristófanes Corrêa; de Paiva, Anselmo Cardoso; Gattass, Marcelo

    2011-08-01

    Breast cancer occurs with high frequency among the world's population and its effects impact the patients' perception of their own sexuality and their very personal image. This work presents a computational methodology that helps specialists detect breast masses in mammogram images. The first stage of the methodology aims to improve the mammogram image. This stage consists in removing objects outside the breast, reducing noise and highlighting the internal structures of the breast. Next, cellular neural networks are used to segment the regions that might contain masses. These regions have their shapes analyzed through shape descriptors (eccentricity, circularity, density, circular disproportion and circular density) and their textures analyzed through geostatistic functions (Ripley's K function and Moran's and Geary's indexes). Support vector machines are used to classify the candidate regions as masses or non-masses, with sensitivity of 80%, rates of 0.84 false positives per image and 0.2 false negatives per image, and an area under the ROC curve of 0.87. Copyright © 2011 Elsevier Ltd. All rights reserved.

  10. Soil risk assessment of As and Zn contamination in a coal mining region using geostatistics [corrected].

    PubMed

    Komnitsas, Kostas; Modis, Kostas

    2006-12-01

    The present paper aims to map As and Zn contamination and assess the risk for agricultural soils in a wider disposal site containing wastes derived from coal beneficiation. Geochemical data related to environmental studies show that the waste characteristics favor solubilisation and mobilization of inorganic contaminants and in some cases the generation of acidic leachates. 135 soil samples were collected from a 34 km(2) area and analysed by using geostatistics under the maximum entropy principle in order to produce risk assessment maps and estimate the probability of soil contamination. In addition, the present paper discusses the main issues related to risk assessment in wider mining and waste disposal sites in order to assist decision makers in selecting feasible rehabilitation schemes.

  11. A Novel Approach of Understanding and Incorporating Error of Chemical Transport Models into a Geostatistical Framework

    NASA Astrophysics Data System (ADS)

    Reyes, J.; Vizuete, W.; Serre, M. L.; Xu, Y.

    2015-12-01

    The EPA employs a vast monitoring network to measure ambient PM2.5 concentrations across the United States with one of its goals being to quantify exposure within the population. However, there are several areas of the country with sparse monitoring spatially and temporally. One means to fill in these monitoring gaps is to use PM2.5 modeled estimates from Chemical Transport Models (CTMs) specifically the Community Multi-scale Air Quality (CMAQ) model. CMAQ is able to provide complete spatial coverage but is subject to systematic and random error due to model uncertainty. Due to the deterministic nature of CMAQ, often these uncertainties are not quantified. Much effort is employed to quantify the efficacy of these models through different metrics of model performance. Currently evaluation is specific to only locations with observed data. Multiyear studies across the United States are challenging because the error and model performance of CMAQ are not uniform over such large space/time domains. Error changes regionally and temporally. Because of the complex mix of species that constitute PM2.5, CMAQ error is also a function of increasing PM2.5 concentration. To address this issue we introduce a model performance evaluation for PM2.5 CMAQ that is regionalized and non-linear. This model performance evaluation leads to error quantification for each CMAQ grid. Areas and time periods of error being better qualified. The regionalized error correction approach is non-linear and is therefore more flexible at characterizing model performance than approaches that rely on linearity assumptions and assume homoscedasticity of CMAQ predictions errors. Corrected CMAQ data are then incorporated into the modern geostatistical framework of Bayesian Maximum Entropy (BME). Through cross validation it is shown that incorporating error-corrected CMAQ data leads to more accurate estimates than just using observed data by themselves.

  12. Modeling dolomitized carbonate-ramp reservoirs: A case study of the Seminole San Andres unit. Part 2 -- Seismic modeling, reservoir geostatistics, and reservoir simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, F.P.; Dai, J.; Kerans, C.

    1998-11-01

    In part 1 of this paper, the authors discussed the rock-fabric/petrophysical classes for dolomitized carbonate-ramp rocks, the effects of rock fabric and pore type on petrophysical properties, petrophysical models for analyzing wireline logs, the critical scales for defining geologic framework, and 3-D geologic modeling. Part 2 focuses on geophysical and engineering characterizations, including seismic modeling, reservoir geostatistics, stochastic modeling, and reservoir simulation. Synthetic seismograms of 30 to 200 Hz were generated to study the level of seismic resolution required to capture the high-frequency geologic features in dolomitized carbonate-ramp reservoirs. Outcrop data were collected to investigate effects of sampling interval andmore » scale-up of block size on geostatistical parameters. Semivariogram analysis of outcrop data showed that the sill of log permeability decreases and the correlation length increases with an increase of horizontal block size. Permeability models were generated using conventional linear interpolation, stochastic realizations without stratigraphic constraints, and stochastic realizations with stratigraphic constraints. Simulations of a fine-scale Lawyer Canyon outcrop model were used to study the factors affecting waterflooding performance. Simulation results show that waterflooding performance depends strongly on the geometry and stacking pattern of the rock-fabric units and on the location of production and injection wells.« less

  13. Mapping seabed sediments: Comparison of manual, geostatistical, object-based image analysis and machine learning approaches

    NASA Astrophysics Data System (ADS)

    Diesing, Markus; Green, Sophie L.; Stephens, David; Lark, R. Murray; Stewart, Heather A.; Dove, Dayton

    2014-08-01

    Marine spatial planning and conservation need underpinning with sufficiently detailed and accurate seabed substrate and habitat maps. Although multibeam echosounders enable us to map the seabed with high resolution and spatial accuracy, there is still a lack of fit-for-purpose seabed maps. This is due to the high costs involved in carrying out systematic seabed mapping programmes and the fact that the development of validated, repeatable, quantitative and objective methods of swath acoustic data interpretation is still in its infancy. We compared a wide spectrum of approaches including manual interpretation, geostatistics, object-based image analysis and machine-learning to gain further insights into the accuracy and comparability of acoustic data interpretation approaches based on multibeam echosounder data (bathymetry, backscatter and derivatives) and seabed samples with the aim to derive seabed substrate maps. Sample data were split into a training and validation data set to allow us to carry out an accuracy assessment. Overall thematic classification accuracy ranged from 67% to 76% and Cohen's kappa varied between 0.34 and 0.52. However, these differences were not statistically significant at the 5% level. Misclassifications were mainly associated with uncommon classes, which were rarely sampled. Map outputs were between 68% and 87% identical. To improve classification accuracy in seabed mapping, we suggest that more studies on the effects of factors affecting the classification performance as well as comparative studies testing the performance of different approaches need to be carried out with a view to developing guidelines for selecting an appropriate method for a given dataset. In the meantime, classification accuracy might be improved by combining different techniques to hybrid approaches and multi-method ensembles.

  14. Canonical Naimark extension for generalized measurements involving sets of Pauli quantum observables chosen at random

    NASA Astrophysics Data System (ADS)

    Sparaciari, Carlo; Paris, Matteo G. A.

    2013-01-01

    We address measurement schemes where certain observables Xk are chosen at random within a set of nondegenerate isospectral observables and then measured on repeated preparations of a physical system. Each observable has a probability zk to be measured, with ∑kzk=1, and the statistics of this generalized measurement is described by a positive operator-valued measure. This kind of scheme is referred to as quantum roulettes, since each observable Xk is chosen at random, e.g., according to the fluctuating value of an external parameter. Here we focus on quantum roulettes for qubits involving the measurements of Pauli matrices, and we explicitly evaluate their canonical Naimark extensions, i.e., their implementation as indirect measurements involving an interaction scheme with a probe system. We thus provide a concrete model to realize the roulette without destroying the signal state, which can be measured again after the measurement or can be transmitted. Finally, we apply our results to the description of Stern-Gerlach-like experiments on a two-level system.

  15. Geostatistics and Geographic Information System to Analyze the Spatial Distribution of the Diversity of Anastrepha Species (Diptera: Tephritidae): the Effect of Forest Fragments in an Urban Area.

    PubMed

    Garcia, A G; Araujo, M R; Uramoto, K; Walder, J M M; Zucchi, R A

    2017-12-08

    Fruit flies are among the most damaging insect pests of commercial fruit in Brazil. It is important to understand the landscape elements that may favor these flies. In the present study, spatial data from surveys of species of Anastrepha Schiner (Diptera: Tephritidae) in an urban area with forest fragments were analyzed, using geostatistics and Geographic Information System (GIS) to map the diversity of insects and evaluate how the forest fragments drive the spatial patterns. The results indicated a high diversity of species associated with large fragments, and a trend toward lower diversity in the more urbanized area, as the fragment sizes decreased. We concluded that the diversity of Anastrepha species is directly and positively related to large and continuous forest fragments in urbanized areas, and that combining geostatistics and GIS is a promising method for use in insect-pest management and sampling involving fruit flies. © The Authors 2017. Published by Oxford University Press on behalf of Entomological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  16. Framework for the mapping of the monthly average daily solar radiation using an advanced case-based reasoning and a geostatistical technique.

    PubMed

    Lee, Minhyun; Koo, Choongwan; Hong, Taehoon; Park, Hyo Seon

    2014-04-15

    For the effective photovoltaic (PV) system, it is necessary to accurately determine the monthly average daily solar radiation (MADSR) and to develop an accurate MADSR map, which can simplify the decision-making process for selecting the suitable location of the PV system installation. Therefore, this study aimed to develop a framework for the mapping of the MADSR using an advanced case-based reasoning (CBR) and a geostatistical technique. The proposed framework consists of the following procedures: (i) the geographic scope for the mapping of the MADSR is set, and the measured MADSR and meteorological data in the geographic scope are collected; (ii) using the collected data, the advanced CBR model is developed; (iii) using the advanced CBR model, the MADSR at unmeasured locations is estimated; and (iv) by applying the measured and estimated MADSR data to the geographic information system, the MADSR map is developed. A practical validation was conducted by applying the proposed framework to South Korea. It was determined that the MADSR map developed through the proposed framework has been improved in terms of accuracy. The developed MADSR map can be used for estimating the MADSR at unmeasured locations and for determining the optimal location for the PV system installation.

  17. Use of geostatistics to predict virus decay rates for determination of septic tank setback distances.

    PubMed Central

    Yates, M V; Yates, S R; Warrick, A W; Gerba, C P

    1986-01-01

    Water samples were collected from 71 public drinking-water supply wells in the Tucson, Ariz., basin. Virus decay rates in the water samples were determined with MS-2 coliphage as a model virus. The correlations between the virus decay rates and the sample locations were shown by fitting a spherical model to the experimental semivariogram. Kriging, a geostatistical technique, was used to calculate virus decay rates at unsampled locations by using the known values at nearby wells. Based on the regional characteristics of groundwater flow and the kriged estimates of virus decay rates, a contour map of the area was constructed. The map shows the variation in separation distances that would have to be maintained between wells and sources of contamination to afford similar degrees of protection from viral contamination of the drinking water in wells throughout the basin. PMID:3532954

  18. Robust spatialization of soil water content at the scale of an agricultural field using geophysical and geostatistical methods

    NASA Astrophysics Data System (ADS)

    Henine, Hocine; Tournebize, Julien; Laurent, Gourdol; Christophe, Hissler; Cournede, Paul-Henry; Clement, Remi

    2017-04-01

    Research on the Critical Zone (CZ) is a prerequisite for undertaking issues related to ecosystemic services that human societies rely on (nutrient cycles, water supply and quality). However, while the upper part of CZ (vegetation, soil, surface water) is readily accessible, knowledge of the subsurface remains limited, due to the point-scale character of conventional direct observations. While the potential for geophysical methods to overcome this limitation is recognized, the translation of the geophysical information into physical properties or states of interest remains a challenge (e.g. the translation of soil electrical resistivity into soil water content). In this study, we propose a geostatistical framework using the Bayesian Maximum Entropy (BME) approach to assimilate geophysical and point-scale data. We especially focus on the prediction of the spatial distribution of soil water content using (1) TDR point-scale measurements of soil water content, which are considered as accurate data, and (2) soil water content data derived from electrical resistivity measurements, which are uncertain data but spatially dense. We used a synthetic dataset obtained with a vertical 2D domain to evaluate the performance of this geostatistical approach. Spatio-temporal simulations of soil water content were carried out using Hydrus-software for different scenarios: homogeneous or heterogeneous hydraulic conductivity distribution, and continuous or punctual infiltration pattern. From the simulations of soil water content, conceptual soil resistivity models were built using a forward modeling approach and point sampling of water content values, vertically ranged, were done. These two datasets are similar to field measurements of soil electrical resistivity (using electrical resistivity tomography, ERT) and soil water content (using TDR probes) obtained at the Boissy-le-Chatel site, in Orgeval catchment (East of Paris, France). We then integrated them into a specialization

  19. Geostatistical interpolation model selection based on ArcGIS and spatio-temporal variability analysis of groundwater level in piedmont plains, northwest China.

    PubMed

    Xiao, Yong; Gu, Xiaomin; Yin, Shiyang; Shao, Jingli; Cui, Yali; Zhang, Qiulan; Niu, Yong

    2016-01-01

    Based on the geo-statistical theory and ArcGIS geo-statistical module, datas of 30 groundwater level observation wells were used to estimate the decline of groundwater level in Beijing piedmont. Seven different interpolation methods (inverse distance weighted interpolation, global polynomial interpolation, local polynomial interpolation, tension spline interpolation, ordinary Kriging interpolation, simple Kriging interpolation and universal Kriging interpolation) were used for interpolating groundwater level between 2001 and 2013. Cross-validation, absolute error and coefficient of determination (R(2)) was applied to evaluate the accuracy of different methods. The result shows that simple Kriging method gave the best fit. The analysis of spatial and temporal variability suggest that the nugget effects from 2001 to 2013 were increasing, which means the spatial correlation weakened gradually under the influence of human activities. The spatial variability in the middle areas of the alluvial-proluvial fan is relatively higher than area in top and bottom. Since the changes of the land use, groundwater level also has a temporal variation, the average decline rate of groundwater level between 2007 and 2013 increases compared with 2001-2006. Urban development and population growth cause over-exploitation of residential and industrial areas. The decline rate of the groundwater level in residential, industrial and river areas is relatively high, while the decreasing of farmland area and development of water-saving irrigation reduce the quantity of water using by agriculture and decline rate of groundwater level in agricultural area is not significant.

  20. Effectiveness of Goal-Setting Telephone Follow-Up on Health Behaviors of Patients with Ischemic Stroke: A Randomized Controlled Trial.

    PubMed

    Wan, Li-Hong; Zhang, Xiao-Pei; Mo, Miao-Miao; Xiong, Xiao-Ni; Ou, Cui-Ling; You, Li-Ming; Chen, Shao-Xian; Zhang, Min

    2016-09-01

    Adopting healthy behaviors is critical for secondary stroke prevention, but many patients fail to follow national guidelines regarding diet, exercise, and abstinence from risk factors. Compliance often decreases with time after hospital discharge, yet few studies have examined programs promoting long-term adherence to health behaviors. Goal setting and telephone follow-up have been proven to be effective in other areas of medicine, so this study evaluated the effectiveness of a guideline-based, goal-setting telephone follow-up program for patients with ischemic stroke. This was a multicenter, assessor-blinded, parallel-group, randomized controlled trial. Ninety-one stroke patients were randomized to either a control group or an intervention group. Intervention consisted of predischarge education and 3 goal-setting follow-up sessions conducted by phone. Data were collected at baseline and during the third and sixth months after hospital discharge. Six months after discharge, patients in the intervention group exhibited significantly higher medication adherence than patients in the control group. There were no statistically significant differences in physical activity, nutrition, low-salt diet adherence, blood pressure monitoring, smoking abstinence, unhealthy use of alcohol, and modified Rankin Scale (mRS) scores between the 2 groups. Goal-setting telephone follow-up intervention for ischemic stroke patients is feasible and leads to improved medication adherence. However, the lack of group differences in other health behavior subcategories and in themRS score indicates a need for more effective intervention strategies to help patients reach guideline-recommended targets. Copyright © 2016 National Stroke Association. Published by Elsevier Inc. All rights reserved.

  1. Comparing of goal setting strategy with group education method to increase physical activity level: A randomized trial.

    PubMed

    Jiryaee, Nasrin; Siadat, Zahra Dana; Zamani, Ahmadreza; Taleban, Roya

    2015-10-01

    Designing an intervention to increase physical activity is important to be based on the health care settings resources and be acceptable by the subject group. This study was designed to assess and compare the effect of the goal setting strategy with a group education method on increasing the physical activity of mothers of children aged 1 to 5. Mothers who had at least one child of 1-5 years were randomized into two groups. The effect of 1) goal-setting strategy and 2) group education method on increasing physical activity was assessed and compared 1 month and 3 months after the intervention. Also, the weight, height, body mass index (BMI), waist and hip circumference, and well-being were compared between the two groups before and after the intervention. Physical activity level increased significantly after the intervention in the goal-setting group and it was significantly different between the two groups after intervention (P < 0.05). BMI, waist circumference, hip circumference, and well-being score were significantly different in the goal-setting group after the intervention. In the group education method, only the well-being score improved significantly (P < 0.05). Our study presented the effects of using the goal-setting strategy to boost physical activity, improving the state of well-being and decreasing BMI, waist, and hip circumference.

  2. Automated connectionist-geostatistical classification as an approach to identify sea ice and land ice types, properties and provinces

    NASA Astrophysics Data System (ADS)

    Goetz-Weiss, L. R.; Herzfeld, U. C.; Trantow, T.; Hunke, E. C.; Maslanik, J. A.; Crocker, R. I.

    2016-12-01

    An important problem in model-data comparison is the identification of parameters that can be extracted from observational data as well as used in numerical models, which are typically based on idealized physical processes. Here, we present a suite of approaches to characterization and classification of sea ice and land ice types, properties and provinces based on several types of remote-sensing data. Applications will be given to not only illustrate the approach, but employ it in model evaluation and understanding of physical processes. (1) In a geostatistical characterization, spatial sea-ice properties in the Chukchi and Beaufort Sea and in Elsoon Lagoon are derived from analysis of RADARSAT and ERS-2 SAR data. (2) The analysis is taken further by utilizing multi-parameter feature vectors as inputs for unsupervised and supervised statistical classification, which facilitates classification of different sea-ice types. (3) Characteristic sea-ice parameters, as resultant from the classification, can then be applied in model evaluation, as demonstrated for the ridging scheme of the Los Alamos sea ice model, CICE, using high-resolution altimeter and image data collected from unmanned aircraft over Fram Strait during the Characterization of Arctic Sea Ice Experiment (CASIE). The characteristic parameters chosen in this application are directly related to deformation processes, which also underly the ridging scheme. (4) The method that is capable of the most complex classification tasks is the connectionist-geostatistical classification method. This approach has been developed to identify currently up to 18 different crevasse types in order to map progression of the surge through the complex Bering-Bagley Glacier System, Alaska, in 2011-2014. The analysis utilizes airborne altimeter data and video image data and satellite image data. Results of the crevasse classification are compare to fracture modeling and found to match.

  3. Sequential Bayesian Geostatistical Inversion and Evaluation of Combined Data Worth for Aquifer Characterization at the Hanford 300 Area

    NASA Astrophysics Data System (ADS)

    Murakami, H.; Chen, X.; Hahn, M. S.; Over, M. W.; Rockhold, M. L.; Vermeul, V.; Hammond, G. E.; Zachara, J. M.; Rubin, Y.

    2010-12-01

    Subsurface characterization for predicting groundwater flow and contaminant transport requires us to integrate large and diverse datasets in a consistent manner, and quantify the associated uncertainty. In this study, we sequentially assimilated multiple types of datasets for characterizing a three-dimensional heterogeneous hydraulic conductivity field at the Hanford 300 Area. The datasets included constant-rate injection tests, electromagnetic borehole flowmeter tests, lithology profile and tracer tests. We used the method of anchored distributions (MAD), which is a modular-structured Bayesian geostatistical inversion method. MAD has two major advantages over the other inversion methods. First, it can directly infer a joint distribution of parameters, which can be used as an input in stochastic simulations for prediction. In MAD, in addition to typical geostatistical structural parameters, the parameter vector includes multiple point values of the heterogeneous field, called anchors, which capture local trends and reduce uncertainty in the prediction. Second, MAD allows us to integrate the datasets sequentially in a Bayesian framework such that it updates the posterior distribution, as a new dataset is included. The sequential assimilation can decrease computational burden significantly. We applied MAD to assimilate different combinations of the datasets, and then compared the inversion results. For the injection and tracer test assimilation, we calculated temporal moments of pressure build-up and breakthrough curves, respectively, to reduce the data dimension. A massive parallel flow and transport code PFLOTRAN is used for simulating the tracer test. For comparison, we used different metrics based on the breakthrough curves not used in the inversion, such as mean arrival time, peak concentration and early arrival time. This comparison intends to yield the combined data worth, i.e. which combination of the datasets is the most effective for a certain metric, which

  4. Geostatistical Methods For Determination of Roughness, Topography, And Changes of Antarctic Ice Streams From SAR And Radar Altimeter Data

    NASA Technical Reports Server (NTRS)

    Herzfeld, Ute C.

    2002-01-01

    The central objective of this project has been the development of geostatistical methods fro mapping elevation and ice surface characteristics from satellite radar altimeter (RA) and Syntheitc Aperture Radar (SAR) data. The main results are an Atlas of elevation maps of Antarctica, from GEOSAT RA data and an Atlas from ERS-1 RA data, including a total of about 200 maps with 3 km grid resolution. Maps and digital terrain models are applied to monitor and study changes in Antarctic ice streams and glaciers, including Lambert Glacier/Amery Ice Shelf, Mertz and Ninnis Glaciers, Jutulstraumen Glacier, Fimbul Ice Shelf, Slessor Glacier, Williamson Glacier and others.

  5. Use of a goal setting intervention to increase adherence to low back pain rehabilitation: a randomized controlled trial.

    PubMed

    Coppack, Russell J; Kristensen, Jakob; Karageorghis, Costas I

    2012-11-01

    To examine the effects of a goal setting intervention on self-efficacy, treatment efficacy, adherence and treatment outcome in patients undergoing low back pain rehabilitation. A mixed-model 2 (time) × 3 (group) randomized controlled trial. A residential rehabilitation centre for military personnel. UK military personnel volunteers (N = 48); mean age was 32.9 (SD 7.9) with a diagnosis of non-specific low back pain. Subjects were randomly assigned to either a goal setting experimental group (Exp, n = 16), therapist-led exercise therapy group (C1, n = 16) or non-therapist-led exercise therapy group (C2, n = 16). Treatment duration for all groups was three weeks. Self-efficacy, treatment efficacy and treatment outcome were recorded before and after the treatment period. Adherence was rated during regularly scheduled treatment sessions using the Sports Injury Rehabilitation Adherence Scale (SIRAS). The Biering-Sørensen test was used as the primary measure of treatment outcome. ANCOVA results showed that adherence scores were significantly higher in the experimental group (13.70 ± 1.58) compared with C2 (11.74 ± 1.35), (P < 0.025). There was no significant difference for adherence between the experimental group and C1 (P = 0.13). Self-efficacy was significantly higher in the experimental group compared to both C1 and C2 (P < 0.05), whereas no significant difference was found for treatment efficacy. Treatment outcome did not differ significantly between the experimental and two control groups. The findings provide partial support for the use of goal setting to enhance adherence in clinical rehabilitation.

  6. Spatial epidemiology of bovine tuberculosis in Mexico.

    PubMed

    Martínez, Horacio Zendejas; Suazo, Feliciano Milián; Cuador Gil, José Quintín; Bello, Gustavo Cruz; Anaya Escalera, Ana María; Márquez, Gabriel Huitrón; Casanova, Leticia García

    2007-01-01

    The purpose of this study was to use geographic information systems (GIS) and geo-statistical methods of ordinary kriging to predict the prevalence and distribution of bovine tuberculosis (TB) in Jalisco, Mexico. A random sample of 2 287 herds selected from a set of 48 766 was used for the analysis. Spatial location of herds was obtained by either a personal global positioning system (GPS), a database from the Instituto Nacional de Estadìstica Geografìa e Informàtica (INEGI) or Google Earth. Information on TB prevalence was provided by the Jalisco Commission for the Control and Eradication of Tuberculosis (COEETB). Prediction of TB was obtained using ordinary kriging in the geostatistical analyst module in ArcView8. A predicted high prevalence area of TB matching the distribution of dairy cattle was observed. This prediction was in agreement with the prevalence calculated on the total 48 766 herds. Validation was performed taking estimated values of TB prevalence at each municipality, extracted from the kriging surface and then compared with the real prevalence values using a correlation test, giving a value of 0.78, indicating that GIS and kriging are reliable tools for the estimation of TB distribution based on a random sample. This resulted in a significant savings of resources.

  7. Closed loop ventilation mode in Intensive Care Unit: a randomized controlled clinical trial comparing the numbers of manual ventilator setting changes.

    PubMed

    Arnal, Jean-Michel; Garnero, Aude; Novotni, Dominik; Corno, Gaëlle; Donati, Stéphane-Yannis; Demory, Didier; Quintana, Gabrielle; Ducros, Laurent; Laubscher, Thomas; Durand-Gasselin, Jacques

    2018-01-01

    There is an equipoise regarding closed-loop ventilation modes and the ability to reduce workload for providers. On one hand some settings are managed by the ventilator but on another hand the automatic mode introduces new settings for the user. This randomized controlled trial compared the number of manual ventilator setting changes between a full closed loop ventilation and oxygenation mode (INTELLiVENT-ASV®) and conventional ventilation modes (volume assist control and pressure support) in Intensive Care Unit (ICU) patients. The secondary endpoints were to compare the number of arterial blood gas analysis, the sedation dose and the user acceptance. Sixty subjects with an expected duration of mechanical ventilation of at least 48 hours were randomized to be ventilated using INTELLiVENT-ASV® or conventional modes with a protocolized weaning. All manual ventilator setting changes were recorded continuously from inclusion to successful extubation or death. Arterial blood gases were performed upon decision of the clinician in charge. User acceptance score was assessed for nurses and physicians once daily using a Likert Scale. The number of manual ventilator setting changes per 24 h-period per subject was lower in INTELLiVENT-ASV® as compared to conventional ventilation group (5 [4-7] versus 10 [7-17]) manuals settings per subject per day [P<0.001]). The number of arterial blood gas analysis and the sedation doses were not significantly different between the groups. Nurses and physicians reported that INTELLiVENT-ASV® was significantly easier to use as compared to conventional ventilation (P<0.001 for nurses and P<0.01 for physicians). For mechanically ventilated ICU patients, INTELLiVENT-ASV® significantly reduces the number of manual ventilator setting changes with the same number of arterial blood gas analysis and sedation dose, and is easier to use for the caregivers as compared to conventional ventilation modes.

  8. Quantifying aggregated uncertainty in Plasmodium falciparum malaria prevalence and populations at risk via efficient space-time geostatistical joint simulation.

    PubMed

    Gething, Peter W; Patil, Anand P; Hay, Simon I

    2010-04-01

    Risk maps estimating the spatial distribution of infectious diseases are required to guide public health policy from local to global scales. The advent of model-based geostatistics (MBG) has allowed these maps to be generated in a formal statistical framework, providing robust metrics of map uncertainty that enhances their utility for decision-makers. In many settings, decision-makers require spatially aggregated measures over large regions such as the mean prevalence within a country or administrative region, or national populations living under different levels of risk. Existing MBG mapping approaches provide suitable metrics of local uncertainty--the fidelity of predictions at each mapped pixel--but have not been adapted for measuring uncertainty over large areas, due largely to a series of fundamental computational constraints. Here the authors present a new efficient approximating algorithm that can generate for the first time the necessary joint simulation of prevalence values across the very large prediction spaces needed for global scale mapping. This new approach is implemented in conjunction with an established model for P. falciparum allowing robust estimates of mean prevalence at any specified level of spatial aggregation. The model is used to provide estimates of national populations at risk under three policy-relevant prevalence thresholds, along with accompanying model-based measures of uncertainty. By overcoming previously unchallenged computational barriers, this study illustrates how MBG approaches, already at the forefront of infectious disease mapping, can be extended to provide large-scale aggregate measures appropriate for decision-makers.

  9. Preliminary evaluation of a telephone-based smoking cessation intervention in the lung cancer screening setting: A randomized clinical trial.

    PubMed

    Taylor, Kathryn L; Hagerman, Charlotte J; Luta, George; Bellini, Paula G; Stanton, Cassandra; Abrams, David B; Kramer, Jenna A; Anderson, Eric; Regis, Shawn; McKee, Andrea; McKee, Brady; Niaura, Ray; Harper, Harry; Ramsaier, Michael

    2017-06-01

    Incorporating effective smoking cessation interventions into lung cancer screening (LCS) programs will be essential to realizing the full benefit of screening. We conducted a pilot randomized trial to determine the feasibility and efficacy of a telephone-counseling (TC) smoking cessation intervention vs. usual care (UC) in the LCS setting. In collaboration with 3 geographically diverse LCS programs, we enrolled current smokers (61.5% participation rate) who were: registered to undergo LCS, 50-77 years old, and had a 20+ pack-year smoking history. Eligibility was not based on readiness to quit. Participants completed pre-LCS (T0) and post-LCS (T1) telephone assessments, were randomized to TC (N=46) vs. UC (N=46), and completed a final 3-month telephone assessment (T2). Both study arms received a list of evidence-based cessation resources. TC participants also received up to 6 brief counseling calls with a trained cessation counselor. Counseling calls incorporated motivational interviewing and utilized the screening result as a motivator for quitting. The outcome was biochemically verified 7-day point prevalence cessation at 3-months post-randomization. Participants (56.5% female) were 60.2 (SD=5.4) years old and reported 47.1 (SD=22.2) pack years; 30% were ready to stop smoking in the next 30 days. TC participants completed an average of 4.4 (SD=2.3) sessions. Using intent-to-treat analyses, biochemically verified quit rates were 17.4% (TC) vs. 4.3% (UC), p<.05. This study provides preliminary evidence that telephone-based cessation counseling is feasible and efficacious in the LCS setting. As millions of current smokers are now eligible for lung cancer screening, this setting represents an important opportunity to exert a large public health impact on cessation among smokers who are at very high risk for multiple tobacco-related diseases. If this evidence-based, brief, and scalable intervention is replicated, TC could help to improve the overall cost

  10. Three nested randomized controlled trials of peer-only or multiple stakeholder group feedback within Delphi surveys during core outcome and information set development.

    PubMed

    Brookes, Sara T; Macefield, Rhiannon C; Williamson, Paula R; McNair, Angus G; Potter, Shelley; Blencowe, Natalie S; Strong, Sean; Blazeby, Jane M

    2016-08-17

    Methods for developing a core outcome or information set require involvement of key stakeholders to prioritise many items and achieve agreement as to the core set. The Delphi technique requires participants to rate the importance of items in sequential questionnaires (or rounds) with feedback provided in each subsequent round such that participants are able to consider the views of others. This study examines the impact of receiving feedback from different stakeholder groups, on the subsequent rating of items and the level of agreement between stakeholders. Randomized controlled trials were nested within the development of three core sets each including a Delphi process with two rounds of questionnaires, completed by patients and health professionals. Participants rated items from 1 (not essential) to 9 (absolutely essential). For round 2, participants were randomized to receive feedback from their peer stakeholder group only (peer) or both stakeholder groups separately (multiple). Decisions as to which items to retain following each round were determined by pre-specified criteria. Whilst type of feedback did not impact on the percentage of items for which a participant subsequently changed their rating, or the magnitude of change, it did impact on items retained at the end of round 2. Each core set contained discordant items retained by one feedback group but not the other (3-22 % discordant items). Consensus between patients and professionals in items to retain was greater amongst those receiving multiple group feedback in each core set (65-82 % agreement for peer-only feedback versus 74-94 % for multiple feedback). In addition, differences in round 2 scores were smaller between stakeholder groups receiving multiple feedback than between those receiving peer group feedback only. Variability in item scores across stakeholders was reduced following any feedback but this reduction was consistently greater amongst the multiple feedback group. In the development of

  11. A Permutation-Randomization Approach to Test the Spatial Distribution of Plant Diseases.

    PubMed

    Lione, G; Gonthier, P

    2016-01-01

    The analysis of the spatial distribution of plant diseases requires the availability of trustworthy geostatistical methods. The mean distance tests (MDT) are here proposed as a series of permutation and randomization tests to assess the spatial distribution of plant diseases when the variable of phytopathological interest is categorical. A user-friendly software to perform the tests is provided. Estimates of power and type I error, obtained with Monte Carlo simulations, showed the reliability of the MDT (power > 0.80; type I error < 0.05). A biological validation on the spatial distribution of spores of two fungal pathogens causing root rot on conifers was successfully performed by verifying the consistency between the MDT responses and previously published data. An application of the MDT was carried out to analyze the relation between the plantation density and the distribution of the infection of Gnomoniopsis castanea, an emerging fungal pathogen causing nut rot on sweet chestnut. Trees carrying nuts infected by the pathogen were randomly distributed in areas with different plantation densities, suggesting that the distribution of G. castanea was not related to the plantation density. The MDT could be used to analyze the spatial distribution of plant diseases both in agricultural and natural ecosystems.

  12. Comments on ``Use of conditional simulation in nuclear waste site performance assessment`` by Carol Gotway

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Downing, D.J.

    1993-10-01

    This paper discusses Carol Gotway`s paper, ``The Use of Conditional Simulation in Nuclear Waste Site Performance Assessment.`` The paper centers on the use of conditional simulation and the use of geostatistical methods to simulate an entire field of values for subsequent use in a complex computer model. The issues of sampling designs for geostatistics, semivariogram estimation and anisotropy, turning bands method for random field generation, and estimation of the comulative distribution function are brought out.

  13. Psychosocial education improves low back pain beliefs: results from a cluster randomized clinical trial (NCT00373009) in a primary prevention setting.

    PubMed

    George, Steven Z; Teyhen, Deydre S; Wu, Samuel S; Wright, Alison C; Dugan, Jessica L; Yang, Guijun; Robinson, Michael E; Childs, John D

    2009-07-01

    The general population has a pessimistic view of low back pain (LBP), and evidence-based information has been used to positively influence LBP beliefs in previously reported mass media studies. However, there is a lack of randomized trials investigating whether LBP beliefs can be modified in primary prevention settings. This cluster randomized clinical trial investigated the effect of an evidence-based psychosocial educational program (PSEP) on LBP beliefs for soldiers completing military training. A military setting was selected for this clinical trial, because LBP is a common cause of soldier disability. Companies of soldiers (n = 3,792) were recruited, and cluster randomized to receive a PSEP or no education (control group, CG). The PSEP consisted of an interactive seminar, and soldiers were issued the Back Book for reference material. The primary outcome measure was the back beliefs questionnaire (BBQ), which assesses inevitable consequences of and ability to cope with LBP. The BBQ was administered before randomization and 12 weeks later. A linear mixed model was fitted for the BBQ at the 12-week follow-up, and a generalized linear mixed model was fitted for the dichotomous outcomes on BBQ change of greater than two points. Sensitivity analyses were performed to account for drop out. BBQ scores (potential range: 9-45) improved significantly from baseline of 25.6 +/- 5.7 (mean +/- SD) to 26.9 +/- 6.2 for those receiving the PSEP, while there was a significant decline from 26.1 +/- 5.7 to 25.6 +/- 6.0 for those in the CG. The adjusted mean BBQ score at follow-up for those receiving the PSEP was 1.49 points higher than those in the CG (P < 0.0001). The adjusted odds ratio of BBQ improvement of greater than two points for those receiving the PSEP was 1.51 (95% CI = 1.22-1.86) times that of those in the CG. BBQ improvement was also mildly associated with race and college education. Sensitivity analyses suggested minimal influence of drop out. In conclusion, soldiers

  14. [Spatial structure analysis and distribution simulation of Therioaphis trifolii population based on geostatistics and GIS].

    PubMed

    Zhang, Rong; Leng, Yun-fa; Zhu, Meng-meng; Wang, Fang

    2007-11-01

    Based on geographic information system and geostatistics, the spatial structure of Therioaphis trifolii population of different periods in Yuanzhou district of Guyuan City, the southern Ningxia Province, was analyzed. The spatial distribution of Therioaphis trifolii population was also simulated by ordinary Kriging interpretation. The results showed that Therioaphis trifolii population of different periods was correlated spatially in the study area. The semivariograms of Therioaphis trifolii could be described by exponential model, indicating an aggregated spatial arrangement. The spatial variance varied from 34.13%-48.77%, and the range varied from 8.751-12.049 km. The degree and direction of aggregation showed that the trend was increased gradually from southwest to northeast. The dynamic change of Therioaphis trifolii population in different periods could be analyzed intuitively on the simulated maps of the spatial distribution from the two aspects of time and space, The occurrence position and degree of Therioaphis trifolii to a state of certain time could be determined easily.

  15. Spatial analysis of lettuce downy mildew using geostatistics and geographic information systems.

    PubMed

    Wu, B M; van Bruggen, A H; Subbarao, K V; Pennings, G G

    2001-02-01

    ABSTRACT The epidemiology of lettuce downy mildew has been investigated extensively in coastal California. However, the spatial patterns of the disease and the distance that Bremia lactucae spores can be transported have not been determined. During 1995 to 1998, we conducted several field- and valley-scale surveys to determine spatial patterns of this disease in the Salinas valley. Geostatistical analyses of the survey data at both scales showed that the influence range of downy mildew incidence at one location on incidence at other locations was between 80 and 3,000 m. A linear relationship was detected between semivariance and lag distance at the field scale, although no single statistical model could fit the semi-variograms at the valley scale. Spatial interpolation by the inverse distance weighting method with a power of 2 resulted in plausible estimates of incidence throughout the valley. Cluster analysis in geographic information systems on the interpolated disease incidence from different dates demonstrated that the Salinas valley could be divided into two areas, north and south of Salinas City, with high and low disease pressure, respectively. Seasonal and spatial trends along the valley suggested that the distinction between the downy mildew conducive and nonconducive areas might be determined by environmental factors.

  16. Geostatistics as a tool to study mite dispersion in physic nut plantations.

    PubMed

    Rosado, J F; Picanço, M C; Sarmento, R A; Pereira, R M; Pedro-Neto, M; Galdino, T V S; de Sousa Saraiva, A; Erasmo, E A L

    2015-08-01

    Spatial distribution studies in pest management identify the locations where pest attacks on crops are most severe, enabling us to understand and predict the movement of such pests. Studies on the spatial distribution of two mite species, however, are rather scarce. The mites Polyphagotarsonemus latus and Tetranychus bastosi are the major pests affecting physic nut plantations (Jatropha curcas). Therefore, the objective of this study was to measure the spatial distributions of P. latus and T. bastosi in the physic nut plantations. Mite densities were monitored over 2 years in two different plantations. Sample locations were georeferenced. The experimental data were analyzed using geostatistical analyses. The total mite density was found to be higher when only one species was present (T. bastosi). When both the mite species were found in the same plantation, their peak densities occurred at different times. These mites, however, exhibited uniform spatial distribution when found at extreme densities (low or high). However, the mites showed an aggregated distribution in intermediate densities. Mite spatial distribution models were isotropic. Mite colonization commenced at the periphery of the areas under study, whereas the high-density patches extended until they reached 30 m in diameter. This has not been reported for J. curcas plants before.

  17. [Evaluation on environmental quality of heavy metals in soils and vegetables based on geostatistics and GIS].

    PubMed

    Xie, Zheng-miao; Li, Jing; Wang, Bi-ling; Chen, Jian-jun

    2006-10-01

    Contents of heavy metals (Pb, Zn, Cd, Cu) in soils and vegetables from Dongguan town in Shangyu city, China were studied using geostatistical analysis and GIS technique to evaluate environmental quality. Based on the evaluation criteria, the distribution of the spatial variability of heavy metals in soil-vegetable system was mapped and analyzed. The results showed that the distribution of soil heavy metals in a large number of soil samples in Dongguan town was asymmetric. The contents of Zn and Cu were lower than those of Cd and Pb. The concentrations distribution of Pb, Zn, Cd and Cu in soils and vegetables were different in spatial variability. There was a close relationship between total and available contents of heavy metals in soil. The contents of Pb and Cd in green vegetables were higher than those of Zn and Cu and exceeded the national sanitation standards for vegetables.

  18. Integrating Address Geocoding, Land Use Regression, and Spatiotemporal Geostatistical Estimation for Groundwater Tetrachloroethylene

    PubMed Central

    Messier, Kyle P.; Akita, Yasuyuki; Serre, Marc L.

    2012-01-01

    Geographic Information Systems (GIS) based techniques are cost-effective and efficient methods used by state agencies and epidemiology researchers for estimating concentration and exposure. However, budget limitations have made statewide assessments of contamination difficult, especially in groundwater media. Many studies have implemented address geocoding, land use regression, and geostatistics independently, but this is the first to examine the benefits of integrating these GIS techniques to address the need of statewide exposure assessments. A novel framework for concentration exposure is introduced that integrates address geocoding, land use regression (LUR), below detect data modeling, and Bayesian Maximum Entropy (BME). A LUR model was developed for Tetrachloroethylene that accounts for point sources and flow direction. We then integrate the LUR model into the BME method as a mean trend while also modeling below detects data as a truncated Gaussian probability distribution function. We increase available PCE data 4.7 times from previously available databases through multistage geocoding. The LUR model shows significant influence of dry cleaners at short ranges. The integration of the LUR model as mean trend in BME results in a 7.5% decrease in cross validation mean square error compared to BME with a constant mean trend. PMID:22264162

  19. Geostatistical characterisation of geothermal parameters for a thermal aquifer storage site in Germany

    NASA Astrophysics Data System (ADS)

    Rodrigo-Ilarri, J.; Li, T.; Grathwohl, P.; Blum, P.; Bayer, P.

    2009-04-01

    The design of geothermal systems such as aquifer thermal energy storage systems (ATES) must account for a comprehensive characterisation of all relevant parameters considered for the numerical design model. Hydraulic and thermal conductivities are the most relevant parameters and its distribution determines not only the technical design but also the economic viability of such systems. Hence, the knowledge of the spatial distribution of these parameters is essential for a successful design and operation of such systems. This work shows the first results obtained when applying geostatistical techniques to the characterisation of the Esseling Site in Germany. In this site a long-term thermal tracer test (> 1 year) was performed. On this open system the spatial temperature distribution inside the aquifer was observed over time in order to obtain as much information as possible that yield to a detailed characterisation both of the hydraulic and thermal relevant parameters. This poster shows the preliminary results obtained for the Esseling Site. It has been observed that the common homogeneous approach is not sufficient to explain the observations obtained from the TRT and that parameter heterogeneity must be taken into account.

  20. Integrating address geocoding, land use regression, and spatiotemporal geostatistical estimation for groundwater tetrachloroethylene.

    PubMed

    Messier, Kyle P; Akita, Yasuyuki; Serre, Marc L

    2012-03-06

    Geographic information systems (GIS) based techniques are cost-effective and efficient methods used by state agencies and epidemiology researchers for estimating concentration and exposure. However, budget limitations have made statewide assessments of contamination difficult, especially in groundwater media. Many studies have implemented address geocoding, land use regression, and geostatistics independently, but this is the first to examine the benefits of integrating these GIS techniques to address the need of statewide exposure assessments. A novel framework for concentration exposure is introduced that integrates address geocoding, land use regression (LUR), below detect data modeling, and Bayesian Maximum Entropy (BME). A LUR model was developed for tetrachloroethylene that accounts for point sources and flow direction. We then integrate the LUR model into the BME method as a mean trend while also modeling below detects data as a truncated Gaussian probability distribution function. We increase available PCE data 4.7 times from previously available databases through multistage geocoding. The LUR model shows significant influence of dry cleaners at short ranges. The integration of the LUR model as mean trend in BME results in a 7.5% decrease in cross validation mean square error compared to BME with a constant mean trend.

  1. Ranked set sampling: cost and optimal set size.

    PubMed

    Nahhas, Ramzi W; Wolfe, Douglas A; Chen, Haiying

    2002-12-01

    McIntyre (1952, Australian Journal of Agricultural Research 3, 385-390) introduced ranked set sampling (RSS) as a method for improving estimation of a population mean in settings where sampling and ranking of units from the population are inexpensive when compared with actual measurement of the units. Two of the major factors in the usefulness of RSS are the set size and the relative costs of the various operations of sampling, ranking, and measurement. In this article, we consider ranking error models and cost models that enable us to assess the effect of different cost structures on the optimal set size for RSS. For reasonable cost structures, we find that the optimal RSS set sizes are generally larger than had been anticipated previously. These results will provide a useful tool for determining whether RSS is likely to lead to an improvement over simple random sampling in a given setting and, if so, what RSS set size is best to use in this case.

  2. A stochastic-geometric model of soil variation in Pleistocene patterned ground

    NASA Astrophysics Data System (ADS)

    Lark, Murray; Meerschman, Eef; Van Meirvenne, Marc

    2013-04-01

    In this paper we examine the spatial variability of soil in parent material with complex spatial structure which arises from complex non-linear geomorphic processes. We show that this variability can be better-modelled by a stochastic-geometric model than by a standard Gaussian random field. The benefits of the new model are seen in the reproduction of features of the target variable which influence processes like water movement and pollutant dispersal. Complex non-linear processes in the soil give rise to properties with non-Gaussian distributions. Even under a transformation to approximate marginal normality, such variables may have a more complex spatial structure than the Gaussian random field model of geostatistics can accommodate. In particular the extent to which extreme values of the variable are connected in spatially coherent regions may be misrepresented. As a result, for example, geostatistical simulation generally fails to reproduce the pathways for preferential flow in an environment where coarse infill of former fluvial channels or coarse alluvium of braided streams creates pathways for rapid movement of water. Multiple point geostatistics has been developed to deal with this problem. Multiple point methods proceed by sampling from a set of training images which can be assumed to reproduce the non-Gaussian behaviour of the target variable. The challenge is to identify appropriate sources of such images. In this paper we consider a mode of soil variation in which the soil varies continuously, exhibiting short-range lateral trends induced by local effects of the factors of soil formation which vary across the region of interest in an unpredictable way. The trends in soil variation are therefore only apparent locally, and the soil variation at regional scale appears random. We propose a stochastic-geometric model for this mode of soil variation called the Continuous Local Trend (CLT) model. We consider a case study of soil formed in relict patterned

  3. A geostatistical analysis of small-scale spatial variability in bacterial abundance and community structure in salt marsh creek bank sediments

    NASA Technical Reports Server (NTRS)

    Franklin, Rima B.; Blum, Linda K.; McComb, Alison C.; Mills, Aaron L.

    2002-01-01

    Small-scale variations in bacterial abundance and community structure were examined in salt marsh sediments from Virginia's eastern shore. Samples were collected at 5 cm intervals (horizontally) along a 50 cm elevation gradient, over a 215 cm horizontal transect. For each sample, bacterial abundance was determined using acridine orange direct counts and community structure was analyzed using randomly amplified polymorphic DNA fingerprinting of whole-community DNA extracts. A geostatistical analysis was used to determine the degree of spatial autocorrelation among the samples, for each variable and each direction (horizontal and vertical). The proportion of variance in bacterial abundance that could be accounted for by the spatial model was quite high (vertical: 60%, horizontal: 73%); significant autocorrelation was found among samples separated by 25 cm in the vertical direction and up to 115 cm horizontally. In contrast, most of the variability in community structure was not accounted for by simply considering the spatial separation of samples (vertical: 11%, horizontal: 22%), and must reflect variability from other parameters (e.g., variation at other spatial scales, experimental error, or environmental heterogeneity). Microbial community patch size based upon overall similarity in community structure varied between 17 cm (vertical) and 35 cm (horizontal). Overall, variability due to horizontal position (distance from the creek bank) was much smaller than that due to vertical position (elevation) for both community properties assayed. This suggests that processes more correlated with elevation (e.g., drainage and redox potential) vary at a smaller scale (therefore producing smaller patch sizes) than processes controlled by distance from the creek bank. c2002 Federation of European Microbiological Societies. Published by Elsevier Science B.V. All rights reserved.

  4. Effects of gut-directed hypnotherapy on IBS in different clinical settings-results from two randomized, controlled trials.

    PubMed

    Lindfors, Perjohan; Unge, Peter; Arvidsson, Patrik; Nyhlin, Henry; Björnsson, Einar; Abrahamsson, Hasse; Simrén, Magnus

    2012-02-01

    Gut-directed hypnotherapy has been found to be effective in irritable bowel syndrome (IBS). However, randomized, controlled studies are rare and few have been performed outside highly specialized research centers. The objective of this study was to study the effect of gut-directed hypnotherapy in IBS in different clinical settings outside the traditional research units. The study population included IBS patients refractory to standard management. In study 1, patients were randomized to receive gut-directed hypnotherapy (12 sessions, 1 h/week) in psychology private practices or supportive therapy, whereas patients were randomized to receive gut-directed hypnotherapy in a small county hospital or to serve as waiting list controls in study 2. Gastrointestinal symptom severity and quality of life were evaluated at baseline, at 3 months follow-up and after 1 year. We randomized 138 IBS patients refractory to standard management, 90 in study 1 and 48 in study 2. In both the studies, IBS-related symptoms were improved at 3 months in the gut-directed hypnotherapy groups (P<0.05), but not in the control groups (ns). In study 1, a significantly greater improvement of IBS-related symptom severity could be detected in the gut-directed hypnotherapy group than in the control group (P<0.05), and a trend in the same direction was seen in study 2 (P=0.17). The results seen at 3 months were sustained up to 1 year. Gut-directed hypnotherapy is an effective treatment alternative for patients with refractory IBS, but the effectiveness is lower when the therapy is given outside the highly specialized research centers.

  5. PRagmatic trial Of Video Education in Nursing homes: The design and rationale for a pragmatic cluster randomized trial in the nursing home setting.

    PubMed

    Mor, Vincent; Volandes, Angelo E; Gutman, Roee; Gatsonis, Constantine; Mitchell, Susan L

    2017-04-01

    Background/Aims Nursing homes are complex healthcare systems serving an increasingly sick population. Nursing homes must engage patients in advance care planning, but do so inconsistently. Video decision support tools improved advance care planning in small randomized controlled trials. Pragmatic trials are increasingly employed in health services research, although not commonly in the nursing home setting to which they are well-suited. This report presents the design and rationale for a pragmatic cluster randomized controlled trial that evaluated the "real world" application of an Advance Care Planning Video Program in two large US nursing home healthcare systems. Methods PRagmatic trial Of Video Education in Nursing homes was conducted in 360 nursing homes (N = 119 intervention/N = 241 control) owned by two healthcare systems. Over an 18-month implementation period, intervention facilities were instructed to offer the Advance Care Planning Video Program to all patients. Control facilities employed usual advance care planning practices. Patient characteristics and outcomes were ascertained from Medicare Claims, Minimum Data Set assessments, and facility electronic medical record data. Intervention adherence was measured using a Video Status Report embedded into electronic medical record systems. The primary outcome was the number of hospitalizations/person-day alive among long-stay patients with advanced dementia or cardiopulmonary disease. The rationale for the approaches to facility randomization and recruitment, intervention implementation, population selection, data acquisition, regulatory issues, and statistical analyses are discussed. Results The large number of well-characterized candidate facilities enabled several unique design features including stratification on historical hospitalization rates, randomization prior to recruitment, and 2:1 control to intervention facilities ratio. Strong endorsement from corporate leadership made randomization

  6. Spatial Analysis of Phytophthora infestans Genotypes and Late Blight Severity on Tomato and Potato in the Del Fuerte Valley Using Geostatistics and Geographic Information Systems.

    PubMed

    Jaime-Garcia, R; Orum, T V; Felix-Gastelum, R; Trinidad-Correa, R; Vanetten, H D; Nelson, M R

    2001-12-01

    ABSTRACT Genetic structure of Phytophthora infestans, the causal agent of potato and tomato late blight, was analyzed spatially in a mixed potato and tomato production area in the Del Fuerte Valley, Sinaloa, Mexico. Isolates of P. infestans were characterized by mating type, allozyme analysis at the glucose-6-phosphate isomerase and peptidase loci, restriction fragment length polymorphism with probe RG57, metalaxyl sensitivity, and aggressiveness to tomato and potato. Spatial patterns of P. infestans genotypes were analyzed by geographical information systems and geo-statistics during the seasons of 1994-95, 1995-96, and 1996-97. Spatial analysis of the genetic structure of P. infestans indicates that geographic substructuring of this pathogen occurs in this area. Maps displaying the probabilities of occurrence of mating types and genotypes of P. infestans, and of disease severity at a regional scale, were presented. Some genotypes that exhibited differences in epidemiologically important features such as metalaxyl sensitivity and aggressiveness to tomato and potato had a restricted spread and were localized in isolated areas. Analysis of late blight severity showed recurring patterns, such as the earliest onset of the disease in the area where both potato and tomato were growing, strengthening the hypothesis that infected potato tubers are the main source of primary inoculum. The information that geostatistical analysis provides might help improve management programs for late blight in the Del Fuerte Valley.

  7. True Randomness from Big Data.

    PubMed

    Papakonstantinou, Periklis A; Woodruff, David P; Yang, Guang

    2016-09-26

    Generating random bits is a difficult task, which is important for physical systems simulation, cryptography, and many applications that rely on high-quality random bits. Our contribution is to show how to generate provably random bits from uncertain events whose outcomes are routinely recorded in the form of massive data sets. These include scientific data sets, such as in astronomics, genomics, as well as data produced by individuals, such as internet search logs, sensor networks, and social network feeds. We view the generation of such data as the sampling process from a big source, which is a random variable of size at least a few gigabytes. Our view initiates the study of big sources in the randomness extraction literature. Previous approaches for big sources rely on statistical assumptions about the samples. We introduce a general method that provably extracts almost-uniform random bits from big sources and extensively validate it empirically on real data sets. The experimental findings indicate that our method is efficient enough to handle large enough sources, while previous extractor constructions are not efficient enough to be practical. Quality-wise, our method at least matches quantum randomness expanders and classical world empirical extractors as measured by standardized tests.

  8. True Randomness from Big Data

    NASA Astrophysics Data System (ADS)

    Papakonstantinou, Periklis A.; Woodruff, David P.; Yang, Guang

    2016-09-01

    Generating random bits is a difficult task, which is important for physical systems simulation, cryptography, and many applications that rely on high-quality random bits. Our contribution is to show how to generate provably random bits from uncertain events whose outcomes are routinely recorded in the form of massive data sets. These include scientific data sets, such as in astronomics, genomics, as well as data produced by individuals, such as internet search logs, sensor networks, and social network feeds. We view the generation of such data as the sampling process from a big source, which is a random variable of size at least a few gigabytes. Our view initiates the study of big sources in the randomness extraction literature. Previous approaches for big sources rely on statistical assumptions about the samples. We introduce a general method that provably extracts almost-uniform random bits from big sources and extensively validate it empirically on real data sets. The experimental findings indicate that our method is efficient enough to handle large enough sources, while previous extractor constructions are not efficient enough to be practical. Quality-wise, our method at least matches quantum randomness expanders and classical world empirical extractors as measured by standardized tests.

  9. True Randomness from Big Data

    PubMed Central

    Papakonstantinou, Periklis A.; Woodruff, David P.; Yang, Guang

    2016-01-01

    Generating random bits is a difficult task, which is important for physical systems simulation, cryptography, and many applications that rely on high-quality random bits. Our contribution is to show how to generate provably random bits from uncertain events whose outcomes are routinely recorded in the form of massive data sets. These include scientific data sets, such as in astronomics, genomics, as well as data produced by individuals, such as internet search logs, sensor networks, and social network feeds. We view the generation of such data as the sampling process from a big source, which is a random variable of size at least a few gigabytes. Our view initiates the study of big sources in the randomness extraction literature. Previous approaches for big sources rely on statistical assumptions about the samples. We introduce a general method that provably extracts almost-uniform random bits from big sources and extensively validate it empirically on real data sets. The experimental findings indicate that our method is efficient enough to handle large enough sources, while previous extractor constructions are not efficient enough to be practical. Quality-wise, our method at least matches quantum randomness expanders and classical world empirical extractors as measured by standardized tests. PMID:27666514

  10. Integrating geophysical data for mapping the contamination of industrial sites by polycyclic aromatic hydrocarbons: A geostatistical approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Colin, P.; Nicoletis, S.; Froidevaux, R.

    1996-12-31

    A case study is presented of building a map showing the probability that the concentration in polycyclic aromatic hydrocarbon (PAH) exceeds a critical threshold. This assessment is based on existing PAH sample data (direct information) and on an electrical resistivity survey (indirect information). Simulated annealing is used to build a model of the range of possible values for PAH concentrations and of the bivariate relationship between PAH concentrations and electrical resistivity. The geostatistical technique of simple indicator kriging is then used, together with the probabilistic model, to infer, at each node of a grid, the range of possible values whichmore » the PAH concentration can take. The risk map is then extracted for this characterization of the local uncertainty. The difference between this risk map and a traditional iso-concentration map is then discussed in terms of decision-making.« less

  11. Bayesian Geostatistical Modeling of Malaria Indicator Survey Data in Angola

    PubMed Central

    Gosoniu, Laura; Veta, Andre Mia; Vounatsou, Penelope

    2010-01-01

    The 2006–2007 Angola Malaria Indicator Survey (AMIS) is the first nationally representative household survey in the country assessing coverage of the key malaria control interventions and measuring malaria-related burden among children under 5 years of age. In this paper, the Angolan MIS data were analyzed to produce the first smooth map of parasitaemia prevalence based on contemporary nationwide empirical data in the country. Bayesian geostatistical models were fitted to assess the effect of interventions after adjusting for environmental, climatic and socio-economic factors. Non-linear relationships between parasitaemia risk and environmental predictors were modeled by categorizing the covariates and by employing two non-parametric approaches, the B-splines and the P-splines. The results of the model validation showed that the categorical model was able to better capture the relationship between parasitaemia prevalence and the environmental factors. Model fit and prediction were handled within a Bayesian framework using Markov chain Monte Carlo (MCMC) simulations. Combining estimates of parasitaemia prevalence with the number of children under we obtained estimates of the number of infected children in the country. The population-adjusted prevalence ranges from in Namibe province to in Malanje province. The odds of parasitaemia in children living in a household with at least ITNs per person was by 41% lower (CI: 14%, 60%) than in those with fewer ITNs. The estimates of the number of parasitaemic children produced in this paper are important for planning and implementing malaria control interventions and for monitoring the impact of prevention and control activities. PMID:20351775

  12. Patch-based iterative conditional geostatistical simulation using graph cuts

    NASA Astrophysics Data System (ADS)

    Li, Xue; Mariethoz, Gregoire; Lu, DeTang; Linde, Niklas

    2016-08-01

    Training image-based geostatistical methods are increasingly popular in groundwater hydrology even if existing algorithms present limitations that often make real-world applications difficult. These limitations include a computational cost that can be prohibitive for high-resolution 3-D applications, the presence of visual artifacts in the model realizations, and a low variability between model realizations due to the limited pool of patterns available in a finite-size training image. In this paper, we address these issues by proposing an iterative patch-based algorithm which adapts a graph cuts methodology that is widely used in computer graphics. Our adapted graph cuts method optimally cuts patches of pixel values borrowed from the training image and assembles them successively, each time accounting for the information of previously stitched patches. The initial simulation result might display artifacts, which are identified as regions of high cost. These artifacts are reduced by iteratively placing new patches in high-cost regions. In contrast to most patch-based algorithms, the proposed scheme can also efficiently address point conditioning. An advantage of the method is that the cut process results in the creation of new patterns that are not present in the training image, thereby increasing pattern variability. To quantify this effect, a new measure of variability is developed, the merging index, quantifies the pattern variability in the realizations with respect to the training image. A series of sensitivity analyses demonstrates the stability of the proposed graph cuts approach, which produces satisfying simulations for a wide range of parameters values. Applications to 2-D and 3-D cases are compared to state-of-the-art multiple-point methods. The results show that the proposed approach obtains significant speedups and increases variability between realizations. Connectivity functions applied to 2-D models transport simulations in 3-D models are used to

  13. Taking a statistical approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wild, M.; Rouhani, S.

    1995-02-01

    A typical site investigation entails extensive sampling and monitoring. In the past, sampling plans have been designed on purely ad hoc bases, leading to significant expenditures and, in some cases, collection of redundant information. In many instances, sampling costs exceed the true worth of the collected data. The US Environmental Protection Agency (EPA) therefore has advocated the use of geostatistics to provide a logical framework for sampling and analysis of environmental data. Geostatistical methodology uses statistical techniques for the spatial analysis of a variety of earth-related data. The use of geostatistics was developed by the mining industry to estimate oremore » concentrations. The same procedure is effective in quantifying environmental contaminants in soils for risk assessments. Unlike classical statistical techniques, geostatistics offers procedures to incorporate the underlying spatial structure of the investigated field. Sample points spaced close together tend to be more similar than samples spaced further apart. This can guide sampling strategies and determine complex contaminant distributions. Geostatistic techniques can be used to evaluate site conditions on the basis of regular, irregular, random and even spatially biased samples. In most environmental investigations, it is desirable to concentrate sampling in areas of known or suspected contamination. The rigorous mathematical procedures of geostatistics allow for accurate estimates at unsampled locations, potentially reducing sampling requirements. The use of geostatistics serves as a decision-aiding and planning tool and can significantly reduce short-term site assessment costs, long-term sampling and monitoring needs, as well as lead to more accurate and realistic remedial design criteria.« less

  14. Random Dynamics

    NASA Astrophysics Data System (ADS)

    Bennett, D. L.; Brene, N.; Nielsen, H. B.

    1987-01-01

    The goal of random dynamics is the derivation of the laws of Nature as we know them (standard model) from inessential assumptions. The inessential assumptions made here are expressed as sets of general models at extremely high energies: gauge glass and spacetime foam. Both sets of models lead tentatively to the standard model.

  15. Forward modeling of gravity data using geostatistically generated subsurface density variations

    USGS Publications Warehouse

    Phelps, Geoffrey

    2016-01-01

    Using geostatistical models of density variations in the subsurface, constrained by geologic data, forward models of gravity anomalies can be generated by discretizing the subsurface and calculating the cumulative effect of each cell (pixel). The results of such stochastically generated forward gravity anomalies can be compared with the observed gravity anomalies to find density models that match the observed data. These models have an advantage over forward gravity anomalies generated using polygonal bodies of homogeneous density because generating numerous realizations explores a larger region of the solution space. The stochastic modeling can be thought of as dividing the forward model into two components: that due to the shape of each geologic unit and that due to the heterogeneous distribution of density within each geologic unit. The modeling demonstrates that the internally heterogeneous distribution of density within each geologic unit can contribute significantly to the resulting calculated forward gravity anomaly. Furthermore, the stochastic models match observed statistical properties of geologic units, the solution space is more broadly explored by producing a suite of successful models, and the likelihood of a particular conceptual geologic model can be compared. The Vaca Fault near Travis Air Force Base, California, can be successfully modeled as a normal or strike-slip fault, with the normal fault model being slightly more probable. It can also be modeled as a reverse fault, although this structural geologic configuration is highly unlikely given the realizations we explored.

  16. Assessing TCE source bioremediation by geostatistical analysis of a flux fence.

    PubMed

    Cai, Zuansi; Wilson, Ryan D; Lerner, David N

    2012-01-01

    Mass discharge across transect planes is increasingly used as a metric for performance assessment of in situ groundwater remediation systems. Mass discharge estimates using concentrations measured in multilevel transects are often made by assuming a uniform flow field, and uncertainty contributions from spatial concentration and flow field variability are often overlooked. We extend our recently developed geostatistical approach to estimate mass discharge using transect data of concentration and hydraulic conductivity, so accounting for the spatial variability of both datasets. The magnitude and uncertainty of mass discharge were quantified by conditional simulation. An important benefit of the approach is that uncertainty is quantified as an integral part of the mass discharge estimate. We use this approach for performance assessment of a bioremediation experiment of a trichloroethene (TCE) source zone. Analyses of dissolved parent and daughter compounds demonstrated that the engineered bioremediation has elevated the degradation rate of TCE, resulting in a two-thirds reduction in the TCE mass discharge from the source zone. The biologically enhanced dissolution of TCE was not significant (~5%), and was less than expected. However, the discharges of the daughter products cis-1,2, dichloroethene (cDCE) and vinyl chloride (VC) increased, probably because of the rapid transformation of TCE from the source zone to the measurement transect. This suggests that enhancing the biodegradation of cDCE and VC will be crucial to successful engineered bioremediation of TCE source zones. © 2012, The Author(s). Ground Water © 2012, National Ground Water Association.

  17. Stochastic hydrogeology: what professionals really need?

    PubMed

    Renard, Philippe

    2007-01-01

    Quantitative hydrogeology celebrated its 150th anniversary in 2006. Geostatistics is younger but has had a very large impact in hydrogeology. Today, geostatistics is used routinely to interpolate deterministically most of the parameters that are required to analyze a problem or make a quantitative analysis. In a small number of cases, geostatistics is combined with deterministic approaches to forecast uncertainty. At a more academic level, geostatistics is used extensively to study physical processes in heterogeneous aquifers. Yet, there is an important gap between the academic use and the routine applications of geostatistics. The reasons for this gap are diverse. These include aspects related to the hydrogeology consulting market, technical reasons such as the lack of widely available software, but also a number of misconceptions. A change in this situation requires acting at different levels. First, regulators must be convinced of the benefit of using geostatistics. Second, the economic potential of the approach must be emphasized to customers. Third, the relevance of the theories needs to be increased. Last, but not least, software, data sets, and computing infrastructure such as grid computing need to be widely available.

  18. Mapping malaria risk among children in Côte d'Ivoire using Bayesian geo-statistical models.

    PubMed

    Raso, Giovanna; Schur, Nadine; Utzinger, Jürg; Koudou, Benjamin G; Tchicaya, Emile S; Rohner, Fabian; N'goran, Eliézer K; Silué, Kigbafori D; Matthys, Barbara; Assi, Serge; Tanner, Marcel; Vounatsou, Penelope

    2012-05-09

    In Côte d'Ivoire, an estimated 767,000 disability-adjusted life years are due to malaria, placing the country at position number 14 with regard to the global burden of malaria. Risk maps are important to guide control interventions, and hence, the aim of this study was to predict the geographical distribution of malaria infection risk in children aged <16 years in Côte d'Ivoire at high spatial resolution. Using different data sources, a systematic review was carried out to compile and geo-reference survey data on Plasmodium spp. infection prevalence in Côte d'Ivoire, focusing on children aged <16 years. The period from 1988 to 2007 was covered. A suite of Bayesian geo-statistical logistic regression models was fitted to analyse malaria risk. Non-spatial models with and without exchangeable random effect parameters were compared to stationary and non-stationary spatial models. Non-stationarity was modelled assuming that the underlying spatial process is a mixture of separate stationary processes in each ecological zone. The best fitting model based on the deviance information criterion was used to predict Plasmodium spp. infection risk for entire Côte d'Ivoire, including uncertainty. Overall, 235 data points at 170 unique survey locations with malaria prevalence data for individuals aged <16 years were extracted. Most data points (n = 182, 77.4%) were collected between 2000 and 2007. A Bayesian non-stationary regression model showed the best fit with annualized rainfall and maximum land surface temperature identified as significant environmental covariates. This model was used to predict malaria infection risk at non-sampled locations. High-risk areas were mainly found in the north-central and western area, while relatively low-risk areas were located in the north at the country border, in the north-east, in the south-east around Abidjan, and in the central-west between two high prevalence areas. The malaria risk map at high spatial resolution gives an

  19. Mapping malaria risk among children in Côte d’Ivoire using Bayesian geo-statistical models

    PubMed Central

    2012-01-01

    Background In Côte d’Ivoire, an estimated 767,000 disability-adjusted life years are due to malaria, placing the country at position number 14 with regard to the global burden of malaria. Risk maps are important to guide control interventions, and hence, the aim of this study was to predict the geographical distribution of malaria infection risk in children aged <16 years in Côte d’Ivoire at high spatial resolution. Methods Using different data sources, a systematic review was carried out to compile and geo-reference survey data on Plasmodium spp. infection prevalence in Côte d’Ivoire, focusing on children aged <16 years. The period from 1988 to 2007 was covered. A suite of Bayesian geo-statistical logistic regression models was fitted to analyse malaria risk. Non-spatial models with and without exchangeable random effect parameters were compared to stationary and non-stationary spatial models. Non-stationarity was modelled assuming that the underlying spatial process is a mixture of separate stationary processes in each ecological zone. The best fitting model based on the deviance information criterion was used to predict Plasmodium spp. infection risk for entire Côte d’Ivoire, including uncertainty. Results Overall, 235 data points at 170 unique survey locations with malaria prevalence data for individuals aged <16 years were extracted. Most data points (n = 182, 77.4%) were collected between 2000 and 2007. A Bayesian non-stationary regression model showed the best fit with annualized rainfall and maximum land surface temperature identified as significant environmental covariates. This model was used to predict malaria infection risk at non-sampled locations. High-risk areas were mainly found in the north-central and western area, while relatively low-risk areas were located in the north at the country border, in the north-east, in the south-east around Abidjan, and in the central-west between two high prevalence areas. Conclusion The

  20. Spatial and temporal variability in the R-5 infiltration data set: Déjà vu and rainfall-runoff simulations

    NASA Astrophysics Data System (ADS)

    Loague, Keith; Kyriakidis, Phaedon C.

    1997-12-01

    This paper is a continuation of the event-based rainfall-runoff model evaluation study reported by Loague and Freeze [1985[. Here we reevaluate the performance of a quasi-physically based rainfall-runoff model for three large events from the well-known R-5 catchment. Five different statistical criteria are used to quantitatively judge model performance. Temporal variability in the large R-5 infiltration data set [Loague and Gander, 1990] is filtered by working in terms of permeability. The transformed data set is reanalyzed via geostatistical methods to model the spatial distribution of permeability across the R-5 catchment. We present new estimates of the spatial distribution of infiltration that are in turn used in our rainfall-runoff simulations with the Horton rainfall-runoff model. The new rainfall-runoff simulations, complicated by reinfiltration impacts at the smaller scales of characterization, indicate that the near-surface hydrologic response of the R-5 catchment is most probably dominated by a combination of the Horton and Dunne overland flow mechanisms.

  1. Antibiotic prophylaxis at elective cesarean section: a randomized controlled trial in a low resource setting.

    PubMed

    Kandil, Mohamed; Sanad, Zakaria; Gaber, Wael

    2014-04-01

    To determine the best time to administer prophylactic antibiotics at Cesarean delivery in order to reduce the postoperative maternal infectious morbidity in a low resource setting. One hundred term primigravidae with singleton pregnancy were recruited and randomly allocated to two equal groups. Each woman received 2 g intravenous Cefazoline. Women in Group I received it prior to skin incision while those in Group II had it immediately after cord clamping. We measured the following outcome parameters: (1) Surgical site wound infection; (2) Endometritis and (3) Urinary tract infection. There was no significant difference in any of the patients' characteristics between both groups. In Group I, three cases developed surgical site infections but four in Group II (p > 0.05). In Group I, the infected cases had Cesarean because of malpresentations while in Group II, two cases had Cesarean because of patients' request, one because of maternal heart disease and one due to intra-uterine growth restriction. Seven and nine cases had urinary tract infection in Groups I and II, respectively, (p > 0.05). Prophylactic antibiotic administration either prior to surgery or after cord clamping is probably equally effective in reducing the postoperative infectious morbidity after Cesarean in low resource settings.

  2. Building on crossvalidation for increasing the quality of geostatistical modeling

    USGS Publications Warehouse

    Olea, R.A.

    2012-01-01

    The random function is a mathematical model commonly used in the assessment of uncertainty associated with a spatially correlated attribute that has been partially sampled. There are multiple algorithms for modeling such random functions, all sharing the requirement of specifying various parameters that have critical influence on the results. The importance of finding ways to compare the methods and setting parameters to obtain results that better model uncertainty has increased as these algorithms have grown in number and complexity. Crossvalidation has been used in spatial statistics, mostly in kriging, for the analysis of mean square errors. An appeal of this approach is its ability to work with the same empirical sample available for running the algorithms. This paper goes beyond checking estimates by formulating a function sensitive to conditional bias. Under ideal conditions, such function turns into a straight line, which can be used as a reference for preparing measures of performance. Applied to kriging, deviations from the ideal line provide sensitivity to the semivariogram lacking in crossvalidation of kriging errors and are more sensitive to conditional bias than analyses of errors. In terms of stochastic simulation, in addition to finding better parameters, the deviations allow comparison of the realizations resulting from the applications of different methods. Examples show improvements of about 30% in the deviations and approximately 10% in the square root of mean square errors between reasonable starting modelling and the solutions according to the new criteria. ?? 2011 US Government.

  3. Geostatistical methods in the assessment of the spatial variability of the quality of river water

    NASA Astrophysics Data System (ADS)

    Krasowska, Małgorzata; Banaszuk, Piotr

    2017-11-01

    The research was conducted in the agricultural catchment in north-eastern Poland. The aim of this study was to check how geostatistical analysis can be useful for the detection zones and forms of supply stream by water from different sources. The work was included the implementation of hydrochemical profiles. These profiles were made by measuring the electrical conductivity (EC) values and temperature along the river. On the basis of these results, the authors calculated the coefficient of Moran I and performed semivariogram and found that the EC values are correlated on a stretch of about 140 m. This means that the spatial correlation between samples of water in the stream is readable over a distance of about 140 meters. Therefore it is believed that the degree of water mineralization on this section is shaped by water entering the river channel migration in different ways: through tributaries, leachate drainage and surface runoff. In the case of the analyzed catchment, the potential sources of pollution were drainage systems. Therefore, the spatial analysis allowed the identification pollution sources in a catchment, especially in drained agricultural catchments.

  4. Effectiveness of a self-management program for dual sensory impaired seniors in aged care settings: study protocol for a cluster randomized controlled trial.

    PubMed

    Roets-Merken, Lieve M; Graff, Maud J L; Zuidema, Sytse U; Hermsen, Pieter G J M; Teerenstra, Steven; Kempen, Gertrudis I J M; Vernooij-Dassen, Myrra J F J

    2013-10-07

    Five to 25 percent of residents in aged care settings have a combined hearing and visual sensory impairment. Usual care is generally restricted to single sensory impairment, neglecting the consequences of dual sensory impairment on social participation and autonomy. The aim of this study is to evaluate the effectiveness of a self-management program for seniors who acquired dual sensory impairment at old age. In a cluster randomized, single-blind controlled trial, with aged care settings as the unit of randomization, the effectiveness of a self-management program will be compared to usual care. A minimum of 14 and maximum of 20 settings will be randomized to either the intervention cluster or the control cluster, aiming to include a total of 132 seniors with dual sensory impairment. Each senior will be linked to a licensed practical nurse working at the setting. During a five to six month intervention period, nurses at the intervention clusters will be trained in a self-management program to support and empower seniors to use self-management strategies. In two separate diaries, nurses keep track of the interviews with the seniors and their reflections on their own learning process. Nurses of the control clusters offer care as usual. At senior level, the primary outcome is the social participation of the seniors measured using the Hearing Handicap Questionnaire and the Activity Card Sort, and secondary outcomes are mood, autonomy and quality of life. At nurse level, the outcome is job satisfaction. Effectiveness will be evaluated using linear mixed model analysis. The results of this study will provide evidence for the effectiveness of the Self-Management Program for seniors with dual sensory impairment living in aged care settings. The findings are expected to contribute to the knowledge on the program's potential to enhance social participation and autonomy of the seniors, as well as increasing the job satisfaction of the licensed practical nurses. Furthermore, an

  5. Effectiveness of a self-management program for dual sensory impaired seniors in aged care settings: study protocol for a cluster randomized controlled trial

    PubMed Central

    2013-01-01

    Background Five to 25 percent of residents in aged care settings have a combined hearing and visual sensory impairment. Usual care is generally restricted to single sensory impairment, neglecting the consequences of dual sensory impairment on social participation and autonomy. The aim of this study is to evaluate the effectiveness of a self-management program for seniors who acquired dual sensory impairment at old age. Methods/Design In a cluster randomized, single-blind controlled trial, with aged care settings as the unit of randomization, the effectiveness of a self-management program will be compared to usual care. A minimum of 14 and maximum of 20 settings will be randomized to either the intervention cluster or the control cluster, aiming to include a total of 132 seniors with dual sensory impairment. Each senior will be linked to a licensed practical nurse working at the setting. During a five to six month intervention period, nurses at the intervention clusters will be trained in a self-management program to support and empower seniors to use self-management strategies. In two separate diaries, nurses keep track of the interviews with the seniors and their reflections on their own learning process. Nurses of the control clusters offer care as usual. At senior level, the primary outcome is the social participation of the seniors measured using the Hearing Handicap Questionnaire and the Activity Card Sort, and secondary outcomes are mood, autonomy and quality of life. At nurse level, the outcome is job satisfaction. Effectiveness will be evaluated using linear mixed model analysis. Discussion The results of this study will provide evidence for the effectiveness of the Self-Management Program for seniors with dual sensory impairment living in aged care settings. The findings are expected to contribute to the knowledge on the program’s potential to enhance social participation and autonomy of the seniors, as well as increasing the job satisfaction of the

  6. The geostatistic-based spatial distribution variations of soil salts under long-term wastewater irrigation.

    PubMed

    Wu, Wenyong; Yin, Shiyang; Liu, Honglu; Niu, Yong; Bao, Zhe

    2014-10-01

    The purpose of this study was to determine and evaluate the spatial changes in soil salinity by using geostatistical methods. The study focused on the suburb area of Beijing, where urban development led to water shortage and accelerated wastewater reuse to farm irrigation for more than 30 years. The data were then processed by GIS using three different interpolation techniques of ordinary kriging (OK), disjunctive kriging (DK), and universal kriging (UK). The normality test and overall trend analysis were applied for each interpolation technique to select the best fitted model for soil parameters. Results showed that OK was suitable for soil sodium adsorption ratio (SAR) and Na(+) interpolation; UK was suitable for soil Cl(-) and pH; DK was suitable for soil Ca(2+). The nugget-to-sill ratio was applied to evaluate the effects of structural and stochastic factors. The maps showed that the areas of non-saline soil and slight salinity soil accounted for 6.39 and 93.61%, respectively. The spatial distribution and accumulation of soil salt were significantly affected by the irrigation probabilities and drainage situation under long-term wastewater irrigation.

  7. Definition of radon prone areas in Friuli Venezia Giulia region, Italy, using geostatistical tools.

    PubMed

    Cafaro, C; Bossew, P; Giovani, C; Garavaglia, M

    2014-12-01

    Studying the geographical distribution of indoor radon concentration, using geostatistical interpolation methods, has become common for predicting and estimating the risk to the population. Here we analyse the case of Friuli Venezia Giulia (FVG), the north easternmost region of Italy. Mean value and standard deviation are, respectively, 153 Bq/m(3) and 183 Bq/m(3). The geometric mean value is 100 Bq/m(3). Spatial datasets of indoor radon concentrations are usually affected by clustering and apparent non-stationarity issues, which can eventually yield arguable results. The clustering of the present dataset seems to be non preferential. Therefore the areal estimations are not expected to be affected. Conversely, nothing can be said on the non stationarity issues and its effects. After discussing the correlation of geology with indoor radon concentration It appears they are created by the same geologic features influencing the mean and median values, and can't be eliminated via a map-based approach. To tackle these problems, in this work we deal with multiple definitions of RPA, but only in quaternary areas of FVG, using extensive simulation techniques. Copyright © 2014 Elsevier Ltd. All rights reserved.

  8. A randomized controlled trial of aromatherapy massage in a hospice setting.

    PubMed

    Soden, Katie; Vincent, Karen; Craske, Stephen; Lucas, Caroline; Ashley, Sue

    2004-03-01

    Research suggests that patients with cancer, particularly in the palliative care setting, are increasingly using aromatherapy and massage. There is good evidence that these therapies may be helpful for anxiety reduction for short periods, but few studies have looked at the longer term effects. This study was designed to compare the effects of four-week courses of aromatherapy massage and massage alone on physical and psychological symptoms in patients with advanced cancer. Forty-two patients were randomly allocated to receive weekly massages with lavender essential oil and an inert carrier oil (aromatherapy group), an inert carrier oil only (massage group) or no intervention. Outcome measures included a Visual Analogue Scale (VAS) of pain intensity, the Verran and Snyder-Halpern (VSH) sleep scale, the Hospital Anxiety and Depression (HAD) scale and the Rotterdam Symptom Checklist (RSCL). We were unable to demonstrate any significant long-term benefits of aromatherapy or massage in terms of improving pain control, anxiety or quality of life. However, sleep scores improved significantly in both the massage and the combined massage (aromatherapy and massage) groups. There were also statistically significant reductions in depression scores in the massage group. In this study of patients with advanced cancer, the addition of lavender essential oil did not appear to increase the beneficial effects of massage. Our results do suggest, however, that patients with high levels of psychological distress respond best to these therapies.

  9. Hungarian contribution to the Global Soil Organic Carbon Map (GSOC17) using advanced machine learning algorithms and geostatistics

    NASA Astrophysics Data System (ADS)

    Szatmári, Gábor; Laborczi, Annamária; Takács, Katalin; Pásztor, László

    2017-04-01

    The knowledge about soil organic carbon (SOC) baselines and changes, and the detection of vulnerable hot spots for SOC losses and gains under climate change and changed land management is still fairly limited. Thus Global Soil Partnership (GSP) has been requested to develop a global SOC mapping campaign by 2017. GSPs concept builds on official national data sets, therefore, a bottom-up (country-driven) approach is pursued. The elaborated Hungarian methodology suits the general specifications of GSOC17 provided by GSP. The input data for GSOC17@HU mapping approach has involved legacy soil data bases, as well as proper environmental covariates related to the main soil forming factors, such as climate, organisms, relief and parent material. Nowadays, digital soil mapping (DSM) highly relies on the assumption that soil properties of interest can be modelled as a sum of a deterministic and stochastic component, which can be treated and modelled separately. We also adopted this assumption in our methodology. In practice, multiple regression techniques are commonly used to model the deterministic part. However, this global (and usually linear) models commonly oversimplify the often complex and non-linear relationship, which has a crucial effect on the resulted soil maps. Thus, we integrated machine learning algorithms (namely random forest and quantile regression forest) in the elaborated methodology, supposing then to be more suitable for the problem in hand. This approach has enable us to model the GSOC17 soil properties in that complex and non-linear forms as the soil itself. Furthermore, it has enable us to model and assess the uncertainty of the results, which is highly relevant in decision making. The applied methodology has used geostatistical approach to model the stochastic part of the spatial variability of the soil properties of interest. We created GSOC17@HU map with 1 km grid resolution according to the GSPs specifications. The map contributes to the GSPs

  10. Improved hydrological model parametrization for climate change impact assessment under data scarcity - The potential of field monitoring techniques and geostatistics.

    PubMed

    Meyer, Swen; Blaschek, Michael; Duttmann, Rainer; Ludwig, Ralf

    2016-02-01

    According to current climate projections, Mediterranean countries are at high risk for an even pronounced susceptibility to changes in the hydrological budget and extremes. These changes are expected to have severe direct impacts on the management of water resources, agricultural productivity and drinking water supply. Current projections of future hydrological change, based on regional climate model results and subsequent hydrological modeling schemes, are very uncertain and poorly validated. The Rio Mannu di San Sperate Basin, located in Sardinia, Italy, is one test site of the CLIMB project. The Water Simulation Model (WaSiM) was set up to model current and future hydrological conditions. The availability of measured meteorological and hydrological data is poor as it is common for many Mediterranean catchments. In this study we conducted a soil sampling campaign in the Rio Mannu catchment. We tested different deterministic and hybrid geostatistical interpolation methods on soil textures and tested the performance of the applied models. We calculated a new soil texture map based on the best prediction method. The soil model in WaSiM was set up with the improved new soil information. The simulation results were compared to standard soil parametrization. WaSiMs was validated with spatial evapotranspiration rates using the triangle method (Jiang and Islam, 1999). WaSiM was driven with the meteorological forcing taken from 4 different ENSEMBLES climate projections for a reference (1971-2000) and a future (2041-2070) times series. The climate change impact was assessed based on differences between reference and future time series. The simulated results show a reduction of all hydrological quantities in the future in the spring season. Furthermore simulation results reveal an earlier onset of dry conditions in the catchment. We show that a solid soil model setup based on short-term field measurements can improve long-term modeling results, which is especially important

  11. Does rational selection of training and test sets improve the outcome of QSAR modeling?

    PubMed

    Martin, Todd M; Harten, Paul; Young, Douglas M; Muratov, Eugene N; Golbraikh, Alexander; Zhu, Hao; Tropsha, Alexander

    2012-10-22

    Prior to using a quantitative structure activity relationship (QSAR) model for external predictions, its predictive power should be established and validated. In the absence of a true external data set, the best way to validate the predictive ability of a model is to perform its statistical external validation. In statistical external validation, the overall data set is divided into training and test sets. Commonly, this splitting is performed using random division. Rational splitting methods can divide data sets into training and test sets in an intelligent fashion. The purpose of this study was to determine whether rational division methods lead to more predictive models compared to random division. A special data splitting procedure was used to facilitate the comparison between random and rational division methods. For each toxicity end point, the overall data set was divided into a modeling set (80% of the overall set) and an external evaluation set (20% of the overall set) using random division. The modeling set was then subdivided into a training set (80% of the modeling set) and a test set (20% of the modeling set) using rational division methods and by using random division. The Kennard-Stone, minimal test set dissimilarity, and sphere exclusion algorithms were used as the rational division methods. The hierarchical clustering, random forest, and k-nearest neighbor (kNN) methods were used to develop QSAR models based on the training sets. For kNN QSAR, multiple training and test sets were generated, and multiple QSAR models were built. The results of this study indicate that models based on rational division methods generate better statistical results for the test sets than models based on random division, but the predictive power of both types of models are comparable.

  12. Best strategies to implement clinical pathways in an emergency department setting: study protocol for a cluster randomized controlled trial.

    PubMed

    Jabbour, Mona; Curran, Janet; Scott, Shannon D; Guttman, Astrid; Rotter, Thomas; Ducharme, Francine M; Lougheed, M Diane; McNaughton-Filion, M Louise; Newton, Amanda; Shafir, Mark; Paprica, Alison; Klassen, Terry; Taljaard, Monica; Grimshaw, Jeremy; Johnson, David W

    2013-05-22

    The clinical pathway is a tool that operationalizes best evidence recommendations and clinical practice guidelines in an accessible format for 'point of care' management by multidisciplinary health teams in hospital settings. While high-quality, expert-developed clinical pathways have many potential benefits, their impact has been limited by variable implementation strategies and suboptimal research designs. Best strategies for implementing pathways into hospital settings remain unknown. This study will seek to develop and comprehensively evaluate best strategies for effective local implementation of externally developed expert clinical pathways. We will develop a theory-based and knowledge user-informed intervention strategy to implement two pediatric clinical pathways: asthma and gastroenteritis. Using a balanced incomplete block design, we will randomize 16 community emergency departments to receive the intervention for one clinical pathway and serve as control for the alternate clinical pathway, thus conducting two cluster randomized controlled trials to evaluate this implementation intervention. A minimization procedure will be used to randomize sites. Intervention sites will receive a tailored strategy to support full clinical pathway implementation. We will evaluate implementation strategy effectiveness through measurement of relevant process and clinical outcomes. The primary process outcome will be the presence of an appropriately completed clinical pathway on the chart for relevant patients. Primary clinical outcomes for each clinical pathway include the following: Asthma--the proportion of asthmatic patients treated appropriately with corticosteroids in the emergency department and at discharge; and Gastroenteritis--the proportion of relevant patients appropriately treated with oral rehydration therapy. Data sources include chart audits, administrative databases, environmental scans, and qualitative interviews. We will also conduct an overall process

  13. Best strategies to implement clinical pathways in an emergency department setting: study protocol for a cluster randomized controlled trial

    PubMed Central

    2013-01-01

    Background The clinical pathway is a tool that operationalizes best evidence recommendations and clinical practice guidelines in an accessible format for ‘point of care’ management by multidisciplinary health teams in hospital settings. While high-quality, expert-developed clinical pathways have many potential benefits, their impact has been limited by variable implementation strategies and suboptimal research designs. Best strategies for implementing pathways into hospital settings remain unknown. This study will seek to develop and comprehensively evaluate best strategies for effective local implementation of externally developed expert clinical pathways. Design/methods We will develop a theory-based and knowledge user-informed intervention strategy to implement two pediatric clinical pathways: asthma and gastroenteritis. Using a balanced incomplete block design, we will randomize 16 community emergency departments to receive the intervention for one clinical pathway and serve as control for the alternate clinical pathway, thus conducting two cluster randomized controlled trials to evaluate this implementation intervention. A minimization procedure will be used to randomize sites. Intervention sites will receive a tailored strategy to support full clinical pathway implementation. We will evaluate implementation strategy effectiveness through measurement of relevant process and clinical outcomes. The primary process outcome will be the presence of an appropriately completed clinical pathway on the chart for relevant patients. Primary clinical outcomes for each clinical pathway include the following: Asthma—the proportion of asthmatic patients treated appropriately with corticosteroids in the emergency department and at discharge; and Gastroenteritis—the proportion of relevant patients appropriately treated with oral rehydration therapy. Data sources include chart audits, administrative databases, environmental scans, and qualitative interviews. We will

  14. The Efficacy of Goal Setting in Cardiac Rehabilitation-a Gender-Specific Randomized Controlled Trial.

    PubMed

    Stamm-Balderjahn, Sabine; Brünger, Martin; Michel, Anne; Bongarth, Christa; Spyra, Karla

    2016-08-08

    Patients with coronary heart disease undergo cardiac rehabilitation in order to reduce their cardiovascular risk factors. Often, however, the benefit of rehabilitation is lost over time. It is unclear whether this happens in the same way to men and women. We studied whether the setting of gender-specific behavior goals with an agreement between the doctor and the patient at the end of rehabilitation can prolong its positive effects. This study was performed with a mixed-method design. It consisted of qualitative interviews and group discussions with patients, doctors and other treating personnel, and researchers, as well as a quantitative, randomized, controlled intervention trial in which data were acquired at four time points (the beginning and end of rehabilitation and then 6 and 12 months later). 545 patients, 262 of them women (48.1%), were included. The patients were assigned to a goal checking group (n = 132), a goal setting group (n = 143), and a control group (n = 270). The primary endpoints were health-related behavior (exercise, diet, tobacco consumption), subjective state of health, and medication adherence. The secondary endpoints included physiological protection and risk factors such as blood pressure, cholesterol (HDL, LDL, and total), blood sugar, HbA1c, and body-mass index. The intervention had no demonstrable effect on the primary or secondary endpoints. The percentage of smokers declined to a similar extent in all groups from the beginning of rehabilitation to 12 months after its end (overall figures: 12.4% to 8.6%, p <0.05). The patients' exercise behavior, diet, and subjective state of health also improved over the entire course of the study. Women had a healthier diet than men. Subgroup analyses indicated a possible effect of the intervention on exercise behavior in women who were employed and in men who were not (p<0.01). The efficacy of goal setting was not demonstrated. Therefore, no indication for its routine provision can be derived from

  15. Estimating the volume and age of water stored in global lakes using a geo-statistical approach

    PubMed Central

    Messager, Mathis Loïc; Lehner, Bernhard; Grill, Günther; Nedeva, Irena; Schmitt, Oliver

    2016-01-01

    Lakes are key components of biogeochemical and ecological processes, thus knowledge about their distribution, volume and residence time is crucial in understanding their properties and interactions within the Earth system. However, global information is scarce and inconsistent across spatial scales and regions. Here we develop a geo-statistical model to estimate the volume of global lakes with a surface area of at least 10 ha based on the surrounding terrain information. Our spatially resolved database shows 1.42 million individual polygons of natural lakes with a total surface area of 2.67 × 106 km2 (1.8% of global land area), a total shoreline length of 7.2 × 106 km (about four times longer than the world's ocean coastline) and a total volume of 181.9 × 103 km3 (0.8% of total global non-frozen terrestrial water stocks). We also compute mean and median hydraulic residence times for all lakes to be 1,834 days and 456 days, respectively. PMID:27976671

  16. A Geostatistical Toolset for Reconstructing Louisiana's Coastal Stratigraphy using Subsurface Boring and Cone Penetrometer Test Data

    NASA Astrophysics Data System (ADS)

    Li, A.; Tsai, F. T. C.; Jafari, N.; Chen, Q. J.; Bentley, S. J.

    2017-12-01

    A vast area of river deltaic wetlands stretches across southern Louisiana coast. The wetlands are suffering from a high rate of land loss, which increasingly threats coastal community and energy infrastructure. A regional stratigraphic framework of the delta plain is now imperative to answer scientific questions (such as how the delta plain grows and decays?) and to provide information to coastal protection and restoration projects (such as marsh creation and construction of levees and floodwalls). Through years, subsurface investigations in Louisiana have been conducted by state and federal agencies (Louisiana Department of Natural Resources, United States Geological Survey, United States Army Corps of Engineers, etc.), research institutes (Louisiana Geological Survey, LSU Coastal Studies Institute, etc.), engineering firms, and oil-gas companies. This has resulted in the availability of various types of data, including geological, geotechnical, and geophysical data. However, it is challenging to integrate different types of data and construct three-dimensional stratigraphy models in regional scale. In this study, a set of geostatistical methods were used to tackle this problem. An ordinary kriging method was used to regionalize continuous data, such as grain size, water content, liquid limit, plasticity index, and cone penetrometer tests (CPTs). Indicator kriging and multiple indicator kriging methods were used to regionalize categorized data, such as soil classification. A compositional kriging method was used to regionalize compositional data, such as soil composition (fractions of sand, silt and clay). Stratigraphy models were constructed for three cases in the coastal zone: (1) Inner Harbor Navigation Canal (IHNC) area: soil classification and soil behavior type (SBT) stratigraphies were constructed using ordinary kriging; (2) Middle Barataria Bay area: a soil classification stratigraphy was constructed using multiple indicator kriging; (3) Lower Barataria

  17. A Random Variable Related to the Inversion Vector of a Partial Random Permutation

    ERIC Educational Resources Information Center

    Laghate, Kavita; Deshpande, M. N.

    2005-01-01

    In this article, we define the inversion vector of a permutation of the integers 1, 2,..., n. We set up a particular kind of permutation, called a partial random permutation. The sum of the elements of the inversion vector of such a permutation is a random variable of interest.

  18. Application of Geostatistical Methods and Machine Learning for spatio-temporal Earthquake Cluster Analysis

    NASA Astrophysics Data System (ADS)

    Schaefer, A. M.; Daniell, J. E.; Wenzel, F.

    2014-12-01

    Earthquake clustering tends to be an increasingly important part of general earthquake research especially in terms of seismic hazard assessment and earthquake forecasting and prediction approaches. The distinct identification and definition of foreshocks, aftershocks, mainshocks and secondary mainshocks is taken into account using a point based spatio-temporal clustering algorithm originating from the field of classic machine learning. This can be further applied for declustering purposes to separate background seismicity from triggered seismicity. The results are interpreted and processed to assemble 3D-(x,y,t) earthquake clustering maps which are based on smoothed seismicity records in space and time. In addition, multi-dimensional Gaussian functions are used to capture clustering parameters for spatial distribution and dominant orientations. Clusters are further processed using methodologies originating from geostatistics, which have been mostly applied and developed in mining projects during the last decades. A 2.5D variogram analysis is applied to identify spatio-temporal homogeneity in terms of earthquake density and energy output. The results are mitigated using Kriging to provide an accurate mapping solution for clustering features. As a case study, seismic data of New Zealand and the United States is used, covering events since the 1950s, from which an earthquake cluster catalogue is assembled for most of the major events, including a detailed analysis of the Landers and Christchurch sequences.

  19. The geostatistical approach for structural and stratigraphic framework analysis of offshore NW Bonaparte Basin, Australia

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wahid, Ali, E-mail: ali.wahid@live.com; Salim, Ahmed Mohamed Ahmed, E-mail: mohamed.salim@petronas.com.my; Yusoff, Wan Ismail Wan, E-mail: wanismail-wanyusoff@petronas.com.my

    2016-02-01

    Geostatistics or statistical approach is based on the studies of temporal and spatial trend, which depend upon spatial relationships to model known information of variable(s) at unsampled locations. The statistical technique known as kriging was used for petrophycial and facies analysis, which help to assume spatial relationship to model the geological continuity between the known data and the unknown to produce a single best guess of the unknown. Kriging is also known as optimal interpolation technique, which facilitate to generate best linear unbiased estimation of each horizon. The idea is to construct a numerical model of the lithofacies and rockmore » properties that honor available data and further integrate with interpreting seismic sections, techtonostratigraphy chart with sea level curve (short term) and regional tectonics of the study area to find the structural and stratigraphic growth history of the NW Bonaparte Basin. By using kriging technique the models were built which help to estimate different parameters like horizons, facies, and porosities in the study area. The variograms were used to determine for identification of spatial relationship between data which help to find the depositional history of the North West (NW) Bonaparte Basin.« less

  20. A combined geostatistical-optimization model for the optimal design of a groundwater quality monitoring network

    NASA Astrophysics Data System (ADS)

    Kolosionis, Konstantinos; Papadopoulou, Maria P.

    2017-04-01

    Monitoring networks provide essential information for water resources management especially in areas with significant groundwater exploitation due to extensive agricultural activities. In this work, a simulation-optimization framework is developed based on heuristic optimization methodologies and geostatistical modeling approaches to obtain an optimal design for a groundwater quality monitoring network. Groundwater quantity and quality data obtained from 43 existing observation locations at 3 different hydrological periods in Mires basin in Crete, Greece will be used in the proposed framework in terms of Regression Kriging to develop the spatial distribution of nitrates concentration in the aquifer of interest. Based on the existing groundwater quality mapping, the proposed optimization tool will determine a cost-effective observation wells network that contributes significant information to water managers and authorities. The elimination of observation wells that add little or no beneficial information to groundwater level and quality mapping of the area can be obtain using estimations uncertainty and statistical error metrics without effecting the assessment of the groundwater quality. Given the high maintenance cost of groundwater monitoring networks, the proposed tool could used by water regulators in the decision-making process to obtain a efficient network design that is essential.

  1. Geostatistics and the representative elementary volume of gamma ray tomography attenuation in rocks cores

    USGS Publications Warehouse

    Vogel, J.R.; Brown, G.O.

    2003-01-01

    Semivariograms of samples of Culebra Dolomite have been determined at two different resolutions for gamma ray computed tomography images. By fitting models to semivariograms, small-scale and large-scale correlation lengths are determined for four samples. Different semivariogram parameters were found for adjacent cores at both resolutions. Relative elementary volume (REV) concepts are related to the stationarity of the sample. A scale disparity factor is defined and is used to determine sample size required for ergodic stationarity with a specified correlation length. This allows for comparison of geostatistical measures and representative elementary volumes. The modifiable areal unit problem is also addressed and used to determine resolution effects on correlation lengths. By changing resolution, a range of correlation lengths can be determined for the same sample. Comparison of voxel volume to the best-fit model correlation length of a single sample at different resolutions reveals a linear scaling effect. Using this relationship, the range of the point value semivariogram is determined. This is the range approached as the voxel size goes to zero. Finally, these results are compared to the regularization theory of point variables for borehole cores and are found to be a better fit for predicting the volume-averaged range.

  2. Provider training to screen and initiate evidence-based pediatric obesity treatment in routine practice settings: A randomized pilot trial

    PubMed Central

    Kolko, Rachel P.; Kass, Andrea E.; Hayes, Jacqueline F.; Levine, Michele D.; Garbutt, Jane M.; Proctor, Enola K.; Wilfley, Denise E.

    2016-01-01

    Introduction This randomized pilot trial evaluated two training modalities for first-line, evidence-based pediatric obesity services (screening and goal-setting) among nursing students. Method Participants (N=63) were randomized to Live Interactive Training (Live) or Web-facilitated Self-study Training (Web). Pre-training, post-training, and one-month follow-up assessments evaluated training feasibility, acceptability, and impact (knowledge, and skill via simulation). Moderator (previous experience) and predictor (content engagement) analyses were conducted. Results Nearly-all (98%) participants completed assessments. Both trainings were acceptable, with higher ratings for Live and participants with previous experience (p’s<.05). Knowledge and skill improved from pre-training to post-training and follow-up in both conditions (p’s<.001). Live demonstrated greater content engagement (p’s<.01). Conclusions The training package was feasible, acceptable, and efficacious among nursing students. Given that Live had higher acceptability and engagement, and online training offers greater scalability, integrating interactive Live components within Web-based training may optimize outcomes, which may enhance practitioners’ delivery of pediatric obesity services. PMID:26873293

  3. The Effects of a Cluster Randomized Controlled Workplace Intervention on Sleep and Work-Family Conflict Outcomes in an Extended Care Setting

    PubMed Central

    Marino, Miguel; Killerby, Marie; Lee, Soomi; Klein, Laura Cousino; Moen, Phyllis; Olson, Ryan; Kossek, Ellen Ernst; King, Rosalind; Erickson, Leslie; Berkman, Lisa F.; Buxton, Orfeu M.

    2016-01-01

    Objectives To evaluate the effects of a workplace-based intervention on actigraphic and self-reported sleep outcomes in an extended care setting. Design Cluster randomized trial. Setting Extended-care (nursing) facilities. Participants US employees and managers at nursing homes. Nursing homes were randomly selected to intervention or control settings. Intervention The Work, Family and Health Study developed an intervention aimed at reducing work-family conflict within a 4-month work-family organizational change process. Employees participated in interactive sessions with facilitated discussions, role-playing, and games designed to increase control over work processes and work time. Managers completed training in family-supportive supervision. Measurements Primary actigraphic outcomes included: total sleep duration, wake after sleep onset, nighttime sleep, variation in nighttime sleep, nap duration, and number of naps. Secondary survey outcomes included work-to-family conflict, sleep insufficiency, insomnia symptoms and sleep quality. Measures were obtained at baseline, 6-months and 12-months post-intervention. Results A total of 1,522 employees and 184 managers provided survey data at baseline. Managers and employees in the intervention arm showed no significant difference in sleep outcomes over time compared to control participants. Sleep outcomes were not moderated by work-to-family conflict or presence of children in the household for managers or employees. Age significantly moderated an intervention effect on nighttime sleep among employees (p=0.040), where younger employees benefited more from the intervention. Conclusion In the context of an extended-care nursing home workplace, the intervention did not significantly alter sleep outcomes in either managers or employees. Moderating effects of age were identified where younger employees’ sleep outcomes benefited more from the intervention. PMID:28239635

  4. Indoor terrestrial gamma dose rate mapping in France: a case study using two different geostatistical models.

    PubMed

    Warnery, E; Ielsch, G; Lajaunie, C; Cale, E; Wackernagel, H; Debayle, C; Guillevic, J

    2015-01-01

    Terrestrial gamma dose rates show important spatial variations in France. Previous studies resulted in maps of arithmetic means of indoor terrestrial gamma dose rates by "departement" (French district). However, numerous areas could not be characterized due to the lack of data. The aim of our work was to obtain more precise estimates of the spatial variability of indoor terrestrial gamma dose rates in France by using a more recent and complete data base and geostatistics. The study was based on the exploitation of 97,595 measurements results distributed in 17,404 locations covering all of France. Measurements were done by the Institute for Radioprotection and Nuclear Safety (IRSN) using RPL (Radio Photo Luminescent) dosimeters, exposed during several months between years 2011 and 2012 in French dentist surgeries and veterinary clinics. The data used came from dosimeters which were not exposed to anthropic sources. After removing the cosmic rays contribution in order to study only the telluric gamma radiation, it was decided to work with the arithmetic means of the time-series measurements, weighted by the time-exposure of the dosimeters, for each location. The values varied between 13 and 349 nSv/h, with an arithmetic mean of 76 nSv/h. The observed statistical distribution of the gamma dose rates was skewed to the right. Firstly, ordinary kriging was performed in order to predict the gamma dose rate on cells of 1*1 km(2), all over the domain. The second step of the study was to use an auxiliary variable in estimates. The IRSN achieved in 2010 a classification of the French geological formations, characterizing their uranium potential on the bases of geology and local measurement results of rocks uranium content. This information is georeferenced in a map at the scale 1:1,000,000. The geological uranium potential (GUP) was classified in 5 qualitative categories. As telluric gamma rays mostly come from the progenies of the (238)Uranium series present in rocks, this

  5. Determining the Significance of Item Order in Randomized Problem Sets

    ERIC Educational Resources Information Center

    Pardos, Zachary A.; Heffernan, Neil T.

    2009-01-01

    Researchers who make tutoring systems would like to know which sequences of educational content lead to the most effective learning by their students. The majority of data collected in many ITS systems consist of answers to a group of questions of a given skill often presented in a random sequence. Following work that identifies which items…

  6. A Streamlined Approach by a Combination of Bioindication and Geostatistical Methods for Assessing Air Contaminants and Their Effects on Human Health in Industrialized Areas: A Case Study in Southern Brazil

    PubMed Central

    Ferreira, Angélica B.; Ribeiro, Andreza P.; Ferreira, Maurício L.; Kniess, Cláudia T.; Quaresma, Cristiano C.; Lafortezza, Raffaele; Santos, José O.; Saiki, Mitiko; Saldiva, Paulo H.

    2017-01-01

    Industrialization in developing countries associated with urban growth results in a number of economic benefits, especially in small or medium-sized cities, but leads to a number of environmental and public health consequences. This problem is further aggravated when adequate infrastructure is lacking to monitor the environmental impacts left by industries and refineries. In this study, a new protocol was designed combining biomonitoring and geostatistics to evaluate the possible effects of shale industry emissions on human health and wellbeing. Futhermore, the traditional and expensive air quality method based on PM2.5 measuring was also used to validate the low-cost geostatistical approach. Chemical analysis was performed using Energy Dispersive X-ray Fluorescence Spectrometer (EDXRF) to measure inorganic elements in tree bark and shale retorted samples in São Mateus do Sul city, Southern Brazil. Fe, S, and Si were considered potential pollutants in the study area. Distribution maps of element concentrations were generated from the dataset and used to estimate the spatial behavior of Fe, S, and Si and the range from their hot spot(s), highlighting the regions sorrounding the shale refinery. This evidence was also demonstrated in the measurements of PM2.5 concentrations, which are in agreement with the information obtained from the biomonitoring and geostatistical model. Factor and descriptive analyses performed on the concentrations of tree bark contaminants suggest that Fe, S, and Si might be used as indicators of industrial emissions. The number of cases of respiratory diseases obtained from local basic health unit were used to assess a possible correlation between shale refinery emissions and cases of repiratory disease. These data are public and may be accessed on the website of the the Brazilian Ministry of Health. Significant associations were found between the health data and refinery activities. The combination of the spatial characterization of air

  7. Spatial assessment of soil organic carbon and physicochemical properties in a horticultural orchard at arid zone of India using geostatistical approaches.

    PubMed

    Singh, Akath; Santra, Priyabrata; Kumar, Mahesh; Panwar, Navraten; Meghwal, P R

    2016-09-01

    Soil organic carbon (SOC) is a major indicator of long-term sustenance of agricultural production system. Apart from sustaining productivity, SOC plays a crucial role in context of climate change. Keeping in mind these potentials, spatial variation of SOC contents of a fruit orchard comprising several arid fruit plantations located at arid region of India is assessed in this study through geostatistical approaches. For this purpose, surface and subsurface soil samples from 175 locations from a fruit orchard spreading over 14.33 ha area were collected along with geographical coordinates. SOC content and soil physicochemical properties of collected soil samples were determined followed by geostatistical analysis for mapping purposes. Average SOC stock density of the orchard was 14.48 Mg ha(-1) for 0- to 30-cm soil layer ranging from 9.01 Mg ha(-1) in Carissa carandas to 19.52 Mg ha(-1) in Prosopis cineraria block. Range of spatial variation of SOC content was found about 100 m, while two other soil physicochemical properties, e.g., pH and electrical conductivity (EC) also showed similar spatial trend. This indicated that minimum sampling distance for future SOC mapping programme may be kept lower than 100 m for better accuracy. Ordinary kriging technique satisfactorily predicted SOC contents (in percent) at unsampled locations with root-mean-squared residual (RMSR) of 0.35-0.37. Co-kriging approach was found slightly superior (RMSR = 0.26-0.28) than ordinary kriging for spatial prediction of SOC contents because of significant correlations of SOC contents with pH and EC. Uncertainty of SOC estimation was also presented in terms of 90 % confidence interval. Spatial estimates of SOC stock through ordinary kriging or co-kriging approach were also found with low uncertainty of estimation than non-spatial estimates, e.g., arithmetic averaging approach. Among different fruit block plantations of the orchard, the block with Prosopis cineraria ('khejri') has

  8. Fitting parametric random effects models in very large data sets with application to VHA national data

    PubMed Central

    2012-01-01

    Background With the current focus on personalized medicine, patient/subject level inference is often of key interest in translational research. As a result, random effects models (REM) are becoming popular for patient level inference. However, for very large data sets that are characterized by large sample size, it can be difficult to fit REM using commonly available statistical software such as SAS since they require inordinate amounts of computer time and memory allocations beyond what are available preventing model convergence. For example, in a retrospective cohort study of over 800,000 Veterans with type 2 diabetes with longitudinal data over 5 years, fitting REM via generalized linear mixed modeling using currently available standard procedures in SAS (e.g. PROC GLIMMIX) was very difficult and same problems exist in Stata’s gllamm or R’s lme packages. Thus, this study proposes and assesses the performance of a meta regression approach and makes comparison with methods based on sampling of the full data. Data We use both simulated and real data from a national cohort of Veterans with type 2 diabetes (n=890,394) which was created by linking multiple patient and administrative files resulting in a cohort with longitudinal data collected over 5 years. Methods and results The outcome of interest was mean annual HbA1c measured over a 5 years period. Using this outcome, we compared parameter estimates from the proposed random effects meta regression (REMR) with estimates based on simple random sampling and VISN (Veterans Integrated Service Networks) based stratified sampling of the full data. Our results indicate that REMR provides parameter estimates that are less likely to be biased with tighter confidence intervals when the VISN level estimates are homogenous. Conclusion When the interest is to fit REM in repeated measures data with very large sample size, REMR can be used as a good alternative. It leads to reasonable inference for both Gaussian and non

  9. Spatial variability of isoproturon mineralizing activity within an agricultural field: geostatistical analysis of simple physicochemical and microbiological soil parameters.

    PubMed

    El Sebai, T; Lagacherie, B; Soulas, G; Martin-Laurent, F

    2007-02-01

    We assessed the spatial variability of isoproturon mineralization in relation to that of physicochemical and biological parameters in fifty soil samples regularly collected along a sampling grid delimited across a 0.36 ha field plot (40 x 90 m). Only faint relationships were observed between isoproturon mineralization and the soil pH, microbial C biomass, and organic nitrogen. Considerable spatial variability was observed for six of the nine parameters tested (isoproturon mineralization rates, organic nitrogen, genetic structure of the microbial communities, soil pH, microbial biomass and equivalent humidity). The map of isoproturon mineralization rates distribution was similar to that of soil pH, microbial biomass, and organic nitrogen but different from those of structure of the microbial communities and equivalent humidity. Geostatistics revealed that the spatial heterogeneity in the rate of degradation of isoproturon corresponded to that of soil pH and microbial biomass.

  10. Random complex dynamics and devil's coliseums

    NASA Astrophysics Data System (ADS)

    Sumi, Hiroki

    2015-04-01

    We investigate the random dynamics of polynomial maps on the Riemann sphere \\hat{\\Bbb{C}} and the dynamics of semigroups of polynomial maps on \\hat{\\Bbb{C}} . In particular, the dynamics of a semigroup G of polynomials whose planar postcritical set is bounded and the associated random dynamics are studied. In general, the Julia set of such a G may be disconnected. We show that if G is such a semigroup, then regarding the associated random dynamics, the chaos of the averaged system disappears in the C0 sense, and the function T∞ of probability of tending to ∞ \\in \\hat{\\Bbb{C}} is Hölder continuous on \\hat{\\Bbb{C}} and varies only on the Julia set of G. Moreover, the function T∞ has a kind of monotonicity. It turns out that T∞ is a complex analogue of the devil's staircase, and we call T∞ a ‘devil’s coliseum'. We investigate the details of T∞ when G is generated by two polynomials. In this case, T∞ varies precisely on the Julia set of G, which is a thin fractal set. Moreover, under this condition, we investigate the pointwise Hölder exponents of T∞.

  11. A geostatistical approach to identify and mitigate agricultural nitrous oxide emission hotspots.

    PubMed

    Turner, P A; Griffis, T J; Mulla, D J; Baker, J M; Venterea, R T

    2016-12-01

    Anthropogenic emissions of nitrous oxide (N 2 O), a trace gas with severe environmental costs, are greatest from agricultural soils amended with nitrogen (N) fertilizer. However, accurate N 2 O emission estimates at fine spatial scales are made difficult by their high variability, which represents a critical challenge for the management of N 2 O emissions. Here, static chamber measurements (n=60) and soil samples (n=129) were collected at approximately weekly intervals (n=6) for 42-d immediately following the application of N in a southern Minnesota cornfield (15.6-ha), typical of the systems prevalent throughout the U.S. Corn Belt. These data were integrated into a geostatistical model that resolved N 2 O emissions at a high spatial resolution (1-m). Field-scale N 2 O emissions exhibited a high degree of spatial variability, and were partitioned into three classes of emission strength: hotspots, intermediate, and coldspots. Rates of emission from hotspots were 2-fold greater than non-hotspot locations. Consequently, 36% of the field-scale emissions could be attributed to hotspots, despite representing only 21% of the total field area. Variations in elevation caused hotspots to develop in predictable locations, which were prone to nutrient and moisture accumulation caused by terrain focusing. Because these features are relatively static, our data and analyses indicate that targeted management of hotspots could efficiently reduce field-scale emissions by as much 17%, a significant benefit considering the deleterious effects of atmospheric N 2 O. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. Simulation of Voltage SET Operation in Phase-Change Random Access Memories with Heater Addition and Ring-Type Contactor for Low-Power Consumption by Finite Element Modeling

    NASA Astrophysics Data System (ADS)

    Gong, Yue-Feng; Song, Zhi-Tang; Ling, Yun; Liu, Yan; Li, Yi-Jin

    2010-06-01

    A three-dimensional finite element model for phase change random access memory is established to simulate electric, thermal and phase state distribution during (SET) operation. The model is applied to simulate the SET behaviors of the heater addition structure (HS) and the ring-type contact in the bottom electrode (RIB) structure. The simulation results indicate that the small bottom electrode contactor (BEC) is beneficial for heat efficiency and reliability in the HS cell, and the bottom electrode contactor with size Fx = 80 nm is a good choice for the RIB cell. Also shown is that the appropriate SET pulse time is 100 ns for the low power consumption and fast operation.

  13. Comparing spatial regression to random forests for large environmental data sets

    EPA Science Inventory

    Environmental data may be “large” due to number of records, number of covariates, or both. Random forests has a reputation for good predictive performance when using many covariates, whereas spatial regression, when using reduced rank methods, has a reputatio...

  14. Using Predictions Based on Geostatistics to Monitor Trends in Aspergillus flavus Strain Composition.

    PubMed

    Orum, T V; Bigelow, D M; Cotty, P J; Nelson, M R

    1999-09-01

    ABSTRACT Aspergillus flavus is a soil-inhabiting fungus that frequently produces aflatoxins, potent carcinogens, in cottonseed and other seed crops. A. flavus S strain isolates, characterized on the basis of sclerotial morphology, are highly toxigenic. Spatial and temporal characteristics of the percentage of the A. flavus isolates that are S strain (S strain incidence) were used to predict patterns across areas of more than 30 km(2). Spatial autocorrelation in S strain incidence in Yuma County, AZ, was shown to extend beyond field boundaries to adjacent fields. Variograms revealed both short-range (2 to 6 km) and long-range (20 to 30 km) spatial structure in S strain incidence. S strain incidence at 36 locations sampled in July 1997 was predicted with a high correlation between expected and observed values (R = 0.85, P = 0.0001) by kriging data from July 1995 and July 1996. S strain incidence at locations sampled in October 1997 and March 1998 was markedly less than predicted by kriging data from the same months in prior years. Temporal analysis of four locations repeatedly sampled from April 1995 through July 1998 also indicated a major reduction in S strain incidence in the Texas Hill area after July 1997. Surface maps generated by kriging point data indicated a similarity in the spatial pattern of S strain incidence among all sampling dates despite temporal changes in the overall S strain incidence. Geostatistics provided useful descriptions of variability in S strain incidence over space and time.

  15. Use of portable X-ray fluorescence spectroscopy and geostatistics for health risk assessment.

    PubMed

    Yang, Meng; Wang, Cheng; Yang, Zhao-Ping; Yan, Nan; Li, Feng-Ying; Diao, Yi-Wei; Chen, Min-Dong; Li, Hui-Ming; Wang, Jin-Hua; Qian, Xin

    2018-05-30

    Laboratory analysis of trace metals using inductively coupled plasma (ICP) spectroscopy is not cost effective, and the complex spatial distribution of soil trace metals makes their spatial analysis and prediction problematic. Thus, for the health risk assessment of exposure to trace metals in soils, portable X-ray fluorescence (PXRF) spectroscopy was used to replace ICP spectroscopy for metal analysis, and robust geostatistical methods were used to identify spatial outliers in trace metal concentrations and to map trace metal distributions. A case study was carried out around an industrial area in Nanjing, China. The results showed that PXRF spectroscopy provided results for trace metal (Cu, Ni, Pb and Zn) levels comparable to ICP spectroscopy. The results of the health risk assessment showed that Ni posed a higher non-carcinogenic risk than Cu, Pb and Zn, indicating a higher priority of concern than the other elements. Sampling locations associated with adverse health effects were identified as 'hotspots', and high-risk areas were delineated from risk maps. These 'hotspots' and high-risk areas were in close proximity to and downwind from petrochemical plants, indicating the dominant role of industrial activities as the major sources of trace metals in soils. The approach used in this study could be adopted as a cost-effective methodology for screening 'hotspots' and priority areas of concern for cost-efficient health risk management. Copyright © 2018 Elsevier Inc. All rights reserved.

  16. Use of geostatistics for assessing the concentration of heavy metals in a stretch of the River Apodi-Mossoro (Rio Grande do Norte State, Brazil).

    NASA Astrophysics Data System (ADS)

    Bezerra, J. M.; Siqueira, G. M.; Montenegro, A. A. A.; Silva, P. C. M.; Batista, R. O.

    2012-04-01

    The objective of this study was to assess the environmental changes with respect to the concentration of heavy metals in the sediment contained a stretch of the River Apodi-Mossoró (Rio Grande do Norte State, Brazil), considering changes in land use and soil. The sediment samples were collected at 30 points in the bed Apodi- Mossoró River in a section with features urban-rural town of Mossoró. The concentration of heavy metals in the sediment was determined using composite samples of surface sediments from the bottom with a depth of 20 cm, according to the methodology of APHAAWWA-WPCF (1998), where he subsequently held to determine the presence and quantity of metal concentration total by the technique of atomic absorption spectrometry, and analyzed the following heavy metals: aluminum(Al), cádmium (Cd), chromium (Cr), copper (Cu), iron (Fe), manganese (Mn), nickel (Ni), lead (Pb) and zinc (Zn). Data were analyzed using statistical and geostatistical. The geostatistical analysiswas performed by the construction of experimental semivariogramas self-assessment and adjustment by using the technique of Jack-kinifing. The elemento Cd was absent in the samples, which reduces the possibility of environmental contamination events. The average concentrations of the elements under study are within the limits proposed by the environmental legislation (National Environmental Council). However, for the elements Fe, Al and Mn no threshold values, because these are associated with the rocky material of geochemical origin. The elemento Fe had the highest range of values than the other, and all elements except for Zn and Cd showed the presence of outliers, suggesting the possibility that these points are listed as points liable to contribution by human activities. It was verified the presence of human influence, because the elements undergo an increase of concentration values from the point 11, which is located downstream of the urban bus consolidated. The experimental

  17. Focused breastfeeding counselling improves short- and long-term success in an early-discharge setting: A cluster-randomized study.

    PubMed

    Nilsson, Ingrid M S; Strandberg-Larsen, Katrine; Knight, Christopher H; Hansen, Anne Vinkel; Kronborg, Hanne

    2017-10-01

    Length of postnatal hospitalization has decreased and has been shown to be associated with infant nutritional problems and increase in readmissions. We aimed to evaluate if guidelines for breastfeeding counselling in an early discharge hospital setting had an effect on maternal breastfeeding self-efficacy, infant readmission and breastfeeding duration. A cluster randomized trial was conducted and assigned nine maternity settings in Denmark to intervention or usual care. Women were eligible if they expected a single infant, intended to breastfeed, were able to read Danish, and expected to be discharged within 50 hr postnatally. Between April 2013 and August 2014, 2,065 mothers were recruited at intervention and 1,476 at reference settings. Results show that the intervention did not affect maternal breastfeeding self-efficacy (primary outcome). However, less infants were readmitted 1 week postnatally in the intervention compared to the reference group (adjusted OR 0.55, 95% CI 0.37, -0.81), and 6 months following birth, more infants were exclusively breastfed in the intervention group (adjusted OR 1.36, 95% CI 1.02, -1.81). Moreover, mothers in the intervention compared to the reference group were breastfeeding more frequently (p < .001), and spend more hours skin to skin with their infants (p < .001). The infants were less often treated for jaundice (p = 0.003) and there was more paternal involvement (p = .037). In an early discharge hospital setting, a focused breastfeeding programme concentrating on increased skin to skin contact, frequent breastfeeding, good positioning of the mother infant dyad, and enhanced involvement of the father improved short-term and long-term breastfeeding success. © 2017 John Wiley & Sons Ltd.

  18. Cognitive remediation for individuals with psychosis in a supported education setting: a randomized controlled trial.

    PubMed

    Kidd, Sean A; Kaur, Jaswant; Virdee, Gursharan; George, Tony P; McKenzie, Kwame; Herman, Yarissa

    2014-08-01

    Cognitive remediation (CR) has demonstrated good outcomes when paired with supported employment, however little is known about its effectiveness when integrated into a supported education program. This randomized controlled trial examined the effectiveness of integrating CR within a supported education program compared with supported education without CR. Thirty-seven students with psychosis were recruited into the study in the 2012 academic year. Academic functioning, cognition, self-esteem, and symptomatology were assessed at baseline, at 4months following the first academic term in which CR was provided, and at 8months assessing maintenance of gains. The treatment group demonstrated better retention in the academic program and a trend of improvement across a range of academic functional domains. While both treatment and control groups showed improvement in cognitive measures, the outcomes were not augmented by CR training. CR was also associated with significant and sustained improvements in self esteem. Further research, investigating specific intervention components is required to clarify the mixed findings regarding the effectiveness of CR in an education setting. Copyright © 2014 Elsevier B.V. All rights reserved.

  19. Groundwater-Quality Impacts from Natural-Gas Wellbore Leakage: Numerical Sensitivity Analysis of Hydrogeologic, Geostatistical, and Source-Term Parameterization at Varying Depths

    NASA Astrophysics Data System (ADS)

    Rice, A. K.; McCray, J. E.; Singha, K.

    2016-12-01

    The development of directional drilling and stimulation of reservoirs by hydraulic fracturing has transformed the energy landscape in the U.S. by making recovery of hydrocarbons from shale formations not only possible but economically viable. Activities associated with hydraulic fracturing present a set of water-quality challenges, including the potential for impaired groundwater quality. In this project, we use a three-dimensional, multiphase, multicomponent numerical model to investigate hydrogeologic conditions that could lead to groundwater contamination from natural gas wellbore leakage. This work explores the fate of methane that enters a well annulus, possibly from an intermediate formation or from the production zone via a flawed cement seal, and leaves the annulus at one of two depths: at the elevation of groundwater or below a freshwater aquifer. The latter leakage scenario is largely ignored in the current scientific literature, where focus has been on leakage directly into freshwater aquifers, despite modern regulations requiring steel casings and cement sheaths at these depths. We perform a three-stage sensitivity analysis, examining (1) hydrogeologic parameters of media surrounding a methane leakage source zone, (2) geostatistical variations in intrinsic permeability, and (3) methane source zone pressurization. Results indicate that in all cases methane reaches groundwater within the first year of leakage. To our knowledge, this is the first study to consider natural gas wellbore leakage in the context of multiphase flow through heterogeneous permeable media; advantages of multiphase modeling include more realistic analysis of methane vapor-phase relative permeability as compared to single-phase models. These results can be used to inform assessment of aquifer vulnerability to hydrocarbon wellbore leakage at varying depths.

  20. Provider Training to Screen and Initiate Evidence-Based Pediatric Obesity Treatment in Routine Practice Settings: A Randomized Pilot Trial.

    PubMed

    Kolko, Rachel P; Kass, Andrea E; Hayes, Jacqueline F; Levine, Michele D; Garbutt, Jane M; Proctor, Enola K; Wilfley, Denise E

    This randomized pilot trial evaluated two training modalities for first-line, evidence-based pediatric obesity services (screening and goal setting) among nursing students. Participants (N = 63) were randomized to live interactive training or Web-facilitated self-study training. Pretraining, post-training, and 1-month follow-up assessments evaluated training feasibility, acceptability, and impact (knowledge and skill via simulation). Moderator (previous experience) and predictor (content engagement) analyses were conducted. Nearly all participants (98%) completed assessments. Both types of training were acceptable, with higher ratings for live training and participants with previous experience (ps < .05). Knowledge and skill improved from pretraining to post-training and follow-up in both conditions (ps < .001). Live training demonstrated greater content engagement (p < .01). The training package was feasible, acceptable, and efficacious among nursing students. Given that live training had higher acceptability and engagement and online training offers greater scalability, integrating interactive live training components within Web-based training may optimize outcomes, which may enhance practitioners' delivery of pediatric obesity services. Copyright © 2016 National Association of Pediatric Nurse Practitioners. Published by Elsevier Inc. All rights reserved.

  1. Epidemiological study of hazelnut bacterial blight in central Italy by using laboratory analysis and geostatistics.

    PubMed

    Lamichhane, Jay Ram; Fabi, Alfredo; Ridolfi, Roberto; Varvaro, Leonardo

    2013-01-01

    Incidence of Xanthomonas arboricola pv. corylina, the causal agent of hazelnut bacterial blight, was analyzed spatially in relation to the pedoclimatic factors. Hazelnut grown in twelve municipalities situated in the province of Viterbo, central Italy was studied. A consistent number of bacterial isolates were obtained from the infected tissues of hazelnut collected in three years (2010-2012). The isolates, characterized by phenotypic tests, did not show any difference among them. Spatial patterns of pedoclimatic data, analyzed by geostatistics showed a strong positive correlation of disease incidence with higher values of rainfall, thermal shock and soil nitrogen; a weak positive correlation with soil aluminium content and a strong negative correlation with the values of Mg/K ratio. No correlation of the disease incidence was found with soil pH. Disease incidence ranged from very low (<1%) to very high (almost 75%) across the orchards. Young plants (4-year old) were the most affected by the disease confirming a weak negative correlation of the disease incidence with plant age. Plant cultivars did not show any difference in susceptibility to the pathogen. Possible role of climate change on the epidemiology of the disease is discussed. Improved management practices are recommended for effective control of the disease.

  2. Epidemiological Study of Hazelnut Bacterial Blight in Central Italy by Using Laboratory Analysis and Geostatistics

    PubMed Central

    Lamichhane, Jay Ram; Fabi, Alfredo; Ridolfi, Roberto; Varvaro, Leonardo

    2013-01-01

    Incidence of Xanthomonas arboricola pv. corylina, the causal agent of hazelnut bacterial blight, was analyzed spatially in relation to the pedoclimatic factors. Hazelnut grown in twelve municipalities situated in the province of Viterbo, central Italy was studied. A consistent number of bacterial isolates were obtained from the infected tissues of hazelnut collected in three years (2010–2012). The isolates, characterized by phenotypic tests, did not show any difference among them. Spatial patterns of pedoclimatic data, analyzed by geostatistics showed a strong positive correlation of disease incidence with higher values of rainfall, thermal shock and soil nitrogen; a weak positive correlation with soil aluminium content and a strong negative correlation with the values of Mg/K ratio. No correlation of the disease incidence was found with soil pH. Disease incidence ranged from very low (<1%) to very high (almost 75%) across the orchards. Young plants (4-year old) were the most affected by the disease confirming a weak negative correlation of the disease incidence with plant age. Plant cultivars did not show any difference in susceptibility to the pathogen. Possible role of climate change on the epidemiology of the disease is discussed. Improved management practices are recommended for effective control of the disease. PMID:23424654

  3. A multivariate geostatistical methodology to delineate areas of potential interest for future sedimentary gold exploration.

    PubMed

    Goovaerts, P; Albuquerque, Teresa; Antunes, Margarida

    2016-11-01

    This paper describes a multivariate geostatistical methodology to delineate areas of potential interest for future sedimentary gold exploration, with an application to an abandoned sedimentary gold mining region in Portugal. The main challenge was the existence of only a dozen gold measurements confined to the grounds of the old gold mines, which precluded the application of traditional interpolation techniques, such as cokriging. The analysis could, however, capitalize on 376 stream sediment samples that were analyzed for twenty two elements. Gold (Au) was first predicted at all 376 locations using linear regression (R 2 =0.798) and four metals (Fe, As, Sn and W), which are known to be mostly associated with the local gold's paragenesis. One hundred realizations of the spatial distribution of gold content were generated using sequential indicator simulation and a soft indicator coding of regression estimates, to supplement the hard indicator coding of gold measurements. Each simulated map then underwent a local cluster analysis to identify significant aggregates of low or high values. The one hundred classified maps were processed to derive the most likely classification of each simulated node and the associated probability of occurrence. Examining the distribution of the hot-spots and cold-spots reveals a clear enrichment in Au along the Erges River downstream from the old sedimentary mineralization.

  4. Kangaroo mother care for low birthweight infants: a randomized controlled trial in different settings.

    PubMed

    Cattaneo, A; Davanzo, R; Worku, B; Surjono, A; Echeverria, M; Bedri, A; Haksari, E; Osorno, L; Gudetta, B; Setyowireni, D; Quintero, S; Tamburlini, G

    1998-09-01

    A randomized controlled trial was carried out for 1 y in three tertiary and teaching hospitals, in Addis Ababa (Ethiopia), Yogyakarta (Indonesia) and Merida (Mexico), to study the effectiveness, feasibility, acceptability and cost of kangaroo mother care (KMC) when compared to conventional methods of care (CMC). About 29% of 649 low birthweight infants (LBWI; 1000-1999 g) died before eligibility. Of the survivors, 38% were excluded for various reasons, 149 were randomly assigned to KMC (almost exclusive skin-to-skin care after stabilization), and 136 to CMC (warm room or incubator care). There were three deaths in each group and no difference in the incidence of severe disease. Hypothermia was significantly less common in KMC infants in Merida (13.5 vs 31.5 episodes/100 infants/d) and overall (10.8 vs 14.6). Exclusive breastfeeding at discharge was more common in KMC infants in Merida (80% vs 16%) and overall (88% vs 70%). KMC infants had a higher mean daily weight gain (21.3 g vs 17.7 g) and were discharged earlier (13.4 vs 16.3 d after enrolment). KMC was considered feasible and presented advantages over CMC in terms of maintenance of equipment. Mothers expressed a clear preference for KMC and health workers found it safe and convenient. KMC was cheaper than CMC in terms of salaries (US$ 11,788 vs US$ 29,888) and other running costs (US$ 7501 vs US$ 9876). This study confirms that hospital KMC for stabilized LBWI 1000-1999 g is at least as effective and safe as CMC, and shows that it is feasible in different settings, acceptable to mothers of different cultures, and less expensive. Where exclusive breastfeeding is uncommon among LBWI, KMC may bring about an increase in its prevalence and duration, with consequent benefits for health and growth. For hospitals in low-income countries KMC may represent an appropriate use of scarce resources.

  5. Uncovering hidden heterogeneity: Geo-statistical models illuminate the fine scale effects of boating infrastructure on sediment characteristics and contaminants.

    PubMed

    Hedge, L H; Dafforn, K A; Simpson, S L; Johnston, E L

    2017-06-30

    Infrastructure associated with coastal communities is likely to not only directly displace natural systems, but also leave environmental footprints' that stretch over multiple scales. Some coastal infrastructure will, there- fore, generate a hidden layer of habitat heterogeneity in sediment systems that is not immediately observable in classical impact assessment frameworks. We examine the hidden heterogeneity associated with one of the most ubiquitous coastal modifications; dense swing moorings fields. Using a model based geo-statistical framework we highlight the variation in sedimentology throughout mooring fields and reference locations. Moorings were correlated with patches of sediment with larger particle sizes, and associated metal(loid) concentrations in these patches were depressed. Our work highlights two important ideas i) mooring fields create a mosaic of habitat in which contamination decreases and grain sizes increase close to moorings, and ii) model- based frameworks provide an information rich, easy-to-interpret way to communicate complex analyses to stakeholders. Crown Copyright © 2017. Published by Elsevier Ltd. All rights reserved.

  6. Discretisation Schemes for Level Sets of Planar Gaussian Fields

    NASA Astrophysics Data System (ADS)

    Beliaev, D.; Muirhead, S.

    2018-01-01

    Smooth random Gaussian functions play an important role in mathematical physics, a main example being the random plane wave model conjectured by Berry to give a universal description of high-energy eigenfunctions of the Laplacian on generic compact manifolds. Our work is motivated by questions about the geometry of such random functions, in particular relating to the structure of their nodal and level sets. We study four discretisation schemes that extract information about level sets of planar Gaussian fields. Each scheme recovers information up to a different level of precision, and each requires a maximum mesh-size in order to be valid with high probability. The first two schemes are generalisations and enhancements of similar schemes that have appeared in the literature (Beffara and Gayet in Publ Math IHES, 2017. https://doi.org/10.1007/s10240-017-0093-0; Mischaikow and Wanner in Ann Appl Probab 17:980-1018, 2007); these give complete topological information about the level sets on either a local or global scale. As an application, we improve the results in Beffara and Gayet (2017) on Russo-Seymour-Welsh estimates for the nodal set of positively-correlated planar Gaussian fields. The third and fourth schemes are, to the best of our knowledge, completely new. The third scheme is specific to the nodal set of the random plane wave, and provides global topological information about the nodal set up to `visible ambiguities'. The fourth scheme gives a way to approximate the mean number of excursion domains of planar Gaussian fields.

  7. National-scale aboveground biomass geostatistical mapping with FIA inventory and GLAS data: Preparation for sparsely sampled lidar assisted forest inventory

    NASA Astrophysics Data System (ADS)

    Babcock, C. R.; Finley, A. O.; Andersen, H. E.; Moskal, L. M.; Morton, D. C.; Cook, B.; Nelson, R.

    2017-12-01

    Upcoming satellite lidar missions, such as GEDI and IceSat-2, are designed to collect laser altimetry data from space for narrow bands along orbital tracts. As a result lidar metric sets derived from these sources will not be of complete spatial coverage. This lack of complete coverage, or sparsity, means traditional regression approaches that consider lidar metrics as explanatory variables (without error) cannot be used to generate wall-to-wall maps of forest inventory variables. We implement a coregionalization framework to jointly model sparsely sampled lidar information and point-referenced forest variable measurements to create wall-to-wall maps with full probabilistic uncertainty quantification of all inputs. We inform the model with USFS Forest Inventory and Analysis (FIA) in-situ forest measurements and GLAS lidar data to spatially predict aboveground forest biomass (AGB) across the contiguous US. We cast our model within a Bayesian hierarchical framework to better model complex space-varying correlation structures among the lidar metrics and FIA data, which yields improved prediction and uncertainty assessment. To circumvent computational difficulties that arise when fitting complex geostatistical models to massive datasets, we use a Nearest Neighbor Gaussian process (NNGP) prior. Results indicate that a coregionalization modeling approach to leveraging sampled lidar data to improve AGB estimation is effective. Further, fitting the coregionalization model within a Bayesian mode of inference allows for AGB quantification across scales ranging from individual pixel estimates of AGB density to total AGB for the continental US with uncertainty. The coregionalization framework examined here is directly applicable to future spaceborne lidar acquisitions from GEDI and IceSat-2. Pairing these lidar sources with the extensive FIA forest monitoring plot network using a joint prediction framework, such as the coregionalization model explored here, offers the

  8. Accounting for geophysical information in geostatistical characterization of unexploded ordnance (UXO) sites.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saito, Hirotaka; Goovaerts, Pierre; McKenna, Sean Andrew

    2003-06-01

    Efficient and reliable unexploded ordnance (UXO) site characterization is needed for decisions regarding future land use. There are several types of data available at UXO sites and geophysical signal maps are one of the most valuable sources of information. Incorporation of such information into site characterization requires a flexible and reliable methodology. Geostatistics allows one to account for exhaustive secondary information (i.e.,, known at every location within the field) in many different ways. Kriging and logistic regression were combined to map the probability of occurrence of at least one geophysical anomaly of interest, such as UXO, from a limited numbermore » of indicator data. Logistic regression is used to derive the trend from a geophysical signal map, and kriged residuals are added to the trend to estimate the probabilities of the presence of UXO at unsampled locations (simple kriging with varying local means or SKlm). Each location is identified for further remedial action if the estimated probability is greater than a given threshold. The technique is illustrated using a hypothetical UXO site generated by a UXO simulator, and a corresponding geophysical signal map. Indicator data are collected along two transects located within the site. Classification performances are then assessed by computing proportions of correct classification, false positive, false negative, and Kappa statistics. Two common approaches, one of which does not take any secondary information into account (ordinary indicator kriging) and a variant of common cokriging (collocated cokriging), were used for comparison purposes. Results indicate that accounting for exhaustive secondary information improves the overall characterization of UXO sites if an appropriate methodology, SKlm in this case, is used.« less

  9. Adjusting for sampling variability in sparse data: geostatistical approaches to disease mapping

    PubMed Central

    2011-01-01

    Background Disease maps of crude rates from routinely collected health data indexed at a small geographical resolution pose specific statistical problems due to the sparse nature of the data. Spatial smoothers allow areas to borrow strength from neighboring regions to produce a more stable estimate of the areal value. Geostatistical smoothers are able to quantify the uncertainty in smoothed rate estimates without a high computational burden. In this paper, we introduce a uniform model extension of Bayesian Maximum Entropy (UMBME) and compare its performance to that of Poisson kriging in measures of smoothing strength and estimation accuracy as applied to simulated data and the real data example of HIV infection in North Carolina. The aim is to produce more reliable maps of disease rates in small areas to improve identification of spatial trends at the local level. Results In all data environments, Poisson kriging exhibited greater smoothing strength than UMBME. With the simulated data where the true latent rate of infection was known, Poisson kriging resulted in greater estimation accuracy with data that displayed low spatial autocorrelation, while UMBME provided more accurate estimators with data that displayed higher spatial autocorrelation. With the HIV data, UMBME performed slightly better than Poisson kriging in cross-validatory predictive checks, with both models performing better than the observed data model with no smoothing. Conclusions Smoothing methods have different advantages depending upon both internal model assumptions that affect smoothing strength and external data environments, such as spatial correlation of the observed data. Further model comparisons in different data environments are required to provide public health practitioners with guidelines needed in choosing the most appropriate smoothing method for their particular health dataset. PMID:21978359

  10. Adjusting for sampling variability in sparse data: geostatistical approaches to disease mapping.

    PubMed

    Hampton, Kristen H; Serre, Marc L; Gesink, Dionne C; Pilcher, Christopher D; Miller, William C

    2011-10-06

    Disease maps of crude rates from routinely collected health data indexed at a small geographical resolution pose specific statistical problems due to the sparse nature of the data. Spatial smoothers allow areas to borrow strength from neighboring regions to produce a more stable estimate of the areal value. Geostatistical smoothers are able to quantify the uncertainty in smoothed rate estimates without a high computational burden. In this paper, we introduce a uniform model extension of Bayesian Maximum Entropy (UMBME) and compare its performance to that of Poisson kriging in measures of smoothing strength and estimation accuracy as applied to simulated data and the real data example of HIV infection in North Carolina. The aim is to produce more reliable maps of disease rates in small areas to improve identification of spatial trends at the local level. In all data environments, Poisson kriging exhibited greater smoothing strength than UMBME. With the simulated data where the true latent rate of infection was known, Poisson kriging resulted in greater estimation accuracy with data that displayed low spatial autocorrelation, while UMBME provided more accurate estimators with data that displayed higher spatial autocorrelation. With the HIV data, UMBME performed slightly better than Poisson kriging in cross-validatory predictive checks, with both models performing better than the observed data model with no smoothing. Smoothing methods have different advantages depending upon both internal model assumptions that affect smoothing strength and external data environments, such as spatial correlation of the observed data. Further model comparisons in different data environments are required to provide public health practitioners with guidelines needed in choosing the most appropriate smoothing method for their particular health dataset.

  11. Training-Image Based Geostatistical Inversion Using a Spatial Generative Adversarial Neural Network

    NASA Astrophysics Data System (ADS)

    Laloy, Eric; Hérault, Romain; Jacques, Diederik; Linde, Niklas

    2018-01-01

    Probabilistic inversion within a multiple-point statistics framework is often computationally prohibitive for high-dimensional problems. To partly address this, we introduce and evaluate a new training-image based inversion approach for complex geologic media. Our approach relies on a deep neural network of the generative adversarial network (GAN) type. After training using a training image (TI), our proposed spatial GAN (SGAN) can quickly generate 2-D and 3-D unconditional realizations. A key characteristic of our SGAN is that it defines a (very) low-dimensional parameterization, thereby allowing for efficient probabilistic inversion using state-of-the-art Markov chain Monte Carlo (MCMC) methods. In addition, available direct conditioning data can be incorporated within the inversion. Several 2-D and 3-D categorical TIs are first used to analyze the performance of our SGAN for unconditional geostatistical simulation. Training our deep network can take several hours. After training, realizations containing a few millions of pixels/voxels can be produced in a matter of seconds. This makes it especially useful for simulating many thousands of realizations (e.g., for MCMC inversion) as the relative cost of the training per realization diminishes with the considered number of realizations. Synthetic inversion case studies involving 2-D steady state flow and 3-D transient hydraulic tomography with and without direct conditioning data are used to illustrate the effectiveness of our proposed SGAN-based inversion. For the 2-D case, the inversion rapidly explores the posterior model distribution. For the 3-D case, the inversion recovers model realizations that fit the data close to the target level and visually resemble the true model well.

  12. Utilizing Maximal Independent Sets as Dominating Sets in Scale-Free Networks

    NASA Astrophysics Data System (ADS)

    Derzsy, N.; Molnar, F., Jr.; Szymanski, B. K.; Korniss, G.

    Dominating sets provide key solution to various critical problems in networked systems, such as detecting, monitoring, or controlling the behavior of nodes. Motivated by graph theory literature [Erdos, Israel J. Math. 4, 233 (1966)], we studied maximal independent sets (MIS) as dominating sets in scale-free networks. We investigated the scaling behavior of the size of MIS in artificial scale-free networks with respect to multiple topological properties (size, average degree, power-law exponent, assortativity), evaluated its resilience to network damage resulting from random failure or targeted attack [Molnar et al., Sci. Rep. 5, 8321 (2015)], and compared its efficiency to previously proposed dominating set selection strategies. We showed that, despite its small set size, MIS provides very high resilience against network damage. Using extensive numerical analysis on both synthetic and real-world (social, biological, technological) network samples, we demonstrate that our method effectively satisfies four essential requirements of dominating sets for their practical applicability on large-scale real-world systems: 1.) small set size, 2.) minimal network information required for their construction scheme, 3.) fast and easy computational implementation, and 4.) resiliency to network damage. Supported by DARPA, DTRA, and NSF.

  13. A Pilot Randomized Controlled Trial of an Internet-Based Alcohol Intervention in a Workplace Setting.

    PubMed

    Brendryen, Håvar; Johansen, Ayna; Duckert, Fanny; Nesvåg, Sverre

    2017-10-01

    The aim of this study was to compare the effectiveness of a brief and an intensive self-help alcohol intervention and to assess the feasibility of recruiting to such interventions in a workplace setting. Employees who screened positive for hazardous drinking (n = 85) received online personalized normative feedback and were randomly assigned to one out of two conditions: either they received an e-booklet about the effects of alcohol or they received a self-help intervention comprising 62 web-based, fully automated, and interactive sessions, plus reminder e-mails, and mobile phone text messages (Short Message Service). Two months after baseline, the responders in the intensive condition drank an average of five to six drinks less per week compared to the responders in the brief condition (B = 5.68, 95% CI = 0.48-10.87, P = .03). There was no significant difference between conditions, using baseline observation carried forward imputation (B = 2.96, 95% CI = -0.50-6.42, P = .09). Six months after baseline, no significant difference was found, neither based on complete cases nor intent-to-treat (B = 1.07, 95% CI = -1.29-3.44, P = .37). Challenges with recruitment are thoroughly reported. The study supports the feasibility and the safety of use for both brief and intensive Internet-based self-help in an occupational setting. The study may inform future trials, but due to recruitment problems and low statistical power, the findings are inconclusive in terms of the intensive program being more effective than brief intervention alone. ClinicalTrials.gov Identifier: NCT01931618.

  14. Identifying and closing gaps in environmental monitoring by means of metadata, ecological regionalization and geostatistics using the UNESCO biosphere reserve Rhoen (Germany) as an example.

    PubMed

    Schröder, Winfried; Pesch, Roland; Schmidt, Gunther

    2006-03-01

    In Germany, environmental monitoring is intended to provide a holistic view of the environmental condition. To this end the monitoring operated by the federal states must use harmonized, resp., standardized methods. In addition, the monitoring sites should cover the ecoregions without any geographical gaps, the monitoring design should have no gaps in terms of ecologically relevant measurement parameters, and the sample data should be spatially without any gaps. This article outlines the extent to which the Rhoen Biosphere Reserve, occupying a part of the German federal states of Bavaria, Hesse and Thuringia, fulfills the listed requirements. The investigation considered collection, data banking and analysis of monitoring data and metadata, ecological regionalization and geostatistics. Metadata on the monitoring networks were collected by questionnaires and provided a complete inventory and description of the monitoring activities in the reserve and its surroundings. The analysis of these metadata reveals that most of the monitoring methods are harmonized across the boundaries of the three federal states the Rhoen is part of. The monitoring networks that measure precipitation, surface water levels, and groundwater quality are particularly overrepresented in the central ecoregions of the biosphere reserve. Soil monitoring sites are more equally distributed within the ecoregions of the Rhoen. The number of sites for the monitoring of air pollutants is not sufficient to draw spatially valid conclusions. To fill these spatial gaps, additional data on the annual average values of the concentrations of air pollutants from monitoring sites outside of the biosphere reserve had therefore been subject to geostatistical analysis and estimation. This yields valid information on the spatial patterns and temporal trends of air quality. The approach illustrated is applicable to similar cases, as, for example, the harmonization of international monitoring networks.

  15. Spatial heterogeneity and risk factors for stunting among children under age five in Ethiopia: A Bayesian geo-statistical model.

    PubMed

    Hagos, Seifu; Hailemariam, Damen; WoldeHanna, Tasew; Lindtjørn, Bernt

    2017-01-01

    Understanding the spatial distribution of stunting and underlying factors operating at meso-scale is of paramount importance for intervention designing and implementations. Yet, little is known about the spatial distribution of stunting and some discrepancies are documented on the relative importance of reported risk factors. Therefore, the present study aims at exploring the spatial distribution of stunting at meso- (district) scale, and evaluates the effect of spatial dependency on the identification of risk factors and their relative contribution to the occurrence of stunting and severe stunting in a rural area of Ethiopia. A community based cross sectional study was conducted to measure the occurrence of stunting and severe stunting among children aged 0-59 months. Additionally, we collected relevant information on anthropometric measures, dietary habits, parent and child-related demographic and socio-economic status. Latitude and longitude of surveyed households were also recorded. Local Anselin Moran's I was calculated to investigate the spatial variation of stunting prevalence and identify potential local pockets (hotspots) of high prevalence. Finally, we employed a Bayesian geo-statistical model, which accounted for spatial dependency structure in the data, to identify potential risk factors for stunting in the study area. Overall, the prevalence of stunting and severe stunting in the district was 43.7% [95%CI: 40.9, 46.4] and 21.3% [95%CI: 19.5, 23.3] respectively. We identified statistically significant clusters of high prevalence of stunting (hotspots) in the eastern part of the district and clusters of low prevalence (cold spots) in the western. We found out that the inclusion of spatial structure of the data into the Bayesian model has shown to improve the fit for stunting model. The Bayesian geo-statistical model indicated that the risk of stunting increased as the child's age increased (OR 4.74; 95% Bayesian credible interval [BCI]:3.35-6.58) and

  16. Estimating the Depth of Stratigraphic Units from Marine Seismic Profiles Using Nonstationary Geostatistics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chihi, Hayet; Galli, Alain; Ravenne, Christian

    2000-03-15

    The object of this study is to build a three-dimensional (3D) geometric model of the stratigraphic units of the margin of the Rhone River on the basis of geophysical investigations by a network of seismic profiles at sea. The geometry of these units is described by depth charts of each surface identified by seismic profiling, which is done by geostatistics. The modeling starts by a statistical analysis by which we determine the parameters that enable us to calculate the variograms of the identified surfaces. After having determined the statistical parameters, we calculate the variograms of the variable Depth. By analyzingmore » the behavior of the variogram we then can deduce whether the situation is stationary and if the variable has an anisotropic behavior. We tried the following two nonstationary methods to obtain our estimates: (a) The method of universal kriging if the underlying variogram was directly accessible. (b) The method of increments if the underlying variogram was not directly accessible. After having modeled the variograms of the increments and of the variable itself, we calculated the surfaces by kriging the variable Depth on a small-mesh estimation grid. The two methods then are compared and their respective advantages and disadvantages are discussed, as well as their fields of application. These methods are capable of being used widely in earth sciences for automatic mapping of geometric surfaces or for variables such as a piezometric surface or a concentration, which are not 'stationary,' that is, essentially, possess a gradient or a tendency to develop systematically in space.« less

  17. Modelling wildland fire propagation by tracking random fronts

    NASA Astrophysics Data System (ADS)

    Pagnini, G.; Mentrelli, A.

    2013-11-01

    Wildland fire propagation is studied in literature by two alternative approaches, namely the reaction-diffusion equation and the level-set method. These two approaches are considered alternative each other because the solution of the reaction-diffusion equation is generally a continuous smooth function that has an exponential decay and an infinite support, while the level-set method, which is a front tracking technique, generates a sharp function with a finite support. However, these two approaches can indeed be considered complementary and reconciled. Turbulent hot-air transport and fire spotting are phenomena with a random character that are extremely important in wildland fire propagation. As a consequence the fire front gets a random character, too. Hence a tracking method for random fronts is needed. In particular, the level-set contourn is here randomized accordingly to the probability density function of the interface particle displacement. Actually, when the level-set method is developed for tracking a front interface with a random motion, the resulting averaged process emerges to be governed by an evolution equation of the reaction-diffusion type. In this reconciled approach, the rate of spread of the fire keeps the same key and characterizing role proper to the level-set approach. The resulting model emerges to be suitable to simulate effects due to turbulent convection as fire flank and backing fire, the faster fire spread because of the actions by hot air pre-heating and by ember landing, and also the fire overcoming a firebreak zone that is a case not resolved by models based on the level-set method. Moreover, from the proposed formulation it follows a correction for the rate of spread formula due to the mean jump-length of firebrands in the downwind direction for the leeward sector of the fireline contour.

  18. Geostatistical analysis and isoscape of ice core derived water stable isotope records in an Antarctic macro region

    NASA Astrophysics Data System (ADS)

    Hatvani, István Gábor; Leuenberger, Markus; Kohán, Balázs; Kern, Zoltán

    2017-09-01

    Water stable isotopes preserved in ice cores provide essential information about polar precipitation. In the present study, multivariate regression and variogram analyses were conducted on 22 δ2H and 53 δ18O records from 60 ice cores covering the second half of the 20th century. Taking the multicollinearity of the explanatory variables into account, as also the model's adjusted R2 and its mean absolute error, longitude, elevation and distance from the coast were found to be the main independent geographical driving factors governing the spatial δ18O variability of firn/ice in the chosen Antarctic macro region. After diminishing the effects of these factors, using variography, the weights for interpolation with kriging were obtained and the spatial autocorrelation structure of the dataset was revealed. This indicates an average area of influence with a radius of 350 km. This allows the determination of the areas which are as yet not covered by the spatial variability of the existing network of ice cores. Finally, the regional isoscape was obtained for the study area, and this may be considered the first step towards a geostatistically improved isoscape for Antarctica.

  19. The effect of automated text messaging and goal setting on pedometer adherence and physical activity in patients with diabetes: A randomized controlled trial.

    PubMed

    Polgreen, Linnea A; Anthony, Christopher; Carr, Lucas; Simmering, Jacob E; Evans, Nicholas J; Foster, Eric D; Segre, Alberto M; Cremer, James F; Polgreen, Philip M

    2018-01-01

    Activity-monitoring devices may increase activity, but their effectiveness in sedentary, diseased, and less-motivated populations is unknown. Subjects with diabetes or pre-diabetes were given a Fitbit and randomized into three groups: Fitbit only, Fitbit with reminders, and Fitbit with both reminders and goal setting. Subjects in the reminders group were sent text-message reminders to wear their Fitbit. The goal-setting group was sent a daily text message asking for a step goal. All subjects had three in-person visits (baseline, 3 and 6 months). We modelled daily steps and goal setting using linear mixed-effects models. 138 subjects participated with 48 in the Fitbit-only, 44 in the reminders, and 46 in the goal-setting groups. Daily steps decreased for all groups during the study. Average daily steps were 7123, 6906, and 6854 for the Fitbit-only, the goal-setting, and the reminders groups, respectively. The reminders group was 17.2 percentage points more likely to wear their Fitbit than the Fitbit-only group. Setting a goal was associated with a significant increase of 791 daily steps, but setting more goals did not lead to step increases. In a population of patients with diabetes or pre-diabetes, individualized reminders to wear their Fitbit and elicit personal step goals did not lead to increases in daily steps, although daily steps were higher on days when goals were set. Our intervention improved engagement and data collection, important goals for activity surveillance. This study demonstrates that new, more-effective interventions for increasing activity in patients with pre-diabetes and diabetes are needed.

  20. Short-term effects of goal-setting focusing on the life goal concept on subjective well-being and treatment engagement in subacute inpatients: a quasi-randomized controlled trial

    PubMed Central

    Ogawa, Tatsuya; Omon, Kyohei; Yuda, Tomohisa; Ishigaki, Tomoya; Imai, Ryota; Ohmatsu, Satoko; Morioka, Shu

    2016-01-01

    Objective: To investigate the short-term effects of the life goal concept on subjective well-being and treatment engagement, and to determine the sample size required for a larger trial. Design: A quasi-randomized controlled trial that was not blinded. Setting: A subacute rehabilitation ward. Subjects: A total of 66 patients were randomized to a goal-setting intervention group with the life goal concept (Life Goal), a standard rehabilitation group with no goal-setting intervention (Control 1), or a goal-setting intervention group without the life goal concept (Control 2). Interventions: The goal-setting intervention in the Life Goal and Control 2 was Goal Attainment Scaling. The Life Goal patients were assessed in terms of their life goals, and the hierarchy of goals was explained. The intervention duration was four weeks. Main measures: Patients were assessed pre- and post-intervention. The outcome measures were the Hospital Anxiety and Depression Scale, 12-item General Health Questionnaire, Pittsburgh Rehabilitation Participation Scale, and Functional Independence Measure. Results: Of the 296 potential participants, 66 were enrolled; Life Goal (n = 22), Control 1 (n = 22) and Control 2 (n = 22). Anxiety was significantly lower in the Life Goal (4.1 ±3.0) than in Control 1 (6.7 ±3.4), but treatment engagement was significantly higher in the Life Goal (5.3 ±0.4) compared with both the Control 1 (4.8 ±0.6) and Control 2 (4.9 ±0.5). Conclusions: The life goal concept had a short-term effect on treatment engagement. A sample of 31 patients per group would be required for a fully powered clinical trial. PMID:27496700

  1. A Reduced-Order Successive Linear Estimator for Geostatistical Inversion and its Application in Hydraulic Tomography

    NASA Astrophysics Data System (ADS)

    Zha, Yuanyuan; Yeh, Tian-Chyi J.; Illman, Walter A.; Zeng, Wenzhi; Zhang, Yonggen; Sun, Fangqiang; Shi, Liangsheng

    2018-03-01

    Hydraulic tomography (HT) is a recently developed technology for characterizing high-resolution, site-specific heterogeneity using hydraulic data (nd) from a series of cross-hole pumping tests. To properly account for the subsurface heterogeneity and to flexibly incorporate additional information, geostatistical inverse models, which permit a large number of spatially correlated unknowns (ny), are frequently used to interpret the collected data. However, the memory storage requirements for the covariance of the unknowns (ny × ny) in these models are prodigious for large-scale 3-D problems. Moreover, the sensitivity evaluation is often computationally intensive using traditional difference method (ny forward runs). Although employment of the adjoint method can reduce the cost to nd forward runs, the adjoint model requires intrusive coding effort. In order to resolve these issues, this paper presents a Reduced-Order Successive Linear Estimator (ROSLE) for analyzing HT data. This new estimator approximates the covariance of the unknowns using Karhunen-Loeve Expansion (KLE) truncated to nkl order, and it calculates the directional sensitivities (in the directions of nkl eigenvectors) to form the covariance and cross-covariance used in the Successive Linear Estimator (SLE). In addition, the covariance of unknowns is updated every iteration by updating the eigenvalues and eigenfunctions. The computational advantages of the proposed algorithm are demonstrated through numerical experiments and a 3-D transient HT analysis of data from a highly heterogeneous field site.

  2. Assessment of groundwater and soil quality degradation using multivariate and geostatistical analyses, Dakhla Oasis, Egypt

    NASA Astrophysics Data System (ADS)

    Masoud, Alaa A.; El-Horiny, Mohamed M.; Atwia, Mohamed G.; Gemail, Khaled S.; Koike, Katsuaki

    2018-06-01

    Salinization of groundwater and soil resources has long been a serious environmental hazard in arid regions. This study was conducted to investigate and document the factors controlling such salinization and their inter-relationships in the Dakhla Oasis (Egypt). To accomplish this, 60 groundwater samples and 31 soil samples were collected in February 2014. Factor analysis (FA) and hierarchical cluster analysis (HCA) were integrated with geostatistical analyses to characterize the chemical properties of groundwater and soil and their spatial patterns, identify the factors controlling the pattern variability, and clarify the salinization mechanism. Groundwater quality standards revealed emergence of salinization (av. 885.8 mg/L) and extreme occurrences of Fe2+ (av. 17.22 mg/L) and Mn2+ (av. 2.38 mg/L). Soils were highly salt-affected (av. 15.2 dS m-1) and slightly alkaline (av. pH = 7.7). Evaporation and ion-exchange processes governed the evolution of two main water types: Na-Cl (52%) and Ca-Mg-Cl (47%), respectively. Salinization leads the chemical variability of both resources. Distinctive patterns of slight salinization marked the northern part and intense salinization marked the middle and southern parts. Congruence in the resources clusters confirmed common geology, soil types, and urban and agricultural practices. Minimizing the environmental and socioeconomic impacts of the resources salinization urges the need for better understanding of the hydrochemical characteristics and prediction of quality changes.

  3. Combining binary decision tree and geostatistical methods to estimate snow distribution in a mountain watershed

    USGS Publications Warehouse

    Balk, Benjamin; Elder, Kelly

    2000-01-01

    We model the spatial distribution of snow across a mountain basin using an approach that combines binary decision tree and geostatistical techniques. In April 1997 and 1998, intensive snow surveys were conducted in the 6.9‐km2 Loch Vale watershed (LVWS), Rocky Mountain National Park, Colorado. Binary decision trees were used to model the large‐scale variations in snow depth, while the small‐scale variations were modeled through kriging interpolation methods. Binary decision trees related depth to the physically based independent variables of net solar radiation, elevation, slope, and vegetation cover type. These decision tree models explained 54–65% of the observed variance in the depth measurements. The tree‐based modeled depths were then subtracted from the measured depths, and the resulting residuals were spatially distributed across LVWS through kriging techniques. The kriged estimates of the residuals were added to the tree‐based modeled depths to produce a combined depth model. The combined depth estimates explained 60–85% of the variance in the measured depths. Snow densities were mapped across LVWS using regression analysis. Snow‐covered area was determined from high‐resolution aerial photographs. Combining the modeled depths and densities with a snow cover map produced estimates of the spatial distribution of snow water equivalence (SWE). This modeling approach offers improvement over previous methods of estimating SWE distribution in mountain basins.

  4. A multivariate geostatistical methodology to delineate areas of potential interest for future sedimentary gold exploration

    PubMed Central

    Goovaerts, P.; Albuquerque, Teresa; Antunes, Margarida

    2015-01-01

    This paper describes a multivariate geostatistical methodology to delineate areas of potential interest for future sedimentary gold exploration, with an application to an abandoned sedimentary gold mining region in Portugal. The main challenge was the existence of only a dozen gold measurements confined to the grounds of the old gold mines, which precluded the application of traditional interpolation techniques, such as cokriging. The analysis could, however, capitalize on 376 stream sediment samples that were analyzed for twenty two elements. Gold (Au) was first predicted at all 376 locations using linear regression (R2=0.798) and four metals (Fe, As, Sn and W), which are known to be mostly associated with the local gold’s paragenesis. One hundred realizations of the spatial distribution of gold content were generated using sequential indicator simulation and a soft indicator coding of regression estimates, to supplement the hard indicator coding of gold measurements. Each simulated map then underwent a local cluster analysis to identify significant aggregates of low or high values. The one hundred classified maps were processed to derive the most likely classification of each simulated node and the associated probability of occurrence. Examining the distribution of the hot-spots and cold-spots reveals a clear enrichment in Au along the Erges River downstream from the old sedimentary mineralization. PMID:27777638

  5. The Effect of Distributed Practice in Undergraduate Statistics Homework Sets: A Randomized Trial

    ERIC Educational Resources Information Center

    Crissinger, Bryan R.

    2015-01-01

    Most homework sets in statistics courses are constructed so that students concentrate or "mass" their practice on a certain topic in one problem set. Distributed practice homework sets include review problems in each set so that practice on a topic is distributed across problem sets. There is a body of research that points to the…

  6. A connectionist-geostatistical approach for classification of deformation types in ice surfaces

    NASA Astrophysics Data System (ADS)

    Goetz-Weiss, L. R.; Herzfeld, U. C.; Hale, R. G.; Hunke, E. C.; Bobeck, J.

    2014-12-01

    Deformation is a class of highly non-linear geophysical processes from which one can infer other geophysical variables in a dynamical system. For example, in an ice-dynamic model, deformation is related to velocity, basal sliding, surface elevation changes, and the stress field at the surface as well as internal to a glacier. While many of these variables cannot be observed, deformation state can be an observable variable, because deformation in glaciers (once a viscosity threshold is exceeded) manifests itself in crevasses.Given the amount of information that can be inferred from observing surface deformation, an automated method for classifying surface imagery becomes increasingly desirable. In this paper a Neural Network is used to recognize classes of crevasse types over the Bering Bagley Glacier System (BBGS) during a surge (2011-2013-?). A surge is a spatially and temporally highly variable and rapid acceleration of the glacier. Therefore, many different crevasse types occur in a short time frame and in close proximity, and these crevasse fields hold information on the geophysical processes of the surge.The connectionist-geostatistical approach uses directional experimental (discrete) variograms to parameterize images into a form that the Neural Network can recognize. Recognizing that each surge wave results in different crevasse types and that environmental conditions affect the appearance in imagery, we have developed a semi-automated pre-training software to adapt the Neural Net to chaining conditions.The method is applied to airborne and satellite imagery to classify surge crevasses from the BBGS surge. This method works well for classifying spatially repetitive images such as the crevasses over Bering Glacier. We expand the network for less repetitive images in order to analyze imagery collected over the Arctic sea ice, to assess the percentage of deformed ice for model calibration.

  7. Sampling design optimisation for rainfall prediction using a non-stationary geostatistical model

    NASA Astrophysics Data System (ADS)

    Wadoux, Alexandre M. J.-C.; Brus, Dick J.; Rico-Ramirez, Miguel A.; Heuvelink, Gerard B. M.

    2017-09-01

    The accuracy of spatial predictions of rainfall by merging rain-gauge and radar data is partly determined by the sampling design of the rain-gauge network. Optimising the locations of the rain-gauges may increase the accuracy of the predictions. Existing spatial sampling design optimisation methods are based on minimisation of the spatially averaged prediction error variance under the assumption of intrinsic stationarity. Over the past years, substantial progress has been made to deal with non-stationary spatial processes in kriging. Various well-documented geostatistical models relax the assumption of stationarity in the mean, while recent studies show the importance of considering non-stationarity in the variance for environmental processes occurring in complex landscapes. We optimised the sampling locations of rain-gauges using an extension of the Kriging with External Drift (KED) model for prediction of rainfall fields. The model incorporates both non-stationarity in the mean and in the variance, which are modelled as functions of external covariates such as radar imagery, distance to radar station and radar beam blockage. Spatial predictions are made repeatedly over time, each time recalibrating the model. The space-time averaged KED variance was minimised by Spatial Simulated Annealing (SSA). The methodology was tested using a case study predicting daily rainfall in the north of England for a one-year period. Results show that (i) the proposed non-stationary variance model outperforms the stationary variance model, and (ii) a small but significant decrease of the rainfall prediction error variance is obtained with the optimised rain-gauge network. In particular, it pays off to place rain-gauges at locations where the radar imagery is inaccurate, while keeping the distribution over the study area sufficiently uniform.

  8. Randomized central limit theorems: A unified theory.

    PubMed

    Eliazar, Iddo; Klafter, Joseph

    2010-08-01

    The central limit theorems (CLTs) characterize the macroscopic statistical behavior of large ensembles of independent and identically distributed random variables. The CLTs assert that the universal probability laws governing ensembles' aggregate statistics are either Gaussian or Lévy, and that the universal probability laws governing ensembles' extreme statistics are Fréchet, Weibull, or Gumbel. The scaling schemes underlying the CLTs are deterministic-scaling all ensemble components by a common deterministic scale. However, there are "random environment" settings in which the underlying scaling schemes are stochastic-scaling the ensemble components by different random scales. Examples of such settings include Holtsmark's law for gravitational fields and the Stretched Exponential law for relaxation times. In this paper we establish a unified theory of randomized central limit theorems (RCLTs)-in which the deterministic CLT scaling schemes are replaced with stochastic scaling schemes-and present "randomized counterparts" to the classic CLTs. The RCLT scaling schemes are shown to be governed by Poisson processes with power-law statistics, and the RCLTs are shown to universally yield the Lévy, Fréchet, and Weibull probability laws.

  9. Increasing Water Intake of Children and Parents in the Family Setting: A Randomized, Controlled Intervention Using Installation Theory.

    PubMed

    Lahlou, Saadi; Boesen-Mariani, Sabine; Franks, Bradley; Guelinckx, Isabelle

    2015-01-01

    On average, children and adults in developed countries consume too little water, which can lead to negative health consequences. In a one-year longitudinal field experiment in Poland, we compared the impact of three home-based interventions on helping children and their parents/caregivers to develop sustainable increased plain water consumption habits. Fluid consumption of 334 children and their caregivers were recorded over one year using an online specific fluid dietary record. They were initially randomly allocated to one of the three following conditions: Control, Information (child and carer received information on the health benefits of water), or Placement (in addition to information, free small bottles of still water for a limited time period were delivered at home). After three months, half of the non-controls were randomly assigned to Community (child and caregiver engaged in an online community forum providing support on water consumption). All conditions significantly increased the water consumption of children (by 21.9-56.7%) and of adults (by 22-89%). Placement + Community generated the largest effects. Community enhanced the impact of Placement for children and parents, as well as the impact of Information for parents but not children. The results suggest that the family setting offers considerable scope for successful installation of interventions encouraging children and caregivers to develop healthier consumption habits, in mutually reinforcing ways. Combining information, affordances, and social influence gives the best, and most sustainable, results. © 2015 S. Karger AG, Basel.

  10. Early initiation of night-time NIV in an outpatient setting: a randomized non-inferiority study in ALS patients.

    PubMed

    Bertella, Enrica; Banfi, Paolo; Paneroni, Mara; Grilli, Silvia; Bianchi, Luca; Volpato, Eleonora; Vitacca, Michele

    2017-12-01

    In patients with amyotrophic lateral sclerosis (ALS), non-invasive ventilation (NIV) is usually initiated in an in-hospital regime. We investigated if NIV initiated in an outpatient setting can be as effective in terms of patients' acceptance/adherence. We also evaluated factors predicting NIV acceptance and adherence and disease progression. Prospective randomized study. Outpatient versus inpatient rehabilitation. ALS patients. ALS patients were randomized to two groups for NIV initiation: outpatients versus inpatients. At baseline (T0), end of NIV trial program (T1) and after 3 months from T1 (T2), respiratory function tests, blood gas analysis, and sleep study were performed. At T1, we assessed: NIV acceptance (>4 h/night), and dyspnea symptoms (day/night) by Visual analogue scale (VAS), staff and patients' experience (how difficult NIV was to accept, how difficult ventilator was to manage, satisfaction); at T2: NIV adherence (>120 h/month) and patients' experience. Fifty patients participated. There were no differences in acceptance failure (P=0.733) or adherence failure (P=0.529). At T1, outpatients had longer hours of nocturnal ventilation (P<0.02), at T2 this was similar (P=0.34). Female gender and spinal onset of the disease were predictors for NIV acceptance/adherence failure. There were no between-group differences in progression of respiratory impairment, symptoms and sleep quality. Early outpatient initiation of NIV in ALS is as effective as inpatient initiation.

  11. Screen-and-treat approaches for cervical cancer prevention in low-resource settings: a randomized controlled trial.

    PubMed

    Denny, Lynette; Kuhn, Louise; De Souza, Michelle; Pollack, Amy E; Dupree, William; Wright, Thomas C

    2005-11-02

    Non-cytology-based screen-and-treat approaches for cervical cancer prevention have been developed for low-resource settings, but few have directly addressed efficacy. To determine the safety and efficacy of 2 screen-and-treat approaches for cervical cancer prevention that were designed to be more resource-appropriate than conventional cytology-based screening programs. Randomized clinical trial of 6555 nonpregnant women, aged 35 to 65 years, recruited through community outreach and conducted between June 2000 and December 2002 at ambulatory women's health clinics in Khayelitsha, South Africa. All patients were screened using human papillomavirus (HPV) DNA testing and visual inspection with acetic acid (VIA). Women were subsequently randomized to 1 of 3 groups: cryotherapy if she had a positive HPV DNA test result; cryotherapy if she had a positive VIA test result; or to delayed evaluation. Biopsy-confirmed high-grade cervical cancer precursor lesions and cancer at 6 and 12 months in the HPV DNA and VIA groups compared with the delayed evaluation (control) group; complications after cryotherapy. The prevalence of high-grade cervical intraepithelial neoplasia and cancer (CIN 2+) was significantly lower in the 2 screen-and-treat groups at 6 months after randomization than in the delayed evaluation group. At 6 months, CIN 2+ was diagnosed in 0.80% (95% confidence interval [CI], 0.40%-1.20%) of the women in the HPV DNA group and 2.23% (95% CI, 1.57%-2.89%) in the VIA group compared with 3.55% (95% CI, 2.71%-4.39%) in the delayed evaluation group (P<.001 and P = .02 for the HPV DNA and VIA groups, respectively). A subset of women underwent a second colposcopy 12 months after enrollment. At 12 months the cumulative detection of CIN 2+ among women in the HPV DNA group was 1.42% (95% CI, 0.88%-1.97%), 2.91% (95% CI, 2.12%-3.69%) in the VIA group, and 5.41% (95% CI, 4.32%-6.50%) in the delayed evaluation group. Although minor complaints, such as discharge and bleeding, were

  12. The Effects of a Cluster Randomized Controlled Workplace Intervention on Sleep and Work-Family Conflict Outcomes in an Extended Care Setting.

    PubMed

    Marino, Miguel; Killerby, Marie; Lee, Soomi; Klein, Laura Cousino; Moen, Phyllis; Olson, Ryan; Kossek, Ellen Ernst; King, Rosalind; Erickson, Leslie; Berkman, Lisa F; Buxton, Orfeu M

    2016-12-01

    To evaluate the effects of a workplace-based intervention on actigraphic and self-reported sleep outcomes in an extended care setting. Cluster randomized trial. Extended-care (nursing) facilities. US employees and managers at nursing homes. Nursing homes were randomly selected to intervention or control settings. The Work, Family and Health Study developed an intervention aimed at reducing work-family conflict within a 4-month work-family organizational change process. Employees participated in interactive sessions with facilitated discussions, role-playing, and games designed to increase control over work processes and work time. Managers completed training in family-supportive supervision. Primary actigraphic outcomes included: total sleep duration, wake after sleep onset, nighttime sleep, variation in nighttime sleep, nap duration, and number of naps. Secondary survey outcomes included work-to-family conflict, sleep insufficiency, insomnia symptoms and sleep quality. Measures were obtained at baseline, 6-months and 12-months post-intervention. A total of 1,522 employees and 184 managers provided survey data at baseline. Managers and employees in the intervention arm showed no significant difference in sleep outcomes over time compared to control participants. Sleep outcomes were not moderated by work-to-family conflict or presence of children in the household for managers or employees. Age significantly moderated an intervention effect on nighttime sleep among employees (p=0.040), where younger employees benefited more from the intervention. In the context of an extended-care nursing home workplace, the intervention did not significantly alter sleep outcomes in either managers or employees. Moderating effects of age were identified where younger employees' sleep outcomes benefited more from the intervention.

  13. Effectiveness of active-online, an individually tailored physical activity intervention, in a real-life setting: randomized controlled trial.

    PubMed

    Wanner, Miriam; Martin-Diener, Eva; Braun-Fahrländer, Charlotte; Bauer, Georg; Martin, Brian W

    2009-07-28

    Effective interventions are needed to reduce the chronic disease epidemic. The Internet has the potential to provide large populations with individual advice at relatively low cost. The focus of the study was the Web-based tailored physical activity intervention Active-online. The main research questions were (1) How effective is Active-online, compared to a nontailored website, in increasing self-reported and objectively measured physical activity levels in the general population when delivered in a real-life setting? (2) Do respondents recruited for the randomized study differ from spontaneous users of Active-online, and how does effectiveness differ between these groups? (3) What is the impact of frequency and duration of use of Active-online on changes in physical activity behavior? Volunteers recruited via different media channels completed a Web-based baseline survey and were randomized to Active-online (intervention group) or a nontailored website (control group). In addition, spontaneous users were recruited directly from the Active-online website. In a subgroup of participants, physical activity was measured objectively using accelerometers. Follow-up assessments took place 6 weeks (FU1), 6 months (FU2), and 13 months (FU3) after baseline. A total of 1531 respondents completed the baseline questionnaire (intervention group n = 681, control group n = 688, spontaneous users n = 162); 133 individuals had valid accelerometer data at baseline. Mean age of the total sample was 43.7 years, and 1146 (74.9%) were women. Mixed linear models (adjusted for sex, age, BMI category, and stage of change) showed a significant increase in self-reported mean minutes spent in moderate- and vigorous-intensity activity from baseline to FU1 (coefficient = 0.14, P = .001) and to FU3 (coefficient = 0.19, P < .001) in all participants with no significant differences between groups. A significant increase in the proportion of individuals meeting the HEPA recommendations (self

  14. The effect of automated text messaging and goal setting on pedometer adherence and physical activity in patients with diabetes: A randomized controlled trial

    PubMed Central

    Anthony, Christopher; Carr, Lucas; Simmering, Jacob E.; Evans, Nicholas J.; Foster, Eric D.; Segre, Alberto M.; Cremer, James F.; Polgreen, Philip M.

    2018-01-01

    Introduction Activity-monitoring devices may increase activity, but their effectiveness in sedentary, diseased, and less-motivated populations is unknown. Methods Subjects with diabetes or pre-diabetes were given a Fitbit and randomized into three groups: Fitbit only, Fitbit with reminders, and Fitbit with both reminders and goal setting. Subjects in the reminders group were sent text-message reminders to wear their Fitbit. The goal-setting group was sent a daily text message asking for a step goal. All subjects had three in-person visits (baseline, 3 and 6 months). We modelled daily steps and goal setting using linear mixed-effects models. Results 138 subjects participated with 48 in the Fitbit-only, 44 in the reminders, and 46 in the goal-setting groups. Daily steps decreased for all groups during the study. Average daily steps were 7123, 6906, and 6854 for the Fitbit-only, the goal-setting, and the reminders groups, respectively. The reminders group was 17.2 percentage points more likely to wear their Fitbit than the Fitbit-only group. Setting a goal was associated with a significant increase of 791 daily steps, but setting more goals did not lead to step increases. Conclusion In a population of patients with diabetes or pre-diabetes, individualized reminders to wear their Fitbit and elicit personal step goals did not lead to increases in daily steps, although daily steps were higher on days when goals were set. Our intervention improved engagement and data collection, important goals for activity surveillance. This study demonstrates that new, more-effective interventions for increasing activity in patients with pre-diabetes and diabetes are needed. PMID:29718931

  15. Estimating the signal-to-noise ratio of AVIRIS data

    NASA Technical Reports Server (NTRS)

    Curran, Paul J.; Dungan, Jennifer L.

    1988-01-01

    To make the best use of narrowband airborne visible/infrared imaging spectrometer (AVIRIS) data, an investigator needs to know the ratio of signal to random variability or noise (signal-to-noise ratio or SNR). The signal is land cover dependent and varies with both wavelength and atmospheric absorption; random noise comprises sensor noise and intrapixel variability (i.e., variability within a pixel). The three existing methods for estimating the SNR are inadequate, since typical laboratory methods inflate while dark current and image methods deflate the SNR. A new procedure is proposed called the geostatistical method. It is based on the removal of periodic noise by notch filtering in the frequency domain and the isolation of sensor noise and intrapixel variability using the semi-variogram. This procedure was applied easily and successfully to five sets of AVIRIS data from the 1987 flying season and could be applied to remotely sensed data from broadband sensors.

  16. Using Geostatistical Data Fusion Techniques and MODIS Data to Upscale Simulated Wheat Yield

    NASA Astrophysics Data System (ADS)

    Castrignano, A.; Buttafuoco, G.; Matese, A.; Toscano, P.

    2014-12-01

    Population growth increases food request. Assessing food demand and predicting the actual supply for a given location are critical components of strategic food security planning at regional scale. Crop yield can be simulated using crop models because is site-specific and determined by weather, management, length of growing season and soil properties. Crop models require reliable location-specific data that are not generally available. Obtaining these data at a large number of locations is time-consuming, costly and sometimes simply not feasible. An upscaling method to extend coverage of sparse estimates of crop yield to an appropriate extrapolation domain is required. This work is aimed to investigate the applicability of a geostatistical data fusion approach for merging remote sensing data with the predictions of a simulation model of wheat growth and production using ground-based data. The study area is Capitanata plain (4000 km2) located in Apulia Region, mostly cropped with durum wheat. The MODIS EVI/NDVI data products for Capitanata plain were downloaded from the Land Processes Distributed Active Archive Center (LPDAAC) remote for the whole crop cycle of durum wheat. Phenological development, biomass growth and grain quantity of durum wheat were simulated by the Delphi system, based on a crop simulation model linked to a database including soil properties, agronomical and meteorological data. Multicollocated cokriging was used to integrate secondary exhaustive information (multi-spectral MODIS data) with primary variable (sparsely distributed biomass/yield model predictions of durum wheat). The model estimates looked strongly spatially correlated with the radiance data (red and NIR bands) and the fusion data approach proved to be quite suitable and flexible to integrate data of different type and support.

  17. Subjective randomness as statistical inference.

    PubMed

    Griffiths, Thomas L; Daniels, Dylan; Austerweil, Joseph L; Tenenbaum, Joshua B

    2018-06-01

    Some events seem more random than others. For example, when tossing a coin, a sequence of eight heads in a row does not seem very random. Where do these intuitions about randomness come from? We argue that subjective randomness can be understood as the result of a statistical inference assessing the evidence that an event provides for having been produced by a random generating process. We show how this account provides a link to previous work relating randomness to algorithmic complexity, in which random events are those that cannot be described by short computer programs. Algorithmic complexity is both incomputable and too general to capture the regularities that people can recognize, but viewing randomness as statistical inference provides two paths to addressing these problems: considering regularities generated by simpler computing machines, and restricting the set of probability distributions that characterize regularity. Building on previous work exploring these different routes to a more restricted notion of randomness, we define strong quantitative models of human randomness judgments that apply not just to binary sequences - which have been the focus of much of the previous work on subjective randomness - but also to binary matrices and spatial clustering. Copyright © 2018 Elsevier Inc. All rights reserved.

  18. Dose relations between goal setting, theory-based correlates of goal setting and increases in physical activity during a workplace trial.

    PubMed

    Dishman, Rod K; Vandenberg, Robert J; Motl, Robert W; Wilson, Mark G; DeJoy, David M

    2010-08-01

    The effectiveness of an intervention depends on its dose and on moderators of dose, which usually are not studied. The purpose of the study is to determine whether goal setting and theory-based moderators of goal setting had dose relations with increases in goal-related physical activity during a successful workplace intervention. A group-randomized 12-week intervention that included personal goal setting was implemented in fall 2005, with a multiracial/ethnic sample of employees at 16 geographically diverse worksites. Here, we examined dose-related variables in the cohort of participants (N = 664) from the 8 worksites randomized to the intervention. Participants in the intervention exceeded 9000 daily pedometer steps and 300 weekly minutes of moderate-to-vigorous physical activity (MVPA) during the last 6 weeks of the study, which approximated or exceeded current public health guidelines. Linear growth modeling indicated that participants who set higher goals and sustained higher levels of self-efficacy, commitment and intention about attaining their goals had greater increases in pedometer steps and MVPA. The relation between change in participants' satisfaction with current physical activity and increases in physical activity was mediated by increases in self-set goals. The results show a dose relation of increased physical activity with changes in goal setting, satisfaction, self-efficacy, commitment and intention, consistent with goal-setting theory.

  19. Optimal partitioning of random programs across two processors

    NASA Technical Reports Server (NTRS)

    Nicol, D. M.

    1986-01-01

    The optimal partitioning of random distributed programs is discussed. It is concluded that the optimal partitioning of a homogeneous random program over a homogeneous distributed system either assigns all modules to a single processor, or distributes the modules as evenly as possible among all processors. The analysis rests heavily on the approximation which equates the expected maximum of a set of independent random variables with the set's maximum expectation. The results are strengthened by providing an approximation-free proof of this result for two processors under general conditions on the module execution time distribution. It is also shown that use of this approximation causes two of the previous central results to be false.

  20. School-based surveys of malaria in Oromia Regional State, Ethiopia: a rapid survey method for malaria in low transmission settings

    PubMed Central

    2011-01-01

    Background In Ethiopia, malaria transmission is seasonal and unstable, with both Plasmodium falciparum and Plasmodium vivax endemic. Such spatial and temporal clustering of malaria only serves to underscore the importance of regularly collecting up-to-date malaria surveillance data to inform decision-making in malaria control. Cross-sectional school-based malaria surveys were conducted across Oromia Regional State to generate up-to-date data for planning malaria control interventions, as well as monitoring and evaluation of operational programme implementation. Methods Two hundred primary schools were randomly selected using a stratified and weighted sampling frame; 100 children aged five to 18 years were then randomly chosen within each school. Surveys were carried out in May 2009 and from October to December 2009, to coincide with the peak of malaria transmission in different parts of Oromia. Each child was tested for malaria by expert microscopy, their haemoglobin measured and a simple questionnaire completed. Satellite-derived environmental data were used to assess ecological correlates of Plasmodium infection; Bayesian geostatistical methods and Kulldorff's spatial scan statistic were employed to investigate spatial heterogeneity. Results A total 20,899 children from 197 schools provided blood samples, two selected schools were inaccessible and one school refused to participate. The overall prevalence of Plasmodium infection was found to be 0.56% (95% CI: 0.46-0.67%), with 53% of infections due to P. falciparum and 47% due to P. vivax. Of children surveyed, 17.6% (95% CI: 17.0-18.1%) were anaemic, while 46% reported sleeping under a mosquito net the previous night. Malaria was found at 30 (15%) schools to a maximum elevation of 2,187 metres, with school-level Plasmodium prevalence ranging between 0% and 14.5%. Although environmental variables were only weakly associated with P. falciparum and P. vivax infection, clusters of infection were identified within

  1. Environmental footprints of brick kiln bottom ashes: Geostatistical approach for assessment of metal toxicity.

    PubMed

    Mondal, Ananya; Das, Subhasish; Sah, Rajesh Kumar; Bhattacharyya, Pradip; Bhattacharya, Satya Sundar

    2017-12-31

    Coal fired brick kiln factories generate significant of brick kiln bottom ash (BKBA) that contaminate soil and water environments of areas near the dumping sites through leaching of toxic metals (Pb, Cr, Cd, Zn, Mn, and Cu). However, characteristics and environmental effects of BKBAs are yet unknown. We collected BKBA samples from 32 strategic locations of two rapidly developing States (West Bengal and Assam) of India. Scanning electron microscope images indicated spherical and granular structures of BKBAs produced in West Bengal (WBKBA) and Assam (ABKBA) respectively; while energy dispersive spectroscopy and analytical assessments confirmed substantial occurrence of total organic C and nutrient elements (N, P, K, Ca, Mg, and S) in both the BKBAs. FTIR analysis revealed greater predominance of organic matter in ABKBAs than WBKBAs. Occurrence of toxic metals (Cd, Cr, Pb, Zn, Mn, and Cu) was higher in ABKBAs than in WBKBAs; while organic and residual fractions of metals were highly predominant in most of the BKBAs. Principal component analysis showed that metal contents and pH were the major distinguishing characteristics of the BKBAs generated in the two different environmental locations. Human health risk associated with BKBAs generated in Assam is of significant concern. Finally, geo-statistical tools enabled to predict the spatial distribution patterns of toxic metals contributed by the BKBAs in Assam and West Bengal respectively. Assessment of contamination index, geo-accumulation index, and ecological risk index revealed some BKBAs to be more toxic than others. Copyright © 2017. Published by Elsevier B.V.

  2. Fourier dimension of random images

    NASA Astrophysics Data System (ADS)

    Ekström, Fredrik

    2016-10-01

    Given a compact set of real numbers, a random C^{m + α}-diffeomorphism is constructed such that the image of any measure concentrated on the set and satisfying a certain condition involving a real number s, almost surely has Fourier dimension greater than or equal to s / (m + α). This is used to show that every Borel subset of the real numbers of Hausdorff dimension s is C^{m + α}-equivalent to a set of Fourier dimension greater than or equal to s / (m + α ). In particular every Borel set is diffeomorphic to a Salem set, and the Fourier dimension is not invariant under Cm-diffeomorphisms for any m.

  3. A Randomized Controlled Trial of an Activity Specific Exercise Program for Individuals With Alzheimer Disease in Long-term Care Settings

    PubMed Central

    Roach, Kathryn E.; Tappen, Ruth M.; Kirk-Sanchez, Neva; Williams, Christine L.; Loewenstein, David

    2011-01-01

    Objective To determine whether an activity specific exercise program could improve ability to perform basic mobility activities in long-term care residents with Alzheimer disease (AD). Design Randomized, controlled, single-blinded clinical trial. Setting Residents of 7 long-term care facilities. Participants Eighty-two long-term care residents with mild to severe AD. Intervention An activity specific exercise program was compared to a walking program and to an attention control. Measurements Ability to perform bed mobility and transfers were assessed using the subscales of the Acute Care Index of Function; functional mobility was measured using the 6-Minute Walk test. Results Subjects receiving the activity specific exercise program improved in ability to perform transfers, whereas subjects in the other 2 groups declined. PMID:21937893

  4. Field-scale soil moisture space-time geostatistical modeling for complex Palouse landscapes in the inland Pacific Northwest

    NASA Astrophysics Data System (ADS)

    Chahal, M. K.; Brown, D. J.; Brooks, E. S.; Campbell, C.; Cobos, D. R.; Vierling, L. A.

    2012-12-01

    Estimating soil moisture content continuously over space and time using geo-statistical techniques supports the refinement of process-based watershed hydrology models and the application of soil process models (e.g. biogeochemical models predicting greenhouse gas fluxes) to complex landscapes. In this study, we model soil profile volumetric moisture content for five agricultural fields with loess soils in the Palouse region of Eastern Washington and Northern Idaho. Using a combination of stratification and space-filling techniques, we selected 42 representative and distributed measurement locations in the Cook Agronomy Farm (Pullman, WA) and 12 locations each in four additional grower fields that span the precipitation gradient across the Palouse. At each measurement location, soil moisture was measured on an hourly basis at five different depths (30, 60, 90, 120, and 150 cm) using Decagon 5-TE/5-TM soil moisture sensors (Decagon Devices, Pullman, WA, USA). This data was collected over three years for the Cook Agronomy Farm and one year for each of the grower fields. In addition to ordinary kriging, we explored the correlation of volumetric water content with external, spatially exhaustive indices derived from terrain models, optical remote sensing imagery, and proximal soil sensing data (electromagnetic induction and VisNIR penetrometer)

  5. Arsenic in groundwater in Bangladesh: A geostatistical and epidemiological framework for evaluating health effects and potential remedies

    NASA Astrophysics Data System (ADS)

    Yu, Winston H.; Harvey, Charles M.; Harvey, Charles F.

    2003-06-01

    This paper examines the health crisis in Bangladesh due to dissolved arsenic in groundwater. First, we use geostatistical methods to construct a map of arsenic concentrations that divides Bangladesh into regions and estimate vertical concentration trends in these regions. Then, we use census data to estimate exposure distributions in the regions; we use epidemiological data from West Bengal and Taiwan to estimate dose response functions for arsenicosis and arsenic-induced cancers; and we combine the regional exposure distributions and the dose response models to estimate the health effects of groundwater arsenic in Bangladesh. We predict that long-term exposure to present arsenic concentrations will result in approximately 1,200,000 cases of hyperpigmentation, 600,000 cases of keratosis, 125,000 cases of skin cancer, and 3000 fatalities per year from internal cancers. Although these estimates are very uncertain, the method provides a framework for incorporating better data as it becomes available. Moreover, we examine the remedy of drilling deeper wells in selected regions of Bangladesh. By replacing 31% of the wells in the country with deeper wells the health effects of drinking groundwater arsenic could be reduced by approximately 70% provided that arsenic concentrations in deep wells remain relatively low.

  6. Feasibility intervention trial of two types of improved cookstoves in three resource-limited settings: study protocol for a randomized controlled trial.

    PubMed

    Klasen, Elizabeth; Miranda, J Jaime; Khatry, Subarna; Menya, Diana; Gilman, Robert H; Tielsch, James M; Kennedy, Caitlin; Dreibelbis, Robert; Naithani, Neha; Kimaiyo, Sylvester; Chiang, Marilu; Carter, E Jane; Sherman, Charles B; Breysse, Patrick N; Checkley, William

    2013-10-10

    Exposure to biomass fuel smoke is one of the leading risk factors for disease burden worldwide. International campaigns are currently promoting the widespread adoption of improved cookstoves in resource-limited settings, yet little is known about the cultural and social barriers to successful improved cookstove adoption and how these barriers affect environmental exposures and health outcomes. We plan to conduct a one-year crossover, feasibility intervention trial in three resource-limited settings (Kenya, Nepal and Peru). We will enroll 40 to 46 female primary cooks aged 20 to 49 years in each site (total 120 to 138). At baseline, we will collect information on sociodemographic characteristics and cooking practices, and measure respiratory health and blood pressure for all participating women. An initial observational period of four months while households use their traditional, open-fire design cookstoves will take place prior to randomization. All participants will then be randomized to receive one of two types of improved, ventilated cookstoves with a chimney: a commercially-constructed cookstove (Envirofit G3300/G3355) or a locally-constructed cookstove. After four months of observation, participants will crossover and receive the other improved cookstove design and be followed for another four months. During each of the three four-month study periods, we will collect monthly information on self-reported respiratory symptoms, cooking practices, compliance with cookstove use (intervention periods only), and measure peak expiratory flow, forced expiratory volume at 1 second, exhaled carbon monoxide and blood pressure. We will also measure pulmonary function testing in the women participants and 24-hour kitchen particulate matter and carbon monoxide levels at least once per period. Findings from this study will help us better understand the behavioral, biological, and environmental changes that occur with a cookstove intervention. If this trial indicates that

  7. Feasibility intervention trial of two types of improved cookstoves in three resource-limited settings: study protocol for a randomized controlled trial

    PubMed Central

    2013-01-01

    Background Exposure to biomass fuel smoke is one of the leading risk factors for disease burden worldwide. International campaigns are currently promoting the widespread adoption of improved cookstoves in resource-limited settings, yet little is known about the cultural and social barriers to successful improved cookstove adoption and how these barriers affect environmental exposures and health outcomes. Design We plan to conduct a one-year crossover, feasibility intervention trial in three resource-limited settings (Kenya, Nepal and Peru). We will enroll 40 to 46 female primary cooks aged 20 to 49 years in each site (total 120 to 138). Methods At baseline, we will collect information on sociodemographic characteristics and cooking practices, and measure respiratory health and blood pressure for all participating women. An initial observational period of four months while households use their traditional, open-fire design cookstoves will take place prior to randomization. All participants will then be randomized to receive one of two types of improved, ventilated cookstoves with a chimney: a commercially-constructed cookstove (Envirofit G3300/G3355) or a locally-constructed cookstove. After four months of observation, participants will crossover and receive the other improved cookstove design and be followed for another four months. During each of the three four-month study periods, we will collect monthly information on self-reported respiratory symptoms, cooking practices, compliance with cookstove use (intervention periods only), and measure peak expiratory flow, forced expiratory volume at 1 second, exhaled carbon monoxide and blood pressure. We will also measure pulmonary function testing in the women participants and 24-hour kitchen particulate matter and carbon monoxide levels at least once per period. Discussion Findings from this study will help us better understand the behavioral, biological, and environmental changes that occur with a cookstove

  8. Randomized central limit theorems: A unified theory

    NASA Astrophysics Data System (ADS)

    Eliazar, Iddo; Klafter, Joseph

    2010-08-01

    The central limit theorems (CLTs) characterize the macroscopic statistical behavior of large ensembles of independent and identically distributed random variables. The CLTs assert that the universal probability laws governing ensembles’ aggregate statistics are either Gaussian or Lévy, and that the universal probability laws governing ensembles’ extreme statistics are Fréchet, Weibull, or Gumbel. The scaling schemes underlying the CLTs are deterministic—scaling all ensemble components by a common deterministic scale. However, there are “random environment” settings in which the underlying scaling schemes are stochastic—scaling the ensemble components by different random scales. Examples of such settings include Holtsmark’s law for gravitational fields and the Stretched Exponential law for relaxation times. In this paper we establish a unified theory of randomized central limit theorems (RCLTs)—in which the deterministic CLT scaling schemes are replaced with stochastic scaling schemes—and present “randomized counterparts” to the classic CLTs. The RCLT scaling schemes are shown to be governed by Poisson processes with power-law statistics, and the RCLTs are shown to universally yield the Lévy, Fréchet, and Weibull probability laws.

  9. Training set selection for the prediction of essential genes.

    PubMed

    Cheng, Jian; Xu, Zhao; Wu, Wenwu; Zhao, Li; Li, Xiangchen; Liu, Yanlin; Tao, Shiheng

    2014-01-01

    Various computational models have been developed to transfer annotations of gene essentiality between organisms. However, despite the increasing number of microorganisms with well-characterized sets of essential genes, selection of appropriate training sets for predicting the essential genes of poorly-studied or newly sequenced organisms remains challenging. In this study, a machine learning approach was applied reciprocally to predict the essential genes in 21 microorganisms. Results showed that training set selection greatly influenced predictive accuracy. We determined four criteria for training set selection: (1) essential genes in the selected training set should be reliable; (2) the growth conditions in which essential genes are defined should be consistent in training and prediction sets; (3) species used as training set should be closely related to the target organism; and (4) organisms used as training and prediction sets should exhibit similar phenotypes or lifestyles. We then analyzed the performance of an incomplete training set and an integrated training set with multiple organisms. We found that the size of the training set should be at least 10% of the total genes to yield accurate predictions. Additionally, the integrated training sets exhibited remarkable increase in stability and accuracy compared with single sets. Finally, we compared the performance of the integrated training sets with the four criteria and with random selection. The results revealed that a rational selection of training sets based on our criteria yields better performance than random selection. Thus, our results provide empirical guidance on training set selection for the identification of essential genes on a genome-wide scale.

  10. Rtop - an R package for interpolation along the stream network

    NASA Astrophysics Data System (ADS)

    Skøien, J. O.

    2009-04-01

    Rtop - an R package for interpolation along the stream network Geostatistical methods have been used to a limited extent for estimation along stream networks, with a few exceptions(Gottschalk, 1993; Gottschalk, et al., 2006; Sauquet, et al., 2000; Skøien, et al., 2006). Interpolation of runoff characteristics are more complicated than the traditional random variables estimated by geostatistical methods, as the measurements have a more complicated support, and many catchments are nested. Skøien et al. (2006) presented the model Top-kriging which takes these effects into account for interpolation of stream flow characteristics (exemplified by the 100 year flood). The method has here been implemented as a package in the statistical environment R (R Development Core Team, 2004). Taking advantage of the existing methods in R for working with spatial objects, and the extensive possibilities for visualizing the result, this makes it considerably easier to apply the method on new data sets, in comparison to earlier implementation of the method. Gottschalk, L. 1993. Interpolation of runoff applying objective methods. Stochastic Hydrology and Hydraulics, 7, 269-281. Gottschalk, L., I. Krasovskaia, E. Leblois, and E. Sauquet. 2006. Mapping mean and variance of runoff in a river basin. Hydrology and Earth System Sciences, 10, 469-484. R Development Core Team. 2004. R: A language and environment for statistical computing. Vienna, Austria: R Foundation for Statistical Computing. Sauquet, E., L. Gottschalk, and E. Leblois. 2000. Mapping average annual runoff: a hierarchical approach applying a stochastic interpolation scheme. Hydrological Sciences Journal, 45 (6), 799-815. Skøien, J. O., R. Merz, and G. Blöschl. 2006. Top-kriging - geostatistics on stream networks. Hydrology and Earth System Sciences, 10, 277-287.

  11. Inverse modeling of hydraulic tests in fractured crystalline rock based on a transition probability geostatistical approach

    NASA Astrophysics Data System (ADS)

    Blessent, Daniela; Therrien, René; Lemieux, Jean-Michel

    2011-12-01

    This paper presents numerical simulations of a series of hydraulic interference tests conducted in crystalline bedrock at Olkiluoto (Finland), a potential site for the disposal of the Finnish high-level nuclear waste. The tests are in a block of crystalline bedrock of about 0.03 km3 that contains low-transmissivity fractures. Fracture density, orientation, and fracture transmissivity are estimated from Posiva Flow Log (PFL) measurements in boreholes drilled in the rock block. On the basis of those data, a geostatistical approach relying on a transitional probability and Markov chain models is used to define a conceptual model based on stochastic fractured rock facies. Four facies are defined, from sparsely fractured bedrock to highly fractured bedrock. Using this conceptual model, three-dimensional groundwater flow is then simulated to reproduce interference pumping tests in either open or packed-off boreholes. Hydraulic conductivities of the fracture facies are estimated through automatic calibration using either hydraulic heads or both hydraulic heads and PFL flow rates as targets for calibration. The latter option produces a narrower confidence interval for the calibrated hydraulic conductivities, therefore reducing the associated uncertainty and demonstrating the usefulness of the measured PFL flow rates. Furthermore, the stochastic facies conceptual model is a suitable alternative to discrete fracture network models to simulate fluid flow in fractured geological media.

  12. Comparison of Oral Reading Errors between Contextual Sentences and Random Words among Schoolchildren

    ERIC Educational Resources Information Center

    Khalid, Nursyairah Mohd; Buari, Noor Halilah; Chen, Ai-Hong

    2017-01-01

    This paper compares the oral reading errors between the contextual sentences and random words among schoolchildren. Two sets of reading materials were developed to test the oral reading errors in 30 schoolchildren (10.00±1.44 years). Set A was comprised contextual sentences while Set B encompassed random words. The schoolchildren were asked to…

  13. Alternatives for randomization in lifestyle intervention studies in cancer patients were not better than conventional randomization.

    PubMed

    Velthuis, Miranda J; May, Anne M; Monninkhof, Evelyn M; van der Wall, Elsken; Peeters, Petra H M

    2012-03-01

    Assessing effects of lifestyle interventions in cancer patients has some specific challenges. Although randomization is urgently needed for evidence-based knowledge, sometimes it is difficult to apply conventional randomization (i.e., consent preceding randomization and intervention) in daily settings. Randomization before seeking consent was proposed by Zelen, and additional modifications were proposed since. We discuss four alternatives for conventional randomization: single and double randomized consent design, two-stage randomized consent design, and the design with consent to postponed information. We considered these designs when designing a study to assess the impact of physical activity on cancer-related fatigue and quality of life. We tested the modified Zelen design with consent to postponed information in a pilot. The design was chosen to prevent drop out of participants in the control group because of disappointment about the allocation. The result was a low overall participation rate most likely because of perceived lack of information by eligible patients and a relatively high dropout in the intervention group. We conclude that the alternatives were not better than conventional randomization. Copyright © 2012 Elsevier Inc. All rights reserved.

  14. Statistical Estimation of Heterogeneities: A New Frontier in Well Testing

    NASA Astrophysics Data System (ADS)

    Neuman, S. P.; Guadagnini, A.; Illman, W. A.; Riva, M.; Vesselinov, V. V.

    2001-12-01

    Well-testing methods have traditionally relied on analytical solutions of groundwater flow equations in relatively simple domains, consisting of one or at most a few units having uniform hydraulic properties. Recently, attention has been shifting toward methods and solutions that would allow one to characterize subsurface heterogeneities in greater detail. On one hand, geostatistical inverse methods are being used to assess the spatial variability of parameters, such as permeability and porosity, on the basis of multiple cross-hole pressure interference tests. On the other hand, analytical solutions are being developed to describe the mean and variance (first and second statistical moments) of flow to a well in a randomly heterogeneous medium. Geostatistical inverse interpretation of cross-hole tests yields a smoothed but detailed "tomographic" image of how parameters actually vary in three-dimensional space, together with corresponding measures of estimation uncertainty. Moment solutions may soon allow one to interpret well tests in terms of statistical parameters such as the mean and variance of log permeability, its spatial autocorrelation and statistical anisotropy. The idea of geostatistical cross-hole tomography is illustrated through pneumatic injection tests conducted in unsaturated fractured tuff at the Apache Leap Research Site near Superior, Arizona. The idea of using moment equations to interpret well-tests statistically is illustrated through a recently developed three-dimensional solution for steady state flow to a well in a bounded, randomly heterogeneous, statistically anisotropic aquifer.

  15. Geostatistical modeling of the gas emission zone and its in-place gas content for Pittsburgh-seam mines using sequential Gaussian simulation

    USGS Publications Warehouse

    Karacan, C.O.; Olea, R.A.; Goodman, G.

    2012-01-01

    Determination of the size of the gas emission zone, the locations of gas sources within, and especially the amount of gas retained in those zones is one of the most important steps for designing a successful methane control strategy and an efficient ventilation system in longwall coal mining. The formation of the gas emission zone and the potential amount of gas-in-place (GIP) that might be available for migration into a mine are factors of local geology and rock properties that usually show spatial variability in continuity and may also show geometric anisotropy. Geostatistical methods are used here for modeling and prediction of gas amounts and for assessing their associated uncertainty in gas emission zones of longwall mines for methane control.This study used core data obtained from 276 vertical exploration boreholes drilled from the surface to the bottom of the Pittsburgh coal seam in a mining district in the Northern Appalachian basin. After identifying important coal and non-coal layers for the gas emission zone, univariate statistical and semivariogram analyses were conducted for data from different formations to define the distribution and continuity of various attributes. Sequential simulations performed stochastic assessment of these attributes, such as gas content, strata thickness, and strata displacement. These analyses were followed by calculations of gas-in-place and their uncertainties in the Pittsburgh seam caved zone and fractured zone of longwall mines in this mining district. Grid blanking was used to isolate the volume over the actual panels from the entire modeled district and to calculate gas amounts that were directly related to the emissions in longwall mines.Results indicated that gas-in-place in the Pittsburgh seam, in the caved zone and in the fractured zone, as well as displacements in major rock units, showed spatial correlations that could be modeled and estimated using geostatistical methods. This study showed that GIP volumes may

  16. Geostatistical modeling of the gas emission zone and its in-place gas content for Pittsburgh-seam mines using sequential Gaussian simulation

    PubMed Central

    Karacan, C. Özgen; Olea, Ricardo A.; Goodman, Gerrit

    2015-01-01

    Determination of the size of the gas emission zone, the locations of gas sources within, and especially the amount of gas retained in those zones is one of the most important steps for designing a successful methane control strategy and an efficient ventilation system in longwall coal mining. The formation of the gas emission zone and the potential amount of gas-in-place (GIP) that might be available for migration into a mine are factors of local geology and rock properties that usually show spatial variability in continuity and may also show geometric anisotropy. Geostatistical methods are used here for modeling and prediction of gas amounts and for assessing their associated uncertainty in gas emission zones of longwall mines for methane control. This study used core data obtained from 276 vertical exploration boreholes drilled from the surface to the bottom of the Pittsburgh coal seam in a mining district in the Northern Appalachian basin. After identifying important coal and non-coal layers for the gas emission zone, univariate statistical and semivariogram analyses were conducted for data from different formations to define the distribution and continuity of various attributes. Sequential simulations performed stochastic assessment of these attributes, such as gas content, strata thickness, and strata displacement. These analyses were followed by calculations of gas-in-place and their uncertainties in the Pittsburgh seam caved zone and fractured zone of longwall mines in this mining district. Grid blanking was used to isolate the volume over the actual panels from the entire modeled district and to calculate gas amounts that were directly related to the emissions in longwall mines. Results indicated that gas-in-place in the Pittsburgh seam, in the caved zone and in the fractured zone, as well as displacements in major rock units, showed spatial correlations that could be modeled and estimated using geostatistical methods. This study showed that GIP volumes may

  17. Video-games used in a group setting is feasible and effective to improve indicators of physical activity in individuals with chronic stroke: a randomized controlled trial.

    PubMed

    Givon, Noa; Zeilig, Gabi; Weingarden, Harold; Rand, Debbie

    2016-04-01

    To investigate the feasibility of using video-games in a group setting and to compare the effectiveness of video-games as a group intervention to a traditional group intervention for improving physical activity in individuals with chronic stroke. A single-blind randomized controlled trial with evaluations pre and post a 3-month intervention, and at 3-month follow-up. Compliance (session attendance), satisfaction and adverse effects were feasibility measures. Grip strength and gait speed were measures of physical activity. Hip accelerometers quantified steps/day and the Action Research Arm Test assessed the functional ability of the upper extremity. Forty-seven community-dwelling individuals with chronic stroke (29-78 years) were randomly allocated to receive video-game (N=24) or traditional therapy (N=23) in a group setting. There was high treatment compliance for both interventions (video-games-78%, traditional therapy-66%), but satisfaction was rated higher for the video-game (93%) than the traditional therapy (71%) (χ(2)=4.98, P=0.026). Adverse effects were not reported in either group. Significant improvements were demonstrated in both groups for gait speed (F=3.9, P=0.02), grip strength of the weaker (F=6.67, P=0.002) and stronger hands (F=7.5, P=0.001). Daily steps and functional ability of the weaker hand did not increase in either group. Using video-games in a small group setting is feasible, safe and satisfying. Video-games improve indicators of physical activity of individuals with chronic stroke. © The Author(s) 2015.

  18. Reservoir Characterization using geostatistical and numerical modeling in GIS with noble gas geochemistry

    NASA Astrophysics Data System (ADS)

    Vasquez, D. A.; Swift, J. N.; Tan, S.; Darrah, T. H.

    2013-12-01

    The integration of precise geochemical analyses with quantitative engineering modeling into an interactive GIS system allows for a sophisticated and efficient method of reservoir engineering and characterization. Geographic Information Systems (GIS) is utilized as an advanced technique for oil field reservoir analysis by combining field engineering and geological/geochemical spatial datasets with the available systematic modeling and mapping methods to integrate the information into a spatially correlated first-hand approach in defining surface and subsurface characteristics. Three key methods of analysis include: 1) Geostatistical modeling to create a static and volumetric 3-dimensional representation of the geological body, 2) Numerical modeling to develop a dynamic and interactive 2-dimensional model of fluid flow across the reservoir and 3) Noble gas geochemistry to further define the physical conditions, components and history of the geologic system. Results thus far include using engineering algorithms for interpolating electrical well log properties across the field (spontaneous potential, resistivity) yielding a highly accurate and high-resolution 3D model of rock properties. Results so far also include using numerical finite difference methods (crank-nicholson) to solve for equations describing the distribution of pressure across field yielding a 2D simulation model of fluid flow across reservoir. Ongoing noble gas geochemistry results will also include determination of the source, thermal maturity and the extent/style of fluid migration (connectivity, continuity and directionality). Future work will include developing an inverse engineering algorithm to model for permeability, porosity and water saturation.This combination of new and efficient technological and analytical capabilities is geared to provide a better understanding of the field geology and hydrocarbon dynamics system with applications to determine the presence of hydrocarbon pay zones (or

  19. Screening for intimate partner violence in health care settings: a randomized trial.

    PubMed

    MacMillan, Harriet L; Wathen, C Nadine; Jamieson, Ellen; Boyle, Michael H; Shannon, Harry S; Ford-Gilboe, Marilyn; Worster, Andrew; Lent, Barbara; Coben, Jeffrey H; Campbell, Jacquelyn C; McNutt, Louise-Anne

    2009-08-05

    Whether intimate partner violence (IPV) screening reduces violence or improves health outcomes for women is unknown. To determine the effectiveness of IPV screening and communication of positive results to clinicians. Randomized controlled trial conducted in 11 emergency departments, 12 family practices, and 3 obstetrics/gynecology clinics in Ontario, Canada, among 6743 English-speaking female patients aged 18 to 64 years who presented between July 2005 and December 2006, could be seen individually, and were well enough to participate. Women in the screened group (n=3271) self-completed the Woman Abuse Screening Tool (WAST); if a woman screened positive, this information was given to her clinician before the health care visit. Subsequent discussions and/or referrals were at the discretion of the treating clinician. The nonscreened group (n=3472) self-completed the WAST and other measures after their visit. Women disclosing past-year IPV were interviewed at baseline and every 6 months until 18 months regarding IPV reexposure and quality of life (primary outcomes), as well as several health outcomes and potential harms of screening. Participant loss to follow-up was high: 43% (148/347) of screened women and 41% (148/360) of nonscreened women. At 18 months (n = 411), observed recurrence of IPV among screened vs nonscreened women was 46% vs 53% (modeled odds ratio, 0.82; 95% confidence interval, 0.32-2.12). Screened vs nonscreened women exhibited about a 0.2-SD greater improvement in quality-of-life scores (modeled score difference at 18 months, 3.74; 95% confidence interval, 0.47-7.00). When multiple imputation was used to account for sample loss, differences between groups were reduced and quality-of-life differences were no longer significant. Screened women reported no harms of screening. Although sample attrition urges cautious interpretation, the results of this trial do not provide sufficient evidence to support IPV screening in health care settings. Evaluation

  20. Exact distribution of a pattern in a set of random sequences generated by a Markov source: applications to biological data

    PubMed Central

    2010-01-01

    Background In bioinformatics it is common to search for a pattern of interest in a potentially large set of rather short sequences (upstream gene regions, proteins, exons, etc.). Although many methodological approaches allow practitioners to compute the distribution of a pattern count in a random sequence generated by a Markov source, no specific developments have taken into account the counting of occurrences in a set of independent sequences. We aim to address this problem by deriving efficient approaches and algorithms to perform these computations both for low and high complexity patterns in the framework of homogeneous or heterogeneous Markov models. Results The latest advances in the field allowed us to use a technique of optimal Markov chain embedding based on deterministic finite automata to introduce three innovative algorithms. Algorithm 1 is the only one able to deal with heterogeneous models. It also permits to avoid any product of convolution of the pattern distribution in individual sequences. When working with homogeneous models, Algorithm 2 yields a dramatic reduction in the complexity by taking advantage of previous computations to obtain moment generating functions efficiently. In the particular case of low or moderate complexity patterns, Algorithm 3 exploits power computation and binary decomposition to further reduce the time complexity to a logarithmic scale. All these algorithms and their relative interest in comparison with existing ones were then tested and discussed on a toy-example and three biological data sets: structural patterns in protein loop structures, PROSITE signatures in a bacterial proteome, and transcription factors in upstream gene regions. On these data sets, we also compared our exact approaches to the tempting approximation that consists in concatenating the sequences in the data set into a single sequence. Conclusions Our algorithms prove to be effective and able to handle real data sets with multiple sequences, as well

  1. Exact distribution of a pattern in a set of random sequences generated by a Markov source: applications to biological data.

    PubMed

    Nuel, Gregory; Regad, Leslie; Martin, Juliette; Camproux, Anne-Claude

    2010-01-26

    In bioinformatics it is common to search for a pattern of interest in a potentially large set of rather short sequences (upstream gene regions, proteins, exons, etc.). Although many methodological approaches allow practitioners to compute the distribution of a pattern count in a random sequence generated by a Markov source, no specific developments have taken into account the counting of occurrences in a set of independent sequences. We aim to address this problem by deriving efficient approaches and algorithms to perform these computations both for low and high complexity patterns in the framework of homogeneous or heterogeneous Markov models. The latest advances in the field allowed us to use a technique of optimal Markov chain embedding based on deterministic finite automata to introduce three innovative algorithms. Algorithm 1 is the only one able to deal with heterogeneous models. It also permits to avoid any product of convolution of the pattern distribution in individual sequences. When working with homogeneous models, Algorithm 2 yields a dramatic reduction in the complexity by taking advantage of previous computations to obtain moment generating functions efficiently. In the particular case of low or moderate complexity patterns, Algorithm 3 exploits power computation and binary decomposition to further reduce the time complexity to a logarithmic scale. All these algorithms and their relative interest in comparison with existing ones were then tested and discussed on a toy-example and three biological data sets: structural patterns in protein loop structures, PROSITE signatures in a bacterial proteome, and transcription factors in upstream gene regions. On these data sets, we also compared our exact approaches to the tempting approximation that consists in concatenating the sequences in the data set into a single sequence. Our algorithms prove to be effective and able to handle real data sets with multiple sequences, as well as biological patterns of

  2. 78 FR 4855 - Random Drug Testing Rate for Covered Crewmembers

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-23

    ... DEPARTMENT OF HOMELAND SECURITY Coast Guard [Docket No. USCG-2009-0973] Random Drug Testing Rate for Covered Crewmembers AGENCY: Coast Guard, DHS. ACTION: Notice of minimum random drug testing rate. SUMMARY: The Coast Guard has set the calendar year 2013 minimum random drug testing rate at 25 percent of...

  3. 76 FR 79204 - Random Drug Testing Rate for Covered Crewmembers

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-21

    ... DEPARTMENT OF HOMELAND SECURITY Coast Guard [Docket No. USCG-2009-0973] Random Drug Testing Rate for Covered Crewmembers AGENCY: Coast Guard, DHS. ACTION: Notice of minimum random drug testing rate. SUMMARY: The Coast Guard has set the calendar year 2012 minimum random drug testing rate at 50 percent of...

  4. 76 FR 1448 - Random Drug Testing Rate for Covered Crewmembers

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-10

    ... DEPARTMENT OF HOMELAND SECURITY Coast Guard [Docket No. USCG-2009-0973] Random Drug Testing Rate for Covered Crewmembers AGENCY: Coast Guard, DHS. ACTION: Notice of minimum random drug testing rate. SUMMARY: The Coast Guard has set the calendar year 2011 minimum random drug testing rate at 50 percent of...

  5. Random catalytic reaction networks

    NASA Astrophysics Data System (ADS)

    Stadler, Peter F.; Fontana, Walter; Miller, John H.

    1993-03-01

    We study networks that are a generalization of replicator (or Lotka-Volterra) equations. They model the dynamics of a population of object types whose binary interactions determine the specific type of interaction product. Such a system always reduces its dimension to a subset that contains production pathways for all of its members. The network equation can be rewritten at a level of collectives in terms of two basic interaction patterns: replicator sets and cyclic transformation pathways among sets. Although the system contains well-known cases that exhibit very complicated dynamics, the generic behavior of randomly generated systems is found (numerically) to be extremely robust: convergence to a globally stable rest point. It is easy to tailor networks that display replicator interactions where the replicators are entire self-sustaining subsystems, rather than structureless units. A numerical scan of random systems highlights the special properties of elementary replicators: they reduce the effective interconnectedness of the system, resulting in enhanced competition, and strong correlations between the concentrations.

  6. Sediment distribution pattern mapped from the combination of objective analysis and geostatistics in the large shallow Taihu Lake, China.

    PubMed

    Luo, Lian-Cong; Qin, Bo-Qiang; Zhu, Guang-Wei

    2004-01-01

    Investigation was made into sediment depth at 723 irregularly scattered measurement points which cover all the regions in Taihu Lake, China. The combination of successive correction scheme and geostatistical method was used to get all the values of recent sediment thickness at the 69 x 69 grids in the whole lake. The results showed that there is the significant difference in sediment depth between the eastern area and the western region, and most of the sediments are located in the western shore-line and northern regimes but just a little in the center and eastern parts. The notable exception is the patch between the center and Xishan Island where the maximum sediment depth is more than 4.0 m. This sediment distribution pattern is more than likely related to the current circulation pattern induced by the prevailing wind-forcing in Taihu Lake. The numerical simulation of hydrodynamics can strong support the conclusion. Sediment effects on water quality was also studied and the results showed that the concentrations of TP, TN and SS in the western part are obviously larger than those in the eastern regime, which suggested that more nutrients can be released from thicker sediment areas.

  7. Detection of terrain indices related to soil salinity and mapping salt-affected soils using remote sensing and geostatistical techniques.

    PubMed

    Triki Fourati, Hela; Bouaziz, Moncef; Benzina, Mourad; Bouaziz, Samir

    2017-04-01

    Traditional surveying methods of soil properties over landscapes are dramatically cost and time-consuming. Thus, remote sensing is a proper choice for monitoring environmental problem. This research aims to study the effect of environmental factors on soil salinity and to map the spatial distribution of this salinity over the southern east part of Tunisia by means of remote sensing and geostatistical techniques. For this purpose, we used Advanced Spaceborne Thermal Emission and Reflection Radiometer data to depict geomorphological parameters: elevation, slope, plan curvature (PLC), profile curvature (PRC), and aspect. Pearson correlation between these parameters and soil electrical conductivity (EC soil ) showed that mainly slope and elevation affect the concentration of salt in soil. Moreover, spectral analysis illustrated the high potential of short-wave infrared (SWIR) bands to identify saline soils. To map soil salinity in southern Tunisia, ordinary kriging (OK), minimum distance (MD) classification, and simple regression (SR) were used. The findings showed that ordinary kriging technique provides the most reliable performances to identify and classify saline soils over the study area with a root mean square error of 1.83 and mean error of 0.018.

  8. Prospective, randomized comparison of 3 different hemoclips for the treatment of acute upper GI hemorrhage in an established experimental setting.

    PubMed

    Kato, Masayuki; Jung, Yunho; Gromski, Mark A; Chuttani, Ram; Matthes, Kai

    2012-01-01

    Recently, endoscopic clip application devices have undergone redesign and improvements to optimize their clinical use and effectiveness. Initially designed for the treatment of bleeding nonvariceal lesions, these devices are also increasingly used for the closure of perforations, fistulas, and anastomotic leaks. Several clinical studies, both randomized and nonrandomized, have used endoscopic hemoclips for hemostasis. However, no comparative studies have yet been reported in the literature comparing the latest endoscopic clip devices for usability and effectiveness for hemostasis of acute upper GI hemorrhage. We aimed to compare the usability and efficacy of 3 different types of endoscopic clip application devices in an established experimental setting by using a porcine ex-vivo simulator of upper GI hemorrhage. Randomized, controlled, ex-vivo study. Academic medical center. Spurting vessels were created within ex-vivo porcine stomachs as published in prior studies. The vessels were attached to a pressure transducer to record the pressure of the circulating blood replacement. Before the initiation of bleeding, each vessel was randomized to 1 of 3 endoscopic clipping devices: 2 different commonly used hemoclips deployed through the working channel and 1 novel clip deployed via an over-the-scope applications device. Two investigators treated 45 bleeding sites (15 bleeding sites for each device at various randomized locations in the stomach: fundus, body, and antrum). Usability was measured via the endpoints of procedure time and quantity of clips required to achieve hemostasis. Efficacy was measured via the endpoint of pressure increase (Δp) from baseline to after treatment. All of the 45 hemostasis treatments were carried out successfully. The mean procedure times were significantly different among the hemoclips, with the clip deployed in an over-the-scope fashion requiring significantly less time to attain hemostasis compared with the other 2 clips. For number of

  9. Determination of geostatistically representative sampling locations in Porsuk Dam Reservoir (Turkey)

    NASA Astrophysics Data System (ADS)

    Aksoy, A.; Yenilmez, F.; Duzgun, S.

    2013-12-01

    Several factors such as wind action, bathymetry and shape of a lake/reservoir, inflows, outflows, point and diffuse pollution sources result in spatial and temporal variations in water quality of lakes and reservoirs. The guides by the United Nations Environment Programme and the World Health Organization to design and implement water quality monitoring programs suggest that even a single monitoring station near the center or at the deepest part of a lake will be sufficient to observe long-term trends if there is good horizontal mixing. In stratified water bodies, several samples can be required. According to the guide of sampling and analysis under the Turkish Water Pollution Control Regulation, a minimum of five sampling locations should be employed to characterize the water quality in a reservoir or a lake. The European Union Water Framework Directive (2000/60/EC) states to select a sufficient number of monitoring sites to assess the magnitude and impact of point and diffuse sources and hydromorphological pressures in designing a monitoring program. Although existing regulations and guidelines include frameworks for the determination of sampling locations in surface waters, most of them do not specify a procedure in establishment of monitoring aims with representative sampling locations in lakes and reservoirs. In this study, geostatistical tools are used to determine the representative sampling locations in the Porsuk Dam Reservoir (PDR). Kernel density estimation and kriging were used in combination to select the representative sampling locations. Dissolved oxygen and specific conductivity were measured at 81 points. Sixteen of them were used for validation. In selection of the representative sampling locations, care was given to keep similar spatial structure in distributions of measured parameters. A procedure was proposed for that purpose. Results indicated that spatial structure was lost under 30 sampling points. This was as a result of varying water

  10. Geostatistics applied to cross-well reflection seismic for imaging carbonate aquifers

    NASA Astrophysics Data System (ADS)

    Parra, Jorge; Emery, Xavier

    2013-05-01

    Cross-well seismic reflection data, acquired from a carbonate aquifer at Port Mayaca test site near the eastern boundary of Lake Okeechobee in Martin County, Florida, are used to delineate flow units in the region intercepted by two wells. The interwell impedance determined by inversion from the seismic reflection data allows us to visualize the major boundaries between the hydraulic units. The hydraulic (flow) unit properties are based on the integration of well logs and the carbonate structure, which consists of isolated vuggy carbonate units and interconnected vug systems within the carbonate matrix. The vuggy and matrix porosity logs based on Formation Micro-Imager (FMI) data provide information about highly permeable conduits at well locations. The integration of the inverted impedance and well logs using geostatistics helps us to assess the resolution of the cross-well seismic method for detecting conduits and to determine whether these conduits are continuous or discontinuous between wells. A productive water zone of the aquifer outlined by the well logs was selected for analysis and interpretation. The ELAN (Elemental Log Analysis) porosity from two wells was selected as primary data and the reflection seismic-based impedance as secondary data. The direct and cross variograms along the vertical wells capture nested structures associated with periodic carbonate units, which correspond to connected flow units between the wells. Alternatively, the horizontal variogram of impedance (secondary data) provides scale lengths that correspond to irregular boundary shapes of flow units. The ELAN porosity image obtained by cokriging exhibits three similar flow units at different depths. These units are thin conduits developed in the first well and, at about the middle of the interwell separation region, these conduits connect to thicker flow units that are intercepted by the second well. In addition, a high impedance zone (low porosity) at a depth of about 275 m, after

  11. Measuring Constraint-Set Utility for Partitional Clustering Algorithms

    NASA Technical Reports Server (NTRS)

    Davidson, Ian; Wagstaff, Kiri L.; Basu, Sugato

    2006-01-01

    Clustering with constraints is an active area of machine learning and data mining research. Previous empirical work has convincingly shown that adding constraints to clustering improves the performance of a variety of algorithms. However, in most of these experiments, results are averaged over different randomly chosen constraint sets from a given set of labels, thereby masking interesting properties of individual sets. We demonstrate that constraint sets vary significantly in how useful they are for constrained clustering; some constraint sets can actually decrease algorithm performance. We create two quantitative measures, informativeness and coherence, that can be used to identify useful constraint sets. We show that these measures can also help explain differences in performance for four particular constrained clustering algorithms.

  12. APPLICATION OF BAYESIAN AND GEOSTATISTICAL MODELING TO THE ENVIRONMENTAL MONITORING OF CS-137 AT THE IDAHO NATIONAL LABORATORY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kara G. Eby

    2010-08-01

    At the Idaho National Laboratory (INL) Cs-137 concentrations above the U.S. Environmental Protection Agency risk-based threshold of 0.23 pCi/g may increase the risk of human mortality due to cancer. As a leader in nuclear research, the INL has been conducting nuclear activities for decades. Elevated anthropogenic radionuclide levels including Cs-137 are a result of atmospheric weapons testing, the Chernobyl accident, and nuclear activities occurring at the INL site. Therefore environmental monitoring and long-term surveillance of Cs-137 is required to evaluate risk. However, due to the large land area involved, frequent and comprehensive monitoring is limited. Developing a spatial model thatmore » predicts Cs-137 concentrations at unsampled locations will enhance the spatial characterization of Cs-137 in surface soils, provide guidance for an efficient monitoring program, and pinpoint areas requiring mitigation strategies. The predictive model presented herein is based on applied geostatistics using a Bayesian analysis of environmental characteristics across the INL site, which provides kriging spatial maps of both Cs-137 estimates and prediction errors. Comparisons are presented of two different kriging methods, showing that the use of secondary information (i.e., environmental characteristics) can provide improved prediction performance in some areas of the INL site.« less

  13. Evaluation of spatial variability of soil arsenic adjacent to a disused cattle-dip site, using model-based geostatistics.

    PubMed

    Niazi, Nabeel K; Bishop, Thomas F A; Singh, Balwant

    2011-12-15

    This study investigated the spatial variability of total and phosphate-extractable arsenic (As) concentrations in soil adjacent to a cattle-dip site, employing a linear mixed model-based geostatistical approach. The soil samples in the study area (n = 102 in 8.1 m(2)) were taken at the nodes of a 0.30 × 0.35 m grid. The results showed that total As concentration (0-0.2 m depth) and phosphate-extractable As concentration (at depths of 0-0.2, 0.2-0.4, and 0.4-0.6 m) in soil adjacent to the dip varied greatly. Both total and phosphate-extractable soil As concentrations significantly (p = 0.004-0.048) increased toward the cattle-dip. Using the linear mixed model, we suggest that 5 samples are sufficient to assess a dip site for soil (As) contamination (95% confidence interval of ±475.9 mg kg(-1)), but 15 samples (95% confidence interval of ±212.3 mg kg(-1)) is desirable baseline when the ultimate goal is to evaluate the effects of phytoremediation. Such guidelines on sampling requirements are crucial for the assessment of As contamination levels at other cattle-dip sites, and to determine the effect of phytoremediation on soil As.

  14. Geostatistical approach for assessing soil volumes requiring remediation: validation using lead-polluted soils underlying a former smelting works.

    PubMed

    Demougeot-Renard, Helene; De Fouquet, Chantal

    2004-10-01

    Assessing the volume of soil requiring remediation and the accuracy of this assessment constitutes an essential step in polluted site management. If this remediation volume is not properly assessed, misclassification may lead both to environmental risks (polluted soils may not be remediated) and financial risks (unexpected discovery of polluted soils may generate additional remediation costs). To minimize such risks, this paper proposes a geostatistical methodology based on stochastic simulations that allows the remediation volume and the uncertainty to be assessed using investigation data. The methodology thoroughly reproduces the conditions in which the soils are classified and extracted at the remediation stage. The validity of the approach is tested by applying it on the data collected during the investigation phase of a former lead smelting works and by comparing the results with the volume that has actually been remediated. This real remediated volume was composed of all the remediation units that were classified as polluted after systematic sampling and analysis during clean-up stage. The volume estimated from the 75 samples collected during site investigation slightly overestimates (5.3% relative error) the remediated volume deduced from 212 remediation units. Furthermore, the real volume falls within the range of uncertainty predicted using the proposed methodology.

  15. Quantitative comparison of randomization designs in sequential clinical trials based on treatment balance and allocation randomness.

    PubMed

    Zhao, Wenle; Weng, Yanqiu; Wu, Qi; Palesch, Yuko

    2012-01-01

    To evaluate the performance of randomization designs under various parameter settings and trial sample sizes, and identify optimal designs with respect to both treatment imbalance and allocation randomness, we evaluate 260 design scenarios from 14 randomization designs under 15 sample sizes range from 10 to 300, using three measures for imbalance and three measures for randomness. The maximum absolute imbalance and the correct guess (CG) probability are selected to assess the trade-off performance of each randomization design. As measured by the maximum absolute imbalance and the CG probability, we found that performances of the 14 randomization designs are located in a closed region with the upper boundary (worst case) given by Efron's biased coin design (BCD) and the lower boundary (best case) from the Soares and Wu's big stick design (BSD). Designs close to the lower boundary provide a smaller imbalance and a higher randomness than designs close to the upper boundary. Our research suggested that optimization of randomization design is possible based on quantified evaluation of imbalance and randomness. Based on the maximum imbalance and CG probability, the BSD, Chen's biased coin design with imbalance tolerance method, and Chen's Ehrenfest urn design perform better than popularly used permuted block design, EBCD, and Wei's urn design. Copyright © 2011 John Wiley & Sons, Ltd.

  16. Spatial analysis and risk mapping of soil-transmitted helminth infections in Brazil, using Bayesian geostatistical models.

    PubMed

    Scholte, Ronaldo G C; Schur, Nadine; Bavia, Maria E; Carvalho, Edgar M; Chammartin, Frédérique; Utzinger, Jürg; Vounatsou, Penelope

    2013-11-01

    Soil-transmitted helminths (Ascaris lumbricoides, Trichuris trichiura and hookworm) negatively impact the health and wellbeing of hundreds of millions of people, particularly in tropical and subtropical countries, including Brazil. Reliable maps of the spatial distribution and estimates of the number of infected people are required for the control and eventual elimination of soil-transmitted helminthiasis. We used advanced Bayesian geostatistical modelling, coupled with geographical information systems and remote sensing to visualize the distribution of the three soil-transmitted helminth species in Brazil. Remotely sensed climatic and environmental data, along with socioeconomic variables from readily available databases were employed as predictors. Our models provided mean prevalence estimates for A. lumbricoides, T. trichiura and hookworm of 15.6%, 10.1% and 2.5%, respectively. By considering infection risk and population numbers at the unit of the municipality, we estimate that 29.7 million Brazilians are infected with A. lumbricoides, 19.2 million with T. trichiura and 4.7 million with hookworm. Our model-based maps identified important risk factors related to the transmission of soiltransmitted helminths and confirm that environmental variables are closely associated with indices of poverty. Our smoothed risk maps, including uncertainty, highlight areas where soil-transmitted helminthiasis control interventions are most urgently required, namely in the North and along most of the coastal areas of Brazil. We believe that our predictive risk maps are useful for disease control managers for prioritising control interventions and for providing a tool for more efficient surveillance-response mechanisms.

  17. Instructional Set, Deep Relaxation and Growth Enhancement: A Pilot Study

    ERIC Educational Resources Information Center

    Leeb, Charles; And Others

    1976-01-01

    This study provides experimental evidence that instructional set can influence access to altered states of consciousness. Fifteen male subjects were randomly assigned to three groups, each of which received the same autogenic biofeedback training in hand temperature control, but each group received a different attitudinal set. (Editor)

  18. Modelling wildland fire propagation by tracking random fronts

    NASA Astrophysics Data System (ADS)

    Pagnini, G.; Mentrelli, A.

    2014-08-01

    Wildland fire propagation is studied in the literature by two alternative approaches, namely the reaction-diffusion equation and the level-set method. These two approaches are considered alternatives to each other because the solution of the reaction-diffusion equation is generally a continuous smooth function that has an exponential decay, and it is not zero in an infinite domain, while the level-set method, which is a front tracking technique, generates a sharp function that is not zero inside a compact domain. However, these two approaches can indeed be considered complementary and reconciled. Turbulent hot-air transport and fire spotting are phenomena with a random nature and they are extremely important in wildland fire propagation. Consequently, the fire front gets a random character, too; hence, a tracking method for random fronts is needed. In particular, the level-set contour is randomised here according to the probability density function of the interface particle displacement. Actually, when the level-set method is developed for tracking a front interface with a random motion, the resulting averaged process emerges to be governed by an evolution equation of the reaction-diffusion type. In this reconciled approach, the rate of spread of the fire keeps the same key and characterising role that is typical of the level-set approach. The resulting model emerges to be suitable for simulating effects due to turbulent convection, such as fire flank and backing fire, the faster fire spread being because of the actions by hot-air pre-heating and by ember landing, and also due to the fire overcoming a fire-break zone, which is a case not resolved by models based on the level-set method. Moreover, from the proposed formulation, a correction follows for the formula of the rate of spread which is due to the mean jump length of firebrands in the downwind direction for the leeward sector of the fireline contour. The presented study constitutes a proof of concept, and it

  19. Multipartite nonlocality and random measurements

    NASA Astrophysics Data System (ADS)

    de Rosier, Anna; Gruca, Jacek; Parisio, Fernando; Vértesi, Tamás; Laskowski, Wiesław

    2017-07-01

    We present an exhaustive numerical analysis of violations of local realism by families of multipartite quantum states. As an indicator of nonclassicality we employ the probability of violation for randomly sampled observables. Surprisingly, it rapidly increases with the number of parties or settings and even for relatively small values local realism is violated for almost all observables. We have observed this effect to be typical in the sense that it emerged for all investigated states including some with randomly drawn coefficients. We also present the probability of violation as a witness of genuine multipartite entanglement.

  20. Systematic versus random sampling in stereological studies.

    PubMed

    West, Mark J

    2012-12-01

    The sampling that takes place at all levels of an experimental design must be random if the estimate is to be unbiased in a statistical sense. There are two fundamental ways by which one can make a random sample of the sections and positions to be probed on the sections. Using a card-sampling analogy, one can pick any card at all out of a deck of cards. This is referred to as independent random sampling because the sampling of any one card is made without reference to the position of the other cards. The other approach to obtaining a random sample would be to pick a card within a set number of cards and others at equal intervals within the deck. Systematic sampling along one axis of many biological structures is more efficient than random sampling, because most biological structures are not randomly organized. This article discusses the merits of systematic versus random sampling in stereological studies.

  1. Tensor Minkowski Functionals for random fields on the sphere

    NASA Astrophysics Data System (ADS)

    Chingangbam, Pravabati; Yogendran, K. P.; Joby, P. K.; Ganesan, Vidhya; Appleby, Stephen; Park, Changbom

    2017-12-01

    We generalize the translation invariant tensor-valued Minkowski Functionals which are defined on two-dimensional flat space to the unit sphere. We apply them to level sets of random fields. The contours enclosing boundaries of level sets of random fields give a spatial distribution of random smooth closed curves. We outline a method to compute the tensor-valued Minkowski Functionals numerically for any random field on the sphere. Then we obtain analytic expressions for the ensemble expectation values of the matrix elements for isotropic Gaussian and Rayleigh fields. The results hold on flat as well as any curved space with affine connection. We elucidate the way in which the matrix elements encode information about the Gaussian nature and statistical isotropy (or departure from isotropy) of the field. Finally, we apply the method to maps of the Galactic foreground emissions from the 2015 PLANCK data and demonstrate their high level of statistical anisotropy and departure from Gaussianity.

  2. An instrumental variable random-coefficients model for binary outcomes

    PubMed Central

    Chesher, Andrew; Rosen, Adam M

    2014-01-01

    In this paper, we study a random-coefficients model for a binary outcome. We allow for the possibility that some or even all of the explanatory variables are arbitrarily correlated with the random coefficients, thus permitting endogeneity. We assume the existence of observed instrumental variables Z that are jointly independent with the random coefficients, although we place no structure on the joint determination of the endogenous variable X and instruments Z, as would be required for a control function approach. The model fits within the spectrum of generalized instrumental variable models, and we thus apply identification results from our previous studies of such models to the present context, demonstrating their use. Specifically, we characterize the identified set for the distribution of random coefficients in the binary response model with endogeneity via a collection of conditional moment inequalities, and we investigate the structure of these sets by way of numerical illustration. PMID:25798048

  3. Identification of the Hydrogeochemical Processes in Groundwater Using Classic Integrated Geochemical Methods and Geostatistical Techniques, in Amol-Babol Plain, Iran

    PubMed Central

    Sheikhy Narany, Tahoora; Ramli, Mohammad Firuz; Aris, Ahmad Zaharin; Sulaiman, Wan Nor Azmin; Juahir, Hafizan; Fakharian, Kazem

    2014-01-01

    Hydrogeochemical investigations had been carried out at the Amol-Babol Plain in the north of Iran. Geochemical processes and factors controlling the groundwater chemistry are identified based on the combination of classic geochemical methods with geographic information system (GIS) and geostatistical techniques. The results of the ionic ratios and Gibbs plots show that water rock interaction mechanisms, followed by cation exchange, and dissolution of carbonate and silicate minerals have influenced the groundwater chemistry in the study area. The hydrogeochemical characteristics of groundwater show a shift from low mineralized Ca-HCO3, Ca-Na-HCO3, and Ca-Cl water types to high mineralized Na-Cl water type. Three classes, namely, C1, C2, and C3, have been classified using cluster analysis. The spatial distribution maps of Na+/Cl−, Mg2+/Ca2+, and Cl−/HCO3 − ratios and electrical conductivity values indicate that the carbonate and weathering of silicate minerals played a significant role in the groundwater chemistry on the southern and western sides of the plain. However, salinization process had increased due to the influence of the evaporation-precipitation process towards the north-eastern side of the study area. PMID:24523640

  4. Acceptability of Home-Assessment Post Medical Abortion and Medical Abortion in a Low-Resource Setting in Rajasthan, India. Secondary Outcome Analysis of a Non-Inferiority Randomized Controlled Trial

    PubMed Central

    Paul, Mandira; Iyengar, Kirti; Essén, Birgitta; Gemzell-Danielsson, Kristina; Iyengar, Sharad D.; Bring, Johan; Soni, Sunita; Klingberg-Allvin, Marie

    2015-01-01

    Background Studies evaluating acceptability of simplified follow-up after medical abortion have focused on high-resource or urban settings where telephones, road connections, and modes of transport are available and where women have formal education. Objective To investigate women’s acceptability of home-assessment of abortion and whether acceptability of medical abortion differs by in-clinic or home-assessment of abortion outcome in a low-resource setting in India. Design Secondary outcome of a randomised, controlled, non-inferiority trial. Setting Outpatient primary health care clinics in rural and urban Rajasthan, India. Population Women were eligible if they sought abortion with a gestation up to 9 weeks, lived within defined study area and agreed to follow-up. Women were ineligible if they had known contraindications to medical abortion, haemoglobin < 85mg/l and were below 18 years. Methods Abortion outcome assessment through routine clinic follow-up by a doctor was compared with home-assessment using a low-sensitivity pregnancy test and a pictorial instruction sheet. A computerized random number generator generated the randomisation sequence (1:1) in blocks of six. Research assistants randomly allocated eligible women who opted for medical abortion (mifepristone and misoprostol), using opaque sealed envelopes. Blinding during outcome assessment was not possible. Main Outcome Measures Women’s acceptability of home-assessment was measured as future preference of follow-up. Overall satisfaction, expectations, and comparison with previous abortion experiences were compared between study groups. Results 731 women were randomized to the clinic follow-up group (n = 353) or home-assessment group (n = 378). 623 (85%) women were successfully followed up, of those 597 (96%) were satisfied and 592 (95%) found the abortion better or as expected, with no difference between study groups. The majority, 355 (57%) women, preferred home-assessment in the event of a future

  5. Training With Curved Laparoscopic Instruments in Single-Port Setting Improves Performance Using Straight Instruments: A Prospective Randomized Simulation Study.

    PubMed

    Lukovich, Peter; Sionov, Valery Ben; Kakucs, Timea

    2016-01-01

    Lately single-port surgery is becoming a widespread procedure, but it is more difficult than conventional laparoscopy owing to the lack of triangulation. Although, these operations are also possible with standard laparoscopic instruments, curved instruments are being developed. The aims of the study were to identify the effect of training on a box trainer in single-port setting on the quality of acquired skills, and transferred with the straight and curved instruments for the basic laparoscopic tasks, and highlight the importance of a special laparoscopic training curriculum. A prospective study on a box trainer in single-port setting was conducted using 2 groups. Each group performed 2 tasks on the box trainer in single-port setting. Group-S used conventional straight laparoscopic instruments, and Group-C used curved laparoscopic instruments. Learning curves were obtained by daily measurements recorded in 7-day sessions. On the last day, the 2 groups changed instruments between each other. 1st Department of Surgery, Semmelweis University of Medicine from Budapest, Hungary, a university teaching hospital. In all, 20 fifth-year medical students were randomized into 2 groups. None of them had any laparoscopic or endoscopic experience. Participation was voluntary. Although Group-S performed all tasks significantly faster than Group-C on the first day, the difference proved to be nonsignificant on the last day. All participants achieved significantly shorter task completion time on the last day than on the first day, regardless of the instrument they used. Group-S showed improvement of 63.5%, and Group-C 69.0% improvement by the end of the session. After swapping the instruments, Group-S reached significantly higher task completion time with curved instruments, whereas Group-C showed further progression of 8.9% with straight instruments. Training with curved instruments in a single-port setting allows for a better acquisition of skills in a shorter period. For this

  6. Academic Performance of Subsequent Schools and Impacts of Early Interventions: Evidence from a Randomized Controlled Trial in Head Start Settings

    PubMed Central

    Zhai, Fuhua; Raver, C. Cybele; Jones, Stephanie M.

    2012-01-01

    The role of subsequent school contexts in the long-term effects of early childhood interventions has received increasing attention, but has been understudied in the literature. Using data from the Chicago School Readiness Project (CSRP), a cluster-randomized controlled trial conducted in Head Start programs, we investigate whether the intervention had differential effects on academic and behavioral outcomes in kindergarten if children attended high- or low-performing schools subsequent to the preschool intervention year. To address the issue of selection bias, we adopt an innovative method, principal score matching, and control for a set of child, mother, and classroom covariates. We find that exposure to the CSRP intervention in the Head Start year had significant effects on academic and behavioral outcomes in kindergarten for children who subsequently attended high-performing schools, but no significant effects on children attending low-performing schools. Policy implications of the findings are discussed. PMID:22773872

  7. Quantifying randomness in real networks

    NASA Astrophysics Data System (ADS)

    Orsini, Chiara; Dankulov, Marija M.; Colomer-de-Simón, Pol; Jamakovic, Almerima; Mahadevan, Priya; Vahdat, Amin; Bassler, Kevin E.; Toroczkai, Zoltán; Boguñá, Marián; Caldarelli, Guido; Fortunato, Santo; Krioukov, Dmitri

    2015-10-01

    Represented as graphs, real networks are intricate combinations of order and disorder. Fixing some of the structural properties of network models to their values observed in real networks, many other properties appear as statistical consequences of these fixed observables, plus randomness in other respects. Here we employ the dk-series, a complete set of basic characteristics of the network structure, to study the statistical dependencies between different network properties. We consider six real networks--the Internet, US airport network, human protein interactions, technosocial web of trust, English word network, and an fMRI map of the human brain--and find that many important local and global structural properties of these networks are closely reproduced by dk-random graphs whose degree distributions, degree correlations and clustering are as in the corresponding real network. We discuss important conceptual, methodological, and practical implications of this evaluation of network randomness, and release software to generate dk-random graphs.

  8. Family, Community and Clinic Collaboration to Treat Overweight and Obese Children: Stanford GOALS -- a Randomized Controlled Trial of a Three-Year, Multi-Component, Multi-Level, Multi-Setting Intervention

    PubMed Central

    Robinson, Thomas N.; Matheson, Donna; Desai, Manisha; Wilson, Darrell M.; Weintraub, Dana L.; Haskell, William L.; McClain, Arianna; McClure, Samuel; Banda, Jorge; Sanders, Lee M.; Haydel, K. Farish; Killen, Joel D.

    2013-01-01

    Objective To test the effects of a three-year, community-based, multi-component, multi-level, multi-setting (MMM) approach for treating overweight and obese children. Design Two-arm, parallel group, randomized controlled trial with measures at baseline, 12, 24, and 36 months after randomization. Participants Seven through eleven year old, overweight and obese children (BMI ≥ 85th percentile) and their parents/caregivers recruited from community locations in low-income, primarily Latino neighborhoods in Northern California. Interventions Families are randomized to the MMM intervention versus a community health education active-placebo comparison intervention. Interventions last for three years for each participant. The MMM intervention includes a community-based after school team sports program designed specifically for overweight and obese children, a home-based family intervention to reduce screen time, alter the home food/eating environment, and promote self-regulatory skills for eating and activity behavior change, and a primary care behavioral counseling intervention linked to the community and home interventions. The active-placebo comparison intervention includes semi-annual health education home visits, monthly health education newsletters for children and for parents/guardians, and a series of community-based health education events for families. Main Outcome Measure Body mass index trajectory over the three-year study. Secondary outcome measures include waist circumference, triceps skinfold thickness, accelerometer-measured physical activity, 24-hour dietary recalls, screen time and other sedentary behaviors, blood pressure, fasting lipids, glucose, insulin, hemoglobin A1c, C-reactive protein, alanine aminotransferase, and psychosocial measures. Conclusions The Stanford GOALS trial is testing the efficacy of a novel community-based multi-component, multi-level, multi-setting treatment for childhood overweight and obesity in low-income, Latino families

  9. Family, community and clinic collaboration to treat overweight and obese children: Stanford GOALS-A randomized controlled trial of a three-year, multi-component, multi-level, multi-setting intervention.

    PubMed

    Robinson, Thomas N; Matheson, Donna; Desai, Manisha; Wilson, Darrell M; Weintraub, Dana L; Haskell, William L; McClain, Arianna; McClure, Samuel; Banda, Jorge A; Sanders, Lee M; Haydel, K Farish; Killen, Joel D

    2013-11-01

    To test the effects of a three-year, community-based, multi-component, multi-level, multi-setting (MMM) approach for treating overweight and obese children. Two-arm, parallel group, randomized controlled trial with measures at baseline, 12, 24, and 36 months after randomization. Seven through eleven year old, overweight and obese children (BMI ≥ 85th percentile) and their parents/caregivers recruited from community locations in low-income, primarily Latino neighborhoods in Northern California. Families are randomized to the MMM intervention versus a community health education active-placebo comparison intervention. Interventions last for three years for each participant. The MMM intervention includes a community-based after school team sports program designed specifically for overweight and obese children, a home-based family intervention to reduce screen time, alter the home food/eating environment, and promote self-regulatory skills for eating and activity behavior change, and a primary care behavioral counseling intervention linked to the community and home interventions. The active-placebo comparison intervention includes semi-annual health education home visits, monthly health education newsletters for children and for parents/guardians, and a series of community-based health education events for families. Body mass index trajectory over the three-year study. Secondary outcome measures include waist circumference, triceps skinfold thickness, accelerometer-measured physical activity, 24-hour dietary recalls, screen time and other sedentary behaviors, blood pressure, fasting lipids, glucose, insulin, hemoglobin A1c, C-reactive protein, alanine aminotransferase, and psychosocial measures. The Stanford GOALS trial is testing the efficacy of a novel community-based multi-component, multi-level, multi-setting treatment for childhood overweight and obesity in low-income, Latino families. © 2013 Elsevier Inc. All rights reserved.

  10. Evaluation of Effectiveness and Cost‐Effectiveness of a Clinical Decision Support System in Managing Hypertension in Resource Constrained Primary Health Care Settings: Results From a Cluster Randomized Trial

    PubMed Central

    Anchala, Raghupathy; Kaptoge, Stephen; Pant, Hira; Di Angelantonio, Emanuele; Franco, Oscar H.; Prabhakaran, D.

    2015-01-01

    Background Randomized control trials from the developed world report that clinical decision support systems (DSS) could provide an effective means to improve the management of hypertension (HTN). However, evidence from developing countries in this regard is rather limited, and there is a need to assess the impact of a clinical DSS on managing HTN in primary health care center (PHC) settings. Methods and Results We performed a cluster randomized trial to test the effectiveness and cost‐effectiveness of a clinical DSS among Indian adult hypertensive patients (between 35 and 64 years of age), wherein 16 PHC clusters from a district of Telangana state, India, were randomized to receive either a DSS or a chart‐based support (CBS) system. Each intervention arm had 8 PHC clusters, with a mean of 102 hypertensive patients per cluster (n=845 in DSS and 783 in CBS groups). Mean change in systolic blood pressure (SBP) from baseline to 12 months was the primary endpoint. The mean difference in SBP change from baseline between the DSS and CBS at the 12th month of follow‐up, adjusted for age, sex, height, waist, body mass index, alcohol consumption, vegetable intake, pickle intake, and baseline differences in blood pressure, was −6.59 mm Hg (95% confidence interval: −12.18 to −1.42; P=0.021). The cost‐effective ratio for CBS and DSS groups was $96.01 and $36.57 per mm of SBP reduction, respectively. Conclusion Clinical DSS are effective and cost‐effective in the management of HTN in resource‐constrained PHC settings. Clinical Trial Registration URL: http://www.ctri.nic.in. Unique identifier: CTRI/2012/03/002476. PMID:25559011

  11. Common Randomness Principles of Secrecy

    ERIC Educational Resources Information Center

    Tyagi, Himanshu

    2013-01-01

    This dissertation concerns the secure processing of distributed data by multiple terminals, using interactive public communication among themselves, in order to accomplish a given computational task. In the setting of a probabilistic multiterminal source model in which several terminals observe correlated random signals, we analyze secure…

  12. Pilot randomized trial of therapeutic hypothermia with serial cranial ultrasound and 18-22 month follow-up for neonatal encephalopathy in a low resource hospital setting in Uganda: study protocol.

    PubMed

    Robertson, Nicola J; Hagmann, Cornelia F; Acolet, Dominique; Allen, Elizabeth; Nyombi, Natasha; Elbourne, Diana; Costello, Anthony; Jacobs, Ian; Nakakeeto, Margaret; Cowan, Frances

    2011-06-04

    There is now convincing evidence that in industrialized countries therapeutic hypothermia for perinatal asphyxial encephalopathy increases survival with normal neurological function. However, the greatest burden of perinatal asphyxia falls in low and mid-resource settings where it is unclear whether therapeutic hypothermia is safe and effective. Under the UCL Uganda Women's Health Initiative, a pilot randomized controlled trial in infants with perinatal asphyxia was set up in the special care baby unit in Mulago Hospital, a large public hospital with ~20,000 births in Kampala, Uganda to determine:(i) The feasibility of achieving consent, neurological assessment, randomization and whole body cooling to a core temperature 33-34°C using water bottles(ii) The temperature profile of encephalopathic infants with standard care(iii) The pattern, severity and evolution of brain tissue injury as seen on cranial ultrasound and relation with outcome(iv) The feasibility of neurodevelopmental follow-up at 18-22 months of age Ethical approval was obtained from Makerere University and Mulago Hospital. All infants were in-born. Parental consent for entry into the trial was obtained. Thirty-six infants were randomized either to standard care plus cooling (target rectal temperature of 33-34°C for 72 hrs, started within 3 h of birth) or standard care alone. All other aspects of management were the same. Cooling was performed using water bottles filled with tepid tap water (25°C). Rectal, axillary, ambient and surface water bottle temperatures were monitored continuously for the first 80 h. Encephalopathy scoring was performed on days 1-4, a structured, scorable neurological examination and head circumference were performed on days 7 and 17. Cranial ultrasound was performed on days 1, 3 and 7 and scored. Griffiths developmental quotient, head circumference, neurological examination and assessment of gross motor function were obtained at 18-22 months. We will highlight differences in

  13. Comparative Geostatistical Analysis of Flowmeter and Direct-Push Hydraulic Conductivity Profiles at the MADE Site

    NASA Astrophysics Data System (ADS)

    Bohling, G.; Liu, G.; Knobbe, S. J.; Reboulet, E. C.; Hyndman, D. W.; Dietrich, P.; Butler, J. J.

    2010-12-01

    Spatial variations in hydraulic conductivity (K) are a critical control on subsurface solute transport. Characterization of such variations at the resolution (cm to dm) required for transport investigations, however, has proven to be a formidable challenge. A new generation of direct-push (DP) tools has now been developed for the characterization of vertical K variations at this resolution. These tools, which can be run in high- (0.015-m) and low- (0.4 m) resolution modes, were recently applied to the extensively studied and highly heterogeneous MADE site. Results from a geostatistical analysis of 64 DP K profiles compare favorably with the flowmeter K data that have served as the primary basis for previous MADE studies. The global statistics of the low-resolution DP and flowmeter K data are in excellent agreement. The correlation structures for the high-resolution DP data show excellent agreement with those computed from the flowmeter data. However, the geometric mean DP K value for high-resolution profiling is roughly one order of magnitude lower than the geometric mean flowmeter K value, possibly as a result of the biases inherent in each approach compounded with differences in the areal distribution of flowmeter and DP profile locations. A DP profile through the MADE aquifer to a depth of 12 m can be completed as rapidly as 1.5-2 hours, a small fraction of the time required to obtain a single flowmeter profile when well drilling, installation, and development are considered. The results of this study demonstrate that DP profiling is a practically feasible approach for characterization of spatial variations in K at the resolution required for transport investigations in highly heterogeneous systems.

  14. Effects of Animation on Naming and Identification across Two Graphic Symbol Sets Representing Verbs and Prepositions

    ERIC Educational Resources Information Center

    Schlosser, Ralf W.; Koul, Rajinder; Shane, Howard; Sorce, James; Brock, Kristofer; Harmon, Ashley; Moerlein, Dorothy; Hearn, Emilia

    2014-01-01

    Purpose: The effects of animation on naming and identification of graphic symbols for verbs and prepositions were studied in 2 graphic symbol sets in preschoolers. Method: Using a 2 × 2 × 2 × 3 completely randomized block design, preschoolers across three age groups were randomly assigned to combinations of symbol set (Autism Language Program…

  15. Ensemble Pruning for Glaucoma Detection in an Unbalanced Data Set.

    PubMed

    Adler, Werner; Gefeller, Olaf; Gul, Asma; Horn, Folkert K; Khan, Zardad; Lausen, Berthold

    2016-12-07

    Random forests are successful classifier ensemble methods consisting of typically 100 to 1000 classification trees. Ensemble pruning techniques reduce the computational cost, especially the memory demand, of random forests by reducing the number of trees without relevant loss of performance or even with increased performance of the sub-ensemble. The application to the problem of an early detection of glaucoma, a severe eye disease with low prevalence, based on topographical measurements of the eye background faces specific challenges. We examine the performance of ensemble pruning strategies for glaucoma detection in an unbalanced data situation. The data set consists of 102 topographical features of the eye background of 254 healthy controls and 55 glaucoma patients. We compare the area under the receiver operating characteristic curve (AUC), and the Brier score on the total data set, in the majority class, and in the minority class of pruned random forest ensembles obtained with strategies based on the prediction accuracy of greedily grown sub-ensembles, the uncertainty weighted accuracy, and the similarity between single trees. To validate the findings and to examine the influence of the prevalence of glaucoma in the data set, we additionally perform a simulation study with lower prevalences of glaucoma. In glaucoma classification all three pruning strategies lead to improved AUC and smaller Brier scores on the total data set with sub-ensembles as small as 30 to 80 trees compared to the classification results obtained with the full ensemble consisting of 1000 trees. In the simulation study, we were able to show that the prevalence of glaucoma is a critical factor and lower prevalence decreases the performance of our pruning strategies. The memory demand for glaucoma classification in an unbalanced data situation based on random forests could effectively be reduced by the application of pruning strategies without loss of performance in a population with increased

  16. Random blood glucose may be used to assess long-term glycaemic control among patients with type 2 diabetes mellitus in a rural African clinical setting.

    PubMed

    Rasmussen, Jon B; Nordin, Lovisa S; Rasmussen, Niclas S; Thomsen, Jakúp A; Street, Laura A; Bygbjerg, Ib C; Christensen, Dirk L

    2014-12-01

    To investigate the diagnostic accuracy of random blood glucose (RBG) on good glycaemic control among patients with diabetes mellitus (DM) in a rural African setting. Cross-sectional study at St. Francis' Hospital in eastern Zambia. RBG and HbA1c were measured during one clinical review only. Other information obtained was age, sex, body mass index, waist circumference, blood pressure, urine albumin-creatinine ratio, duration since diagnosis and medication. One hundred and one patients with DM (type 1 DM = 23, type 2 DM = 78) were included. Spearman's rank correlation coefficient revealed a significant correlation between RBG and HbA1c among the patients with type 2 DM (r = 0.73, P < 0.001) but not patients with type 1 DM (r = 0.17, P = 0.44). Furthermore, in a multivariate linear regression model (R(2) = 0.71) RBG (per mmol/l increment) (B = 0.28, 95% CI:0.24-0.32, P < 0.001) was significantly associated with HbA1c among the patients with type 2 DM. Based on ROC analysis (AUC = 0.80, SE = 0.05), RBG ≤7.5 mmol/l was determined as the optimal cut-off value for good glycaemic control (HbA1c <7.0% [53 mmol/mol]) among patients with type 2 DM (sensitivity = 76.7%; specificity = 70.8%; positive predictive value = 62.2%; negative predictive value = 82.9%). Random blood glucose could possibly be used to assess glycaemic control among patients with type 2 DM in rural settings of sub-Saharan Africa. © 2014 John Wiley & Sons Ltd.

  17. Choosing HIV Counseling and Testing Strategies for Outreach Settings: A Randomized Trial.

    PubMed

    Spielberg, Freya; Branson, Bernard M; Goldbaum, Gary M; Lockhart, David; Kurth, Ann; Rossini, Anthony; Wood, Robert W

    2005-03-01

    In surveys, clients have expressed preferences for alternatives to traditional HIV counseling and testing. Few data exist to document how offering such alternatives affects acceptance of HIV testing and receipt of test results. This randomized controlled trial compared types of HIV tests and counseling at a needle exchange and 2 bathhouses to determine which types most effectively ensured that clients received test results. Four alternatives were offered on randomly determined days: (1) traditional test with standard counseling, (2) rapid test with standard counseling, (3) oral fluid test with standard counseling, and (4) traditional test with choice of written pretest materials or standard counseling. Of 17,010 clients offered testing, 7014 (41%) were eligible; of those eligible, 761 (11%) were tested: 324 at the needle exchange and 437 at the bathhouses. At the needle exchange, more clients accepted testing (odds ratio [OR] = 2.3; P < 0.001) and received results (OR = 2.6; P < 0.001) on days when the oral fluid test was offered compared with the traditional test. At the bathhouses, more clients accepted oral fluid testing (OR = 1.6; P < 0.001), but more clients overall received results on days when the rapid test was offered (OR = 1.9; P = 0.01). Oral fluid testing and rapid blood testing at both outreach venues resulted in significantly more people receiving test results compared with traditional HIV testing. Making counseling optional increased testing at the needle exchange but not at the bathhouses.

  18. Integrating indicator-based geostatistical estimation and aquifer vulnerability of nitrate-N for establishing groundwater protection zones

    NASA Astrophysics Data System (ADS)

    Jang, Cheng-Shin; Chen, Shih-Kai

    2015-04-01

    Groundwater nitrate-N contamination occurs frequently in agricultural regions, primarily resulting from surface agricultural activities. The focus of this study is to establish groundwater protection zones based on indicator-based geostatistical estimation and aquifer vulnerability of nitrate-N in the Choushui River alluvial fan in Taiwan. The groundwater protection zones are determined by univariate indicator kriging (IK) estimation, aquifer vulnerability assessment using logistic regression (LR), and integration of the IK estimation and aquifer vulnerability using simple IK with local prior means (sIKlpm). First, according to the statistical significance of source, transport, and attenuation factors dominating the occurrence of nitrate-N pollution, a LR model was adopted to evaluate aquifer vulnerability and to characterize occurrence probability of nitrate-N exceeding 0.5 mg/L. Moreover, the probabilities estimated using LR were regarded as local prior means. IK was then used to estimate the actual extent of nitrate-N pollution. The integration of the IK estimation and aquifer vulnerability was obtained using sIKlpm. Finally, groundwater protection zones were probabilistically determined using the three aforementioned methods, and the estimated accuracy of the delineated groundwater protection zones was gauged using a cross-validation procedure based on observed nitrate-N data. The results reveal that the integration of the IK estimation and aquifer vulnerability using sIKlpm is more robust than univariate IK estimation and aquifer vulnerability assessment using LR for establishing groundwater protection zones. Rigorous management practices for fertilizer use should be implemented in orchards situated in the determined groundwater protection zones.

  19. Geostatistics-based groundwater-level monitoring network design and its application to the Upper Floridan aquifer, USA.

    PubMed

    Bhat, Shirish; Motz, Louis H; Pathak, Chandra; Kuebler, Laura

    2015-01-01

    A geostatistical method was applied to optimize an existing groundwater-level monitoring network in the Upper Floridan aquifer for the South Florida Water Management District in the southeastern United States. Analyses were performed to determine suitable numbers and locations of monitoring wells that will provide equivalent or better quality groundwater-level data compared to an existing monitoring network. Ambient, unadjusted groundwater heads were expressed as salinity-adjusted heads based on the density of freshwater, well screen elevations, and temperature-dependent saline groundwater density. The optimization of the numbers and locations of monitoring wells is based on a pre-defined groundwater-level prediction error. The newly developed network combines an existing network with the addition of new wells that will result in a spatial distribution of groundwater monitoring wells that better defines the regional potentiometric surface of the Upper Floridan aquifer in the study area. The network yields groundwater-level predictions that differ significantly from those produced using the existing network. The newly designed network will reduce the mean prediction standard error by 43% compared to the existing network. The adoption of a hexagonal grid network for the South Florida Water Management District is recommended to achieve both a uniform level of information about groundwater levels and the minimum required accuracy. It is customary to install more monitoring wells for observing groundwater levels and groundwater quality as groundwater development progresses. However, budget constraints often force water managers to implement cost-effective monitoring networks. In this regard, this study provides guidelines to water managers concerned with groundwater planning and monitoring.

  20. Merging parallel tempering with sequential geostatistical resampling for improved posterior exploration of high-dimensional subsurface categorical fields

    NASA Astrophysics Data System (ADS)

    Laloy, Eric; Linde, Niklas; Jacques, Diederik; Mariethoz, Grégoire

    2016-04-01

    The sequential geostatistical resampling (SGR) algorithm is a Markov chain Monte Carlo (MCMC) scheme for sampling from possibly non-Gaussian, complex spatially-distributed prior models such as geologic facies or categorical fields. In this work, we highlight the limits of standard SGR for posterior inference of high-dimensional categorical fields with realistically complex likelihood landscapes and benchmark a parallel tempering implementation (PT-SGR). Our proposed PT-SGR approach is demonstrated using synthetic (error corrupted) data from steady-state flow and transport experiments in categorical 7575- and 10,000-dimensional 2D conductivity fields. In both case studies, every SGR trial gets trapped in a local optima while PT-SGR maintains an higher diversity in the sampled model states. The advantage of PT-SGR is most apparent in an inverse transport problem where the posterior distribution is made bimodal by construction. PT-SGR then converges towards the appropriate data misfit much faster than SGR and partly recovers the two modes. In contrast, for the same computational resources SGR does not fit the data to the appropriate error level and hardly produces a locally optimal solution that looks visually similar to one of the two reference modes. Although PT-SGR clearly surpasses SGR in performance, our results also indicate that using a small number (16-24) of temperatures (and thus parallel cores) may not permit complete sampling of the posterior distribution by PT-SGR within a reasonable computational time (less than 1-2 weeks).