Science.gov

Sample records for geostatistics random sets

  1. Fast Geostatistical Inversion using Randomized Matrix Decompositions and Sketchings for Heterogeneous Aquifer Characterization

    NASA Astrophysics Data System (ADS)

    O'Malley, D.; Le, E. B.; Vesselinov, V. V.

    2015-12-01

    We present a fast, scalable, and highly-implementable stochastic inverse method for characterization of aquifer heterogeneity. The method utilizes recent advances in randomized matrix algebra and exploits the structure of the Quasi-Linear Geostatistical Approach (QLGA), without requiring a structured grid like Fast-Fourier Transform (FFT) methods. The QLGA framework is a more stable version of Gauss-Newton iterates for a large number of unknown model parameters, but provides unbiased estimates. The methods are matrix-free and do not require derivatives or adjoints, and are thus ideal for complex models and black-box implementation. We also incorporate randomized least-square solvers and data-reduction methods, which speed up computation and simulate missing data points. The new inverse methodology is coded in Julia and implemented in the MADS computational framework (http://mads.lanl.gov). Julia is an advanced high-level scientific programing language that allows for efficient memory management and utilization of high-performance computational resources. Inversion results based on series of synthetic problems with steady-state and transient calibration data are presented.

  2. Distribution of random Cantor sets on tubes

    NASA Astrophysics Data System (ADS)

    Chen, Changhao

    2016-04-01

    We show that there exist (d-1)-Ahlfors regular compact sets E subset{R}d, d≥2 such that for any t< d-1, we have supT {H}^{d-1}(E\\cap T)/w(T)t< infty where the supremum is over all tubes T with width w(T) >0. This settles a question of T. Orponen. The sets we construct are random Cantor sets, and the method combines geometric and probabilistic estimates on the intersections of these random Cantor sets with affine subspaces.

  3. Scaling random walks on arbitrary sets

    NASA Astrophysics Data System (ADS)

    Harris, Simon C.; Williams, David; Sibson, Robin

    1999-01-01

    Let I be a countably infinite set of points in [open face R] which we can write as I={ui: i[set membership][open face Z]}, with uirandom-walk, when repeatedly rescaled suitably in space and time, looks more and more like a Brownian motion. In this paper we explore the convergence properties of the Markov chain Y on the set I under suitable space-time scalings. Later, we consider some cases when the set I consists of the points of a renewal process and the jump rates assigned to each state in I are perhaps also randomly chosen.This work sprang from a question asked by one of us (Sibson) about ‘driftless nearest-neighbour’ Markov chains on countable subsets I of [open face R]d, work of Sibson [7] and of Christ, Friedberg and Lee [2] having identified examples of such chains in terms of the Dirichlet tessellation associated with I. Amongst methods which can be brought to bear on this d-dimensional problem is the theory of Dirichlet forms. There are potential problems in doing this because we wish I to be random (for example, a realization of a Poisson point process), we do not wish to impose artificial boundedness conditions which would clearly make things work for certain deterministic sets I. In the 1-dimensional case discussed here and in the following paper by Harris, much simpler techniques (where we embed the Markov chain in a Brownian motion using local time) work very effectively; and it is these, rather than the theory of Dirichlet forms, that we use.

  4. GEOSTATISTICAL SAMPLING DESIGNS FOR HAZARDOUS WASTE SITES

    EPA Science Inventory

    This chapter discusses field sampling design for environmental sites and hazardous waste sites with respect to random variable sampling theory, Gy's sampling theory, and geostatistical (kriging) sampling theory. The literature often presents these sampling methods as an adversari...

  5. Imprecise (fuzzy) information in geostatistics

    SciTech Connect

    Bardossy, A.; Bogardi, I.; Kelly, W.E.

    1988-05-01

    A methodology based on fuzzy set theory for the utilization of imprecise data in geostatistics is presented. A common problem preventing a broader use of geostatistics has been the insufficient amount of accurate measurement data. In certain cases, additional but uncertain (soft) information is available and can be encoded as subjective probabilities, and then the soft kriging method can be applied (Journal, 1986). In other cases, a fuzzy encoding of soft information may be more realistic and simplify the numerical calculations. Imprecise (fuzzy) spatial information on the possible variogram is integrated into a single variogram which is used in a fuzzy kriging procedure. The overall uncertainty of prediction is represented by the estimation variance and the calculated membership function for each kriged point. The methodology is applied to the permeability prediction of a soil liner for hazardous waste containment. The available number of hard measurement data (20) was not enough for a classical geostatistical analysis. An additional 20 soft data made it possible to prepare kriged contour maps using the fuzzy geostatistical procedure.

  6. The non-integer operation associated to random variation sets of the self-similar set

    NASA Astrophysics Data System (ADS)

    Ren, Fu-Yao; Liang, Jin-Rong

    2000-10-01

    When memory sets are random variation sets of the self-similar set and the total number of remaining states in each stage of the division of this set is normalized to unity, the corresponding flux and fractional integral are “robust” and stable under some conditions. This answers an open problem proposed by Alian Le Mehaute et al. in their book entitled Irreversibilitê Temporel et Geometrie Fractale.

  7. Random set particle filter for bearings-only multitarget tracking

    NASA Astrophysics Data System (ADS)

    Vihola, Matti

    2005-05-01

    The random set approach to multitarget tracking is a theoretically sound framework that covers joint estimation of the number of targets and the state of the targets. This paper describes a particle filter implementation of the random set multitarget filter. The contribution of this paper to the random set tracking framework is the formulation of a measurement model where each sensor report is assumed to contain at most one measurement. The implemented filter was tested in synthetic bearings-only tracking scenarios containing up to two targets in the presence of false alarms and missed measurements. The estimated target state consisted of 2D position and velocity components. The filter was capable to track the targets fairly well despite of the missing measurements and the relatively high false alarm rates. In addition, the filter showed robustness against wrong parameter values of false alarm rates. The results that were obtained during the limited tests of the filter show that the random set framework has potential for challenging tracking situations. On the other hand, the computational burden of the described implementation is quite high and increases approximately linearly with respect to the expected number of targets.

  8. Fractal and geostatistical methods for modeling of a fracture network

    SciTech Connect

    Chiles, J.P.

    1988-08-01

    The modeling of fracture networks is useful for fluid flow and rock mechanics studies. About 6600 fracture traces were recorded on drifts of a uranium mine in a granite massif. The traces have an extension of 0.20-20 m. The network was studied by fractal and by geostatistical methods but can be considered neither as a fractal with a constant dimension nor a set of purely randomly located fractures. Two kinds of generalization of conventional models can still provide more flexibility for the characterization of the network: (a) a nonscaling fractal model with variable similarity dimension (for a 2-D network of traces, the dimension varying from 2 for the 10-m scale to 1 for the centimeter scale, (b) a parent-daughter model with a regionalized density; the geostatistical study allows a 3-D model to be established where: fractures are assumed to be discs; fractures are grouped in clusters or swarms; and fracturation density is regionalized (with two ranges at about 30 and 300 m). The fractal model is easy to fit and to simulate along a line, but 2-D and 3-D simulations are more difficult. The geostatistical model is more complex, but easy to simulate, even in 3-D.

  9. GEO-EAS (GEOSTATISTICAL ENVIRONMENTAL ASSESSMENT SOFTWARE) USER'S GUIDE

    EPA Science Inventory

    The report describes how to install and use the Geo-EAS (Geostatistical Environmental Assessment Software) software package on an IBM-PC compatible computer system. A detailed example is provided showing how to use the software to conduct a geostatistical analysis of a data set. ...

  10. Geostatistics and spatial analysis in biological anthropology.

    PubMed

    Relethford, John H

    2008-05-01

    A variety of methods have been used to make evolutionary inferences based on the spatial distribution of biological data, including reconstructing population history and detection of the geographic pattern of natural selection. This article provides an examination of geostatistical analysis, a method used widely in geology but which has not often been applied in biological anthropology. Geostatistical analysis begins with the examination of a variogram, a plot showing the relationship between a biological distance measure and the geographic distance between data points and which provides information on the extent and pattern of spatial correlation. The results of variogram analysis are used for interpolating values of unknown data points in order to construct a contour map, a process known as kriging. The methods of geostatistical analysis and discussion of potential problems are applied to a large data set of anthropometric measures for 197 populations in Ireland. The geostatistical analysis reveals two major sources of spatial variation. One pattern, seen for overall body and craniofacial size, shows an east-west cline most likely reflecting the combined effects of past population dispersal and settlement. The second pattern is seen for craniofacial height and shows an isolation by distance pattern reflecting rapid spatial changes in the midlands region of Ireland, perhaps attributable to the genetic impact of the Vikings. The correspondence of these results with other analyses of these data and the additional insights generated from variogram analysis and kriging illustrate the potential utility of geostatistical analysis in biological anthropology. PMID:18257009

  11. The excursion set approach in non-Gaussian random fields

    NASA Astrophysics Data System (ADS)

    Musso, Marcello; Sheth, Ravi K.

    2014-04-01

    Insight into a number of interesting questions in cosmology can be obtained by studying the first crossing distributions of physically motivated barriers by random walks with correlated steps: higher mass objects are associated with walks that cross the barrier in fewer steps. We write the first crossing distribution as a formal series, ordered by the number of times a walk upcrosses the barrier. Since the fraction of walks with many upcrossings is negligible if the walk has not taken many steps, the leading order term in this series is the most relevant for understanding the massive objects of most interest in cosmology. For walks associated with Gaussian random fields, this first term only requires knowledge of the bivariate distribution of the walk height and slope, and provides an excellent approximation to the first crossing distribution for all barriers and smoothing filters of current interest. We show that this simplicity survives when extending the approach to the case of non-Gaussian random fields. For non-Gaussian fields which are obtained by deterministic transformations of a Gaussian, the first crossing distribution is simply related to that for Gaussian walks crossing a suitably rescaled barrier. Our analysis shows that this is a useful way to think of the generic case as well. Although our study is motivated by the possibility that the primordial fluctuation field was non-Gaussian, our results are general. In particular, they do not assume the non-Gaussianity is small, so they may be viewed as the solution to an excursion set analysis of the late-time, non-linear fluctuation field rather than the initial one. They are also useful for models in which the barrier height is determined by quantities other than the initial density, since most other physically motivated variables (such as the shear) are usually stochastic and non-Gaussian. We use the Lognormal transformation to illustrate some of our arguments.

  12. The use of a genetic algorithm-based search strategy in geostatistics: application to a set of anisotropic piezometric head data

    NASA Astrophysics Data System (ADS)

    Abedini, M. J.; Nasseri, M.; Burn, D. H.

    2012-04-01

    In any geostatistical study, an important consideration is the choice of an appropriate, repeatable, and objective search strategy that controls the nearby samples to be included in the location-specific estimation procedure. Almost all geostatistical software available in the market puts the onus on the user to supply search strategy parameters in a heuristic manner. These parameters are solely controlled by geographical coordinates that are defined for the entire area under study, and the user has no guidance as to how to choose these parameters. The main thesis of the current study is that the selection of search strategy parameters has to be driven by data—both the spatial coordinates and the sample values—and cannot be chosen beforehand. For this purpose, a genetic-algorithm-based ordinary kriging with moving neighborhood technique is proposed. The search capability of a genetic algorithm is exploited to search the feature space for appropriate, either local or global, search strategy parameters. Radius of circle/sphere and/or radii of standard or rotated ellipse/ellipsoid are considered as the decision variables to be optimized by GA. The superiority of GA-based ordinary kriging is demonstrated through application to the Wolfcamp Aquifer piezometric head data. Assessment of numerical results showed that definition of search strategy parameters based on both geographical coordinates and sample values improves cross-validation statistics when compared with that based on geographical coordinates alone. In the case of a variable search neighborhood for each estimation point, optimization of local search strategy parameters for an elliptical support domain—the orientation of which is dictated by anisotropic axes—via GA was able to capture the dynamics of piezometric head in west Texas/New Mexico in an efficient way.

  13. Multiple Extended Target Tracking With Labeled Random Finite Sets

    NASA Astrophysics Data System (ADS)

    Beard, Michael; Reuter, Stephan; Granstrom, Karl; Vo, Ba-Tuong; Vo, Ba-Ngu; Scheel, Alexander

    2016-04-01

    Targets that generate multiple measurements at a given instant in time are commonly known as extended targets. These present a challenge for many tracking algorithms, as they violate one of the key assumptions of the standard measurement model. In this paper, a new algorithm is proposed for tracking multiple extended targets in clutter, that is capable of estimating the number of targets, as well the trajectories of their states, comprising the kinematics, measurement rates and extents. The proposed technique is based on modelling the multi-target state as a generalised labelled multi-Bernoulli (GLMB) random finite set (RFS), within which the extended targets are modelled using gamma Gaussian inverse Wishart (GGIW) distributions. A cheaper variant of the algorithm is also proposed, based on the labelled multi-Bernoulli (LMB) filter. The proposed GLMB/LMB-based algorithms are compared with an extended target version of the cardinalised probability hypothesis density (CPHD) filter, and simulation results show that the (G)LMB has improved estimation and tracking performance.

  14. Robust geostatistical analysis of spatial data

    NASA Astrophysics Data System (ADS)

    Papritz, Andreas; Künsch, Hans Rudolf; Schwierz, Cornelia; Stahel, Werner A.

    2013-04-01

    Most of the geostatistical software tools rely on non-robust algorithms. This is unfortunate, because outlying observations are rather the rule than the exception, in particular in environmental data sets. Outliers affect the modelling of the large-scale spatial trend, the estimation of the spatial dependence of the residual variation and the predictions by kriging. Identifying outliers manually is cumbersome and requires expertise because one needs parameter estimates to decide which observation is a potential outlier. Moreover, inference after the rejection of some observations is problematic. A better approach is to use robust algorithms that prevent automatically that outlying observations have undue influence. Former studies on robust geostatistics focused on robust estimation of the sample variogram and ordinary kriging without external drift. Furthermore, Richardson and Welsh (1995) proposed a robustified version of (restricted) maximum likelihood ([RE]ML) estimation for the variance components of a linear mixed model, which was later used by Marchant and Lark (2007) for robust REML estimation of the variogram. We propose here a novel method for robust REML estimation of the variogram of a Gaussian random field that is possibly contaminated by independent errors from a long-tailed distribution. It is based on robustification of estimating equations for the Gaussian REML estimation (Welsh and Richardson, 1997). Besides robust estimates of the parameters of the external drift and of the variogram, the method also provides standard errors for the estimated parameters, robustified kriging predictions at both sampled and non-sampled locations and kriging variances. Apart from presenting our modelling framework, we shall present selected simulation results by which we explored the properties of the new method. This will be complemented by an analysis a data set on heavy metal contamination of the soil in the vicinity of a metal smelter. Marchant, B.P. and Lark, R

  15. In School Settings, Are All RCTs (Randomized Control Trials) Exploratory?

    ERIC Educational Resources Information Center

    Newman, Denis; Jaciw, Andrew P.

    2012-01-01

    The motivation for this paper is the authors' recent work on several randomized control trials in which they found the primary result, which averaged across subgroups or sites, to be moderated by demographic or site characteristics. They are led to examine a distinction that the Institute of Education Sciences (IES) makes between "confirmatory"…

  16. Determining the Significance of Item Order in Randomized Problem Sets

    ERIC Educational Resources Information Center

    Pardos, Zachary A.; Heffernan, Neil T.

    2009-01-01

    Researchers who make tutoring systems would like to know which sequences of educational content lead to the most effective learning by their students. The majority of data collected in many ITS systems consist of answers to a group of questions of a given skill often presented in a random sequence. Following work that identifies which items…

  17. Preferential sampling and Bayesian geostatistics: Statistical modeling and examples.

    PubMed

    Cecconi, Lorenzo; Grisotto, Laura; Catelan, Dolores; Lagazio, Corrado; Berrocal, Veronica; Biggeri, Annibale

    2016-08-01

    Preferential sampling refers to any situation in which the spatial process and the sampling locations are not stochastically independent. In this paper, we present two examples of geostatistical analysis in which the usual assumption of stochastic independence between the point process and the measurement process is violated. To account for preferential sampling, we specify a flexible and general Bayesian geostatistical model that includes a shared spatial random component. We apply the proposed model to two different case studies that allow us to highlight three different modeling and inferential aspects of geostatistical modeling under preferential sampling: (1) continuous or finite spatial sampling frame; (2) underlying causal model and relevant covariates; and (3) inferential goals related to mean prediction surface or prediction uncertainty. PMID:27566774

  18. The geostatistical characteristics of the Borden aquifer

    NASA Astrophysics Data System (ADS)

    Woodbury, Allan D.; Sudicky, E. A.

    1991-04-01

    A complete reexamination of Sudicky's (1986) field experiment for the geostatistical characterization of hydraulic conductivity at the Borden aquifer in Ontario, Canada is performed. The sampled data reveal that a number of outliers (low ln (K) values) are present in the data base. These low values cause difficulties in both variogram estimation and determining population statistics. The analysis shows that assuming either a normal distribution or exponential distribution for log conductivity is appropriate. The classical, Cressie/Hawkins and squared median of the absolute deviations (SMAD) estimators are used to compute experimental variograms. None of these estimators provides completely satisfactory variograms for the Borden data with the exception of the classical estimator with outliers removed from the data set. Theoretical exponential variogram parameters are determined from nonlinear (NL) estimation. Differences are obtained between NL fits and those of Sudicky (1986). For the classical-screened estimated variogram, NL fits produce an ln (K) variance of 0.24, nugget of 0.07, and integral scales of 5.1 m horizontal and 0.21 m vertical along A-A'. For B-B' these values are 0.37, 0.11, 8.3 and 0.34. The fitted parameter set for B-B' data (horizontal and vertical) is statistically different than the parameter set determined for A-A'. We also evaluate a probabilistic form of Dagan's (1982, 1987) equations relating geostatistical parameters to a tracer cloud's spreading moments. The equations are evaluated using the parameter estimates and covariances determined from line A-A' as input, with a velocity equal to 9.0 cm/day. The results are compared with actual values determined from the field test, but evaluated by both Freyberg (1986) and Rajaram and Gelhar (1988). The geostatistical parameters developed from this study produce an excellent fit to both sets of calculated plume moments when combined with Dagan's stochastic theory for predicting the spread of a

  19. Self-similar continuous cascades supported by random Cantor sets: Application to rainfall data

    NASA Astrophysics Data System (ADS)

    Muzy, Jean-François; Baïle, Rachel

    2016-05-01

    We introduce a variant of continuous random cascade models that extends former constructions introduced by Barral-Mandelbrot and Bacry-Muzy in the sense that they can be supported by sets of arbitrary fractal dimension. The so-introduced sets are exactly self-similar stationary versions of random Cantor sets formerly introduced by Mandelbrot as "random cutouts." We discuss the main mathematical properties of our construction and compute its scaling properties. We then illustrate our purpose on several numerical examples and we consider a possible application to rainfall data. We notably show that our model allows us to reproduce remarkably the distribution of dry period durations.

  20. Self-similar continuous cascades supported by random Cantor sets: Application to rainfall data.

    PubMed

    Muzy, Jean-François; Baïle, Rachel

    2016-05-01

    We introduce a variant of continuous random cascade models that extends former constructions introduced by Barral-Mandelbrot and Bacry-Muzy in the sense that they can be supported by sets of arbitrary fractal dimension. The so-introduced sets are exactly self-similar stationary versions of random Cantor sets formerly introduced by Mandelbrot as "random cutouts." We discuss the main mathematical properties of our construction and compute its scaling properties. We then illustrate our purpose on several numerical examples and we consider a possible application to rainfall data. We notably show that our model allows us to reproduce remarkably the distribution of dry period durations. PMID:27300908

  1. A Practical Primer on Geostatistics

    USGS Publications Warehouse

    Olea, Ricardo A.

    2009-01-01

    THE CHALLENGE Most geological phenomena are extraordinarily complex in their interrelationships and vast in their geographical extension. Ordinarily, engineers and geoscientists are faced with corporate or scientific requirements to properly prepare geological models with measurements involving a small fraction of the entire area or volume of interest. Exact description of a system such as an oil reservoir is neither feasible nor economically possible. The results are necessarily uncertain. Note that the uncertainty is not an intrinsic property of the systems; it is the result of incomplete knowledge by the observer. THE AIM OF GEOSTATISTICS The main objective of geostatistics is the characterization of spatial systems that are incompletely known, systems that are common in geology. A key difference from classical statistics is that geostatistics uses the sampling location of every measurement. Unless the measurements show spatial correlation, the application of geostatistics is pointless. Ordinarily the need for additional knowledge goes beyond a few points, which explains the display of results graphically as fishnet plots, block diagrams, and maps. GEOSTATISTICAL METHODS Geostatistics is a collection of numerical techniques for the characterization of spatial attributes using primarily two tools: probabilistic models, which are used for spatial data in a manner similar to the way in which time-series analysis characterizes temporal data, or pattern recognition techniques. The probabilistic models are used as a way to handle uncertainty in results away from sampling locations, making a radical departure from alternative approaches like inverse distance estimation methods. DIFFERENCES WITH TIME SERIES On dealing with time-series analysis, users frequently concentrate their attention on extrapolations for making forecasts. Although users of geostatistics may be interested in extrapolation, the methods work at their best interpolating. This simple difference has

  2. Model Selection for Geostatistical Models

    SciTech Connect

    Hoeting, Jennifer A.; Davis, Richard A.; Merton, Andrew A.; Thompson, Sandra E.

    2006-02-01

    We consider the problem of model selection for geospatial data. Spatial correlation is typically ignored in the selection of explanatory variables and this can influence model selection results. For example, the inclusion or exclusion of particular explanatory variables may not be apparent when spatial correlation is ignored. To address this problem, we consider the Akaike Information Criterion (AIC) as applied to a geostatistical model. We offer a heuristic derivation of the AIC in this context and provide simulation results that show that using AIC for a geostatistical model is superior to the often used approach of ignoring spatial correlation in the selection of explanatory variables. These ideas are further demonstrated via a model for lizard abundance. We also employ the principle of minimum description length (MDL) to variable selection for the geostatistical model. The effect of sampling design on the selection of explanatory covariates is also explored.

  3. Examining the Missing Completely at Random Mechanism in Incomplete Data Sets: A Multiple Testing Approach

    ERIC Educational Resources Information Center

    Raykov, Tenko; Lichtenberg, Peter A.; Paulson, Daniel

    2012-01-01

    A multiple testing procedure for examining implications of the missing completely at random (MCAR) mechanism in incomplete data sets is discussed. The approach uses the false discovery rate concept and is concerned with testing group differences on a set of variables. The method can be used for ascertaining violations of MCAR and disproving this…

  4. Restricted spatial regression in practice: Geostatistical models, confounding, and robustness under model misspecification

    USGS Publications Warehouse

    Hanks, Ephraim M.; Schliep, Erin M.; Hooten, Mevin B.; Hoeting, Jennifer A.

    2015-01-01

    In spatial generalized linear mixed models (SGLMMs), covariates that are spatially smooth are often collinear with spatially smooth random effects. This phenomenon is known as spatial confounding and has been studied primarily in the case where the spatial support of the process being studied is discrete (e.g., areal spatial data). In this case, the most common approach suggested is restricted spatial regression (RSR) in which the spatial random effects are constrained to be orthogonal to the fixed effects. We consider spatial confounding and RSR in the geostatistical (continuous spatial support) setting. We show that RSR provides computational benefits relative to the confounded SGLMM, but that Bayesian credible intervals under RSR can be inappropriately narrow under model misspecification. We propose a posterior predictive approach to alleviating this potential problem and discuss the appropriateness of RSR in a variety of situations. We illustrate RSR and SGLMM approaches through simulation studies and an analysis of malaria frequencies in The Gambia, Africa.

  5. Application of Frechet and other random-set averaging techniques to fusion of information

    NASA Astrophysics Data System (ADS)

    Goodman, I. R.

    1998-07-01

    An obviously important aspect of target tracking, and more generally, data fusion, is the combination of those pieces of multi-source information deemed to belong together. Recently, it has been pointed out that a random set approach to target tracking and data fusion may be more appropriate rather than the standard point-vector estimate approach -- especially in the case of large inherent parameter errors. In addition, since many data fusion problems involve non-numerical linguistic descriptions, in the same spirit it is also desirable to be able to have a method which averages in some qualitative sense random sets which are non-numerically- valued, i.e., which take on propositions or events, such as 'the target appears in area A or C, given the weather conditions of yesterday and source 1' and 'the target appears in area A or B, given the weather conditions of today and source 2.' This leads to the fundamental problem of how best to define the expectation of a random set. To date, this open issue has only been considered for numerically-based random sets. This paper addresses this issue in part by proposing an approach which is actually algebraically-based, but also applicable to numerical-based random sets, and directly related to both the Frechet and the Aumann-Artstein-Vitale random set averaging procedures. The technique employs the concept of 'constant probability events,' which has also played a key role in the recent development of 'relational event algebra,' a new mathematical tool for representing various models in the form of various functions of probabilities.

  6. A geostatistical approach to estimate mining efficiency indicators with flexible meshes

    NASA Astrophysics Data System (ADS)

    Freixas, Genis; Garriga, David; Fernàndez-Garcia, Daniel; Sanchez-Vila, Xavier

    2014-05-01

    Geostatistics is a branch of statistics developed originally to predict probability distributions of ore grades for mining operations by considering the attributes of a geological formation at unknown locations as a set of correlated random variables. Mining exploitations typically aim to maintain acceptable mineral laws to produce commercial products based upon demand. In this context, we present a new geostatistical methodology to estimate strategic efficiency maps that incorporate hydraulic test data, the evolution of concentrations with time obtained from chemical analysis (packer tests and production wells) as well as hydraulic head variations. The methodology is applied to a salt basin in South America. The exploitation is based on the extraction of brines through vertical and horizontal wells. Thereafter, brines are precipitated in evaporation ponds to obtain target potassium and magnesium salts of economic interest. Lithium carbonate is obtained as a byproduct of the production of potassium chloride. Aside from providing an assemble of traditional geostatistical methods, the strength of this study falls with the new methodology developed, which focus on finding the best sites to exploit the brines while maintaining efficiency criteria. Thus, some strategic indicator efficiency maps have been developed under the specific criteria imposed by exploitation standards to incorporate new extraction wells in new areas that would allow maintain or improve production. Results show that the uncertainty quantification of the efficiency plays a dominant role and that the use flexible meshes, which properly describe the curvilinear features associated with vertical stratification, provides a more consistent estimation of the geological processes. Moreover, we demonstrate that the vertical correlation structure at the given salt basin is essentially linked to variations in the formation thickness, which calls for flexible meshes and non-stationarity stochastic processes.

  7. Statistical analysis of sets of random walks: how to resolve their generating mechanism.

    PubMed

    Coscoy, Sylvie; Huguet, Etienne; Amblard, François

    2007-11-01

    The analysis of experimental random walks aims at identifying the process(es) that generate(s) them. It is in general a difficult task, because statistical dispersion within an experimental set of random walks is a complex combination of the stochastic nature of the generating process, and the possibility to have more than one simple process. In this paper, we study by numerical simulations how the statistical distribution of various geometric descriptors such as the second, third and fourth order moments of two-dimensional random walks depends on the stochastic process that generates that set. From these observations, we derive a method to classify complex sets of random walks, and resolve the generating process(es) by the systematic comparison of experimental moment distributions with those numerically obtained for candidate processes. In particular, various processes such as Brownian diffusion combined with convection, noise, confinement, anisotropy, or intermittency, can be resolved by using high order moment distributions. In addition, finite-size effects are observed that are useful for treating short random walks. As an illustration, we describe how the present method can be used to study the motile behavior of epithelial microvilli. The present work should be of interest in biology for all possible types of single particle tracking experiments. PMID:17896161

  8. Variation analysis of lake extension in space and time from MODIS images using random sets

    NASA Astrophysics Data System (ADS)

    Zhao, Xi; Stein, Alfred; Zhang, Xiang; Feng, Lian; Chen, Xiaoling

    2014-08-01

    Understanding inundation in wetlands may benefit from a joint variation analysis in changes of size, shape, position and extent of water bodies. In this study, we modeled wetland inundation as a random spread process and used random sets to characterize stochastic properties of water body extents. Periodicity, trend and random components were captured by monthly and yearly random sets that were derived from multitemporal images. The Covering-Distance matrix and related operators summarized and visualized the spatial pattern and quantified the similarity of different inundation stages. The study was carried out on the Poyang Lake wetland area in China, and MODIS images for a period of eleven years were used. Results revealed the substantial seasonal dynamic pattern of the inundation and a subtle interannual change in its extension from 2000 to 2010. Various spatial properties including the size, shape, position and extent are visible: areas of high flooding risk are very elongated and locate along the water channel; few of the inundation areas tend to be more circular and spread extensively; the majority of the inundation areas have various extent and size in different month and year. Large differences in the spatial distribution of inundation extents were shown to exist between months from different seasons. A unique spatial pattern occurred during those months that a dramatic flooding or recession happened. Yearly random sets gave detailed information on the spatial distributions of inundation frequency and showed a shrinking trend from 2000 to 2009. 2003 is the partition year in the declining trend and 2010 breaking the trend as an abnormal year. Besides, probability bounds were derived from the model for a region that was attacked by flooding. This ability of supporting decision making is shown in a simple management scenario. We conclude that a random sets analysis is a valuable addition to a frequency analysis that quantifies inundation variation in space and

  9. Geostatistical Modeling of Pore Velocity

    SciTech Connect

    Devary, J.L.; Doctor, P.G.

    1981-06-01

    A significant part of evaluating a geologic formation as a nuclear waste repository involves the modeling of contaminant transport in the surrounding media in the event the repository is breached. The commonly used contaminant transport models are deterministic. However, the spatial variability of hydrologic field parameters introduces uncertainties into contaminant transport predictions. This paper discusses the application of geostatistical techniques to the modeling of spatially varying hydrologic field parameters required as input to contaminant transport analyses. Kriging estimation techniques were applied to Hanford Reservation field data to calculate hydraulic conductivity and the ground-water potential gradients. These quantities were statistically combined to estimate the groundwater pore velocity and to characterize the pore velocity estimation error. Combining geostatistical modeling techniques with product error propagation techniques results in an effective stochastic characterization of groundwater pore velocity, a hydrologic parameter required for contaminant transport analyses.

  10. Geostatistics for high resolution geomorphometry: from spatial continuity to surface texture

    NASA Astrophysics Data System (ADS)

    Trevisani, Sebastiano

    2015-04-01

    This presentation introduces the use of geostatistics in the context of high-resolution geomorphometry. The application of geostatistics to geomorphometry permits a shift in perspective, moving our attention more toward spatial continuity description than toward the inference of a spatial continuity model. This change in perspective opens interesting directions in the application of geostatistical methods in geomorphometry. Geostatistical methodologies have been extensively applied and adapted in the context of remote sensing, leading to many interesting applications aimed at the analysis of the complex patterns characterizing imagery. Among these applications the analysis of image texture has to be mentioned. In fact, the analysis of image texture reverts to the analysis of surface texture when the analyzed image is a raster representation of a digital terrain model. The main idea is to use spatial-continuity indices as multiscale and directional descriptors of surface texture, including the important aspect related to surface roughness. In this context we introduce some examples regarding the application of geostatistics for image analysis and surface texture characterization. We also show as in presence of complex morphological settings there is the need to use alternative indices of spatial continuity, less sensitive to hotspots and to non-stationarity that often characterize surface morphology. This introduction is mainly dedicated to univariate geostatistics; however the same concepts could be exploited by means of multivariate as well as multipoint geostatistics.

  11. Local Geostatistical Models and Big Data in Hydrological and Ecological Applications

    NASA Astrophysics Data System (ADS)

    Hristopulos, Dionissios

    2015-04-01

    The advent of the big data era creates new opportunities for environmental and ecological modelling but also presents significant challenges. The availability of remote sensing images and low-cost wireless sensor networks implies that spatiotemporal environmental data to cover larger spatial domains at higher spatial and temporal resolution for longer time windows. Handling such voluminous data presents several technical and scientific challenges. In particular, the geostatistical methods used to process spatiotemporal data need to overcome the dimensionality curse associated with the need to store and invert large covariance matrices. There are various mathematical approaches for addressing the dimensionality problem, including change of basis, dimensionality reduction, hierarchical schemes, and local approximations. We present a Stochastic Local Interaction (SLI) model that can be used to model local correlations in spatial data. SLI is a random field model suitable for data on discrete supports (i.e., regular lattices or irregular sampling grids). The degree of localization is determined by means of kernel functions and appropriate bandwidths. The strength of the correlations is determined by means of coefficients. In the "plain vanilla" version the parameter set involves scale and rigidity coefficients as well as a characteristic length. The latter determines in connection with the rigidity coefficient the correlation length of the random field. The SLI model is based on statistical field theory and extends previous research on Spartan spatial random fields [2,3] from continuum spaces to explicitly discrete supports. The SLI kernel functions employ adaptive bandwidths learned from the sampling spatial distribution [1]. The SLI precision matrix is expressed explicitly in terms of the model parameter and the kernel function. Hence, covariance matrix inversion is not necessary for parameter inference that is based on leave-one-out cross validation. This property

  12. Building Damage-Resilient Dominating Sets in Complex Networks against Random and Targeted Attacks

    PubMed Central

    Molnár, F.; Derzsy, N.; Szymanski, B. K.; Korniss, G.

    2015-01-01

    We study the vulnerability of dominating sets against random and targeted node removals in complex networks. While small, cost-efficient dominating sets play a significant role in controllability and observability of these networks, a fixed and intact network structure is always implicitly assumed. We find that cost-efficiency of dominating sets optimized for small size alone comes at a price of being vulnerable to damage; domination in the remaining network can be severely disrupted, even if a small fraction of dominator nodes are lost. We develop two new methods for finding flexible dominating sets, allowing either adjustable overall resilience, or dominating set size, while maximizing the dominated fraction of the remaining network after the attack. We analyze the efficiency of each method on synthetic scale-free networks, as well as real complex networks. PMID:25662371

  13. Importance of stationarity for geostatistical assessment of environmental contamination

    SciTech Connect

    Dagdelen, K.; Turner, A.K.

    1996-12-31

    This paper describes a geostatistical case study to assess TCE contamination from multiple point sources that is migrating through the geologically complex conditions with several aquifers. The paper highlights the importance of the stationarity assumption by demonstrating how biased assessments of TCE contamination result when ordinary kriging of the data that violates stationarity assumptions. Division of the data set into more homogeneous geologic and hydrologic zones improved the accuracy of the estimates. Indicator kriging offers an alternate method for providing a stochastic model that is more appropriate for the data. Further improvement in the estimates results when indicator kriging is applied to individual subregional data sets that are based on geological considerations. This further enhances the data homogeneity and makes use of stationary model more appropriate. By combining geological and geostatistical evaluations, more realistic maps may be produced that reflect the hydrogeological environment and provide a sound basis for future investigations and remediation.

  14. Geostatistical enhancement of european hydrological predictions

    NASA Astrophysics Data System (ADS)

    Pugliese, Alessio; Castellarin, Attilio; Parajka, Juraj; Arheimer, Berit; Bagli, Stefano; Mazzoli, Paolo; Montanari, Alberto; Blöschl, Günter

    2016-04-01

    Geostatistical Enhancement of European Hydrological Prediction (GEEHP) is a research experiment developed within the EU funded SWITCH-ON project, which proposes to conduct comparative experiments in a virtual laboratory in order to share water-related information and tackle changes in the hydrosphere for operational needs (http://www.water-switch-on.eu). The main objective of GEEHP deals with the prediction of streamflow indices and signatures in ungauged basins at different spatial scales. In particular, among several possible hydrological signatures we focus in our experiment on the prediction of flow-duration curves (FDCs) along the stream-network, which has attracted an increasing scientific attention in the last decades due to the large number of practical and technical applications of the curves (e.g. hydropower potential estimation, riverine habitat suitability and ecological assessments, etc.). We apply a geostatistical procedure based on Top-kriging, which has been recently shown to be particularly reliable and easy-to-use regionalization approach, employing two different type of streamflow data: pan-European E-HYPE simulations (http://hypeweb.smhi.se/europehype) and observed daily streamflow series collected in two pilot study regions, i.e. Tyrol (merging data from Austrian and Italian stream gauging networks) and Sweden. The merger of the two study regions results in a rather large area (~450000 km2) and might be considered as a proxy for a pan-European application of the approach. In a first phase, we implement a bidirectional validation, i.e. E-HYPE catchments are set as training sites to predict FDCs at the same sites where observed data are available, and vice-versa. Such a validation procedure reveals (1) the usability of the proposed approach for predicting the FDCs over the entire river network of interest using alternatively observed data and E-HYPE simulations and (2) the accuracy of E-HYPE-based predictions of FDCs in ungauged sites. In a

  15. A unified development of several techniques for the representation of random vectors and data sets

    NASA Technical Reports Server (NTRS)

    Bundick, W. T.

    1973-01-01

    Linear vector space theory is used to develop a general representation of a set of data vectors or random vectors by linear combinations of orthonormal vectors such that the mean squared error of the representation is minimized. The orthonormal vectors are shown to be the eigenvectors of an operator. The general representation is applied to several specific problems involving the use of the Karhunen-Loeve expansion, principal component analysis, and empirical orthogonal functions; and the common properties of these representations are developed.

  16. Geostatistics software user's manual for the Geosciences Research and Engineering Department PDP 11/70 computer

    SciTech Connect

    Devary, J.L.; Mitchell, P.J.; Rice, W.A.; Damschen, D.W.

    1983-12-01

    Geostatistics and kriging are statistical techniques that can be used to estimate a hydrologic parameter surface from spatially distributed data. The techniques produce statistically optimum estimates of a surface and can aid in pinpointing areas of greatest need for data. This document describes the use of geostatistical software routines that have been implemented at Pacific Northwest Laboratory's (PNL's) PDP 11/70 computer of the Geosciences Research and Engineering Department for use with the Hanford ground water monitoring data base of the Ground-Water Surveillance Program sponsored by U.S. Department of Energy. The geostatistical routines were modified to improve input/output file efficiency and to accept larger data sets. Two data processing routines were also developed for retrieving and sorting data into a form compatible with the geostatistical routines. 6 references, 9 figures, 1 table.

  17. Stability of Dominating Sets in Complex Networks against Random and Targeted Attacks

    NASA Astrophysics Data System (ADS)

    Molnar, F.; Derzsy, N.; Szymanski, B. K.; Korniss, G.

    2014-03-01

    Minimum dominating sets (MDS) are involved in efficiently controlling and monitoring many social and technological networks. However, MDS influence over the entire network may be significantly reduced when some MDS nodes are disabled due to random breakdowns or targeted attacks against nodes in the network. We investigate the stability of domination in scale-free networks in such scenarios. We define stability as the fraction of nodes in the network that are still dominated after some nodes have been removed, either randomly, or by targeting the highest-degree nodes. We find that although the MDS is the most cost-efficient solution (requiring the least number of nodes) for reaching every node in an undamaged network, it is also very sensitive to damage. Further, we investigate alternative methods for finding dominating sets that are less efficient (more costly) than MDS but provide better stability. Finally we construct an algorithm based on greedy node selection that allows us to precisely control the balance between domination stability and cost, to achieve any desired stability at minimum cost, or the best possible stability at any given cost. Analysis of our method shows moderate improvement of domination cost efficiency against random breakdowns, but substantial improvements against targeted attacks. Supported by DARPA, DTRA, ARL NS-CTA, ARO, and ONR.

  18. Spin-glass phase transitions and minimum energy of the random feedback vertex set problem.

    PubMed

    Qin, Shao-Meng; Zeng, Ying; Zhou, Hai-Jun

    2016-08-01

    A feedback vertex set (FVS) of an undirected graph contains vertices from every cycle of this graph. Constructing a FVS of sufficiently small cardinality is very difficult in the worst cases, but for random graphs this problem can be efficiently solved by converting it into an appropriate spin-glass model [H.-J. Zhou, Eur. Phys. J. B 86, 455 (2013)EPJBFY1434-602810.1140/epjb/e2013-40690-1]. In the present work we study the spin-glass phase transitions and the minimum energy density of the random FVS problem by the first-step replica-symmetry-breaking (1RSB) mean-field theory. For both regular random graphs and Erdös-Rényi graphs, we determine the inverse temperature β_{l} at which the replica-symmetric mean-field theory loses its local stability, the inverse temperature β_{d} of the dynamical (clustering) phase transition, and the inverse temperature β_{s} of the static (condensation) phase transition. These critical inverse temperatures all change with the mean vertex degree in a nonmonotonic way, and β_{d} is distinct from β_{s} for regular random graphs of vertex degrees K>60, while β_{d} are identical to β_{s} for Erdös-Rényi graphs at least up to mean vertex degree c=512. We then derive the zero-temperature limit of the 1RSB theory and use it to compute the minimum FVS cardinality. PMID:27627285

  19. The gradient boosting algorithm and random boosting for genome-assisted evaluation in large data sets.

    PubMed

    González-Recio, O; Jiménez-Montero, J A; Alenda, R

    2013-01-01

    In the next few years, with the advent of high-density single nucleotide polymorphism (SNP) arrays and genome sequencing, genomic evaluation methods will need to deal with a large number of genetic variants and an increasing sample size. The boosting algorithm is a machine-learning technique that may alleviate the drawbacks of dealing with such large data sets. This algorithm combines different predictors in a sequential manner with some shrinkage on them; each predictor is applied consecutively to the residuals from the committee formed by the previous ones to form a final prediction based on a subset of covariates. Here, a detailed description is provided and examples using a toy data set are included. A modification of the algorithm called "random boosting" was proposed to increase predictive ability and decrease computation time of genome-assisted evaluation in large data sets. Random boosting uses a random selection of markers to add a subsequent weak learner to the predictive model. These modifications were applied to a real data set composed of 1,797 bulls genotyped for 39,714 SNP. Deregressed proofs of 4 yield traits and 1 type trait from January 2009 routine evaluations were used as dependent variables. A 2-fold cross-validation scenario was implemented. Sires born before 2005 were used as a training sample (1,576 and 1,562 for production and type traits, respectively), whereas younger sires were used as a testing sample to evaluate predictive ability of the algorithm on yet-to-be-observed phenotypes. Comparison with the original algorithm was provided. The predictive ability of the algorithm was measured as Pearson correlations between observed and predicted responses. Further, estimated bias was computed as the average difference between observed and predicted phenotypes. The results showed that the modification of the original boosting algorithm could be run in 1% of the time used with the original algorithm and with negligible differences in accuracy

  20. Accurate Complete Basis Set Extrapolation of Direct Random Phase Correlation Energies.

    PubMed

    Mezei, Pál D; Csonka, Gábor I; Ruzsinszky, Adrienn

    2015-08-11

    The direct random phase approximation (dRPA) is a promising way to obtain improvements upon the standard semilocal density functional results in many aspects of computational chemistry. In this paper, we address the slow convergence of the calculated dRPA correlation energy with the increase of the quality and size of the popular Gaussian-type Dunning's correlation consistent aug-cc-pVXZ split valence atomic basis set family. The cardinal number X controls the size of the basis set, and we use X = 3-6 in this study. It is known that even the very expensive X = 6 basis sets lead to large errors for the dRPA correlation energy, and thus complete basis set extrapolation is necessary. We study the basis set convergence of the dRPA correlation energies on a set of 65 hydrocarbon isomers from CH4 to C6H6. We calculate the iterative density fitted dRPA correlation energies using an efficient algorithm based on the CC-like form of the equations using the self-consistent HF orbitals. We test the popular inverse cubic, the optimized exponential, and inverse power formulas for complete basis set extrapolation. We have found that the optimized inverse power based extrapolation delivers the best energies. Further analysis showed that the optimal exponent depends on the molecular structure, and the most efficient two-point energy extrapolations that use X = 3 and 4 can be improved considerably by considering the atomic composition and hybridization states of the atoms in the molecules. Our results also show that the optimized exponents that yield accurate X = 3 and 4 extrapolated dRPA energies for atoms or small molecules might be inaccurate for larger molecules. PMID:26574475

  1. Characterizing gene sets using discriminative random walks with restart on heterogeneous biological networks

    PubMed Central

    Blatti, Charles; Sinha, Saurabh

    2016-01-01

    Motivation: Analysis of co-expressed gene sets typically involves testing for enrichment of different annotations or ‘properties’ such as biological processes, pathways, transcription factor binding sites, etc., one property at a time. This common approach ignores any known relationships among the properties or the genes themselves. It is believed that known biological relationships among genes and their many properties may be exploited to more accurately reveal commonalities of a gene set. Previous work has sought to achieve this by building biological networks that combine multiple types of gene–gene or gene–property relationships, and performing network analysis to identify other genes and properties most relevant to a given gene set. Most existing network-based approaches for recognizing genes or annotations relevant to a given gene set collapse information about different properties to simplify (homogenize) the networks. Results: We present a network-based method for ranking genes or properties related to a given gene set. Such related genes or properties are identified from among the nodes of a large, heterogeneous network of biological information. Our method involves a random walk with restarts, performed on an initial network with multiple node and edge types that preserve more of the original, specific property information than current methods that operate on homogeneous networks. In this first stage of our algorithm, we find the properties that are the most relevant to the given gene set and extract a subnetwork of the original network, comprising only these relevant properties. We then re-rank genes by their similarity to the given gene set, based on a second random walk with restarts, performed on the above subnetwork. We demonstrate the effectiveness of this algorithm for ranking genes related to Drosophila embryonic development and aggressive responses in the brains of social animals. Availability and Implementation: DRaWR was implemented as

  2. Random sets technique for information fusion applied to estimation of brain functional images

    NASA Astrophysics Data System (ADS)

    Smith, Therese M.; Kelly, Patrick A.

    1999-05-01

    A new mathematical technique for information fusion based on random sets, developed and described by Goodman, Mahler and Nguyen (The Mathematics of Data Fusion, Kluwer, 1997) can be useful for estimation of functional brian images. Many image estimation algorithms employ prior models that incorporate general knowledge about sizes, shapes and locations of brain regions. Recently, algorithms have been proposed using specific prior knowledge obtained from other imaging modalities (for example, Bowsher, et al., IEEE Trans. Medical Imaging, 1996). However, there is more relevant information than is presently used. A technique that permits use of additional prior information about activity levels would improve the quality of prior models, and hence, of the resulting image estimate. The use of random sets provides this capability because it allows seemingly non-statistical (or ambiguous) information such as that contained in inference rules to be represented and combined with observations in a single statistical model, corresponding to a global joint density. This paper illustrates the use of this approach by constructing an example global joint density function for brain functional activity from measurements of functional activity, anatomical information, clinical observations and inference rules. The estimation procedure is tested on a data phantom with Poisson noise.

  3. Generalized index for spatial data sets as a measure of complete spatial randomness

    NASA Astrophysics Data System (ADS)

    Hackett-Jones, Emily J.; Davies, Kale J.; Binder, Benjamin J.; Landman, Kerry A.

    2012-06-01

    Spatial data sets, generated from a wide range of physical systems can be analyzed by counting the number of objects in a set of bins. Previous work has been limited to equal-sized bins, which are inappropriate for some domains (e.g., circular). We consider a nonequal size bin configuration whereby overlapping or nonoverlapping bins cover the domain. A generalized index, defined in terms of a variance between bin counts, is developed to indicate whether or not a spatial data set, generated from exclusion or nonexclusion processes, is at the complete spatial randomness (CSR) state. Limiting values of the index are determined. Using examples, we investigate trends in the generalized index as a function of density and compare the results with those using equal size bins. The smallest bin size must be much larger than the mean size of the objects. We can determine whether a spatial data set is at the CSR state or not by comparing the values of a generalized index for different bin configurations—the values will be approximately the same if the data is at the CSR state, while the values will differ if the data set is not at the CSR state. In general, the generalized index is lower than the limiting value of the index, since objects do not have access to the entire region due to blocking by other objects. These methods are applied to two applications: (i) spatial data sets generated from a cellular automata model of cell aggregation in the enteric nervous system and (ii) a known plant data distribution.

  4. Context-free pairs of groups II — Cuts, tree sets, and random walks

    PubMed Central

    Woess, Wolfgang

    2012-01-01

    This is a continuation of the study, begun by Ceccherini-Silberstein and Woess (2009) [5], of context-free pairs of groups and the related context-free graphs in the sense of Muller and Schupp (1985) [22]. The graphs under consideration are Schreier graphs of a subgroup of some finitely generated group, and context-freeness relates to a tree-like structure of those graphs. Instead of the cones of Muller and Schupp (1985) [22] (connected components resulting from deletion of finite balls with respect to the graph metric), a more general approach to context-free graphs is proposed via tree sets consisting of cuts of the graph, and associated structure trees. The existence of tree sets with certain “good” properties is studied. With a tree set, a natural context-free grammar is associated. These investigations of the structure of context free pairs, resp. graphs are then applied to study random walk asymptotics via complex analysis. In particular, a complete proof of the local limit theorem for return probabilities on any virtually free group is given, as well as on Schreier graphs of a finitely generated subgoup of a free group. This extends, respectively completes, the significant work of Lalley (1993, 2001) [18,20]. PMID:22267873

  5. The Ethics of Randomized Controlled Trials in Social Settings: Can Social Trials Be Scientifically Promising and Must There Be Equipoise?

    ERIC Educational Resources Information Center

    Fives, Allyn; Russell, Daniel W.; Canavan, John; Lyons, Rena; Eaton, Patricia; Devaney, Carmel; Kearns, Norean; O'Brien, Aoife

    2015-01-01

    In a randomized controlled trial (RCT), treatments are assigned randomly and treatments are withheld from participants. Is it ethically permissible to conduct an RCT in a social setting? This paper addresses two conditions for justifying RCTs: that there should be a state of equipoise and that the trial should be scientifically promising.…

  6. Implementing Randomized Controlled Trials in Preschool Settings That Include Young Children with Disabilities: Considering the Context of Strain and Bovey

    ERIC Educational Resources Information Center

    Snyder, Patricia

    2011-01-01

    In this commentary, developments related to conducting randomized controlled trials in authentic preschool settings that include young children with disabilities are discussed in relation to the Strain and Bovey study.

  7. Reducing spatial uncertainty in climatic maps through geostatistical analysis

    NASA Astrophysics Data System (ADS)

    Pesquer, Lluís; Ninyerola, Miquel; Pons, Xavier

    2014-05-01

    ), applying different interpolation methods/parameters are shown: RMS (mm) error values obtained from the independent test set (20 % of the samples) follow, according to this order: IDW (exponent=1.5, 2, 2.5, 3) / SPT (tension=100, 125, 150, 175, 200) / OK. LOOCV: 92.5; 80.2; 74.2; 72.3 / 181.6; 90.6; 75.7; 71.1; 69.4; 68.8 RS: 101.2; 89.6; 83.9; 81.9 / 115.1; 92.4; 84.0; 81.4; 80.9; 81.1 / 81.1 EU: 57.4; 51.3; 53.1; 55.5 / 59.1; 57.1; 55.9; 55.0; 54.3 / 51.8 A3D: 48.3; 49.8; 52.5; 62.2 / 57.1; 54.4; 52.5; 51.2; 50.2 / 49.7 To study these results, a geostatistical analysis of uncertainty has been done. Main results: variogram analysis of the error (using the test set) shows that the total sill is reduced (50% EU, 60% A3D) when using the two new approaches, while the spatialized standard deviation model calculated from the OK shows significantly lower values when compared to the RS. In conclusion, A3D and EU highly improve LOOCV and RS, whereas A3D slightly improves EU. Also, LOOCV only shows slightly better results than RS, suggesting that non-random-split increases the power of both fitting-test steps. * Ninyerola, Pons, Roure. A methodological approach of climatological modelling of air temperature and precipitation through GIS techniques. IJC, 2000; 20:1823-1841.

  8. Canonical Naimark extension for generalized measurements involving sets of Pauli quantum observables chosen at random

    NASA Astrophysics Data System (ADS)

    Sparaciari, Carlo; Paris, Matteo G. A.

    2013-01-01

    We address measurement schemes where certain observables Xk are chosen at random within a set of nondegenerate isospectral observables and then measured on repeated preparations of a physical system. Each observable has a probability zk to be measured, with ∑kzk=1, and the statistics of this generalized measurement is described by a positive operator-valued measure. This kind of scheme is referred to as quantum roulettes, since each observable Xk is chosen at random, e.g., according to the fluctuating value of an external parameter. Here we focus on quantum roulettes for qubits involving the measurements of Pauli matrices, and we explicitly evaluate their canonical Naimark extensions, i.e., their implementation as indirect measurements involving an interaction scheme with a probe system. We thus provide a concrete model to realize the roulette without destroying the signal state, which can be measured again after the measurement or can be transmitted. Finally, we apply our results to the description of Stern-Gerlach-like experiments on a two-level system.

  9. Reverse engineering discrete dynamical systems from data sets with random input vectors.

    PubMed

    Just, Winfried

    2006-10-01

    Recently a new algorithm for reverse engineering of biochemical networks was developed by Laubenbacher and Stigler. It is based on methods from computational algebra and finds most parsimonious models for a given data set. We derive mathematically rigorous estimates for the expected amount of data needed by this algorithm to find the correct model. In particular, we demonstrate that for one type of input parameter (graded term orders), the expected data requirements scale polynomially with the number n of chemicals in the network, while for another type of input parameters (randomly chosen lex orders) this number scales exponentially in n. We also show that, for a modification of the algorithm, the expected data requirements scale as the logarithm of n. PMID:17061920

  10. A Particle Multi-Target Tracker for Superpositional Measurements Using Labeled Random Finite Sets

    NASA Astrophysics Data System (ADS)

    Papi, Francesco; Kim, Du Yong

    2015-08-01

    In this paper we present a general solution for multi-target tracking with superpositional measurements. Measurements that are functions of the sum of the contributions of the targets present in the surveillance area are called superpositional measurements. We base our modelling on Labeled Random Finite Set (RFS) in order to jointly estimate the number of targets and their trajectories. This modelling leads to a labeled version of Mahler's multi-target Bayes filter. However, a straightforward implementation of this tracker using Sequential Monte Carlo (SMC) methods is not feasible due to the difficulties of sampling in high dimensional spaces. We propose an efficient multi-target sampling strategy based on Superpositional Approximate CPHD (SA-CPHD) filter and the recently introduced Labeled Multi-Bernoulli (LMB) and Vo-Vo densities. The applicability of the proposed approach is verified through simulation in a challenging radar application with closely spaced targets and low signal-to-noise ratio.

  11. Random finite set multi-target trackers: stochastic geometry for space situational awareness

    NASA Astrophysics Data System (ADS)

    Vo, Ba-Ngu; Vo, Ba-Tuong

    2015-05-01

    This paper describes the recent development in the random finite set RFS paradigm in multi-target tracking. Over the last decade the Probability Hypothesis Density filter has become synonymous with the RFS approach. As result the PHD filter is often wrongly used as a performance benchmark for the RFS approach. Since there is a suite of RFS-based multi-target tracking algorithms, benchmarking tracking performance of the RFS approach by using the PHD filter, the cheapest of these, is misleading. Such benchmarking should be performed with more sophisticated RFS algorithms. In this paper we outline the high-performance RFS-based multi-target trackers such that the Generalized Labled Multi-Bernoulli filter, and a number of efficient approximations and discuss extensions and applications of these filters. Applications to space situational awareness are discussed.

  12. Recent Advances in Geostatistical Inversion

    NASA Astrophysics Data System (ADS)

    Kitanidis, P. K.

    2011-12-01

    Inverse problems are common in hydrologic applications, such as in subsurface imaging which is the identification of parameters characterizing geologic formations from hydrologic and geophysical observations. In such problems, the data do not suffice to constrain the solution to be unique. The geostatistical approach utilizes probability theory and statistical inference to assimilate data and information about structure and to explore the range of possible solutions in a systematic way. This is a progress report on recent advances in terms of formulation and computational methods. The standard implementation of the geostatistical approach to the inverse method is computationally very expensive when there are millions of unknowns and hundreds of thousands of observations, as is the case in fusing data from many sources in hydrogeology. However, depending on the specific problem, alternative formulations and numerical methods can reduce the computational problem quite dramatically. One approach can utilize formulations that involve matrices with a very high degree of sparsity combined with indirect methods of solution and strategies that minimize the number of required forward runs. The potential for this method is illustrated with an application to transient hydraulic tomography. Another approach speeds up matrix-vector multiplications by utilizing hierarchical sparsity in commonly encountered matrices, particularly prior covariance matrices. A couple of examples show how the computational cost scales with the size of the problem (number of observations and unknowns) in conventional and newer methods. Yet another fruitful approach is to rely on a large number of realizations to represent ensembles of solutions and we illustrate this approach in fusing data from two different geophysical methods. In all of these approaches, utilizing parallel processors and mixed CPU/GPU programming can significantly reduce the computational cost and make it possible to solve very

  13. Improving practice in community-based settings: a randomized trial of supervision – study protocol

    PubMed Central

    2013-01-01

    Background Evidence-based treatments for child mental health problems are not consistently available in public mental health settings. Expanding availability requires workforce training. However, research has demonstrated that training alone is not sufficient for changing provider behavior, suggesting that ongoing intervention-specific supervision or consultation is required. Supervision is notably under-investigated, particularly as provided in public mental health. The degree to which supervision in this setting includes ‘gold standard’ supervision elements from efficacy trials (e.g., session review, model fidelity, outcome monitoring, skill-building) is unknown. The current federally-funded investigation leverages the Washington State Trauma-focused Cognitive Behavioral Therapy Initiative to describe usual supervision practices and test the impact of systematic implementation of gold standard supervision strategies on treatment fidelity and clinical outcomes. Methods/Design The study has two phases. We will conduct an initial descriptive study (Phase I) of supervision practices within public mental health in Washington State followed by a randomized controlled trial of gold standard supervision strategies (Phase II), with randomization at the clinician level (i.e., supervisors provide both conditions). Study participants will be 35 supervisors and 130 clinicians in community mental health centers. We will enroll one child per clinician in Phase I (N = 130) and three children per clinician in Phase II (N = 390). We use a multi-level mixed within- and between-subjects longitudinal design. Audio recordings of supervision and therapy sessions will be collected and coded throughout both phases. Child outcome data will be collected at the beginning of treatment and at three and six months into treatment. Discussion This study will provide insight into how supervisors can optimally support clinicians delivering evidence-based treatments. Phase I will

  14. Model-Based Geostatistical Mapping of the Prevalence of Onchocerca volvulus in West Africa

    PubMed Central

    O’Hanlon, Simon J.; Slater, Hannah C.; Cheke, Robert A.; Boatin, Boakye A.; Coffeng, Luc E.; Pion, Sébastien D. S.; Boussinesq, Michel; Zouré, Honorat G. M.; Stolk, Wilma A.; Basáñez, María-Gloria

    2016-01-01

    Background The initial endemicity (pre-control prevalence) of onchocerciasis has been shown to be an important determinant of the feasibility of elimination by mass ivermectin distribution. We present the first geostatistical map of microfilarial prevalence in the former Onchocerciasis Control Programme in West Africa (OCP) before commencement of antivectorial and antiparasitic interventions. Methods and Findings Pre-control microfilarial prevalence data from 737 villages across the 11 constituent countries in the OCP epidemiological database were used as ground-truth data. These 737 data points, plus a set of statistically selected environmental covariates, were used in a Bayesian model-based geostatistical (B-MBG) approach to generate a continuous surface (at pixel resolution of 5 km x 5km) of microfilarial prevalence in West Africa prior to the commencement of the OCP. Uncertainty in model predictions was measured using a suite of validation statistics, performed on bootstrap samples of held-out validation data. The mean Pearson’s correlation between observed and estimated prevalence at validation locations was 0.693; the mean prediction error (average difference between observed and estimated values) was 0.77%, and the mean absolute prediction error (average magnitude of difference between observed and estimated values) was 12.2%. Within OCP boundaries, 17.8 million people were deemed to have been at risk, 7.55 million to have been infected, and mean microfilarial prevalence to have been 45% (range: 2–90%) in 1975. Conclusions and Significance This is the first map of initial onchocerciasis prevalence in West Africa using B-MBG. Important environmental predictors of infection prevalence were identified and used in a model out-performing those without spatial random effects or environmental covariates. Results may be compared with recent epidemiological mapping efforts to find areas of persisting transmission. These methods may be extended to areas where

  15. Cognitive remediation for individuals with psychosis in a supported education setting: a randomized controlled trial.

    PubMed

    Kidd, Sean A; Kaur, Jaswant; Virdee, Gursharan; George, Tony P; McKenzie, Kwame; Herman, Yarissa

    2014-08-01

    Cognitive remediation (CR) has demonstrated good outcomes when paired with supported employment, however little is known about its effectiveness when integrated into a supported education program. This randomized controlled trial examined the effectiveness of integrating CR within a supported education program compared with supported education without CR. Thirty-seven students with psychosis were recruited into the study in the 2012 academic year. Academic functioning, cognition, self-esteem, and symptomatology were assessed at baseline, at 4months following the first academic term in which CR was provided, and at 8months assessing maintenance of gains. The treatment group demonstrated better retention in the academic program and a trend of improvement across a range of academic functional domains. While both treatment and control groups showed improvement in cognitive measures, the outcomes were not augmented by CR training. CR was also associated with significant and sustained improvements in self esteem. Further research, investigating specific intervention components is required to clarify the mixed findings regarding the effectiveness of CR in an education setting. PMID:24893903

  16. GEOSTATISTICS FOR WASTE MANAGEMENT: A USER'S MANUAL FOR THE GEOPACK (VERSION 1.0) GEOSTATISTICAL SOFTWARE SYSTEM

    EPA Science Inventory

    GEOPACK, a comprehensive user-friendly geostatistical software system, was developed to help in the analysis of spatially correlated data. The software system was developed to be used by scientists, engineers, regulators, etc., with little experience in geostatistical techniques...

  17. H-Area/ITP Geostatistical Assessment of In-Situ and Engineering Properties

    SciTech Connect

    Wyatt, D.; Bartlett, S.F.; Rouhani, S.; Lin, Y.

    1995-09-05

    This report describes the results of a geostatistical investigation conducted in response to tasks and requirements set out in the SRS Request for Proposal Number 93044EQ, dated January 4, 1994. The project tasks were defined in the Statement of Scope (SOS) Number 93044, Revision 0, dated December 1, 1993, developed by the Savannah River Site Geotechnical Services Department.

  18. Maximizing device-independent randomness from a Bell experiment by optimizing the measurement settings

    NASA Astrophysics Data System (ADS)

    Assad, S. M.; Thearle, O.; Lam, P. K.

    2016-07-01

    The rates at which a user can generate device-independent quantum random numbers from a Bell-type experiment depend on the measurements that the user performs. By numerically optimizing over these measurements, we present lower bounds on the randomness generation rates for a family of two-qubit states composed from a mixture of partially entangled states and the completely mixed state. We also report on the randomness generation rates from a tomographic measurement. Interestingly in this case, the randomness generation rates are not monotonic functions of entanglement.

  19. The Effect of Distributed Practice in Undergraduate Statistics Homework Sets: A Randomized Trial

    ERIC Educational Resources Information Center

    Crissinger, Bryan R.

    2015-01-01

    Most homework sets in statistics courses are constructed so that students concentrate or "mass" their practice on a certain topic in one problem set. Distributed practice homework sets include review problems in each set so that practice on a topic is distributed across problem sets. There is a body of research that points to the…

  20. Spatial continuity measures for probabilistic and deterministic geostatistics

    SciTech Connect

    Isaaks, E.H.; Srivastava, R.M.

    1988-05-01

    Geostatistics has traditionally used a probabilistic framework, one in which expected values or ensemble averages are of primary importance. The less familiar deterministic framework views geostatistical problems in terms of spatial integrals. This paper outlines the two frameworks and examines the issue of which spatial continuity measure, the covariance C(h) or the variogram ..sigma..(h), is appropriate for each framework. Although C(h) and ..sigma..(h) were defined originally in terms of spatial integrals, the convenience of probabilistic notation made the expected value definitions more common. These now classical expected value definitions entail a linear relationship between C(h) and ..sigma..(h); the spatial integral definitions do not. In a probabilistic framework, where available sample information is extrapolated to domains other than the one which was sampled, the expected value definitions are appropriate; furthermore, within a probabilistic framework, reasons exist for preferring the variogram to the covariance function. In a deterministic framework, where available sample information is interpolated within the same domain, the spatial integral definitions are appropriate and no reasons are known for preferring the variogram. A case study on a Wiener-Levy process demonstrates differences between the two frameworks and shows that, for most estimation problems, the deterministic viewpoint is more appropriate. Several case studies on real data sets reveal that the sample covariance function reflects the character of spatial continuity better than the sample variogram. From both theoretical and practical considerations, clearly for most geostatistical problems, direct estimation of the covariance is better than the traditional variogram approach.

  1. Preventing Depression among Early Adolescents in the Primary Care Setting: A Randomized Controlled Study of the Penn Resiliency Program

    ERIC Educational Resources Information Center

    Gillham, Jane E.; Hamilton, John; Freres, Derek R.; Patton, Ken; Gallop, Robert

    2006-01-01

    This study evaluated the Penn Resiliency Program's effectiveness in preventing depression when delivered by therapists in a primary care setting. Two-hundred and seventy-one 11- and 12-year-olds, with elevated depressive symptoms, were randomized to PRP or usual care. Over the 2-year follow-up, PRP improved explanatory style for positive events.…

  2. Observation of Lévy distribution and replica symmetry breaking in random lasers from a single set of measurements.

    PubMed

    Gomes, Anderson S L; Raposo, Ernesto P; Moura, André L; Fewo, Serge I; Pincheira, Pablo I R; Jerez, Vladimir; Maia, Lauro J Q; de Araújo, Cid B

    2016-01-01

    Random lasers have been recently exploited as a photonic platform for studies of complex systems. This cross-disciplinary approach opened up new important avenues for the understanding of random-laser behavior, including Lévy-type distributions of strong intensity fluctuations and phase transitions to a photonic spin-glass phase. In this work, we employ the Nd:YBO random laser system to unveil, from a single set of measurements, the physical origin of the complex correspondence between the Lévy fluctuation regime and the replica-symmetry-breaking transition to the spin-glass phase. A novel unexpected finding is also reported: the trend to suppress the spin-glass behavior for high excitation pulse energies. The present description from first principles of this correspondence unfolds new possibilities to characterize other random lasers, such as random fiber lasers, nanolasers and small lasers, which include plasmonic-based, photonic-crystal and bio-derived nanodevices. The statistical nature of the emission provided by random lasers can also impact on their prominent use as sources for speckle-free laser imaging, which nowadays represents one of the most promising applications of random lasers, with expected progress even in cancer research. PMID:27292095

  3. Observation of Lévy distribution and replica symmetry breaking in random lasers from a single set of measurements

    NASA Astrophysics Data System (ADS)

    Gomes, Anderson S. L.; Raposo, Ernesto P.; Moura, André L.; Fewo, Serge I.; Pincheira, Pablo I. R.; Jerez, Vladimir; Maia, Lauro J. Q.; de Araújo, Cid B.

    2016-06-01

    Random lasers have been recently exploited as a photonic platform for studies of complex systems. This cross-disciplinary approach opened up new important avenues for the understanding of random-laser behavior, including Lévy-type distributions of strong intensity fluctuations and phase transitions to a photonic spin-glass phase. In this work, we employ the Nd:YBO random laser system to unveil, from a single set of measurements, the physical origin of the complex correspondence between the Lévy fluctuation regime and the replica-symmetry-breaking transition to the spin-glass phase. A novel unexpected finding is also reported: the trend to suppress the spin-glass behavior for high excitation pulse energies. The present description from first principles of this correspondence unfolds new possibilities to characterize other random lasers, such as random fiber lasers, nanolasers and small lasers, which include plasmonic-based, photonic-crystal and bio-derived nanodevices. The statistical nature of the emission provided by random lasers can also impact on their prominent use as sources for speckle-free laser imaging, which nowadays represents one of the most promising applications of random lasers, with expected progress even in cancer research.

  4. Observation of Lévy distribution and replica symmetry breaking in random lasers from a single set of measurements

    PubMed Central

    Gomes, Anderson S. L.; Raposo, Ernesto P.; Moura, André L.; Fewo, Serge I.; Pincheira, Pablo I. R.; Jerez, Vladimir; Maia, Lauro J. Q.; de Araújo, Cid B.

    2016-01-01

    Random lasers have been recently exploited as a photonic platform for studies of complex systems. This cross-disciplinary approach opened up new important avenues for the understanding of random-laser behavior, including Lévy-type distributions of strong intensity fluctuations and phase transitions to a photonic spin-glass phase. In this work, we employ the Nd:YBO random laser system to unveil, from a single set of measurements, the physical origin of the complex correspondence between the Lévy fluctuation regime and the replica-symmetry-breaking transition to the spin-glass phase. A novel unexpected finding is also reported: the trend to suppress the spin-glass behavior for high excitation pulse energies. The present description from first principles of this correspondence unfolds new possibilities to characterize other random lasers, such as random fiber lasers, nanolasers and small lasers, which include plasmonic-based, photonic-crystal and bio-derived nanodevices. The statistical nature of the emission provided by random lasers can also impact on their prominent use as sources for speckle-free laser imaging, which nowadays represents one of the most promising applications of random lasers, with expected progress even in cancer research. PMID:27292095

  5. A geostatistical algorithm to reproduce lateral gradual facies transitions: Description and implementation

    NASA Astrophysics Data System (ADS)

    Falivene, Oriol; Cabello, Patricia; Arbués, Pau; Muñoz, Josep Anton; Cabrera, Lluís

    2009-08-01

    Valid representations of geological heterogeneity are fundamental inputs for quantitative models used in managing subsurface activities. Consequently, the simulation of realistic facies distributions is a significant aim. Realistic facies distributions are typically obtained by pixel-based, object-based or process-based methods. This work presents a pixel-based geostatistical algorithm suitable for reproducing lateral gradual facies transitions (LGFT) between two adjacent sedimentary bodies. Lateral contact (i.e. interfingering) between distinct depositional facies is a widespread geometric relationship that occurs at different scales in any depositional system. The algorithm is based on the truncation of the sum of a linear expectation trend and a random Gaussian field, and can be conditioned to well data. The implementation introduced herein also includes subroutines to clean and geometrically characterize the obtained LGFT. The cleaned sedimentary body transition provides a more appropriate and realistic facies distribution for some depositional settings. The geometric measures of the LGFT yield an intuitive measure of the morphology of the sedimentary body boundary, which can be compared to analogue data. An example of a LGFT obtained by the algorithm presented herein is also flow simulated, quantitatively demonstrating the importance of realistically reproducing them in subsurface models, if further flow-related accurate predictions are to be made.

  6. A Comparison of Hourly Typhoon Rainfall Forecasting Models Based on Support Vector Machines and Random Forests with Different Predictor Sets

    NASA Astrophysics Data System (ADS)

    Lin, Kun-Hsiang; Tseng, Hung-Wei; Kuo, Chen-Min; Yang, Tao-Chang; Yu, Pao-Shan

    2016-04-01

    Typhoons with heavy rainfall and strong wind often cause severe floods and losses in Taiwan, which motivates the development of rainfall forecasting models as part of an early warning system. Thus, this study aims to develop rainfall forecasting models based on two machine learning methods, support vector machines (SVMs) and random forests (RFs), and investigate the performances of the models with different predictor sets for searching the optimal predictor set in forecasting. Four predictor sets were used: (1) antecedent rainfalls, (2) antecedent rainfalls and typhoon characteristics, (3) antecedent rainfalls and meteorological factors, and (4) antecedent rainfalls, typhoon characteristics and meteorological factors to construct for 1- to 6-hour ahead rainfall forecasting. An application to three rainfall stations in Yilan River basin, northeastern Taiwan, was conducted. Firstly, the performance of the SVMs-based forecasting model with predictor set #1 was analyzed. The results show that the accuracy of the models for 2- to 6-hour ahead forecasting decrease rapidly as compared to the accuracy of the model for 1-hour ahead forecasting which is acceptable. For improving the model performance, each predictor set was further examined in the SVMs-based forecasting model. The results reveal that the SVMs-based model using predictor set #4 as input variables performs better than the other sets and a significant improvement of model performance is found especially for the long lead time forecasting. Lastly, the performance of the SVMs-based model using predictor set #4 as input variables was compared with the performance of the RFs-based model using predictor set #4 as input variables. It is found that the RFs-based model is superior to the SVMs-based model in hourly typhoon rainfall forecasting. Keywords: hourly typhoon rainfall forecasting, predictor selection, support vector machines, random forests

  7. Proposed sets of critical exponents for randomly branched polymers, using a known string theory model

    NASA Astrophysics Data System (ADS)

    March, N. H.; Moreno, A. J.

    2016-06-01

    The critical exponent ν for randomly branched polymers with dimensionality d equal to 3, is known exactly as 1/2. Here, we invoke an already available string theory model to predict the remaining static critical exponents. Utilizing results of Hsu et al. (Comput Phys Commun. 2005;169:114-116), results are added for d = 8. Experiment plus simulation would now be important to confirm, or if necessary to refine, the proposed values.

  8. Set statistics in conductive bridge random access memory device with Cu/HfO{sub 2}/Pt structure

    SciTech Connect

    Zhang, Meiyun; Long, Shibing Wang, Guoming; Xu, Xiaoxin; Li, Yang; Liu, Qi; Lv, Hangbing; Liu, Ming; Lian, Xiaojuan; Miranda, Enrique; Suñé, Jordi

    2014-11-10

    The switching parameter variation of resistive switching memory is one of the most important challenges in its application. In this letter, we have studied the set statistics of conductive bridge random access memory with a Cu/HfO{sub 2}/Pt structure. The experimental distributions of the set parameters in several off resistance ranges are shown to nicely fit a Weibull model. The Weibull slopes of the set voltage and current increase and decrease logarithmically with off resistance, respectively. This experimental behavior is perfectly captured by a Monte Carlo simulator based on the cell-based set voltage statistics model and the Quantum Point Contact electron transport model. Our work provides indications for the improvement of the switching uniformity.

  9. Reducing HIV-Related Stigma in Health Care Settings: A Randomized Controlled Trial in China

    PubMed Central

    Wu, Zunyou; Liang, Li-Jung; Guan, Jihui; Jia, Manhong; Rou, Keming; Yan, Zhihua

    2013-01-01

    Objectives. The objective of the intervention was to reduce service providers’ stigmatizing attitudes and behaviors toward people living with HIV. Methods. The randomized controlled trial was conducted in 40 county-level hospitals in 2 provinces of China between October 2008 and February 2010. Forty-four service providers were randomly selected from each hospital, yielding a total of 1760 study participants. We randomized the hospitals to either an intervention condition or a control condition. In the intervention hospitals, about 15% of the popular opinion leaders were identified and trained to disseminate stigma reduction messages. Results. We observed significant improvements for the intervention group in reducing prejudicial attitudes (P < .001), reducing avoidance intent towards people living with HIV (P < .001), and increasing institutional support in the hospitals (P = .003) at 6 months after controlling for service providers’ background factors and clinic-level characteristics. The intervention effects were sustained and strengthened at 12 months. Conclusions. The intervention reduced stigmatizing attitudes and behaviors among service providers. It has the potential to be integrated into the health care systems in China and other countries. PMID:23237175

  10. Mental health first aid training in a workplace setting: A randomized controlled trial [ISRCTN13249129

    PubMed Central

    Kitchener, Betty A; Jorm, Anthony F

    2004-01-01

    Background The Mental Health First Aid training course was favorably evaluated in an uncontrolled trial in 2002 showing improvements in participants' mental health literacy, including knowledge, stigmatizing attitudes, confidence and help provided to others. This article reports the first randomized controlled trial of this course. Methods Data are reported on 301 participants randomized to either participate immediately in a course or to be wait-listed for 5 months before undertaking the training. The participants were employees in two large government departments in Canberra, Australia, where the courses were conducted during participants' work time. Data were analyzed according to an intention-to-treat approach. Results The trial found a number of benefits from this training course, including greater confidence in providing help to others, greater likelihood of advising people to seek professional help, improved concordance with health professionals about treatments, and decreased stigmatizing attitudes. An additional unexpected but exciting finding was an improvement in the mental health of the participants themselves. Conclusions The Mental Health First Aid training has shown itself to be not only an effective way to improve participants' mental health literacy but also to improve their own mental health. It is a course that has high applicability across the community. PMID:15310395

  11. Geostatistical Interpolation of Particle-Size Curves in Heterogeneous Aquifers

    NASA Astrophysics Data System (ADS)

    Guadagnini, A.; Menafoglio, A.; Secchi, P.

    2013-12-01

    We address the problem of predicting the spatial field of particle-size curves (PSCs) from measurements associated with soil samples collected at a discrete set of locations within an aquifer system. Proper estimates of the full PSC are relevant to applications related to groundwater hydrology, soil science and geochemistry and aimed at modeling physical and chemical processes occurring in heterogeneous earth systems. Hence, we focus on providing kriging estimates of the entire PSC at unsampled locations. To this end, we treat particle-size curves as cumulative distribution functions, model their densities as functional compositional data and analyze them by embedding these into the Hilbert space of compositional functions endowed with the Aitchison geometry. On this basis, we develop a new geostatistical methodology for the analysis of spatially dependent functional compositional data. Our functional compositional kriging (FCK) approach allows providing predictions at unsampled location of the entire particle-size curve, together with a quantification of the associated uncertainty, by fully exploiting both the functional form of the data and their compositional nature. This is a key advantage of our approach with respect to traditional methodologies, which treat only a set of selected features (e.g., quantiles) of PSCs. Embedding the full PSC into a geostatistical analysis enables one to provide a complete characterization of the spatial distribution of lithotypes in a reservoir, eventually leading to improved predictions of soil hydraulic attributes through pedotransfer functions as well as of soil geochemical parameters which are relevant in sorption/desorption and cation exchange processes. We test our new method on PSCs sampled along a borehole located within an alluvial aquifer near the city of Tuebingen, Germany. The quality of FCK predictions is assessed through leave-one-out cross-validation. A comparison between hydraulic conductivity estimates obtained

  12. Geostatistical Analysis of Spatial Variability of Mineral Abundance and Kd in Frenchman Flat, NTS, Alluvium

    SciTech Connect

    Carle, S F; Zavarin, M; Pawloski, G A

    2002-11-01

    LLNL hydrologic source term modeling at the Cambric site (Pawloski et al., 2000) showed that retardation of radionuclide transport is sensitive to the distribution and amount of radionuclide sorbing minerals. While all mineralogic information available near the Cambric site was used in these early simulations (11 mineral abundance analyses from UE-5n and 9 from RNM-l), these older data sets were qualitative in nature, with detection limits too high to accurately measure many of the important radionuclide sorbing minerals (e.g. iron oxide). Also, the sparse nature of the mineral abundance data permitted only a hypothetical description of the spatial distribution of radionuclide sorbing minerals. Yet, the modeling results predicted that the spatial distribution of sorbing minerals would strongly affect radionuclide transport. Clearly, additional data are needed to improve understanding of mineral abundances and their spatial distributions if model predictions in Frenchman Flat are to be defensible. This report evaluates new high-resolution quantitative X-Ray Diffraction (XRD) data on mineral distributions and their abundances from core samples recently collected from drill hole ER-5-4. The total of 94 samples from ER-5-4 were collected at various spacings to enable evaluation of spatial variability at a variety of spatial scales as small as 0.3 meters and up to hundreds of meters. Additional XRD analyses obtained from drillholes UE-Sn, ER-5-3, and U-11g-1 are used to augment evaluation of vertical spatial variability and permit some evaluation of lateral spatial variability. A total of 163 samples are evaluated. The overall goal of this study is to understand and characterize the spatial variation of sorbing minerals in Frenchman Flat alluvium using geostatistical techniques, with consideration for the potential impact on reactive transport of radionuclides. To achieve this goal requires an effort to ensure that plausible geostatistical models are used to

  13. Does Pedometer Goal Setting Improve Physical Activity among Native Elders? Results from a Randomized Pilot Study

    ERIC Educational Resources Information Center

    Sawchuk, Craig N.; Russo, Joan E.; Charles, Steve; Goldberg, Jack; Forquera, Ralph; Roy-Byrne, Peter; Buchwald, Dedra

    2011-01-01

    We examined if step-count goal setting resulted in increases in physical activity and walking compared to only monitoring step counts with pedometers among American Indian/Alaska Native elders. Outcomes included step counts, self-reported physical activity and well-being, and performance on the 6-minute walk test. Although no significant…

  14. Evaluating the Bookmark Standard Setting Method: The Impact of Random Item Ordering

    ERIC Educational Resources Information Center

    Davis-Becker, Susan L.; Buckendahl, Chad W.; Gerrow, Jack

    2011-01-01

    Throughout the world, cut scores are an important aspect of a high-stakes testing program because they are a key operational component of the interpretation of test scores. One method for setting standards that is prevalent in educational testing programs--the Bookmark method--is intended to be a less cognitively complex alternative to methods…

  15. Comparing of goal setting strategy with group education method to increase physical activity level: A randomized trial

    PubMed Central

    Jiryaee, Nasrin; Siadat, Zahra Dana; Zamani, Ahmadreza; Taleban, Roya

    2015-01-01

    Background: Designing an intervention to increase physical activity is important to be based on the health care settings resources and be acceptable by the subject group. This study was designed to assess and compare the effect of the goal setting strategy with a group education method on increasing the physical activity of mothers of children aged 1 to 5. Materials and Methods: Mothers who had at least one child of 1-5 years were randomized into two groups. The effect of 1) goal-setting strategy and 2) group education method on increasing physical activity was assessed and compared 1 month and 3 months after the intervention. Also, the weight, height, body mass index (BMI), waist and hip circumference, and well-being were compared between the two groups before and after the intervention. Results: Physical activity level increased significantly after the intervention in the goal-setting group and it was significantly different between the two groups after intervention (P < 0.05). BMI, waist circumference, hip circumference, and well-being score were significantly different in the goal-setting group after the intervention. In the group education method, only the well-being score improved significantly (P < 0.05). Conclusion: Our study presented the effects of using the goal-setting strategy to boost physical activity, improving the state of well-being and decreasing BMI, waist, and hip circumference. PMID:26929765

  16. PREDICTION INTERVALS FOR INTEGRALS OF GAUSSIAN RANDOM FIELDS.

    PubMed

    De Oliveira, Victor; Kone, Bazoumana

    2015-03-01

    Methodology is proposed for the construction of prediction intervals for integrals of Gaussian random fields over bounded regions (called block averages in the geostatistical literature) based on observations at a finite set of sampling locations. Two bootstrap calibration algorithms are proposed, termed indirect and direct, aimed at improving upon plug-in prediction intervals in terms of coverage probability. A simulation study is carried out that illustrates the effectiveness of both procedures, and these procedures are applied to estimate block averages of chromium traces in a potentially contaminated region in Switzerland. PMID:25431507

  17. PREDICTION INTERVALS FOR INTEGRALS OF GAUSSIAN RANDOM FIELDS

    PubMed Central

    De Oliveira, Victor; Kone, Bazoumana

    2014-01-01

    Methodology is proposed for the construction of prediction intervals for integrals of Gaussian random fields over bounded regions (called block averages in the geostatistical literature) based on observations at a finite set of sampling locations. Two bootstrap calibration algorithms are proposed, termed indirect and direct, aimed at improving upon plug-in prediction intervals in terms of coverage probability. A simulation study is carried out that illustrates the effectiveness of both procedures, and these procedures are applied to estimate block averages of chromium traces in a potentially contaminated region in Switzerland. PMID:25431507

  18. Continuous quality improvement (CQI) in addiction treatment settings: design and intervention protocol of a group randomized pilot study

    PubMed Central

    2014-01-01

    Background Few studies have designed and tested the use of continuous quality improvement approaches in community based substance use treatment settings. Little is known about the feasibility, costs, efficacy, and sustainment of such approaches in these settings. Methods/Design A group-randomized trial using a modified stepped wedge design is being used. In the first phase of the study, eight programs, stratified by modality (residential, outpatient) are being randomly assigned to the intervention or control condition. In the second phase, the initially assigned control programs are receiving the intervention to gain additional information about feasibility while sustainment is being studied among the programs initially assigned to the intervention. Discussion By using this design in a pilot study, we help inform the field about the feasibility, costs, efficacy and sustainment of the intervention. Determining information at the pilot stage about costs and sustainment provides value for designing future studies and implementation strategies with the goal to reduce the time between intervention development and translation to real world practice settings. PMID:24467770

  19. Randomized Trial of Two Dissemination Strategies for a Skin Cancer Prevention Program in Aquatic Settings

    PubMed Central

    Escoffery, Cam; Elliott, Tom; Nehl, Eric J.

    2015-01-01

    Objectives. We compared 2 strategies for disseminating an evidence-based skin cancer prevention program. Methods. We evaluated the effects of 2 strategies (basic vs enhanced) for dissemination of the Pool Cool skin cancer prevention program in outdoor swimming pools on (1) program implementation, maintenance, and sustainability and (2) improvements in organizational and environmental supports for sun protection. The trial used a cluster-randomized design with pools as the unit of intervention and outcome. The enhanced group received extra incentives, reinforcement, feedback, and skill-building guidance. Surveys were collected in successive years (2003–2006) from managers of 435 pools in 33 metropolitan areas across the United States participating in the Pool Cool Diffusion Trial. Results. Both treatment groups improved their implementation of the program, but pools in the enhanced condition had significantly greater overall maintenance of the program over 3 summers of participation. Furthermore, pools in the enhanced condition established and maintained significantly greater sun-safety policies and supportive environments over time. Conclusions. This study found that more intensive, theory-driven dissemination strategies can significantly enhance program implementation and maintenance of health-promoting environmental and policy changes. Future research is warranted through longitudinal follow-up to examine sustainability. PMID:25521872

  20. Wave scattering from random sets of closely spaced objects through linear embedding via Green's operators

    NASA Astrophysics Data System (ADS)

    Lancellotti, V.; de Hon, B. P.; Tijhuis, A. G.

    2011-08-01

    In this paper we present the application of linear embedding via Green's operators (LEGO) to the solution of the electromagnetic scattering from clusters of arbitrary (both conducting and penetrable) bodies randomly placed in a homogeneous background medium. In the LEGO method the objects are enclosed within simple-shaped bricks described in turn via scattering operators of equivalent surface current densities. Such operators have to be computed only once for a given frequency, and hence they can be re-used to perform the study of many distributions comprising the same objects located in different positions. The surface integral equations of LEGO are solved via the Moments Method combined with Adaptive Cross Approximation (to save memory) and Arnoldi basis functions (to compress the system). By means of purposefully selected numerical experiments we discuss the time requirements with respect to the geometry of a given distribution. Besides, we derive an approximate relationship between the (near-field) accuracy of the computed solution and the number of Arnoldi basis functions used to obtain it. This result endows LEGO with a handy practical criterion for both estimating the error and keeping it in check.

  1. Relevant feature set estimation with a knock-out strategy and random forests.

    PubMed

    Ganz, Melanie; Greve, Douglas N; Fischl, Bruce; Konukoglu, Ender

    2015-11-15

    Group analysis of neuroimaging data is a vital tool for identifying anatomical and functional variations related to diseases as well as normal biological processes. The analyses are often performed on a large number of highly correlated measurements using a relatively smaller number of samples. Despite the correlation structure, the most widely used approach is to analyze the data using univariate methods followed by post-hoc corrections that try to account for the data's multivariate nature. Although widely used, this approach may fail to recover from the adverse effects of the initial analysis when local effects are not strong. Multivariate pattern analysis (MVPA) is a powerful alternative to the univariate approach for identifying relevant variations. Jointly analyzing all the measures, MVPA techniques can detect global effects even when individual local effects are too weak to detect with univariate analysis. Current approaches are successful in identifying variations that yield highly predictive and compact models. However, they suffer from lessened sensitivity and instabilities in identification of relevant variations. Furthermore, current methods' user-defined parameters are often unintuitive and difficult to determine. In this article, we propose a novel MVPA method for group analysis of high-dimensional data that overcomes the drawbacks of the current techniques. Our approach explicitly aims to identify all relevant variations using a "knock-out" strategy and the Random Forest algorithm. In evaluations with synthetic datasets the proposed method achieved substantially higher sensitivity and accuracy than the state-of-the-art MVPA methods, and outperformed the univariate approach when the effect size is low. In experiments with real datasets the proposed method identified regions beyond the univariate approach, while other MVPA methods failed to replicate the univariate results. More importantly, in a reproducibility study with the well-known ADNI dataset

  2. Probiotics reduce symptoms of antibiotic use in a hospital setting: a randomized dose response study.

    PubMed

    Ouwehand, Arthur C; DongLian, Cai; Weijian, Xu; Stewart, Morgan; Ni, Jiayi; Stewart, Tad; Miller, Larry E

    2014-01-16

    Probiotics are known to reduce antibiotic associated diarrhea (AAD) and Clostridium difficile associated diarrhea (CDAD) risk in a strain-specific manner. The aim of this study was to determine the dose-response effect of a four strain probiotic combination (HOWARU(®) Restore) on the incidence of AAD and CDAD and severity of gastrointestinal symptoms in adult in-patients requiring antibiotic therapy. Patients (n=503) were randomized among three study groups: HOWARU(®) Restore probiotic 1.70×10(10) CFU (high-dose, n=168), HOWARU(®) Restore probiotic 4.17×10(9) CFU (low-dose, n=168), or placebo (n=167). Subjects were stratified by gender, age, and duration of antibiotic treatment. Study products were administered daily up to 7 days after the final antibiotic dose. The primary endpoint of the study was the incidence of AAD. Secondary endpoints included incidence of CDAD, diarrhea duration, stools per day, bloody stools, fever, abdominal cramping, and bloating. A significant dose-response effect on AAD was observed with incidences of 12.5, 19.6, and 24.6% with high-dose, low-dose, and placebo, respectively (p=0.02). CDAD was the same in both probiotic groups (1.8%) but different from the placebo group (4.8%; p=0.04). Incidences of fever, abdominal pain, and bloating were lower with increasing probiotic dose. The number of daily liquid stools and average duration of diarrhea decreased with higher probiotic dosage. The tested four strain probiotic combination appears to lower the risk of AAD, CDAD, and gastrointestinal symptoms in a dose-dependent manner in adult in-patients. PMID:24291194

  3. GEOSTATISTICS FOR WASTE MANAGEMENT: A USER'S MANUAL FOR THE GEOPAK (VERSION 1.0) GEOSTATISTICAL SOFTWARE SYSTEM

    EPA Science Inventory

    A comprehensive, user-friendly geostatistical software system called GEOPACk has been developed. he purpose of this software is to make available the programs necessary to undertake a geostatistical analysis of spatially correlated data. he programs were written so that they can ...

  4. GEOSTATISTICS FOR WASTE MANAGEMENT: A USER'S MANUEL FOR THE GEOPACK (VERSION 1.0) GEOSTATISTICAL SOFTWARE SYSTEM

    EPA Science Inventory

    A comprehensive, user-friendly geostatistical software system called GEOPACk has been developed. The purpose of this software is to make available the programs necessary to undertake a geostatistical analysis of spatially correlated data. The programs were written so that they ...

  5. A pilot randomized controlled trial of a brief parenting intervention in low-resource settings in Panama.

    PubMed

    Mejia, Anilena; Calam, Rachel; Sanders, Matthew R

    2015-07-01

    The aim of this study was to determine whether an intervention from the Triple P Positive Parenting Program system was effective in reducing parental reports of child behavioral difficulties in urban low-income settings in Panama City. A pilot parallel-group randomized controlled trial was carried out. A total of 108 parents of children 3 to 12 years old with some level of parent-rated behavioral difficulties were randomly assigned to a discussion group on "dealing with disobedience" or to a no intervention control. Blinded assessments were carried out prior to the intervention, 2 weeks, 3 months, and 6 months later. Results indicated that parental reports of child behavioral difficulties changed over time and decreased more steeply in the intervention than in the control group. The effects of the intervention on parental reports of behavioral difficulties were moderate at post-intervention and 3-month follow-up, and large at 6-month follow-up. Parents who participated in the discussion group reported fewer behavioral difficulties in their children after the intervention than those in the control condition. They also reported reduced parental stress and less use of dysfunctional parenting practices. There is a limited amount of evidence on the efficacy of parenting interventions in low-resource settings. This pilot trial was carried out using a small convenience sample living in low-income urban communities in Panama City, and therefore, the findings are of reduced generalizability to other settings. However, the methodology employed in this trial represents an example for future work in other low-resource settings. PMID:25703382

  6. Geostatistical integration and uncertainty in pollutant concentration surface under preferential sampling.

    PubMed

    Grisotto, Laura; Consonni, Dario; Cecconi, Lorenzo; Catelan, Dolores; Lagazio, Corrado; Bertazzi, Pier Alberto; Baccini, Michela; Biggeri, Annibale

    2016-01-01

    In this paper the focus is on environmental statistics, with the aim of estimating the concentration surface and related uncertainty of an air pollutant. We used air quality data recorded by a network of monitoring stations within a Bayesian framework to overcome difficulties in accounting for prediction uncertainty and to integrate information provided by deterministic models based on emissions meteorology and chemico-physical characteristics of the atmosphere. Several authors have proposed such integration, but all the proposed approaches rely on representativeness and completeness of existing air pollution monitoring networks. We considered the situation in which the spatial process of interest and the sampling locations are not independent. This is known in the literature as the preferential sampling problem, which if ignored in the analysis, can bias geostatistical inferences. We developed a Bayesian geostatistical model to account for preferential sampling with the main interest in statistical integration and uncertainty. We used PM10 data arising from the air quality network of the Environmental Protection Agency of Lombardy Region (Italy) and numerical outputs from the deterministic model. We specified an inhomogeneous Poisson process for the sampling locations intensities and a shared spatial random component model for the dependence between the spatial location of monitors and the pollution surface. We found greater predicted standard deviation differences in areas not properly covered by the air quality network. In conclusion, in this context inferences on prediction uncertainty may be misleading when geostatistical modelling does not take into account preferential sampling. PMID:27087040

  7. Geostatistical prediction of flow-duration curves

    NASA Astrophysics Data System (ADS)

    Pugliese, A.; Castellarin, A.; Brath, A.

    2013-11-01

    We present in this study an adaptation of Topological kriging (or Top-kriging), which makes the geostatistical procedure capable of predicting flow-duration curves (FDCs) in ungauged catchments. Previous applications of Top-kriging mainly focused on the prediction of point streamflow indices (e.g. flood quantiles, low-flow indices, etc.). In this study Top-kriging is used to predict FDCs in ungauged sites as a weighted average of standardised empirical FDCs through the traditional linear-weighting scheme of kriging methods. Our study focuses on the prediction of period-of-record FDCs for 18 unregulated catchments located in Central Italy, for which daily streamflow series with length from 5 to 40 yr are available, together with information on climate referring to the same time-span of each daily streamflow sequence. Empirical FDCs are standardised by a reference streamflow value (i.e. mean annual flow, or mean annual precipitation times the catchment drainage area) and the overall deviation of the curves from this reference value is then used for expressing the hydrological similarity between catchments and for deriving the geostatistical weights. We performed an extensive leave-one-out cross-validation to quantify the accuracy of the proposed technique, and to compare it to traditional regionalisation models that were recently developed for the same study region. The cross-validation points out that Top-kriging is a reliable approach for predicting FDCs, which can significantly outperform traditional regional models in ungauged basins.

  8. A pilot randomized trial of technology-assisted goal setting to improve physical activity among primary care patients with prediabetes.

    PubMed

    Mann, Devin M; Palmisano, Joseph; Lin, Jenny J

    2016-12-01

    Lifestyle behavior changes can prevent progression of prediabetes to diabetes but providers often are not able to effectively counsel about preventive lifestyle changes. We developed and pilot tested the Avoiding Diabetes Thru Action Plan Targeting (ADAPT) program to enhance primary care providers' counseling about behavior change for patients with prediabetes. Primary care providers in two urban academic practices and their patients with prediabetes were recruited to participate in the ADAPT study, an unblinded randomized pragmatic trial to test the effectiveness of the ADAPT program, including a streamlined electronic medical record-based goal setting tool. Providers were randomized to intervention or control arms; eligible patients whose providers were in the intervention arm received the ADAPT program. Physical activity (the primary outcome) was measured using pedometers, and data were gathered about patients' diet, weight and glycemic control. A total of 54 patients were randomized and analyzed as part of the 6-month ADAPT study (2010-2012, New York, NY). Those in the intervention group showed an increase total daily steps compared to those in the control group (+ 1418 vs - 598, p = 0.007) at 6 months. There was also a trend towards weight loss in the intervention compared to the control group (- 1.0 lbs. vs. 3.0 lbs., p = 0.11), although no change in glycemic control. The ADAPT study is among the first to use standard electronic medical record tools to embed goal setting into realistic primary care workflows and to demonstrate a significant improvement in prediabetes patients' physical activity. PMID:27413670

  9. Can Geostatistical Models Represent Nature's Variability? An Analysis Using Flume Experiments

    NASA Astrophysics Data System (ADS)

    Scheidt, C.; Fernandes, A. M.; Paola, C.; Caers, J.

    2015-12-01

    The lack of understanding in the Earth's geological and physical processes governing sediment deposition render subsurface modeling subject to large uncertainty. Geostatistics is often used to model uncertainty because of its capability to stochastically generate spatially varying realizations of the subsurface. These methods can generate a range of realizations of a given pattern - but how representative are these of the full natural variability? And how can we identify the minimum set of images that represent this natural variability? Here we use this minimum set to define the geostatistical prior model: a set of training images that represent the range of patterns generated by autogenic variability in the sedimentary environment under study. The proper definition of the prior model is essential in capturing the variability of the depositional patterns. This work starts with a set of overhead images from an experimental basin that showed ongoing autogenic variability. We use the images to analyze the essential characteristics of this suite of patterns. In particular, our goal is to define a prior model (a minimal set of selected training images) such that geostatistical algorithms, when applied to this set, can reproduce the full measured variability. A necessary prerequisite is to define a measure of variability. In this study, we measure variability using a dissimilarity distance between the images. The distance indicates whether two snapshots contain similar depositional patterns. To reproduce the variability in the images, we apply an MPS algorithm to the set of selected snapshots of the sedimentary basin that serve as training images. The training images are chosen from among the initial set by using the distance measure to ensure that only dissimilar images are chosen. Preliminary investigations show that MPS can reproduce fairly accurately the natural variability of the experimental depositional system. Furthermore, the selected training images provide

  10. Benchmarking a geostatistical procedure for the homogenisation of annual precipitation series

    NASA Astrophysics Data System (ADS)

    Caineta, Júlio; Ribeiro, Sara; Henriques, Roberto; Soares, Amílcar; Costa, Ana Cristina

    2014-05-01

    The European project COST Action ES0601, Advances in homogenisation methods of climate series: an integrated approach (HOME), has brought to attention the importance of establishing reliable homogenisation methods for climate data. In order to achieve that, a benchmark data set, containing monthly and daily temperature and precipitation data, was created to be used as a comparison basis for the effectiveness of those methods. Several contributions were submitted and evaluated by a number of performance metrics, validating the results against realistic inhomogeneous data. HOME also led to the development of new homogenisation software packages, which included feedback and lessons learned during the project. Preliminary studies have suggested a geostatistical stochastic approach, which uses Direct Sequential Simulation (DSS), as a promising methodology for the homogenisation of precipitation data series. Based on the spatial and temporal correlation between the neighbouring stations, DSS calculates local probability density functions at a candidate station to detect inhomogeneities. The purpose of the current study is to test and compare this geostatistical approach with the methods previously presented in the HOME project, using surrogate precipitation series from the HOME benchmark data set. The benchmark data set contains monthly precipitation surrogate series, from which annual precipitation data series were derived. These annual precipitation series were subject to exploratory analysis and to a thorough variography study. The geostatistical approach was then applied to the data set, based on different scenarios for the spatial continuity. Implementing this procedure also promoted the development of a computer program that aims to assist on the homogenisation of climate data, while minimising user interaction. Finally, in order to compare the effectiveness of this methodology with the homogenisation methods submitted during the HOME project, the obtained results

  11. A New 2.5D Representation for Lymph Node Detection using Random Sets of Deep Convolutional Neural Network Observations

    PubMed Central

    Lu, Le; Seff, Ari; Cherry, Kevin M.; Hoffman, Joanne; Wang, Shijun; Liu, Jiamin; Turkbey, Evrim; Summers, Ronald M.

    2015-01-01

    Automated Lymph Node (LN) detection is an important clinical diagnostic task but very challenging due to the low contrast of surrounding structures in Computed Tomography (CT) and to their varying sizes, poses, shapes and sparsely distributed locations. State-of-the-art studies show the performance range of 52.9% sensitivity at 3.1 false-positives per volume (FP/vol.), or 60.9% at 6.1 FP/vol. for mediastinal LN, by one-shot boosting on 3D HAAR features. In this paper, we first operate a preliminary candidate generation stage, towards ~100% sensitivity at the cost of high FP levels (~40 per patient), to harvest volumes of interest (VOI). Our 2.5D approach consequently decomposes any 3D VOI by resampling 2D reformatted orthogonal views N times, via scale, random translations, and rotations with respect to the VOI centroid coordinates. These random views are then used to train a deep Convolutional Neural Network (CNN) classifier. In testing, the CNN is employed to assign LN probabilities for all N random views that can be simply averaged (as a set) to compute the final classification probability per VOI. We validate the approach on two datasets: 90 CT volumes with 388 mediastinal LNs and 86 patients with 595 abdominal LNs. We achieve sensitivities of 70%/83% at 3 FP/vol. and 84%/90% at 6 FP/vol. in mediastinum and abdomen respectively, which drastically improves over the previous state-of-the-art work. PMID:25333158

  12. Increasing Water Intake of Children and Parents in the Family Setting: A Randomized, Controlled Intervention Using Installation Theory.

    PubMed

    Lahlou, Saadi; Boesen-Mariani, Sabine; Franks, Bradley; Guelinckx, Isabelle

    2015-01-01

    On average, children and adults in developed countries consume too little water, which can lead to negative health consequences. In a one-year longitudinal field experiment in Poland, we compared the impact of three home-based interventions on helping children and their parents/caregivers to develop sustainable increased plain water consumption habits. Fluid consumption of 334 children and their caregivers were recorded over one year using an online specific fluid dietary record. They were initially randomly allocated to one of the three following conditions: Control, Information (child and carer received information on the health benefits of water), or Placement (in addition to information, free small bottles of still water for a limited time period were delivered at home). After three months, half of the non-controls were randomly assigned to Community (child and caregiver engaged in an online community forum providing support on water consumption). All conditions significantly increased the water consumption of children (by 21.9-56.7%) and of adults (by 22-89%). Placement + Community generated the largest effects. Community enhanced the impact of Placement for children and parents, as well as the impact of Information for parents but not children. The results suggest that the family setting offers considerable scope for successful installation of interventions encouraging children and caregivers to develop healthier consumption habits, in mutually reinforcing ways. Combining information, affordances, and social influence gives the best, and most sustainable, results. PMID:26088044

  13. Preventing depression among early adolescents in the primary care setting: a randomized controlled study of the Penn Resiliency Program.

    PubMed

    Gillham, Jane E; Hamilton, John; Freres, Derek R; Patton, Ken; Gallop, Robert

    2006-04-01

    This study evaluated the Penn Resiliency Program's effectiveness in preventing depression when delivered by therapists in a primary care setting. Two-hundred and seventy-one 11- and 12-year-olds, with elevated depressive symptoms, were randomized to PRP or usual care. Over the 2-year follow-up, PRP improved explanatory style for positive events. PRP's effects on depressive symptoms and explanatory style for negative events were moderated by sex, with girls benefiting more than boys. Stronger effects were seen in high-fidelity groups than low-fidelity groups. PRP did not significantly prevent depressive disorders but significantly prevented depression, anxiety, and adjustment disorders (when combined) among high-symptom participants. Findings are discussed in relation to previous PRP studies and research on the dissemination of psychological interventions. PMID:16741684

  14. First look: a cluster-randomized trial of ultrasound to improve pregnancy outcomes in low income country settings

    PubMed Central

    2014-01-01

    Background In high-resource settings, obstetric ultrasound is a standard component of prenatal care used to identify pregnancy complications and to establish an accurate gestational age in order to improve obstetric care. Whether or not ultrasound use will improve care and ultimately pregnancy outcomes in low-resource settings is unknown. Methods/Design This multi-country cluster randomized trial will assess the impact of antenatal ultrasound screening performed by health care staff on a composite outcome consisting of maternal mortality and maternal near-miss, stillbirth and neonatal mortality in low-resource community settings. The trial will utilize an existing research infrastructure, the Global Network for Women’s and Children’s Health Research with sites in Pakistan, Kenya, Zambia, Democratic Republic of Congo and Guatemala. A maternal and newborn health registry in defined geographic areas which documents all pregnancies and their outcomes to 6 weeks post-delivery will provide population-based rates of maternal mortality and morbidity, stillbirth, neonatal mortality and morbidity, and health care utilization for study clusters. A total of 58 study clusters each with a health center and about 500 births per year will be randomized (29 intervention and 29 control). The intervention includes training of health workers (e.g., nurses, midwives, clinical officers) to perform ultrasound examinations during antenatal care, generally at 18–22 and at 32–36 weeks for each subject. Women who are identified as having a complication of pregnancy will be referred to a hospital for appropriate care. Finally, the intervention includes community sensitization activities to inform women and their families of the availability of ultrasound at the antenatal care clinic and training in emergency obstetric and neonatal care at referral facilities. Discussion In summary, our trial will evaluate whether introduction of ultrasound during antenatal care improves pregnancy

  15. Best strategies to implement clinical pathways in an emergency department setting: study protocol for a cluster randomized controlled trial

    PubMed Central

    2013-01-01

    Background The clinical pathway is a tool that operationalizes best evidence recommendations and clinical practice guidelines in an accessible format for ‘point of care’ management by multidisciplinary health teams in hospital settings. While high-quality, expert-developed clinical pathways have many potential benefits, their impact has been limited by variable implementation strategies and suboptimal research designs. Best strategies for implementing pathways into hospital settings remain unknown. This study will seek to develop and comprehensively evaluate best strategies for effective local implementation of externally developed expert clinical pathways. Design/methods We will develop a theory-based and knowledge user-informed intervention strategy to implement two pediatric clinical pathways: asthma and gastroenteritis. Using a balanced incomplete block design, we will randomize 16 community emergency departments to receive the intervention for one clinical pathway and serve as control for the alternate clinical pathway, thus conducting two cluster randomized controlled trials to evaluate this implementation intervention. A minimization procedure will be used to randomize sites. Intervention sites will receive a tailored strategy to support full clinical pathway implementation. We will evaluate implementation strategy effectiveness through measurement of relevant process and clinical outcomes. The primary process outcome will be the presence of an appropriately completed clinical pathway on the chart for relevant patients. Primary clinical outcomes for each clinical pathway include the following: Asthma—the proportion of asthmatic patients treated appropriately with corticosteroids in the emergency department and at discharge; and Gastroenteritis—the proportion of relevant patients appropriately treated with oral rehydration therapy. Data sources include chart audits, administrative databases, environmental scans, and qualitative interviews. We will

  16. A space and time scale-dependent nonlinear geostatistical approach for downscaling daily precipitation and temperature

    NASA Astrophysics Data System (ADS)

    Jha, Sanjeev Kumar; Mariethoz, Gregoire; Evans, Jason; McCabe, Matthew F.; Sharma, Ashish

    2015-08-01

    A geostatistical framework is proposed to downscale daily precipitation and temperature. The methodology is based on multiple-point geostatistics (MPS), where a multivariate training image is used to represent the spatial relationship between daily precipitation and daily temperature over several years. Here the training image consists of daily rainfall and temperature outputs from the Weather Research and Forecasting (WRF) model at 50 and 10 km resolution for a 20 year period ranging from 1985 to 2004. The data are used to predict downscaled climate variables for the year 2005. The result, for each downscaled pixel, is daily time series of precipitation and temperature that are spatially dependent. Comparison of predicted precipitation and temperature against a reference data set indicates that both the seasonal average climate response together with the temporal variability are well reproduced. The explicit inclusion of time dependence is explored by considering the climate properties of the previous day as an additional variable. Comparison of simulations with and without inclusion of time dependence shows that the temporal dependence only slightly improves the daily prediction because the temporal variability is already well represented in the conditioning data. Overall, the study shows that the multiple-point geostatistics approach is an efficient tool to be used for statistical downscaling to obtain local-scale estimates of precipitation and temperature from General Circulation Models.

  17. Technology demonstration: geostatistical and hydrologic analysis of salt areas. Assessment of effectiveness of geologic isolation systems

    SciTech Connect

    Doctor, P.G.; Oberlander, P.L.; Rice, W.A.; Devary, J.L.; Nelson, R.W.; Tucker, P.E.

    1982-09-01

    The Office of Nuclear Waste Isolation (ONWI) requested Pacific Northwest Laboratory (PNL) to: (1) use geostatistical analyses to evaluate the adequacy of hydrologic data from three salt regions, each of which contains a potential nuclear waste repository site; and (2) demonstrate a methodology that allows quantification of the value of additional data collection. The three regions examined are the Paradox Basin in Utah, the Permian Basin in Texas, and the Mississippi Study Area. Additional and new data became available to ONWI during and following these analyses; therefore, this report must be considered a methodology demonstration here would apply as illustrated had the complete data sets been available. A combination of geostatistical and hydrologic analyses was used for this demonstration. Geostatistical analyses provided an optimal estimate of the potentiometric surface from the available data, a measure of the uncertainty of that estimate, and a means for selecting and evaluating the location of future data. The hydrologic analyses included the calculation of transmissivities, flow paths, travel times, and ground-water flow rates from hypothetical repository sites. Simulation techniques were used to evaluate the effect of optimally located future data on the potentiometric surface, flow lines, travel times, and flow rates. Data availability, quality, quantity, and conformance with model assumptions differed in each of the salt areas. Report highlights for the three locations are given.

  18. Unsupervised classification of multivariate geostatistical data: Two algorithms

    NASA Astrophysics Data System (ADS)

    Romary, Thomas; Ors, Fabien; Rivoirard, Jacques; Deraisme, Jacques

    2015-12-01

    With the increasing development of remote sensing platforms and the evolution of sampling facilities in mining and oil industry, spatial datasets are becoming increasingly large, inform a growing number of variables and cover wider and wider areas. Therefore, it is often necessary to split the domain of study to account for radically different behaviors of the natural phenomenon over the domain and to simplify the subsequent modeling step. The definition of these areas can be seen as a problem of unsupervised classification, or clustering, where we try to divide the domain into homogeneous domains with respect to the values taken by the variables in hand. The application of classical clustering methods, designed for independent observations, does not ensure the spatial coherence of the resulting classes. Image segmentation methods, based on e.g. Markov random fields, are not adapted to irregularly sampled data. Other existing approaches, based on mixtures of Gaussian random functions estimated via the expectation-maximization algorithm, are limited to reasonable sample sizes and a small number of variables. In this work, we propose two algorithms based on adaptations of classical algorithms to multivariate geostatistical data. Both algorithms are model free and can handle large volumes of multivariate, irregularly spaced data. The first one proceeds by agglomerative hierarchical clustering. The spatial coherence is ensured by a proximity condition imposed for two clusters to merge. This proximity condition relies on a graph organizing the data in the coordinates space. The hierarchical algorithm can then be seen as a graph-partitioning algorithm. Following this interpretation, a spatial version of the spectral clustering algorithm is also proposed. The performances of both algorithms are assessed on toy examples and a mining dataset.

  19. Optimisation of groundwater level monitoring networks using geostatistical modelling based on the Spartan family variogram and a genetic algorithm method

    NASA Astrophysics Data System (ADS)

    Parasyris, Antonios E.; Spanoudaki, Katerina; Kampanis, Nikolaos A.

    2016-04-01

    Groundwater level monitoring networks provide essential information for water resources management, especially in areas with significant groundwater exploitation for agricultural and domestic use. Given the high maintenance costs of these networks, development of tools, which can be used by regulators for efficient network design is essential. In this work, a monitoring network optimisation tool is presented. The network optimisation tool couples geostatistical modelling based on the Spartan family variogram with a genetic algorithm method and is applied to Mires basin in Crete, Greece, an area of high socioeconomic and agricultural interest, which suffers from groundwater overexploitation leading to a dramatic decrease of groundwater levels. The purpose of the optimisation tool is to determine which wells to exclude from the monitoring network because they add little or no beneficial information to groundwater level mapping of the area. Unlike previous relevant investigations, the network optimisation tool presented here uses Ordinary Kriging with the recently-established non-differentiable Spartan variogram for groundwater level mapping, which, based on a previous geostatistical study in the area leads to optimal groundwater level mapping. Seventy boreholes operate in the area for groundwater abstraction and water level monitoring. The Spartan variogram gives overall the most accurate groundwater level estimates followed closely by the power-law model. The geostatistical model is coupled to an integer genetic algorithm method programmed in MATLAB 2015a. The algorithm is used to find the set of wells whose removal leads to the minimum error between the original water level mapping using all the available wells in the network and the groundwater level mapping using the reduced well network (error is defined as the 2-norm of the difference between the original mapping matrix with 70 wells and the mapping matrix of the reduced well network). The solution to the

  20. The role of geostatistics in medical geology

    NASA Astrophysics Data System (ADS)

    Goovaerts, Pierre

    2014-05-01

    Since its development in the mining industry, geostatistics has emerged as the primary tool for spatial data analysis in various fields, ranging from earth and atmospheric sciences, to agriculture, soil science, remote sensing, and more recently environmental exposure assessment. In the last few years, these tools have been tailored to the field of medical geography or spatial epidemiology, which is concerned with the study of spatial patterns of disease incidence and mortality and the identification of potential 'causes' of disease, such as environmental exposure, diet and unhealthy behaviors, economic or socio-demographic factors. On the other hand, medical geology is an emerging interdisciplinary scientific field studying the relationship between natural geological factors and their effects on human and animal health. This paper provides an introduction to the field of medical geology with an overview of geostatistical methods available for the analysis of geological and health data. Key concepts are illustrated using the mapping of groundwater arsenic concentrations across eleven Michigan counties and the exploration of its relationship to the incidence of prostate cancer at the township level. Arsenic in drinking-water is a major problem and has received much attention because of the large human population exposed and the extremely high concentrations (e.g. 600 to 700 μg/L) recorded in many instances. Few studies have however assessed the risks associated with exposure to low levels of arsenic (say < 50 μg/L) most commonly found in drinking water in the United States. In the Michigan thumb region, arsenopyrite (up to 7% As by weight) has been identified in the bedrock of the Marshall Sandstone aquifer, one of the region's most productive aquifers. Epidemiologic studies have suggested a possible associationbetween exposure to inorganic arsenic and prostate cancer mortality, including a study of populations residing in Utah. The information available for the

  1. Tuberculosis case-finding through a village outreach programme in a rural setting in southern Ethiopia: community randomized trial.

    PubMed Central

    Shargie, Estifanos Biru; Mørkve, Odd; Lindtjørn, Bernt

    2006-01-01

    OBJECTIVE: To ascertain whether case-finding through community outreach in a rural setting has an effect on case-notification rate, symptom duration, and treatment outcome of smear-positive tuberculosis (TB). METHODS: We randomly allocated 32 rural communities to intervention or control groups. In intervention communities, health workers from seven health centres held monthly diagnostic outreach clinics at which they obtained sputum samples for sputum microscopy from symptomatic TB suspects. In addition, trained community promoters distributed leaflets and discussed symptoms of TB during house visits and at popular gatherings. Symptomatic individuals were encouraged to visit the outreach team or a nearby health facility. In control communities, cases were detected through passive case-finding among symptomatic suspects reporting to health facilities. Smear-positive TB patients from the intervention and control communities diagnosed during the study period were prospectively enrolled. FINDINGS: In the 1-year study period, 159 and 221 cases of smear-positive TB were detected in the intervention and control groups, respectively. Case-notification rates in all age groups were 124.6/10(5) and 98.1/10(5) person-years, respectively (P = 0.12). The corresponding rates in adults older than 14 years were 207/10(5) and 158/10(5) person-years, respectively (P = 0.09). The proportion of patients with >3 months' symptom duration was 41% in the intervention group compared with 63% in the control group (P<0.001). Pre-treatment symptom duration in the intervention group fell by 55-60% compared with 3-20% in the control group. In the intervention and control groups, 81% and 75%, respectively of patients successfully completed treatment (P = 0.12). CONCLUSION: The intervention was effective in improving the speed but not the extent of case finding for smear-positive TB in this setting. Both groups had comparable treatment outcomes. PMID:16501728

  2. On the comparison of the interval estimation of the Pareto parameter under simple random sampling and ranked set sampling techniques

    NASA Astrophysics Data System (ADS)

    Aissa, Aissa Omar; Ibrahim, Kamarulzaman; Dayyeh, Walid Abu; Zin, Wan Zawiah Wan

    2015-02-01

    Ranked set sampling (RSS) is recognized as a useful sampling scheme for improving the precision of the parameter estimates and increasing the efficiency of estimation. This type of scheme is appropriate when the variable of interest is expensive or time consuming to be quantified, but easy and cheap to be ranked. In this study, the estimation of the shape parameter of the Pareto distribution of the first type when the scale is known is studied for the data that are gathered under simple random sampling (SRS), RSS, and selective order statistics based on the maximum (SORSS(max)). The confidence intervals for the shape parameter of Pareto distribution under the sampling techniques considered are determined. A simulation study is carried out to compare the confidence intervals in terms of coverage probabilities (CPs) and expected lengths (ELs). When the coverage probabilities and expected lengths for the confidence intervals of the shape parameter of Pareto distribution determined based on the different sampling methods are compared, the coverage probabilities and expected lengths are found to be more precise under RSS as opposed to SRS. In particular, it is found that the coverage probabilities under SORSS(max) is closest to the nominal value of 0.95.

  3. Testing Allele Transmission of an SNP Set Using a Family-Based Generalized Genetic Random Field Method.

    PubMed

    Li, Ming; Li, Jingyun; He, Zihuai; Lu, Qing; Witte, John S; Macleod, Stewart L; Hobbs, Charlotte A; Cleves, Mario A

    2016-05-01

    Family-based association studies are commonly used in genetic research because they can be robust to population stratification (PS). Recent advances in high-throughput genotyping technologies have produced a massive amount of genomic data in family-based studies. However, current family-based association tests are mainly focused on evaluating individual variants one at a time. In this article, we introduce a family-based generalized genetic random field (FB-GGRF) method to test the joint association between a set of autosomal SNPs (i.e., single-nucleotide polymorphisms) and disease phenotypes. The proposed method is a natural extension of a recently developed GGRF method for population-based case-control studies. It models offspring genotypes conditional on parental genotypes, and, thus, is robust to PS. Through simulations, we presented that under various disease scenarios the FB-GGRF has improved power over a commonly used family-based sequence kernel association test (FB-SKAT). Further, similar to GGRF, the proposed FB-GGRF method is asymptotically well-behaved, and does not require empirical adjustment of the type I error rates. We illustrate the proposed method using a study of congenital heart defects with family trios from the National Birth Defects Prevention Study (NBDPS). PMID:27061818

  4. Estimation of extreme daily precipitation: comparison between regional and geostatistical approaches.

    NASA Astrophysics Data System (ADS)

    Hellies, Matteo; Deidda, Roberto; Langousis, Andreas

    2016-04-01

    We study the extreme rainfall regime of the Island of Sardinia in Italy, based on annual maxima of daily precipitation. The statistical analysis is conducted using 229 daily rainfall records with at least 50 complete years of observations, collected at different sites by the Hydrological Survey of the Sardinia Region. Preliminary analysis, and the L-skewness and L-kurtosis diagrams, show that the Generalized Extreme Value (GEV) distribution model performs best in describing daily rainfall extremes. The GEV distribution parameters are estimated using the method of Probability Weighted Moments (PWM). To obtain extreme rainfall estimates at ungauged sites, while minimizing uncertainties due to sampling variability, a regional and a geostatistical approach are compared. The regional approach merges information from different gauged sites, within homogeneous regions, to obtain GEV parameter estimates at ungauged locations. The geostatistical approach infers the parameters of the GEV distribution model at locations where measurements are available, and then spatially interpolates them over the study region. In both approaches we use local rainfall means as index-rainfall. In the regional approach we define homogeneous regions by applying a hierarchical cluster analysis based on Ward's method, with L-moment ratios (i.e. L-CV and L-Skewness) as metrics. The analysis results in four contiguous regions, which satisfy the Hosking and Wallis (1997) homogeneity tests. The latter have been conducted using a Monte-Carlo approach based on a 4-parameter Kappa distribution model, fitted to each station cluster. Note that the 4-parameter Kappa model includes the GEV distribution as a sub-case, when the fourth parameter h is set to 0. In the geostatistical approach we apply kriging for uncertain data (KUD), which accounts for the error variance in local parameter estimation and, therefore, may serve as a useful tool for spatial interpolation of metrics affected by high uncertainty. In

  5. High Performance Geostatistical Modeling of Biospheric Resources

    NASA Astrophysics Data System (ADS)

    Pedelty, J. A.; Morisette, J. T.; Smith, J. A.; Schnase, J. L.; Crosier, C. S.; Stohlgren, T. J.

    2004-12-01

    We are using parallel geostatistical codes to study spatial relationships among biospheric resources in several study areas. For example, spatial statistical models based on large- and small-scale variability have been used to predict species richness of both native and exotic plants (hot spots of diversity) and patterns of exotic plant invasion. However, broader use of geostastics in natural resource modeling, especially at regional and national scales, has been limited due to the large computing requirements of these applications. To address this problem, we implemented parallel versions of the kriging spatial interpolation algorithm. The first uses the Message Passing Interface (MPI) in a master/slave paradigm on an open source Linux Beowulf cluster, while the second is implemented with the new proprietary Xgrid distributed processing system on an Xserve G5 cluster from Apple Computer, Inc. These techniques are proving effective and provide the basis for a national decision support capability for invasive species management that is being jointly developed by NASA and the US Geological Survey.

  6. Geostatistical Sampling Methods for Efficient Uncertainty Analysis in Flow and Transport Problems

    NASA Astrophysics Data System (ADS)

    Liodakis, Stylianos; Kyriakidis, Phaedon; Gaganis, Petros

    2015-04-01

    In hydrogeological applications involving flow and transport of in heterogeneous porous media the spatial distribution of hydraulic conductivity is often parameterized in terms of a lognormal random field based on a histogram and variogram model inferred from data and/or synthesized from relevant knowledge. Realizations of simulated conductivity fields are then generated using geostatistical simulation involving simple random (SR) sampling and are subsequently used as inputs to physically-based simulators of flow and transport in a Monte Carlo framework for evaluating the uncertainty in the spatial distribution of solute concentration due to the uncertainty in the spatial distribution of hydraulic con- ductivity [1]. Realistic uncertainty analysis, however, calls for a large number of simulated concentration fields; hence, can become expensive in terms of both time and computer re- sources. A more efficient alternative to SR sampling is Latin hypercube (LH) sampling, a special case of stratified random sampling, which yields a more representative distribution of simulated attribute values with fewer realizations [2]. Here, term representative implies realizations spanning efficiently the range of possible conductivity values corresponding to the lognormal random field. In this work we investigate the efficiency of alternative methods to classical LH sampling within the context of simulation of flow and transport in a heterogeneous porous medium. More precisely, we consider the stratified likelihood (SL) sampling method of [3], in which attribute realizations are generated using the polar simulation method by exploring the geometrical properties of the multivariate Gaussian distribution function. In addition, we propose a more efficient version of the above method, here termed minimum energy (ME) sampling, whereby a set of N representative conductivity realizations at M locations is constructed by: (i) generating a representative set of N points distributed on the

  7. Geostatistical Study of Precipitation on the Island of Crete

    NASA Astrophysics Data System (ADS)

    Agou, Vasiliki D.; Varouchakis, Emmanouil A.; Hristopulos, Dionissios T.

    2015-04-01

    precipitation which are fitted locally to a three-parameter probability distribution, based on which a normalized index is derived. We use the Spartan variogram function to model space-time correlations, because it is more flexible than classical models [3]. The performance of the variogram model is tested by means of leave-one-out cross validation. The variogram model is then used in connection with ordinary kriging to generate precipitation maps for the entire island. In the future, we will explore the joint spatiotemporal evolution of precipitation patterns on Crete. References [1] P. Goovaerts. Geostatistical approaches for incorporating elevation into the spatial interpolation of precipitation. Journal of Hydrology, 228(1):113-129, 2000. [2] N. B. Guttman. Accepting the standardized precipitation index: a calculation algorithm. American Water Resource Association, 35(2):311-322, 1999. [3] D. T Hristopulos. Spartan Gibbs random field models for geostatistical applications. SIAM Journal on Scientific Computing, 24(6):2125-2162, 2003. [4] A.G. Koutroulis, A.-E.K. Vrohidou, and I.K. Tsanis. Spatiotemporal characteristics of meteorological drought for the island of Crete. Journal of Hydrometeorology, 12(2):206-226, 2011. [5] T. B. McKee, N. J. Doesken, and J. Kleist. The relationship of drought frequency and duration to time scales. In Proceedings of the 8th Conference on Applied Climatology, page 179-184, Anaheim, California, 1993.

  8. Geostatistics from Digital Outcrop Models of Outcrop Analogues for Hydrocarbon Reservoir Characterisation.

    NASA Astrophysics Data System (ADS)

    Hodgetts, David; Burnham, Brian; Head, William; Jonathan, Atunima; Rarity, Franklin; Seers, Thomas; Spence, Guy

    2013-04-01

    In the hydrocarbon industry stochastic approaches are the main method by which reservoirs are modelled. These stochastic modelling approaches require geostatistical information on the geometry and distribution of the geological elements of the reservoir. As the reservoir itself cannot be viewed directly (only indirectly via seismic and/or well log data) this leads to a great deal of uncertainty in the geostatistics used, therefore outcrop analogues are characterised to help obtain the geostatistical information required to model the reservoir. Lidar derived Digital Outcrop Model's (DOM's) provide the ability to collect large quantities of statistical information on the geological architecture of the outcrop, far more than is possible by field work alone as the DOM allows accurate measurements to be made in normally inaccessible parts of the exposure. This increases the size of the measured statistical dataset, which in turn results in an increase in statistical significance. There are, however, many problems and biases in the data which cannot be overcome by sample size alone. These biases, for example, may relate to the orientation, size and quality of exposure, as well as the resolution of the DOM itself. Stochastic modelling used in the hydrocarbon industry fall mainly into 4 generic approaches: 1) Object Modelling where the geology is defined by a set of simplistic shapes (such as channels), where parameters such as width, height and orientation, among others, can be defined. 2) Sequential Indicator Simulations where geological shapes are less well defined and the size and distribution are defined using variograms. 3) Multipoint statistics where training images are used to define shapes and relationships between geological elements and 4) Discrete Fracture Networks for fractures reservoirs where information on fracture size and distribution are required. Examples of using DOM's to assist with each of these modelling approaches are presented, highlighting the

  9. Reducing uncertainty in geostatistical description with well testing pressure data

    SciTech Connect

    Reynolds, A.C.; He, Nanqun; Oliver, D.S.

    1997-08-01

    Geostatistics has proven to be an effective tool for generating realizations of reservoir properties conditioned to static data, e.g., core and log data and geologic knowledge. Due to the lack of closely spaced data in the lateral directions, there will be significant variability in reservoir descriptions generated by geostatistical simulation, i.e., significant uncertainty in the reservoir descriptions. In past work, we have presented procedures based on inverse problem theory for generating reservoir descriptions (rock property fields) conditioned to pressure data and geostatistical information represented as prior means for log-permeability and porosity and variograms. Although we have shown that the incorporation of pressure data reduces the uncertainty below the level contained in the geostatistical model based only on static information (the prior model), our previous results assumed did not explicitly account for uncertainties in the prior means and the parameters defining the variogram model. In this work, we investigate how pressure data can help detect errors in the prior means. If errors in the prior means are large and are not taken into account, realizations conditioned to pressure data represent incorrect samples of the a posteriori probability density function for the rock property fields, whereas, if the uncertainty in the prior mean is incorporated properly into the model, one obtains realistic realizations of the rock property fields.

  10. Extending an operational meteorological monitoring network through machine learning and classical geo-statistical approaches

    NASA Astrophysics Data System (ADS)

    Appelhans, Tim; Mwangomo, Ephraim; Otte, Insa; Detsch, Florian; Nauss, Thomas; Hemp, Andreas; Ndyamkama, Jimmy

    2015-04-01

    This study introduces the set-up and characteristics of a meteorological station network on the southern slopes of Mt. Kilimanjaro, Tanzania. The set-up follows a hierarchical approach covering an elevational as well as a land-use disturbance gradient. The network consists of 52 basic stations measuring ambient air temperature and above ground air humidity and 11 precipitation measurement sites. We provide in depth descriptions of various machine learning and classical geo-statistical methods used to fill observation gaps and extend the spatial coverage of the network to a total of 60 research sites. Performance statistics for these methods indicate that the presented data sets provide reliable measurements of the meteorological reality at Mt. Kilimanjaro. These data provide an excellent basis for ecological studies and are also of great value for regional atmospheric numerical modelling studies for which such comprehensive in-situ validation observations are rare, especially in tropical regions of complex terrain.

  11. Geostatistical analysis of soil properties at field scale using standardized data

    NASA Astrophysics Data System (ADS)

    Millan, H.; Tarquis, A. M.; Pérez, L. D.; Matos, J.; González-Posada, M.

    2012-04-01

    Indentifying areas with physical degradation is a crucial step to ameliorate the effects in soil erosion. The quantification and interpretation of spatial variability is a key issue for site-specific soil management. Geostatistics has been the main methodological tool for implementing precision agriculture using field data collected at different spatial resolutions. Even though many works have made significant contributions to the body of knowledge on spatial statistics and its applications, some other key points need to be addressed for conducting precise comparisons between soil properties using geostatistical parameters. The objectives of the present work were (i) to quantify the spatial structure of different physical properties collected from a Vertisol, (ii) to search for potential correlations between different spatial patterns and (iii) to identify relevant components through multivariate spatial analysis. The study was conducted on a Vertisol (Typic Hapludert) dedicated to sugarcane (Saccharum officinarum L.) production during the last sixty years. We used six soil properties collected from a squared grid (225 points) (penetrometer resistance (PR), total porosity, fragmentation dimension (Df), vertical electrical conductivity (ECv), horizontal electrical conductivity (ECh) and soil water content (WC)). All the original data sets were z-transformed before geostatistical analysis. Three different types of semivariogram models were necessary for fitting individual experimental semivariograms. This suggests the different natures of spatial variability patterns. Soil water content rendered the largest nugget effect (C0 = 0.933) while soil total porosity showed the largest range of spatial correlation (A = 43.92 m). The bivariate geostatistical analysis also rendered significant cross-semivariance between different paired soil properties. However, four different semivariogram models were required in that case. This indicates an underlying co

  12. A Parenting Intervention for Childhood Behavioral Problems: A Randomized Controlled Trial in Disadvantaged Community-Based Settings

    ERIC Educational Resources Information Center

    McGilloway, Sinead; Mhaille, Grainne Ni; Bywater, Tracey; Furlong, Mairead; Leckey, Yvonne; Kelly, Paul; Comiskey, Catherine; Donnelly, Michael

    2012-01-01

    Objective: A community-based randomized controlled trial (RCT) was conducted in urban areas characterized by high levels of disadvantage to test the effectiveness of the Incredible Years BASIC parent training program (IYBP) for children with behavioral problems. Potential moderators of intervention effects on child behavioral outcomes were also…

  13. [Geostatistical analysis on distribution dynamics of Myzus persicae (Sulzer) in flue-cured tobacco field].

    PubMed

    Xia, Peng-liang; Liu, Ying-hong; Fan, Jun; Tan, Jun

    2015-02-01

    Abstract: Myzus persicae belonging to Aphididae, Hemiptera, is an important migratory pest in tobacco field. As nymph and adult, it sucks the juice, breeds the mildew stains disease, spreads tobacco virus diseases and causes huge losses to the yield and quality. The distribution pattern and dynamics of winged and wingless aphids in the field were investigated from the transplanting of tobacco to the harvesting stage of mid-place tobacco leaves in Enshi, Hubei. The semivariable function characteristics were analyzed by geostatistical method, and the field migration pattern were simulated. The results showed that the population dynamics of winged aphids in Enshi were of bimodal curve, with two peaks at 3 weeks after transplanting and 2 weeks after multi-topping of tobacco leaves, and there were five-step process such as random, aggregation, random, aggregation and random. The population dynamics of wingless peach aphids were of single-peak curve, getting its peak before multi-topping, and had random, aggregation, random three-step process. Human factors and the hosts had considerable effects on the population density. Spatial distribution simulation-interpolation-figure could clearly reflect the dynamics of tobacco aphids. Combined with the Pearson correlation analysis, we found that the population density was low and highly concentrated as winged type in the immigration period, which was the key period for the management of peach aphids. PMID:26094473

  14. Geostatistical regularization of inverse models for the retrieval of vegetation biophysical variables

    NASA Astrophysics Data System (ADS)

    Atzberger, C.; Richter, K.

    2009-09-01

    The robust and accurate retrieval of vegetation biophysical variables using radiative transfer models (RTM) is seriously hampered by the ill-posedness of the inverse problem. With this research we further develop our previously published (object-based) inversion approach [Atzberger (2004)]. The object-based RTM inversion takes advantage of the geostatistical fact that the biophysical characteristics of nearby pixel are generally more similar than those at a larger distance. A two-step inversion based on PROSPECT+SAIL generated look-up-tables is presented that can be easily implemented and adapted to other radiative transfer models. The approach takes into account the spectral signatures of neighboring pixel and optimizes a common value of the average leaf angle (ALA) for all pixel of a given image object, such as an agricultural field. Using a large set of leaf area index (LAI) measurements (n = 58) acquired over six different crops of the Barrax test site, Spain), we demonstrate that the proposed geostatistical regularization yields in most cases more accurate and spatially consistent results compared to the traditional (pixel-based) inversion. Pros and cons of the approach are discussed and possible future extensions presented.

  15. Comparing the performance of geostatistical models with additional information from covariates for sewage plume characterization.

    PubMed

    Del Monego, Maurici; Ribeiro, Paulo Justiniano; Ramos, Patrícia

    2015-04-01

    In this work, kriging with covariates is used to model and map the spatial distribution of salinity measurements gathered by an autonomous underwater vehicle in a sea outfall monitoring campaign aiming to distinguish the effluent plume from the receiving waters and characterize its spatial variability in the vicinity of the discharge. Four different geostatistical linear models for salinity were assumed, where the distance to diffuser, the west-east positioning, and the south-north positioning were used as covariates. Sample variograms were fitted by the Matèrn models using weighted least squares and maximum likelihood estimation methods as a way to detect eventual discrepancies. Typically, the maximum likelihood method estimated very low ranges which have limited the kriging process. So, at least for these data sets, weighted least squares showed to be the most appropriate estimation method for variogram fitting. The kriged maps show clearly the spatial variation of salinity, and it is possible to identify the effluent plume in the area studied. The results obtained show some guidelines for sewage monitoring if a geostatistical analysis of the data is in mind. It is important to treat properly the existence of anomalous values and to adopt a sampling strategy that includes transects parallel and perpendicular to the effluent dispersion. PMID:25345922

  16. Large-scale hydraulic tomography and joint inversion of head and tracer data using the Principal Component Geostatistical Approach (PCGA)

    NASA Astrophysics Data System (ADS)

    Lee, J.; Kitanidis, P. K.

    2014-07-01

    The stochastic geostatistical inversion approach is widely used in subsurface inverse problems to estimate unknown parameter fields and corresponding uncertainty from noisy observations. However, the approach requires a large number of forward model runs to determine the Jacobian or sensitivity matrix, thus the computational and storage costs become prohibitive when the number of unknowns, m, and the number of observations, n increase. To overcome this challenge in large-scale geostatistical inversion, the Principal Component Geostatistical Approach (PCGA) has recently been developed as a "matrix-free" geostatistical inversion strategy that avoids the direct evaluation of the Jacobian matrix through the principal components (low-rank approximation) of the prior covariance and the drift matrix with a finite difference approximation. As a result, the proposed method requires about K runs of the forward problem in each iteration independently of m and n, where K is the number of principal components and can be much less than m and n for large-scale inverse problems. Furthermore, the PCGA is easily adaptable to different forward simulation models and various data types for which the adjoint-state method may not be implemented suitably. In this paper, we apply the PCGA to representative subsurface inverse problems to illustrate its efficiency and scalability. The low-rank approximation of the large-dimensional dense prior covariance matrix is computed through a randomized eigen decomposition. A hydraulic tomography problem in which the number of observations is typically large is investigated first to validate the accuracy of the PCGA compared with the conventional geostatistical approach. Then the method is applied to a large-scale hydraulic tomography with 3 million unknowns and it is shown that underlying subsurface structures are characterized successfully through an inversion that involves an affordable number of forward simulation runs. Lastly, we present a joint

  17. Hydrogeologic Unit Flow Characterization Using Transition Probability Geostatistics

    SciTech Connect

    Jones, N L; Walker, J R; Carle, S F

    2003-11-21

    This paper describes a technique for applying the transition probability geostatistics method for stochastic simulation to a MODFLOW model. Transition probability geostatistics has several advantages over traditional indicator kriging methods including a simpler and more intuitive framework for interpreting geologic relationships and the ability to simulate juxtapositional tendencies such as fining upwards sequences. The indicator arrays generated by the transition probability simulation are converted to layer elevation and thickness arrays for use with the new Hydrogeologic Unit Flow (HUF) package in MODFLOW 2000. This makes it possible to preserve complex heterogeneity while using reasonably sized grids. An application of the technique involving probabilistic capture zone delineation for the Aberjona Aquifer in Woburn, Ma. is included.

  18. Stochastic Local Interaction (SLI) model: Bridging machine learning and geostatistics

    NASA Astrophysics Data System (ADS)

    Hristopulos, Dionissios T.

    2015-12-01

    Machine learning and geostatistics are powerful mathematical frameworks for modeling spatial data. Both approaches, however, suffer from poor scaling of the required computational resources for large data applications. We present the Stochastic Local Interaction (SLI) model, which employs a local representation to improve computational efficiency. SLI combines geostatistics and machine learning with ideas from statistical physics and computational geometry. It is based on a joint probability density function defined by an energy functional which involves local interactions implemented by means of kernel functions with adaptive local kernel bandwidths. SLI is expressed in terms of an explicit, typically sparse, precision (inverse covariance) matrix. This representation leads to a semi-analytical expression for interpolation (prediction), which is valid in any number of dimensions and avoids the computationally costly covariance matrix inversion.

  19. Assessing the resolution-dependent utility of tomograms for geostatistics

    USGS Publications Warehouse

    Day-Lewis, F. D.; Lane, J.W., Jr.

    2004-01-01

    Geophysical tomograms are used increasingly as auxiliary data for geostatistical modeling of aquifer and reservoir properties. The correlation between tomographic estimates and hydrogeologic properties is commonly based on laboratory measurements, co-located measurements at boreholes, or petrophysical models. The inferred correlation is assumed uniform throughout the interwell region; however, tomographic resolution varies spatially due to acquisition geometry, regularization, data error, and the physics underlying the geophysical measurements. Blurring and inversion artifacts are expected in regions traversed by few or only low-angle raypaths. In the context of radar traveltime tomography, we derive analytical models for (1) the variance of tomographic estimates, (2) the spatially variable correlation with a hydrologic parameter of interest, and (3) the spatial covariance of tomographic estimates. Synthetic examples demonstrate that tomograms of qualitative value may have limited utility for geostatistics; moreover, the imprint of regularization may preclude inference of meaningful spatial statistics from tomograms.

  20. A randomized controlled trial of a nurse-administered educational intervention for improving cancer pain management in ambulatory settings.

    PubMed

    Yates, Patsy; Edwards, Helen; Nash, Robyn; Aranda, Sanchia; Purdie, David; Najman, Jake; Skerman, Helen; Walsh, Anne

    2004-05-01

    The persistence of negative attitudes towards cancer pain and its treatment suggests there is scope for identifying more effective pain education strategies. This randomized controlled trial involving 189 ambulatory cancer patients evaluated an educational intervention that aimed to optimize patients' ability to manage pain. One week post-intervention, patients receiving the pain management intervention (PMI) had a significantly greater increase in self-reported pain knowledge, perceived control over pain, and number of pain treatments recommended. Intervention group patients also demonstrated a greater reduction in willingness to tolerate pain, concerns about addiction and side effects, being a "good" patient, and tolerance to pain relieving medication. The results suggest that targeted educational interventions that utilize individualized instructional techniques may alter cancer patient attitudes, which can potentially act as barriers to effective pain management. PMID:15140463

  1. Addressing uncertainty in rock properties through geostatistical simulation

    SciTech Connect

    McKenna, S.A.; Rautman, A.; Cromer, M.V.; Zelinski, W.P.

    1996-09-01

    Fracture and matrix properties in a sequence of unsaturated, welded tuffs at Yucca Mountain, Nevada, are modeled in two-dimensional cross-sections through geostatistical simulation. In the absence of large amounts of sample data, an n interpretive, deterministic, stratigraphic model is coupled with a gaussian simulation algorithm to constrain realizations of both matrix porosity and fracture frequency. Use of the deterministic, stratigraphic model imposes scientific judgment, in the form of a conceptual geologic model, onto the property realizations. Linear coregionalization and a regression relationship between matrix porosity and matrix hydraulic conductivity are used to generate realizations of matrix hydraulic conductivity. Fracture-frequency simulations conditioned on the stratigraphic model represent one class of fractures (cooling fractures) in the conceptual model of the geology. A second class of fractures (tectonic fractures) is conceptualized as fractures that cut across strata vertically and includes discrete features such as fault zones. Indicator geostatistical simulation provides locations of this second class of fractures. The indicator realizations are combined with the realizations of fracture spacing to create realizations of fracture frequency that are a combination of both classes of fractures. Evaluations of the resulting realizations include comparing vertical profiles of rock properties within the model to those observed in boreholes and checking intra-unit property distributions against collected data. Geostatistical simulation provides an efficient means of addressing spatial uncertainty in dual continuum rock properties.

  2. An improved algorithm of a priori based on geostatistics

    NASA Astrophysics Data System (ADS)

    Chen, Jiangping; Wang, Rong; Tang, Xuehua

    2008-12-01

    In data mining one of the classical algorithms is Apriori which has been developed for association rule mining in large transaction database. And it cannot been directly used in spatial association rules mining. The main difference between data mining in relational DB and in spatial DB is that attributes of the neighbors of some object of interest may have an influence on the object and therefore have to be considered as well. The explicit location and extension of spatial objects define implicit relations of spatial neighborhood (such as topological, distance and direction relations) which are used by spatial data mining algorithms. Therefore, new techniques are required for effective and efficient spatial data mining. Geostatistics are statistical methods used to describe spatial relationships among sample data and to apply this analysis to the prediction of spatial and temporal phenomena. They are used to explain spatial patterns and to interpolate values at unsampled locations. This paper put forward an improved algorithm of Apriori about mining association rules with geostatistics. First the spatial autocorrelation of the attributes with location were estimated with the geostatistics methods such as kriging and Spatial Autoregressive Model (SAR). Then a spatial autocorrelation model of the attributes were built. Later an improved algorithm of apriori combined with the spatial autocorrelation model were offered to mine the spatial association rules. Last an experiment of the new algorithm were carried out on the hayfever incidence and climate factors in UK. The result shows that the output rules is matched with the references.

  3. Psychosocial education improves low back pain beliefs: results from a cluster randomized clinical trial (NCT00373009) in a primary prevention setting.

    PubMed

    George, Steven Z; Teyhen, Deydre S; Wu, Samuel S; Wright, Alison C; Dugan, Jessica L; Yang, Guijun; Robinson, Michael E; Childs, John D

    2009-07-01

    The general population has a pessimistic view of low back pain (LBP), and evidence-based information has been used to positively influence LBP beliefs in previously reported mass media studies. However, there is a lack of randomized trials investigating whether LBP beliefs can be modified in primary prevention settings. This cluster randomized clinical trial investigated the effect of an evidence-based psychosocial educational program (PSEP) on LBP beliefs for soldiers completing military training. A military setting was selected for this clinical trial, because LBP is a common cause of soldier disability. Companies of soldiers (n = 3,792) were recruited, and cluster randomized to receive a PSEP or no education (control group, CG). The PSEP consisted of an interactive seminar, and soldiers were issued the Back Book for reference material. The primary outcome measure was the back beliefs questionnaire (BBQ), which assesses inevitable consequences of and ability to cope with LBP. The BBQ was administered before randomization and 12 weeks later. A linear mixed model was fitted for the BBQ at the 12-week follow-up, and a generalized linear mixed model was fitted for the dichotomous outcomes on BBQ change of greater than two points. Sensitivity analyses were performed to account for drop out. BBQ scores (potential range: 9-45) improved significantly from baseline of 25.6 +/- 5.7 (mean +/- SD) to 26.9 +/- 6.2 for those receiving the PSEP, while there was a significant decline from 26.1 +/- 5.7 to 25.6 +/- 6.0 for those in the CG. The adjusted mean BBQ score at follow-up for those receiving the PSEP was 1.49 points higher than those in the CG (P < 0.0001). The adjusted odds ratio of BBQ improvement of greater than two points for those receiving the PSEP was 1.51 (95% CI = 1.22-1.86) times that of those in the CG. BBQ improvement was also mildly associated with race and college education. Sensitivity analyses suggested minimal influence of drop out. In conclusion, soldiers

  4. Classroom-based Interventions and Teachers’ Perceived Job Stressors and Confidence: Evidence from a Randomized Trial in Head Start Settings

    PubMed Central

    Zhai, Fuhua; Raver, C. Cybele; Li-Grining, Christine

    2011-01-01

    Preschool teachers’ job stressors have received increasing attention but have been understudied in the literature. We investigated the impacts of a classroom-based intervention, the Chicago School Readiness Project (CSRP), on teachers’ perceived job stressors and confidence, as indexed by their perceptions of job control, job resources, job demands, and confidence in behavior management. Using a clustered randomized controlled trial (RCT) design, the CSRP provided multifaceted services to the treatment group, including teacher training and mental health consultation, which were accompanied by stress-reduction services and workshops. Overall, 90 teachers in 35 classrooms at 18 Head Start sites participated in the study. After adjusting for teacher and classroom factors and site fixed effects, we found that the CSRP had significant effects on the improvement of teachers’ perceived job control and work-related resources. We also found that the CSRP decreased teachers’ confidence in behavior management and had no statistically significant effects on job demands. Overall, we did not find significant moderation effects of teacher race/ethnicity, education, teaching experience, or teacher type. The implications for research and policy are discussed. PMID:21927538

  5. Geostatistical analysis of fault and joint measurements in Austin Chalk, Superconducting Super Collider Site, Texas

    SciTech Connect

    Mace, R.E.; Nance, H.S.; Laubach, S.E.

    1995-06-01

    Faults and joints are conduits for ground-water flow and targets for horizontal drilling in the petroleum industry. Spacing and size distribution are rarely predicted accurately by current structural models or documented adequately by conventional borehole or outcrop samples. Tunnel excavations present opportunities to measure fracture attributes in continuous subsurface exposures. These fracture measurements ran be used to improve structural models, guide interpretation of conventional borehole and outcrop data, and geostatistically quantify spatial and spacing characteristics for comparison to outcrop data or for generating distributions of fracture for numerical flow and transport modeling. Structure maps of over 9 mi of nearly continuous tunnel excavations in Austin Chalk at the Superconducting Super Collider (SSC) site in Ellis County, Texas, provide a unique database of fault and joint populations for geostatistical analysis. Observationally, small faults (<10 ft. throw) occur in clusters or swarms that have as many as 24 faults, fault swarms are as much as 2,000 ft. wide and appear to be on average 1,000 ft. apart, and joints are in swarms spaced 500 to more than 2l,000 ft. apart. Semi-variograms show varying degrees of spatial correlation. These variograms have structured sills that correlate directly to highs and lows in fracture frequency observed in the tunnel. Semi-variograms generated with respect to fracture spacing and number also have structured sills, but tend to not show any near-field correlation. The distribution of fault spacing can be described with a negative exponential, which suggests a random distribution. However, there is clearly some structure and clustering in the spacing data as shown by running average and variograms, which implies that a number of different methods should be utilized to characterize fracture spacing.

  6. Psychoneuroendocrine effects of cognitive-behavioral stress management in a naturalistic setting--a randomized controlled trial.

    PubMed

    Gaab, J; Sonderegger, L; Scherrer, S; Ehlert, U

    2006-05-01

    It is assumed that chronic or extensive release of cortisol due to stress has deleterious effects on somatic and psychological health, making interventions aiming to reduce and/or normalize cortisol secretion to stress of interest. Cognitive-behavioral stress management (CBSM) has repeatedly been shown to effectively reduce cortisol responses to acute psychosocial stress. However, the effects of CBSM on psychoneuroendocrine responses during "real-life" stress have yet not been examined in healthy subjects. Eight weeks before all subjects took an important academic exam, 28 healthy economics students were randomly assigned to four weekly sessions of cognitive behavioral stress management (CBSM) training or a waiting control condition. Psychological and somatic symptoms were repeatedly assessed throughout the preparation period. Salivary cortisol (cortisol awakening response and short circadian cortisol profile) was repeatedly measured at baseline and on the day of the exam. In addition, cognitive appraisal was assessed on the day of the exam. Subjects in the CBSM group showed significantly lower anxiety and somatic symptom levels throughout the period prior to the exam. On the day of the exam, groups differed in their cortisol awakening stress responses, with significantly attenuated cortisol levels in controls. Short circadian cortisol levels did not differ between groups. Interestingly, groups differed in their associations between cortisol responses before the exam and cognitive stress appraisal, with dissociation in controls but not in the CBSM group. The results show that CBSM reduces psychological and somatic symptoms and influences the ability to show a cortisol response corresponding to subjectively perceived stress. In line with current psychoneuroendocrine models, the inability to mount a cortisol response corresponding to the cognitive appraisal in controls could be a result of a dysregulated HPA axis, probably as a consequence of longlasting stress. PMID

  7. A randomized controlled trial of a computer-based physician workstation in an outpatient setting: implementation barriers to outcome evaluation.

    PubMed Central

    Rotman, B L; Sullivan, A N; McDonald, T W; Brown, B W; DeSmedt, P; Goodnature, D; Higgins, M C; Suermondt, H J; Young, C; Owens, D K

    1996-01-01

    OBJECTIVE: A research prototype Physician Workstation (PWS) incorporating a graphical user interface and a drug ordering module was compared with the existing hospital information system in an academic Veterans Administration General Medical Clinic. Physicians in the intervention group received recommendations for drug substitutions to reduce costs and were alerted to potential drug interactions. The objective was to evaluate the effect of the PWS on user satisfaction, on health-related outcomes, and on costs. DESIGN: A one-year, two-period, randomized controlled trial with 37 subjects. MEASUREMENTS: Differences in the reliance on noncomputer sources of information, in user satisfaction, in the cost of prescribed medications, and in the rate of clinically relevant drug interactions were assessed. RESULTS: The study subjects logged onto the workstation an average of 6.53 times per provider and used it to generate 2.8% of prescriptions during the intervention period. On a five-point scale (5 = very satisfied, 1 = very dissatisfied), user satisfaction declined in the PWS group (3.44 to 2.98 p = 0.008), and increased in the control group (3.23 to 3.72, p < 0.0001). CONCLUSION: The intervention physicians did not use the PWS frequently enough to influence information-seeking behavior, health outcomes, or cost. The study design did not determine whether the poor usage resulted from satisfaction with the control system, problems using the PWS intervention, or the functions provided by the PWS intervention. Evaluative studies should include provisions to improve the chance of successful implementation as well as to yield maximum information if a negative study occurs. PMID:8880681

  8. A Controlled Design of Aligned and Random Nanofibers for 3D Bi-functionalized Nerve Conduits Fabricated via a Novel Electrospinning Set-up.

    PubMed

    Kim, Jeong In; Hwang, Tae In; Aguilar, Ludwig Erik; Park, Chan Hee; Kim, Cheol Sang

    2016-01-01

    Scaffolds made of aligned nanofibers are favorable for nerve regeneration due to their superior nerve cell attachment and proliferation. However, it is challenging not only to produce a neat mat or a conduit form with aligned nanofibers but also to use these for surgical applications as a nerve guide conduit due to their insufficient mechanical strength. Furthermore, no studies have been reported on the fabrication of aligned nanofibers and randomly-oriented nanofibers on the same mat. In this study, we have successfully produced a mat with both aligned and randomly-oriented nanofibers by using a novel electrospinning set up. A new conduit with a highly-aligned electrospun mat is produced with this modified electrospinning method, and this proposed conduit with favorable features, such as selective permeability, hydrophilicity and nerve growth directional steering, were fabricated as nerve guide conduits (NGCs). The inner surface of the nerve conduit is covered with highly aligned electrospun nanofibers and is able to enhance the proliferation of neural cells. The central part of the tube is double-coated with randomly-oriented nanofibers over the aligned nanofibers, strengthening the weak mechanical strength of the aligned nanofibers. PMID:27021221

  9. A Controlled Design of Aligned and Random Nanofibers for 3D Bi-functionalized Nerve Conduits Fabricated via a Novel Electrospinning Set-up

    PubMed Central

    Kim, Jeong In; Hwang, Tae In; Aguilar, Ludwig Erik; Park, Chan Hee; Kim, Cheol Sang

    2016-01-01

    Scaffolds made of aligned nanofibers are favorable for nerve regeneration due to their superior nerve cell attachment and proliferation. However, it is challenging not only to produce a neat mat or a conduit form with aligned nanofibers but also to use these for surgical applications as a nerve guide conduit due to their insufficient mechanical strength. Furthermore, no studies have been reported on the fabrication of aligned nanofibers and randomly-oriented nanofibers on the same mat. In this study, we have successfully produced a mat with both aligned and randomly-oriented nanofibers by using a novel electrospinning set up. A new conduit with a highly-aligned electrospun mat is produced with this modified electrospinning method, and this proposed conduit with favorable features, such as selective permeability, hydrophilicity and nerve growth directional steering, were fabricated as nerve guide conduits (NGCs). The inner surface of the nerve conduit is covered with highly aligned electrospun nanofibers and is able to enhance the proliferation of neural cells. The central part of the tube is double-coated with randomly-oriented nanofibers over the aligned nanofibers, strengthening the weak mechanical strength of the aligned nanofibers. PMID:27021221

  10. Analysis of vadose zone tritium transport from an underground storage tank release using numerical modeling and geostatistics

    SciTech Connect

    Lee, K.H.

    1997-09-01

    Numerical and geostatistical analyses show that the artificial smoothing effect of kriging removes high permeability flow paths from hydrogeologic data sets, reducing simulated contaminant transport rates in heterogeneous vadose zone systems. therefore, kriging alone is not recommended for estimating the spatial distribution of soil hydraulic properties for contaminant transport analysis at vadose zone sites. Vadose zone transport if modeled more effectively by combining kriging with stochastic simulation to better represent the high degree of spatial variability usually found in the hydraulic properties of field soils. However, kriging is a viable technique for estimating the initial mass distribution of contaminants in the subsurface.

  11. Geostatistical modeling of uncertainty of the spatial distribution of available phosphorus in soil in a sugarcane field

    NASA Astrophysics Data System (ADS)

    Tadeu Pereira, Gener; Ribeiro de Oliveira, Ismênia; De Bortoli Teixeira, Daniel; Arantes Camargo, Livia; Rodrigo Panosso, Alan; Marques, José, Jr.

    2015-04-01

    Phosphorus is one of the limiting nutrients for sugarcane development in Brazilian soils. The spatial variability of this nutrient is great, defined by the properties that control its adsorption and desorption reactions. Spatial estimates to characterize this variability are based on geostatistical interpolation. Thus, the assessment of the uncertainty of estimates associated with the spatial distribution of available P (Plabile) is decisive to optimize the use of phosphate fertilizers. The purpose of this study was to evaluate the performance of sequential Gaussian simulation (sGs) and ordinary kriging (OK) in the modeling of uncertainty in available P estimates. A sampling grid with 626 points was established in a 200-ha experimental sugarcane field in Tabapuã, São Paulo State, Brazil. The soil was sampled in the crossover points of a regular grid with intervals of 50 m. From the observations, 63 points, approximately 10% of sampled points were randomly selected before the geostatistical modeling of the composition of a data set used in the validation process modeling, while the remaining 563 points were used for the predictions variable in a place not sampled. The sGs generated 200 realizations. From the realizations generated, different measures of estimation and uncertainty were obtained. The standard deviation, calculated point to point, all simulated maps provided the map of deviation, used to assess local uncertainty. The visual analysis of maps of the E-type and KO showed that the spatial patterns produced by both methods were similar, however, it was possible to observe the characteristic smoothing effect of the KO especially in regions with extreme values. The Standardized variograms of selected realizations sGs showed both range and model similar to the variogram of the Observed date of Plabile. The variogram KO showed a distinct structure of the observed data, underestimating the variability over short distances, presenting parabolic behavior near

  12. ON THE GEOSTATISTICAL APPROACH TO THE INVERSE PROBLEM. (R825689C037)

    EPA Science Inventory

    Abstract

    The geostatistical approach to the inverse problem is discussed with emphasis on the importance of structural analysis. Although the geostatistical approach is occasionally misconstrued as mere cokriging, in fact it consists of two steps: estimation of statist...

  13. Geospatial interpolation and mapping of tropospheric ozone pollution using geostatistics.

    PubMed

    Kethireddy, Swatantra R; Tchounwou, Paul B; Ahmad, Hafiz A; Yerramilli, Anjaneyulu; Young, John H

    2014-01-01

    Tropospheric ozone (O3) pollution is a major problem worldwide, including in the United States of America (USA), particularly during the summer months. Ozone oxidative capacity and its impact on human health have attracted the attention of the scientific community. In the USA, sparse spatial observations for O3 may not provide a reliable source of data over a geo-environmental region. Geostatistical Analyst in ArcGIS has the capability to interpolate values in unmonitored geo-spaces of interest. In this study of eastern Texas O3 pollution, hourly episodes for spring and summer 2012 were selectively identified. To visualize the O3 distribution, geostatistical techniques were employed in ArcMap. Using ordinary Kriging, geostatistical layers of O3 for all the studied hours were predicted and mapped at a spatial resolution of 1 kilometer. A decent level of prediction accuracy was achieved and was confirmed from cross-validation results. The mean prediction error was close to 0, the root mean-standardized-prediction error was close to 1, and the root mean square and average standard errors were small. O3 pollution map data can be further used in analysis and modeling studies. Kriging results and O3 decadal trends indicate that the populace in Houston-Sugar Land-Baytown, Dallas-Fort Worth-Arlington, Beaumont-Port Arthur, San Antonio, and Longview are repeatedly exposed to high levels of O3-related pollution, and are prone to the corresponding respiratory and cardiovascular health effects. Optimization of the monitoring network proves to be an added advantage for the accurate prediction of exposure levels. PMID:24434594

  14. Assessment of spatial distribution of fallout radionuclides through geostatistics concept.

    PubMed

    Mabit, L; Bernard, C

    2007-01-01

    After introducing geostatistics concept and its utility in environmental science and especially in Fallout Radionuclide (FRN) spatialisation, a case study for cesium-137 ((137)Cs) redistribution at the field scale using geostatistics is presented. On a Canadian agricultural field, geostatistics coupled with a Geographic Information System (GIS) was used to test three different techniques of interpolation [Ordinary Kriging (OK), Inverse Distance Weighting power one (IDW1) and two (IDW2)] to create a (137)Cs map and to establish a radioisotope budget. Following the optimization of variographic parameters, an experimental semivariogram was developed to determine the spatial dependence of (137)Cs. It was adjusted to a spherical isotropic model with a range of 30 m and a very small nugget effect. This (137)Cs semivariogram showed a good autocorrelation (R(2)=0.91) and was well structured ('nugget-to-sill' ratio of 4%). It also revealed that the sampling strategy was adequate to reveal the spatial correlation of (137)Cs. The spatial redistribution of (137)Cs was estimated by Ordinary Kriging and IDW to produce contour maps. A radioisotope budget was established for the 2.16 ha agricultural field under investigation. It was estimated that around 2 x 10(7)Bq of (137)Cs were missing (around 30% of the total initial fallout) and were exported by physical processes (runoff and erosion processes) from the area under investigation. The cross-validation analysis showed that in the case of spatially structured data, OK is a better interpolation method than IDW1 or IDW2 for the assessment of potential radioactive contamination and/or pollution. PMID:17673340

  15. Geospatial Interpolation and Mapping of Tropospheric Ozone Pollution Using Geostatistics

    PubMed Central

    Kethireddy, Swatantra R.; Tchounwou, Paul B.; Ahmad, Hafiz A.; Yerramilli, Anjaneyulu; Young, John H.

    2014-01-01

    Tropospheric ozone (O3) pollution is a major problem worldwide, including in the United States of America (USA), particularly during the summer months. Ozone oxidative capacity and its impact on human health have attracted the attention of the scientific community. In the USA, sparse spatial observations for O3 may not provide a reliable source of data over a geo-environmental region. Geostatistical Analyst in ArcGIS has the capability to interpolate values in unmonitored geo-spaces of interest. In this study of eastern Texas O3 pollution, hourly episodes for spring and summer 2012 were selectively identified. To visualize the O3 distribution, geostatistical techniques were employed in ArcMap. Using ordinary Kriging, geostatistical layers of O3 for all the studied hours were predicted and mapped at a spatial resolution of 1 kilometer. A decent level of prediction accuracy was achieved and was confirmed from cross-validation results. The mean prediction error was close to 0, the root mean-standardized-prediction error was close to 1, and the root mean square and average standard errors were small. O3 pollution map data can be further used in analysis and modeling studies. Kriging results and O3 decadal trends indicate that the populace in Houston-Sugar Land-Baytown, Dallas-Fort Worth-Arlington, Beaumont-Port Arthur, San Antonio, and Longview are repeatedly exposed to high levels of O3-related pollution, and are prone to the corresponding respiratory and cardiovascular health effects. Optimization of the monitoring network proves to be an added advantage for the accurate prediction of exposure levels. PMID:24434594

  16. Mercury emissions from coal combustion in Silesia, analysis using geostatistics

    NASA Astrophysics Data System (ADS)

    Zasina, Damian; Zawadzki, Jaroslaw

    2015-04-01

    Data provided by the UNEP's report on mercury [1] shows that solid fuel combustion in significant source of mercury emission to air. Silesia, located in southwestern Poland, is notably affected by mercury emission due to being one of the most industrialized Polish regions: the place of coal mining, production of metals, stone mining, mineral quarrying and chemical industry. Moreover, Silesia is the region with high population density. People are exposed to severe risk of mercury emitted from both: industrial and domestic sources (i.e. small household furnaces). Small sources have significant contribution to total emission of mercury. Official and statistical analysis, including prepared for international purposes [2] did not provide data about spatial distribution of the mercury emitted to air, however number of analysis on Polish public power and energy sector had been prepared so far [3; 4]. The distribution of locations exposed for mercury emission from small domestic sources is interesting matter merging information from various sources: statistical, economical and environmental. This paper presents geostatistical approach to distibution of mercury emission from coal combustion. Analysed data organized in 2 independent levels: individual, bottom-up approach derived from national emission reporting system [5; 6] and top down - regional data calculated basing on official statistics [7]. Analysis, that will be presented, will include comparison of spatial distributions of mercury emission using data derived from sources mentioned above. Investigation will include three voivodeships of Poland: Lower Silesian, Opole (voivodeship) and Silesian using selected geostatistical methodologies including ordinary kriging [8]. References [1] UNEP. Global Mercury Assessment 2013: Sources, Emissions, Releases and Environmental Transport. UNEP Chemicals Branch, Geneva, Switzerland, 2013. [2] NCEM. Poland's Informative Inventory Report 2014. NCEM at the IEP-NRI, 2014. http

  17. Examples of improved reservoir modeling through geostatistical data integration

    SciTech Connect

    Bashore, W.M.; Araktingi, U.G.

    1994-12-31

    Results from four case studies are presented to demonstrate improvements in reservoir modeling and subsequent flow predictions through various uses of geostatistical integration methods. Specifically, these cases highlight improvements gained from (1) better understanding of reservoir geometries through 3D visualization, (2) forward modeling to assess the value of new data prior to acquisition and integration, (3) assessment of reduced uncertainty in porosity prediction through integration of seismic acoustic impedance, and (4) integration of crosswell tomographic and reflection data. The intent of each of these examples is to quantify the add-value of geological and geophysical data integration in engineering terms such as fluid-flow results and reservoir property predictions.

  18. Enhancing multiple-point geostatistical modeling: 1. Graph theory and pattern adjustment

    NASA Astrophysics Data System (ADS)

    Tahmasebi, Pejman; Sahimi, Muhammad

    2016-03-01

    In recent years, higher-order geostatistical methods have been used for modeling of a wide variety of large-scale porous media, such as groundwater aquifers and oil reservoirs. Their popularity stems from their ability to account for qualitative data and the great flexibility that they offer for conditioning the models to hard (quantitative) data, which endow them with the capability for generating realistic realizations of porous formations with very complex channels, as well as features that are mainly a barrier to fluid flow. One group of such models consists of pattern-based methods that use a set of data points for generating stochastic realizations by which the large-scale structure and highly-connected features are reproduced accurately. The cross correlation-based simulation (CCSIM) algorithm, proposed previously by the authors, is a member of this group that has been shown to be capable of simulating multimillion cell models in a matter of a few CPU seconds. The method is, however, sensitive to pattern's specifications, such as boundaries and the number of replicates. In this paper the original CCSIM algorithm is reconsidered and two significant improvements are proposed for accurately reproducing large-scale patterns of heterogeneities in porous media. First, an effective boundary-correction method based on the graph theory is presented by which one identifies the optimal cutting path/surface for removing the patchiness and discontinuities in the realization of a porous medium. Next, a new pattern adjustment method is proposed that automatically transfers the features in a pattern to one that seamlessly matches the surrounding patterns. The original CCSIM algorithm is then combined with the two methods and is tested using various complex two- and three-dimensional examples. It should, however, be emphasized that the methods that we propose in this paper are applicable to other pattern-based geostatistical simulation methods.

  19. Geostatistical three-dimensional modeling of oolite shoals, St. Louis Limestone, southwest Kansas

    USGS Publications Warehouse

    Qi, L.; Carr, T.R.; Goldstein, R.H.

    2007-01-01

    In the Hugoton embayment of southwestern Kansas, reservoirs composed of relatively thin (<4 m; <13.1 ft) oolitic deposits within the St. Louis Limestone have produced more than 300 million bbl of oil. The geometry and distribution of oolitic deposits control the heterogeneity of the reservoirs, resulting in exploration challenges and relatively low recovery. Geostatistical three-dimensional (3-D) models were constructed to quantify the geometry and spatial distribution of oolitic reservoirs, and the continuity of flow units within Big Bow and Sand Arroyo Creek fields. Lithofacies in uncored wells were predicted from digital logs using a neural network. The tilting effect from the Laramide orogeny was removed to construct restored structural surfaces at the time of deposition. Well data and structural maps were integrated to build 3-D models of oolitic reservoirs using stochastic simulations with geometry data. Three-dimensional models provide insights into the distribution, the external and internal geometry of oolitic deposits, and the sedimentologic processes that generated reservoir intervals. The structural highs and general structural trend had a significant impact on the distribution and orientation of the oolitic complexes. The depositional pattern and connectivity analysis suggest an overall aggradation of shallow-marine deposits during pulses of relative sea level rise followed by deepening near the top of the St. Louis Limestone. Cemented oolitic deposits were modeled as barriers and baffles and tend to concentrate at the edge of oolitic complexes. Spatial distribution of porous oolitic deposits controls the internal geometry of rock properties. Integrated geostatistical modeling methods can be applicable to other complex carbonate or siliciclastic reservoirs in shallow-marine settings. Copyright ?? 2007. The American Association of Petroleum Geologists. All rights reserved.

  20. Geostatistical inference of main Y-STR-haplotype groups in Europe.

    PubMed

    Diaz-Lacava, Amalia; Walier, Maja; Willuweit, Sascha; Wienker, Thomas F; Fimmers, Rolf; Baur, Max P; Roewer, Lutz

    2011-03-01

    We examined the multifarious genetic heterogeneity of Europe and neighboring regions from a geographical perspective. We created composite maps outlining the estimated geographical distribution of major groups of genetically similar individuals on the basis of forensic Y-chromosomal markers. We analyzed Y-chromosomal haplotypes composed of 7 highly polymorphic STR loci, genotyped for 33,010 samples, collected at 249 sites in Europe, Western Asia and North Africa, deposited in the YHRD database (www.yhrd.org). The data set comprised 4176 different haplotypes, which we grouped into 20 clusters. For each cluster, the frequency per site was calculated. All geostatistical analysis was performed with the geographic information system GRASS-GIS. We interpolated frequency values across the study area separately for each cluster. Juxtaposing all 20 interpolated surfaces, we point-wisely screened for the highest cluster frequencies and stored it in parallel with the respective cluster label. We combined these two types of data in a composite map. We repeated this procedure for the second highest frequencies in Europe. Major groups were assigned to Northern, Western and Eastern Europe. North Africa built a separate region, Southeastern Europe, Turkey and Near East were divided into several regions. The spatial distribution of the groups accounting for the second highest frequencies in Europe overlapped with the territories of the largest countries. The genetic structure presented in the composite maps fits major historical geopolitical regions and is in agreement with previous studies of genetic frequencies, validating our approach. Our genetic geostatistical approach provides, on the basis of two composite maps, detailed evidence of the geographical distribution and relative frequencies of the most predominant groups of the extant male European population, examined on the basis of forensic Y-STR haplotypes. The existence of considerable genetic differences among geographic

  1. Comparative soil CO2 flux measurements and geostatistical estimation methods on Masaya volcano, Nicaragua

    USGS Publications Warehouse

    Lewicki, J.L.; Bergfeld, D.; Cardellini, C.; Chiodini, G.; Granieri, D.; Varley, N.; Werner, C.

    2005-01-01

    We present a comparative study of soil CO2 flux (FCO2) measured by five groups (Groups 1-5) at the IAVCEI-CCVG Eighth Workshop on Volcanic Gases on Masaya volcano, Nicaragua. Groups 1-5 measured (FCO2) using the accumulation chamber method at 5-m spacing within a 900 m2 grid during a morning (AM) period. These measurements were repeated by Groups 1-3 during an afternoon (PM) period. Measured (FCO2 ranged from 218 to 14,719 g m-2 day-1. The variability of the five measurements made at each grid point ranged from ??5 to 167%. However, the arithmetic means of fluxes measured over the entire grid and associated total CO2 emission rate estimates varied between groups by only ??22%. All three groups that made PM measurements reported an 8-19% increase in total emissions over the AM results. Based on a comparison of measurements made during AM and PM times, we argue that this change is due in large part to natural temporal variability of gas flow, rather than to measurement error. In order to estimate the mean and associated CO2 emission rate of one data set and to map the spatial FCO2 distribution, we compared six geostatistical methods: Arithmetic and minimum variance unbiased estimator means of uninterpolated data, and arithmetic means of data interpolated by the multiquadric radial basis function, ordinary kriging, multi-Gaussian kriging, and sequential Gaussian simulation methods. While the total CO2 emission rates estimated using the different techniques only varied by ??4.4%, the FCO2 maps showed important differences. We suggest that the sequential Gaussian simulation method yields the most realistic representation of the spatial distribution of FCO2, but a variety of geostatistical methods are appropriate to estimate the total CO2 emission rate from a study area, which is a primary goal in volcano monitoring research. ?? Springer-Verlag 2005.

  2. Spatially explicit Schistosoma infection risk in eastern Africa using Bayesian geostatistical modelling.

    PubMed

    Schur, Nadine; Hürlimann, Eveline; Stensgaard, Anna-Sofie; Chimfwembe, Kingford; Mushinge, Gabriel; Simoonga, Christopher; Kabatereine, Narcis B; Kristensen, Thomas K; Utzinger, Jürg; Vounatsou, Penelope

    2013-11-01

    Schistosomiasis remains one of the most prevalent parasitic diseases in the tropics and subtropics, but current statistics are outdated due to demographic and ecological transformations and ongoing control efforts. Reliable risk estimates are important to plan and evaluate interventions in a spatially explicit and cost-effective manner. We analysed a large ensemble of georeferenced survey data derived from an open-access neglected tropical diseases database to create smooth empirical prevalence maps for Schistosoma mansoni and Schistosoma haematobium for a total of 13 countries of eastern Africa. Bayesian geostatistical models based on climatic and other environmental data were used to account for potential spatial clustering in spatially structured exposures. Geostatistical variable selection was employed to reduce the set of covariates. Alignment factors were implemented to combine surveys on different age-groups and to acquire separate estimates for individuals aged ≤20 years and entire communities. Prevalence estimates were combined with population statistics to obtain country-specific numbers of Schistosoma infections. We estimate that 122 million individuals in eastern Africa are currently infected with either S. mansoni, or S. haematobium, or both species concurrently. Country-specific population-adjusted prevalence estimates range between 12.9% (Uganda) and 34.5% (Mozambique) for S. mansoni and between 11.9% (Djibouti) and 40.9% (Mozambique) for S. haematobium. Our models revealed that infection risk in Burundi, Eritrea, Ethiopia, Kenya, Rwanda, Somalia and Sudan might be considerably higher than previously reported, while in Mozambique and Tanzania, the risk might be lower than current estimates suggest. Our empirical, large-scale, high-resolution infection risk estimates for S. mansoni and S. haematobium in eastern Africa can guide future control interventions and provide a benchmark for subsequent monitoring and evaluation activities. PMID:22019933

  3. The effectiveness of physical activity monitoring and distance counseling in an occupational setting – Results from a randomized controlled trial (CoAct)

    PubMed Central

    2012-01-01

    Background Lack of physical activity (PA) is a known risk factor for many health conditions. The workplace is a setting often used to promote activity and health. We investigated the effectiveness of an intervention on PA and productivity-related outcomes in an occupational setting. Methods We conducted a randomized controlled trial of 12 months duration with two 1:1 allocated parallel groups of insurance company employees. Eligibility criteria included permanent employment and absence of any condition that risked the participant’s health during PA. Subjects in the intervention group monitored their daily PA with an accelerometer, set goals, had access to an online service to help them track their activity levels, and received counseling via telephone or web messages for 12 months. The control group received the results of a fitness test and an information leaflet on PA at the beginning of the study. The intervention’s aim was to increase PA, improve work productivity, and decrease sickness absence. Primary outcomes were PA (measured as MET minutes per week), work productivity (quantity and quality of work; QQ index), and sickness absence (SA) days at 12 months. Participants were assigned to groups using block randomization with a computer-generated scheme. The study was not blinded. Results There were 544 randomized participants, of which 521 were included in the analysis (64% female, mean age 43 years). At 12 months, there was no significant difference in physical activity levels between the intervention group (n = 264) and the control group (n = 257). The adjusted mean difference was −206 MET min/week [95% Bayesian credible interval −540 to 128; negative values favor control group]. There was also no significant difference in the QQ index (−0.5 [−4.4 to 3.3]) or SA days (0.0 [−1.2 to 0.9]). Of secondary outcomes, body weight (0.5 kg [0.0 to 1.0]) and percentage of body fat (0.6% [0.2% to 1.1%]) were slightly higher in the

  4. A multiple-point geostatistical approach to quantifying uncertainty for flow and transport simulation in geologically complex environments

    NASA Astrophysics Data System (ADS)

    Cronkite-Ratcliff, C.; Phelps, G. A.; Boucher, A.

    2011-12-01

    In many geologic settings, the pathways of groundwater flow are controlled by geologic heterogeneities which have complex geometries. Models of these geologic heterogeneities, and consequently, their effects on the simulated pathways of groundwater flow, are characterized by uncertainty. Multiple-point geostatistics, which uses a training image to represent complex geometric descriptions of geologic heterogeneity, provides a stochastic approach to the analysis of geologic uncertainty. Incorporating multiple-point geostatistics into numerical models provides a way to extend this analysis to the effects of geologic uncertainty on the results of flow simulations. We present two case studies to demonstrate the application of multiple-point geostatistics to numerical flow simulation in complex geologic settings with both static and dynamic conditioning data. Both cases involve the development of a training image from a complex geometric description of the geologic environment. Geologic heterogeneity is modeled stochastically by generating multiple equally-probable realizations, all consistent with the training image. Numerical flow simulation for each stochastic realization provides the basis for analyzing the effects of geologic uncertainty on simulated hydraulic response. The first case study is a hypothetical geologic scenario developed using data from the alluvial deposits in Yucca Flat, Nevada. The SNESIM algorithm is used to stochastically model geologic heterogeneity conditioned to the mapped surface geology as well as vertical drill-hole data. Numerical simulation of groundwater flow and contaminant transport through geologic models produces a distribution of hydraulic responses and contaminant concentration results. From this distribution of results, the probability of exceeding a given contaminant concentration threshold can be used as an indicator of uncertainty about the location of the contaminant plume boundary. The second case study considers a

  5. Testing geostatistical methods to combine radar and rain gauges for precipitation mapping in a mountainous region

    NASA Astrophysics Data System (ADS)

    Erdin, R.; Frei, C.; Sideris, I.; Kuensch, H.-R.

    2010-09-01

    There is an increasing demand for accurate mapping of precipitation at a spatial resolution of kilometers. Radar and rain gauges - the two main precipitation measurement systems - exhibit complementary strengths and weaknesses. Radar offers high spatial and temporal resolution but lacks accuracy of absolute values, whereas rain gauges provide accurate values at their specific point location but suffer from poor spatial representativeness. Methods of geostatistical mapping have been proposed to combine radar and rain gauge data for quantitative precipitation estimation (QPE). The aim is to combine the respective strengths and compensate for the respective weaknesses of the two observation platforms. Several studies have demonstrated the potential of these methods over topography of moderate complexity, but their performance remains unclear for high-mountain regions where rainfall patterns are complex, the representativeness of rain gauge measurements is limited and radar observations are obstructed. In this study we examine the potential and limitations of two frequently used geostatistical mapping methods for the territory of Switzerland, where the mountain chain of the Alps poses particular challenges to QPE. The two geostatistical methods explored are kriging with external drift (KED) using radar as drift variable and ordinary kriging of radar errors (OKRE). The radar data is a composite from three C-band radars using a constant Z-R relationship, advanced correction processings for visibility, ground clutter and beam shielding and a climatological bias adjustment. The rain gauge data originates from an automatic network with a typical inter-station distance of 25 km. Both combination methods are applied to a set of case examples representing typical rainfall situations in the Alps with their inherent challenges at daily and hourly time resolution. The quality of precipitation estimates is assessed by several skill scores calculated from cross validation errors at

  6. Probabilistic assessment of ground-water contamination. 1: Geostatistical framework

    SciTech Connect

    Rautman, C.A.; Istok, J.D.

    1996-09-01

    Characterizing the extent and severity of ground-water contamination at waste sites is expensive and time-consuming. A probabilistic approach, based on the acceptance of uncertainty and a finite probability of making classification errors (contaminated relative to a regulatory threshold vs. uncontaminated), is presented as an alternative to traditional site characterization methodology. The approach utilizes geostatistical techniques to identify and model the spatial continuity of contamination at a site (variography) and to develop alternate plausible simulations of contamination fields (conditional simulation). Probabilistic summaries of many simulations provide tools for (a) estimating the range of plausible contaminant concentrations at unsampled locations, (b) identifying the locations of boundaries between contaminated and uncontaminated portions of the site and the degree of certainty in those locations, and (c) estimating the range of plausible values for total contaminant mass. The first paper in the series presents the geostatistical framework and illustrates the approach using synthetic data for a hypothetical site. The second paper presents an application of the proposed methodology to the probabilistic assessment of ground-water contamination at a site involving ground-water contamination by nitrate and herbicide in a shallow, unconfined alluvial aquifer in an agricultural area in eastern Oregon.

  7. Validating spatial structure in canopy water content using geostatistics

    NASA Technical Reports Server (NTRS)

    Sanderson, E. W.; Zhang, M. H.; Ustin, S. L.; Rejmankova, E.; Haxo, R. S.

    1995-01-01

    Heterogeneity in ecological phenomena are scale dependent and affect the hierarchical structure of image data. AVIRIS pixels average reflectance produced by complex absorption and scattering interactions between biogeochemical composition, canopy architecture, view and illumination angles, species distributions, and plant cover as well as other factors. These scales affect validation of pixel reflectance, typically performed by relating pixel spectra to ground measurements acquired at scales of 1m(exp 2) or less (e.g., field spectra, foilage and soil samples, etc.). As image analysis becomes more sophisticated, such as those for detection of canopy chemistry, better validation becomes a critical problem. This paper presents a methodology for bridging between point measurements and pixels using geostatistics. Geostatistics have been extensively used in geological or hydrogeolocial studies but have received little application in ecological studies. The key criteria for kriging estimation is that the phenomena varies in space and that an underlying controlling process produces spatial correlation between the measured data points. Ecological variation meets this requirement because communities vary along environmental gradients like soil moisture, nutrient availability, or topography.

  8. A geostatistical analysis of the geographic distribution of lymphatic filariasis prevalence in southern India.

    PubMed

    Srividya, A; Michael, E; Palaniyandi, M; Pani, S P; Das, P K

    2002-11-01

    Gaining a better understanding of the spatial population structure of infectious agents is increasingly recognized as being key to their more effective mapping and to improving knowledge of their overall population dynamics and control. Here, we investigate the spatial structure of bancroftian filariasis distribution using geostatistical methods in an endemic region in Southern India. Analysis of a parasite antigenemia prevalence dataset assembled by sampling 79 villages selected using a World Health Organization (WHO) proposed 25 x 25 km grid sampling procedure in a 225 x 225 km area within this region was compared with that of a corresponding microfilaraemia prevalence dataset assembled by sampling 119 randomly selected villages from a smaller subregion located within the main study area. A major finding from the analysis was that once large-scale spatial trends were removed, the antigenemia data did not show evidence for the existence of any small-scale dependency at the study sampling interval of 25 km. By contrast, analysis of the randomly sampled microfilaraemia data indicated strong spatial contagion in prevalence up to a distance of approximately 6.6 kms, suggesting the likely existence of small spatial patches or foci of transmission in the study area occurring below the sampling scale used for sampling the antigenemia data. While this could indicate differences in parasite spatial population dynamics based on antigenemia versus microfilaraemia data, the result may also suggest that the WHO recommended 25 x 25 km sampling grid for rapid filariasis mapping could have been too coarse a scale to capture and describe the likely local variation in filariasis infection in this endemic location and highlights the need for caution when applying uniform sampling schemes in diverse endemic regions for investigating the spatial pattern of this parasitic infection. The present results, on the other hand, imply that both small-scale spatial processes and large

  9. A geostatistical modeling study of the effect of heterogeneity on radionuclide transport in the unsaturated zone, Yucca Mountain.

    PubMed

    Viswanathan, Hari S; Robinson, Bruce A; Gable, Carl W; Carey, James W

    2003-01-01

    Retardation of certain radionuclides due to sorption to zeolitic minerals is considered one of the major barriers to contaminant transport in the unsaturated zone of Yucca Mountain. However, zeolitically altered areas are lower in permeability than unaltered regions, which raises the possibility that contaminants might bypass the sorptive zeolites. The relationship between hydrologic and chemical properties must be understood to predict the transport of radionuclides through zeolitically altered areas. In this study, we incorporate mineralogical information into an unsaturated zone transport model using geostatistical techniques to correlate zeolitic abundance to hydrologic and chemical properties. Geostatistical methods are used to develop variograms, kriging maps, and conditional simulations of zeolitic abundance. We then investigate, using flow and transport modeling on a heterogeneous field, the relationship between percent zeolitic alteration, permeability changes due to alteration, sorption due to alteration, and their overall effect on radionuclide transport. We compare these geostatistical simulations to a simplified threshold method in which each spatial location in the model is assigned either zeolitic or vitric properties based on the zeolitic abundance at that location. A key conclusion is that retardation due to sorption predicted by using the continuous distribution is larger than the retardation predicted by the threshold method. The reason for larger retardation when using the continuous distribution is a small but significant sorption at locations with low zeolitic abundance. If, for practical reasons, models with homogeneous properties within each layer are used, we recommend setting nonzero K(d)s in the vitric tuffs to mimic the more rigorous continuous distribution simulations. Regions with high zeolitic abundance may not be as effective in retarding radionuclides such as Neptunium since these rocks are lower in permeability and contaminants can

  10. Large to intermediate-scale aquifer heterogeneity in fine-grain dominated alluvial fans (Cenozoic As Pontes Basin, northwestern Spain): insight based on three-dimensional geostatistical reconstruction

    NASA Astrophysics Data System (ADS)

    Falivene, O.; Cabrera, L.; Sáez, A.

    2007-08-01

    Facies reconstructions are used in hydrogeology to improve the interpretation of aquifer permeability distribution. In the absence of sufficient data to define the heterogeneity due to geological processes, uncertainties in the distribution of aquifer hydrofacies and characteristics may appear. Geometric and geostatistical methods are used to understand and model aquifer hydrofacies distribution, providing models to improve comprehension and development of aquifers. However, these models require some input statistical parameters that can be difficult to infer from the study site. A three-dimensional reconstruction of a kilometer scale fine-grain dominated Cenozoic alluvial fan derived from more than 200 continuously cored, closely spaced, and regularly distributed wells is presented. The facies distributions were reconstructed using a genetic stratigraphic subdivision and a deterministic geostatistical algorithm. The reconstruction is only slightly affected by variations in the geostatistical input parameters because of the high-density data set. Analysis of the reconstruction allowed identification in the proximal to medial alluvial fan zones of several laterally extensive sand bodies with relatively higher permeability; these sand bodies were quantified in terms of volume, mean thickness, maximum area, and maximum equivalent diameter. These quantifications provide trends and geological scenarios for input statistical parameters to model aquifer systems in similar alluvial fan depositional settings.

  11. Feasibility intervention trial of two types of improved cookstoves in three resource-limited settings: study protocol for a randomized controlled trial

    PubMed Central

    2013-01-01

    Background Exposure to biomass fuel smoke is one of the leading risk factors for disease burden worldwide. International campaigns are currently promoting the widespread adoption of improved cookstoves in resource-limited settings, yet little is known about the cultural and social barriers to successful improved cookstove adoption and how these barriers affect environmental exposures and health outcomes. Design We plan to conduct a one-year crossover, feasibility intervention trial in three resource-limited settings (Kenya, Nepal and Peru). We will enroll 40 to 46 female primary cooks aged 20 to 49 years in each site (total 120 to 138). Methods At baseline, we will collect information on sociodemographic characteristics and cooking practices, and measure respiratory health and blood pressure for all participating women. An initial observational period of four months while households use their traditional, open-fire design cookstoves will take place prior to randomization. All participants will then be randomized to receive one of two types of improved, ventilated cookstoves with a chimney: a commercially-constructed cookstove (Envirofit G3300/G3355) or a locally-constructed cookstove. After four months of observation, participants will crossover and receive the other improved cookstove design and be followed for another four months. During each of the three four-month study periods, we will collect monthly information on self-reported respiratory symptoms, cooking practices, compliance with cookstove use (intervention periods only), and measure peak expiratory flow, forced expiratory volume at 1 second, exhaled carbon monoxide and blood pressure. We will also measure pulmonary function testing in the women participants and 24-hour kitchen particulate matter and carbon monoxide levels at least once per period. Discussion Findings from this study will help us better understand the behavioral, biological, and environmental changes that occur with a cookstove

  12. No Effect of Insecticide Treated Curtain Deployment on Aedes Infestation in a Cluster Randomized Trial in a Setting of Low Dengue Transmission in Guantanamo, Cuba

    PubMed Central

    Lambert, Isora; Montada, Domingo; Baly, Alberto; Van der Stuyft, Patrick

    2015-01-01

    Objective & Methodology The current study evaluated the effectiveness and cost-effectiveness of Insecticide Treated Curtain (ITC) deployment for reducing dengue vector infestation levels in the Cuban context with intensive routine control activities. A cluster randomized controlled trial took place in Guantanamo city, east Cuba. Twelve neighborhoods (about 500 households each) were selected among the ones with the highest Aedes infestation levels in the previous two years, and were randomly allocated to the intervention and control arms. Long lasting ITC (PermaNet) were distributed in the intervention clusters in March 2009. Routine control activities were continued in the whole study area. In both study arms, we monitored monthly pre- and post-intervention House Index (HI, number of houses with at least 1 container with Aedes immature stages/100 houses inspected), during 12 and 18 months respectively. We evaluated the effect of ITC deployment on HI by fitting a generalized linear regression model with a negative binomial link function to these data. Principal Findings At distribution, the ITC coverage (% of households using ≥1 ITC) reached 98.4%, with a median of 3 ITC distributed/household. After 18 months, the coverage remained 97.4%. The local Aedes species was susceptible to deltamethrin (mosquito mortality rate of 99.7%) and the residual deltamethrin activity in the ITC was within acceptable levels (mosquito mortality rate of 73.1%) after one year of curtain use. Over the 18 month observation period after ITC distribution, the adjusted HI rate ratio, intervention versus control clusters, was 1.15 (95% CI 0.57 to 2.34). The annualized cost per household of ITC implementation was 3.8 USD, against 16.8 USD for all routine ACP activities. Conclusion Deployment of ITC in a setting with already intensive routine Aedes control actions does not lead to reductions in Aedes infestation levels. PMID:25794192

  13. Mobile phone technologies improve adherence to antiretroviral treatment in a resource-limited setting: a randomized controlled trial of text message reminders

    PubMed Central

    Pop-Eleches, Cristian; Thirumurthy, Harsha; Habyarimana, James P.; Zivin, Joshua G.; Goldstein, Markus P.; de Walque, Damien; MacKeen, Leslie; Haberer, Jessica; Kimaiyo, Sylvester; Sidle, John; Ngare, Duncan; Bangsberg, David R.

    2013-01-01

    Objective There is limited evidence on whether growing mobile phone availability in sub-Saharan Africa can be used to promote high adherence to antiretroviral therapy (ART). This study tested the efficacy of short message service (SMS) reminders on adherence to ART among patients attending a rural clinic in Kenya. Design A randomized controlled trial of four SMS reminder interventions with 48 weeks of follow-up. Methods Four hundred and thirty-one adult patients who had initiated ART within 3 months were enrolled and randomly assigned to a control group or one of the four intervention groups. Participants in the intervention groups received SMS reminders that were either short or long and sent at a daily or weekly frequency. Adherence was measured using the medication event monitoring system. The primary outcome was whether adherence exceeded 90% during each 12-week period of analysis and the 48-week study period. The secondary outcome was whether there were treatment interruptions lasting at least 48 h. Results In intention-to-treat analysis, 53% of participants receiving weekly SMS reminders achieved adherence of at least 90% during the 48 weeks of the study, compared with 40% of participants in the control group (P=0.03). Participants in groups receiving weekly reminders were also significantly less likely to experience treatment interruptions exceeding 48 h during the 48-week follow-up period than participants in the control group (81 vs. 90%, P = 0.03). Conclusion These results suggest that SMS reminders may be an important tool to achieve optimal treatment response in resource-limited settings. PMID:21252632

  14. A group randomized trial of a complexity-based organizational intervention to improve risk factors for diabetes complications in primary care settings: study protocol

    PubMed Central

    Parchman, Michael L; Pugh, Jacqueline A; Culler, Steven D; Noel, Polly H; Arar, Nedal H; Romero, Raquel L; Palmer, Raymond F

    2008-01-01

    Background Most patients with type 2 diabetes have suboptimal control of their glucose, blood pressure (BP), and lipids – three risk factors for diabetes complications. Although the chronic care model (CCM) provides a roadmap for improving these outcomes, developing theoretically sound implementation strategies that will work across diverse primary care settings has been challenging. One explanation for this difficulty may be that most strategies do not account for the complex adaptive system (CAS) characteristics of the primary care setting. A CAS is comprised of individuals who can learn, interconnect, self-organize, and interact with their environment in a way that demonstrates non-linear dynamic behavior. One implementation strategy that may be used to leverage these properties is practice facilitation (PF). PF creates time for learning and reflection by members of the team in each clinic, improves their communication, and promotes an individualized approach to implement a strategy to improve patient outcomes. Specific objectives The specific objectives of this protocol are to: evaluate the effectiveness and sustainability of PF to improve risk factor control in patients with type 2 diabetes across a variety of primary care settings; assess the implementation of the CCM in response to the intervention; examine the relationship between communication within the practice team and the implementation of the CCM; and determine the cost of the intervention both from the perspective of the organization conducting the PF intervention and from the perspective of the primary care practice. Intervention The study will be a group randomized trial conducted in 40 primary care clinics. Data will be collected on all clinics, with 60 patients in each clinic, using a multi-method assessment process at baseline, 12, and 24 months. The intervention, PF, will consist of a series of practice improvement team meetings led by trained facilitators over 12 months. Primary hypotheses

  15. A combined community- and facility-based approach to improve pregnancy outcomes in low-resource settings: a Global Network cluster randomized trial

    PubMed Central

    2013-01-01

    Background Fetal and neonatal mortality rates in low-income countries are at least 10-fold greater than in high-income countries. These differences have been related to poor access to and poor quality of obstetric and neonatal care. Methods This trial tested the hypothesis that teams of health care providers, administrators and local residents can address the problem of limited access to quality obstetric and neonatal care and lead to a reduction in perinatal mortality in intervention compared to control locations. In seven geographic areas in five low-income and one middle-income country, most with high perinatal mortality rates and substantial numbers of home deliveries, we performed a cluster randomized non-masked trial of a package of interventions that included community mobilization focusing on birth planning and hospital transport, community birth attendant training in problem recognition, and facility staff training in the management of obstetric and neonatal emergencies. The primary outcome was perinatal mortality at ≥28 weeks gestation or birth weight ≥1000 g. Results Despite extensive effort in all sites in each of the three intervention areas, no differences emerged in the primary or any secondary outcome between the intervention and control clusters. In both groups, the mean perinatal mortality was 40.1/1,000 births (P = 0.9996). Neither were there differences between the two groups in outcomes in the last six months of the project, in the year following intervention cessation, nor in the clusters that best implemented the intervention. Conclusions This cluster randomized comprehensive, large-scale, multi-sector intervention did not result in detectable impact on the proposed outcomes. While this does not negate the importance of these interventions, we expect that achieving improvement in pregnancy outcomes in these settings will require substantially more obstetric and neonatal care infrastructure than was available at the sites during this trial

  16. Bayesian geostatistics in health cartography: the perspective of malaria.

    PubMed

    Patil, Anand P; Gething, Peter W; Piel, Frédéric B; Hay, Simon I

    2011-06-01

    Maps of parasite prevalences and other aspects of infectious diseases that vary in space are widely used in parasitology. However, spatial parasitological datasets rarely, if ever, have sufficient coverage to allow exact determination of such maps. Bayesian geostatistics (BG) is a method for finding a large sample of maps that can explain a dataset, in which maps that do a better job of explaining the data are more likely to be represented. This sample represents the knowledge that the analyst has gained from the data about the unknown true map. BG provides a conceptually simple way to convert these samples to predictions of features of the unknown map, for example regional averages. These predictions account for each map in the sample, yielding an appropriate level of predictive precision. PMID:21420361

  17. Indoor radon variations in central Iran and its geostatistical map

    NASA Astrophysics Data System (ADS)

    Hadad, Kamal; Mokhtari, Javad

    2015-02-01

    We present the results of 2 year indoor radon survey in 10 cities of Yazd province in Central Iran (covering an area of 80,000 km2). We used passive diffusive samplers with LATEX polycarbonate films as Solid State Nuclear Track Detector (SSNTD). This study carried out in central Iran where there are major minerals and uranium mines. Our results indicate that despite few extraordinary high concentrations, average annual concentrations of indoor radon are within ICRP guidelines. When geostatistical spatial distribution of radon mapped onto geographical features of the province it was observed that risk of high radon concentration increases near the Saqand, Bafq, Harat and Abarkooh cities, this depended on the elevation and vicinity of the ores and mines.

  18. Geostatistical analysis of allele presence patterns among American black bears in eastern North Carolina

    USGS Publications Warehouse

    Thompson, L.M.; Van Manen, F.T.; King, T.L.

    2005-01-01

    Highways are one of the leading causes of wildlife habitat fragmentation and may particularly affect wide-ranging species, such as American black bears (Ursus americanus). We initiated a research project in 2000 to determine potential effects of a 4-lane highway on black bear ecology in Washington County, North Carolina. The research design included a treatment area (highway construction) and a control area and a pre- and post-construction phase. We used data from the pre-construction phase to determine whether we could detect scale dependency or directionality among allele occurrence patterns using geostatistics. Detection of such patterns could provide a powerful tool to measure the effects of landscape fragmentation on gene flow. We sampled DNA from roots of black bear hair at 70 hair-sampling sites on each study area for 7 weeks during fall of 2000. We used microsatellite analysis based on 10 loci to determine unique multi-locus genotypes. We examined all alleles sampled at ???25 sites on each study area and mapped their presence or absence at each hair-sample site. We calculated semivariograms, which measure the strength of statistical correlation as a function of distance, and adjusted them for anisotropy to determine the maximum direction of spatial continuity. We then calculated the mean direction of spatial continuity for all examined alleles. The mean direction of allele frequency variation was 118.3?? (SE = 8.5) on the treatment area and 172.3?? (SE = 6.0) on the control area. Rayleigh's tests showed that these directions differed from random distributions (P = 0.028 and P < 0.001, respectively), indicating consistent directional patterns for the alleles we examined in each area. Despite the small spatial scale of our study (approximately 11,000 ha for each study area), we observed distinct and consistent patterns of allele occurrence, suggesting different directions of gene flow between the study areas. These directions seemed to coincide with the

  19. Mapping aboveground woody biomass using forest inventory, remote sensing and geostatistical techniques.

    PubMed

    Yadav, Bechu K V; Nandy, S

    2015-05-01

    Mapping forest biomass is fundamental for estimating CO₂ emissions, and planning and monitoring of forests and ecosystem productivity. The present study attempted to map aboveground woody biomass (AGWB) integrating forest inventory, remote sensing and geostatistical techniques, viz., direct radiometric relationships (DRR), k-nearest neighbours (k-NN) and cokriging (CoK) and to evaluate their accuracy. A part of the Timli Forest Range of Kalsi Soil and Water Conservation Division, Uttarakhand, India was selected for the present study. Stratified random sampling was used to collect biophysical data from 36 sample plots of 0.1 ha (31.62 m × 31.62 m) size. Species-specific volumetric equations were used for calculating volume and multiplied by specific gravity to get biomass. Three forest-type density classes, viz. 10-40, 40-70 and >70% of Shorea robusta forest and four non-forest classes were delineated using on-screen visual interpretation of IRS P6 LISS-III data of December 2012. The volume in different strata of forest-type density ranged from 189.84 to 484.36 m(3) ha(-1). The total growing stock of the forest was found to be 2,024,652.88 m(3). The AGWB ranged from 143 to 421 Mgha(-1). Spectral bands and vegetation indices were used as independent variables and biomass as dependent variable for DRR, k-NN and CoK. After validation and comparison, k-NN method of Mahalanobis distance (root mean square error (RMSE) = 42.25 Mgha(-1)) was found to be the best method followed by fuzzy distance and Euclidean distance with RMSE of 44.23 and 45.13 Mgha(-1) respectively. DRR was found to be the least accurate method with RMSE of 67.17 Mgha(-1). The study highlighted the potential of integrating of forest inventory, remote sensing and geostatistical techniques for forest biomass mapping. PMID:25930205

  20. Principal Component Geostatistical Approach for large-dimensional inverse problems

    NASA Astrophysics Data System (ADS)

    Kitanidis, P. K.; Lee, J.

    2014-07-01

    The quasi-linear geostatistical approach is for weakly nonlinear underdetermined inverse problems, such as Hydraulic Tomography and Electrical Resistivity Tomography. It provides best estimates as well as measures for uncertainty quantification. However, for its textbook implementation, the approach involves iterations, to reach an optimum, and requires the determination of the Jacobian matrix, i.e., the derivative of the observation function with respect to the unknown. Although there are elegant methods for the determination of the Jacobian, the cost is high when the number of unknowns, m, and the number of observations, n, is high. It is also wasteful to compute the Jacobian for points away from the optimum. Irrespective of the issue of computing derivatives, the computational cost of implementing the method is generally of the order of m2n, though there are methods to reduce the computational cost. In this work, we present an implementation that utilizes a matrix free in terms of the Jacobian matrix Gauss-Newton method and improves the scalability of the geostatistical inverse problem. For each iteration, it is required to perform K runs of the forward problem, where K is not just much smaller than m but can be smaller that n. The computational and storage cost of implementation of the inverse procedure scales roughly linearly with m instead of m2 as in the textbook approach. For problems of very large m, this implementation constitutes a dramatic reduction in computational cost compared to the textbook approach. Results illustrate the validity of the approach and provide insight in the conditions under which this method perform best.

  1. Medical Geography: a Promising Field of Application for Geostatistics

    PubMed Central

    Goovaerts, P.

    2008-01-01

    The analysis of health data and putative covariates, such as environmental, socio-economic, behavioral or demographic factors, is a promising application for geostatistics. It presents, however, several methodological challenges that arise from the fact that data are typically aggregated over irregular spatial supports and consist of a numerator and a denominator (i.e. population size). This paper presents an overview of recent developments in the field of health geostatistics, with an emphasis on three main steps in the analysis of areal health data: estimation of the underlying disease risk, detection of areas with significantly higher risk, and analysis of relationships with putative risk factors. The analysis is illustrated using age-adjusted cervix cancer mortality rates recorded over the 1970–1994 period for 118 counties of four states in the Western USA. Poisson kriging allows the filtering of noisy mortality rates computed from small population sizes, enhancing the correlation with two putative explanatory variables: percentage of habitants living below the federally defined poverty line, and percentage of Hispanic females. Area-to-point kriging formulation creates continuous maps of mortality risk, reducing the visual bias associated with the interpretation of choropleth maps. Stochastic simulation is used to generate realizations of cancer mortality maps, which allows one to quantify numerically how the uncertainty about the spatial distribution of health outcomes translates into uncertainty about the location of clusters of high values or the correlation with covariates. Last, geographically-weighted regression highlights the non-stationarity in the explanatory power of covariates: the higher mortality values along the coast are better explained by the two covariates than the lower risk recorded in Utah. PMID:19412347

  2. Principal Component Geostatistical Approach for large-dimensional inverse problems

    PubMed Central

    Kitanidis, P K; Lee, J

    2014-01-01

    The quasi-linear geostatistical approach is for weakly nonlinear underdetermined inverse problems, such as Hydraulic Tomography and Electrical Resistivity Tomography. It provides best estimates as well as measures for uncertainty quantification. However, for its textbook implementation, the approach involves iterations, to reach an optimum, and requires the determination of the Jacobian matrix, i.e., the derivative of the observation function with respect to the unknown. Although there are elegant methods for the determination of the Jacobian, the cost is high when the number of unknowns, m, and the number of observations, n, is high. It is also wasteful to compute the Jacobian for points away from the optimum. Irrespective of the issue of computing derivatives, the computational cost of implementing the method is generally of the order of m2n, though there are methods to reduce the computational cost. In this work, we present an implementation that utilizes a matrix free in terms of the Jacobian matrix Gauss-Newton method and improves the scalability of the geostatistical inverse problem. For each iteration, it is required to perform K runs of the forward problem, where K is not just much smaller than m but can be smaller that n. The computational and storage cost of implementation of the inverse procedure scales roughly linearly with m instead of m2 as in the textbook approach. For problems of very large m, this implementation constitutes a dramatic reduction in computational cost compared to the textbook approach. Results illustrate the validity of the approach and provide insight in the conditions under which this method perform best. PMID:25558113

  3. DEM-based delineation for improving geostatistical interpolation of rainfall in mountainous region of Central Himalayas, India

    NASA Astrophysics Data System (ADS)

    Kumari, Madhuri; Singh, Chander Kumar; Bakimchandra, Oinam; Basistha, Ashoke

    2016-07-01

    In mountainous region with heterogeneous topography, the geostatistical modeling of the rainfall using global data set may not confirm to the intrinsic hypothesis of stationarity. This study was focused on improving the precision of the interpolated rainfall maps by spatial stratification in complex terrain. Predictions of the normal annual rainfall data were carried out by ordinary kriging, universal kriging, and co-kriging, using 80-point observations in the Indian Himalayas extending over an area of 53,484 km2. A two-step spatial clustering approach is proposed. In the first step, the study area was delineated into two regions namely lowland and upland based on the elevation derived from the digital elevation model. The delineation was based on the natural break classification method. In the next step, the rainfall data was clustered into two groups based on its spatial location in lowland or upland. The terrain ruggedness index (TRI) was incorporated as a co-variable in co-kriging interpolation algorithm. The precision of the kriged and co-kriged maps was assessed by two accuracy measures, root mean square error and Chatfield's percent better. It was observed that the stratification of rainfall data resulted in 5-20 % of increase in the performance efficiency of interpolation methods. Co-kriging outperformed the kriging models at annual and seasonal scale. The result illustrates that the stratification of the study area improves the stationarity characteristic of the point data, thus enhancing the precision of the interpolated rainfall maps derived using geostatistical methods.

  4. Spatial distribution of Biomphalaria mollusks at São Francisco River Basin, Minas Gerais, Brazil, using geostatistical procedures.

    PubMed

    Guimarães, Ricardo J P S; Freitas, Corina C; Dutra, Luciano V; Felgueiras, Carlos A; Moura, Ana C M; Amaral, Ronaldo S; Drummond, Sandra C; Scholte, Ronaldo G C; Oliveira, Guilherme; Carvalho, Omar S

    2009-03-01

    Geostatistics is used in this work to make inferences about the presence of the species of Biomphalaria (B. glabrata, B. tenagophila and/or B. straminea), intermediate hosts of Schistosoma mansoni, at the São Francisco River Basin, in Minas Gerais, Brazil. One of these geostatistical procedures, known as indicator kriging, allows the classification of categorical data, in areas where the data are not available, using a punctual sample set. The result is a map of species and risk area definition. More than a single map of the categorical attribute, the procedure also permits the association of uncertainties of the stochastic model, which can be used to qualify the inferences. In order to validate the estimated data of the risk map, a fieldwork in five municipalities was carried out. The obtained results showed that indicator kriging is a rather robust tool since it presented a very good agreement with the field findings. The obtained risk map can be thought as an auxiliary tool to formulate proper public health strategies, and to guide other fieldwork, considering the places with higher occurrence probability of the most important snail species. Also, the risk map will enable better resource distribution and adequate policies for the mollusk control. This methodology will be applied to other river basins to generate a predictive map for Biomphalaria species distribution for the entire state of Minas Gerais. PMID:19046937

  5. Revised Geostatistical Analysis of the Inventory of Carbon Tetrachloride in the Unconfined Aquifer in the 200 West Area of the Hanford Site

    SciTech Connect

    Murray, Christopher J.; Bott, Yi-Ju

    2008-12-30

    This report provides an updated estimate of the inventory of carbon tetrachloride (CTET) in the unconfined aquifer in the 200 West Area of the Hanford Site. The contaminant plumes of interest extend within the 200-ZP-1 and 200-UP-1 operable units. CH2M HILL Plateau Remediation Company (CHPRC) currently is preparing a plan identifying locations for groundwater extraction wells, injection wells, transfer stations, and one or more treatment facilities to address contaminants of concern identified in the 200-ZP-1 CERCLA Record of Decision. To accomplish this, a current understanding of the inventory of CTET is needed throughout the unconfined aquifer in the 200 West Area. Pacific Northwest National Laboratory (PNNL) previously developed an estimate of the CTET inventory in the area using a Monte Carlo approach based on geostatistical simulation of the three-dimensional (3D) distribution of CTET and chloroform in the aquifer. Fluor Hanford, Inc. (FH) (the previous site contractor) requested PNNL to update that inventory estimate using as input a set of geostatistical realizations of CTET and chloroform recently created for a related but separate project, referred to as the mapping project. The scope of work for the inventory revision complemented the scope of work for the mapping project, performed for FH by PNNL. This report briefly describes the spatial and univariate distribution of the CTET and chloroform data, along with the results of the geostatistical analysis and simulation performed for the mapping project.

  6. An in vitro systematic spectroscopic examination of the photostabilities of a random set of commercial sunscreen lotions and their chemical UVB/UVA active agents.

    PubMed

    Serpone, Nick; Salinaro, Angela; Emeline, Alexei V; Horikoshi, Satoshi; Hidaka, Hisao; Zhao, Jincai

    2002-12-01

    The photostabilities of a random set of commercially available sunscreen lotions and their active ingredients are examined spectroscopically subsequent to simulated sunlight UV exposure. Loss of filtering efficacy can occur because of possible photochemical modifications of the sunscreen active agents. Changes in absorption of UVA/ UVB sunlight by agents in sunscreen lotions also leads to a reduction of the expected photoprotection of human skin and DNA against the harmful UV radiation. The active ingredients were investigated in aqueous media and in organic solvents of various polarities (methanol, acetonitrile, and n-hexane) under aerobic and anaerobic conditions The UV absorption features are affected by the nature of the solvents with properties closely related to oil-in-water (o/w) or water-in-oil (w/o) emulsions actually used in sunscreen formulations, and by the presence of molecular oxygen. The photostabilities of two combined chemical ingredients (oxybenzone and octyl methoxycinnamate) and the combination oxybenzone/titanium dioxide were also explored. In the latter case, oxybenzone undergoes significant photodegradation in the presence of the physical filter TiO2. PMID:12661594

  7. Building on crossvalidation for increasing the quality of geostatistical modeling

    USGS Publications Warehouse

    Olea, R.A.

    2012-01-01

    The random function is a mathematical model commonly used in the assessment of uncertainty associated with a spatially correlated attribute that has been partially sampled. There are multiple algorithms for modeling such random functions, all sharing the requirement of specifying various parameters that have critical influence on the results. The importance of finding ways to compare the methods and setting parameters to obtain results that better model uncertainty has increased as these algorithms have grown in number and complexity. Crossvalidation has been used in spatial statistics, mostly in kriging, for the analysis of mean square errors. An appeal of this approach is its ability to work with the same empirical sample available for running the algorithms. This paper goes beyond checking estimates by formulating a function sensitive to conditional bias. Under ideal conditions, such function turns into a straight line, which can be used as a reference for preparing measures of performance. Applied to kriging, deviations from the ideal line provide sensitivity to the semivariogram lacking in crossvalidation of kriging errors and are more sensitive to conditional bias than analyses of errors. In terms of stochastic simulation, in addition to finding better parameters, the deviations allow comparison of the realizations resulting from the applications of different methods. Examples show improvements of about 30% in the deviations and approximately 10% in the square root of mean square errors between reasonable starting modelling and the solutions according to the new criteria. ?? 2011 US Government.

  8. Geostatistical noise filtering of geophysical images : application to unexploded ordnance (UXO) sites.

    SciTech Connect

    Saito, Hirotaka; McKenna, Sean Andrew; Coburn, Timothy C.

    2004-07-01

    Geostatistical and non-geostatistical noise filtering methodologies, factorial kriging and a low-pass filter, and a region growing method are applied to analytic signal magnetometer images at two UXO contaminated sites to delineate UXO target areas. Overall delineation performance is improved by removing background noise. Factorial kriging slightly outperforms the low-pass filter but there is no distinct difference between them in terms of finding anomalies of interest.

  9. Should hydraulic tomography data be interpreted using geostatistical inverse modeling? A laboratory sandbox investigation

    NASA Astrophysics Data System (ADS)

    Illman, Walter A.; Berg, Steven J.; Zhao, Zhanfeng

    2015-05-01

    The robust performance of hydraulic tomography (HT) based on geostatistics has been demonstrated through numerous synthetic, laboratory, and field studies. While geostatistical inverse methods offer many advantages, one key disadvantage is its highly parameterized nature, which renders it computationally intensive for large-scale problems. Another issue is that geostatistics-based HT may produce overly smooth images of subsurface heterogeneity when there are few monitoring interval data. Therefore, some may question the utility of the geostatistical inversion approach in certain situations and seek alternative approaches. To investigate these issues, we simultaneously calibrated different groundwater models with varying subsurface conceptualizations and parameter resolutions using a laboratory sandbox aquifer. The compared models included: (1) isotropic and anisotropic effective parameter models; (2) a heterogeneous model that faithfully represents the geological features; and (3) a heterogeneous model based on geostatistical inverse modeling. The performance of these models was assessed by quantitatively examining the results from model calibration and validation. Calibration data consisted of steady state drawdown data from eight pumping tests and validation data consisted of data from 16 separate pumping tests not used in the calibration effort. Results revealed that the geostatistical inversion approach performed the best among the approaches compared, although the geological model that faithfully represented stratigraphy came a close second. In addition, when the number of pumping tests available for inverse modeling was small, the geological modeling approach yielded more robust validation results. This suggests that better knowledge of stratigraphy obtained via geophysics or other means may contribute to improved results for HT.

  10. Geostatistical prediction of flow-duration curves in an index-flow framework

    NASA Astrophysics Data System (ADS)

    Pugliese, Alessio; Castellarin, Attilio; Brath, Armando

    2014-05-01

    An empirical period-of-record Flow-Duration Curve (FDC) describes the percentage of time (duration) in which a given streamflow was equaled or exceeded over an historical period of time. FDCs have always attracted a great deal of interest in engineering applications because of their ability to provide a simple yet comprehensive graphical view of the overall historical variability of streamflows in a river basin, from floods to low-flows. Nevertheless, in many practical applications one has to construct FDC in basins that are ungauged or where very few observations are available. We present in this study an application strategy of Topological kriging (or Top-kriging), which makes the geostatistical procedure capable of predicting flow-duration curves (FDCs) in ungauged catchments. Previous applications of Top-kriging mainly focused on the prediction of point streamflow indices (e.g. flood quantiles, low-flow indices, etc.). In this study Top-kriging is used to predict FDCs in ungauged sites as a weighted average of standardised empirical FDCs through the traditional linear-weighting scheme of kriging methods. Our study focuses on the prediction of FDCs for 18 unregulated catchments located in Central Italy, for which daily streamflow series with length from 5 to 40 years are available, together with information on climate referring to the same time-span of each daily streamflow sequence. Empirical FDCs are standardised by a reference index-flow value (i.e. mean annual flow, or mean annual precipitation times the catchment drainage area) and the overall deviation of the curves from this reference value is then used for expressing the hydrological similarity between catchments and for deriving the geostatistical weights. We performed an extensive leave-one-out cross-validation to quantify the accuracy of the proposed technique, and to compare it to traditional regionalisation models that were recently developed for the same study region. The cross-validation points

  11. Optimized Field Sampling and Monitoring of Airborne Hazardous Transport Plumes; A Geostatistical Simulation Approach

    SciTech Connect

    Chen, DI-WEN

    2001-11-21

    Airborne hazardous plumes inadvertently released during nuclear/chemical/biological incidents are mostly of unknown composition and concentration until measurements are taken of post-accident ground concentrations from plume-ground deposition of constituents. Unfortunately, measurements often are days post-incident and rely on hazardous manned air-vehicle measurements. Before this happens, computational plume migration models are the only source of information on the plume characteristics, constituents, concentrations, directions of travel, ground deposition, etc. A mobile ''lighter than air'' (LTA) system is being developed at Oak Ridge National Laboratory that will be part of the first response in emergency conditions. These interactive and remote unmanned air vehicles will carry light-weight detectors and weather instrumentation to measure the conditions during and after plume release. This requires a cooperative computationally organized, GPS-controlled set of LTA's that self-coordinate around the objectives in an emergency situation in restricted time frames. A critical step before an optimum and cost-effective field sampling and monitoring program proceeds is the collection of data that provides statistically significant information, collected in a reliable and expeditious manner. Efficient aerial arrangements of the detectors taking the data (for active airborne release conditions) are necessary for plume identification, computational 3-dimensional reconstruction, and source distribution functions. This report describes the application of stochastic or geostatistical simulations to delineate the plume for guiding subsequent sampling and monitoring designs. A case study is presented of building digital plume images, based on existing ''hard'' experimental data and ''soft'' preliminary transport modeling results of Prairie Grass Trials Site. Markov Bayes Simulation, a coupled Bayesian/geostatistical methodology, quantitatively combines soft information

  12. Estimating transmissivity in the Edwards Aquifer using upscaling, geostatistics, and Bayesian updating

    NASA Astrophysics Data System (ADS)

    Painter, S. L.; Jiang, Y.; Woodbury, A. D.

    2002-12-01

    The Edwards Aquifer, a highly heterogeneous karst aquifer located in south central Texas, is the sole source of drinking water for more than one million people. Hydraulic conductivity (K) measurements in the Edwards Aquifer are sparse, highly variable (log-K variance of 6.4), and are mostly from single-well drawdown tests that are appropriate for the spatial scale of a few meters. To support ongoing efforts to develop a groundwater management (MODFLOW) model of the San Antonio segment of the Edwards Aquifer, a multistep procedure was developed to assign hydraulic parameters to the 402 m x 402 m computational cells intended for the management model. The approach used a combination of nonparametric geostatistical analysis, stochastic simulation, numerical upscaling, and automatic model calibration based on Bayesian updating [1,2]. Indicator correlograms reveal a nested spatial structure in the well-test K of the confined zone, with practical correlation ranges of 3,600 and 15,000 meters and a large nugget effect. The fitted geostatistical model was used in unconditional stochastic simulations by the sequential indicator simulation method. The resulting realizations of K, defined at the scale of the well tests, were then numerically upscaled to the block scale. A new geostatistical model was fitted to the upscaled values. The upscaled model was then used to cokrige the block-scale K based on the well-test K. The resulting K map was then converted to transmissivity (T) using deterministically mapped aquifer thickness. When tested in a forward groundwater model, the upscaled T reproduced hydraulic heads better than a simple kriging of the well-test values (mean error of -3.9 meter and mean-absolute-error of 12 meters, as compared with -13 and 17 meters for the simple kriging). As the final step in the study, the upscaled T map was used as the prior distribution in an inverse procedure based on Bayesian updating [1,2]. When input to the forward groundwater model, the

  13. Optimization of ventilator setting by flow and pressure waveforms analysis during noninvasive ventilation for acute exacerbations of COPD: a multicentric randomized controlled trial

    PubMed Central

    2011-01-01

    Introduction The analysis of flow and pressure waveforms generated by ventilators can be useful in the optimization of patient-ventilator interactions, notably in chronic obstructive pulmonary disease (COPD) patients. To date, however, a real clinical benefit of this approach has not been proven. Methods The aim of the present randomized, multi-centric, controlled study was to compare optimized ventilation, driven by the analysis of flow and pressure waveforms, to standard ventilation (same physician, same initial ventilator setting, same time spent at the bedside while the ventilator screen was obscured with numerical data always available). The primary aim was the rate of pH normalization at two hours, while secondary aims were changes in PaCO2, respiratory rate and the patient's tolerance to ventilation (all parameters evaluated at baseline, 30, 120, 360 minutes and 24 hours after the beginning of ventilation). Seventy patients (35 for each group) with acute exacerbation of COPD were enrolled. Results Optimized ventilation led to a more rapid normalization of pH at two hours (51 vs. 26% of patients), to a significant improvement of the patient's tolerance to ventilation at two hours, and to a higher decrease of PaCO2 at two and six hours. Optimized ventilation induced physicians to use higher levels of external positive end-expiratory pressure, more sensitive inspiratory triggers and a faster speed of pressurization. Conclusions The analysis of the waveforms generated by ventilators has a significant positive effect on physiological and patient-centered outcomes during acute exacerbation of COPD. The acquisition of specific skills in this field should be encouraged. Trial registration ClinicalTrials.gov NCT01291303. PMID:22115190

  14. Hand-held echocardiography in the setting of pre-operative cardiac evaluation of patients undergoing non-cardiac surgery: results from a randomized pilot study.

    PubMed

    Cavallari, Ilaria; Mega, Simona; Goffredo, Costanza; Patti, Giuseppe; Chello, Massimo; Di Sciascio, Germano

    2015-06-01

    Transthoracic echocardiography is not a routine test in the pre-operative cardiac evaluation of patients undergoing non-cardiac surgery but may be considered in those with known heart failure and valvular heart disease or complaining cardiac symptoms. In this setting, hand-held echocardiography (HHE) could find a potential application as an alternative to standard echocardiography in selected patients; however, its utility in this context has not been investigated. The aim of this pilot study was to evaluate the conclusiveness of HHE compared to standard echocardiography in this subset of patients. 100 patients scheduled for non-cardiac surgery were randomized to receive a standard exam with a Philips Ie33 or a bedside evaluation with a pocket-size imaging device (Opti-Go, Philips Medical System). The primary endpoint was the percentage of satisfactory diagnosis at the end of the examination referred as conclusiveness. Secondary endpoints were the mean duration time and the mean waiting time to perform the exams. No significant difference in terms of conclusiveness between HHE and standard echo was found (86 vs 96%; P = 0.08). Mean duration time of the examinations was 6.1 ± 1.2 min with HHE and 13.1 ± 2.6 min with standard echocardiography (P < 0.001). HHE resulted in a consistent save of waiting time because it was performed the same day of clinical evaluation whereas patients waited 10.1 ± 6.1 days for a standard echocardiography (P < 0.001). This study suggests the potential role of HHE for pre-operative evaluation of selected patients undergoing non-cardiac surgery, since it provided similar information but it was faster and earlier performed compared to standard echocardiography. PMID:25985940

  15. Prospective Randomized Double-Blind Pilot Study of Site-Specific Consensus Atlas Implementation for Rectal Cancer Target Volume Delineation in the Cooperative Group Setting

    SciTech Connect

    Fuller, Clifton D.; Nijkamp, Jasper; Duppen, Joop C.; Rasch, Coen R.N.; Thomas, Charles R.; Wang, Samuel J.; Okunieff, Paul; Jones, William E.; Baseman, Daniel; Patel, Shilpen; Demandante, Carlo G.N.; Harris, Anna M.; Smith, Benjamin D.; Katz, Alan W.; McGann, Camille

    2011-02-01

    Purpose: Variations in target volume delineation represent a significant hurdle in clinical trials involving conformal radiotherapy. We sought to determine the effect of a consensus guideline-based visual atlas on contouring the target volumes. Methods and Materials: A representative case was contoured (Scan 1) by 14 physician observers and a reference expert with and without target volume delineation instructions derived from a proposed rectal cancer clinical trial involving conformal radiotherapy. The gross tumor volume (GTV), and two clinical target volumes (CTVA, including the internal iliac, presacral, and perirectal nodes, and CTVB, which included the external iliac nodes) were contoured. The observers were randomly assigned to receipt (Group A) or nonreceipt (Group B) of a consensus guideline and atlas for anorectal cancers and then instructed to recontour the same case/images (Scan 2). Observer variation was analyzed volumetrically using the conformation number (CN, where CN = 1 equals total agreement). Results: Of 14 evaluable contour sets (1 expert and 7 Group A and 6 Group B observers), greater agreement was found for the GTV (mean CN, 0.75) than for the CTVs (mean CN, 0.46-0.65). Atlas exposure for Group A led to significantly increased interobserver agreement for CTVA (mean initial CN, 0.68, after atlas use, 0.76; p = .03) and increased agreement with the expert reference (initial mean CN, 0.58; after atlas use, 0.69; p = .02). For the GTV and CTVB, neither the interobserver nor the expert agreement was altered after atlas exposure. Conclusion: Consensus guideline atlas implementation resulted in a detectable difference in interobserver agreement and a greater approximation of expert volumes for the CTVA but not for the GTV or CTVB in the specified case. Visual atlas inclusion should be considered as a feature in future clinical trials incorporating conformal RT.

  16. Randomized controlled trial of meat compared with multimicronutrient-fortified cereal in infants and toddlers with high stunting rates in diverse settings123

    PubMed Central

    Krebs, Nancy F; Mazariegos, Manolo; Chomba, Elwyn; Sami, Neelofar; Pasha, Omrana; Tshefu, Antoinette; Carlo, Waldemar A; Goldenberg, Robert L; Bose, Carl L; Wright, Linda L; Koso-Thomas, Marion; Goco, Norman; Kindem, Mark; McClure, Elizabeth M; Westcott, Jamie; Garces, Ana; Lokangaka, Adrien; Manasyan, Albert; Imenda, Edna; Hartwell, Tyler D; Hambidge, K Michael

    2012-01-01

    Background: Improved complementary feeding is cited as a critical factor for reducing stunting. Consumption of meats has been advocated, but its efficacy in low-resource settings has not been tested. Objective: The objective was to test the hypothesis that daily intake of 30 to 45 g meat from 6 to 18 mo of age would result in greater linear growth velocity and improved micronutrient status in comparison with an equicaloric multimicronutrient-fortified cereal. Design: This was a cluster randomized efficacy trial conducted in the Democratic Republic of Congo, Zambia, Guatemala, and Pakistan. Individual daily portions of study foods and education messages to enhance complementary feeding were delivered to participants. Blood tests were obtained at trial completion. Results: A total of 532 (86.1%) and 530 (85.8%) participants from the meat and cereal arms, respectively, completed the study. Linear growth velocity did not differ between treatment groups: 1.00 (95% CI: 0.99, 1.02) and 1.02 (95% CI: 1.00, 1.04) cm/mo for the meat and cereal groups, respectively (P = 0.39). From baseline to 18 mo, stunting [length-for-age z score (LAZ) <−2.0] rates increased from ∼33% to nearly 50%. Years of maternal education and maternal height were positively associated with linear growth velocity (P = 0.0006 and 0.003, respectively); LAZ at 6 mo was negatively associated (P < 0.0001). Anemia rates did not differ by group; iron deficiency was significantly lower in the cereal group. Conclusion: The high rate of stunting at baseline and the lack of effect of either the meat or multiple micronutrient-fortified cereal intervention to reverse its progression argue for multifaceted interventions beginning in the pre- and early postnatal periods. This trial was registered at clinicaltrials.gov as NCT01084109. PMID:22952176

  17. A multicenter randomized controlled trial of a plant-based nutrition program to reduce body weight and cardiovascular risk in the corporate setting: the GEICO study

    PubMed Central

    Mishra, S; Xu, J; Agarwal, U; Gonzales, J; Levin, S; Barnard, N D

    2013-01-01

    Background/objectives: To determine the effects of a low-fat plant-based diet program on anthropometric and biochemical measures in a multicenter corporate setting. Subjects/methods: Employees from 10 sites of a major US company with body mass index ⩾25 kg/m2 and/or previous diagnosis of type 2 diabetes were randomized to either follow a low-fat vegan diet, with weekly group support and work cafeteria options available, or make no diet changes for 18 weeks. Dietary intake, body weight, plasma lipid concentrations, blood pressure and glycated hemoglobin (HbA1C) were determined at baseline and 18 weeks. Results: Mean body weight fell 2.9 kg and 0.06 kg in the intervention and control groups, respectively (P<0.001). Total and low-density lipoprotein (LDL) cholesterol fell 8.0 and 8.1 mg/dl in the intervention group and 0.01 and 0.9 mg/dl in the control group (P<0.01). HbA1C fell 0.6 percentage point and 0.08 percentage point in the intervention and control group, respectively (P<0.01). Among study completers, mean changes in body weight were −4.3 kg and −0.08 kg in the intervention and control groups, respectively (P<0.001). Total and LDL cholesterol fell 13.7 and 13.0 mg/dl in the intervention group and 1.3 and 1.7 mg/dl in the control group (P<0.001). HbA1C levels decreased 0.7 percentage point and 0.1 percentage point in the intervention and control group, respectively (P<0.01). Conclusions: An 18-week dietary intervention using a low-fat plant-based diet in a corporate setting improves body weight, plasma lipids, and, in individuals with diabetes, glycemic control. PMID:23695207

  18. Working with men to prevent intimate partner violence in a conflict-affected setting: a pilot cluster randomized controlled trial in rural Côte d’Ivoire

    PubMed Central

    2014-01-01

    Background Evidence from armed conflict settings points to high levels of intimate partner violence (IPV) against women. Current knowledge on how to prevent IPV is limited—especially within war-affected settings. To inform prevention programming on gender-based violence in settings affected by conflict, we evaluated the impact of adding a targeted men’s intervention to a community-based prevention programme in Côte d’Ivoire. Methods We conducted a two-armed, non-blinded cluster randomized trial in Côte d’Ivoire among 12 pair-matched communities spanning government-controlled, UN buffer, and rebel–controlled zones. The intervention communities received a 16-week IPV prevention intervention using a men’s discussion group format. All communities received community-based prevention programmes. Baseline data were collected from couples in September 2010 (pre-intervention) and follow-up in March 2012 (one year post-intervention). The primary trial outcome was women’s reported experiences of physical and/or sexual IPV in the last 12 months. We also assessed men’s reported intention to use physical IPV, attitudes towards sexual IPV, use of hostility and conflict management skills, and participation in gendered household tasks. An adjusted cluster-level intention to treat analysis was used to compare outcomes between intervention and control communities at follow-up. Results At follow-up, reported levels of physical and/or sexual IPV in the intervention arm had decreased compared to the control arm (ARR 0.52, 95% CI 0.18-1.51, not significant). Men participating in the intervention reported decreased intentions to use physical IPV (ARR 0.83, 95% CI 0.66-1.06) and improved attitudes toward sexual IPV (ARR 1.21, 95% CI 0.77-1.91). Significant differences were found between men in the intervention and control arms’ reported ability to control their hostility and manage conflict (ARR 1.3, 95% CI 1.06-1.58), and participation in gendered household tasks (ARR

  19. Geostatistical investigations for suitable mapping of the water table: the Bordeaux case (France)

    NASA Astrophysics Data System (ADS)

    Guekie simo, Aubin Thibaut; Marache, Antoine; Lastennet, Roland; Breysse, Denys

    2016-02-01

    Methodologies have been developed to establish realistic water-table maps using geostatistical methods: ordinary kriging (OK), cokriging (CoK), collocated cokriging (CoCoK), and kriging with external drift (KED). In fact, in a hilly terrain, when piezometric data are sparsely distributed over large areas, the water-table maps obtained by these methods provide exact water levels at monitoring wells but fail to represent the groundwater flow system, manifested through an interpolated water table above the topography. A methodology is developed in order to rebuild water-table maps for urban areas at the city scale. The interpolation methodology is presented and applied in a case study where water levels are monitored at a set of 47 points for a part urban domain covering 25.6 km2 close to Bordeaux city, France. To select the best method, a geographic information system was used to visualize surfaces reconstructed with each method. A cross-validation was carried out to evaluate the predictive performances of each kriging method. KED proves to be the most accurate and yields a better description of the local fluctuations induced by the topography (natural occurrence of ridges and valleys).

  20. Characterizing the spatial structure of endangered species habitat using geostatistical analysis of IKONOS imagery

    USGS Publications Warehouse

    Wallace, C.S.A.; Marsh, S.E.

    2005-01-01

    Our study used geostatistics to extract measures that characterize the spatial structure of vegetated landscapes from satellite imagery for mapping endangered Sonoran pronghorn habitat. Fine spatial resolution IKONOS data provided information at the scale of individual trees or shrubs that permitted analysis of vegetation structure and pattern. We derived images of landscape structure by calculating local estimates of the nugget, sill, and range variogram parameters within 25 ?? 25-m image windows. These variogram parameters, which describe the spatial autocorrelation of the 1-m image pixels, are shown in previous studies to discriminate between different species-specific vegetation associations. We constructed two independent models of pronghorn landscape preference by coupling the derived measures with Sonoran pronghorn sighting data: a distribution-based model and a cluster-based model. The distribution-based model used the descriptive statistics for variogram measures at pronghorn sightings, whereas the cluster-based model used the distribution of pronghorn sightings within clusters of an unsupervised classification of derived images. Both models define similar landscapes, and validation results confirm they effectively predict the locations of an independent set of pronghorn sightings. Such information, although not a substitute for field-based knowledge of the landscape and associated ecological processes, can provide valuable reconnaissance information to guide natural resource management efforts. ?? 2005 Taylor & Francis Group Ltd.

  1. Geostatistical analysis of tritium, groundwater age and other noble gas derived parameters in California.

    PubMed

    Visser, A; Moran, J E; Hillegonds, Darren; Singleton, M J; Kulongoski, Justin T; Belitz, Kenneth; Esser, B K

    2016-03-15

    Key characteristics of California groundwater systems related to aquifer vulnerability, sustainability, recharge locations and mechanisms, and anthropogenic impact on recharge are revealed in a spatial geostatistical analysis of a unique data set of tritium, noble gases and other isotopic analyses unprecedented in size at nearly 4000 samples. The correlation length of key groundwater residence time parameters varies between tens of kilometers ((3)H; age) to the order of a hundred kilometers ((4)Heter; (14)C; (3)Hetrit). The correlation length of parameters related to climate, topography and atmospheric processes is on the order of several hundred kilometers (recharge temperature; δ(18)O). Young groundwater ages that highlight regional recharge areas are located in the eastern San Joaquin Valley, in the southern Santa Clara Valley Basin, in the upper LA basin and along unlined canals carrying Colorado River water, showing that much of the recent recharge in central and southern California is dominated by river recharge and managed aquifer recharge. Modern groundwater is found in wells with the top open intervals below 60 m depth in the southeastern San Joaquin Valley, Santa Clara Valley and Los Angeles basin, as the result of intensive pumping and/or managed aquifer recharge operations. PMID:26803267

  2. Spatial analysis of heavy metals in surface soils based on GeoStatistics

    NASA Astrophysics Data System (ADS)

    Sun, Yingjun; Ding, Ning; Cai, Fei; Meng, Fei

    2008-10-01

    The pollution of surface soils caused by heavy metals has been a focus problem discussed. Instead of the acquisition of the "best" estimation of unsampled points, the author paid much attention to the assessment of the spatial uncertainty about unsampled values. The simulation method of Geostatistics, aimed at the same statistics (histogram, Variogram), can generate a set of equally-probable realizations which is particularly useful for assessing the uncertainty in the spatial distribution of attribute values. The case study was from an Urban - Rural transition zone of Shanghai, China. Six kinds of heavy metals (Cu, Pb, Cd, Cr, Hg and As) in agricultural surface soils were analyzed in the paper. Based on the study of spatial variation of different kind of heavy metal, the author got the different realization of the 6 kinds of heavy metals respectively based on the sequential simulation methods. At last, the author drew the conclusion that Cu, Cd and Cr were the dominant elements that influenced soil quality in the study area. At the end of the paper, the author gave the uncertainty map of the six heave metals respectively.

  3. Obtaining parsimonious hydraulic conductivity fields using head and transport observations: A bayesian geostatistical parameter estimation approach

    USGS Publications Warehouse

    Fienen, M.; Hunt, R.; Krabbenhoft, D.; Clemo, T.

    2009-01-01

    Flow path delineation is a valuable tool for interpreting the subsurface hydrogeochemical environment. Different types of data, such as groundwater flow and transport, inform different aspects of hydrogeologie parameter values (hydraulic conductivity in this case) which, in turn, determine flow paths. This work combines flow and transport information to estimate a unified set of hydrogeologic parameters using the Bayesian geostatistical inverse approach. Parameter flexibility is allowed by using a highly parameterized approach with the level of complexity informed by the data. Despite the effort to adhere to the ideal of minimal a priori structure imposed on the problem, extreme contrasts in parameters can result in the need to censor correlation across hydrostratigraphic bounding surfaces. These partitions segregate parameters into faci??s associations. With an iterative approach in which partitions are based on inspection of initial estimates, flow path interpretation is progressively refined through the inclusion of more types of data. Head observations, stable oxygen isotopes (18O/16O) ratios), and tritium are all used to progressively refine flow path delineation on an isthmus between two lakes in the Trout Lake watershed, northern Wisconsin, United States. Despite allowing significant parameter freedom by estimating many distributed parameter values, a smooth field is obtained. Copyright 2009 by the American Geophysical Union.

  4. Geostatistical Analysis of Spatio-Temporal Forest Fire Data

    NASA Astrophysics Data System (ADS)

    Vega Orozco, Carmen D.; Kanevski, Mikhail; Tonini, Marj; Conedera, Marc

    2010-05-01

    Forest fire is one of the major phenomena causing degradation of environment, landscape, natural ecosystems, human health and economy. One of the main topic in forest fire data studies deals with the detection, analysis and modelling of spatio-temporal patterns of clustering. Spatial patterns of forest fire locations, their sizes and their sequence in time are of great interest for fire prediction and for forest fire management planning and distribution in optimal way necessary resources. Currently, fires can be analyzed and monitored by using different statistical tools, for example, Ripley's k-function, fractals, Allan factor, scan statistics, etc. Some of them are adapted to temporal or spatial data and are either local or global. In the present study the main attention is paid to the application of geostatistical tools - variography and methods for the analysis of monitoring networks (MN) clustering techniques (topological, statistical and fractal measures), in order to detect and to characterize spatio-temporal forest fire patterns. The main studies performed include: a) analysis of forest fires temporal sequences; b) spatial clustering of forest fires; c) geostatistical spatial analysis of burnt areas. Variography was carried out both for temporal and spatial data. Real case study is based on the forest-fire event data from Canton of Ticino (Switzerland) for a period of 1969 to 2008. The results from temporal analysis show the presence of clustering and seasonal periodicities. Comprehensive analysis of the variograms shows an anisotropy in the direction 30° East-North where smooth changes are detected, while on the direction 30° North-West a greater variability was identified. The research was completed with an application of different MN analysis techniques including, analysis of distributions of distances between events, Morisita Index (MI), fractal dimensions (sandbox counting and box counting methods) and functional fractal dimensions, adapted and

  5. Geostatistical Analyses of the Persistence and Inventory of Carbon Tetrachloride in the 200 West Area of the Hanford Site

    SciTech Connect

    Murray, Christopher J.; Bott, Yi-Ju; Truex, Michael J.

    2007-04-30

    This report documents two separate geostatistical studies performed by researchers from Pacific Northwest National Laboratory to evaluate the carbon tetrachloride plume in the groundwater on the Hanford Site.

  6. Bootstrapped models for intrinsic random functions

    SciTech Connect

    Campbell, K.

    1987-01-01

    The use of intrinsic random function stochastic models as a basis for estimation in geostatistical work requires the identification of the generalized covariance function of the underlying process, and the fact that this function has to be estimated from the data introduces an additional source of error into predictions based on the model. This paper develops the sample reuse procedure called the ''bootstrap'' in the context of intrinsic random functions to obtain realistic estimates of these errors. Simulation results support the conclusion that bootstrap distributions of functionals of the process, as well as of their ''kriging variance,'' provide a reasonable picture of the variability introduced by imperfect estimation of the generalized covariance function.

  7. Bootstrapped models for intrinsic random functions

    SciTech Connect

    Campbell, K.

    1988-08-01

    Use of intrinsic random function stochastic models as a basis for estimation in geostatistical work requires the identification of the generalized covariance function of the underlying process. The fact that this function has to be estimated from data introduces an additional source of error into predictions based on the model. This paper develops the sample reuse procedure called the bootstrap in the context of intrinsic random functions to obtain realistic estimates of these errors. Simulation results support the conclusion that bootstrap distributions of functionals of the process, as well as their kriging variance, provide a reasonable picture of variability introduced by imperfect estimation of the generalized covariance function.

  8. Priority Setting and Influential Factors on Acceptance of Pharmaceutical Recommendations in Collaborative Medication Reviews in an Ambulatory Care Setting – Analysis of a Cluster Randomized Controlled Trial (WestGem-Study)

    PubMed Central

    Rose, Olaf; Mennemann, Hugo; John, Carina; Lautenschläger, Marcus; Mertens-Keller, Damaris; Richling, Katharina; Waltering, Isabel; Hamacher, Stefanie; Felsch, Moritz; Herich, Lena; Czarnecki, Kathrin; Schaffert, Corinna; Jaehde, Ulrich; Köberlein-Neu, Juliane

    2016-01-01

    Background Medication reviews are recognized services to increase quality of therapy and reduce medication risks. The selection of eligible patients with potential to receive a major benefit is based on assumptions rather than on factual data. Acceptance of interprofessional collaboration is crucial to increase the quality of medication therapy. Objective The research question was to identify and prioritize eligible patients for a medication review and to provide evidence-based criteria for patient selection. Acceptance of the prescribing general practitioner to implement pharmaceutical recommendations was measured and factors influencing physicians’ acceptance were explored to obtain an impression on the extent of collaboration in medication review in an ambulatory care setting. Methods Based on data of a cluster-randomized controlled study (WestGem-study), the correlation between patient parameters and the individual performance in a medication review was calculated in a multiple logistic regression model. Physician’s acceptance of the suggested intervention was assessed using feedback forms. Influential factors were analyzed. Results The number of drugs in use (p = 0.001), discrepancies between prescribed and used medicines (p = 0.014), the baseline Medication Appropriateness Index score (p<0.001) and the duration of the intervention (p = 0.006) could be identified as influential factors for a major benefit from a medication review, whereas morbidity (p>0.05) and a low kidney function (p>0.05) do not predetermine the outcome. Longitudinal patient care with repeated reviews showed higher interprofessional acceptance and superior patient benefit. A total of 54.9% of the recommendations in a medication review on drug therapy were accepted for implementation. Conclusions The number of drugs in use and medication reconciliation could be a first rational step in patient selection for a medication review. Most elderly, multimorbid patients with polymedication

  9. Kriging in the Shadows: Geostatistical Interpolation for Remote Sensing

    NASA Technical Reports Server (NTRS)

    Rossi, Richard E.; Dungan, Jennifer L.; Beck, Louisa R.

    1994-01-01

    It is often useful to estimate obscured or missing remotely sensed data. Traditional interpolation methods, such as nearest-neighbor or bilinear resampling, do not take full advantage of the spatial information in the image. An alternative method, a geostatistical technique known as indicator kriging, is described and demonstrated using a Landsat Thematic Mapper image in southern Chiapas, Mexico. The image was first classified into pasture and nonpasture land cover. For each pixel that was obscured by cloud or cloud shadow, the probability that it was pasture was assigned by the algorithm. An exponential omnidirectional variogram model was used to characterize the spatial continuity of the image for use in the kriging algorithm. Assuming a cutoff probability level of 50%, the error was shown to be 17% with no obvious spatial bias but with some tendency to categorize nonpasture as pasture (overestimation). While this is a promising result, the method's practical application in other missing data problems for remotely sensed images will depend on the amount and spatial pattern of the unobscured pixels and missing pixels and the success of the spatial continuity model used.

  10. Spatiotemporal analysis of olive flowering using geostatistical techniques.

    PubMed

    Rojo, Jesús; Pérez-Badia, Rosa

    2015-02-01

    Analysis of flowering patterns in the olive (Olea europaea L.) are of considerable agricultural and ecological interest, and also provide valuable information for allergy-sufferers, enabling identification of the major sources of airborne pollen at any given moment by interpreting the aerobiological data recorded in pollen traps. The present spatiotemporal analysis of olive flowering in central Spain combined geostatistical techniques with the application of a Geographic Information Systems, and compared results for flowering intensity with airborne pollen records. The results were used to obtain continuous phenological maps which determined the pattern of the succession of the olive flowering. The results show also that, although the highest airborne olive-pollen counts were recorded during the greatest flowering intensity of the groves closest to the pollen trap, the counts recorded at the start of the pollen season were not linked to local olive groves, which had not yet begin to flower. To detect the remote sources of olive pollen several episodes of pollen recorded before the local flowering season were analysed using a HYSPLIT trajectory model and the findings showed that western, southern and southwestern winds transported pollen grains into the study area from earlier-flowering groves located outside the territory. PMID:25461089

  11. An interactive Bayesian geostatistical inverse protocol for hydraulic tomography

    USGS Publications Warehouse

    Fienen, Michael N.; Clemo, Tom; Kitanidis, Peter K.

    2008-01-01

    Hydraulic tomography is a powerful technique for characterizing heterogeneous hydrogeologic parameters. An explicit trade-off between characterization based on measurement misfit and subjective characterization using prior information is presented. We apply a Bayesian geostatistical inverse approach that is well suited to accommodate a flexible model with the level of complexity driven by the data and explicitly considering uncertainty. Prior information is incorporated through the selection of a parameter covariance model characterizing continuity and providing stability. Often, discontinuities in the parameter field, typically caused by geologic contacts between contrasting lithologic units, necessitate subdivision into zones across which there is no correlation among hydraulic parameters. We propose an interactive protocol in which zonation candidates are implied from the data and are evaluated using cross validation and expert knowledge. Uncertainty introduced by limited knowledge of dynamic regional conditions is mitigated by using drawdown rather than native head values. An adjoint state formulation of MODFLOW-2000 is used to calculate sensitivities which are used both for the solution to the inverse problem and to guide protocol decisions. The protocol is tested using synthetic two-dimensional steady state examples in which the wells are located at the edge of the region of interest.

  12. Geostatistical modeling of uncertainty, simulation, and proposed applications in GIScience

    NASA Astrophysics Data System (ADS)

    Doucette, Peter; Dolloff, John; Lenihan, Michael

    2015-05-01

    Geostatistical modeling of spatial uncertainty has its roots in the mining, water and oil reservoir exploration communities, and has great potential for broader applications as proposed in this paper. This paper describes the underlying statistical models and their use in both the estimation of quantities of interest and the Monte-Carlo simulation of their uncertainty or errors, including their variance or expected magnitude and their spatial correlations or inter-relationships. These quantities can include 2D or 3D terrain locations, feature vertex locations, or any specified attributes whose statistical properties vary spatially. The simulation of spatial uncertainty or errors is a practical and powerful tool for understanding the effects of error propagation in complex systems. This paper describes various simulation techniques and trades-off their generality with complexity and speed. One technique recently proposed by the authors, Fast Sequential Simulation, has the ability to simulate tens of millions of errors with specifiable variance and spatial correlations in a few seconds on a lap-top computer. This ability allows for the timely evaluation of resultant output errors or the performance of a "down-stream" module or application. It also allows for near-real time evaluation when such a simulation capability is built into the application itself.

  13. Soil Organic Carbon Mapping by Geostatistics in Europe Scale

    NASA Astrophysics Data System (ADS)

    Aksoy, E.; Panagos, P.; Montanarella, L.

    2013-12-01

    Accuracy in assessing the distribution of soil organic carbon (SOC) is an important issue because SOC is an important soil component that plays key roles in the functions of both natural ecosystems and agricultural systems. The SOC content varies from place to place and it is strongly related with climate variables (temperature and rainfall), terrain features, soil texture, parent material, vegetation, land-use types, and human management (management and degradation) at different spatial scales. Geostatistical techniques allow for the prediction of soil properties using soil information and environmental covariates. In this study, assessment of SOC distribution has been predicted with Regression-Kriging method in Europe scale. In this prediction, combination of the soil samples which were collected from the LUCAS (European Land Use/Cover Area frame statistical Survey) & BioSoil Projects, with local soil data which were collected from six different CZOs in Europe and ten spatial predictors (slope, aspect, elevation, CTI, CORINE land-cover classification, parent material, texture, WRB soil classification, annual average temperature and precipitation) were used. Significant correlation between the covariates and the organic carbon dependent variable was found. Moreover, investigating the contribution of local dataset in watershed scale into regional dataset in European scale was an important challenge.

  14. Effect of mixing on oxygen reduction, denitrification, and isotopic fractionation in a heterogeneous aquifer in an agricultural setting.

    NASA Astrophysics Data System (ADS)

    Green, C. T.; Bekins, B. A.

    2007-12-01

    Large fluxes of nitrate in agricultural settings threaten water quality worldwide. In groundwater, the transport and fate of anthropogenic nitrate can be influenced by denitrification, which occurs under anaerobic conditions. It is difficult to characterize denitrification because (1) waters from multiple flow paths commingle in the aquifer due to dispersion and (2) a sample typically includes water from multiple lithologies. For a site near Merced, California, geostatistical simulations, flow modeling, and backward random walk particle tracking were applied to simulate the influence of mixing on apparent reaction rates, redox boundaries, and fractionation of stable isotopes of nitrogen. Results show that dispersion has a strong effect on the distribution of redox indicators. When reactions are rapid, mixing can create the appearance of gradual redox transitions that mask the actual reaction rates and the degree of isotopic fractionation.

  15. A multiple-point geostatistical method for characterizing uncertainty of subsurface alluvial units and its effects on flow and transport

    USGS Publications Warehouse

    Cronkite-Ratcliff, C.; Phelps, G.A.; Boucher, A.

    2012-01-01

    This report provides a proof-of-concept to demonstrate the potential application of multiple-point geostatistics for characterizing geologic heterogeneity and its effect on flow and transport simulation. The study presented in this report is the result of collaboration between the U.S. Geological Survey (USGS) and Stanford University. This collaboration focused on improving the characterization of alluvial deposits by incorporating prior knowledge of geologic structure and estimating the uncertainty of the modeled geologic units. In this study, geologic heterogeneity of alluvial units is characterized as a set of stochastic realizations, and uncertainty is indicated by variability in the results of flow and transport simulations for this set of realizations. This approach is tested on a hypothetical geologic scenario developed using data from the alluvial deposits in Yucca Flat, Nevada. Yucca Flat was chosen as a data source for this test case because it includes both complex geologic and hydrologic characteristics and also contains a substantial amount of both surface and subsurface geologic data. Multiple-point geostatistics is used to model geologic heterogeneity in the subsurface. A three-dimensional (3D) model of spatial variability is developed by integrating alluvial units mapped at the surface with vertical drill-hole data. The SNESIM (Single Normal Equation Simulation) algorithm is used to represent geologic heterogeneity stochastically by generating 20 realizations, each of which represents an equally probable geologic scenario. A 3D numerical model is used to simulate groundwater flow and contaminant transport for each realization, producing a distribution of flow and transport responses to the geologic heterogeneity. From this distribution of flow and transport responses, the frequency of exceeding a given contaminant concentration threshold can be used as an indicator of uncertainty about the location of the contaminant plume boundary.

  16. How effective is an in-hospital heart failure self-care program in a Japanese setting? Lessons from a randomized controlled pilot study

    PubMed Central

    Kato, Naoko P; Kinugawa, Koichiro; Sano, Miho; Kogure, Asuka; Sakuragi, Fumika; Kobukata, Kihoko; Ohtsu, Hiroshi; Wakita, Sanae; Jaarsma, Tiny; Kazuma, Keiko

    2016-01-01

    Background Although the effectiveness of heart failure (HF) disease management programs has been established in Western countries, to date there have been no such programs in Japan. These programs may have different effectiveness due to differences in health care organization and possible cultural differences with regard to self-care. Therefore, the purpose of this study was to evaluate the effectiveness of a pilot HF program in a Japanese setting. Methods We developed an HF program focused on enhancing patient self-care before hospital discharge. Patients were randomized 1:1 to receive the new HF program or usual care. The primary outcome was self-care behavior as assessed by the European Heart Failure Self-Care Behavior Scale (EHFScBS). Secondary outcomes included HF knowledge and the 2-year rate of HF hospitalization and/or cardiac death. Results A total of 32 patients were enrolled (mean age, 63 years; 31% female). There was no difference in the total score of the EHFScBS between the two groups. One specific behavior score regarding a low-salt diet significantly improved compared with baseline in the intervention group. HF knowledge in the intervention group tended to improve more over 6 months than in the control group (a group-by-time effect, F=2.47, P=0.098). During a 2-year follow-up, the HF program was related to better outcomes regarding HF hospitalization and/or cardiac death (14% vs 48%, log-rank test P=0.04). In Cox regression analysis after adjustment for age, sex, and logarithmic of B-type natriuretic peptide, the program was associated with a reduction in HF hospitalization and/or cardiac death (hazard ratio, 0.17; 95% confidence interval, 0.03–0.90; P=0.04). Conclusion The HF program was likely to increase patients’ HF knowledge, change their behavior regarding a low-salt diet, and reduce HF hospitalization and/or cardiac events. Further improvement focused on the transition of knowledge to self-care behavior is necessary. PMID:26937177

  17. Massively Parallel Geostatistical Inversion of Coupled Processes in Heterogeneous Porous Media

    NASA Astrophysics Data System (ADS)

    Ngo, A.; Schwede, R. L.; Li, W.; Bastian, P.; Ippisch, O.; Cirpka, O. A.

    2012-04-01

    The quasi-linear geostatistical approach is an inversion scheme that can be used to estimate the spatial distribution of a heterogeneous hydraulic conductivity field. The estimated parameter field is considered to be a random variable that varies continuously in space, meets the measurements of dependent quantities (such as the hydraulic head, the concentration of a transported solute or its arrival time) and shows the required spatial correlation (described by certain variogram models). This is a method of conditioning a parameter field to observations. Upon discretization, this results in as many parameters as elements of the computational grid. For a full three dimensional representation of the heterogeneous subsurface it is hardly sufficient to work with resolutions (up to one million parameters) of the model domain that can be achieved on a serial computer. The forward problems to be solved within the inversion procedure consists of the elliptic steady-state groundwater flow equation and the formally elliptic but nearly hyperbolic steady-state advection-dominated solute transport equation in a heterogeneous porous medium. Both equations are discretized by Finite Element Methods (FEM) using fully scalable domain decomposition techniques. Whereas standard conforming FEM is sufficient for the flow equation, for the advection dominated transport equation, which rises well known numerical difficulties at sharp fronts or boundary layers, we use the streamline diffusion approach. The arising linear systems are solved using efficient iterative solvers with an AMG (algebraic multigrid) pre-conditioner. During each iteration step of the inversion scheme one needs to solve a multitude of forward and adjoint problems in order to calculate the sensitivities of each measurement and the related cross-covariance matrix of the unknown parameters and the observations. In order to reduce interprocess communications and to improve the scalability of the code on larger clusters

  18. Examining the spatial distribution of flower thrips in southern highbush blueberries by utilizing geostatistical methods.

    PubMed

    Rhodes, Elena M; Liburd, Oscar E; Grunwald, Sabine

    2011-08-01

    Flower thrips (Frankliniella spp.) are one of the key pests of southern highbush blueberries (Vaccinium corymbosum L. x V. darrowii Camp), a high-value crop in Florida. Thrips' feeding and oviposition injury to flowers can result in fruit scarring that renders the fruit unmarketable. Flower thrips often form areas of high population, termed "hot spots", in blueberry plantings. The objective of this study was to model thrips spatial distribution patterns with geostatistical techniques. Semivariogram models were used to determine optimum trap spacing and two commonly used interpolation methods, inverse distance weighting (IDW) and ordinary kriging (OK), were compared for their ability to model thrips spatial patterns. The experimental design consisted of a grid of 100 white sticky traps spaced at 15.24-m and 7.61-m intervals in 2008 and 2009, respectively. Thirty additional traps were placed randomly throughout the sampling area to collect information on distances shorter than the grid spacing. The semivariogram analysis indicated that, in most cases, spacing traps at least 28.8 m apart would result in spatially independent samples. Also, the 7.61-m grid spacing captured more of the thrips spatial variability than the 15.24-m grid spacing. IDW and OK produced maps with similar accuracy in both years, which indicates that thrips spatial distribution patterns, including "hot spots," can be modeled using either interpolation method. Future studies can use this information to determine if the formation of "hot spots" can be predicted using flower density, temperature, and other environmental factors. If so, this development would allow growers to spot treat the "hot spots" rather than their entire field. PMID:22251691

  19. Geostatistical inspired metamodeling and optimization of nanoscale analog circuits

    NASA Astrophysics Data System (ADS)

    Okobiah, Oghenekarho

    The current trend towards miniaturization of modern consumer electronic devices significantly affects their design. The demand for efficient all-in-one appliances leads to smaller, yet more complex and powerful nanoelectronic devices. The increasing complexity in the design of such nanoscale Analog/Mixed-Signal Systems-on-Chip (AMS-SoCs) presents difficult challenges to designers. One promising design method used to mitigate the burden of this design effort is the use of metamodeling (surrogate) modeling techniques. Their use significantly reduces the time for computer simulation and design space exploration and optimization. This dissertation addresses several issues of metamodeling based nanoelectronic based AMS design exploration. A surrogate modeling technique which uses geostatistical based Kriging prediction methods in creating metamodels is proposed. Kriging prediction techniques take into account the correlation effects between input parameters for performance point prediction. We propose the use of Kriging to utilize this property for the accurate modeling of process variation effects of designs in the deep nanometer region. Different Kriging methods have been explored for this work such as simple and ordinary Kriging. We also propose another metamodeling technique Kriging-Bootstrapped Neural Network that combines the accuracy and process variation awareness of Kriging with artificial neural network models for ultra-fast and accurate process aware metamodeling design. The proposed methodologies combine Kriging metamodels with selected algorithms for ultra-fast layout optimization. The selected algorithms explored are: Gravitational Search Algorithm (GSA), Simulated Annealing Optimization (SAO), and Ant Colony Optimization (ACO). Experimental results demonstrate that the proposed Kriging metamodel based methodologies can perform the optimizations with minimal computational burden compared to traditional (SPICE-based) design flows.

  20. Bayesian geostatistical modeling of Malaria Indicator Survey data in Angola.

    PubMed

    Gosoniu, Laura; Veta, Andre Mia; Vounatsou, Penelope

    2010-01-01

    The 2006-2007 Angola Malaria Indicator Survey (AMIS) is the first nationally representative household survey in the country assessing coverage of the key malaria control interventions and measuring malaria-related burden among children under 5 years of age. In this paper, the Angolan MIS data were analyzed to produce the first smooth map of parasitaemia prevalence based on contemporary nationwide empirical data in the country. Bayesian geostatistical models were fitted to assess the effect of interventions after adjusting for environmental, climatic and socio-economic factors. Non-linear relationships between parasitaemia risk and environmental predictors were modeled by categorizing the covariates and by employing two non-parametric approaches, the B-splines and the P-splines. The results of the model validation showed that the categorical model was able to better capture the relationship between parasitaemia prevalence and the environmental factors. Model fit and prediction were handled within a Bayesian framework using Markov chain Monte Carlo (MCMC) simulations. Combining estimates of parasitaemia prevalence with the number of children under we obtained estimates of the number of infected children in the country. The population-adjusted prevalence ranges from in Namibe province to in Malanje province. The odds of parasitaemia in children living in a household with at least ITNs per person was by 41% lower (CI: 14%, 60%) than in those with fewer ITNs. The estimates of the number of parasitaemic children produced in this paper are important for planning and implementing malaria control interventions and for monitoring the impact of prevention and control activities. PMID:20351775

  1. Bayesian Geostatistical Modeling of Malaria Indicator Survey Data in Angola

    PubMed Central

    Gosoniu, Laura; Veta, Andre Mia; Vounatsou, Penelope

    2010-01-01

    The 2006–2007 Angola Malaria Indicator Survey (AMIS) is the first nationally representative household survey in the country assessing coverage of the key malaria control interventions and measuring malaria-related burden among children under 5 years of age. In this paper, the Angolan MIS data were analyzed to produce the first smooth map of parasitaemia prevalence based on contemporary nationwide empirical data in the country. Bayesian geostatistical models were fitted to assess the effect of interventions after adjusting for environmental, climatic and socio-economic factors. Non-linear relationships between parasitaemia risk and environmental predictors were modeled by categorizing the covariates and by employing two non-parametric approaches, the B-splines and the P-splines. The results of the model validation showed that the categorical model was able to better capture the relationship between parasitaemia prevalence and the environmental factors. Model fit and prediction were handled within a Bayesian framework using Markov chain Monte Carlo (MCMC) simulations. Combining estimates of parasitaemia prevalence with the number of children under we obtained estimates of the number of infected children in the country. The population-adjusted prevalence ranges from in Namibe province to in Malanje province. The odds of parasitaemia in children living in a household with at least ITNs per person was by 41% lower (CI: 14%, 60%) than in those with fewer ITNs. The estimates of the number of parasitaemic children produced in this paper are important for planning and implementing malaria control interventions and for monitoring the impact of prevention and control activities. PMID:20351775

  2. A geostatistical approach to mapping site response spectral amplifications

    USGS Publications Warehouse

    Thompson, E.M.; Baise, L.G.; Kayen, R.E.; Tanaka, Y.; Tanaka, H.

    2010-01-01

    If quantitative estimates of the seismic properties do not exist at a location of interest then the site response spectral amplifications must be estimated from data collected at other locations. Currently, the most common approach employs correlations of site class with maps of surficial geology. Analogously, correlations of site class with topographic slope can be employed where the surficial geology is unknown. Our goal is to identify and validate a method to estimate site response with greater spatial resolution and accuracy for regions where additional effort is warranted. This method consists of three components: region-specific data collection, a spatial model for interpolating seismic properties, and a theoretical method for computing spectral amplifications from the interpolated seismic properties. We consider three spatial interpolation schemes: correlations with surficial geology, termed the geologic trend (GT), ordinary kriging (OK), and kriging with a trend (KT). We estimate the spectral amplifications from seismic properties using the square root of impedance method, thereby linking the frequency-dependent spectral amplifications to the depth-dependent seismic properties. Thus, the range of periods for which this method is applicable is limited by the depth of exploration. A dense survey of near-surface S-wave slowness (Ss) throughout Kobe, Japan shows that the geostatistical methods give more accurate estimates of Ss than the topographic slope and GT methods, and the OK and KT methods perform equally well. We prefer the KT model because it can be seamlessly integrated with geologic maps that cover larger regions. Empirical spectral amplifications show that the region-specific data achieve more accurate estimates of observed median short-period amplifications than the topographic slope method. ?? 2010 Elsevier B.V.

  3. Use of geostatistical modeling to capture complex geology in finite-element analyses

    SciTech Connect

    Rautman, C.A.; Longenbaugh, R.S.; Ryder, E.E.

    1995-12-01

    This paper summarizes a number of transient thermal analyses performed for a representative two-dimensional cross section of volcanic tuffs at Yucca Mountain using the finite element, nonlinear heat-conduction code COYOTE-II. In addition to conventional design analyses, in which material properties are formulated as a uniform single material and as horizontally layered, internally uniform matters, an attempt was made to increase the resemblance of the thermal property field to the actual geology by creating two fairly complex, geologically realistic models. The first model was created by digitizing an existing two-dimensional geologic cross section of Yucca Mountain. The second model was created using conditional geostatistical simulation. Direct mapping of geostatistically generated material property fields onto finite element computational meshes was demonstrated to yield temperature fields approximately equivalent to those generated through more conventional procedures. However, the ability to use the geostatistical models offers a means of simplifying the physical-process analyses.

  4. Use of geostatistics for remediation planning to transcend urban political boundaries.

    PubMed

    Milillo, Tammy M; Sinha, Gaurav; Gardella, Joseph A

    2012-11-01

    Soil remediation plans are often dictated by areas of jurisdiction or property lines instead of scientific information. This study exemplifies how geostatistically interpolated surfaces can substantially improve remediation planning. Ordinary kriging, ordinary co-kriging, and inverse distance weighting spatial interpolation methods were compared for analyzing surface and sub-surface soil sample data originally collected by the US EPA and researchers at the University at Buffalo in Hickory Woods, an industrial-residential neighborhood in Buffalo, NY, where both lead and arsenic contamination is present. Past clean-up efforts estimated contamination levels from point samples, but parcel and agency jurisdiction boundaries were used to define remediation sites, rather than geostatistical models estimating the spatial behavior of the contaminants in the soil. Residents were understandably dissatisfied with the arbitrariness of the remediation plan. In this study we show how geostatistical mapping and participatory assessment can make soil remediation scientifically defensible, socially acceptable, and economically feasible. PMID:22771352

  5. Geostatistical analysis of the spatial variation of the ash reserves in the litter of bog birch forests in Western Siberia

    NASA Astrophysics Data System (ADS)

    Efremova, T. T.; Sekretenko, O. P.; Avrova, A. F.; Efremov, S. P.

    2013-01-01

    A typological series of native Betula pubescens Ehrh. dendrocenoses along the channel of a river crossing a bog was studied. The variability of the mineral element reserves is described by geostatistical methods as the sum of a trend, autocorrelation, and random components. The contribution of deterministic and random components has been assessed in the years with average precipitation and in the year of 2007 with high and long-term flooding. The empirical variograms and the parameters of the model variograms are presented. The class of the spatial correlation of the ash reserves is described. A primary cause of the ash content's variability is the specific water regime, which is determined by the following: (i) the abundance and duration of the spring floods responsible for the silt mass brought by the river and (ii) the draining effect of the intrabog river, the distance from which provided the formation in the forest of the ground cover with the specific species composition and ash content. The falloff of the arboreal layer in the bog birch forests formed the fundamental mineral background of the litter.

  6. Geostatistical analysis of groundwater level using Euclidean and non-Euclidean distance metrics and variable variogram fitting criteria

    NASA Astrophysics Data System (ADS)

    Theodoridou, Panagiota G.; Karatzas, George P.; Varouchakis, Emmanouil A.; Corzo Perez, Gerald A.

    2015-04-01

    Groundwater level is an important information in hydrological modelling. Geostatistical methods are often employed to map the free surface of an aquifer. In geostatistical analysis using Kriging techniques the selection of the optimal variogram model is very important for the optimal method performance. This work compares three different criteria, the least squares sum method, the Akaike Information Criterion and the Cressie's Indicator, to assess the theoretical variogram that fits to the experimental one and investigates the impact on the prediction results. Moreover, five different distance functions (Euclidean, Minkowski, Manhattan, Canberra, and Bray-Curtis) are applied to calculate the distance between observations that affects both the variogram calculation and the Kriging estimator. Cross validation analysis in terms of Ordinary Kriging is applied by using sequentially a different distance metric and the above three variogram fitting criteria. The spatial dependence of the observations in the tested dataset is studied by fitting classical variogram models and the Matérn model. The proposed comparison analysis performed for a data set of two hundred fifty hydraulic head measurements distributed over an alluvial aquifer that covers an area of 210 km2. The study area is located in the Prefecture of Drama, which belongs to the Water District of East Macedonia (Greece). This area was selected in terms of hydro-geological data availability and geological homogeneity. The analysis showed that a combination of the Akaike information Criterion for the variogram fitting assessment and the Brays-Curtis distance metric provided the most accurate cross-validation results. The Power-law variogram model provided the best fit to the experimental data. The aforementioned approach for the specific dataset in terms of the Ordinary Kriging method improves the prediction efficiency in comparison to the classical Euclidean distance metric. Therefore, maps of the spatial

  7. Systematic evaluation of sequential geostatistical resampling within MCMC for posterior sampling of near-surface geophysical inverse problems

    NASA Astrophysics Data System (ADS)

    Ruggeri, Paolo; Irving, James; Holliger, Klaus

    2015-08-01

    We critically examine the performance of sequential geostatistical resampling (SGR) as a model proposal mechanism for Bayesian Markov-chain-Monte-Carlo (MCMC) solutions to near-surface geophysical inverse problems. Focusing on a series of simple yet realistic synthetic crosshole georadar tomographic examples characterized by different numbers of data, levels of data error and degrees of model parameter spatial correlation, we investigate the efficiency of three different resampling strategies with regard to their ability to generate statistically independent realizations from the Bayesian posterior distribution. Quite importantly, our results show that, no matter what resampling strategy is employed, many of the examined test cases require an unreasonably high number of forward model runs to produce independent posterior samples, meaning that the SGR approach as currently implemented will not be computationally feasible for a wide range of problems. Although use of a novel gradual-deformation-based proposal method can help to alleviate these issues, it does not offer a full solution. Further, we find that the nature of the SGR is found to strongly influence MCMC performance; however no clear rule exists as to what set of inversion parameters and/or overall proposal acceptance rate will allow for the most efficient implementation. We conclude that although the SGR methodology is highly attractive as it allows for the consideration of complex geostatistical priors as well as conditioning to hard and soft data, further developments are necessary in the context of novel or hybrid MCMC approaches for it to be considered generally suitable for near-surface geophysical inversions.

  8. Error modeling based on geostatistics for uncertainty analysis in crop mapping using Gaofen-1 multispectral imagery

    NASA Astrophysics Data System (ADS)

    You, Jiong; Pei, Zhiyuan

    2015-01-01

    With the development of remote sensing technology, its applications in agriculture monitoring systems, crop mapping accuracy, and spatial distribution are more and more being explored by administrators and users. Uncertainty in crop mapping is profoundly affected by the spatial pattern of spectral reflectance values obtained from the applied remote sensing data. Errors in remotely sensed crop cover information and the propagation in derivative products need to be quantified and handled correctly. Therefore, this study discusses the methods of error modeling for uncertainty characterization in crop mapping using GF-1 multispectral imagery. An error modeling framework based on geostatistics is proposed, which introduced the sequential Gaussian simulation algorithm to explore the relationship between classification errors and the spectral signature from remote sensing data source. On this basis, a misclassification probability model to produce a spatially explicit classification error probability surface for the map of a crop is developed, which realizes the uncertainty characterization for crop mapping. In this process, trend surface analysis was carried out to generate a spatially varying mean response and the corresponding residual response with spatial variation for the spectral bands of GF-1 multispectral imagery. Variogram models were employed to measure the spatial dependence in the spectral bands and the derived misclassification probability surfaces. Simulated spectral data and classification results were quantitatively analyzed. Through experiments using data sets from a region in the low rolling country located at the Yangtze River valley, it was found that GF-1 multispectral imagery can be used for crop mapping with a good overall performance, the proposal error modeling framework can be used to quantify the uncertainty in crop mapping, and the misclassification probability model can summarize the spatial variation in map accuracy and is helpful for

  9. UNCERT: geostatistics, uncertainty analysis and visualization software applied to groundwater flow and contaminant transport modeling

    NASA Astrophysics Data System (ADS)

    Wingle, William L.; Poeter, Eileen P.; McKenna, Sean A.

    1999-05-01

    UNCERT is a 2D and 3D geostatistics, uncertainty analysis and visualization software package applied to ground water flow and contaminant transport modeling. It is a collection of modules that provides tools for linear regression, univariate statistics, semivariogram analysis, inverse-distance gridding, trend-surface analysis, simple and ordinary kriging and discrete conditional indicator simulation. Graphical user interfaces for MODFLOW and MT3D, ground water flow and contaminant transport models, are provided for streamlined data input and result analysis. Visualization tools are included for displaying data input and output. These include, but are not limited to, 2D and 3D scatter plots, histograms, box and whisker plots, 2D contour maps, surface renderings of 2D gridded data and 3D views of gridded data. By design, UNCERT's graphical user interface and visualization tools facilitate model design and analysis. There are few built in restrictions on data set sizes and each module (with two exceptions) can be run in either graphical or batch mode. UNCERT is in the public domain and is available from the World Wide Web with complete on-line and printable (PDF) documentation. UNCERT is written in ANSI-C with a small amount of FORTRAN77, for UNIX workstations running X-Windows and Motif (or Lesstif). This article discusses the features of each module and demonstrates how they can be used individually and in combination. The tools are applicable to a wide range of fields and are currently used by researchers in the ground water, mining, mathematics, chemistry and geophysics, to name a few disciplines.

  10. Geostatistical analysis of soil geochemical data from an industrial area (Puertollano, South-Central Spain).

    NASA Astrophysics Data System (ADS)

    Esbrí, José M.; Higueras, Pablo; López-Berdonces, Miguel A.; García-Noguero, Eva M.; González-Corrochano, Beatriz; Fernández-Calderón, Sergio; Martínez-Coronado, Alba

    2015-04-01

    Puertollano is the biggest industrial city of Castilla-La Mancha, with 48,086 inhabitants. It is located 250 km South of Madrid in the North border of the Ojailén River valley. The industrial area includes a big coal open pit (ENCASUR), two power plants (EON and ELCOGAS), a petrochemical complex (REPSOL) and a fertiliser factory (ENFERSA), all located in the proximities of the town. These industries suppose a complex scenario in terms of metals and metalloids emissions. For instance, mercury emissions declared to PRTR inventory during 2010 were 210 kg year-1 (REPSOL), 130 kg year-1 (ELCOGAS) and 11,9 kg year-1 (EON). Besides it still remains an unaccounted possibly of diffuse sources of other potentially toxic elements coming from the different industrial sites. Multielemental analyses of soils from two different depths covering the whole valley were carried out by means of XRF with a portable Oxford Instruments device. Geostatistical data treatment was performed using SURFER software, applying block kriging to obtain interpolation maps for the study area. Semivariograms of elemental concentrations make a clear distinction between volatile (Hg, Se) and non-volatile elements (Cu, Ni), with differences in scales and variances between the two soil horizons considered. Semivariograms also show different models for elements emitted by combustion processes (Ni) and for anomalous elements from geological substrate (Pb, Zn). In addition to differences in anisotropy of data, these models reflect different forms of elemental dispersion; despite this, identification of particular sources for the different elements is not possible for this geochemical data set.

  11. Quantifying the Relationship between Dynamical Cores and Physical Parameterizations by Geostatistical Methods

    NASA Astrophysics Data System (ADS)

    Yorgun, M. S.; Rood, R. B.

    2010-12-01

    The behavior of atmospheric models is sensitive to the algorithms that are used to represent the equations of motion. Typically, comprehensive models are conceived in terms of the resolved fluid dynamics (i.e. the dynamical core) and subgrid, unresolved physics represented by parameterizations. Deterministic weather predictions are often validated with feature-by-feature comparison. Probabilistic weather forecasts and climate projects are evaluated with statistical methods. We seek to develop model evaluation strategies that identify like “objects” - coherent systems with an associated set of measurable parameters. This makes it possible to evaluate processes in models without needing to reproduce the time and location of, for example, a particular observed cloud system. Process- and object-based evaluation preserves information in the observations by avoiding the need for extensive spatial and temporal averaging. As a concrete example, we focus on analyzing how the choice of dynamical core impacts the representation of precipitation in the Pacific Northwest of the United States, Western Canada, and Alaska; this brings attention to the interaction of the resolved and the parameterized components of the model. Two dynamical cores are considered within the Community Atmosphere Model. These are the Spectral (Eulerian), which relies on global basis functions and the Finite Volume (FV), which uses only local information. We introduce the concept of "meteorological realism" that is, do local representations of large-scale phenomena, for example, fronts and orographic precipitation, look like the observations? A follow on question is, does the representation of these phenomena improve with resolution? Our approach to quantify meteorological realism starts with methods of geospatial statistics. Specifically, we employ variography, which is a geostatistical method which is used to measure the spatial continuity of a regionalized variable, and principle component

  12. Bayesian Geostatistical Analysis and Prediction of Rhodesian Human African Trypanosomiasis

    PubMed Central

    Wardrop, Nicola A.; Atkinson, Peter M.; Gething, Peter W.; Fèvre, Eric M.; Picozzi, Kim; Kakembo, Abbas S. L.; Welburn, Susan C.

    2010-01-01

    Background The persistent spread of Rhodesian human African trypanosomiasis (HAT) in Uganda in recent years has increased concerns of a potential overlap with the Gambian form of the disease. Recent research has aimed to increase the evidence base for targeting control measures by focusing on the environmental and climatic factors that control the spatial distribution of the disease. Objectives One recent study used simple logistic regression methods to explore the relationship between prevalence of Rhodesian HAT and several social, environmental and climatic variables in two of the most recently affected districts of Uganda, and suggested the disease had spread into the study area due to the movement of infected, untreated livestock. Here we extend this study to account for spatial autocorrelation, incorporate uncertainty in input data and model parameters and undertake predictive mapping for risk of high HAT prevalence in future. Materials and Methods Using a spatial analysis in which a generalised linear geostatistical model is used in a Bayesian framework to account explicitly for spatial autocorrelation and incorporate uncertainty in input data and model parameters we are able to demonstrate a more rigorous analytical approach, potentially resulting in more accurate parameter and significance estimates and increased predictive accuracy, thereby allowing an assessment of the validity of the livestock movement hypothesis given more robust parameter estimation and appropriate assessment of covariate effects. Results Analysis strongly supports the theory that Rhodesian HAT was imported to the study area via the movement of untreated, infected livestock from endemic areas. The confounding effect of health care accessibility on the spatial distribution of Rhodesian HAT and the linkages between the disease's distribution and minimum land surface temperature have also been confirmed via the application of these methods. Conclusions Predictive mapping indicates an

  13. Spectral turning bands for efficient Gaussian random fields generation on GPUs and accelerators

    NASA Astrophysics Data System (ADS)

    Hunger, L.; Cosenza, B.; Kimeswenger, S.; Fahringer, T.

    2015-11-01

    A random field (RF) is a set of correlated random variables associated with different spatial locations. RF generation algorithms are of crucial importance for many scientific areas, such as astrophysics, geostatistics, computer graphics, and many others. Current approaches commonly make use of 3D fast Fourier transform (FFT), which does not scale well for RF bigger than the available memory; they are also limited to regular rectilinear meshes. We introduce random field generation with the turning band method (RAFT), an RF generation algorithm based on the turning band method that is optimized for massively parallel hardware such as GPUs and accelerators. Our algorithm replaces the 3D FFT with a lower-order, one-dimensional FFT followed by a projection step and is further optimized with loop unrolling and blocking. RAFT can easily generate RF on non-regular (non-uniform) meshes and efficiently produce fields with mesh sizes bigger than the available device memory by using a streaming, out-of-core approach. Our algorithm generates RF with the correct statistical behavior and is tested on a variety of modern hardware, such as NVIDIA Tesla, AMD FirePro and Intel Phi. RAFT is faster than the traditional methods on regular meshes and has been successfully applied to two real case scenarios: planetary nebulae and cosmological simulations.

  14. Delivering Prevention Interventions to People Living with HIV in Clinical Care Settings: Results of a Cluster Randomized Trial in Kenya, Namibia, and Tanzania

    PubMed Central

    Kidder, Daniel; Medley, Amy; Pals, Sherri L.; Carpenter, Deborah; Howard, Andrea; Antelman, Gretchen; DeLuca, Nicolas; Muhenje, Odylia; Sheriff, Muhsin; Somi, Geoffrey; Katuta, Frieda; Cherutich, Peter; Moore, Janet

    2016-01-01

    We conducted a group randomized trial to assess the feasibility and effectiveness of a multi-component, clinic-based HIV prevention intervention for HIV-positive patients attending clinical care in Namibia, Kenya, and Tanzania. Eighteen HIV care and treatment clinics (six per country) were randomly assigned to intervention or control arms. Approximately 200 sexually active clients from each clinic were enrolled and interviewed at baseline and 6- and 12-months post-intervention. Mixed model logistic regression with random effects for clinic and participant was used to assess the effectiveness of the intervention. Of 3522 HIV-positive patients enrolled, 3034 (86 %) completed a 12-month follow-up interview. Intervention participants were significantly more likely to report receiving provider-delivered messages on disclosure, partner testing, family planning, alcohol reduction, and consistent condom use compared to participants in comparison clinics. Participants in intervention clinics were less likely to report unprotected sex in the past 2 weeks (OR = 0.56, 95 % CI 0.32, 0.99) compared to participants in comparison clinics. In Tanzania, a higher percentage of participants in intervention clinics (17 %) reported using a highly effective method of contraception compared to participants in comparison clinics (10 %, OR = 2.25, 95 % CI 1.24, 4.10). This effect was not observed in Kenya or Namibia. HIV prevention services are feasible to implement as part of routine care and are associated with a self-reported decrease in unprotected sex. Further operational research is needed to identify strategies to address common operational challenges including staff turnover and large patient volumes. PMID:26995678

  15. Delivering Prevention Interventions to People Living with HIV in Clinical Care Settings: Results of a Cluster Randomized Trial in Kenya, Namibia, and Tanzania.

    PubMed

    Bachanas, Pamela; Kidder, Daniel; Medley, Amy; Pals, Sherri L; Carpenter, Deborah; Howard, Andrea; Antelman, Gretchen; DeLuca, Nicolas; Muhenje, Odylia; Sheriff, Muhsin; Somi, Geoffrey; Katuta, Frieda; Cherutich, Peter; Moore, Janet

    2016-09-01

    We conducted a group randomized trial to assess the feasibility and effectiveness of a multi-component, clinic-based HIV prevention intervention for HIV-positive patients attending clinical care in Namibia, Kenya, and Tanzania. Eighteen HIV care and treatment clinics (six per country) were randomly assigned to intervention or control arms. Approximately 200 sexually active clients from each clinic were enrolled and interviewed at baseline and 6- and 12-months post-intervention. Mixed model logistic regression with random effects for clinic and participant was used to assess the effectiveness of the intervention. Of 3522 HIV-positive patients enrolled, 3034 (86 %) completed a 12-month follow-up interview. Intervention participants were significantly more likely to report receiving provider-delivered messages on disclosure, partner testing, family planning, alcohol reduction, and consistent condom use compared to participants in comparison clinics. Participants in intervention clinics were less likely to report unprotected sex in the past 2 weeks (OR = 0.56, 95 % CI 0.32, 0.99) compared to participants in comparison clinics. In Tanzania, a higher percentage of participants in intervention clinics (17 %) reported using a highly effective method of contraception compared to participants in comparison clinics (10 %, OR = 2.25, 95 % CI 1.24, 4.10). This effect was not observed in Kenya or Namibia. HIV prevention services are feasible to implement as part of routine care and are associated with a self-reported decrease in unprotected sex. Further operational research is needed to identify strategies to address common operational challenges including staff turnover and large patient volumes. PMID:26995678

  16. Combined assimilation of streamflow and satellite soil moisture with the particle filter and geostatistical modeling

    NASA Astrophysics Data System (ADS)

    Yan, Hongxiang; Moradkhani, Hamid

    2016-08-01

    Assimilation of satellite soil moisture and streamflow data into a distributed hydrologic model has received increasing attention over the past few years. This study provides a detailed analysis of the joint and separate assimilation of streamflow and Advanced Scatterometer (ASCAT) surface soil moisture into a distributed Sacramento Soil Moisture Accounting (SAC-SMA) model, with the use of recently developed particle filter-Markov chain Monte Carlo (PF-MCMC) method. Performance is assessed over the Salt River Watershed in Arizona, which is one of the watersheds without anthropogenic effects in Model Parameter Estimation Experiment (MOPEX). A total of five data assimilation (DA) scenarios are designed and the effects of the locations of streamflow gauges and the ASCAT soil moisture on the predictions of soil moisture and streamflow are assessed. In addition, a geostatistical model is introduced to overcome the significantly biased satellite soil moisture and also discontinuity issue. The results indicate that: (1) solely assimilating outlet streamflow can lead to biased soil moisture estimation; (2) when the study area can only be partially covered by the satellite data, the geostatistical approach can estimate the soil moisture for those uncovered grid cells; (3) joint assimilation of streamflow and soil moisture from geostatistical modeling can further improve the surface soil moisture prediction. This study recommends that the geostatistical model is a helpful tool to aid the remote sensing technique and the hydrologic DA study.

  17. Adapting geostatistics to analyze spatial and temporal trends in weed populations

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Geostatistics were originally developed in mining to estimate the location, abundance and quality of ore over large areas from soil samples to optimize future mining efforts. Here, some of these methods were adapted to weeds to account for a limited distribution area (i.e., inside a field), variatio...

  18. The use of geostatistics in the study of floral phenology of Vulpia geniculata (L.) link.

    PubMed

    León Ruiz, Eduardo J; García Mozo, Herminia; Domínguez Vilches, Eugenio; Galán, Carmen

    2012-01-01

    Traditionally phenology studies have been focused on changes through time, but there exist many instances in ecological research where it is necessary to interpolate among spatially stratified samples. The combined use of Geographical Information Systems (GIS) and Geostatistics can be an essential tool for spatial analysis in phenological studies. Geostatistics are a family of statistics that describe correlations through space/time and they can be used for both quantifying spatial correlation and interpolating unsampled points. In the present work, estimations based upon Geostatistics and GIS mapping have enabled the construction of spatial models that reflect phenological evolution of Vulpia geniculata (L.) Link throughout the study area during sampling season. Ten sampling points, scattered throughout the city and low mountains in the "Sierra de Córdoba" were chosen to carry out the weekly phenological monitoring during flowering season. The phenological data were interpolated by applying the traditional geostatitical method of Kriging, which was used to elaborate weekly estimations of V. geniculata phenology in unsampled areas. Finally, the application of Geostatistics and GIS to create phenological maps could be an essential complement in pollen aerobiological studies, given the increased interest in obtaining automatic aerobiological forecasting maps. PMID:22629169

  19. Integrated geostatistics for modeling fluid contacts and shales in Prudhoe Bay

    SciTech Connect

    Perez, G.; Chopra, A.K.; Severson, C.D.

    1997-12-01

    Geostatistics techniques are being used increasingly to model reservoir heterogeneity at a wide range of scales. A variety of techniques is now available with differing underlying assumptions, complexity, and applications. This paper introduces a novel method of geostatistics to model dynamic gas-oil contacts and shales in the Prudhoe Bay reservoir. The method integrates reservoir description and surveillance data within the same geostatistical framework. Surveillance logs and shale data are transformed to indicator variables. These variables are used to evaluate vertical and horizontal spatial correlation and cross-correlation of gas and shale at different times and to develop variogram models. Conditional simulation techniques are used to generate multiple three-dimensional (3D) descriptions of gas and shales that provide a measure of uncertainty. These techniques capture the complex 3D distribution of gas-oil contacts through time. The authors compare results of the geostatistical method with conventional techniques as well as with infill wells drilled after the study. Predicted gas-oil contacts and shale distributions are in close agreement with gas-oil contacts observed at infill wells.

  20. USING GEOSTATISTICS TO UNDERSTAND THE SPATIAL DISTRIBUTION OF SOIL PROPERTIES AND THE FIELD DISSIPATION OF HERBICIDES

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Current research in precision agriculture has focused on solving problems dealing with soil fertility and crop yield. Specifically, geostatistical methods are being used to link the spatial distribution in soil physical and chemical properties to spatial patterns in crop yield. Because soil proper...

  1. Estimating Solar PV Output Using Modern Space/Time Geostatistics (Presentation)

    SciTech Connect

    Lee, S. J.; George, R.; Bush, B.

    2009-04-29

    This presentation describes a project that uses mapping techniques to predict solar output at subhourly resolution at any spatial point, develop a methodology that is applicable to natural resources in general, and demonstrate capability of geostatistical techniques to predict the output of a potential solar plant.

  2. Combining Geostatistics with Moran’s I Analysis for Mapping Soil Heavy Metals in Beijing, China

    PubMed Central

    Huo, Xiao-Ni; Li, Hong; Sun, Dan-Feng; Zhou, Lian-Di; Li, Bao-Guo

    2012-01-01

    Production of high quality interpolation maps of heavy metals is important for risk assessment of environmental pollution. In this paper, the spatial correlation characteristics information obtained from Moran’s I analysis was used to supplement the traditional geostatistics. According to Moran’s I analysis, four characteristics distances were obtained and used as the active lag distance to calculate the semivariance. Validation of the optimality of semivariance demonstrated that using the two distances where the Moran’s I and the standardized Moran’s I, Z(I) reached a maximum as the active lag distance can improve the fitting accuracy of semivariance. Then, spatial interpolation was produced based on the two distances and their nested model. The comparative analysis of estimation accuracy and the measured and predicted pollution status showed that the method combining geostatistics with Moran’s I analysis was better than traditional geostatistics. Thus, Moran’s I analysis is a useful complement for geostatistics to improve the spatial interpolation accuracy of heavy metals. PMID:22690179

  3. Geostatistical prediction of stream-flow regime in southeastern United States

    NASA Astrophysics Data System (ADS)

    Pugliese, Alessio; Castellarin, Attilio; Archfield, Stacey; Farmer, William

    2015-04-01

    similar performances independently of the interpretation of the curves (i.e. period-of-record/annual, or complete/seasonal) or Q* (MAF or MAP*); at -site performances are satisfactory or good (i.e. Nash-Sutcliffe Efficiency NSE ranges from 0.60 to 0.90 for cross-validated FDCs, depending on the model setting), while the overall performance at regional scale indicates that OK and TK are associated with smaller BIAS and RMSE relative to the six benchmark procedures. Acknowledgements: We thankfully acknowledge Alessia Bononi and Antonio Liguori for their preliminary analyses and Jon O. Skøien and Edzer Pebesma for their helpful assistance with R-packages rtop and gstat. The study is part of the research activities carried out by the working group: Anthropogenic and Climatic Controls on WateR AvailabilitY (ACCuRAcY) of Panta Rhei - Everything Flows Change in Hydrology and Society (IAHS Scientific Decade 2013-2022). References Pugliese, A., A. Castellarin, A., Brath (2014): Geostatistical prediction of flow-duration curves in an index-flow framework, Hydrol. Earth Syst. Sci., 18, 3801-3816,doi:10.5194/hess-18-3801-2014. Castiglioni, S., A. Castellarin, A. Montanari (2009): Prediction of low-flow indices in ungauged basins through physiographical space-based interpolation, Journal of Hydrology, 378, 272-280.

  4. Introduction to This Special Issue on Geostatistics and Geospatial Techniques in Remote Sensing

    NASA Technical Reports Server (NTRS)

    Atkinson, Peter; Quattrochi, Dale A.; Goodman, H. Michael (Technical Monitor)

    2000-01-01

    The germination of this special Computers & Geosciences (C&G) issue began at the Royal Geographical Society (with the Institute of British Geographers) (RGS-IBG) annual meeting in January 1997 held at the University of Exeter, UK. The snow and cold of the English winter were tempered greatly by warm and cordial discussion of how to stimulate and enhance cooperation on geostatistical and geospatial research in remote sensing 'across the big pond' between UK and US researchers. It was decided that one way forward would be to hold parallel sessions in 1998 on geostatistical and geospatial research in remote sensing at appropriate venues in both the UK and the US. Selected papers given at these sessions would be published as special issues of C&G on the UK side and Photogrammetric Engineering and Remote Sensing (PE&RS) on the US side. These issues would highlight the commonality in research on geostatistical and geospatial research in remote sensing on both sides of the Atlantic Ocean. As a consequence, a session on "Geostatistics and Geospatial Techniques for Remote Sensing of Land Surface Processes" was held at the RGS-IBG annual meeting in Guildford, Surrey, UK in January 1998, organized by the Modeling and Advanced Techniques Special Interest Group (MAT SIG) of the Remote Sensing Society (RSS). A similar session was held at the Association of American Geographers (AAG) annual meeting in Boston, Massachusetts in March 1998, sponsored by the AAG's Remote Sensing Specialty Group (RSSG). The 10 papers that make up this issue of C&G, comprise 7 papers from the UK and 3 papers from the LIS. We are both co-editors of each of the journal special issues, with the lead editor of each journal issue being from their respective side of the Atlantic. The special issue of PE&RS (vol. 65) that constitutes the other half of this co-edited journal series was published in early 1999, comprising 6 papers by US authors. We are indebted to the International Association for Mathematical

  5. Evaluation of the supply of antifungal medication for the treatment of vaginal thrush in the community pharmacy setting: a randomized controlled trial

    PubMed Central

    Schneider, Carl R.; Emery, Lyndal; Brostek, Raisa; Clifford, Rhonda M.

    Background The Pharmaceutical Society of Australia have developed "guidance" for the supply of several medicines available without prescription to the general public. Limited research has been published assessing the effect of these guidelines on the provision of medication within the practice of pharmacy. Objective To assess appropriate supply of non-prescription antifungal medications for the treatment of vaginal thrush in community pharmacies, with and without a guideline. A secondary aim was to describe the assessment and counseling provided to patients when requesting this medication. Methods A randomized controlled trial was undertaken whereby two simulated patients conducted visits to 100 randomly selected community pharmacies in a metropolitan region. A product-based request for fluconazole (an oral antifungal that has a guideline was compared to a product-based request for clotrimazole (a topical antifungal without a guideline). The same patient details were used for both requests. Outcome measures of the visits were the appropriateness of supply and referral to a medical practitioner. Results Overall 16% (n=16) of visits resulted in an appropriate outcome; 10% (n=5) of fluconozaole requests compared with 22% (n=11) of clotrimazole requests (chi-square=2.68, p=0.10). There was a difference in the type of assessment performed by pharmacy staff between visits for fluconazole and clotrimazole. A request for clotrimazole resulted in a significant increase in frequency in regards to assessment of the reason for the request (chi-square=8.57, p=0.003), symptom location (chi-square=8.27, p=0.004), and prior history (chi-square=5.09, p=0.02). Conclusions Overall practice was poor, with the majority of pharmacies inappropriately supplying antifungal medication. New strategies are required to improve current practice of community pharmacies for provision of non-prescription antifungals in the treatment of vaginal thrush. PMID:24223077

  6. Evaluation and implementation of graded in vivo exposure for chronic low back pain in a German outpatient setting: a study protocol of a randomized controlled trial

    PubMed Central

    2013-01-01

    Background The purpose of the present study is to introduce an adapted protocol of in vivo exposure for fear avoidant back pain patients and its implementation in the German health care system without multidisciplinary teams. Case studies demonstrated promising effects but three preceding randomized controlled trials (RCTs) could not support the former results. More empirical support is necessary to further substantiate the effectiveness of in vivo exposure. Methods A total of 108 chronic low back pain patients are randomly assigned to one out of three conditions (A: exposure_long (15 sessions), B: exposure_short (10 sessions) or C: control condition cognitive behavioral therapy (15 sessions)). The inclusion criteria are: back pain ≥3 months and a sufficient level of fear-avoidance. An effect evaluation, a process evaluation and an economic evaluation are conducted. Primary outcomes are pain-related disability and pain intensity. Secondary outcomes are: emotional distress, fear avoidance, catastrophizing and costs. Data are collected at baseline, upon completion of the intervention, after 10 sessions, and at six months following completion of treatment. Besides the comparison of exposure in vivo and cognitive behavioral therapy (CBT), we additionally compare a short and a long version of exposure to analyze dose response effects. Discussion This is, to our knowledge, the first RCT comparing in vivo exposure to psychological treatment as usual in terms of cognitive behavioral therapy. Results will help to find out whether a tailored treatment for fear avoidant back pain patients is more effective than a general pain management treatment. Trial registration The trial has been registered to ClinicalTrial.gov. The trial registration number is NCT01484418 PMID:23837551

  7. Computer generation and application of 3-D model porous media: From pore-level geostatistics to the estimation of formation factor

    SciTech Connect

    Ioannidis, M.; Kwiecien, M.; Chatzis, I.

    1995-12-31

    This paper describes a new method for the computer generation of 3-D stochastic realizations of porous media using geostatistical information obtained from high-contrast 2-D images of pore casts. The stochastic method yields model porous media with statistical properties identical to those of their real counterparts. Synthetic media obtained in this manner can form the basis for a number of studies related to the detailed characterization of the porous microstructure and, ultimately, the prediction of important petrophysical and reservoir engineering properties. In this context, direct computer estimation of the formation resistivity factor is examined using a discrete random walk algorithm. The dependence of formation factor on measureable statistical properties of the pore space is also investigated.

  8. Considerations in Applying the Results of Randomized Controlled Clinical Trials to the Care of Older Adults With Kidney Disease in the Clinical Setting: The SHARP Trial.

    PubMed

    Butler, Catherine R; O'Hare, Ann M

    2016-01-01

    The Study of Heart and Renal Protection (SHARP) found that treatment with ezetemibe and low-dose simvastatin reduced the incidence of major atherosclerotic events in patients with kidney disease. Due to the paucity of evidence-based interventions that lower cardiovascular morbidity in this high-risk population, the SHARP trial will likely have a large impact on clinical practice. However, applying the results of clinical trials conducted in select populations to the care of individual patients in real-world settings can be fraught with difficulty. This is especially true when caring for older adults with complex comorbidity and limited life expectancy. These patients are often excluded from clinical trials, frequently have competing health priorities, and may be less likely to benefit and more likely to be harmed by medications. We discuss key considerations in applying the results of the SHARP trial to the care of older adults with CKD in real-world clinical settings using guiding principles set forth by the American Geriatrics Society's Expert Panel on the Care of Older Adults with Multimorbidity. Using this schema, we emphasize the importance of evaluating trial results in the unique context of each patient's goals, values, priorities, and circumstances. PMID:26709060

  9. RANDOM LASSO.

    PubMed

    Wang, Sijian; Nan, Bin; Rosset, Saharon; Zhu, Ji

    2011-03-01

    We propose a computationally intensive method, the random lasso method, for variable selection in linear models. The method consists of two major steps. In step 1, the lasso method is applied to many bootstrap samples, each using a set of randomly selected covariates. A measure of importance is yielded from this step for each covariate. In step 2, a similar procedure to the first step is implemented with the exception that for each bootstrap sample, a subset of covariates is randomly selected with unequal selection probabilities determined by the covariates' importance. Adaptive lasso may be used in the second step with weights determined by the importance measures. The final set of covariates and their coefficients are determined by averaging bootstrap results obtained from step 2. The proposed method alleviates some of the limitations of lasso, elastic-net and related methods noted especially in the context of microarray data analysis: it tends to remove highly correlated variables altogether or select them all, and maintains maximal flexibility in estimating their coefficients, particularly with different signs; the number of selected variables is no longer limited by the sample size; and the resulting prediction accuracy is competitive or superior compared to the alternatives. We illustrate the proposed method by extensive simulation studies. The proposed method is also applied to a Glioblastoma microarray data analysis. PMID:22997542

  10. Systematic Review and Pooled Analyses of Recent Neurointerventional Randomized Controlled Trials: Setting a New Standard of Care for Acute Ischemic Stroke Treatment after 20 Years

    PubMed Central

    Hussain, Mohammed; Moussavi, Mohammad; Korya, Daniel; Mehta, Siddhart; Brar, Jaskiran; Chahal, Harina; Qureshi, Ihtesham; Mehta, Tapan; Ahmad, Javaad; Zaidat, Osama O.; Kirmani, Jawad F.

    2016-01-01

    Background Recent advances in the treatment of ischemic stroke have focused on revascularization and led to better clinical and functional outcomes. A systematic review and pooled analyses of 6 recent multicentered prospective randomized controlled trials (MPRCT) were performed to compare intravenous tissue plasminogen activator (IV tPA) and endovascular therapy (intervention) with IV tPA alone (control) for anterior circulation ischemic stroke (AIS) secondary to large vessel occlusion (LVO). Objectives Six MPRCTs (MR CLEAN, ESCAPE, EXTEND IA, SWIFT PRIME, REVASCAT and THERAPY) incorporating image-based LVO AIS were selected for assessing the following: (1) prespecified primary clinical outcomes of AIS patients in intervention and control arms: good outcomes were defined by a modified Rankin Scale score of 0-2 at 90 days; (2) secondary clinical outcomes were: (a) revascularization rates [favorable outcomes defined as modified Thrombolysis in Cerebral Infarction scale (mTICI) score of 2b/3]; (b) symptomatic intracranial hemorrhage (sICH) rates and mortality; (c) derivation of number needed to harm (NNH), number needed to treat (NNT), and relative percent difference (RPD) between intervention and control groups, and (d) random effects model to determine overall significance (forest and funnel plots). Results A total of 1,386 patients were included. Good outcomes at 90 days were seen in 46% of patients in the intervention (p < 0.00001) and in 27% of patients in the control groups (p < 0.00002). An mTICI score of 2b/3 was achieved in 70.2% of patients in the intervention arm. The sICH and mortality in the intervention arm compared with the control arm were 4.7 and 14.3% versus 7.9 and 17.8%, respectively. The NNT and NNH in the intervention and control groups were 5.3 and 9.1, respectively. Patients in the intervention arm had a 50.1% (RPD) better chance of achieving a good 90-day outcome as compared to controls. Conclusions Endovascular therapy combined with IV t

  11. Short-term effects of goal-setting focusing on the life goal concept on subjective well-being and treatment engagement in subacute inpatients: a quasi-randomized controlled trial

    PubMed Central

    Ogawa, Tatsuya; Omon, Kyohei; Yuda, Tomohisa; Ishigaki, Tomoya; Imai, Ryota; Ohmatsu, Satoko; Morioka, Shu

    2016-01-01

    Objective: To investigate the short-term effects of the life goal concept on subjective well-being and treatment engagement, and to determine the sample size required for a larger trial. Design: A quasi-randomized controlled trial that was not blinded. Setting: A subacute rehabilitation ward. Subjects: A total of 66 patients were randomized to a goal-setting intervention group with the life goal concept (Life Goal), a standard rehabilitation group with no goal-setting intervention (Control 1), or a goal-setting intervention group without the life goal concept (Control 2). Interventions: The goal-setting intervention in the Life Goal and Control 2 was Goal Attainment Scaling. The Life Goal patients were assessed in terms of their life goals, and the hierarchy of goals was explained. The intervention duration was four weeks. Main measures: Patients were assessed pre- and post-intervention. The outcome measures were the Hospital Anxiety and Depression Scale, 12-item General Health Questionnaire, Pittsburgh Rehabilitation Participation Scale, and Functional Independence Measure. Results: Of the 296 potential participants, 66 were enrolled; Life Goal (n = 22), Control 1 (n = 22) and Control 2 (n = 22). Anxiety was significantly lower in the Life Goal (4.1 ±3.0) than in Control 1 (6.7 ±3.4), but treatment engagement was significantly higher in the Life Goal (5.3 ±0.4) compared with both the Control 1 (4.8 ±0.6) and Control 2 (4.9 ±0.5). Conclusions: The life goal concept had a short-term effect on treatment engagement. A sample of 31 patients per group would be required for a fully powered clinical trial. PMID:27496700

  12. Geostatistical and Stochastic Study of Flow and Transport in the Unsaturated Zone at Yucca Mountain

    SciTech Connect

    Ye, Ming; Pan, Feng; Hu, Xiaolong; Zhu, Jianting

    2007-08-14

    Yucca Mountain has been proposed by the U.S. Department of Energy as the nation’s long-term, permanent geologic repository for spent nuclear fuel or high-level radioactive waste. The potential repository would be located in Yucca Mountain’s unsaturated zone (UZ), which acts as a critical natural barrier delaying arrival of radionuclides to the water table. Since radionuclide transport in groundwater can pose serious threats to human health and the environment, it is important to understand how much and how fast water and radionuclides travel through the UZ to groundwater. The UZ system consists of multiple hydrogeologic units whose hydraulic and geochemical properties exhibit systematic and random spatial variation, or heterogeneity, at multiple scales. Predictions of radionuclide transport under such complicated conditions are uncertain, and the uncertainty complicates decision making and risk analysis. This project aims at using geostatistical and stochastic methods to assess uncertainty of unsaturated flow and radionuclide transport in the UZ at Yucca Mountain. Focus of this study is parameter uncertainty of hydraulic and transport properties of the UZ. The parametric uncertainty arises since limited parameter measurements are unable to deterministically describe spatial variability of the parameters. In this project, matrix porosity, permeability and sorption coefficient of the reactive tracer (neptunium) of the UZ are treated as random variables. Corresponding propagation of parametric uncertainty is quantitatively measured using mean, variance, 5th and 95th percentiles of simulated state variables (e.g., saturation, capillary pressure, percolation flux, and travel time). These statistics are evaluated using a Monte Carlo method, in which a three-dimensional flow and transport model implemented using the TOUGH2 code is executed with multiple parameter realizations of the random model parameters. The project specifically studies uncertainty of unsaturated

  13. Consequences of anorectal cancer atlas implementation in the cooperative group setting: Radiobiologic analysis of a prospective randomized in silico target delineation study

    PubMed Central

    Mavroidis, Panayiotis; Giantsoudis, Drosoula; Awan, Musaddiq J.; Nijkamp, Jasper; Rasch, Coen R. N.; Duppen, Joop C.; Thomas, Charles R.; Okunieff, Paul; Jones, William E.; Kachnicc, Lisa A.; Papanikolaou, Niko; Fuller, Clifton D.

    2014-01-01

    Purpose The aim of this study is to ascertain the subsequent radiobiological impact of using a consensus guideline target volume delineation atlas. Materials and methods Using a representative case and target volume delineation instructions derived from a proposed IMRT rectal cancer clinical trial, gross tumor volume (GTV) and clinical/planning target volumes (CTV/PTV) were contoured by 13 physician observers (Phase 1). The observers were then randomly assigned to follow (atlas) or not-follow (control) a consensus guideline/atlas for anorectal cancers, and instructed to re-contour the same case (Phase 2). Results The atlas group was found to have increased tumor control probability (TCP) after the atlas intervention for both the CTV (p < 0.0001) and PTV1 (p = 0.0011) with decreasing normal tissue complication probability (NTCP) for small intestine, while the control group did not. Additionally, the atlas group had reduced variance in TCP for all target volumes and reduced variance in NTCP for the bowel. In Phase 2, the atlas group had increased TCP relative to the control for CTV (p = 0.03). Conclusions Visual atlas and consensus treatment guidelines usage in the development of rectal cancer IMRT treatment plans reduced the inter-observer radiobiological variation, with clinically relevant TCP alteration for CTV and PTV volumes. PMID:24996454

  14. The Impact of Brief Messages on HSV-2 Screening Uptake Among Female Defendants in a Court Setting: A Randomized Controlled Trial Utilizing Prospect Theory

    PubMed Central

    ROTH, ALEXIS M.; VAN DER POL, BARBARA; FORTENBERRY, J. DENNIS; DODGE, BRIAN; REECE, MICHAEL; CERTO, DAVID; ZIMET, GREGORY D.

    2015-01-01

    Epidemiologic data demonstrate that women involved with the criminal justice system in the United States are at high risk for sexually transmitted infections, including herpes simplex virus type 2 (HSV-2). Female defendants were recruited from a misdemeanor court to assess whether brief framed messages utilizing prospect theory could encourage testing for HSV-2. Participants were randomly assigned to a message condition (gain, loss, or control), completed an interviewer-administered survey assessing factors associated with antibody test uptake/refusal and were offered free point-of-care HSV-2 serologic testing. Although individuals in the loss-frame group accepted testing at the highest rate, an overall statistical difference in HSV-2 testing behavior by group (p ≤.43) was not detected. The majority of the sample (74.6%) characterized receiving a serological test for HSV-2 as health affirming. However, this did not moderate the effect of the intervention nor was it significantly associated with test acceptance (p ≤.82). Although the effects of message framing are subtle, the findings have important theoretical implications given the participants’ characterization of HSV-2 screening as health affirming despite being a detection behavior. Implications of study results for health care providers interested in brief, low cost interventions are also explored. PMID:25494832

  15. The PRESLO study: evaluation of a global secondary low back pain prevention program for health care personnel in a hospital setting. Multicenter, randomized intervention trial

    PubMed Central

    2012-01-01

    Background Common low back pain represents a major public health problem in terms of its direct cost to health care and its socio-economic repercussions. Ten percent of individuals who suffer from low back pain evolve toward a chronic case and as such are responsible for 75 to 80% of the direct cost of low back pain. It is therefore imperative to highlight the predictive factors of low back pain chronification in order to lighten the economic burden of low back pain-related invalidity. Despite being particularly affected by low back pain, Hospices Civils de Lyon (HCL) personnel have never been offered a specific, tailor-made treatment plan. The PRESLO study (with PRESLO referring to Secondary Low Back Pain Prevention, or in French, PREvention Secondaire de la LOmbalgie), proposed by HCL occupational health services and the Centre Médico-Chirurgical et de Réadaptation des Massues – Croix Rouge Française, is a randomized trial that aims to evaluate the feasibility and efficiency of a global secondary low back pain prevention program for the low back pain sufferers among HCL hospital personnel, a population at risk for recurrence and chronification. This program, which is based on the concept of physical retraining, employs a multidisciplinary approach uniting physical activity, cognitive education about low back pain and lumbopelvic morphotype analysis. No study targeting populations at risk for low back pain chronification has as yet evaluated the efficiency of lighter secondary prevention programs. Methods/Design This study is a two-arm parallel randomized controlled trial proposed to all low back pain sufferers among HCL workers, included between October 2008 and July 2011 and followed over two years. The personnel following their usual treatment (control group) and those following the global prevention program in addition to their usual treatment (intervention group) are compared in terms of low back pain recurrence and the impairments measured at the

  16. Predictive modeling of groundwater nitrate pollution using Random Forest and multisource variables related to intrinsic and specific vulnerability: a case study in an agricultural setting (Southern Spain).

    PubMed

    Rodriguez-Galiano, Victor; Mendes, Maria Paula; Garcia-Soldado, Maria Jose; Chica-Olmo, Mario; Ribeiro, Luis

    2014-04-01

    Watershed management decisions need robust methods, which allow an accurate predictive modeling of pollutant occurrences. Random Forest (RF) is a powerful machine learning data driven method that is rarely used in water resources studies, and thus has not been evaluated thoroughly in this field, when compared to more conventional pattern recognition techniques key advantages of RF include: its non-parametric nature; high predictive accuracy; and capability to determine variable importance. This last characteristic can be used to better understand the individual role and the combined effect of explanatory variables in both protecting and exposing groundwater from and to a pollutant. In this paper, the performance of the RF regression for predictive modeling of nitrate pollution is explored, based on intrinsic and specific vulnerability assessment of the Vega de Granada aquifer. The applicability of this new machine learning technique is demonstrated in an agriculture-dominated area where nitrate concentrations in groundwater can exceed the trigger value of 50 mg/L, at many locations. A comprehensive GIS database of twenty-four parameters related to intrinsic hydrogeologic proprieties, driving forces, remotely sensed variables and physical-chemical variables measured in "situ", were used as inputs to build different predictive models of nitrate pollution. RF measures of importance were also used to define the most significant predictors of nitrate pollution in groundwater, allowing the establishment of the pollution sources (pressures). The potential of RF for generating a vulnerability map to nitrate pollution is assessed considering multiple criteria related to variations in the algorithm parameters and the accuracy of the maps. The performance of the RF is also evaluated in comparison to the logistic regression (LR) method using different efficiency measures to ensure their generalization ability. Prediction results show the ability of RF to build accurate models

  17. Randomized comparative study of left versus right radial approach in the setting of primary percutaneous coronary intervention for ST-elevation myocardial infarction

    PubMed Central

    Fu, Qiang; Hu, Hongyu; Wang, Dezhao; Chen, Wei; Tan, Zhixu; Li, Qun; Chen, Buxing

    2015-01-01

    Background Growing evidence suggests that the left radial approach (LRA) is related to decreased coronary procedure duration and fewer cerebrovascular complications as compared to the right radial approach (RRA) in elective percutaneous coronary intervention (PCI). However, the feasibility of LRA in primary PCI has yet to be studied further. Therefore, the aim of this study was to investigate the efficacy of LRA compared with RRA for primary PCI in ST-elevation myocardial infarction (STEMI) patients. Materials and methods A total of 200 consecutive patients with STEMI who received primary PCI were randomized to LRA (number [n]=100) or RRA (n=100). The study endpoint was needle-to-balloon time, defined as the time from local anesthesia infiltration to the first balloon inflation. Radiation dose by measuring cumulative air kerma (CAK) and CAK dose area product, as well as fluoroscopy time and contrast volume were also investigated. Results There were no significant differences in the baseline characteristics between the two groups. The coronary procedural success rate was similar between both radial approaches (98% for left versus 94% for right; P=0.28). Compared with RRA, LRA had significantly shorter needle-to-balloon time (16.0±4.8 minutes versus 18.0±6.5 minutes, respectively; P=0.02). Additionally, fluoroscopy time (7.4±3.4 minutes versus 8.8±3.5 minutes, respectively; P=0.01) and CAK dose area product (51.9±30.4 Gy cm2 versus 65.3±49.1 Gy cm2, respectively; P=0.04) were significantly lower with LRA than with RRA. Conclusion Primary PCI can be performed via LRA with earlier blood flow restoration in the infarct-related artery and lower radiation exposure when compared with RRA; therefore, the LRA may become a feasible and attractive alternative to perform primary PCI for STEMI patients. PMID:26150704

  18. SRS 2010 Vegetation Inventory GeoStatistical Mapping Results for Custom Reaction Intensity and Total Dead Fuels.

    SciTech Connect

    Edwards, Lloyd A.; Paresol, Bernard

    2014-09-01

    This report of the geostatistical analysis results of the fire fuels response variables, custom reaction intensity and total dead fuels is but a part of an SRS 2010 vegetation inventory project. For detailed description of project, theory and background including sample design, methods, and results please refer to USDA Forest Service Savannah River Site internal report “SRS 2010 Vegetation Inventory GeoStatistical Mapping Report”, (Edwards & Parresol 2013).

  19. A geostatistical approach for describing spatial pattern in stream networks

    USGS Publications Warehouse

    Ganio, L.M.; Torgersen, C.E.; Gresswell, R.E.

    2005-01-01

    The shape and configuration of branched networks influence ecological patterns and processes. Recent investigations of network influences in riverine ecology stress the need to quantify spatial structure not only in a two-dimensional plane, but also in networks. An initial step in understanding data from stream networks is discerning non-random patterns along the network. On the other hand, data collected in the network may be spatially autocorrelated and thus not suitable for traditional statistical analyses. Here we provide a method that uses commercially available software to construct an empirical variogram to describe spatial pattern in the relative abundance of coastal cutthroat trout in headwater stream networks. We describe the mathematical and practical considerations involved in calculating a variogram using a non-Euclidean distance metric to incorporate the network pathway structure in the analysis of spatial variability, and use a non-parametric technique to ascertain if the pattern in the empirical variogram is non-random.

  20. Performance of a Culturally Tailored Cognitive Behavioral Intervention (CBI) Integrated in a Public Health Setting to Reduce Risk of Antepartum Depression: A Randomized Clinical Trial

    PubMed Central

    Jesse, D. Elizabeth; Gaynes, Bradley N.; Feldhousen, Elizabeth; Newton, Edward R.; Bunch, Shelia; Hollon, Steven D.

    2016-01-01

    Introduction Cognitive behavioral group interventions have been shown to improve depressive symptoms in adult populations. This article details the feasibility and efficacy of a 6-week culturally tailored cognitive behavioral intervention offered to rural, minority, low-income women at risk for antepartum depression. Methods 146 pregnant women were stratified by high-risk for antepartum depression (Edinburgh Postnatal Depression Scale (EPDS) score of 10 or higher) or low-moderate risk (EPDS score of 4-9) and randomized to a cognitive behavioral intervention or treatment-as-usual. Differences in mean change of EPDS and BDI-II scores for low-moderate and high-risk women in the cognitive behavioral intervention and treatment-as-usual for the full sample were assessed from baseline (T1), post-treatment (T2) and 1-month follow-up (T3) and for African-American women in the subsample. Results Both the cognitive behavioral intervention and treatment-as-usual groups had significant reductions in the EPDS scores from T1 to T2 and T1 to T3. In women at high-risk for depression (n=62), there was no significant treatment effect from T1 to T2 or T3 for the Edinburgh Postnatal Depression Scale. However, in low-moderate risk women, there was a significant decrease in the BDI-II scores from T1 to T2 (4.92 vs. 0.59, P=.018) and T1 to T3 (5.67 vs. 1.51, P=.04). Also, the cognitive behavioral intervention significantly reduced EPDS scores for African-American women at high-risk (n=43) from T1 to T2 (5.59 vs. 2.18, P=.02) and from T1 to T3 (6.32 vs. 3.14, P= .04). Discussion A cognitive behavioral intervention integrated within prenatal clinics is feasible in this sample, although attrition rates were high. Compared to treatment-as-usual, the cognitive behavioral intervention reduced depressive symptoms for African-American women at high-risk for antepartum depression and for the full sample of women at low-moderate risk for antepartum depression. These promising findings need to be

  1. Three-dimensional geostatistical inversion of flowmeter and pumping test data.

    PubMed

    Li, Wei; Englert, Andreas; Cirpka, Olaf A; Vereecken, Harry

    2008-01-01

    We jointly invert field data of flowmeter and multiple pumping tests in fully screened wells to estimate hydraulic conductivity using a geostatistical method. We use the steady-state drawdowns of pumping tests and the discharge profiles of flowmeter tests as our data in the inference. The discharge profiles need not be converted to absolute hydraulic conductivities. Consequently, we do not need measurements of depth-averaged hydraulic conductivity at well locations. The flowmeter profiles contain information about relative vertical distributions of hydraulic conductivity, while drawdown measurements of pumping tests provide information about horizontal fluctuation of the depth-averaged hydraulic conductivity. We apply the method to data obtained at the Krauthausen test site of the Forschungszentrum Jülich, Germany. The resulting estimate of our joint three-dimensional (3D) geostatistical inversion shows an improved 3D structure in comparison to the inversion of pumping test data only. PMID:18266734

  2. Geochemical and geostatistical evaluation, Arkansas Canyon Planning Unit, Fremont and Custer Counties, Colorado

    USGS Publications Warehouse

    Weiland, E.F.; Connors, R.A.; Robinson, M.L.; Lindemann, J.W.; Meyer, W.T.

    1982-01-01

    A mineral assessment of the Arkansas Canyon Planning Unit was undertaken by Barringer Resources Inc., under the terms of contract YA-553-CTO-100 with the Bureau of Land Management, Colorado State Office. The study was based on a geochemical-geostatistical survey in which 700 stream sediment samples were collected and analyzed for 25 elements. Geochemical results were interpreted by statistical processing which included factor, discriminant, multiple regression and characteristic analysis. The major deposit types evaluated were massive sulfide-base metal, sedimentary and magmatic uranium, thorium vein, magmatic segregation, and carbonatite related deposits. Results of the single element data and multivariate geostatistical analysis indicate that limited potential exists for base metal mineralization near the Horseshoe, El Plomo, and Green Mountain Mines. Thirty areas are considered to be anomalous with regard to one or more of the geochemical parameters evaluated during this study. The evaluation of carbonatite related mineralization was restricted due to the lack of geochemical data specific to this environment.

  3. Geostatistical simulations for radon indoor with a nested model including the housing factor.

    PubMed

    Cafaro, C; Giovani, C; Garavaglia, M

    2016-01-01

    The radon prone areas definition is matter of many researches in radioecology, since radon is considered a leading cause of lung tumours, therefore the authorities ask for support to develop an appropriate sanitary prevention strategy. In this paper, we use geostatistical tools to elaborate a definition accounting for some of the available information about the dwellings. Co-kriging is the proper interpolator used in geostatistics to refine the predictions by using external covariates. In advance, co-kriging is not guaranteed to improve significantly the results obtained by applying the common lognormal kriging. Here, instead, such multivariate approach leads to reduce the cross-validation residual variance to an extent which is deemed as satisfying. Furthermore, with the application of Monte Carlo simulations, the paradigm provides a more conservative radon prone areas definition than the one previously made by lognormal kriging. PMID:26547362

  4. Family, Community and Clinic Collaboration to Treat Overweight and Obese Children: Stanford GOALS -- a Randomized Controlled Trial of a Three-Year, Multi-Component, Multi-Level, Multi-Setting Intervention

    PubMed Central

    Robinson, Thomas N.; Matheson, Donna; Desai, Manisha; Wilson, Darrell M.; Weintraub, Dana L.; Haskell, William L.; McClain, Arianna; McClure, Samuel; Banda, Jorge; Sanders, Lee M.; Haydel, K. Farish; Killen, Joel D.

    2013-01-01

    Objective To test the effects of a three-year, community-based, multi-component, multi-level, multi-setting (MMM) approach for treating overweight and obese children. Design Two-arm, parallel group, randomized controlled trial with measures at baseline, 12, 24, and 36 months after randomization. Participants Seven through eleven year old, overweight and obese children (BMI ≥ 85th percentile) and their parents/caregivers recruited from community locations in low-income, primarily Latino neighborhoods in Northern California. Interventions Families are randomized to the MMM intervention versus a community health education active-placebo comparison intervention. Interventions last for three years for each participant. The MMM intervention includes a community-based after school team sports program designed specifically for overweight and obese children, a home-based family intervention to reduce screen time, alter the home food/eating environment, and promote self-regulatory skills for eating and activity behavior change, and a primary care behavioral counseling intervention linked to the community and home interventions. The active-placebo comparison intervention includes semi-annual health education home visits, monthly health education newsletters for children and for parents/guardians, and a series of community-based health education events for families. Main Outcome Measure Body mass index trajectory over the three-year study. Secondary outcome measures include waist circumference, triceps skinfold thickness, accelerometer-measured physical activity, 24-hour dietary recalls, screen time and other sedentary behaviors, blood pressure, fasting lipids, glucose, insulin, hemoglobin A1c, C-reactive protein, alanine aminotransferase, and psychosocial measures. Conclusions The Stanford GOALS trial is testing the efficacy of a novel community-based multi-component, multi-level, multi-setting treatment for childhood overweight and obesity in low-income, Latino families

  5. Evaluation of Effectiveness and Cost‐Effectiveness of a Clinical Decision Support System in Managing Hypertension in Resource Constrained Primary Health Care Settings: Results From a Cluster Randomized Trial

    PubMed Central

    Anchala, Raghupathy; Kaptoge, Stephen; Pant, Hira; Di Angelantonio, Emanuele; Franco, Oscar H.; Prabhakaran, D.

    2015-01-01

    Background Randomized control trials from the developed world report that clinical decision support systems (DSS) could provide an effective means to improve the management of hypertension (HTN). However, evidence from developing countries in this regard is rather limited, and there is a need to assess the impact of a clinical DSS on managing HTN in primary health care center (PHC) settings. Methods and Results We performed a cluster randomized trial to test the effectiveness and cost‐effectiveness of a clinical DSS among Indian adult hypertensive patients (between 35 and 64 years of age), wherein 16 PHC clusters from a district of Telangana state, India, were randomized to receive either a DSS or a chart‐based support (CBS) system. Each intervention arm had 8 PHC clusters, with a mean of 102 hypertensive patients per cluster (n=845 in DSS and 783 in CBS groups). Mean change in systolic blood pressure (SBP) from baseline to 12 months was the primary endpoint. The mean difference in SBP change from baseline between the DSS and CBS at the 12th month of follow‐up, adjusted for age, sex, height, waist, body mass index, alcohol consumption, vegetable intake, pickle intake, and baseline differences in blood pressure, was −6.59 mm Hg (95% confidence interval: −12.18 to −1.42; P=0.021). The cost‐effective ratio for CBS and DSS groups was $96.01 and $36.57 per mm of SBP reduction, respectively. Conclusion Clinical DSS are effective and cost‐effective in the management of HTN in resource‐constrained PHC settings. Clinical Trial Registration URL: http://www.ctri.nic.in. Unique identifier: CTRI/2012/03/002476. PMID:25559011

  6. Geostatistical mapping of effluent-affected sediment distribution on the Palos Verdes Shelf

    SciTech Connect

    Murray, Christopher J. ); Lee, H J.; Hampton, M A.

    2001-12-01

    Geostatistical techniques were used to study the spatial continuity of the thickness of effluent-affected sediment in the offshore Palos Verdes margin area. The thickness data were measured directly from cores and indirectly from high-frequency subbottom profiles collected over the Palos Verdes Margin. Strong spatial continuity of the sediment thickness data was identified, with a maximum range of correlation in excess of 1.4 km. The spatial correlation showed a marked anisotropy, and was more than twice as continuous in the alongshore direction as in the cross-shelf direction. Sequential indicator simulation employing models fit to the thickness data variograms was used to map the distribution of the sediment, and to quantify the uncertainty in those estimates. A strong correlation between sediment thickness data and measurements of the mass of the contaminant p,p?-DDE per unit area was identified. A calibration based on the bivariate distribution of the thickness and p,p?-DDE data was applied using Markov-Bayes indicator simulation to extend the geostatistical study and map the contamination levels in the sediment. Integrating the map grids produced by the geostatistical study of the two variables indicated that 7.8 million cubic meters of effluent-affected sediment exist in the map area, containing approximately 61 to 72 Mg (metric tons) of p,p?-DDE. Most of the contaminated sediment (about 85% of the sediment and 89% of the p,p?-DDE) occurs in water depths less than 100 m. The geostatistical study also indicated that the samples available for mapping are well distributed and the uncertainty of the estimates of the thickness and contamination level of the sediments is lowest in areas where the contaminated sediment is most prevalent.

  7. Application of a computationally efficient geostatistical approach to characterizing variably spaced water-table data

    SciTech Connect

    Quinn, J.J.

    1996-02-01

    Geostatistical analysis of hydraulic head data is useful in producing unbiased contour plots of head estimates and relative errors. However, at most sites being characterized, monitoring wells are generally present at different densities, with clusters of wells in some areas and few wells elsewhere. The problem that arises when kriging data at different densities is in achieving adequate resolution of the grid while maintaining computational efficiency and working within software limitations. For the site considered, 113 data points were available over a 14-mi{sup 2} study area, including 57 monitoring wells within an area of concern of 1.5 mi{sup 2}. Variogram analyses of the data indicate a linear model with a negligible nugget effect. The geostatistical package used in the study allows a maximum grid of 100 by 100 cells. Two-dimensional kriging was performed for the entire study area with a 500-ft grid spacing, while the smaller zone was modeled separately with a 100-ft spacing. In this manner, grid cells for the dense area and the sparse area remained small relative to the well separation distances, and the maximum dimensions of the program were not exceeded. The spatial head results for the detailed zone were then nested into the regional output by use of a graphical, object-oriented database that performed the contouring of the geostatistical output. This study benefitted from the two-scale approach and from very fine geostatistical grid spacings relative to typical data separation distances. The combining of the sparse, regional results with those from the finer-resolution area of concern yielded contours that honored the actual data at every measurement location. The method applied in this study can also be used to generate reproducible, unbiased representations of other types of spatial data.

  8. Tomogram-based comparison of geostatistical models: Application to the Macrodispersion Experiment (MADE) site

    NASA Astrophysics Data System (ADS)

    Linde, Niklas; Lochbühler, Tobias; Dogan, Mine; Van Dam, Remke L.

    2015-12-01

    We propose a new framework to compare alternative geostatistical descriptions of a given site. Multiple realizations of each of the considered geostatistical models and their corresponding tomograms (based on inversion of noise-contaminated simulated data) are used as a multivariate training image. The training image is scanned with a direct sampling algorithm to obtain conditional realizations of hydraulic conductivity that are not only in agreement with the geostatistical model, but also honor the spatially varying resolution of the site-specific tomogram. Model comparison is based on the quality of the simulated geophysical data from the ensemble of conditional realizations. The tomogram in this study is obtained by inversion of cross-hole ground-penetrating radar (GPR) first-arrival travel time data acquired at the MAcro-Dispersion Experiment (MADE) site in Mississippi (USA). Various heterogeneity descriptions ranging from multi-Gaussian fields to fields with complex multiple-point statistics inferred from outcrops are considered. Under the assumption that the relationship between porosity and hydraulic conductivity inferred from local measurements is valid, we find that conditioned multi-Gaussian realizations and derivatives thereof can explain the crosshole geophysical data. A training image based on an aquifer analog from Germany was found to be in better agreement with the geophysical data than the one based on the local outcrop, which appears to under-represent high hydraulic conductivity zones. These findings are only based on the information content in a single resolution-limited tomogram and extending the analysis to tracer or higher resolution surface GPR data might lead to different conclusions (e.g., that discrete facies boundaries are necessary). Our framework makes it possible to identify inadequate geostatistical models and petrophysical relationships, effectively narrowing the space of possible heterogeneity representations.

  9. A geostatistical approach to predicting sulfur content in the Pittsburgh coal bed

    USGS Publications Warehouse

    Watson, W.D.; Ruppert, L.F.; Bragg, L.J.; Tewalt, S.J.

    2001-01-01

    The US Geological Survey (USGS) is completing a national assessment of coal resources in the five top coal-producing regions in the US. Point-located data provide measurements on coal thickness and sulfur content. The sample data and their geologic interpretation represent the most regionally complete and up-to-date assessment of what is known about top-producing US coal beds. The sample data are analyzed using a combination of geologic and Geographic Information System (GIS) models to estimate tonnages and qualities of the coal beds. Traditionally, GIS practitioners use contouring to represent geographical patterns of "similar" data values. The tonnage and grade of coal resources are then assessed by using the contour lines as references for interpolation. An assessment taken to this point is only indicative of resource quantity and quality. Data users may benefit from a statistical approach that would allow them to better understand the uncertainty and limitations of the sample data. To develop a quantitative approach, geostatistics were applied to the data on coal sulfur content from samples taken in the Pittsburgh coal bed (located in the eastern US, in the southwestern part of the state of Pennsylvania, and in adjoining areas in the states of Ohio and West Virginia). Geostatistical methods that account for regional and local trends were applied to blocks 2.7 mi (4.3 km) on a side. The data and geostatistics support conclusions concerning the average sulfur content and its degree of reliability at regional- and economic-block scale over the large, contiguous part of the Pittsburgh outcrop, but not to a mine scale. To validate the method, a comparison was made with the sulfur contents in sample data taken from 53 coal mines located in the study area. The comparison showed a high degree of similarity between the sulfur content in the mine samples and the sulfur content represented by the geostatistically derived contours. Published by Elsevier Science B.V.

  10. Hierarchical probabilistic regionalization of volcanism for Sengan region in Japan using multivariate statistical techniques and geostatistical interpolation techniques

    SciTech Connect

    Park, Jinyong; Balasingham, P; McKenna, Sean Andrew; Pinnaduwa H.S.W. Kulatilake

    2004-09-01

    Sandia National Laboratories, under contract to Nuclear Waste Management Organization of Japan (NUMO), is performing research on regional classification of given sites in Japan with respect to potential volcanic disruption using multivariate statistics and geo-statistical interpolation techniques. This report provides results obtained for hierarchical probabilistic regionalization of volcanism for the Sengan region in Japan by applying multivariate statistical techniques and geostatistical interpolation techniques on the geologic data provided by NUMO. A workshop report produced in September 2003 by Sandia National Laboratories (Arnold et al., 2003) on volcanism lists a set of most important geologic variables as well as some secondary information related to volcanism. Geologic data extracted for the Sengan region in Japan from the data provided by NUMO revealed that data are not available at the same locations for all the important geologic variables. In other words, the geologic variable vectors were found to be incomplete spatially. However, it is necessary to have complete geologic variable vectors to perform multivariate statistical analyses. As a first step towards constructing complete geologic variable vectors, the Universal Transverse Mercator (UTM) zone 54 projected coordinate system and a 1 km square regular grid system were selected. The data available for each geologic variable on a geographic coordinate system were transferred to the aforementioned grid system. Also the recorded data on volcanic activity for Sengan region were produced on the same grid system. Each geologic variable map was compared with the recorded volcanic activity map to determine the geologic variables that are most important for volcanism. In the regionalized classification procedure, this step is known as the variable selection step. The following variables were determined as most important for volcanism: geothermal gradient, groundwater temperature, heat discharge, groundwater

  11. Efficacy and Safety of Three Antiretroviral Regimens for Initial Treatment of HIV-1: A Randomized Clinical Trial in Diverse Multinational Settings

    PubMed Central

    Campbell, Thomas B.; Smeaton, Laura M.; Kumarasamy, N.; Flanigan, Timothy; Klingman, Karin L.; Firnhaber, Cynthia; Grinsztejn, Beatriz; Hosseinipour, Mina C.; Kumwenda, Johnstone; Lalloo, Umesh; Riviere, Cynthia; Sanchez, Jorge; Melo, Marineide; Supparatpinyo, Khuanchai; Tripathy, Srikanth; Martinez, Ana I.; Nair, Apsara; Walawander, Ann; Moran, Laura; Chen, Yun; Snowden, Wendy; Rooney, James F.; Uy, Jonathan; Schooley, Robert T.; De Gruttola, Victor; Hakim, James Gita; Swann, Edith; Barnett, Ronald L.; Brizz, Barbara; Delph, Yvette; Gettinger, Nikki; Mitsuyasu, Ronald T.; Eshleman, Susan; Safren, Steven; Fiscus, Susan A.; Andrade, Adriana; Haas, David W.; Amod, Farida; Berthaud, Vladimir; Bollinger, Robert C.; Bryson, Yvonne; Celentano, David; Chilongozi, David; Cohen, Myron; Collier, Ann C.; Currier, Judith Silverstein; Cu-Uvin, Susan; Eron, Joseph; Flexner, Charles; Gallant, Joel E.; Gulick, Roy M.; Hammer, Scott M.; Hoffman, Irving; Kazembe, Peter; Kumwenda, Newton; Lama, Javier R.; Lawrence, Jody; Maponga, Chiedza; Martinson, Francis; Mayer, Kenneth; Nielsen, Karin; Pendame, Richard B.; Ramratnam, Bharat; Sanne, Ian; Severe, Patrice; Sirisanthana, Thira; Solomon, Suniti; Tabet, Steve; Taha, Taha; van der Horst, Charles; Wanke, Christine; Gormley, Joan; Marcus, Cheryl J.; Putnam, Beverly; Loeliger, Edde; Pappa, Keith A.; Webb, Nancy; Shugarts, David L.; Winters, Mark A.; Descallar, Renard S.; Steele, Joseph; Wulfsohn, Michael; Said, Farideh; Chen, Yue; Martin, John C; Bischofberger, Norbert; Cheng, Andrew; Jaffe, Howard; Sharma, Jabin; Poongulali, S.; Cardoso, Sandra Wagner; Faria, Deise Lucia; Berendes, Sima; Burke, Kelly; Mngqibisa, Rosie; Kanyama, Cecelia; Kayoyo, Virginia; Samaneka, Wadzanai P.; Chisada, Anthony; Faesen, Sharla; Chariyalertsak, Suwat; Santos, Breno; Lira, Rita Alves; Joglekar, Anjali A.; Rosa, Alberto La; Infante, Rosa; Jain, Mamta; Petersen, Tianna; Godbole, Sheela; Dhayarkar, Sampada; Feinberg, Judith; Baer, Jenifer; Pollard, Richard B.; Asmuth, David; Gangakhedkar, Raman R; Gaikwad, Asmita; Ray, M. Graham; Basler, Cathi; Para, Michael F.; Watson, Kathy J.; Taiwo, Babafemi; McGregor, Donna; Balfour, Henry H.; Mullan, Beth; Kim, Ge-Youl; Klebert, Michael K.; Cox, Gary Matthew; Silberman, Martha; Mildvan, Donna; Revuelta, Manuel; Tashima, Karen T.; Patterson, Helen; Geiseler, P. Jan; Santos, Bartolo; Daar, Eric S; Lopez, Ruben; Frarey, Laurie; Currin, David; Haas, David H.; Bailey, Vicki L.; Tebas, Pablo; Zifchak, Larisa; Noel-Connor, Jolene; Torres, Madeline; Sha, Beverly E.; Fritsche, Janice M.; Cespedes, Michelle; Forcht, Janet; O'Brien, William A.; Mogridge, Cheryl; Hurley, Christine; Corales, Roberto; Palmer, Maria; Adams, Mary; Luque, Amneris; Lopez-Detres, Luis; Stroberg, Todd

    2012-01-01

    Background Antiretroviral regimens with simplified dosing and better safety are needed to maximize the efficiency of antiretroviral delivery in resource-limited settings. We investigated the efficacy and safety of antiretroviral regimens with once-daily compared to twice-daily dosing in diverse areas of the world. Methods and Findings 1,571 HIV-1-infected persons (47% women) from nine countries in four continents were assigned with equal probability to open-label antiretroviral therapy with efavirenz plus lamivudine-zidovudine (EFV+3TC-ZDV), atazanavir plus didanosine-EC plus emtricitabine (ATV+DDI+FTC), or efavirenz plus emtricitabine-tenofovir-disoproxil fumarate (DF) (EFV+FTC-TDF). ATV+DDI+FTC and EFV+FTC-TDF were hypothesized to be non-inferior to EFV+3TC-ZDV if the upper one-sided 95% confidence bound for the hazard ratio (HR) was ≤1.35 when 30% of participants had treatment failure. An independent monitoring board recommended stopping study follow-up prior to accumulation of 472 treatment failures. Comparing EFV+FTC-TDF to EFV+3TC-ZDV, during a median 184 wk of follow-up there were 95 treatment failures (18%) among 526 participants versus 98 failures among 519 participants (19%; HR 0.95, 95% CI 0.72–1.27; p = 0.74). Safety endpoints occurred in 243 (46%) participants assigned to EFV+FTC-TDF versus 313 (60%) assigned to EFV+3TC-ZDV (HR 0.64, CI 0.54–0.76; p<0.001) and there was a significant interaction between sex and regimen safety (HR 0.50, CI 0.39–0.64 for women; HR 0.79, CI 0.62–1.00 for men; p = 0.01). Comparing ATV+DDI+FTC to EFV+3TC-ZDV, during a median follow-up of 81 wk there were 108 failures (21%) among 526 participants assigned to ATV+DDI+FTC and 76 (15%) among 519 participants assigned to EFV+3TC-ZDV (HR 1.51, CI 1.12–2.04; p = 0.007). Conclusion EFV+FTC-TDF had similar high efficacy compared to EFV+3TC-ZDV in this trial population, recruited in diverse multinational settings. Superior safety, especially in HIV-1-infected

  12. Effect of a Simple Information Booklet on Pain Persistence after an Acute Episode of Low Back Pain: A Non-Randomized Trial in a Primary Care Setting

    PubMed Central

    Coudeyre, Emmanuel; Baron, Gabriel; Coriat, Fernand; Brin, Sylvie; Revel, Michel; Poiraudeau, Serge

    2007-01-01

    Objective Mass-media campaigns have been known to modify the outcome of low back pain (LBP). We assessed the impact on outcome of standardized written information on LBP given to patients with acute LBP. Methods Design: A 3-month pragmatic, multicenter controlled trial with geographic stratification. Setting: Primary care practice in France. Participants: 2752 patients with acute LBP. Intervention: An advice book on LBP (the “back book”). Main outcome measures: The main outcome measure was persistence of LBP three months after baseline evaluation. Results 2337 (85%) patients were assessed at follow-up and 12.4% of participants reported persistent LBP. The absolute risk reduction of reporting persistent back pain in the intervention group was 3.6% lower than in the control group (10.5% vs. 14.1%; 95% confidence interval [−6.3% ; −1.0%]; p value adjusted for cluster effect = 0.01). Patients in the intervention group were more satisfied than those in the control group with the information they received about physical activities, when to consult their physician, and how to prevent a new episode of LBP. However, the number of patients who had taken sick leave was similar, as was the mean sick-leave duration, in both arms, and, among patients with persistent pain at follow-up, the intervention and control groups did not differ in disability or fear-avoidance beliefs. Conclusions The level of improvement of an information booklet is modest, but the cost and complexity of the intervention is minimal. Therefore, the implications and generalizability of this intervention are substantial. Trial Registration ClinicalTrials.gov NCT00343057 PMID:17684553

  13. Preliminary geostatistical modeling of thermal conductivity for a cross section of Yucca Mountain, Nevada

    SciTech Connect

    Rautman, C.A.

    1995-09-01

    Two-dimensional, heterogeneous, spatially correlated models of thermal conductivity and bulk density have been created for a representative, east-west cross section of Yucca Mountain, Nevada, using geostatistical simulation. The thermal conductivity models are derived from spatially correlated, surrogate material-property models of porosity, through a multiple linear-regression equation, which expresses thermal conductivity as a function of porosity and initial temperature and saturation. Bulk-density values were obtained through a similar, linear-regression relationship with porosity. The use of a surrogate-property allows the use of spatially much-more-abundant porosity measurements to condition the simulations. Modeling was conducted in stratigraphic coordinates to represent original depositional continuity of material properties and the completed models were transformed to real-world coordinates to capture present-day tectonic tilting and faulting of the material-property units. Spatial correlation lengths required for geostatistical modeling were assumed, but are based on the results of previous transect-sampling and geostatistical-modeling work.

  14. A geostatistical methodology to assess the accuracy of unsaturated flow models

    SciTech Connect

    Smoot, J.L.; Williams, R.E.

    1996-04-01

    The Pacific Northwest National Laboratory spatiotemporal movement of water injected into (PNNL) has developed a Hydrologic unsaturated sediments at the Hanford Site in Evaluation Methodology (HEM) to assist the Washington State was used to develop a new U.S. Nuclear Regulatory Commission in method for evaluating mathematical model evaluating the potential that infiltrating meteoric predictions. Measured water content data were water will produce leachate at commercial low- interpolated geostatistically to a 16 x 16 x 36 level radioactive waste disposal sites. Two key grid at several time intervals. Then a issues are raised in the HEM: (1) evaluation of mathematical model was used to predict water mathematical models that predict facility content at the same grid locations at the selected performance, and (2) estimation of the times. Node-by-node comparison of the uncertainty associated with these mathematical mathematical model predictions with the model predictions. The technical objective of geostatistically interpolated values was this research is to adapt geostatistical tools conducted. The method facilitates a complete commonly used for model parameter estimation accounting and categorization of model error at to the problem of estimating the spatial every node. The comparison suggests that distribution of the dependent variable to be model results generally are within measurement calculated by the model. To fulfill this error. The worst model error occurs in silt objective, a database describing the lenses and is in excess of measurement error.

  15. A conceptual sedimentological-geostatistical model of aquifer heterogeneity based on outcrop studies

    SciTech Connect

    Davis, J.M.

    1994-01-01

    Three outcrop studies were conducted in deposits of different depositional environments. At each site, permeability measurements were obtained with an air-minipermeameter developed as part of this study. In addition, the geological units were mapped with either surveying, photographs, or both. Geostatistical analysis of the permeability data was performed to estimate the characteristics of the probability distribution function and the spatial correlation structure. The information obtained from the geological mapping was then compared with the results of the geostatistical analysis for any relationships that may exist. The main field site was located in the Albuquerque Basin of central New Mexico at an outcrop of the Pliocene-Pleistocene Sierra Ladrones Formation. The second study was conducted on the walls of waste pits in alluvial fan deposits at the Nevada Test Site. The third study was conducted on an outcrop of an eolian deposit (miocene) south of Socorro, New Mexico. The results of the three studies were then used to construct a conceptual model relating depositional environment to geostatistical models of heterogeneity. The model presented is largely qualitative but provides a basis for further hypothesis formulation and testing.

  16. Geostatistical estimation of signal-to-noise ratios for spectral vegetation indices

    NASA Astrophysics Data System (ADS)

    Ji, Lei; Zhang, Li; Rover, Jennifer; Wylie, Bruce K.; Chen, Xuexia

    2014-10-01

    In the past 40 years, many spectral vegetation indices have been developed to quantify vegetation biophysical parameters. An ideal vegetation index should contain the maximum level of signal related to specific biophysical characteristics and the minimum level of noise such as background soil influences and atmospheric effects. However, accurate quantification of signal and noise in a vegetation index remains a challenge, because it requires a large number of field measurements or laboratory experiments. In this study, we applied a geostatistical method to estimate signal-to-noise ratio (S/N) for spectral vegetation indices. Based on the sample semivariogram of vegetation index images, we used the standardized noise to quantify the noise component of vegetation indices. In a case study in the grasslands and shrublands of the western United States, we demonstrated the geostatistical method for evaluating S/N for a series of soil-adjusted vegetation indices derived from the Moderate Resolution Imaging Spectroradiometer (MODIS) sensor. The soil-adjusted vegetation indices were found to have higher S/N values than the traditional normalized difference vegetation index (NDVI) and simple ratio (SR) in the sparsely vegetated areas. This study shows that the proposed geostatistical analysis can constitute an efficient technique for estimating signal and noise components in vegetation indices.

  17. Geostatistical estimation of signal-to-noise ratios for spectral vegetation indices

    USGS Publications Warehouse

    Ji, Lei; Zhang, Li; Rover, Jennifer R.; Wylie, Bruce K.; Chen, Xuexia

    2014-01-01

    In the past 40 years, many spectral vegetation indices have been developed to quantify vegetation biophysical parameters. An ideal vegetation index should contain the maximum level of signal related to specific biophysical characteristics and the minimum level of noise such as background soil influences and atmospheric effects. However, accurate quantification of signal and noise in a vegetation index remains a challenge, because it requires a large number of field measurements or laboratory experiments. In this study, we applied a geostatistical method to estimate signal-to-noise ratio (S/N) for spectral vegetation indices. Based on the sample semivariogram of vegetation index images, we used the standardized noise to quantify the noise component of vegetation indices. In a case study in the grasslands and shrublands of the western United States, we demonstrated the geostatistical method for evaluating S/N for a series of soil-adjusted vegetation indices derived from the Moderate Resolution Imaging Spectroradiometer (MODIS) sensor. The soil-adjusted vegetation indices were found to have higher S/N values than the traditional normalized difference vegetation index (NDVI) and simple ratio (SR) in the sparsely vegetated areas. This study shows that the proposed geostatistical analysis can constitute an efficient technique for estimating signal and noise components in vegetation indices.

  18. Integration of GIS, Geostatistics, and 3-D Technology to Assess the Spatial Distribution of Soil Moisture

    NASA Technical Reports Server (NTRS)

    Betts, M.; Tsegaye, T.; Tadesse, W.; Coleman, T. L.; Fahsi, A.

    1998-01-01

    The spatial and temporal distribution of near surface soil moisture is of fundamental importance to many physical, biological, biogeochemical, and hydrological processes. However, knowledge of these space-time dynamics and the processes which control them remains unclear. The integration of geographic information systems (GIS) and geostatistics together promise a simple mechanism to evaluate and display the spatial and temporal distribution of this vital hydrologic and physical variable. Therefore, this research demonstrates the use of geostatistics and GIS to predict and display soil moisture distribution under vegetated and non-vegetated plots. The research was conducted at the Winfred Thomas Agricultural Experiment Station (WTAES), Hazel Green, Alabama. Soil moisture measurement were done on a 10 by 10 m grid from tall fescue grass (GR), alfalfa (AA), bare rough (BR), and bare smooth (BS) plots. Results indicated that variance associated with soil moisture was higher for vegetated plots than non-vegetated plots. The presence of vegetation in general contributed to the spatial variability of soil moisture. Integration of geostatistics and GIS can improve the productivity of farm lands and the precision of farming.

  19. Acceptability of Home-Assessment Post Medical Abortion and Medical Abortion in a Low-Resource Setting in Rajasthan, India. Secondary Outcome Analysis of a Non-Inferiority Randomized Controlled Trial

    PubMed Central

    Paul, Mandira; Iyengar, Kirti; Essén, Birgitta; Gemzell-Danielsson, Kristina; Iyengar, Sharad D.; Bring, Johan; Soni, Sunita; Klingberg-Allvin, Marie

    2015-01-01

    Background Studies evaluating acceptability of simplified follow-up after medical abortion have focused on high-resource or urban settings where telephones, road connections, and modes of transport are available and where women have formal education. Objective To investigate women’s acceptability of home-assessment of abortion and whether acceptability of medical abortion differs by in-clinic or home-assessment of abortion outcome in a low-resource setting in India. Design Secondary outcome of a randomised, controlled, non-inferiority trial. Setting Outpatient primary health care clinics in rural and urban Rajasthan, India. Population Women were eligible if they sought abortion with a gestation up to 9 weeks, lived within defined study area and agreed to follow-up. Women were ineligible if they had known contraindications to medical abortion, haemoglobin < 85mg/l and were below 18 years. Methods Abortion outcome assessment through routine clinic follow-up by a doctor was compared with home-assessment using a low-sensitivity pregnancy test and a pictorial instruction sheet. A computerized random number generator generated the randomisation sequence (1:1) in blocks of six. Research assistants randomly allocated eligible women who opted for medical abortion (mifepristone and misoprostol), using opaque sealed envelopes. Blinding during outcome assessment was not possible. Main Outcome Measures Women’s acceptability of home-assessment was measured as future preference of follow-up. Overall satisfaction, expectations, and comparison with previous abortion experiences were compared between study groups. Results 731 women were randomized to the clinic follow-up group (n = 353) or home-assessment group (n = 378). 623 (85%) women were successfully followed up, of those 597 (96%) were satisfied and 592 (95%) found the abortion better or as expected, with no difference between study groups. The majority, 355 (57%) women, preferred home-assessment in the event of a future

  20. Quantifying Aggregated Uncertainty in Plasmodium falciparum Malaria Prevalence and Populations at Risk via Efficient Space-Time Geostatistical Joint Simulation

    PubMed Central

    Gething, Peter W.; Patil, Anand P.; Hay, Simon I.

    2010-01-01

    Risk maps estimating the spatial distribution of infectious diseases are required to guide public health policy from local to global scales. The advent of model-based geostatistics (MBG) has allowed these maps to be generated in a formal statistical framework, providing robust metrics of map uncertainty that enhances their utility for decision-makers. In many settings, decision-makers require spatially aggregated measures over large regions such as the mean prevalence within a country or administrative region, or national populations living under different levels of risk. Existing MBG mapping approaches provide suitable metrics of local uncertainty—the fidelity of predictions at each mapped pixel—but have not been adapted for measuring uncertainty over large areas, due largely to a series of fundamental computational constraints. Here the authors present a new efficient approximating algorithm that can generate for the first time the necessary joint simulation of prevalence values across the very large prediction spaces needed for global scale mapping. This new approach is implemented in conjunction with an established model for P. falciparum allowing robust estimates of mean prevalence at any specified level of spatial aggregation. The model is used to provide estimates of national populations at risk under three policy-relevant prevalence thresholds, along with accompanying model-based measures of uncertainty. By overcoming previously unchallenged computational barriers, this study illustrates how MBG approaches, already at the forefront of infectious disease mapping, can be extended to provide large-scale aggregate measures appropriate for decision-makers. PMID:20369009

  1. Groundwater levels time series sensitivity to pluviometry and air temperature: a geostatistical approach to Sfax region, Tunisia.

    PubMed

    Triki, Ibtissem; Trabelsi, Nadia; Hentati, Imen; Zairi, Moncef

    2014-03-01

    In this paper, the pattern of groundwater level fluctuations is investigated by statistical techniques for 24 monitoring wells located in an unconfined coastal aquifer in Sfax (Tunisia) for a time period from 1997 to 2006. Firstly, a geostatistical study is performed to characterize the temporal behaviors of data sets in terms of variograms and to make predictions about the value of the groundwater level at unsampled times. Secondly, multivariate statistical methods, i.e., principal component analysis (PCA) and cluster analysis (CA) of time series of groundwater levels are used to classify groundwater hydrographs regard to identical fluctuation pattern. Three groundwater groups (A, B, and C) were identified. In group "A," water level decreases continuously throughout the study periods with rapid annual cyclic variation, whereas in group "B," the water level contains much less high-frequency variation. The wells of group "C" represents a steady and gradual increase of groundwater levels caused by the aquifer artificial recharge. Furthermore, a cross-correlation analysis is used to investigate the aquifer response to local rainfall and temperature records. The result revealed that the temperature is more affecting the variation of the groundwater level of group A wells than the rainfall. However, the second and the third groups are less affected by rainfall or temperature. PMID:24141484

  2. Rainfall interpolation using a remote sensing CCD data in a tropical basin A GIS and geostatistical application

    NASA Astrophysics Data System (ADS)

    Moges, S. A.; Alemaw, B. F.; Chaoka, T. R.; Kachroo, R. K.

    This paper is aimed at developing a geostatistical model to improve interpolated annual and monthly rainfall variation using remotely-sensed cold cloud duration (CCD) data as a background image. The data set consists of rainfall data from a network of 704 rain gauges in the Rufiji drainage basin in Tanzania. We found ordinary kriging to be a robust estimator due mainly to its inherent nature of including the non-stationary local mean during estimation. Parameter sensitivity analysis and examination of the residuals revealed that the parameter values of the variogram viz., the nugget effect, the range, sill value and maximum direction of continuity, as long as they are in acceptable ranges, and any different combination of these parameters, have low effect on model efficiency and accuracy. Rather, the use of remotely-sensed CCD data as a background image is found to improve the interpolation as compared to the estimation based on observed point rainfall data alone. The study revealed the improvement in terms of Nash-Sutcliffe model performance index ( R2) by using CCD as external drift with kriging provided an R2 of 64.5% compared to the simple kriging and ordinary kriging, which performed with efficiency of 60.0% and 61.4%, respectively. For each case, parameter sensitivity analysis was conducted to investigate the effect of the change in the parameters on the model performance and the spatio-temporal interpolation results.

  3. Geostatistical Relationships Between Freeboard and Thickness of Antarctic Sea Ice Floes Derived from Terrestrial LiDAR Surveys

    NASA Astrophysics Data System (ADS)

    Lewis, M. J.; Parra, J.; Weissling, B.; Ackley, S. F.; Maksym, T. L.; Wilkinson, J.; Wagner, T.

    2011-12-01

    Sea ice is a critical component of the Earth's climate system and is a highly complex media. The physical characteristics are important in interpretation of remote sensing data. Sea ice characteristics such as snow surface topography, snow depth and ice thickness were derived from in situ measurements obtained during the J.C. Ross (ICEBell) and Oden Southern Ocean (OSO) expeditions during the austral summer of 2010-11. Select areas of sea ice floes in the Bellingshausen, Weddell and Amundsen Seas were measured using terrestrial scanning LiDAR (TSL) and also by conventional gridded and transect surveys. Snow depths were obtained at 2-5 meter sampling intervals and ice thickness was estimated by both electromagnetic induction (EMI) and auger drilling at 2-5 meter intervals. The LiDAR data is gridded to a 10cm rasterized data set. The field data from multiple floes in different regions provide a unique three dimensional perspective of sea ice elevation, snow depth and derived freeboard. These floes are visualized in both space and spectral domains and analyzed using classic statistical and geostatistical methods to assess surface roughness, snow depth, and the effects of differing scales on data resolution. The correlation lengths needed for isostatic equilibrium of freeboard were determined. These relationships are useful in assessing radar and laser altimetry data from airborne and satellite sources.

  4. Geostatistics and Geographic Information Systems to Study the Spatial Distribution of Grapholita molesta (Busck) (Lepidoptera: Tortricidae) in Peach Fields.

    PubMed

    Duarte, F; Calvo, M V; Borges, A; Scatoni, I B

    2015-08-01

    The oriental fruit moth, Grapholita molesta (Busck), is the most serious pest in peach, and several insecticide applications are required to reduce crop damage to acceptable levels. Geostatistics and Geographic Information Systems (GIS) are employed to measure the range of spatial correlation of G. molesta in order to define the optimum sampling distance for performing spatial analysis and to determine the current distribution of the pest in peach orchards of southern Uruguay. From 2007 to 2010, 135 pheromone traps per season were installed and georeferenced in peach orchards distributed over 50,000 ha. Male adult captures were recorded weekly from September to April. Structural analysis of the captures was performed, yielding 14 semivariograms for the accumulated captures analyzed by generation and growing season. Two sets of maps were constructed to describe the pest distribution. Nine significant models were obtained in the 14 evaluated periods. The range estimated for the correlation was from 908 to 6884 m. Three hot spots of high population level and some areas with comparatively low populations were constant over the 3-year period, while there is a greater variation in the size of the population in different generations and years in other areas. PMID:26174957

  5. A Bayesian geostatistical approach for evaluating the uncertainty of contaminant mass discharges from point sources

    NASA Astrophysics Data System (ADS)

    Troldborg, M.; Nowak, W.; Binning, P. J.; Bjerg, P. L.

    2012-12-01

    Estimates of mass discharge (mass/time) are increasingly being used when assessing risks of groundwater contamination and designing remedial systems at contaminated sites. Mass discharge estimates are, however, prone to rather large uncertainties as they integrate uncertain spatial distributions of both concentration and groundwater flow velocities. For risk assessments or any other decisions that are being based on mass discharge estimates, it is essential to address these uncertainties. We present a novel Bayesian geostatistical approach for quantifying the uncertainty of the mass discharge across a multilevel control plane. The method decouples the flow and transport simulation and has the advantage of avoiding the heavy computational burden of three-dimensional numerical flow and transport simulation coupled with geostatistical inversion. It may therefore be of practical relevance to practitioners compared to existing methods that are either too simple or computationally demanding. The method is based on conditional geostatistical simulation and accounts for i) heterogeneity of both the flow field and the concentration distribution through Bayesian geostatistics (including the uncertainty in covariance functions), ii) measurement uncertainty, and iii) uncertain source zone geometry and transport parameters. The method generates multiple equally likely realizations of the spatial flow and concentration distribution, which all honour the measured data at the control plane. The flow realizations are generated by analytical co-simulation of the hydraulic conductivity and the hydraulic gradient across the control plane. These realizations are made consistent with measurements of both hydraulic conductivity and head at the site. An analytical macro-dispersive transport solution is employed to simulate the mean concentration distribution across the control plane, and a geostatistical model of the Box-Cox transformed concentration data is used to simulate observed

  6. Pilot randomized trial of therapeutic hypothermia with serial cranial ultrasound and 18-22 month follow-up for neonatal encephalopathy in a low resource hospital setting in uganda: study protocol

    PubMed Central

    2011-01-01

    Background There is now convincing evidence that in industrialized countries therapeutic hypothermia for perinatal asphyxial encephalopathy increases survival with normal neurological function. However, the greatest burden of perinatal asphyxia falls in low and mid-resource settings where it is unclear whether therapeutic hypothermia is safe and effective. Aims Under the UCL Uganda Women's Health Initiative, a pilot randomized controlled trial in infants with perinatal asphyxia was set up in the special care baby unit in Mulago Hospital, a large public hospital with ~20,000 births in Kampala, Uganda to determine: (i) The feasibility of achieving consent, neurological assessment, randomization and whole body cooling to a core temperature 33-34°C using water bottles (ii) The temperature profile of encephalopathic infants with standard care (iii) The pattern, severity and evolution of brain tissue injury as seen on cranial ultrasound and relation with outcome (iv) The feasibility of neurodevelopmental follow-up at 18-22 months of age Methods/Design Ethical approval was obtained from Makerere University and Mulago Hospital. All infants were in-born. Parental consent for entry into the trial was obtained. Thirty-six infants were randomized either to standard care plus cooling (target rectal temperature of 33-34°C for 72 hrs, started within 3 h of birth) or standard care alone. All other aspects of management were the same. Cooling was performed using water bottles filled with tepid tap water (25°C). Rectal, axillary, ambient and surface water bottle temperatures were monitored continuously for the first 80 h. Encephalopathy scoring was performed on days 1-4, a structured, scorable neurological examination and head circumference were performed on days 7 and 17. Cranial ultrasound was performed on days 1, 3 and 7 and scored. Griffiths developmental quotient, head circumference, neurological examination and assessment of gross motor function were obtained at 18

  7. Geostatistical independent simulation of spatially correlated soil variables

    NASA Astrophysics Data System (ADS)

    Boluwade, Alaba; Madramootoo, Chandra A.

    2015-12-01

    The selection of best management practices to reduce soil and water pollution often requires estimation of soil properties. It is important to find an efficient and robust technique to simulate spatially correlated soils parameters. Co-kriging and co-simulation are techniques that can be used. These methods are limited in terms of computer simulation due to the problem of solving large co-kriging systems and difficulties in fitting a valid model of coregionalization. The order of complexity increases as the number of covariables increases. This paper presents a technique for the conditional simulation of a non-Gaussian vector random field on point support scale. The technique is termed Independent Component Analysis (ICA). The basic principle underlining ICA is the determination of a linear representation of non-Gaussian data so that the components are considered statistically independent. With such representation, it would be easy and more computationally efficient to develop direct variograms for the components. The process is presented in two stages. The first stage involves the ICA decomposition. The second stage involves sequential Gaussian simulation of the generated components (which are derived from the first stage). This technique was applied for spatially correlated extractable cations such as magnesium (Mg) and iron (Fe) in a Canadian watershed. This paper has a strong application in stochastic quantification of uncertainties of soil attributes in soil remediation and soil rehabilitation.

  8. Geostatistical analysis of field hydraulic conductivity in compacted clay

    SciTech Connect

    Rogowski, A.S.; Simmons, D.E.

    1988-05-01

    Hydraulic conductivity (K) of fractured or porous materials is associated intimately with water flow and chemical transport. Basic concepts imply uniform flux through a homogeneous cross-sectional area. If flow were to occur only through part of the area, actual rates could be considerably different. Because laboratory values of K in compacted clays seldom agree with field estimates, questions arise as to what the true values of K are and how they should be estimated. Hydraulic conductivity values were measured on a 10 x 25 m elevated bridge-like platform. A constant water level was maintained for 1 yr over a 0.3-m thick layer of compacted clay, and inflow and outflow rates were monitored using 10 x 25 grids of 0.3-m diameter infiltration rings and outflow drains subtending approximately 1 x 1 m blocks of compacted clay. Variography of inflow and outflow data established relationships between cores and blocks of clay, respectively. Because distributions of outflow rates were much less and bore little resemblance to the distributions of break-through rates based on tracer studies, presence of macropores and preferential flow through the macropores was suspected. Subsequently, probability kriging was applied to reevaluate distribution of flux rates and possible location of macropores. Sites exceeding a threshold outflow of 100 x 10/sup -9/ m/s were classified as outliers and were assumed to probably contain a significant population of macropores. Different sampling schemes were examined. Variogram analysis of outflows with and without outliers suggested adequacy of sampling the site at 50 randomly chosen locations. Because of the potential contribution of macropores to pollutant transport and the practical necessity of extrapolating small plot values to larger areas, conditional simulations with and without outliers were carried out.

  9. Improved hydrological model parametrization for climate change impact assessment under data scarcity - The potential of field monitoring techniques and geostatistics.

    PubMed

    Meyer, Swen; Blaschek, Michael; Duttmann, Rainer; Ludwig, Ralf

    2016-02-01

    According to current climate projections, Mediterranean countries are at high risk for an even pronounced susceptibility to changes in the hydrological budget and extremes. These changes are expected to have severe direct impacts on the management of water resources, agricultural productivity and drinking water supply. Current projections of future hydrological change, based on regional climate model results and subsequent hydrological modeling schemes, are very uncertain and poorly validated. The Rio Mannu di San Sperate Basin, located in Sardinia, Italy, is one test site of the CLIMB project. The Water Simulation Model (WaSiM) was set up to model current and future hydrological conditions. The availability of measured meteorological and hydrological data is poor as it is common for many Mediterranean catchments. In this study we conducted a soil sampling campaign in the Rio Mannu catchment. We tested different deterministic and hybrid geostatistical interpolation methods on soil textures and tested the performance of the applied models. We calculated a new soil texture map based on the best prediction method. The soil model in WaSiM was set up with the improved new soil information. The simulation results were compared to standard soil parametrization. WaSiMs was validated with spatial evapotranspiration rates using the triangle method (Jiang and Islam, 1999). WaSiM was driven with the meteorological forcing taken from 4 different ENSEMBLES climate projections for a reference (1971-2000) and a future (2041-2070) times series. The climate change impact was assessed based on differences between reference and future time series. The simulated results show a reduction of all hydrological quantities in the future in the spring season. Furthermore simulation results reveal an earlier onset of dry conditions in the catchment. We show that a solid soil model setup based on short-term field measurements can improve long-term modeling results, which is especially important

  10. Overview and technical and practical aspects for use of geostatistics in hazardous-, toxic-, and radioactive-waste-site investigations

    SciTech Connect

    Bossong, C.R.; Karlinger, M.R.; Troutman, B.M.; Vecchia, A.V.

    1999-10-01

    Technical and practical aspects of applying geostatistics are developed for individuals involved in investigation at hazardous-, toxic-, and radioactive-waste sites. Important geostatistical concepts, such as variograms and ordinary, universal, and indicator kriging, are described in general terms for introductory purposes and in more detail for practical applications. Variogram modeling using measured ground-water elevation data is described in detail to illustrate principles of stationarity, anisotropy, transformations, and cross validation. Several examples of kriging applications are described using ground-water-level elevations, bedrock elevations, and ground-water-quality data. A review of contemporary literature and selected public domain software associated with geostatistics also is provided, as is a discussion of alternative methods for spatial modeling, including inverse distance weighting, triangulation, splines, trend-surface analysis, and simulation.

  11. Separation of potential data as regional and residuals by geostatistical filtering techniques

    NASA Astrophysics Data System (ADS)

    Rim, H.

    2009-12-01

    I propose a spatial filtering scheme using factorial kriging, a kind of geostatistical filtering method in order to separate potential data as regional and residual anomalies. Spatial filtering assumes that regional anomalies have longer distance relation and residual anomalies have effected on smaller range. Gravity anomaly was decomposed into two variogram models depending on long and short effective ranges. And best-fitted variogram models produced the separated regional-residual anomalies by means of factorial kriging. This algorithm was examined with synthetic gravity data, and also applied to a real microgravity data to figure out abandoned mineshaft.

  12. Determination of 137Cs contamination depth distribution in building structures using geostatistical modeling of ISOCS measurements.

    PubMed

    Boden, Sven; Rogiers, Bart; Jacques, Diederik

    2013-09-01

    Decommissioning of nuclear building structures usually leads to large amounts of low level radioactive waste. Using a reliable method to determine the contamination depth is indispensable prior to the start of decontamination works and also for minimizing the radioactive waste volume and the total workload. The method described in this paper is based on geostatistical modeling of in situ gamma-ray spectroscopy measurements using the multiple photo peak method. The method has been tested on the floor of the waste gas surge tank room within the BR3 (Belgian Reactor 3) decommissioning project and has delivered adequate results. PMID:23722072

  13. New advances in methodology for statistical tests useful in geostatistical studies

    SciTech Connect

    Borgman, L.E.

    1988-05-01

    Methodology for statistical procedures to perform tests of hypothesis pertaining to various aspects of geostatistical investigations has been slow in developing. The correlated nature of the data precludes most classical tests and makes the design of new tests difficult. Recent studies have led to modifications of the classical t test which allow for the intercorrelation. In addition, results for certain nonparametric tests have been obtained. The conclusions of these studies provide a variety of new tools for the geostatistician in deciding questions on significant differences and magnitudes.

  14. Geostatistical analysis of the soil and crop parameters in a field experiment on precision agriculture

    NASA Astrophysics Data System (ADS)

    Sidorova, V. A.; Zhukovskii, E. E.; Lekomtsev, P. V.; Yakushev, V. V.

    2012-08-01

    A thorough geostatistical analysis was performed of the spatial variability of the soil properties, the sowing parameters, and the wheat yield in a field experiment under precision agriculture conditions. It was found that most of the soil parameters are significantly correlated and can be successfully mapped using kriging procedures, which ensure the optimum development of agrochemical cartograms for agricultural fields. It was also shown that the sowing parameters had a significantly lower spatial correlation; their cartograms could be drawn, although with worse accuracy. The quality parameters of the wheat grain showed no spatial correlation.

  15. Uncertainty Estimate in Resources Assessment: A Geostatistical Contribution

    SciTech Connect

    Souza, Luis Eduardo de Costa, Joao Felipe C. L.; Koppe, Jair C.

    2004-03-15

    For many decades the mining industry regarded resources/reserves estimation and classification as a mere calculation requiring basic mathematical and geological knowledge. Most methods were based on geometrical procedures and spatial data distribution. Therefore, uncertainty associated with tonnages and grades either were ignored or mishandled, although various mining codes require a measure of confidence in the values reported. Traditional methods fail in reporting the level of confidence in the quantities and grades. Conversely, kriging is known to provide the best estimate and its associated variance. Among kriging methods, Ordinary Kriging (OK) probably is the most widely used one for mineral resource/reserve estimation, mainly because of its robustness and its facility in uncertainty assessment by using the kriging variance. It also is known that OK variance is unable to recognize local data variability, an important issue when heterogeneous mineral deposits with higher and poorer grade zones are being evaluated. Alternatively, stochastic simulation are used to build local or global uncertainty about a geological attribute respecting its statistical moments. This study investigates methods capable of incorporating uncertainty to the estimates of resources and reserves via OK and sequential gaussian and sequential indicator simulation The results showed that for the type of mineralization studied all methods classified the tonnages similarly. The methods are illustrated using an exploration drill hole data sets from a large Brazilian coal deposit.

  16. Tomography of the ionospheric electron density with geostatistical inversion

    NASA Astrophysics Data System (ADS)

    Minkwitz, D.; van den Boogaart, K. G.; Gerzen, T.; Hoque, M.

    2015-08-01

    In relation to satellite applications like global navigation satellite systems (GNSS) and remote sensing, the electron density distribution of the ionosphere has significant influence on trans-ionospheric radio signal propagation. In this paper, we develop a novel ionospheric tomography approach providing the estimation of the electron density's spatial covariance and based on a best linear unbiased estimator of the 3-D electron density. Therefore a non-stationary and anisotropic covariance model is set up and its parameters are determined within a maximum-likelihood approach incorporating GNSS total electron content measurements and the NeQuick model as background. As a first assessment this 3-D simple kriging approach is applied to a part of Europe. We illustrate the estimated covariance model revealing the different correlation lengths in latitude and longitude direction and its non-stationarity. Furthermore, we show promising improvements of the reconstructed electron densities compared to the background model through the validation of the ionosondes Rome, Italy (RO041), and Dourbes, Belgium (DB049), with electron density profiles for 1 day.

  17. Assessing Landscape-Scale Soil Moisture Distribution Using Auxiliary Sensing Technologies and Multivariate Geostatistics

    NASA Astrophysics Data System (ADS)

    Landrum, C.; Castrignanò, A.; Mueller, T.; Zourarakis, D.; Zhu, J.

    2013-12-01

    It is important to assess soil moisture to develop strategies to better manage its availability and use. At the landscape scale, soil moisture distribution derives from an integration of hydrologic, pedologic and geomorphic processes that cause soil moisture variability (SMV) to be time, space, and scale-dependent. Traditional methods to assess SMV at this scale are often costly, labor intensive, and invasive, which can lead to inadequate sampling density and spatial coverage. Fusing traditional sampling techniques with georeferenced auxiliary sensing technologies, such as geoelectric sensing and LiDAR, provide an alternative approach. Because geoelectric and LiDAR measurements are sensitive to soil properties and terrain features that affect soil moisture variation, they are often employed as auxiliary measures to support less dense direct sampling. Georeferenced proximal sensing acquires rapid, real-time, high resolution data over large spatial extents that is enriched with spatial, temporal and scale-dependent information. Data fusion becomes important when proximal sensing is used in tandem with more sparse direct sampling. Multicollocated factorial cokriging (MFC) is one technique of multivariate geostatistics to fuse multiple data sources collected at different sampling scales to study the spatial characteristics of environmental properties. With MFC sparse soil observations are supported by more densely sampled auxiliary attributes to produce more consistent spatial descriptions of scale-dependent parameters affecting SMV. This study uses high resolution geoelectric and LiDAR data as auxiliary measures to support direct soil sampling (n=127) over a 40 hectare Central Kentucky (USA) landscape. Shallow and deep apparent electrical resistivity (ERa) were measured using a Veris 3100 in tandem with soil moisture sampling on three separate dates with ascending soil moisture contents ranging from plant wilting point to field capacity. Terrain features were produced

  18. Treatment of children with attention-deficit/hyperactivity disorder: results of a randomized, multicenter, double-blind, crossover study of extended-release dexmethylphenidate and D,L-methylphenidate and placebo in a laboratory classroom setting.

    PubMed

    Silva, Raul; Muniz, Rafael; McCague, Kevin; Childress, Ann; Brams, Matthew; Mao, Alice

    2008-01-01

    The purpose of this study was to compare the efficacy and safety of extended-release dexmethylphenidate (d-MPH-ER) to that of d,l-MPH-ER and placebo in children with attention-deficit/hyperactivity disorder (ADHD) in a laboratory classroom setting. This multicenter, double-blind, crossover study randomized 82 children, 6 to 12 years of age, stabilized on a total daily dose to the nearest equivalent of 40 to 60 mg of d,l-MPH or 20 or 30 mg/day of d-MPH. Patients participated in a screening day and practice day, and were randomized to 1 of 10 sequences of all five treatments in five separate periods. Treatments included d-MPH-ER (20 mg/day), d-MPH-ER (30 mg/day), d,l-MPH-ER (36 mg/day), d,l-MPH-ER (54 mg/day), and placebo. Primary efficacy was measured by the change from predose on the Swanson, Kotkin, Agler, M-Flynn, and Pelham (SKAMP) Rating Scale-Combined scores at 2-h postdose during the 12-h laboratory assessment (d-MPH-ER 20 mg/day vs. d,l-MPH-ER 36 mg/day). Adverse events were monitored throughout the study period. d-MPH-ER (20 mg/day) was significantly more effective than d,l-MPH-ER (36 mg/day) in the primary efficacy variable, change from predose to 2-h postdose in SKAMP-combined score. In general, d-MPH-ER had an earlier onset of action than d,l-MPH-ER, while d,l-MPH-ER had a stronger effect at 12-h postdose. No serious adverse events were reported. Treatment with either agent was associated with significant improvements in ADHD symptoms. d-MPH-ER and d,l-MPH-ER can be differentiated on what part of the day each is more effective. PMID:18362868

  19. Random Walks on Random Graphs

    NASA Astrophysics Data System (ADS)

    Cooper, Colin; Frieze, Alan

    The aim of this article is to discuss some of the notions and applications of random walks on finite graphs, especially as they apply to random graphs. In this section we give some basic definitions, in Section 2 we review applications of random walks in computer science, and in Section 3 we focus on walks in random graphs.

  20. Stochastic Estimates of the Permeability Field of the Soultz-sous-Forêts Geothermal Reservoir - Comparison of Bayesian Inversion, MC Geostatistics, and EnKF Assimilation

    NASA Astrophysics Data System (ADS)

    Kosack, Christian; Vogt, Christian; Rath, Volker; Marquart, Gabriele

    2010-05-01

    The knowledge of the permeability distribution at depth is of primary concern for any geothermal reservoir engineering. However, permeability might change over orders of magnitude even for a single rock type and is additionally controlled by tectonic or engineered fracturing of the rocks. During reservoir exploration pumping tests are regularly performed where tracer marked water is pumped in one borehole and retrieved at one or a few others. At the European Enhanced Geothermal System (EGS) test site at Soultz-sous-Forêts three wells had been drilled in the granitic bedrock down to 4 to 5 km and were hydraulically stimulated to enhance the hydraulic connectivity between the wells. In July 2005, a tracer circulation test was carried out in order to estimate the changes of the hydraulic properties. Therefore a tracer was injected into the well GPK3 for 19 hours at a rate of 0.015 m3 s-1 and a concentration of 0.389 mol m-3. Tracer concentration was measured in the production wells over the following 5 months, while the produced water was re-injected into GPK3. This experiment demonstrated a good hydraulic connection between GPK3 and one of the production wells, GPK2, while a very low connectivity was observed in the other one, GPK4. We tested three different approaches simulating the pumping experiment with the numerical simulator shemat_suite in a simplified 3D model of the site in order to study their respective potential to estimate a reliable permeability distribution for the Soultz reservoir: A full-physics gradient-based Bayesian inversion, a massive Monte Carlo approach with geostatistic analysis, and an Ensemble-Kalman-Filter (EnKF) assimilation. A common feature in all models is a high permeability zone which acts as main flow area and transports most of the tracer. It is assumed to be associated with the fault zone cutting through the boreholes GPK2 and GPK3. With the Bayesian Inversion we were able to estimate a parameter set consisting of porosity

  1. Geostatistical Analysis of Tritium, 3H/3He Age and Noble Gas Derived Parameters in California Groundwater

    NASA Astrophysics Data System (ADS)

    Visser, A.; Singleton, M. J.; Moran, J. E.; Fram, M. S.; Kulongoski, J. T.; Esser, B. K.

    2014-12-01

    Key characteristics of California groundwater systems related to aquifer vulnerability, sustainability, recharge locations and mechanisms, and anthropogenic impact on recharge, are revealed in a spatial geostatistical analysis of the data set of tritium, dissolved noble gas and helium isotope analyses collected for the California State Water Resources Control Board's Groundwater Ambient Monitoring and Assessment (GAMA) and California Aquifer Susceptibility (CAS) programs. Over 4,000 tritium and noble gas analyses are available from wells across California. 25% of the analyzed samples contained less than 1 pCi/L indicating recharge occurred before 1950. The correlation length of tritium concentration is 120 km. Nearly 50% of the wells show a significant component of terrigenic helium. Over 50% of these samples show a terrigenic helium isotope ratio (Rter) that is significantly higher than the radiogenic helium isotope ratio (Rrad = 2×10-8). Rter values of more than three times the atmospheric isotope ratio (Ra = 1.384×10-6) are associated with known faults and volcanic provinces in Northern California. In the Central Valley, Rter varies from radiogenic to 2.25 Ra, complicating 3H/3He dating. The Rter was mapped by kriging, showing a correlation length of less than 50 km. The local predicted Rter was used to separate tritiogenic from atmospheric and terrigenic 3He. Regional groundwater recharge areas, indicated by young groundwater ages, are located in the southern Santa Clara Basin and in the upper LA basin and in the eastern San Joaquin Valley and along unlined canals carrying Colorado River water. Recharge in California is dominated by agricultural return flows, river recharge and managed aquifer recharge rather than precipitation excess. Combined application of noble gases and other groundwater tracers reveal the impact of engineered groundwater recharge and prove invaluable for the study of complex groundwater systems. This work was performed under the

  2. Multiscale organization of joints and faults in a fractured reservoir revealed by geostatistical, multifractal and wavelet techniques

    SciTech Connect

    Castaing, C.; Genter, A.; Ouillon, G.

    1995-08-01

    Datasets of the geometry of fracture systems were analysed at various scales in the western Arabian sedimentary platform by means of geostatistical, multifractal, and anisotropic-wavelet techniques. The investigations covered a wide range of scales, from regional to outcrops in a well-exposed area, and were based on field mapping of fractures, and the interpretation and digitizing of fracture patterns on aerial photographs and satellite images. As a first step, fracture data sets were used to examine the direction, size, spacing and density systematics, and the variability in these quantities with space and scale. Secondly, a multifractal analysis was carried out, which consists in estimating the moments of the spatial distribution of fractures at different resolutions. This global multifractal method was complemented by a local wavelet analysis, using a new anisotropic technique tailored to linear structures. For a map with a given scale of detail, this procedure permits to define integrated fracture patterns and their associated directions at a more regional scale. The main result of this combined approach is that fracturing is not a self-similar process from the centimeter scale up to the one-million-kilometer scale. Spatial distribution of faults appears as being highly controlled by the thickness of the different rheological layers that constitute the crust. A proceeding for upscaling fracture systems in sedimentary reservoirs can be proposed, based on (i) a power law for joint-length distribution, (ii) characteristic joint spacing depending on the critical sedimentary units, and (iii) fractal fault geometry for faults larger than the whole thickness of the sedimentary basin.

  3. Geostatistical modelling of arsenic in drinking water wells and related toenail arsenic concentrations across Nova Scotia, Canada.

    PubMed

    Dummer, T J B; Yu, Z M; Nauta, L; Murimboh, J D; Parker, L

    2015-02-01

    Arsenic is a naturally occurring class 1 human carcinogen that is widespread in private drinking water wells throughout the province of Nova Scotia in Canada. In this paper we explore the spatial variation in toenail arsenic concentrations (arsenic body burden) in Nova Scotia. We describe the regional distribution of arsenic concentrations in private well water supplies in the province, and evaluate the geological and environmental features associated with higher levels of arsenic in well water. We develop geostatistical process models to predict high toenail arsenic concentrations and high well water arsenic concentrations, which have utility for studies where no direct measurements of arsenic body burden or arsenic exposure are available. 892 men and women who participated in the Atlantic Partnership for Tomorrow's Health Project provided both drinking water and toenail clipping samples. Information on socio-demographic, lifestyle and health factors was obtained with a set of standardized questionnaires. Anthropometric indices and arsenic concentrations in drinking water and toenails were measured. In addition, data on arsenic concentrations in 10,498 private wells were provided by the Nova Scotia Department of Environment. We utilised stepwise multivariable logistic regression modelling to develop separate statistical models to: a) predict high toenail arsenic concentrations (defined as toenail arsenic levels ≥0.12 μg g(-1)) and b) predict high well water arsenic concentrations (defined as well water arsenic levels ≥5.0 μg L(-1)). We found that the geological and environmental information that predicted well water arsenic concentrations can also be used to accurately predict toenail arsenic concentrations. We conclude that geological and environmental factors contributing to arsenic contamination in well water are the major contributing influences on arsenic body burden among Nova Scotia residents. Further studies are warranted to assess appropriate

  4. Three-dimensional geostatistical inversion of synthetic tomographic pumping and heat-tracer tests in a nested-cell setup

    NASA Astrophysics Data System (ADS)

    Schwede, Ronnie L.; Li, Wei; Leven, Carsten; Cirpka, Olaf A.

    2014-01-01

    A main purpose of groundwater inverse modeling lies in estimating the hydraulic conductivity field of an aquifer. Traditionally, hydraulic head measurements, possibly obtained in tomographic setups, are used as data. Because the groundwater flow equation is diffusive, many pumping and observation wells would be necessary to obtain a high resolution of hydraulic conductivity, which is typically not possible. We suggest performing heat tracer tests using the same already installed pumping wells and thermometers in observation planes to amend the hydraulic head data set by the arrival times of the heat signals. For each tomographic combinations of wells, we recommend installing an outer pair of pumping wells, generating artificial ambient flow, and an inner well pair in which the tests are performed. We jointly invert heads and thermal arrival times in 3-D by the quasi-linear geostatistical approach using an efficiently parallelized code running on a mid-range cluster. In the present study, we evaluate the value of heat tracer versus head data in a synthetic test case, where the estimated fields can be compared to the synthetic truth. Because the sensitivity patterns of the thermal arrival times differ from those of head measurements, the resolved variance in the estimated field is 6 to 10 times higher in the joint inversion in comparison to inverting head data only. Also, in contrast to head measurements, reversing the flow field and repeating the heat-tracer test improves the estimate in terms of reducing the estimation variance of the estimate. Based on the synthetic test case, we recommend performing the tests in four principal directions, requiring in total eight pumping wells and four intersecting observation planes for heads and temperature in each direction.

  5. Geostatistics: a common link between medical geography, mathematical geology, and medical geology

    PubMed Central

    Goovaerts, P.

    2015-01-01

    Synopsis Since its development in the mining industry, geostatistics has emerged as the primary tool for spatial data analysis in various fields, ranging from earth and atmospheric sciences to agriculture, soil science, remote sensing, and more recently environmental exposure assessment. In the last few years, these tools have been tailored to the field of medical geography or spatial epidemiology, which is concerned with the study of spatial patterns of disease incidence and mortality and the identification of potential ‘causes’ of disease, such as environmental exposure, diet and unhealthy behaviours, economic or socio-demographic factors. On the other hand, medical geology is an emerging interdisciplinary scientific field studying the relationship between natural geological factors and their effects on human and animal health. This paper provides an introduction to the field of medical geology with an overview of geostatistical methods available for the analysis of geological and health data. Key concepts are illustrated using the mapping of groundwater arsenic concentration across eleven Michigan counties and the exploration of its relationship to the incidence of prostate cancer at the township level. PMID:25722963

  6. Redesigning rain gauges network in Johor using geostatistics and simulated annealing

    NASA Astrophysics Data System (ADS)

    Aziz, Mohd Khairul Bazli Mohd; Yusof, Fadhilah; Daud, Zalina Mohd; Yusop, Zulkifli; Kasno, Mohammad Afif

    2015-02-01

    Recently, many rainfall network design techniques have been developed, discussed and compared by many researchers. Present day hydrological studies require higher levels of accuracy from collected data. In numerous basins, the rain gauge stations are located without clear scientific understanding. In this study, an attempt is made to redesign rain gauge network for Johor, Malaysia in order to meet the required level of accuracy preset by rainfall data users. The existing network of 84 rain gauges in Johor is optimized and redesigned into a new locations by using rainfall, humidity, solar radiation, temperature and wind speed data collected during the monsoon season (November - February) of 1975 until 2008. This study used the combination of geostatistics method (variance-reduction method) and simulated annealing as the algorithm of optimization during the redesigned proses. The result shows that the new rain gauge location provides minimum value of estimated variance. This shows that the combination of geostatistics method (variance-reduction method) and simulated annealing is successful in the development of the new optimum rain gauge system.

  7. Geostatistical Borehole Image-Based Mapping of Karst-Carbonate Aquifer Pores.

    PubMed

    Sukop, Michael C; Cunningham, Kevin J

    2016-03-01

    Quantification of the character and spatial distribution of porosity in carbonate aquifers is important as input into computer models used in the calculation of intrinsic permeability and for next-generation, high-resolution groundwater flow simulations. Digital, optical, borehole-wall image data from three closely spaced boreholes in the karst-carbonate Biscayne aquifer in southeastern Florida are used in geostatistical experiments to assess the capabilities of various methods to create realistic two-dimensional models of vuggy megaporosity and matrix-porosity distribution in the limestone that composes the aquifer. When the borehole image data alone were used as the model training image, multiple-point geostatistics failed to detect the known spatial autocorrelation of vuggy megaporosity and matrix porosity among the three boreholes, which were only 10 m apart. Variogram analysis and subsequent Gaussian simulation produced results that showed a realistic conceptualization of horizontal continuity of strata dominated by vuggy megaporosity and matrix porosity among the three boreholes. PMID:26174850

  8. Analysis of large scale spatial variability of soil moisture using a geostatistical method.

    PubMed

    Lakhankar, Tarendra; Jones, Andrew S; Combs, Cynthia L; Sengupta, Manajit; Vonder Haar, Thomas H; Khanbilvardi, Reza

    2010-01-01

    Spatial and temporal soil moisture dynamics are critically needed to improve the parameterization for hydrological and meteorological modeling processes. This study evaluates the statistical spatial structure of large-scale observed and simulated estimates of soil moisture under pre- and post-precipitation event conditions. This large scale variability is a crucial in calibration and validation of large-scale satellite based data assimilation systems. Spatial analysis using geostatistical approaches was used to validate modeled soil moisture by the Agriculture Meteorological (AGRMET) model using in situ measurements of soil moisture from a state-wide environmental monitoring network (Oklahoma Mesonet). The results show that AGRMET data produces larger spatial decorrelation compared to in situ based soil moisture data. The precipitation storms drive the soil moisture spatial structures at large scale, found smaller decorrelation length after precipitation. This study also evaluates the geostatistical approach for mitigation for quality control issues within in situ soil moisture network to estimates at soil moisture at unsampled stations. PMID:22315576

  9. Acceleration of the Geostatistical Software Library (GSLIB) by code optimization and hybrid parallel programming

    NASA Astrophysics Data System (ADS)

    Peredo, Oscar; Ortiz, Julián M.; Herrero, José R.

    2015-12-01

    The Geostatistical Software Library (GSLIB) has been used in the geostatistical community for more than thirty years. It was designed as a bundle of sequential Fortran codes, and today it is still in use by many practitioners and researchers. Despite its widespread use, few attempts have been reported in order to bring this package to the multi-core era. Using all CPU resources, GSLIB algorithms can handle large datasets and grids, where tasks are compute- and memory-intensive applications. In this work, a methodology is presented to accelerate GSLIB applications using code optimization and hybrid parallel processing, specifically for compute-intensive applications. Minimal code modifications are added decreasing as much as possible the elapsed time of execution of the studied routines. If multi-core processing is available, the user can activate OpenMP directives to speed up the execution using all resources of the CPU. If multi-node processing is available, the execution is enhanced using MPI messages between the compute nodes.Four case studies are presented: experimental variogram calculation, kriging estimation, sequential gaussian and indicator simulation. For each application, three scenarios (small, large and extra large) are tested using a desktop environment with 4 CPU-cores and a multi-node server with 128 CPU-nodes. Elapsed times, speedup and efficiency results are shown.

  10. Accounting for Transport Parameter Uncertainty in Geostatistical Groundwater Contaminant Release History Estimation

    NASA Astrophysics Data System (ADS)

    Ostrowski, J.; Shlomi, S.; Michalak, A.

    2007-12-01

    The process of estimating the release history of a contaminant in groundwater relies on coupling a limited number of concentration measurements with a groundwater flow and transport model in an inverse modeling framework. The information provided by available measurements is generally not sufficient to fully characterize the unknown release history; therefore, an accurate assessment of the estimation uncertainty is required. The modeler's level of confidence in the transport parameters, expressed as pdfs, can be incorporated into the inverse model to improve the accuracy of the release estimates. In this work, geostatistical inverse modeling is used in conjunction with Monte Carlo sampling of transport parameters to estimate groundwater contaminant release histories. Concentration non-negativity is enforced using a Gibbs sampling algorithm based on a truncated normal distribution. The method is applied to two one-dimensional test cases: a hypothetical dataset commonly used in validating contaminant source identification methods, and data collected from a tetrachloroethylene and trichloroethylene plume at the Dover Air Force Base in Delaware. The estimated release histories and associated uncertainties are compared to results from a geostatistical inverse model where uncertainty in transport parameters is ignored. Results show that the a posteriori uncertainty associated with the model that accounts for parameter uncertainty is higher, but that this model provides a more realistic representation of the release history based on available data. This modified inverse modeling technique has many applications, including assignment of liability in groundwater contamination cases, characterization of groundwater contamination, and model calibration.

  11. Spatial and temporal groundwater level variation geostatistical modeling in the city of Konya, Turkey.

    PubMed

    Cay, Tayfun; Uyan, Mevlut

    2009-12-01

    Groundwater is one of the most important resources used for drinking and utility and irrigation purposes in the city of Konya, Turkey, as in many areas. The purpose of this study is to evaluate spatial and temporal changes in the level of groundwater by using geostatistical methods based on data from 91 groundwater wells during the period 1999 to 2003. Geostatistical methods have been used widely as a convenient tool to make decisions on the management of groundwater levels. To evaluate the spatial and temporal changes in the level of the groundwater, a vector-based geographic information system software package, ArcGIS 9.1 (Environmental Systems Research Institute, Redlands, California), was used for the application of an ordinary kriging method, with cross-validation leading to the estimation of groundwater levels. The average value of variogram (spherical model) for the spatial analysis was approximately 2150 m. Results of ordinary kriging for groundwater level drops were underestimated by 17%. Cross-validation errors were within an acceptable level. The kriging model also helps to detect risk-prone areas for groundwater abstraction. PMID:20099631

  12. Geostatistical analysis of the temporal variability of ozone concentrations. Comparison between CHIMERE model and surface observations

    NASA Astrophysics Data System (ADS)

    de Fouquet, Chantal; Malherbe, Laure; Ung, Anthony

    2011-07-01

    Deterministic models have become essential tools to forecast and map concentration fields of atmospheric pollutants like ozone. Those models are regularly updated and improved by incorporating recent theoretical developments and using more precise input data. Unavoidable differences with in situ measurements still remain, which need to be better understood. This study investigates those discrepancies in a geostatistical framework by comparing the temporal variability of ozone hourly surface concentrations simulated by a chemistry-transport model, CHIMERE, and measured across France. More than 200 rural and urban background monitoring sites are considered. The relationship between modelled and observed data is complex. Ozone concentrations evolve according to various time scales. CHIMERE correctly accounts for those different scales of variability but is usually unable to reproduce the exact magnitude of each temporal component. Such difficulty cannot be entirely attributed to the difference in spatial support between grid cell averages and punctual observations. As a result of this exploratory analysis, the common multivariate geostatistical model, known as the linear model of coregionalization, is used to describe the temporal variability of ozone hourly concentrations and the relationship between simulated and observed values at each observation point. The fitted parameters of the model can then be interpreted. Their distribution in space provides objective criteria to delimitate the areas where the chemistry-transport model is more or less reliable.

  13. Geostatistical analysis of soil moisture distribution in a part of Solani River catchment

    NASA Astrophysics Data System (ADS)

    Kumar, Kamal; Arora, M. K.; Hariprasad, K. S.

    2016-03-01

    The aim of this paper is to estimate soil moisture at spatial level by applying geostatistical techniques on the point observations of soil moisture in parts of Solani River catchment in Haridwar district of India. Undisturbed soil samples were collected at 69 locations with soil core sampler at a depth of 0-10 cm from the soil surface. Out of these, discrete soil moisture observations at 49 locations were used to generate a spatial soil moisture distribution map of the region. Two geostatistical techniques, namely, moving average and kriging, were adopted. Root mean square error (RMSE) between observed and estimated soil moisture at remaining 20 locations was determined to assess the accuracy of the estimated soil moisture. Both techniques resulted in low RMSE at small limiting distance, which increased with the increase in the limiting distance. The root mean square error varied from 7.42 to 9.77 in moving average method, while in case of kriging it varied from 7.33 to 9.99 indicating similar performance of the two techniques.

  14. Geostatistical borehole image-based mapping of karst-carbonate aquifer pores

    USGS Publications Warehouse

    Michael Sukop; Cunningham, Kevin J.

    2016-01-01

    Quantification of the character and spatial distribution of porosity in carbonate aquifers is important as input into computer models used in the calculation of intrinsic permeability and for next-generation, high-resolution groundwater flow simulations. Digital, optical, borehole-wall image data from three closely spaced boreholes in the karst-carbonate Biscayne aquifer in southeastern Florida are used in geostatistical experiments to assess the capabilities of various methods to create realistic two-dimensional models of vuggy megaporosity and matrix-porosity distribution in the limestone that composes the aquifer. When the borehole image data alone were used as the model training image, multiple-point geostatistics failed to detect the known spatial autocorrelation of vuggy megaporosity and matrix porosity among the three boreholes, which were only 10 m apart. Variogram analysis and subsequent Gaussian simulation produced results that showed a realistic conceptualization of horizontal continuity of strata dominated by vuggy megaporosity and matrix porosity among the three boreholes.

  15. Bridges between multiple-point geostatistics and texture synthesis: Review and guidelines for future research

    NASA Astrophysics Data System (ADS)

    Mariethoz, Gregoire; Lefebvre, Sylvain

    2014-05-01

    Multiple-Point Simulations (MPS) is a family of geostatistical tools that has received a lot of attention in recent years for the characterization of spatial phenomena in geosciences. It relies on the definition of training images to represent a given type of spatial variability, or texture. We show that the algorithmic tools used are similar in many ways to techniques developed in computer graphics, where there is a need to generate large amounts of realistic textures for applications such as video games and animated movies. Similarly to MPS, these texture synthesis methods use training images, or exemplars, to generate realistic-looking graphical textures. Both domains of multiple-point geostatistics and example-based texture synthesis present similarities in their historic development and share similar concepts. These disciplines have however remained separated, and as a result significant algorithmic innovations in each discipline have not been universally adopted. Texture synthesis algorithms present drastically increased computational efficiency, patterns reproduction and user control. At the same time, MPS developed ways to condition models to spatial data and to produce 3D stochastic realizations, which have not been thoroughly investigated in the field of texture synthesis. In this paper we review the possible links between these disciplines and show the potential and limitations of using concepts and approaches from texture synthesis in MPS. We also provide guidelines on how recent developments could benefit both fields of research, and what challenges remain open.

  16. Geostatistical models of secondary oil migration within heterogeneous carrier beds: A theoretical example

    SciTech Connect

    Rhea, L.; Person, M.; Marsily, G. de; Ledoux, E.; Galli, A.

    1994-11-01

    This paper critically evaluates the utility of two different geostatistical methods in tracing long-distance oil migration through sedimentary basins. Geostatistical models of petroleum migration based on kriging and the conditional simulation method are assessed by comparing them to {open_quotes}known{close_quotes} oil migration rates and directions through a numerical carrier bed. In this example, the numerical carrier bed, which serves as {open_quotes}ground truth{close_quotes} in the study, incorporates a synthetic permeability field generated using the method of turning bands. Different representations of lateral permeability heterogeneity of the carrier bed are incorporated into a quasi-three-dimensional model of secondary oil migration. The geometric configuration of the carrier bed is intended to represent migration conditions within the center of a saucer-shaped intracratonic sag basin. In all of the numerical experiments, oil is sourced in the lowest 10% of a saucer-shaped carrier bed and migrates 10-14 km outward in a radial fashion by buoyancy. The effects of vertical permeability variations on secondary oil migration were not considered in the study.

  17. Redesigning rain gauges network in Johor using geostatistics and simulated annealing

    SciTech Connect

    Aziz, Mohd Khairul Bazli Mohd; Yusof, Fadhilah; Daud, Zalina Mohd; Yusop, Zulkifli; Kasno, Mohammad Afif

    2015-02-03

    Recently, many rainfall network design techniques have been developed, discussed and compared by many researchers. Present day hydrological studies require higher levels of accuracy from collected data. In numerous basins, the rain gauge stations are located without clear scientific understanding. In this study, an attempt is made to redesign rain gauge network for Johor, Malaysia in order to meet the required level of accuracy preset by rainfall data users. The existing network of 84 rain gauges in Johor is optimized and redesigned into a new locations by using rainfall, humidity, solar radiation, temperature and wind speed data collected during the monsoon season (November - February) of 1975 until 2008. This study used the combination of geostatistics method (variance-reduction method) and simulated annealing as the algorithm of optimization during the redesigned proses. The result shows that the new rain gauge location provides minimum value of estimated variance. This shows that the combination of geostatistics method (variance-reduction method) and simulated annealing is successful in the development of the new optimum rain gauge system.

  18. Downscaling remotely sensed imagery using area-to-point cokriging and multiple-point geostatistical simulation

    NASA Astrophysics Data System (ADS)

    Tang, Yunwei; Atkinson, Peter M.; Zhang, Jingxiong

    2015-03-01

    A cross-scale data integration method was developed and tested based on the theory of geostatistics and multiple-point geostatistics (MPG). The goal was to downscale remotely sensed images while retaining spatial structure by integrating images at different spatial resolutions. During the process of downscaling, a rich spatial correlation model in the form of a training image was incorporated to facilitate reproduction of similar local patterns in the simulated images. Area-to-point cokriging (ATPCK) was used as locally varying mean (LVM) (i.e., soft data) to deal with the change of support problem (COSP) for cross-scale integration, which MPG cannot achieve alone. Several pairs of spectral bands of remotely sensed images were tested for integration within different cross-scale case studies. The experiment shows that MPG can restore the spatial structure of the image at a fine spatial resolution given the training image and conditioning data. The super-resolution image can be predicted using the proposed method, which cannot be realised using most data integration methods. The results show that ATPCK-MPG approach can achieve greater accuracy than methods which do not account for the change of support issue.

  19. Geostatistical analysis of Landsat-TM lossy compression images in a high-performance computing environment

    NASA Astrophysics Data System (ADS)

    Pesquer, Lluís; Cortés, Ana; Serral, Ivette; Pons, Xavier

    2011-11-01

    The main goal of this study is to characterize the effects of lossy image compression procedures on the spatial patterns of remotely sensed images, as well as to test the performance of job distribution tools specifically designed for obtaining geostatistical parameters (variogram) in a High Performance Computing (HPC) environment. To this purpose, radiometrically and geometrically corrected Landsat-5 TM images from April, July, August and September 2006 were compressed using two different methods: Band-Independent Fixed-Rate (BIFR) and three-dimensional Discrete Wavelet Transform (3d-DWT) applied to the JPEG 2000 standard. For both methods, a wide range of compression ratios (2.5:1, 5:1, 10:1, 50:1, 100:1, 200:1 and 400:1, from soft to hard compression) were compared. Variogram analyses conclude that all compression ratios maintain the variogram shapes and that the higher ratios (more than 100:1) reduce variance in the sill parameter of about 5%. Moreover, the parallel solution in a distributed environment demonstrates that HPC offers a suitable scientific test bed for time demanding execution processes, as in geostatistical analyses of remote sensing images.

  20. Geostatistical Analysis of County-Level Lung Cancer Mortality Rates in the Southeastern United States

    PubMed Central

    Goovaerts, Pierre

    2009-01-01

    The analysis of health data and putative covariates, such as environmental, socioeconomic, demographic, behavioral, or occupational factors, is a promising application for geostatistics. Transferring methods originally developed for the analysis of earth properties to health science, however, presents several methodological and technical challenges. These arise because health data are typically aggregated over irregular spatial supports (e.g., counties) and consist of a numerator and a denominator (i.e., rates). This article provides an overview of geostatistical methods tailored specifically to the characteristics of areal health data, with an application to lung cancer mortality rates in 688 U.S. counties of the southeast (1970–1994). Factorial Poisson kriging can filter short-scale variation and noise, which can be large in sparsely populated counties, to reveal similar regional patterns for male and female cancer mortality that correlate well with proximity to shipyards. Rate uncertainty was transferred through local cluster analysis using stochastic simulation, allowing the computation of the likelihood of clusters of low or high cancer mortality. Accounting for population size and rate uncertainty led to the detection of new clusters of high mortality around Oak Ridge National Laboratory for both sexes, in counties with high concentrations of pig farms and paper mill industries for males (occupational exposure) and in the vicinity of Atlanta for females. PMID:20445829

  1. Multidimensional set switching.

    PubMed

    Hahn, Sowon; Andersen, George J; Kramer, Arthur F

    2003-06-01

    The present study examined the organization of preparatory processes that underlie set switching and, more specifically, switch costs. On each trial, subjects performed one of two perceptual judgment tasks, color or shape discrimination. Subjects also responded with one of two different response sets. The task set and/or the response set switched from one to the other after 2-6 repeated trials. Response set, task set, and double set switches were performed in both blocked and randomized conditions. Subjects performed with short (100-msec) and long (800-msec) preparatory intervals. Task and response set switches had an additive effect on reaction times (RTs) in the blocked condition. Such a pattern of results suggests a serial organization of preparatory processes when the nature of switches is predictable. However, task and response set switches had an underadditive effect on RTs in the random condition when subjects performed with a brief cue-to-target interval. This pattern of results suggests overlapping task and response set preparation. These findings are discussed in terms of strategic control of preparatory processes in set switching. PMID:12921431

  2. 0.6-1.0 V operation set/reset voltage (3 V) generator for three-dimensional integrated resistive random access memory and NAND flash hybrid solid-state drive

    NASA Astrophysics Data System (ADS)

    Tanaka, Masahiro; Hachiya, Shogo; Ishii, Tomoya; Ning, Sheyang; Tsurumi, Kota; Takeuchi, Ken

    2016-04-01

    A 0.6-1.0 V, 25.9 mm2 boost converter is proposed to generate resistive random access memory (ReRAM) write (set/reset) voltage for three-dimensional (3D) integrated ReRAM and NAND flash hybrid solid-state drive (SSD). The proposed boost converter uses an integrated area-efficient V BUF generation circuit to obtain short ReRAM sector write time, small circuit size, and small energy consumption simultaneously. In specific, the proposed boost converter reduces ReRAM sector write time by 65% compared with a conventional one-stage boost converter (Conventional 1) which uses 1.0 V operating voltage. On the other hand, by using the same ReRAM sector write time, the proposed boost converter reduces 49% circuit area and 46% energy consumption compared with a conventional two-stage boost converter (Conventional 2). In addition, by using the proposed boost converter, the operating voltage, V DD, can be reduced to 0.6 V. The lowest 159 nJ energy consumption can be obtained when V DD is 0.7 V.

  3. A multi-site single-blind clinical study to compare the effects of STAIR Narrative Therapy to treatment as usual among women with PTSD in public sector mental health settings: study protocol for a randomized controlled trial

    PubMed Central

    2014-01-01

    Background This article provides a description of the rationale, design, and methods of a multisite clinical trial which evaluates the potential benefits of an evidence-based psychosocial treatment, STAIR Narrative Therapy, among women with posttraumatic stress disorder (PTSD) related to interpersonal violence who are seeking services in public sector community mental health clinics. This is the first large multisite trial of an evidence-based treatment for PTSD provided in the context of community settings that are dedicated to the treatment of poverty-level patient populations. Methods The study is enrolling 352 participants in a minimum of 4 community clinics. Participants are randomized into either STAIR Narrative Therapy or Treatment As Usual (TAU). Primary outcomes are PTSD, emotion management and interpersonal problems. The study will allow a flexible application of the protocol determined by patient need and preferences. Secondary analyses will assess the relationship of outcomes to different patterns of treatment implementation for different levels of baseline symptom severity. Discussion The article discusses the rationale and study issues related to the use of a flexible delivery of a protocol treatment and of the selection of treatment as it is actually practiced in the community as the comparator. Trial registration Clinicaltrials.gov identifier: NCT01488539. PMID:24886235

  4. Interventions for physical activity promotion applied to the primary healthcare settings for people living in regions of low socioeconomic level: study protocol for a non-randomized controlled trial

    PubMed Central

    2014-01-01

    Background Regular physical activity practice has been widely recommended for promoting health, but the physical activity levels remain low in the population. Therefore, the study of interventions to promote physical activity is essential. Objective: To present the methodology of two physical activity interventions from the “Ambiente Ativo” (“Active Environment”) project. Methods 12-month non-randomized controlled intervention trial. 157 healthy and physically inactive individuals were selected: health education (n = 54) supervised exercise (n = 54) and control (n = 49). Intervention based on health education: a multidisciplinary team of health professionals organized the intervention in group discussions, phone calls, SMS and educational material. Intervention based on supervised exercise program: consisted of offering an exercise program in groups supervised by physical education professionals involving strength, endurance and flexibility exercises. The physical activity level was assessed by the International Physical Activity Questionnaire (long version), physical activities recalls, pedometers and accelerometers over a seven-day period. Result This study described two different proposals for promoting physical activity that were applied to adults attended through the public healthcare settings. The participants were living in a region of low socioeconomic level, while respecting the characteristics and organization of the system and its professionals, and also adapting the interventions to the realities of the individuals attended. Conclusion Both interventions are applicable in regions of low socioeconomic level, while respecting the social and economic characteristics of each region. Trial registration ClinicalTrials.gov NCT01852981 PMID:24624930

  5. Approaches in highly parameterized inversion: bgaPEST, a Bayesian geostatistical approach implementation with PEST: documentation and instructions

    USGS Publications Warehouse

    Fienen, Michael N.; D'Oria, Marco; Doherty, John E.; Hunt, Randall J.

    2013-01-01

    The application bgaPEST is a highly parameterized inversion software package implementing the Bayesian Geostatistical Approach in a framework compatible with the parameter estimation suite PEST. Highly parameterized inversion refers to cases in which parameters are distributed in space or time and are correlated with one another. The Bayesian aspect of bgaPEST is related to Bayesian probability theory in which prior information about parameters is formally revised on the basis of the calibration dataset used for the inversion. Conceptually, this approach formalizes the conditionality of estimated parameters on the specific data and model available. The geostatistical component of the method refers to the way in which prior information about the parameters is used. A geostatistical autocorrelation function is used to enforce structure on the parameters to avoid overfitting and unrealistic results. Bayesian Geostatistical Approach is designed to provide the smoothest solution that is consistent with the data. Optionally, users can specify a level of fit or estimate a balance between fit and model complexity informed by the data. Groundwater and surface-water applications are used as examples in this text, but the possible uses of bgaPEST extend to any distributed parameter applications.

  6. Mapping in random-structures

    SciTech Connect

    Reidys, C.M.

    1996-06-01

    A mapping in random-structures is defined on the vertices of a generalized hypercube Q{sub {alpha}}{sup n}. A random-structure will consist of (1) a random contact graph and (2) a family of relations imposed on adjacent vertices. The vertex set of a random contact graph will be the set of all coordinates of a vertex P {element_of} Q{sub {alpha}}{sup n}. Its edge will be the union of the edge sets of two random graphs. The first is a random 1-regular graph on 2m vertices (coordinates) and the second is a random graph G{sub p} with p = c{sub 2}/n on all n vertices (coordinates). The structure of the random contact graphs will be investigated and it will be shown that for certain values of m, c{sub 2} the mapping in random-structures allows to search by the set of random-structures. This is applied to mappings in RNA-secondary structures. Also, the results on random-structures might be helpful for designing 3D-folding algorithms for RNA.

  7. Introduction to this Special Issue on Geostatistics and Scaling of Remote Sensing

    NASA Technical Reports Server (NTRS)

    Quattrochi, Dale A.

    1999-01-01

    The germination of this special PE&RS issue began at the Royal Geographical Society (with the Institute of British Geographers)(RCS-IBC) annual meeting in January, 1997 held at the University of Exeter in Exeter, England. The cold and snow of an England winter were greatly tempered by the friendly and cordial discussions that ensued at the meeting on possible ways to foster both dialog and research across "the Big Pond" between geographers in the US and the UK on the use of geostatistics and geospatial techniques for remote sensing of land surface processes. It was decided that one way to stimulate and enhance cooperation on the application of geostatistics and geospatial methods in remote sensing was to hold parallel sessions on these topics at appropriate meeting venues in 1998 in both the US and the UK Selected papers given at these sessions would be published as a special issue of PE&RS on the US side, and as a special issue of Computers and Geosciences (C&G) on the UK side, to highlight the commonality in research on geostatistics and geospatial methods in remote sensing and spatial data analysis on both sides of the Atlantic Ocean. As a consequence, a session on "Ceostatistics and Geospatial Techniques for Remote Sensing of Land Surface Processes" was held at the Association of American Geographers (AAG) annual meeting in Boston, Massachusetts in March, 1998, sponsored by the AAG's Remote Sensing Specialty Group (RSSG). A similar session was held at the RGS-IBG annual meeting in Guildford, Surrey, England in January 1998, organized by the Modeling and Advanced Techniques Special Interest Group (MAT SIG) of the Remote Sensing Society (RSS). The six papers that in part, comprise this issue of PE&RS, are the US complement to such a dual journal publication effort. Both of us are co-editors of each of the journal special issues, with the lead editor of each journal being from their respective side of the Atlantic where the journals are published. The special

  8. Geostatistical Evaluation of Spring Water Quality in an Urbanizing Carbonate Aquifer

    NASA Astrophysics Data System (ADS)

    McGinty, A.; Welty, C.

    2003-04-01

    As part of an investigation of the impacts of urbanization on the hydrology and ecology of Valley Creek watershed near Philadelphia, Pennsylvania, we have analyzed the chemical composition of 110 springs to assess the relative influence of geology and anthropogenic activities on water quality. The 60 km^2 watershed is underlain by productive fractured rock aquifers composed of Cambrian and Ordovician carbonate rocks in the central valley and Cambrian crystalline and siliciclastic rocks (quartzite and phyllite) in the north and south hills that border the valley. All tributaries of the surface water system originate in the crystalline and siliciclastic hills. The watershed is covered by 17% impervious area and contains 6 major hazardous waste sites, one active quarrying operation and one golf course; 25% of the area utilizes septic systems for sewage disposal. We identified 172 springs, 110 of which had measurable flow rates ranging from 0.002 to 5 l/s. The mapped surficial geology appears as an anisotropic pattern, with long bands of rock formations paralleling the geographic orientation of the valley. Mapped development appears as a more isotropic pattern, characterized by isolated patches of land use that are not coincident with the evident geologic pattern. Superimposed upon these characteristics is a dense array of depressions and shallow sinkholes in the carbonate rocks, and a system of major faults at several formation contacts. We used indicator geostatistics to quantitatively characterize the spatial extent of the major geologic formations and patterns of land use. Maximum correlation scales for the rock types corresponded with strike direction and ranged from 1000 to 3000 m. Anisotropy ratios ranged from 2 to 4. Land-use correlation scales were generally smaller (200 to 500 m) with anisotropy ratios of around 1.2, i.e., nearly isotropic as predicted. Geostatistical analysis of spring water quality parameters related to geology (pH, specific conductance

  9. Precipitation estimation in mountainous terrain using multivariate geostatistics. Part I: structural analysis

    USGS Publications Warehouse

    Hevesi, Joseph A.; Istok, Jonathan D.; Flint, Alan L.

    1992-01-01

    Values of average annual precipitation (AAP) are desired for hydrologic studies within a watershed containing Yucca Mountain, Nevada, a potential site for a high-level nuclear-waste repository. Reliable values of AAP are not yet available for most areas within this watershed because of a sparsity of precipitation measurements and the need to obtain measurements over a sufficient length of time. To estimate AAP over the entire watershed, historical precipitation data and station elevations were obtained from a network of 62 stations in southern Nevada and southeastern California. Multivariate geostatistics (cokriging) was selected as an estimation method because of a significant (p = 0.05) correlation of r = .75 between the natural log of AAP and station elevation. A sample direct variogram for the transformed variable, TAAP = ln [(AAP) 1000], was fitted with an isotropic, spherical model defined by a small nugget value of 5000, a range of 190 000 ft, and a sill value equal to the sample variance of 163 151. Elevations for 1531 additional locations were obtained from topographic maps to improve the accuracy of cokriged estimates. A sample direct variogram for elevation was fitted with an isotropic model consisting of a nugget value of 5500 and three nested transition structures: a Gaussian structure with a range of 61 000 ft, a spherical structure with a range of 70 000 ft, and a quasi-stationary, linear structure. The use of an isotropic, stationary model for elevation was considered valid within a sliding-neighborhood radius of 120 000 ft. The problem of fitting a positive-definite, nonlinear model of coregionalization to an inconsistent sample cross variogram for TAAP and elevation was solved by a modified use of the Cauchy-Schwarz inequality. A selected cross-variogram model consisted of two nested structures: a Gaussian structure with a range of 61 000 ft and a spherical structure with a range of 190 000 ft. Cross validation was used for model selection and for

  10. Precipitation estimation in mountainous terrain using multivariate geostatistics. Part II: isohyetal maps

    USGS Publications Warehouse

    Hevesi, Joseph A.; Flint, Alan L.; Istok, Jonathan D.

    1992-01-01

    Values of average annual precipitation (AAP) may be important for hydrologic characterization of a potential high-level nuclear-waste repository site at Yucca Mountain, Nevada. Reliable measurements of AAP are sparse in the vicinity of Yucca Mountain, and estimates of AAP were needed for an isohyetal mapping over a 2600-square-mile watershed containing Yucca Mountain. Estimates were obtained with a multivariate geostatistical model developed using AAP and elevation data from a network of 42 precipitation stations in southern Nevada and southeastern California. An additional 1531 elevations were obtained to improve estimation accuracy. Isohyets representing estimates obtained using univariate geostatistics (kriging) defined a smooth and continuous surface. Isohyets representing estimates obtained using multivariate geostatistics (cokriging) defined an irregular surface that more accurately represented expected local orographic influences on AAP. Cokriging results included a maximum estimate within the study area of 335 mm at an elevation of 7400 ft, an average estimate of 157 mm for the study area, and an average estimate of 172 mm at eight locations in the vicinity of the potential repository site. Kriging estimates tended to be lower in comparison because the increased AAP expected for remote mountainous topography was not adequately represented by the available sample. Regression results between cokriging estimates and elevation were similar to regression results between measured AAP and elevation. The position of the cokriging 250-mm isohyet relative to the boundaries of pinyon pine and juniper woodlands provided indirect evidence of improved estimation accuracy because the cokriging result agreed well with investigations by others concerning the relationship between elevation, vegetation, and climate in the Great Basin. Calculated estimation variances were also mapped and compared to evaluate improvements in estimation accuracy. Cokriging estimation variances

  11. A geostatistical method applied to the geochemical study of the Chichinautzin Volcanic Field in Mexico

    NASA Astrophysics Data System (ADS)

    Robidoux, P.; Roberge, J.; Urbina Oviedo, C. A.

    2011-12-01

    The origin of magmatism and the role of the subducted Coco's Plate in the Chichinautzin volcanic field (CVF), Mexico is still a subject of debate. It has been established that mafic magmas of alkali type (subduction) and calc-alkali type (OIB) are produced in the CVF and both groups cannot be related by simple fractional crystallization. Therefore, many geochemical studies have been done, and many models have been proposed. The main goal of the work present here is to provide a new tool for the visualization and interpretation of geochemical data using geostatistics and geospatial analysis techniques. It contains a complete geodatabase built from referred samples over the 2500 km2 area of CVF and its neighbour stratovolcanoes (Popocatepetl, Iztaccihuatl and Nevado de Toluca). From this database, map of different geochemical markers were done to visualise geochemical signature in a geographical manner, to test the statistic distribution with a cartographic technique and highlight any spatial correlations. The distribution and regionalization of the geochemical signatures can be viewed in a two-dimensional space using a specific spatial analysis tools from a Geographic Information System (GIS). The model of spatial distribution is tested with Linear Decrease (LD) and Inverse Distance Weight (IDW) interpolation technique because they best represent the geostatistical characteristics of the geodatabase. We found that ratio of Ba/Nb, Nb/Ta, Th/Nb show first order tendency, which means visible spatial variation over a large scale area. Monogenetic volcanoes in the center of the CVF have distinct values compare to those of the Popocatepetl-Iztaccihuatl polygenetic complex which are spatially well defined. Inside the Valley of Mexico, a large quantity of monogenetic cone in the eastern portion of CVF has ratios similar to the Iztaccihuatl and Popocatepetl complex. Other ratios like alkalis vs SiO2, V/Ti, La/Yb, Zr/Y show different spatial tendencies. In that case, second

  12. Multicenter prospective randomized study comparing the technique of using a bovine pericardium biological prosthesis reinforcement in parietal herniorrhaphy (Tutomesh TUTOGEN) with simple parietal herniorrhaphy, in a potentially contaminated setting.

    PubMed

    Nedelcu, Marius; Verhaeghe, Pierre; Skalli, Mehdi; Champault, Gerard; Barrat, Christophe; Sebbag, Hugues; Reche, Fabian; Passebois, Laurent; Beyrne, Daniel; Gugenheim, Jean; Berdah, Stephane; Bouayed, Amine; Michel Fabre, Jean; Nocca, David

    2016-03-01

    The use of parietal synthetic prosthetic reinforcement material in potentially contaminated settings is not recommended, as there is a risk that the prosthesis may become infected. Thus, simple parietal herniorrhaphy, is the conventional treatment, even though there is a significant risk that the hernia may recur. Using new biomaterials of animal origin presently appears to offer a new therapeutic solution, but their effectiveness has yet to be demonstrated. The purpose of this multicenter prospective randomized single-blind study was to compare the surgical treatment of inguinal hernia or abdominal incisional hernia by simple parietal herniorrhaphy without prosthetic reinforcement (Group A), with Tutomesh TUTOGEN biological prosthesis reinforcement parietal herniorrhaphy (Group B), in a potentially contaminated setting. We examined early postoperative complications in the first month after the operation, performed an assessment after one year of survival without recurrence and analyzed the quality of life and pain of the patients (using SF-12 health status questionnaire and Visual Analog Pain Scale) at 1, 6, and 12 months, together with an economic impact study. Hundred and thirty four patients were enrolled between January 2009 and October 2010 in 20 French hospitals. The groups were comparable with respect to their enrollment characteristics, their history, types of operative indications and procedures carried out. At one month post-op, the rate of infectious complications (n(A) = 11(18.33%) vs. n(B) = 12(19.05%), p = 0.919) was not significantly different between the two groups. The assessment after one year of survival without recurrence revealed that survival was significantly greater in Group B (Group A recurrence: 10, Group B: 3; p = 0.0475). No difference in the patients' quality of life was demonstrated at 1, 6, or 12 months. However, at the 1 month follow-up, the "perceived health" rating seemed better in the group with Tutomesh (p

  13. Geostatistical analysis of potentiometric data in the Pennsylvanian aquifer of the Palo Duro Basin, Texas

    SciTech Connect

    Harper, W.V.; Basinger, K.L.; Furr, J.M.

    1988-01-01

    This report details a geostatistical analysis of potentiometric data from the Pennsylvanian aquifer in the Palo Duro Basin, Texas. Such an analysis is a part of an overall uncertainty analysis for a high-level waste repository in salt. Both an expected potentiometric surface and the associated standard error surface are produced. The Pennsylvanian data are found to be well explained by a linear trend with a superimposed spherical semivariogram. A cross-validation of the analysis confirms this. In addition, the cross-validation provides a point-by-point check to test for possible anomalous data. The analysis is restricted to that portion of the Pennsylvanian aquifer that lies to the southwest of the Amarillo Uplift. The Pennsylvanian is absent is some areas across the uplift and data to the northeast were not used in this analysis. The surfaces produced in that analysis are included for comparison. 9 refs., 15 figs.

  14. Assessment of soil organic carbon distribution in Europe scale by spatio-temporal data and geostatistics

    NASA Astrophysics Data System (ADS)

    Aksoy, Ece; Panagos, Panos; Montanarella, Luca

    2013-04-01

    Accuracy in assessing the distribution of soil organic carbon (SOC) is an important issue because SOC is an important soil component that plays key roles in the functions of both natural ecosystems and agricultural systems. The SOC content varies from place to place and it is strongly related with climate variables (temperature and rainfall), terrain features, soil texture, parent material, vegetation, land-use types, and human management (management and degradation) at different spatial scales. Geostatistical techniques allow for the prediction of soil properties using soil information and environmental covariates. In this study, assessment of SOC distribution has been predicted using combination of LUCAS soil samples with local soil data and ten spatio-temporal predictors (slope, aspect, elevation, CTI, CORINE land-cover classification, parent material, texture, WRB soil classification, average temperature and precipitation) with Regression-Kriging method in Europe scale. Significant correlation between the covariates and the organic carbon dependent variable was found.

  15. Implementation of the Iterative Proportion Fitting Algorithm for Geostatistical Facies Modeling

    SciTech Connect

    Li Yupeng Deutsch, Clayton V.

    2012-06-15

    In geostatistics, most stochastic algorithm for simulation of categorical variables such as facies or rock types require a conditional probability distribution. The multivariate probability distribution of all the grouped locations including the unsampled location permits calculation of the conditional probability directly based on its definition. In this article, the iterative proportion fitting (IPF) algorithm is implemented to infer this multivariate probability. Using the IPF algorithm, the multivariate probability is obtained by iterative modification to an initial estimated multivariate probability using lower order bivariate probabilities as constraints. The imposed bivariate marginal probabilities are inferred from profiles along drill holes or wells. In the IPF process, a sparse matrix is used to calculate the marginal probabilities from the multivariate probability, which makes the iterative fitting more tractable and practical. This algorithm can be extended to higher order marginal probability constraints as used in multiple point statistics. The theoretical framework is developed and illustrated with estimation and simulation example.

  16. Bayesian Geostatistical Analysis and Ecoclimatic Determinants of Corynebacterium pseudotuberculosis Infection among Horses.

    PubMed

    Boysen, Courtney; Davis, Elizabeth G; Beard, Laurie A; Lubbers, Brian V; Raghavan, Ram K

    2015-01-01

    Kansas witnessed an unprecedented outbreak in Corynebacterium pseudotuberculosis infection among horses, a disease commonly referred to as pigeon fever during fall 2012. Bayesian geostatistical models were developed to identify key environmental and climatic risk factors associated with C. pseudotuberculosis infection in horses. Positive infection status among horses (cases) was determined by positive test results for characteristic abscess formation, positive bacterial culture on purulent material obtained from a lanced abscess (n = 82), or positive serologic evidence of exposure to organism (≥ 1:512)(n = 11). Horses negative for these tests (n = 172)(controls) were considered free of infection. Information pertaining to horse demographics and stabled location were obtained through review of medical records and/or contact with horse owners via telephone. Covariate information for environmental and climatic determinants were obtained from USDA (soil attributes), USGS (land use/land cover), and NASA MODIS and NASA Prediction of Worldwide Renewable Resources (climate). Candidate covariates were screened using univariate regression models followed by Bayesian geostatistical models with and without covariates. The best performing model indicated a protective effect for higher soil moisture content (OR = 0.53, 95% CrI = 0.25, 0.71), and detrimental effects for higher land surface temperature (≥ 35°C) (OR = 2.81, 95% CrI = 2.21, 3.85) and habitat fragmentation (OR = 1.31, 95% CrI = 1.27, 2.22) for C. pseudotuberculosis infection status in horses, while age, gender and breed had no effect. Preventative and ecoclimatic significance of these findings are discussed. PMID:26473728

  17. Soil moisture estimation by assimilating L-band microwave brightness temperature with geostatistics and observation localization.

    PubMed

    Han, Xujun; Li, Xin; Rigon, Riccardo; Jin, Rui; Endrizzi, Stefano

    2015-01-01

    The observation could be used to reduce the model uncertainties with data assimilation. If the observation cannot cover the whole model area due to spatial availability or instrument ability, how to do data assimilation at locations not covered by observation? Two commonly used strategies were firstly described: One is covariance localization (CL); the other is observation localization (OL). Compared with CL, OL is easy to parallelize and more efficient for large-scale analysis. This paper evaluated OL in soil moisture profile characterizations, in which the geostatistical semivariogram was used to fit the spatial correlated characteristics of synthetic L-Band microwave brightness temperature measurement. The fitted semivariogram model and the local ensemble transform Kalman filter algorithm are combined together to weight and assimilate the observations within a local region surrounding the grid cell of land surface model to be analyzed. Six scenarios were compared: 1_Obs with one nearest observation assimilated, 5_Obs with no more than five nearest local observations assimilated, and 9_Obs with no more than nine nearest local observations assimilated. The scenarios with no more than 16, 25, and 36 local observations were also compared. From the results we can conclude that more local observations involved in assimilation will improve estimations with an upper bound of 9 observations in this case. This study demonstrates the potentials of geostatistical correlation representation in OL to improve data assimilation of catchment scale soil moisture using synthetic L-band microwave brightness temperature, which cannot cover the study area fully in space due to vegetation effects. PMID:25635771

  18. Geostatistical and Fractal Characteristics of Soil Moisture Patterns from Plot to Catchment Scale Datasets

    NASA Astrophysics Data System (ADS)

    Korres, Wolfgang; Reichenau, Tim G.; Fiener, Peter; Koyama, Christian N.; Bogena, Heye R.; Cornelissen, Thomas; Baatz, Roland; Herbst, Michael; Diekkrüger, Bernd; Vereecken, Harry; Schneider, Karl

    2016-04-01

    Soil moisture and its spatio-temporal pattern is a key variable in hydrology, meteorology and agriculture. The aim of the current study is to analyze spatio-temporal soil moisture patterns of 9 datasets from the Rur catchment (Western Germany) with a total area of 2364 km², consisting of a low mountain range (forest and grassland) and a loess plain dominated by arable land. Data was acquired across a variety of land use types, on different spatial scales (plot to mesoscale catchment) and with different methods (field measurements, remote sensing, and modelling). All datasets were analyzed using the same methodology. In a geostatistical analysis sill and range of the theoretical variogram were inferred. Based on this analysis, three groups of datasets with similar characteristics in the autocorrelation structure were identified: (i) modelled and measured datasets from a forest sub-catchment (influenced by soil properties and topography), (ii) remotely sensed datasets from the cropped part of the total catchment (influenced by the land-use structure of the cropped area), and (iii) modelled datasets from the cropped part of the Rur catchment (influenced by large scale variability of soil properties). A fractal analysis revealed that soil moisture patterns of all datasets show a multi-fractal behavior (varying fractal dimensions, patterns are only self-similar over certain ranges of scales), with at least one scale break and generally high fractal dimensions (high spatial variability). Corresponding scale breaks were found in various datasets and the factors explaining these scale breaks are consistent with the findings of the geostatistical analysis. The joined analysis of the different datasets showed that small differences in soil moisture dynamics, especially at maximum porosity and wilting point in the soils, can have a large influence on the soil moisture patterns and their autocorrelation structure.

  19. Bayesian Geostatistical Analysis and Ecoclimatic Determinants of Corynebacterium pseudotuberculosis Infection among Horses

    PubMed Central

    Boysen, Courtney; Davis, Elizabeth G.; Beard, Laurie A.; Lubbers, Brian V.; Raghavan, Ram K.

    2015-01-01

    Kansas witnessed an unprecedented outbreak in Corynebacterium pseudotuberculosis infection among horses, a disease commonly referred to as pigeon fever during fall 2012. Bayesian geostatistical models were developed to identify key environmental and climatic risk factors associated with C. pseudotuberculosis infection in horses. Positive infection status among horses (cases) was determined by positive test results for characteristic abscess formation, positive bacterial culture on purulent material obtained from a lanced abscess (n = 82), or positive serologic evidence of exposure to organism (≥1:512)(n = 11). Horses negative for these tests (n = 172)(controls) were considered free of infection. Information pertaining to horse demographics and stabled location were obtained through review of medical records and/or contact with horse owners via telephone. Covariate information for environmental and climatic determinants were obtained from USDA (soil attributes), USGS (land use/land cover), and NASA MODIS and NASA Prediction of Worldwide Renewable Resources (climate). Candidate covariates were screened using univariate regression models followed by Bayesian geostatistical models with and without covariates. The best performing model indicated a protective effect for higher soil moisture content (OR = 0.53, 95% CrI = 0.25, 0.71), and detrimental effects for higher land surface temperature (≥35°C) (OR = 2.81, 95% CrI = 2.21, 3.85) and habitat fragmentation (OR = 1.31, 95% CrI = 1.27, 2.22) for C. pseudotuberculosis infection status in horses, while age, gender and breed had no effect. Preventative and ecoclimatic significance of these findings are discussed. PMID:26473728

  20. Soil Moisture Estimation by Assimilating L-Band Microwave Brightness Temperature with Geostatistics and Observation Localization

    PubMed Central

    Han, Xujun; Li, Xin; Rigon, Riccardo; Jin, Rui; Endrizzi, Stefano

    2015-01-01

    The observation could be used to reduce the model uncertainties with data assimilation. If the observation cannot cover the whole model area due to spatial availability or instrument ability, how to do data assimilation at locations not covered by observation? Two commonly used strategies were firstly described: One is covariance localization (CL); the other is observation localization (OL). Compared with CL, OL is easy to parallelize and more efficient for large-scale analysis. This paper evaluated OL in soil moisture profile characterizations, in which the geostatistical semivariogram was used to fit the spatial correlated characteristics of synthetic L-Band microwave brightness temperature measurement. The fitted semivariogram model and the local ensemble transform Kalman filter algorithm are combined together to weight and assimilate the observations within a local region surrounding the grid cell of land surface model to be analyzed. Six scenarios were compared: 1_Obs with one nearest observation assimilated, 5_Obs with no more than five nearest local observations assimilated, and 9_Obs with no more than nine nearest local observations assimilated. The scenarios with no more than 16, 25, and 36 local observations were also compared. From the results we can conclude that more local observations involved in assimilation will improve estimations with an upper bound of 9 observations in this case. This study demonstrates the potentials of geostatistical correlation representation in OL to improve data assimilation of catchment scale soil moisture using synthetic L-band microwave brightness temperature, which cannot cover the study area fully in space due to vegetation effects. PMID:25635771

  1. A comparison of geostatistically based inverse techniques for use in performance assessment analysis at the Waste Isolation Pilot Plant Site: Results from Test Case No. 1

    SciTech Connect

    Zimmerman, D.A.; Gallegos, D.P.

    1993-10-01

    The groundwater flow pathway in the Culebra Dolomite aquifer at the Waste Isolation Pilot Plant (WIPP) has been identified as a potentially important pathway for radionuclide migration to the accessible environment. Consequently, uncertainties in the models used to describe flow and transport in the Culebra need to be addressed. A ``Geostatistics Test Problem`` is being developed to evaluate a number of inverse techniques that may be used for flow calculations in the WIPP performance assessment (PA). The Test Problem is actually a series of test cases, each being developed as a highly complex synthetic data set; the intent is for the ensemble of these data sets to span the range of possible conceptual models of groundwater flow at the WIPP site. The Test Problem analysis approach is to use a comparison of the probabilistic groundwater travel time (GWTT) estimates produced by each technique as the basis for the evaluation. Participants are given observations of head and transmissivity (possibly including measurement error) or other information such as drawdowns from pumping wells, and are asked to develop stochastic models of groundwater flow for the synthetic system. Cumulative distribution functions (CDFs) of groundwater flow (computed via particle tracking) are constructed using the head and transmissivity data generated through the application of each technique; one semi-analytical method generates the CDFs of groundwater flow directly. This paper describes the results from Test Case No. 1.

  2. The dynamics and interaction of compaction bands in Valley of Fire State Park, Nevada (USA): Implications for their growth, evolution, and geostatistical property

    NASA Astrophysics Data System (ADS)

    Torabi, A.; Aydin, A.; Cilona, A.; Jarstø, B. E.; Deng, S.

    2015-08-01

    Geometry, geostatistics and microstructures of two sets of high-angle compaction bands within Aztec Sandstone are investigated for the purpose of characterizing their dimensions, distributions, and growth dynamics. Both sets generally terminate at the dune boundaries. The first set of compaction bands is longer than the second set, which is thought to be controlled by the depositional architecture. The macroscopic thickness of single compaction bands varies from zero at their tips to maximum values, 3 to 10 mm commonly within the central parts. The compaction bands often cluster into zones, the thickness of which increases by addition of a new band sub-parallel to the previously formed bands. The average thickness of a given zone does not change significantly except for a limited number of places along their lengths and a high thickness/distance gradient at their ends. The high thickness anomalies in the thickness distribution plots along the zones correspond to local complexities such as eye structures, steps, and intersections of converging or diverging bands and may reach to about 35 cm. Microstructural analysis of single compaction bands highlights the presence of two typical portions, a thin portion and a thick portion. The thin portion of the band contains pockets of tightly packed grains surrounded by large pores in the band. The thick portion is characterized by an almost homogeneously distributed compaction throughout the band. The thin portions are found close to the tips, whereas the thick portions occur primarily in the central parts of the bands. Two other characteristics of both single bands and zones of bands in high resolution images are that their thicknesses appear to be smaller and that their lateral boundaries are unexpectedly rough with respect to their appearance in outcrops and hand samples, the latter of which may play an important role in their lateral growth.

  3. Multivariate Analysis and Modeling of Sediment Pollution Using Neural Network Models and Geostatistics

    NASA Astrophysics Data System (ADS)

    Golay, Jean; Kanevski, Mikhaïl

    2013-04-01

    The present research deals with the exploration and modeling of a complex dataset of 200 measurement points of sediment pollution by heavy metals in Lake Geneva. The fundamental idea was to use multivariate Artificial Neural Networks (ANN) along with geostatistical models and tools in order to improve the accuracy and the interpretability of data modeling. The results obtained with ANN were compared to those of traditional geostatistical algorithms like ordinary (co)kriging and (co)kriging with an external drift. Exploratory data analysis highlighted a great variety of relationships (i.e. linear, non-linear, independence) between the 11 variables of the dataset (i.e. Cadmium, Mercury, Zinc, Copper, Titanium, Chromium, Vanadium and Nickel as well as the spatial coordinates of the measurement points and their depth). Then, exploratory spatial data analysis (i.e. anisotropic variography, local spatial correlations and moving window statistics) was carried out. It was shown that the different phenomena to be modeled were characterized by high spatial anisotropies, complex spatial correlation structures and heteroscedasticity. A feature selection procedure based on General Regression Neural Networks (GRNN) was also applied to create subsets of variables enabling to improve the predictions during the modeling phase. The basic modeling was conducted using a Multilayer Perceptron (MLP) which is a workhorse of ANN. MLP models are robust and highly flexible tools which can incorporate in a nonlinear manner different kind of high-dimensional information. In the present research, the input layer was made of either two (spatial coordinates) or three neurons (when depth as auxiliary information could possibly capture an underlying trend) and the output layer was composed of one (univariate MLP) to eight neurons corresponding to the heavy metals of the dataset (multivariate MLP). MLP models with three input neurons can be referred to as Artificial Neural Networks with EXternal

  4. Random thoughts

    NASA Astrophysics Data System (ADS)

    ajansen; kwhitefoot; panteltje1; edprochak; sudhakar, the

    2014-07-01

    In reply to the physicsworld.com news story “How to make a quantum random-number generator from a mobile phone” (16 May, http://ow.ly/xFiYc, see also p5), which describes a way of delivering random numbers by counting the number of photons that impinge on each of the individual pixels in the camera of a Nokia N9 smartphone.

  5. Accounting for regional background and population size in the detection of spatial clusters and outliers using geostatistical filtering and spatial neutral models: the case of lung cancer in Long Island, New York

    PubMed Central

    Goovaerts, Pierre; Jacquez, Geoffrey M

    2004-01-01

    Background Complete Spatial Randomness (CSR) is the null hypothesis employed by many statistical tests for spatial pattern, such as local cluster or boundary analysis. CSR is however not a relevant null hypothesis for highly complex and organized systems such as those encountered in the environmental and health sciences in which underlying spatial pattern is present. This paper presents a geostatistical approach to filter the noise caused by spatially varying population size and to generate spatially correlated neutral models that account for regional background obtained by geostatistical smoothing of observed mortality rates. These neutral models were used in conjunction with the local Moran statistics to identify spatial clusters and outliers in the geographical distribution of male and female lung cancer in Nassau, Queens, and Suffolk counties, New York, USA. Results We developed a typology of neutral models that progressively relaxes the assumptions of null hypotheses, allowing for the presence of spatial autocorrelation, non-uniform risk, and incorporation of spatially heterogeneous population sizes. Incorporation of spatial autocorrelation led to fewer significant ZIP codes than found in previous studies, confirming earlier claims that CSR can lead to over-identification of the number of significant spatial clusters or outliers. Accounting for population size through geostatistical filtering increased the size of clusters while removing most of the spatial outliers. Integration of regional background into the neutral models yielded substantially different spatial clusters and outliers, leading to the identification of ZIP codes where SMR values significantly depart from their regional background. Conclusion The approach presented in this paper enables researchers to assess geographic relationships using appropriate null hypotheses that account for the background variation extant in real-world systems. In particular, this new methodology allows one to identify

  6. Framework for the mapping of the monthly average daily solar radiation using an advanced case-based reasoning and a geostatistical technique.

    PubMed

    Lee, Minhyun; Koo, Choongwan; Hong, Taehoon; Park, Hyo Seon

    2014-04-15

    For the effective photovoltaic (PV) system, it is necessary to accurately determine the monthly average daily solar radiation (MADSR) and to develop an accurate MADSR map, which can simplify the decision-making process for selecting the suitable location of the PV system installation. Therefore, this study aimed to develop a framework for the mapping of the MADSR using an advanced case-based reasoning (CBR) and a geostatistical technique. The proposed framework consists of the following procedures: (i) the geographic scope for the mapping of the MADSR is set, and the measured MADSR and meteorological data in the geographic scope are collected; (ii) using the collected data, the advanced CBR model is developed; (iii) using the advanced CBR model, the MADSR at unmeasured locations is estimated; and (iv) by applying the measured and estimated MADSR data to the geographic information system, the MADSR map is developed. A practical validation was conducted by applying the proposed framework to South Korea. It was determined that the MADSR map developed through the proposed framework has been improved in terms of accuracy. The developed MADSR map can be used for estimating the MADSR at unmeasured locations and for determining the optimal location for the PV system installation. PMID:24635702

  7. Factors affecting paddy soil arsenic concentration in Bangladesh: prediction and uncertainty of geostatistical risk mapping.

    PubMed

    Ahmed, Zia U; Panaullah, Golam M; DeGloria, Stephen D; Duxbury, John M

    2011-12-15

    Knowledge of the spatial correlation of soil arsenic (As) concentrations with environmental variables is needed to assess the nature and extent of the risk of As contamination from irrigation water in Bangladesh. We analyzed 263 paired groundwater and paddy soil samples covering highland (HL) and medium highland-1 (MHL-1) land types for geostatistical mapping of soil As and delineation of As contaminated areas in Tala Upazilla, Satkhira district. We also collected 74 non-rice soil samples to assess the baseline concentration of soil As for this area. The mean soil As concentrations (mg/kg) for different land types under rice and non-rice crops were: rice-MHL-1 (21.2)>rice-HL (14.1)>non-rice-MHL-1 (11.9)>non-rice-HL (7.2). Multiple regression analyses showed that irrigation water As, Fe, land elevation and years of tubewell operation are the important factors affecting the concentrations of As in HL paddy soils. Only years of tubewell operation affected As concentration in the MHL-1 paddy soils. Quantitatively similar increases in soil As above the estimated baseline-As concentration were observed for rice soils on HL and MHL-1 after 6-8 years of groundwater irrigation, implying strong retention of As added in irrigation water in both land types. Application of single geostatistical methods with secondary variables such as regression kriging (RK) and ordinary co-kriging (OCK) gave little improvement in prediction of soil As over ordinary kriging (OK). Comparing single prediction methods, kriging within strata (KWS), the combination of RK for HL and OCK for MHL-1, gave more accurate soil As predictions and showed the lowest misclassification of declaring a location "contaminated" with respect to 14.8 mg As/kg, the highest value obtained for the baseline soil As concentration. Prediction of soil As buildup over time indicated that 75% or the soils cropped to rice would contain at least 30 mg/L As by the year 2020. PMID:22055452

  8. Indoor terrestrial gamma dose rate mapping in France: a case study using two different geostatistical models.

    PubMed

    Warnery, E; Ielsch, G; Lajaunie, C; Cale, E; Wackernagel, H; Debayle, C; Guillevic, J

    2015-01-01

    Terrestrial gamma dose rates show important spatial variations in France. Previous studies resulted in maps of arithmetic means of indoor terrestrial gamma dose rates by "departement" (French district). However, numerous areas could not be characterized due to the lack of data. The aim of our work was to obtain more precise estimates of the spatial variability of indoor terrestrial gamma dose rates in France by using a more recent and complete data base and geostatistics. The study was based on the exploitation of 97,595 measurements results distributed in 17,404 locations covering all of France. Measurements were done by the Institute for Radioprotection and Nuclear Safety (IRSN) using RPL (Radio Photo Luminescent) dosimeters, exposed during several months between years 2011 and 2012 in French dentist surgeries and veterinary clinics. The data used came from dosimeters which were not exposed to anthropic sources. After removing the cosmic rays contribution in order to study only the telluric gamma radiation, it was decided to work with the arithmetic means of the time-series measurements, weighted by the time-exposure of the dosimeters, for each location. The values varied between 13 and 349 nSv/h, with an arithmetic mean of 76 nSv/h. The observed statistical distribution of the gamma dose rates was skewed to the right. Firstly, ordinary kriging was performed in order to predict the gamma dose rate on cells of 1*1 km(2), all over the domain. The second step of the study was to use an auxiliary variable in estimates. The IRSN achieved in 2010 a classification of the French geological formations, characterizing their uranium potential on the bases of geology and local measurement results of rocks uranium content. This information is georeferenced in a map at the scale 1:1,000,000. The geological uranium potential (GUP) was classified in 5 qualitative categories. As telluric gamma rays mostly come from the progenies of the (238)Uranium series present in rocks, this

  9. Geostatistical techniques to assess the influence of soil density on sugarcane productivity

    NASA Astrophysics Data System (ADS)

    Marques, Karina; Silva, Wellington; Almeida, Ceres; Bezerra, Joel; Almeida, Brivaldo; Siqueira, Glecio

    2013-04-01

    Spatial variation in some soil properties on small distances occur even on homogeneous areas with same soil class that can influence to crop productivity. This variability must be incorporated into the procedures and techniques used in agriculture. Thus, it is necessary to know it to optimize agricultural practices. This study aimed to evaluate the influence of soil density on the sugarcane productivity by geostatistical techniques. The area is located on Rio Formoso city, Pernambuco (Brazil), at latitude 08°38'91"S and longitude 35°16'08"W, where the climate is rainy tropical. About of 243 georeferenced undisturbed soil samples (clods) were collected on lowland area at three depths (0-20, 20-40 and 40-60cm) grid spacing of 15 x 30 m. The total area has 7.5 ha, divided equally into three subareas. Statistical and geostatistics analysis were done. It was found that soil density increased with depth Bulk density and can be used as an index of relative compaction. Machine weight, track or tire design and soil water content at the time of traffic are some of the factors that determine the amount of soil compaction and resulting changes in the plant root environment. These points can have influenced the highest soil density found in subarea 1. This subarea was intensively mechanized and it presents poor drainage and seasonal flood. Based on semivariograms models fitted, we can say that soil density showed spatial dependence in subarea 1 at all depths (Gaussian (0-20cm) and spherical both 20-40 and 40-60cm). Unlike this, the models fitted to subarea 2 were to 0-20 and 40-60cm depths, exponential and on subarea 3, at 0-20cm (Gaussian). Pure nugget effect was found on 20-40cm depth at the subareas 2 and 3, and 40-60cm on the subarea 3. Subarea 1 had higher soil density and lower sugarcane productivity thus, it is known that root development and nutrient uptake are directly influenced by soil density.

  10. Text Sets.

    ERIC Educational Resources Information Center

    Giorgis, Cyndi; Johnson, Nancy J.

    2002-01-01

    Presents annotations of approximately 30 titles grouped in text sets. Defines a text set as five to ten books on a particular topic or theme. Discusses books on the following topics: living creatures; pirates; physical appearance; natural disasters; and the Irish potato famine. (SG)

  11. Electrical geophysical characterization of the Hanford 300 Area Integrated Field Research Challenge using high performance DC resistivity inversion geostatistically constrained by borehole conductivity logs

    NASA Astrophysics Data System (ADS)

    Johnson, T. C.; Versteeg, R. J.; Ward, A. L.; Strickland, C. E.; Greenwood, J.

    2009-12-01

    The Hanford 300 Area Integrated Field Research Challenge is a DOE funded multi-institution field study designed to better understand the field scale subsurface processes governing uranium transport in the Hanford 300 Area and other DOE sites. The primary focus area includes 28 monitoring wells spaced approximately 10 meters apart and extending downward 18 meters through the unconfined aquifer to a lower fine-grained confining unit. Each well is equipped with 15 permanent electrodes in the vadose zone and 15 removable electrodes in the saturated zone for a total 840 electrodes. To characterize the electrical conductivity structure of the site, a set of 32 overlapping 3D DC resistivity surveys were conducted where each survey included 4 adjacent wells. The complete set of surveys comprises approximately 150,000 resistivity measurements. We demonstrate the characterization by inverting the complete data set using a parallel resistivity modeling/inversion code developed at the Idaho National Laboratory. We compare two approaches; 1) an Occam minimum-structure type inversion, and 2) an inversion using geostatistical and borehole conductivity constraints. The first inversion shows the larger scale conductivity features of the site as resolved by the resistivity data only. The second inversion provides solutions which honor the resistivity data, the directional and zonal semivariograms derived from borehole conductivity logs, and the corresponding borehole conductivities. This approach allows an ensemble of solutions to be generated which incorporate uncertainty in both the semivariograms and the resistivity data. Assuming electrical conductivity is well correlated with hydrogeologic properties at the site, this ensemble of models can be used, along with appropriate petrophyiscal transforms, to generate an ensemble of hydrogeologic models which can ultimately be used to investigate uncertainty in flow and transport processes.

  12. Characterization of an unregulated landfill using surface-based geophysics and geostatistics

    SciTech Connect

    Woldt, W.E.; Jones, D.D.; Hagemeister, M.E.

    1998-11-01

    This paper develops a geoelectrical and geostatistical-based methodology that can be used to screen unregulated landfills for the presence of leachate and obtain an approximation of the vertical/spatial extent of waste. The methodology uses a surface electromagnetic (EM) survey combined with indicator kriging. Indicator kriging allows for the use of EM data that have been collected over highly electrically conductive material such as that occurring within landfills. Indicator kriging maps were generated for several vertical sections cut through a landfill to establish the landfill strata. Similarly, horizontal sections were generated to evaluate the areas extent of waste and leachate in the vadose zone and saturated zones, respectively. The horizontal and vertical maps were combined to estimate the volume of solid waste and liquid within three zones: (1) waste, (2) waste and/or leachate, and (3) potential leachate. The methodology appears to hold promise in providing results that can be used as part of a hazard assessment, or to assist in the placement of monitoring wells at sites requiring additional study. A case study is used to demonstrate the methodology at an unregulated landfill in eastern Nebraska.

  13. Stochastic simulation of geological data using isometric mapping and multiple-point geostatistics with data incorporation

    NASA Astrophysics Data System (ADS)

    Zhang, Ting; Du, Yi; Huang, Tao; Li, Xue

    2016-02-01

    Constrained by current hardware equipment and techniques, acquisition of geological data sometimes is difficult or even impossible. Stochastic simulation for geological data is helpful to address this issue, providing multiple possible results of geological data for resource prediction and risk evaluation. Multiple-point geostatistics (MPS) being one of the main branches of stochastic simulation can extract the intrinsic features of patterns from training images (TIs) that provide prior information to limit the under-determined simulated results, and then copy them to the simulated regions. Because the generated models from TIs are not always linear, some MPS methods using linear dimensionality reduction are not suitable to deal with nonlinear models of TIs. A new MPS method named ISOMAPSIM was proposed to resolve this issue, which reduces the dimensionality of patterns from TIs using isometric mapping (ISOMAP) and then classifies these low-dimensional patterns for simulation. Since conditional models including hard data and soft data influence the simulated results greatly, this paper further studies ISOMAPSIM using hard data and soft data to obtain more accurate simulations for geological modeling. Stochastic simulation of geological data is processed respectively under several conditions according to different situations of conditional models. The tests show that the proposed method can reproduce the structural characteristics of TIs under all conditions, but the condition using soft data and hard data together performs best in simulation quality; moreover, the proposed method shows its advantages over other MPS methods that use linear dimensionality reduction.

  14. Definition of radon prone areas in Friuli Venezia Giulia region, Italy, using geostatistical tools.

    PubMed

    Cafaro, C; Bossew, P; Giovani, C; Garavaglia, M

    2014-12-01

    Studying the geographical distribution of indoor radon concentration, using geostatistical interpolation methods, has become common for predicting and estimating the risk to the population. Here we analyse the case of Friuli Venezia Giulia (FVG), the north easternmost region of Italy. Mean value and standard deviation are, respectively, 153 Bq/m(3) and 183 Bq/m(3). The geometric mean value is 100 Bq/m(3). Spatial datasets of indoor radon concentrations are usually affected by clustering and apparent non-stationarity issues, which can eventually yield arguable results. The clustering of the present dataset seems to be non preferential. Therefore the areal estimations are not expected to be affected. Conversely, nothing can be said on the non stationarity issues and its effects. After discussing the correlation of geology with indoor radon concentration It appears they are created by the same geologic features influencing the mean and median values, and can't be eliminated via a map-based approach. To tackle these problems, in this work we deal with multiple definitions of RPA, but only in quaternary areas of FVG, using extensive simulation techniques. PMID:25261867

  15. Geostatistical analysis of hydraulic conductivity of the Upper Santa Fe Group, Albuquerque Basin, New Mexico

    SciTech Connect

    Ruskauff, G.J.

    1996-12-31

    A regional groundwater flow model has been developed to be used as a management tool for the Albuquerque Basin. It is crucial to recognize the impact of the inherent uncertainty in aquifer hydrogeology when applying the model, and an understanding of the effects of uncertainty can be accomplished using a probabilistic approach to address the role of natural variability of aquifer properties on management strategies. Statistical analysis shows that the hydraulic conductivity data is moderately skewed to the right, but is not lognormal. Geostatistical analysis revealed zonal anisotropy oriented due north-south, which is directly related to the flow direction of the ancestral Rio Grande which laid down the Upper Santa Fe Group deposits. The presence of multiple depositional environments within the Upper Santa Fe Group violates the assumption of stationarity. This can be circumvented by choosing the simulation search radius so that local stationarity holds, or by separating the basin into two portions to be simulated separately and then combined for flow model analysis.

  16. Epidemiological Study of Hazelnut Bacterial Blight in Central Italy by Using Laboratory Analysis and Geostatistics

    PubMed Central

    Lamichhane, Jay Ram; Fabi, Alfredo; Ridolfi, Roberto; Varvaro, Leonardo

    2013-01-01

    Incidence of Xanthomonas arboricola pv. corylina, the causal agent of hazelnut bacterial blight, was analyzed spatially in relation to the pedoclimatic factors. Hazelnut grown in twelve municipalities situated in the province of Viterbo, central Italy was studied. A consistent number of bacterial isolates were obtained from the infected tissues of hazelnut collected in three years (2010–2012). The isolates, characterized by phenotypic tests, did not show any difference among them. Spatial patterns of pedoclimatic data, analyzed by geostatistics showed a strong positive correlation of disease incidence with higher values of rainfall, thermal shock and soil nitrogen; a weak positive correlation with soil aluminium content and a strong negative correlation with the values of Mg/K ratio. No correlation of the disease incidence was found with soil pH. Disease incidence ranged from very low (<1%) to very high (almost 75%) across the orchards. Young plants (4-year old) were the most affected by the disease confirming a weak negative correlation of the disease incidence with plant age. Plant cultivars did not show any difference in susceptibility to the pathogen. Possible role of climate change on the epidemiology of the disease is discussed. Improved management practices are recommended for effective control of the disease. PMID:23424654

  17. Usage of multivariate geostatistics in interpolation processes for meteorological precipitation maps

    NASA Astrophysics Data System (ADS)

    Gundogdu, Ismail Bulent

    2015-09-01

    Long-term meteorological data are very important both for the evaluation of meteorological events and for the analysis of their effects on the environment. Prediction maps which are constructed by different interpolation techniques often provide explanatory information. Conventional techniques, such as surface spline fitting, global and local polynomial models, and inverse distance weighting may not be adequate. Multivariate geostatistical methods can be more significant, especially when studying secondary variables, because secondary variables might directly affect the precision of prediction. In this study, the mean annual and mean monthly precipitations from 1984 to 2014 for 268 meteorological stations in Turkey have been used to construct country-wide maps. Besides linear regression, the inverse square distance and ordinary co-Kriging (OCK) have been used and compared to each other. Also elevation, slope, and aspect data for each station have been taken into account as secondary variables, whose use has reduced errors by up to a factor of three. OCK gave the smallest errors (1.002 cm) when aspect was included.

  18. The geostatistical approach for structural and stratigraphic framework analysis of offshore NW Bonaparte Basin, Australia

    NASA Astrophysics Data System (ADS)

    Wahid, Ali; Salim, Ahmed Mohamed Ahmed; Gaafar, Gamal Ragab; Yusoff, Wan Ismail Wan

    2016-02-01

    Geostatistics or statistical approach is based on the studies of temporal and spatial trend, which depend upon spatial relationships to model known information of variable(s) at unsampled locations. The statistical technique known as kriging was used for petrophycial and facies analysis, which help to assume spatial relationship to model the geological continuity between the known data and the unknown to produce a single best guess of the unknown. Kriging is also known as optimal interpolation technique, which facilitate to generate best linear unbiased estimation of each horizon. The idea is to construct a numerical model of the lithofacies and rock properties that honor available data and further integrate with interpreting seismic sections, techtonostratigraphy chart with sea level curve (short term) and regional tectonics of the study area to find the structural and stratigraphic growth history of the NW Bonaparte Basin. By using kriging technique the models were built which help to estimate different parameters like horizons, facies, and porosities in the study area. The variograms were used to determine for identification of spatial relationship between data which help to find the depositional history of the North West (NW) Bonaparte Basin.

  19. Geostatistical methods for rock mass quality prediction using borehole and geophysical survey data

    NASA Astrophysics Data System (ADS)

    Chen, J.; Rubin, Y.; Sege, J. E.; Li, X.; Hehua, Z.

    2015-12-01

    For long, deep tunnels, the number of geotechnical borehole investigations during the preconstruction stage is generally limited. Yet tunnels are often constructed in geological structures with complex geometries, and in which the rock mass is fragmented from past structural deformations. Tunnel Geology Prediction (TGP) is a geophysical technique widely used during tunnel construction in China to ensure safety during construction and to prevent geological disasters. In this paper, geostatistical techniques were applied in order to integrate seismic velocity from TGP and borehole information into spatial predictions of RMR (Rock Mass Rating) in unexcavated areas. This approach is intended to apply conditional probability methods to transform seismic velocities to directly observed RMR values. The initial spatial distribution of RMR, inferred from the boreholes, was updated by including geophysical survey data in a co-kriging approach. The method applied to a real tunnel project shows significant improvements in rock mass quality predictions after including geophysical survey data, leading to better decision-making for construction safety design.

  20. Integration of geostatistical techniques and intuitive geology in the 3-D modeling process

    SciTech Connect

    Heine, C.J.; Cooper, D.H.

    1995-08-01

    The development of 3-D geologic models for reservoir description and simulation has traditionally relied on the computer derived interpolation of well data in a geocelluar stratigraphic framework. The quality of the interpolation has been directly dependent on the nature of the interpolation method, and ability of the interpolation scheme to accurately predict the value of geologic attributes away from the well. Typically, interpolation methods employ deterministic or geostatistical algorithms which offer limited capacity for integrating data derived from secondary analyses. These secondary analyses, which might include the results from 3-D seismic inversion, borehole imagery studies, or deductive reasoning, introduce a subjective component into what would otherwise be restricted to a purely mathematical treatment of geologic data. At Saudi ARAMCO an increased emphasis is being placed on the role of the reservoir geologist in the development of 3-D geologic models. Quantitative results, based on numerical computations, are being enhanced with intuitive geology, derived from years of cumulative professional experience and expertise. Techniques such as template modeling and modified conditional simulation, are yielding 3-D geologic models, which not only more accurately reflect the geology of the reservoir, but also preserve geologic detail throughout the simulation process. This incorporation of secondary data sources and qualitative analysis has been successfully demonstrated in a clastic reservoir environment in Central Saudi Arabia, and serves as a prototype for future 3-D geologic model development.

  1. Integration of geostatistical techniques and intuitive geology in the 3-D modeling process

    SciTech Connect

    Heine, C.J.; Cooper, D.H. )

    1996-01-01

    The development of 3-D geologic models for reservoir description and simulation has traditionally relied on the computer derived interpolation of well data in a geocelluar stratigraphic framework. The quality of the interpolation has been directly dependent on the nature of the interpolation method, and ability of the Interpolation scheme to accurately predict the value of geologic attributes away from the well. Typically, interpolation methods employ deterministic or geostatistical algorithms which offer limited capacity for Integrating data derived from secondary analyses. These secondary analyses, which might include the results from 3-D seismic inversion, borehole imagery studies, or deductive reasoning, introduce a subjective component into what would otherwise be restricted to a purely mathematical treatment of geologic data. At Saudi ARAMCO an increased emphases is being placed on the role of the reservoir geologist in the development of 3-D geologic models. Quantitative results, based on numerical computations, are being enhanced with intuitive geology, derived from years of cumulative professional experience and expertise. Techniques such as template modeling and modified conditional simulation, are yielding 3-D geologic models, which not only more accurately reflect the geology of the reservoir, but also preserve geologic detail throughout the simulation process. This incorporation of secondary data sources and qualitative analysis has been successfully demonstrated in a clastic reservoir environment in Central Saudi Arabia, and serves as a prototype for future 3-D geologic model development.

  2. Integration of geostatistical techniques and intuitive geology in the 3-D modeling process

    SciTech Connect

    Heine, C.J.; Cooper, D.H.

    1996-12-31

    The development of 3-D geologic models for reservoir description and simulation has traditionally relied on the computer derived interpolation of well data in a geocelluar stratigraphic framework. The quality of the interpolation has been directly dependent on the nature of the interpolation method, and ability of the Interpolation scheme to accurately predict the value of geologic attributes away from the well. Typically, interpolation methods employ deterministic or geostatistical algorithms which offer limited capacity for Integrating data derived from secondary analyses. These secondary analyses, which might include the results from 3-D seismic inversion, borehole imagery studies, or deductive reasoning, introduce a subjective component into what would otherwise be restricted to a purely mathematical treatment of geologic data. At Saudi ARAMCO an increased emphases is being placed on the role of the reservoir geologist in the development of 3-D geologic models. Quantitative results, based on numerical computations, are being enhanced with intuitive geology, derived from years of cumulative professional experience and expertise. Techniques such as template modeling and modified conditional simulation, are yielding 3-D geologic models, which not only more accurately reflect the geology of the reservoir, but also preserve geologic detail throughout the simulation process. This incorporation of secondary data sources and qualitative analysis has been successfully demonstrated in a clastic reservoir environment in Central Saudi Arabia, and serves as a prototype for future 3-D geologic model development.

  3. Geostatistics and the representative elementary volume of gamma ray tomography attenuation in rocks cores

    USGS Publications Warehouse

    Vogel, J.R.; Brown, G.O.

    2003-01-01

    Semivariograms of samples of Culebra Dolomite have been determined at two different resolutions for gamma ray computed tomography images. By fitting models to semivariograms, small-scale and large-scale correlation lengths are determined for four samples. Different semivariogram parameters were found for adjacent cores at both resolutions. Relative elementary volume (REV) concepts are related to the stationarity of the sample. A scale disparity factor is defined and is used to determine sample size required for ergodic stationarity with a specified correlation length. This allows for comparison of geostatistical measures and representative elementary volumes. The modifiable areal unit problem is also addressed and used to determine resolution effects on correlation lengths. By changing resolution, a range of correlation lengths can be determined for the same sample. Comparison of voxel volume to the best-fit model correlation length of a single sample at different resolutions reveals a linear scaling effect. Using this relationship, the range of the point value semivariogram is determined. This is the range approached as the voxel size goes to zero. Finally, these results are compared to the regularization theory of point variables for borehole cores and are found to be a better fit for predicting the volume-averaged range.

  4. Predictive risk mapping of schistosomiasis in Brazil using Bayesian geostatistical models.

    PubMed

    Scholte, Ronaldo G C; Gosoniu, Laura; Malone, John B; Chammartin, Frédérique; Utzinger, Jürg; Vounatsou, Penelope

    2014-04-01

    Schistosomiasis is one of the most common parasitic diseases in tropical and subtropical areas, including Brazil. A national control programme was initiated in Brazil in the mid-1970s and proved successful in terms of morbidity control, as the number of cases with hepato-splenic involvement was reduced significantly. To consolidate control and move towards elimination, there is a need for reliable maps on the spatial distribution of schistosomiasis, so that interventions can target communities at highest risk. The purpose of this study was to map the distribution of Schistosoma mansoni in Brazil. We utilized readily available prevalence data from the national schistosomiasis control programme for the years 2005-2009, derived remotely sensed climatic and environmental data and obtained socioeconomic data from various sources. Data were collated into a geographical information system and Bayesian geostatistical models were developed. Model-based maps identified important risk factors related to the transmission of S. mansoni and confirmed that environmental variables are closely associated with indices of poverty. Our smoothed predictive risk map, including uncertainty, highlights priority areas for intervention, namely the northern parts of North and Southeast regions and the eastern part of Northeast region. Our predictive risk map provides a useful tool for to strengthen existing surveillance-response mechanisms. PMID:24361640

  5. Integrating Address Geocoding, Land Use Regression, and Spatiotemporal Geostatistical Estimation for Groundwater Tetrachloroethylene

    PubMed Central

    Messier, Kyle P.; Akita, Yasuyuki; Serre, Marc L.

    2012-01-01

    Geographic Information Systems (GIS) based techniques are cost-effective and efficient methods used by state agencies and epidemiology researchers for estimating concentration and exposure. However, budget limitations have made statewide assessments of contamination difficult, especially in groundwater media. Many studies have implemented address geocoding, land use regression, and geostatistics independently, but this is the first to examine the benefits of integrating these GIS techniques to address the need of statewide exposure assessments. A novel framework for concentration exposure is introduced that integrates address geocoding, land use regression (LUR), below detect data modeling, and Bayesian Maximum Entropy (BME). A LUR model was developed for Tetrachloroethylene that accounts for point sources and flow direction. We then integrate the LUR model into the BME method as a mean trend while also modeling below detects data as a truncated Gaussian probability distribution function. We increase available PCE data 4.7 times from previously available databases through multistage geocoding. The LUR model shows significant influence of dry cleaners at short ranges. The integration of the LUR model as mean trend in BME results in a 7.5% decrease in cross validation mean square error compared to BME with a constant mean trend. PMID:22264162

  6. Inverse modeling of hydraulic tests in fractured crystalline rock based on a transition probability geostatistical approach

    NASA Astrophysics Data System (ADS)

    Blessent, Daniela; Therrien, René; Lemieux, Jean-Michel

    2011-12-01

    This paper presents numerical simulations of a series of hydraulic interference tests conducted in crystalline bedrock at Olkiluoto (Finland), a potential site for the disposal of the Finnish high-level nuclear waste. The tests are in a block of crystalline bedrock of about 0.03 km3 that contains low-transmissivity fractures. Fracture density, orientation, and fracture transmissivity are estimated from Posiva Flow Log (PFL) measurements in boreholes drilled in the rock block. On the basis of those data, a geostatistical approach relying on a transitional probability and Markov chain models is used to define a conceptual model based on stochastic fractured rock facies. Four facies are defined, from sparsely fractured bedrock to highly fractured bedrock. Using this conceptual model, three-dimensional groundwater flow is then simulated to reproduce interference pumping tests in either open or packed-off boreholes. Hydraulic conductivities of the fracture facies are estimated through automatic calibration using either hydraulic heads or both hydraulic heads and PFL flow rates as targets for calibration. The latter option produces a narrower confidence interval for the calibrated hydraulic conductivities, therefore reducing the associated uncertainty and demonstrating the usefulness of the measured PFL flow rates. Furthermore, the stochastic facies conceptual model is a suitable alternative to discrete fracture network models to simulate fluid flow in fractured geological media.

  7. Geostatistics as a tool to study mite dispersion in physic nut plantations.

    PubMed

    Rosado, J F; Picanço, M C; Sarmento, R A; Pereira, R M; Pedro-Neto, M; Galdino, T V S; de Sousa Saraiva, A; Erasmo, E A L

    2015-08-01

    Spatial distribution studies in pest management identify the locations where pest attacks on crops are most severe, enabling us to understand and predict the movement of such pests. Studies on the spatial distribution of two mite species, however, are rather scarce. The mites Polyphagotarsonemus latus and Tetranychus bastosi are the major pests affecting physic nut plantations (Jatropha curcas). Therefore, the objective of this study was to measure the spatial distributions of P. latus and T. bastosi in the physic nut plantations. Mite densities were monitored over 2 years in two different plantations. Sample locations were georeferenced. The experimental data were analyzed using geostatistical analyses. The total mite density was found to be higher when only one species was present (T. bastosi). When both the mite species were found in the same plantation, their peak densities occurred at different times. These mites, however, exhibited uniform spatial distribution when found at extreme densities (low or high). However, the mites showed an aggregated distribution in intermediate densities. Mite spatial distribution models were isotropic. Mite colonization commenced at the periphery of the areas under study, whereas the high-density patches extended until they reached 30 m in diameter. This has not been reported for J. curcas plants before. PMID:25895655

  8. Analysis and simulation of wireless signal propagation applying geostatistical interpolation techniques

    NASA Astrophysics Data System (ADS)

    Kolyaie, S.; Yaghooti, M.; Majidi, G.

    2011-12-01

    This paper is a part of an ongoing research to examine the capability of geostatistical analysis for mobile networks coverage prediction, simulation and tuning. Mobile network coverage predictions are used to find network coverage gaps and areas with poor serviceability. They are essential data for engineering and management in order to make better decision regarding rollout, planning and optimisation of mobile networks.The objective of this research is to evaluate different interpolation techniques in coverage prediction. In method presented here, raw data collected from drive testing a sample of roads in study area is analysed and various continuous surfaces are created using different interpolation methods. Two general interpolation methods are used in this paper with different variables; first, Inverse Distance Weighting (IDW) with various powers and number of neighbours and second, ordinary kriging with Gaussian, spherical, circular and exponential semivariogram models with different number of neighbours. For the result comparison, we have used check points coming from the same drive test data. Prediction values for check points are extracted from each surface and the differences with actual value are computed. The output of this research helps finding an optimised and accurate model for coverage prediction.

  9. The geostatistic-based spatial distribution variations of soil salts under long-term wastewater irrigation.

    PubMed

    Wu, Wenyong; Yin, Shiyang; Liu, Honglu; Niu, Yong; Bao, Zhe

    2014-10-01

    The purpose of this study was to determine and evaluate the spatial changes in soil salinity by using geostatistical methods. The study focused on the suburb area of Beijing, where urban development led to water shortage and accelerated wastewater reuse to farm irrigation for more than 30 years. The data were then processed by GIS using three different interpolation techniques of ordinary kriging (OK), disjunctive kriging (DK), and universal kriging (UK). The normality test and overall trend analysis were applied for each interpolation technique to select the best fitted model for soil parameters. Results showed that OK was suitable for soil sodium adsorption ratio (SAR) and Na(+) interpolation; UK was suitable for soil Cl(-) and pH; DK was suitable for soil Ca(2+). The nugget-to-sill ratio was applied to evaluate the effects of structural and stochastic factors. The maps showed that the areas of non-saline soil and slight salinity soil accounted for 6.39 and 93.61%, respectively. The spatial distribution and accumulation of soil salt were significantly affected by the irrigation probabilities and drainage situation under long-term wastewater irrigation. PMID:25127658

  10. Modelling ambient ozone in an urban area using an objective model and geostatistical algorithms

    NASA Astrophysics Data System (ADS)

    Moral, Francisco J.; Rebollo, Francisco J.; Valiente, Pablo; López, Fernando; Muñoz de la Peña, Arsenio

    2012-12-01

    Ground-level tropospheric ozone is one of the air pollutants of most concern. Ozone levels continue to exceed both target values and the long-term objectives established in EU legislation to protect human health and prevent damage to ecosystems, agricultural crops and materials. Researchers or decision-makers frequently need information about atmospheric pollution patterns in urbanized areas. The preparation of this type of information is a complex task, due to the influence of several factors and their variability over time. In this work, some results of urban ozone distribution patterns in the city of Badajoz, which is the largest (140,000 inhabitants) and most industrialized city in Extremadura region (southwest Spain) are shown. Twelve sampling campaigns, one per month, were carried out to measure ambient air ozone concentrations, during periods that were selected according to favourable conditions to ozone production, using an automatic portable analyzer. Later, to evaluate the overall ozone level at each sampling location during the time interval considered, the measured ozone data were analysed using a new methodology based on the formulation of the Rasch model. As a result, a measure of overall ozone level which consolidates the monthly ground-level ozone measurements was obtained, getting moreover information about the influence on the overall ozone level of each monthly ozone measure. Finally, overall ozone level at locations where no measurements were available was estimated with geostatistical techniques and hazard assessment maps based on the spatial distribution of ozone were also generated.

  11. Spatial analysis of lettuce downy mildew using geostatistics and geographic information systems.

    PubMed

    Wu, B M; van Bruggen, A H; Subbarao, K V; Pennings, G G

    2001-02-01

    ABSTRACT The epidemiology of lettuce downy mildew has been investigated extensively in coastal California. However, the spatial patterns of the disease and the distance that Bremia lactucae spores can be transported have not been determined. During 1995 to 1998, we conducted several field- and valley-scale surveys to determine spatial patterns of this disease in the Salinas valley. Geostatistical analyses of the survey data at both scales showed that the influence range of downy mildew incidence at one location on incidence at other locations was between 80 and 3,000 m. A linear relationship was detected between semivariance and lag distance at the field scale, although no single statistical model could fit the semi-variograms at the valley scale. Spatial interpolation by the inverse distance weighting method with a power of 2 resulted in plausible estimates of incidence throughout the valley. Cluster analysis in geographic information systems on the interpolated disease incidence from different dates demonstrated that the Salinas valley could be divided into two areas, north and south of Salinas City, with high and low disease pressure, respectively. Seasonal and spatial trends along the valley suggested that the distinction between the downy mildew conducive and nonconducive areas might be determined by environmental factors. PMID:18944386

  12. Integrating address geocoding, land use regression, and spatiotemporal geostatistical estimation for groundwater tetrachloroethylene.

    PubMed

    Messier, Kyle P; Akita, Yasuyuki; Serre, Marc L

    2012-03-01

    Geographic information systems (GIS) based techniques are cost-effective and efficient methods used by state agencies and epidemiology researchers for estimating concentration and exposure. However, budget limitations have made statewide assessments of contamination difficult, especially in groundwater media. Many studies have implemented address geocoding, land use regression, and geostatistics independently, but this is the first to examine the benefits of integrating these GIS techniques to address the need of statewide exposure assessments. A novel framework for concentration exposure is introduced that integrates address geocoding, land use regression (LUR), below detect data modeling, and Bayesian Maximum Entropy (BME). A LUR model was developed for tetrachloroethylene that accounts for point sources and flow direction. We then integrate the LUR model into the BME method as a mean trend while also modeling below detects data as a truncated Gaussian probability distribution function. We increase available PCE data 4.7 times from previously available databases through multistage geocoding. The LUR model shows significant influence of dry cleaners at short ranges. The integration of the LUR model as mean trend in BME results in a 7.5% decrease in cross validation mean square error compared to BME with a constant mean trend. PMID:22264162

  13. Application of Geostatistical Methods and Machine Learning for spatio-temporal Earthquake Cluster Analysis

    NASA Astrophysics Data System (ADS)

    Schaefer, A. M.; Daniell, J. E.; Wenzel, F.

    2014-12-01

    Earthquake clustering tends to be an increasingly important part of general earthquake research especially in terms of seismic hazard assessment and earthquake forecasting and prediction approaches. The distinct identification and definition of foreshocks, aftershocks, mainshocks and secondary mainshocks is taken into account using a point based spatio-temporal clustering algorithm originating from the field of classic machine learning. This can be further applied for declustering purposes to separate background seismicity from triggered seismicity. The results are interpreted and processed to assemble 3D-(x,y,t) earthquake clustering maps which are based on smoothed seismicity records in space and time. In addition, multi-dimensional Gaussian functions are used to capture clustering parameters for spatial distribution and dominant orientations. Clusters are further processed using methodologies originating from geostatistics, which have been mostly applied and developed in mining projects during the last decades. A 2.5D variogram analysis is applied to identify spatio-temporal homogeneity in terms of earthquake density and energy output. The results are mitigated using Kriging to provide an accurate mapping solution for clustering features. As a case study, seismic data of New Zealand and the United States is used, covering events since the 1950s, from which an earthquake cluster catalogue is assembled for most of the major events, including a detailed analysis of the Landers and Christchurch sequences.

  14. Efficient methods for large-scale linear inversion using a geostatistical approach

    NASA Astrophysics Data System (ADS)

    Saibaba, Arvind K.; Kitanidis, Peter K.

    2012-05-01

    In geophysical inverse problems, such as estimating the unknown parameter field from noisy observations of dependent quantities, e.g., hydraulic conductivity from head observations, stochastic Bayesian and geostatistical approaches are frequently used. To obtain best estimates and conditional realizations it is required to perform several matrix-matrix computations involving the covariance matrix of the discretized field of the parameters. In realistic three-dimensional fields that are finely discretized, these operations as performed in conventional algorithms become extremely expensive and even prohibitive in terms of memory and computational requirements. Using Hierarchical Matrices, we show how to reduce the complexity of forming approximate matrix-vector products involving the Covariance matrices in log linear complexity for an arbitrary distribution of points and a wide variety of generalized covariance functions. The resulting system of equations is solved iteratively using a matrix-free Krylov subspace approach. Furthermore, we show how to generate unconditional realizations using an approximation to the square root of the covariance matrix using Chebyshev matrix polynomials and use the above to generate conditional realizations. We demonstrate the efficiency of our method on a few standard test problems, such as interpolation from noisy observations and contaminant source identification.

  15. Nonpoint source solute transport normal to aquifer bedding in heterogeneous, Markov chain random fields

    NASA Astrophysics Data System (ADS)

    Zhang, Hua; Harter, Thomas; Sivakumar, Bellie

    2006-06-01

    Facies-based geostatistical models have become important tools for analyzing flow and mass transport processes in heterogeneous aquifers. Yet little is known about the relationship between these latter processes and the parameters of facies-based geostatistical models. In this study, we examine the transport of a nonpoint source solute normal (perpendicular) to the major bedding plane of an alluvial aquifer medium that contains multiple geologic facies, including interconnected, high-conductivity (coarse textured) facies. We also evaluate the dependence of the transport behavior on the parameters of the constitutive facies model. A facies-based Markov chain geostatistical model is used to quantify the spatial variability of the aquifer system's hydrostratigraphy. It is integrated with a groundwater flow model and a random walk particle transport model to estimate the solute traveltime probability density function (pdf) for solute flux from the water table to the bottom boundary (the production horizon) of the aquifer. The cases examined include two-, three-, and four-facies models, with mean length anisotropy ratios for horizontal to vertical facies, ek, from 25:1 to 300:1 and with a wide range of facies volume proportions (e.g., from 5 to 95% coarse-textured facies). Predictions of traveltime pdfs are found to be significantly affected by the number of hydrostratigraphic facies identified in the aquifer. Those predictions of traveltime pdfs also are affected by the proportions of coarse-textured sediments, the mean length of the facies (particularly the ratio of length to thickness of coarse materials), and, to a lesser degree, the juxtapositional preference among the hydrostratigraphic facies. In transport normal to the sedimentary bedding plane, traveltime is not lognormally distributed as is often assumed. Also, macrodispersive behavior (variance of the traveltime) is found not to be a unique function of the conductivity variance. For the parameter range

  16. A geostatistical analysis of small-scale spatial variability in bacterial abundance and community structure in salt marsh creek bank sediments

    NASA Technical Reports Server (NTRS)

    Franklin, Rima B.; Blum, Linda K.; McComb, Alison C.; Mills, Aaron L.

    2002-01-01

    Small-scale variations in bacterial abundance and community structure were examined in salt marsh sediments from Virginia's eastern shore. Samples were collected at 5 cm intervals (horizontally) along a 50 cm elevation gradient, over a 215 cm horizontal transect. For each sample, bacterial abundance was determined using acridine orange direct counts and community structure was analyzed using randomly amplified polymorphic DNA fingerprinting of whole-community DNA extracts. A geostatistical analysis was used to determine the degree of spatial autocorrelation among the samples, for each variable and each direction (horizontal and vertical). The proportion of variance in bacterial abundance that could be accounted for by the spatial model was quite high (vertical: 60%, horizontal: 73%); significant autocorrelation was found among samples separated by 25 cm in the vertical direction and up to 115 cm horizontally. In contrast, most of the variability in community structure was not accounted for by simply considering the spatial separation of samples (vertical: 11%, horizontal: 22%), and must reflect variability from other parameters (e.g., variation at other spatial scales, experimental error, or environmental heterogeneity). Microbial community patch size based upon overall similarity in community structure varied between 17 cm (vertical) and 35 cm (horizontal). Overall, variability due to horizontal position (distance from the creek bank) was much smaller than that due to vertical position (elevation) for both community properties assayed. This suggests that processes more correlated with elevation (e.g., drainage and redox potential) vary at a smaller scale (therefore producing smaller patch sizes) than processes controlled by distance from the creek bank. c2002 Federation of European Microbiological Societies. Published by Elsevier Science B.V. All rights reserved.

  17. Geostatistical Modeling of Malaria Endemicity using Serological Indicators of Exposure Collected through School Surveys

    PubMed Central

    Ashton, Ruth A.; Kefyalew, Takele; Rand, Alison; Sime, Heven; Assefa, Ashenafi; Mekasha, Addis; Edosa, Wasihun; Tesfaye, Gezahegn; Cano, Jorge; Teka, Hiwot; Reithinger, Richard; Pullan, Rachel L.; Drakeley, Chris J.; Brooker, Simon J.

    2015-01-01

    Ethiopia has a diverse ecology and geography resulting in spatial and temporal variation in malaria transmission. Evidence-based strategies are thus needed to monitor transmission intensity and target interventions. A purposive selection of dried blood spots collected during cross-sectional school-based surveys in Oromia Regional State, Ethiopia, were tested for presence of antibodies against Plasmodium falciparum and P. vivax antigens. Spatially explicit binomial models of seroprevalence were created for each species using a Bayesian framework, and used to predict seroprevalence at 5 km resolution across Oromia. School seroprevalence showed a wider prevalence range than microscopy for both P. falciparum (0–50% versus 0–12.7%) and P. vivax (0–53.7% versus 0–4.5%), respectively. The P. falciparum model incorporated environmental predictors and spatial random effects, while P. vivax seroprevalence first-order trends were not adequately explained by environmental variables, and a spatial smoothing model was developed. This is the first demonstration of serological indicators being used to detect large-scale heterogeneity in malaria transmission using samples from cross-sectional school-based surveys. The findings support the incorporation of serological indicators into periodic large-scale surveillance such as Malaria Indicator Surveys, and with particular utility for low transmission and elimination settings. PMID:25962770

  18. Random Vibrations

    NASA Technical Reports Server (NTRS)

    Messaro. Semma; Harrison, Phillip

    2010-01-01

    Ares I Zonal Random vibration environments due to acoustic impingement and combustion processes are develop for liftoff, ascent and reentry. Random Vibration test criteria for Ares I Upper Stage pyrotechnic components are developed by enveloping the applicable zonal environments where each component is located. Random vibration tests will be conducted to assure that these components will survive and function appropriately after exposure to the expected vibration environments. Methodology: Random Vibration test criteria for Ares I Upper Stage pyrotechnic components were desired that would envelope all the applicable environments where each component was located. Applicable Ares I Vehicle drawings and design information needed to be assessed to determine the location(s) for each component on the Ares I Upper Stage. Design and test criteria needed to be developed by plotting and enveloping the applicable environments using Microsoft Excel Spreadsheet Software and documenting them in a report Using Microsoft Word Processing Software. Conclusion: Random vibration liftoff, ascent, and green run design & test criteria for the Upper Stage Pyrotechnic Components were developed by using Microsoft Excel to envelope zonal environments applicable to each component. Results were transferred from Excel into a report using Microsoft Word. After the report is reviewed and edited by my mentor it will be submitted for publication as an attachment to a memorandum. Pyrotechnic component designers will extract criteria from my report for incorporation into the design and test specifications for components. Eventually the hardware will be tested to the environments I developed to assure that the components will survive and function appropriately after exposure to the expected vibration environments.

  19. Channel characterization using multiple-point geostatistics, neural network, and modern analogy: A case study from a carbonate reservoir, southwest Iran

    NASA Astrophysics Data System (ADS)

    Hashemi, Seyyedhossein; Javaherian, Abdolrahim; Ataee-pour, Majid; Tahmasebi, Pejman; Khoshdel, Hossein

    2014-12-01

    In facies modeling, the ideal objective is to integrate different sources of data to generate a model that has the highest consistency to reality with respect to geological shapes and their facies architectures. Multiple-point (geo)statistics (MPS) is a tool that gives the opportunity of reaching this goal via defining a training image (TI). A facies modeling workflow was conducted on a carbonate reservoir located southwest Iran. Through a sequence stratigraphic correlation among the wells, it was revealed that the interval under a modeling process was deposited in a tidal flat environment. Bahamas tidal flat environment which is one of the most well studied modern carbonate tidal flats was considered to be the source of required information for modeling a TI. In parallel, a neural network probability cube was generated based on a set of attributes derived from 3D seismic cube to be applied into the MPS algorithm as a soft conditioning data. Moreover, extracted channel bodies and drilled well log facies came to the modeling as hard data. Combination of these constraints resulted to a facies model which was greatly consistent to the geological scenarios. This study showed how analogy of modern occurrences can be set as the foundation for generating a training image. Channel morphology and facies types currently being deposited, which are crucial for modeling a training image, was inferred from modern occurrences. However, there were some practical considerations concerning the MPS algorithm used for facies simulation. The main limitation was the huge amount of RAM and CPU-time needed to perform simulations.

  20. Two-point versus multiple-point geostatistics: the ability of geostatistical methods to capture complex geobodies and their facies associations—an application to a channelized carbonate reservoir, southwest Iran

    NASA Astrophysics Data System (ADS)

    Hashemi, Seyyedhossein; Javaherian, Abdolrahim; Ataee-pour, Majid; Khoshdel, Hossein

    2014-12-01

    Facies models try to explain facies architectures which have a primary control on the subsurface heterogeneities and the fluid flow characteristics of a given reservoir. In the process of facies modeling, geostatistical methods are implemented to integrate different sources of data into a consistent model. The facies models should describe facies interactions; the shape and geometry of the geobodies as they occur in reality. Two distinct categories of geostatistical techniques are two-point and multiple-point (geo) statistics (MPS). In this study, both of the aforementioned categories were applied to generate facies models. A sequential indicator simulation (SIS) and a truncated Gaussian simulation (TGS) represented two-point geostatistical methods, and a single normal equation simulation (SNESIM) selected as an MPS simulation representative. The dataset from an extremely channelized carbonate reservoir located in southwest Iran was applied to these algorithms to analyze their performance in reproducing complex curvilinear geobodies. The SNESIM algorithm needs consistent training images (TI) in which all possible facies architectures that are present in the area are included. The TI model was founded on the data acquired from modern occurrences. These analogies delivered vital information about the possible channel geometries and facies classes that are typically present in those similar environments. The MPS results were conditioned to both soft and hard data. Soft facies probabilities were acquired from a neural network workflow. In this workflow, seismic-derived attributes were implemented as the input data. Furthermore, MPS realizations were conditioned to hard data to guarantee the exact positioning and continuity of the channel bodies. A geobody extraction workflow was implemented to extract the most certain parts of the channel bodies from the seismic data. These extracted parts of the channel bodies were applied to the simulation workflow as hard data. This

  1. Geostatistical investigation into the temporal evolution of spatial structure in a shallow water table

    NASA Astrophysics Data System (ADS)

    Lyon, S. W.; Seibert, J.; Lembo, A. J.; Walter, M. T.; Steenhuis, T. S.

    2005-08-01

    Shallow water tables in the near-stream region often lead to saturated areas in catchments in humid climates. While these saturated areas are assumed to be of importance for issues such as non-point pollution sources, little is known about the spatial and temporal behavior of shallow water tables and the resulting saturated areas. In this study, geostatistical methods are employed demonstrating their utility in investigating the spatial and temporal variation of the shallow water table for the near-stream region. Event-based and seasonal changes in the spatial structure of the shallow water table, which directly influences surface saturation and runoff generation, can be identified and used in conjunction to characterize the hydrology of an area. This is accomplished through semivariogram analysis and indicator kriging to produce maps combining supplemental soft data (i.e., proxy information to the variable of interest) representing seasonal trends in the shallow water table with hard data (i.e., the actual measurements) that represent variation in the spatial structure of the shallow water table per rainfall event. The area used was a hillslope located in the Catskill Mountains region of New York State. The shallow water table was monitored for a 120 m×180 m near-stream region at 44 sampling locations on 15-min intervals. Outflow of the area was measured at the same time interval. These data were analyzed at a short time interval (15 min) and at a long time interval (months) to characterize the changes in the hydrology of the region. Indicator semivariograms based on transforming the depth to ground water table data into binary values (i.e., 1 if exceeding the time-variable median depth to water table and 0 if not) were created for both time interval lengths. When considering only the short time interval, the indicator semivariograms for spring when there is excess rainfall show high spatial structure with increased ranges during rain events with surface

  2. Geostatistical investigation into the temporal evolution of spatial structure in a shallow water table

    NASA Astrophysics Data System (ADS)

    Lyon, S. W.; Seibert, J.; Lembo, A. J.; Walter, M. T.; Steenhuis, T. S.

    2006-02-01

    Shallow water tables near-streams often lead to saturated, overland flow generating areas in catchments in humid climates. While these saturated areas are assumed to be principal biogeochemical hot-spots and important for issues such as non-point pollution sources, the spatial and temporal behavior of shallow water tables, and associated saturated areas, is not completely understood. This study demonstrates how geostatistical methods can be used to characterize the spatial and temporal variation of the shallow water table for the near-stream region. Event-based and seasonal changes in the spatial structure of the shallow water table, which influences the spatial pattern of surface saturation and related runoff generation, can be identified and used in conjunction to characterize the hydrology of an area. This is accomplished through semivariogram analysis and indicator kriging to produce maps combining soft data (i.e., proxy information to the variable of interest) representing general shallow water table patterns with hard data (i.e., actual measurements) that represent variation in the spatial structure of the shallow water table per rainfall event. The area used was a hillslope in the Catskill Mountains region of New York State. The shallow water table was monitored for a 120 m×180 m near-stream region at 44 sampling locations on 15-min intervals. Outflow of the area was measured at the same time interval. These data were analyzed at a short time interval (15 min) and at a long time interval (months) to characterize the changes in the hydrologic behavior of the hillslope. Indicator semivariograms based on binary-transformed ground water table data (i.e., 1 if exceeding the time-variable median depth to water table and 0 if not) were created for both short and long time intervals. For the short time interval, the indicator semivariograms showed a high degree of spatial structure in the shallow water table for the spring, with increased range during many rain

  3. Spatial Pattern of Great Lakes Estuary Processes from Water Quality Sensing and Geostatistical Methods

    NASA Astrophysics Data System (ADS)

    Xu, W.; Minsker, B. S.; Bailey, B.; Collingsworth, P.

    2014-12-01

    Mixing of river and lake water can alter water temperature, conductivity, and other properties that influence ecological processes in freshwater estuaries of the Great Lakes. This study uses geostatistical methods to rapidly visualize and understand water quality sampling results and enable adaptive sampling to remove anomalies and explore interesting phenomena in more detail. Triaxus, a towed undulating sensor package, was used for collecting various physical and biological water qualities in three estuary areas of Lake Michigan in Summer 2011. Based on the particular sampling pattern, data quality assurance and quality control (QA/QC) processes, including sensor synchronization, upcast and downcast separation, and spatial outlier removal are first applied. An automated kriging interpolation approach that considers trend and anisotropy is then proposed to estimate data on a gridded map for direct visualization. Other methods are explored with the data to gain more insights on water quality processes. Local G statistics serve as a supplementary tool to direct visualization. The method identifies statistically high value zones (hot spots) and low value zones (cold spots) in water chemistry across the estuaries, including locations of water sources and intrusions. In addition, chlorophyll concentration distributions are different among sites. To further understand the interactions and differences between river and lake water, K-means clustering algorithm is used to spatially cluster the water based on temperature and specific conductivity. Statistical analysis indicates that clusters with significant river water can be identified from higher turbidity, specific conductivity, and chlorophyll concentrations. Different ratios between zooplankton biomass and density indicate different zooplankton structure across clusters. All of these methods can contribute to improved near real-time analysis of future sampling activity.

  4. Accounting for geophysical information in geostatistical characterization of unexploded ordnance (UXO) sites.

    SciTech Connect

    Saito, Hirotaka; Goovaerts, Pierre; McKenna, Sean Andrew

    2003-06-01

    Efficient and reliable unexploded ordnance (UXO) site characterization is needed for decisions regarding future land use. There are several types of data available at UXO sites and geophysical signal maps are one of the most valuable sources of information. Incorporation of such information into site characterization requires a flexible and reliable methodology. Geostatistics allows one to account for exhaustive secondary information (i.e.,, known at every location within the field) in many different ways. Kriging and logistic regression were combined to map the probability of occurrence of at least one geophysical anomaly of interest, such as UXO, from a limited number of indicator data. Logistic regression is used to derive the trend from a geophysical signal map, and kriged residuals are added to the trend to estimate the probabilities of the presence of UXO at unsampled locations (simple kriging with varying local means or SKlm). Each location is identified for further remedial action if the estimated probability is greater than a given threshold. The technique is illustrated using a hypothetical UXO site generated by a UXO simulator, and a corresponding geophysical signal map. Indicator data are collected along two transects located within the site. Classification performances are then assessed by computing proportions of correct classification, false positive, false negative, and Kappa statistics. Two common approaches, one of which does not take any secondary information into account (ordinary indicator kriging) and a variant of common cokriging (collocated cokriging), were used for comparison purposes. Results indicate that accounting for exhaustive secondary information improves the overall characterization of UXO sites if an appropriate methodology, SKlm in this case, is used.

  5. Estimating the Depth of Stratigraphic Units from Marine Seismic Profiles Using Nonstationary Geostatistics

    SciTech Connect

    Chihi, Hayet; Galli, Alain; Ravenne, Christian; Tesson, Michel; Marsily, Ghislain de

    2000-03-15

    The object of this study is to build a three-dimensional (3D) geometric model of the stratigraphic units of the margin of the Rhone River on the basis of geophysical investigations by a network of seismic profiles at sea. The geometry of these units is described by depth charts of each surface identified by seismic profiling, which is done by geostatistics. The modeling starts by a statistical analysis by which we determine the parameters that enable us to calculate the variograms of the identified surfaces. After having determined the statistical parameters, we calculate the variograms of the variable Depth. By analyzing the behavior of the variogram we then can deduce whether the situation is stationary and if the variable has an anisotropic behavior. We tried the following two nonstationary methods to obtain our estimates: (a) The method of universal kriging if the underlying variogram was directly accessible. (b) The method of increments if the underlying variogram was not directly accessible. After having modeled the variograms of the increments and of the variable itself, we calculated the surfaces by kriging the variable Depth on a small-mesh estimation grid. The two methods then are compared and their respective advantages and disadvantages are discussed, as well as their fields of application. These methods are capable of being used widely in earth sciences for automatic mapping of geometric surfaces or for variables such as a piezometric surface or a concentration, which are not 'stationary,' that is, essentially, possess a gradient or a tendency to develop systematically in space.

  6. Geostatistical modeling of the spatial distribution of sediment oxygen demand within a Coastal Plain blackwater watershed.

    PubMed

    Todd, M Jason; Lowrance, R Richard; Goovaerts, Pierre; Vellidis, George; Pringle, Catherine M

    2010-10-15

    Blackwater streams are found throughout the Coastal Plain of the southeastern United States and are characterized by a series of instream floodplain swamps that play a critical role in determining the water quality of these systems. Within the state of Georgia, many of these streams are listed in violation of the state's dissolved oxygen (DO) standard. Previous work has shown that sediment oxygen demand (SOD) is elevated in instream floodplain swamps and due to these areas of intense oxygen demand, these locations play a major role in determining the oxygen balance of the watershed as a whole. This work also showed SOD rates to be positively correlated with the concentration of total organic carbon. This study builds on previous work by using geostatistics and Sequential Gaussian Simulation to investigate the patchiness and distribution of total organic carbon (TOC) at the reach scale. This was achieved by interpolating TOC observations and simulated SOD rates based on a linear regression. Additionally, this study identifies areas within the stream system prone to high SOD at representative 3rd and 5th order locations. Results show that SOD was spatially correlated with the differences in distribution of TOC at both locations and that these differences in distribution are likely a result of the differing hydrologic regime and watershed position. Mapping of floodplain soils at the watershed scale shows that areas of organic sediment are widespread and become more prevalent in higher order streams. DO dynamics within blackwater systems are a complicated mix of natural and anthropogenic influences, but this paper illustrates the importance of instream swamps in enhancing SOD at the watershed scale. Moreover, our study illustrates the influence of instream swamps on oxygen demand while providing support that many of these systems are naturally low in DO. PMID:20938491

  7. A space-time geostatistical framework for ensemble nowcasting using rainfall radar fields and gauge data

    NASA Astrophysics Data System (ADS)

    Caseri, Angelica; Ramos, Maria Helena; Javelle, Pierre; Leblois, Etienne

    2016-04-01

    Floods are responsible for a major part of the total damage caused by natural disasters. Nowcasting systems providing public alerts to flash floods are very important to prevent damages from extreme events and reduce their socio-economic impacts. The major challenge of these systems is to capture high-risk situations in advance, with good accuracy in the intensity, location and timing of future intense precipitation events. Flash flood forecasting has been studied by several authors in different affected areas. The majority of the studies combines rain gauge data with radar imagery advection to improve prediction for the next few hours. Outputs of Numerical Weather Prediction (NWP) models have also been increasingly used to predict ensembles of extreme precipitation events that might trigger flash floods. One of the challenges of the use of NWP for ensemble nowcasting is to successfully generate ensemble forecasts of precipitation in a short time calculation period to enable the production of flood forecasts with sufficient advance to issue flash flood alerts. In this study, we investigate an alternative space-time geostatistical framework to generate multiple scenarios of future rainfall for flash floods nowcasting. The approach is based on conditional simulation and an advection method applied within the Turning Bands Method (TBM). Ensemble forecasts of precipitation fields are generated based on space-time properties given by radar images and precipitation data collected from rain gauges during the development of the rainfall event. The results show that the approach developed can be an interesting alternative to capture precipitation uncertainties in location and intensity and generate ensemble forecasts of rainfall that can be useful to improve alerts for flash floods, especially in small areas.

  8. Applying simple geostatistical techniques to a routine production geology problem - a case study

    SciTech Connect

    Norris, R.J.; Hewitt, A.; Massonnat, G.J. )

    1994-07-01

    A production geology reservoir description was shown to represent poorly the known dynamic reservoir behavior. The permeability field was originally generated from a general porosity-permeability law applied to a contoured porosity field. This resulted in unrealistically high permeability values in some layers and early water breakthrough in flow simulations. Furthermore, known well values of permeability were not honored. To improve the model, a geostatistical approach was used. The first step involved the use of a porosity-permeability law based on total layer porosity values. That is, with no cutoff applied, thereby allowing obtainment of values of k closer to the known reservoir values. This alone produces a permeability field, which although better represents the absolute permeability values is still too smooth-preferential pathways occurring as artifacts of contouring. A second step, therefore, involves the [open quotes]unsmoothing[close quotes] of the permeability field. At each point of the permeability field the permeability values are resampled from a distribution centered around the original value. This is a form of Monte-Carlo replacement, conditioned to well data and honoring the original trend to the data. In this case no correlation was included due to lack of information. This simple Gaussian simulation approach provides multiple realizations based on deterministic information. The advantages are clear, more realistic images (no artificial pathways), improved match of water breakthrough, and the honoring of all deterministic data (wells and trend). A subsequent step was the incorporation of further [open quotes]soft[close quotes] information. Geological analysis suggested that there was a degradation of reservoir properties along the east-west axis. This new information was rapidly assimilated into the model to produce final images of the reservoir.

  9. Using GIS geostatistical techniques to study the temporal and spatial rainfall characteristics in Romania

    NASA Astrophysics Data System (ADS)

    Nertan, A. T.; Panaitescu, V.; Stancalie, Gh.

    2009-09-01

    Floods are among the most devastating natural hazards in the world, claiming more lives and causing more property damage than any other natural phenomena. In the last 15 years floods and accompanying landslides, occurred quit frequently in Romania, some of which isolated, others-affecting wide areas of the country's territory. When rainfall have significant spatial variation, complete rainfall data for each region with different rainfall characteristics are very important. This study aimed at using GIS statistical interpolation techniques to study the temporal and spatial rainfall characteristics and map the spatiotemporal variation in rainfall in Romania. Two geostatistical interpolation methods were compared for this research work: Inverse Distance Weighting (IDW) method and Ordinary Kriging as the stochastic methods. The two different spatial interpolation methods are applied to estimate the missing value on the basis of the remaining observed ones. The results from this study demonstrate the usefulness of the GIS techniques in assessing rainfall variability. The paper describes also the application software based on GIS technology to create automatically the map representing the distribution of the total precipitation amounts for intervals of interest (i.e. monthly, annual, for critical vegetation periods, seasonal, for agricultural years, etc). The application accesses automatically the corresponding database and offers the possibility to save and keep these maps in different format (JPEG, PDF) files which can be sent to the decision-making factors in order to carry out strategic decisions proper to each situation. It also offers to the users the possibility to interrogate databases, create graphs to analyze the temporal evolution of the selected meteorological parameter.

  10. Using Geostatistical Data Fusion Techniques and MODIS Data to Upscale Simulated Wheat Yield

    NASA Astrophysics Data System (ADS)

    Castrignano, A.; Buttafuoco, G.; Matese, A.; Toscano, P.

    2014-12-01

    Population growth increases food request. Assessing food demand and predicting the actual supply for a given location are critical components of strategic food security planning at regional scale. Crop yield can be simulated using crop models because is site-specific and determined by weather, management, length of growing season and soil properties. Crop models require reliable location-specific data that are not generally available. Obtaining these data at a large number of locations is time-consuming, costly and sometimes simply not feasible. An upscaling method to extend coverage of sparse estimates of crop yield to an appropriate extrapolation domain is required. This work is aimed to investigate the applicability of a geostatistical data fusion approach for merging remote sensing data with the predictions of a simulation model of wheat growth and production using ground-based data. The study area is Capitanata plain (4000 km2) located in Apulia Region, mostly cropped with durum wheat. The MODIS EVI/NDVI data products for Capitanata plain were downloaded from the Land Processes Distributed Active Archive Center (LPDAAC) remote for the whole crop cycle of durum wheat. Phenological development, biomass growth and grain quantity of durum wheat were simulated by the Delphi system, based on a crop simulation model linked to a database including soil properties, agronomical and meteorological data. Multicollocated cokriging was used to integrate secondary exhaustive information (multi-spectral MODIS data) with primary variable (sparsely distributed biomass/yield model predictions of durum wheat). The model estimates looked strongly spatially correlated with the radiance data (red and NIR bands) and the fusion data approach proved to be quite suitable and flexible to integrate data of different type and support.

  11. Geostatistical modeling of the spatial distribution of sediment oxygen demand within a Coastal Plain blackwater watershed

    PubMed Central

    Todd, M. Jason; Lowrance, R. Richard; Goovaerts, Pierre; Vellidis, George; Pringle, Catherine M.

    2010-01-01

    Blackwater streams are found throughout the Coastal Plain of the southeastern United States and are characterized by a series of instream floodplain swamps that play a critical role in determining the water quality of these systems. Within the state of Georgia, many of these streams are listed in violation of the state’s dissolved oxygen (DO) standard. Previous work has shown that sediment oxygen demand (SOD) is elevated in instream floodplain swamps and due to these areas of intense oxygen demand, these locations play a major role in determining the oxygen balance of the watershed as a whole. This work also showed SOD rates to be positively correlated with the concentration of total organic carbon. This study builds on previous work by using geostatistics and Sequential Gaussian Simulation to investigate the patchiness and distribution of total organic carbon (TOC) at the reach scale. This was achieved by interpolating TOC observations and simulated SOD rates based on a linear regression. Additionally, this study identifies areas within the stream system prone to high SOD at representative 3rd and 5th order locations. Results show that SOD was spatially correlated with the differences in distribution of TOC at both locations and that these differences in distribution are likely a result of the differing hydrologic regime and watershed position. Mapping of floodplain soils at the watershed scale shows that areas of organic sediment are widespread and become more prevalent in higher order streams. DO dynamics within blackwater systems are a complicated mix of natural and anthropogenic influences, but this paper illustrates the importance of instream swamps in enhancing SOD at the watershed scale. Moreover, our study illustrates the influence of instream swamps on oxygen demand while providing support that many of these systems are naturally low in DO. PMID:20938491

  12. Geostatistical radar-raingauge merging: A novel method for the quantification of rain estimation accuracy

    NASA Astrophysics Data System (ADS)

    Delrieu, Guy; Wijbrans, Annette; Boudevillain, Brice; Faure, Dominique; Bonnifait, Laurent; Kirstetter, Pierre-Emmanuel

    2014-09-01

    Compared to other estimation techniques, one advantage of geostatistical techniques is that they provide an index of the estimation accuracy of the variable of interest with the kriging estimation standard deviation (ESD). In the context of radar-raingauge quantitative precipitation estimation (QPE), we address in this article the question of how the kriging ESD can be transformed into a local spread of error by using the dependency of radar errors to the rain amount analyzed in previous work. The proposed approach is implemented for the most significant rain events observed in 2008 in the Cévennes-Vivarais region, France, by considering both the kriging with external drift (KED) and the ordinary kriging (OK) methods. A two-step procedure is implemented for estimating the rain estimation accuracy: (i) first kriging normalized ESDs are computed by using normalized variograms (sill equal to 1) to account for the observation system configuration and the spatial structure of the variable of interest (rainfall amount, residuals to the drift); (ii) based on the assumption of a linear relationship between the standard deviation and the mean of the variable of interest, a denormalization of the kriging ESDs is performed globally for a given rain event by using a cross-validation procedure. Despite the fact that the KED normalized ESDs are usually greater than the OK ones (due to an additional constraint in the kriging system and a weaker spatial structure of the residuals to the drift), the KED denormalized ESDs are generally smaller the OK ones, a result consistent with the better performance observed for the KED technique. The evolution of the mean and the standard deviation of the rainfall-scaled ESDs over a range of spatial (5-300 km2) and temporal (1-6 h) scales demonstrates that there is clear added value of the radar with respect to the raingauge network for the shortest scales, which are those of interest for flash-flood prediction in the considered region.

  13. Reservoir Characterization using geostatistical and numerical modeling in GIS with noble gas geochemistry

    NASA Astrophysics Data System (ADS)

    Vasquez, D. A.; Swift, J. N.; Tan, S.; Darrah, T. H.

    2013-12-01

    The integration of precise geochemical analyses with quantitative engineering modeling into an interactive GIS system allows for a sophisticated and efficient method of reservoir engineering and characterization. Geographic Information Systems (GIS) is utilized as an advanced technique for oil field reservoir analysis by combining field engineering and geological/geochemical spatial datasets with the available systematic modeling and mapping methods to integrate the information into a spatially correlated first-hand approach in defining surface and subsurface characteristics. Three key methods of analysis include: 1) Geostatistical modeling to create a static and volumetric 3-dimensional representation of the geological body, 2) Numerical modeling to develop a dynamic and interactive 2-dimensional model of fluid flow across the reservoir and 3) Noble gas geochemistry to further define the physical conditions, components and history of the geologic system. Results thus far include using engineering algorithms for interpolating electrical well log properties across the field (spontaneous potential, resistivity) yielding a highly accurate and high-resolution 3D model of rock properties. Results so far also include using numerical finite difference methods (crank-nicholson) to solve for equations describing the distribution of pressure across field yielding a 2D simulation model of fluid flow across reservoir. Ongoing noble gas geochemistry results will also include determination of the source, thermal maturity and the extent/style of fluid migration (connectivity, continuity and directionality). Future work will include developing an inverse engineering algorithm to model for permeability, porosity and water saturation.This combination of new and efficient technological and analytical capabilities is geared to provide a better understanding of the field geology and hydrocarbon dynamics system with applications to determine the presence of hydrocarbon pay zones (or

  14. A Geostatistical Framework for Estimating Rain Intensity Fields Using Dense Rain Gauge Networks

    NASA Astrophysics Data System (ADS)

    Benoit, L.; Mariethoz, G.

    2015-12-01

    Rain gauges provide direct and continuous observations of rain accumulation with a high time resolution (up to 1min). However the representativeness of these measurements is restricted to the funnel where rainwater is collected. Due to the high spatial heterogeneity of rainfall, this poor spatial representativeness is a strong limitation for the detailed reconstruction of rain intensity fields. Here we propose a geostatistical framework that is able to generate an ensemble of simulated rain fields based on data from a dense rain gauge network. When the density of rain gauges is high (sensor spacing in the range 500m to 1km), the spatial correlation between precipitation time series becomes sufficient to identify and track the rain patterns observed at the rain gauge sampling rate. Rain observations derived from such networks can thus be used to reconstruct the rain field with a high resolution in both space and time (i.e. 1min in time, 100m in space). Our method produces an ensemble of realizations that honor the rain intensities measured throughout the rain gauge network and preserve the main features of the rain intensity field at the considered scale, i.e.: the advection and morphing properties of rain cells over time, the intermittency and the skewed distribution of rainfall, and the decrease of the rain rate near the rain cell borders (dry drift). This allows to image the observed rain field and characterize its main features, as well as to quantify the related uncertainty. The obtained reconstruction of the rainfall are continuous in time, and therefore can complement weather radar observations which are snapshots of the rain field. In addition, the application of this method to networks with a spatial extent comparable to the one of a radar pixel (i.e. around 1km2) could allow exploration of the rain field within a single radar pixel.

  15. Setting Objectives

    ERIC Educational Resources Information Center

    Elkins, Aaron J.

    1977-01-01

    The author questions the extent to which educators have relied on "relevance" and learner participation in objective-setting in the past decade. He describes a useful approach to learner-oriented evaluation in which content relevance was not judged by participants until after they had been exposed to it. (MF)

  16. Mapping of soil salinity: a comparative study between deterministic and geostatistical methods, case of the Tadla plain (Morocco)

    NASA Astrophysics Data System (ADS)

    Barbouchi, Meriem; Chokmani, Karem; Ben Aissa, Nadhira; Lhissou, Rachid; El Harti, Abderrazak; Abdelfattah, Riadh

    2013-04-01

    Soil salinization hazard in semi-arid regions such as Central Morocco is increasingly affecting arable lands and this is due to combined effects of anthropogenic activities (development of irrigation) and climate change (Multiplying drought episodes). In a rational strategy of fight against this hazard, salinity mapping is a key step to ensure effective spatiotemporal monitoring. The objective of this study is to test the effectiveness of geostatistical approach in mapping soil salinity compared to more forward deterministic interpolation methods. Three soil salinity sampling campaigns (27 September, 24 October and 19 November 2011) were conducted over the irrigated area of the Tadla plain, situated between the High and Middle Atlasin Central Morocco. Each campaign was made of 38 surface soil samples (upper 5 cm). From each sample the electrical conductivity (EC) was determined in saturated paste extract and used subsequently as proxy of soil salinity. The potential of deterministic interpolation methods (IDW) and geostatistical techniques (Ordinary Kriging) in mapping surface soil salinity was evaluated in a GIS environment through cross-validation technique. Field measurements showed that the soil salinity was generally low except during the second campaign where a significant increase in EC values was recorded. Interpolation results showed a better performance with geostatistical approach compared to deterministic one. Indeed, for all the campaigns, cross-validation yielded lower RMSE and bias for Kriging than IDW. However, the performance of the two methods was dependent on the range and the structure of the spatial variability of salinity. Indeed, Kriging showed better accuracy for the second campaign in comparison with the two others. This could be explained by the wider range of values of soil salinity during this campaign, which has resulted in a greater range of spatial dependence and has a better modeling of the spatial variability of salinity, which 'was

  17. Integration of geology, geostatistics, well logs and pressure data to model a heterogeneous supergiant field in Iran

    SciTech Connect

    Samimi, B.; Bagherpour, H.; Nioc, A.

    1995-08-01

    The geological reservoir study of the supergiant Ahwaz field significantly improved the history matching process in many aspects, particularly the development of a geostatistical model which allowed a sound basis for changes and by delivering much needed accurate estimates of grid block vertical permeabilities. The geostatistical reservoir evaluation was facilitated by using the Heresim package and litho-stratigraphic zonations for the entire field. For each of the geological zones, 3-dimensional electrolithofacies and petrophysical property distributions (realizations) were treated which captured the heterogeneities which significantly affected fluid flow. However, as this level of heterogeneity was at a significantly smaller scale than the flow simulation grid blocks, a scaling up effort was needed to derive the effective flow properties of the blocks (porosity, horizontal and vertical permeability, and water saturation). The properties relating to the static reservoir description were accurately derived by using stream tube techniques developed in-house whereas, the relative permeabilities of the grid block were derived by dynamic pseudo relative permeability techniques. The prediction of vertical and lateral communication and water encroachment was facilitated by a close integration of pressure, saturation data, geostatistical modelling and sedimentological studies of the depositional environments and paleocurrents. The nature of reservoir barriers and baffles varied both vertically and laterally in this heterogeneous reservoir. Maps showing differences in pressure between zones after years of production served as a guide to integrating the static geological studies to the dynamic behaviour of each of the 16 reservoir zones. The use of deep wells being drilled to a deeper reservoir provided data to better understand the sweep efficiency and the continuity of barriers and baffles.

  18. Geostatistical Methods For Determination of Roughness, Topography, And Changes of Antarctic Ice Streams From SAR And Radar Altimeter Data

    NASA Technical Reports Server (NTRS)

    Herzfeld, Ute C.

    2002-01-01

    The central objective of this project has been the development of geostatistical methods fro mapping elevation and ice surface characteristics from satellite radar altimeter (RA) and Syntheitc Aperture Radar (SAR) data. The main results are an Atlas of elevation maps of Antarctica, from GEOSAT RA data and an Atlas from ERS-1 RA data, including a total of about 200 maps with 3 km grid resolution. Maps and digital terrain models are applied to monitor and study changes in Antarctic ice streams and glaciers, including Lambert Glacier/Amery Ice Shelf, Mertz and Ninnis Glaciers, Jutulstraumen Glacier, Fimbul Ice Shelf, Slessor Glacier, Williamson Glacier and others.

  19. SETS. Set Equation Transformation System

    SciTech Connect

    Worrel, R.B.

    1992-01-13

    SETS is used for symbolic manipulation of Boolean equations, particularly the reduction of equations by the application of Boolean identities. It is a flexible and efficient tool for performing probabilistic risk analysis (PRA), vital area analysis, and common cause analysis. The equation manipulation capabilities of SETS can also be used to analyze noncoherent fault trees and determine prime implicants of Boolean functions, to verify circuit design implementation, to determine minimum cost fire protection requirements for nuclear reactor plants, to obtain solutions to combinatorial optimization problems with Boolean constraints, and to determine the susceptibility of a facility to unauthorized access through nullification of sensors in its protection system.

  20. Potential of deterministic and geostatistical rainfall interpolation under high rainfall variability and dry spells: case of Kenya's Central Highlands

    NASA Astrophysics Data System (ADS)

    Kisaka, M. Oscar; Mucheru-Muna, M.; Ngetich, F. K.; Mugwe, J.; Mugendi, D.; Mairura, F.; Shisanya, C.; Makokha, G. L.

    2016-04-01

    Drier parts of Kenya's Central Highlands endure persistent crop failure and declining agricultural productivity. These have, in part, attributed to high temperatures, prolonged dry spells and erratic rainfall. Understanding spatial-temporal variability of climatic indices such as rainfall at seasonal level is critical for optimal rain-fed agricultural productivity and natural resource management in the study area. However, the predominant setbacks in analysing hydro-meteorological events are occasioned by either lack, inadequate, or inconsistent meteorological data. Like in most other places, the sole sources of climatic data in the study region are scarce and only limited to single stations, yet with persistent missing/unrecorded data making their utilization a challenge. This study examined seasonal anomalies and variability in rainfall, drought occurrence and the efficacy of interpolation techniques in the drier regions of eastern Kenyan. Rainfall data from five stations (Machang'a, Kiritiri, Kiambere and Kindaruma and Embu) were sourced from both the Kenya Meteorology Department and on-site primary recording. Owing to some experimental work ongoing, automated recording for primary dailies in Machang'a have been ongoing since the year 2000 to date; thus, Machang'a was treated as reference (for period of record) station for selection of other stations in the region. The other stations had data sets of over 15 years with missing data of less than 10 % as required by the world meteorological organization whose quality check is subject to the Centre for Climate Systems Modeling (C2SM) through MeteoSwiss and EMPA bodies. The dailies were also subjected to homogeneity testing to evaluate whether they came from the same population. Rainfall anomaly index, coefficients of variance and probability were utilized in the analyses of rainfall variability. Spline, kriging and inverse distance weighting interpolation techniques were assessed using daily rainfall data and

  1. Interpolating Non-Parametric Distributions of Hourly Rainfall Intensities Using Random Mixing

    NASA Astrophysics Data System (ADS)

    Mosthaf, Tobias; Bárdossy, András; Hörning, Sebastian

    2015-04-01

    The correct spatial interpolation of hourly rainfall intensity distributions is of great importance for stochastical rainfall models. Poorly interpolated distributions may lead to over- or underestimation of rainfall and consequently to wrong estimates of following applications, like hydrological or hydraulic models. By analyzing the spatial relation of empirical rainfall distribution functions, a persistent order of the quantile values over a wide range of non-exceedance probabilities is observed. As the order remains similar, the interpolation weights of quantile values for one certain non-exceedance probability can be applied to the other probabilities. This assumption enables the use of kernel smoothed distribution functions for interpolation purposes. Comparing the order of hourly quantile values over different gauges with the order of their daily quantile values for equal probabilities, results in high correlations. The hourly quantile values also show high correlations with elevation. The incorporation of these two covariates into the interpolation is therefore tested. As only positive interpolation weights for the quantile values assure a monotonically increasing distribution function, the use of geostatistical methods like kriging is problematic. Employing kriging with external drift to incorporate secondary information is not applicable. Nonetheless, it would be fruitful to make use of covariates. To overcome this shortcoming, a new random mixing approach of spatial random fields is applied. Within the mixing process hourly quantile values are considered as equality constraints and correlations with elevation values are included as relationship constraints. To profit from the dependence of daily quantile values, distribution functions of daily gauges are used to set up lower equal and greater equal constraints at their locations. In this way the denser daily gauge network can be included in the interpolation of the hourly distribution functions. The

  2. Assimilation of Satellite Soil Moisture observation with the Particle Filter-Markov Chain Monte Carlo and Geostatistical Modeling

    NASA Astrophysics Data System (ADS)

    Moradkhani, Hamid; Yan, Hongxiang

    2016-04-01

    Soil moisture simulation and prediction are increasingly used to characterize agricultural droughts but the process suffers from data scarcity and quality. The satellite soil moisture observations could be used to improve model predictions with data assimilation. Remote sensing products, however, are typically discontinuous in spatial-temporal coverages; while simulated soil moisture products are potentially biased due to the errors in forcing data, parameters, and deficiencies of model physics. This study attempts to provide a detailed analysis of the joint and separate assimilation of streamflow and Advanced Scatterometer (ASCAT) surface soil moisture into a fully distributed hydrologic model, with the use of recently developed particle filter-Markov chain Monte Carlo (PF-MCMC) method. A geostatistical model is introduced to overcome the satellite soil moisture discontinuity issue where satellite data does not cover the whole study region or is significantly biased, and the dominant land cover is dense vegetation. The results indicate that joint assimilation of soil moisture and streamflow has minimal effect in improving the streamflow prediction, however, the surface soil moisture field is significantly improved. The combination of DA and geostatistical approach can further improve the surface soil moisture prediction.

  3. Genetic Geostatistical Framework for Spatial Analysis of Fine-Scale Genetic Heterogeneity in Modern Populations: Results from the KORA Study.

    PubMed

    Diaz-Lacava, A N; Walier, M; Holler, D; Steffens, M; Gieger, C; Furlanello, C; Lamina, C; Wichmann, H E; Becker, T

    2015-01-01

    Aiming to investigate fine-scale patterns of genetic heterogeneity in modern humans from a geographic perspective, a genetic geostatistical approach framed within a geographic information system is presented. A sample collected for prospective studies in a small area of southern Germany was analyzed. None indication of genetic heterogeneity was detected in previous analysis. Socio-demographic and genotypic data of German citizens were analyzed (212 SNPs; n = 728). Genetic heterogeneity was evaluated with observed heterozygosity (H O ). Best-fitting spatial autoregressive models were identified, using socio-demographic variables as covariates. Spatial analysis included surface interpolation and geostatistics of observed and predicted patterns. Prediction accuracy was quantified. Spatial autocorrelation was detected for both socio-demographic and genetic variables. Augsburg City and eastern suburban areas showed higher H O values. The selected model gave best predictions in suburban areas. Fine-scale patterns of genetic heterogeneity were observed. In accordance to literature, more urbanized areas showed higher levels of admixture. This approach showed efficacy for detecting and analyzing subtle patterns of genetic heterogeneity within small areas. It is scalable in number of loci, even up to whole-genome analysis. It may be suggested that this approach may be applicable to investigate the underlying genetic history that is, at least partially, embedded in geographic data. PMID:26258132

  4. Genetic Geostatistical Framework for Spatial Analysis of Fine-Scale Genetic Heterogeneity in Modern Populations: Results from the KORA Study

    PubMed Central

    Diaz-Lacava, A. N.; Walier, M.; Holler, D.; Steffens, M.; Gieger, C.; Furlanello, C.; Lamina, C.; Wichmann, H. E.; Becker, T.

    2015-01-01

    Aiming to investigate fine-scale patterns of genetic heterogeneity in modern humans from a geographic perspective, a genetic geostatistical approach framed within a geographic information system is presented. A sample collected for prospective studies in a small area of southern Germany was analyzed. None indication of genetic heterogeneity was detected in previous analysis. Socio-demographic and genotypic data of German citizens were analyzed (212 SNPs; n = 728). Genetic heterogeneity was evaluated with observed heterozygosity (HO). Best-fitting spatial autoregressive models were identified, using socio-demographic variables as covariates. Spatial analysis included surface interpolation and geostatistics of observed and predicted patterns. Prediction accuracy was quantified. Spatial autocorrelation was detected for both socio-demographic and genetic variables. Augsburg City and eastern suburban areas showed higher HO values. The selected model gave best predictions in suburban areas. Fine-scale patterns of genetic heterogeneity were observed. In accordance to literature, more urbanized areas showed higher levels of admixture. This approach showed efficacy for detecting and analyzing subtle patterns of genetic heterogeneity within small areas. It is scalable in number of loci, even up to whole-genome analysis. It may be suggested that this approach may be applicable to investigate the underlying genetic history that is, at least partially, embedded in geographic data. PMID:26258132

  5. Determination of geostatistically representative sampling locations in Porsuk Dam Reservoir (Turkey)

    NASA Astrophysics Data System (ADS)

    Aksoy, A.; Yenilmez, F.; Duzgun, S.

    2013-12-01

    Several factors such as wind action, bathymetry and shape of a lake/reservoir, inflows, outflows, point and diffuse pollution sources result in spatial and temporal variations in water quality of lakes and reservoirs. The guides by the United Nations Environment Programme and the World Health Organization to design and implement water quality monitoring programs suggest that even a single monitoring station near the center or at the deepest part of a lake will be sufficient to observe long-term trends if there is good horizontal mixing. In stratified water bodies, several samples can be required. According to the guide of sampling and analysis under the Turkish Water Pollution Control Regulation, a minimum of five sampling locations should be employed to characterize the water quality in a reservoir or a lake. The European Union Water Framework Directive (2000/60/EC) states to select a sufficient number of monitoring sites to assess the magnitude and impact of point and diffuse sources and hydromorphological pressures in designing a monitoring program. Although existing regulations and guidelines include frameworks for the determination of sampling locations in surface waters, most of them do not specify a procedure in establishment of monitoring aims with representative sampling locations in lakes and reservoirs. In this study, geostatistical tools are used to determine the representative sampling locations in the Porsuk Dam Reservoir (PDR). Kernel density estimation and kriging were used in combination to select the representative sampling locations. Dissolved oxygen and specific conductivity were measured at 81 points. Sixteen of them were used for validation. In selection of the representative sampling locations, care was given to keep similar spatial structure in distributions of measured parameters. A procedure was proposed for that purpose. Results indicated that spatial structure was lost under 30 sampling points. This was as a result of varying water

  6. Adaptive Conditioning of Multiple-Point Geostatistical Facies Simulation to Flow Data with Facies Probability Maps

    NASA Astrophysics Data System (ADS)

    Khodabakhshi, M.; Jafarpour, B.

    2013-12-01

    Characterization of complex geologic patterns that create preferential flow paths in certain reservoir systems requires higher-order geostatistical modeling techniques. Multipoint statistics (MPS) provides a flexible grid-based approach for simulating such complex geologic patterns from a conceptual prior model known as a training image (TI). In this approach, a stationary TI that encodes the higher-order spatial statistics of the expected geologic patterns is used to represent the shape and connectivity of the underlying lithofacies. While MPS is quite powerful for describing complex geologic facies connectivity, the nonlinear and complex relation between the flow data and facies distribution makes flow data conditioning quite challenging. We propose an adaptive technique for conditioning facies simulation from a prior TI to nonlinear flow data. Non-adaptive strategies for conditioning facies simulation to flow data can involves many forward flow model solutions that can be computationally very demanding. To improve the conditioning efficiency, we develop an adaptive sampling approach through a data feedback mechanism based on the sampling history. In this approach, after a short period of sampling burn-in time where unconditional samples are generated and passed through an acceptance/rejection test, an ensemble of accepted samples is identified and used to generate a facies probability map. This facies probability map contains the common features of the accepted samples and provides conditioning information about facies occurrence in each grid block, which is used to guide the conditional facies simulation process. As the sampling progresses, the initial probability map is updated according to the collective information about the facies distribution in the chain of accepted samples to increase the acceptance rate and efficiency of the conditioning. This conditioning process can be viewed as an optimization approach where each new sample is proposed based on the

  7. Geostatistics applied to cross-well reflection seismic for imaging carbonate aquifers

    NASA Astrophysics Data System (ADS)

    Parra, Jorge; Emery, Xavier

    2013-05-01

    Cross-well seismic reflection data, acquired from a carbonate aquifer at Port Mayaca test site near the eastern boundary of Lake Okeechobee in Martin County, Florida, are used to delineate flow units in the region intercepted by two wells. The interwell impedance determined by inversion from the seismic reflection data allows us to visualize the major boundaries between the hydraulic units. The hydraulic (flow) unit properties are based on the integration of well logs and the carbonate structure, which consists of isolated vuggy carbonate units and interconnected vug systems within the carbonate matrix. The vuggy and matrix porosity logs based on Formation Micro-Imager (FMI) data provide information about highly permeable conduits at well locations. The integration of the inverted impedance and well logs using geostatistics helps us to assess the resolution of the cross-well seismic method for detecting conduits and to determine whether these conduits are continuous or discontinuous between wells. A productive water zone of the aquifer outlined by the well logs was selected for analysis and interpretation. The ELAN (Elemental Log Analysis) porosity from two wells was selected as primary data and the reflection seismic-based impedance as secondary data. The direct and cross variograms along the vertical wells capture nested structures associated with periodic carbonate units, which correspond to connected flow units between the wells. Alternatively, the horizontal variogram of impedance (secondary data) provides scale lengths that correspond to irregular boundary shapes of flow units. The ELAN porosity image obtained by cokriging exhibits three similar flow units at different depths. These units are thin conduits developed in the first well and, at about the middle of the interwell separation region, these conduits connect to thicker flow units that are intercepted by the second well. In addition, a high impedance zone (low porosity) at a depth of about 275 m, after

  8. Geostatistical modelling of soil-transmitted helminth infection in Cambodia: do socioeconomic factors improve predictions?

    PubMed

    Karagiannis-Voules, Dimitrios-Alexios; Odermatt, Peter; Biedermann, Patricia; Khieu, Virak; Schär, Fabian; Muth, Sinuon; Utzinger, Jürg; Vounatsou, Penelope

    2015-01-01

    Soil-transmitted helminth infections are intimately connected with poverty. Yet, there is a paucity of using socioeconomic proxies in spatially explicit risk profiling. We compiled household-level socioeconomic data pertaining to sanitation, drinking-water, education and nutrition from readily available Demographic and Health Surveys, Multiple Indicator Cluster Surveys and World Health Surveys for Cambodia and aggregated the data at village level. We conducted a systematic review to identify parasitological surveys and made every effort possible to extract, georeference and upload the data in the open source Global Neglected Tropical Diseases database. Bayesian geostatistical models were employed to spatially align the village-aggregated socioeconomic predictors with the soil-transmitted helminth infection data. The risk of soil-transmitted helminth infection was predicted at a grid of 1×1km covering Cambodia. Additionally, two separate individual-level spatial analyses were carried out, for Takeo and Preah Vihear provinces, to assess and quantify the association between soil-transmitted helminth infection and socioeconomic indicators at an individual level. Overall, we obtained socioeconomic proxies from 1624 locations across the country. Surveys focussing on soil-transmitted helminth infections were extracted from 16 sources reporting data from 238 unique locations. We found that the risk of soil-transmitted helminth infection from 2000 onwards was considerably lower than in surveys conducted earlier. Population-adjusted prevalences for school-aged children from 2000 onwards were 28.7% for hookworm, 1.5% for Ascaris lumbricoides and 0.9% for Trichuris trichiura. Surprisingly, at the country-wide analyses, we did not find any significant association between soil-transmitted helminth infection and village-aggregated socioeconomic proxies. Based also on the individual-level analyses we conclude that socioeconomic proxies might not be good predictors at an

  9. Estimation of water table level and nitrate pollution based on geostatistical and multiple mass transport models

    NASA Astrophysics Data System (ADS)

    Matiatos, Ioannis; Varouhakis, Emmanouil A.; Papadopoulou, Maria P.

    2015-04-01

    level and nitrate concentrations were produced and compared with those obtained from groundwater and mass transport numerical models. Preliminary results showed similar efficiency of the spatiotemporal geostatistical method with the numerical models. However data requirements of the former model were significantly less. Advantages and disadvantages of the methods performance were analysed and discussed indicating the characteristics of the different approaches.

  10. Multivariate and geostatistical analyzes of metals in urban soil of Weinan industrial areas, Northwest of China

    NASA Astrophysics Data System (ADS)

    Li, Xiaoping; Feng, Linna

    2012-02-01

    An investigation on spatial distribution, possible pollution sources, and affecting factors of heavy metals in the urban soils of Weinan (China) was conducted using geographic information system (GIS) technique and multivariate statistics. The results indicated that the levels of 10 heavy metals (Pb, Cr, Ba, Zn, V, Mn, Co, Cu, Ni and As) distribution measured by wavelength dispersive X-ray fluorescence spectrometry (WDXRF) in urban soil of Weinan were As varied from 2.60 to 11.50 mg kg -1, Co was 8.00-14.50 mg kg -1, Cr from 74.90 to 157.00 mg kg -1, Cu from 14.60 to 34.70 mg kg -1, Ba from 413.00 to 1137.60 mg kg -1, Mn from 431.30 to 653.70 mg kg -1, Ni from 20.80 to 35.8 mg kg -1, Pb from 19.00 to 89.50 mg kg -1, V from 63.80 to 89.50 mg kg -1, Zn from 44.50 to 196.80 mg kg -1, respectively. The mean enrichment factors (EFs) decreased in the order of Pb > Cr > Ba > Cu > Zn > Ni > V > Mn > Co > As, indicating Ba, Cu, Pb, Cr and Zn were significant enriched in the study areas. Spatial distribution maps of heavy metal contents, based on geostatistical analysis and GIS mapping, indicated that Pb and Cr had similar patterns of spatial distribution, likewise, between Ba, Cu and Zn, and among As, Co, Mn, V as well. Multivariate statistical analysis (correlation analysis, principal component analysis, and clustering analysis) showed distinctly different associations among the studied metals, suggesting that the heavy metals, Ba, Cu, Pb, Cr and Zn were associated with anthropic activities (industrial sources, combined with coal combustion as well as traffic factor), whereas Mn, V, Co, As and Ni in study area mainly controlled by parent materials and therefore had natural sources. A comprehensive environmental management strategy should be concerned by the local government to address soil pollution in urban areas.

  11. Including stratigraphic hierarchy information in geostatistical simulation: a demonstration study on analogs of alluvial sediments

    NASA Astrophysics Data System (ADS)

    Comunian, Alessandro; Felletti, Fabrizio; Giacobbo, Francesca; Giudici, Mauro; Bersezio, Riccardo

    2015-04-01

    When building a geostatistical model of the hydrofacies distribution in a volume block it is important to include all the relevant available information. Localised information about the observed hydrofacies (hard data) are routinely included in the simulation procedures. Non stationarities in the hydrofacies distribution can be handled by considering auxiliary (soft) data extracted, for example, from the results of geophysical surveys. This piece of information can be included as auxiliary data both in variogram based methods (i.e. co-Kriging) and in multiple-point statistics (MPS) methods. The latter methods allow to formalise some soft knowledge about the considered model of heterogeneity using a training image. However, including information related to the stratigraphic hierarchy in the training image is rarely straightforward. In this work, a methodology to include the information about the stratigraphic hierarchy in the simulation process is formalised and implemented in a MPS framework. The methodology is applied and tested by reconstructing two model blocks of alluvial sediments with an approximate volume of few cubic meters. The external faces of the blocks, exposed in a quarry, were thoroughly mapped and their stratigraphic hierarchy was interpreted in a previous study. The bi-dimensional (2D) maps extracted from the faces, which are used as training images and as hard data, present a vertical trend and complex stratigraphic architectures. The training images and the conditioning data are classified according to the proposed stratigraphic hierarchy, and the hydrofacies codes are grouped to allow a sequence of interleaved binary MPS simulation. Every step of the simulation sequence corresponds to a group of hydrofacies defined in the stratigraphic hierarchy. The blocks simulated with the proposed methodology are compared with blocks simulated with a standard MPS approach. The comparisons are performed on many realisations using connectivity indicators and

  12. Is random access memory random?

    NASA Technical Reports Server (NTRS)

    Denning, P. J.

    1986-01-01

    Most software is contructed on the assumption that the programs and data are stored in random access memory (RAM). Physical limitations on the relative speeds of processor and memory elements lead to a variety of memory organizations that match processor addressing rate with memory service rate. These include interleaved and cached memory. A very high fraction of a processor's address requests can be satified from the cache without reference to the main memory. The cache requests information from main memory in blocks that can be transferred at the full memory speed. Programmers who organize algorithms for locality can realize the highest performance from these computers.

  13. Digital random-number generator

    NASA Technical Reports Server (NTRS)

    Brocker, D. H.

    1973-01-01

    For binary digit array of N bits, use N noise sources to feed N nonlinear operators; each flip-flop in digit array is set by nonlinear operator to reflect whether amplitude of generator which feeds it is above or below mean value of generated noise. Fixed-point uniform distribution random number generation method can also be used to generate random numbers with other than uniform distribution.

  14. Quasi-Random Sequence Generators.

    Energy Science and Technology Software Center (ESTSC)

    1994-03-01

    Version 00 LPTAU generates quasi-random sequences. The sequences are uniformly distributed sets of L=2**30 points in the N-dimensional unit cube: I**N=[0,1]. The sequences are used as nodes for multidimensional integration, as searching points in global optimization, as trial points in multicriteria decision making, as quasi-random points for quasi Monte Carlo algorithms.

  15. Random sequential adsorption on fractals

    NASA Astrophysics Data System (ADS)

    Ciesla, Michal; Barbasz, Jakub

    2012-07-01

    Irreversible adsorption of spheres on flat collectors having dimension d < 2 is studied. Molecules are adsorbed on Sierpinski's triangle and carpet-like fractals (1 < d < 2), and on general Cantor set (d < 1). Adsorption process is modeled numerically using random sequential adsorption (RSA) algorithm. The paper concentrates on measurement of fundamental properties of coverages, i.e., maximal random coverage ratio and density autocorrelation function, as well as RSA kinetics. Obtained results allow to improve phenomenological relation between maximal random coverage ratio and collector dimension. Moreover, simulations show that, in general, most of known dimensional properties of adsorbed monolayers are valid for non-integer dimensions.

  16. Random sequential adsorption on fractals.

    PubMed

    Ciesla, Michal; Barbasz, Jakub

    2012-07-28

    Irreversible adsorption of spheres on flat collectors having dimension d < 2 is studied. Molecules are adsorbed on Sierpinski's triangle and carpet-like fractals (1 < d < 2), and on general Cantor set (d < 1). Adsorption process is modeled numerically using random sequential adsorption (RSA) algorithm. The paper concentrates on measurement of fundamental properties of coverages, i.e., maximal random coverage ratio and density autocorrelation function, as well as RSA kinetics. Obtained results allow to improve phenomenological relation between maximal random coverage ratio and collector dimension. Moreover, simulations show that, in general, most of known dimensional properties of adsorbed monolayers are valid for non-integer dimensions. PMID:22852643

  17. Geostatistical uncertainty of assessing air quality using high-spatial-resolution lichen data: A health study in the urban area of Sines, Portugal.

    PubMed

    Ribeiro, Manuel C; Pinho, P; Branquinho, C; Llop, Esteve; Pereira, Maria J

    2016-08-15

    In most studies correlating health outcomes with air pollution, personal exposure assignments are based on measurements collected at air-quality monitoring stations not coinciding with health data locations. In such cases, interpolators are needed to predict air quality in unsampled locations and to assign personal exposures. Moreover, a measure of the spatial uncertainty of exposures should be incorporated, especially in urban areas where concentrations vary at short distances due to changes in land use and pollution intensity. These studies are limited by the lack of literature comparing exposure uncertainty derived from distinct spatial interpolators. Here, we addressed these issues with two interpolation methods: regression Kriging (RK) and ordinary Kriging (OK). These methods were used to generate air-quality simulations with a geostatistical algorithm. For each method, the geostatistical uncertainty was drawn from generalized linear model (GLM) analysis. We analyzed the association between air quality and birth weight. Personal health data (n=227) and exposure data were collected in Sines (Portugal) during 2007-2010. Because air-quality monitoring stations in the city do not offer high-spatial-resolution measurements (n=1), we used lichen data as an ecological indicator of air quality (n=83). We found no significant difference in the fit of GLMs with any of the geostatistical methods. With RK, however, the models tended to fit better more often and worse less often. Moreover, the geostatistical uncertainty results showed a marginally higher mean and precision with RK. Combined with lichen data and land-use data of high spatial resolution, RK is a more effective geostatistical method for relating health outcomes with air quality in urban areas. This is particularly important in small cities, which generally do not have expensive air-quality monitoring stations with high spatial resolution. Further, alternative ways of linking human activities with their

  18. Geostatistical modelling of the malaria risk in Mozambique: effect of the spatial resolution when using remotely-sensed imagery.

    PubMed

    Giardina, Federica; Franke, Jonas; Vounatsou, Penelope

    2015-01-01

    The study of malaria spatial epidemiology has benefited from recent advances in geographic information system and geostatistical modelling. Significant progress in earth observation technologies has led to the development of moderate, high and very high resolution imagery. Extensive literature exists on the relationship between malaria and environmental/climatic factors in different geographical areas, but few studies have linked human malaria parasitemia survey data with remote sensing-derived land cover/land use variables and very few have used Earth Observation products. Comparison among the different resolution products to model parasitemia has not yet been investigated. In this study, we probe a proximity measure to incorporate different land cover classes and assess the effect of the spatial resolution of remotely sensed land cover and elevation on malaria risk estimation in Mozambique after adjusting for other environmental factors at a fixed spatial resolution. We used data from the Demographic and Health survey carried out in 2011, which collected malaria parasitemia data on children from 0 to 5 years old, analysing them with a Bayesian geostatistical model. We compared the risk predicted using land cover and elevation at moderate resolution with the risk obtained employing the same variables at high resolution. We used elevation data at moderate and high resolution and the land cover layer from the Moderate Resolution Imaging Spectroradiometer as well as the one produced by MALAREO, a project covering part of Mozambique during 2010-2012 that was funded by the European Union's 7th Framework Program. Moreover, the number of infected children was predicted at different spatial resolutions using AFRIPOP population data and the enhanced population data generated by the MALAREO project for comparison of estimates. The Bayesian geostatistical model showed that the main determinants of malaria presence are precipitation and day temperature. However, the presence

  19. Detection of Local Anomalies in High Resolution Hyperspectral Imagery Using Geostatistical Filtering and Local Spatial Statistics

    NASA Astrophysics Data System (ADS)

    Goovaerts, P.; Jacquez, G. M.; Marcus, A. W.

    2004-12-01

    Spatial data are periodically collected and processed to monitor, analyze and interpret developments in our changing environment. Remote sensing is a modern way of data collecting and has seen an enormous growth since launching of modern satellites and development of airborne sensors. In particular, the recent availability of high spatial resolution hyperspectral imagery (spatial resolution of less than 5 meters and including data collected over 64 or more bands of electromagnetic radiation for each pixel offers a great potential to significantly enhance environmental mapping and our ability to model spatial systems. High spatial resolution imagery contains a remarkable quantity of information that could be used to analyze spatial breaks (boundaries), areas of similarity (clusters), and spatial autocorrelation (associations) across the landscape. This paper addresses the specific issue of soil disturbance detection, which could indicate the presence of land mines or recent movements of troop and heavy equipment. A challenge presented by soil detection is to retain the measurement of fine-scale features (i.e. mineral soil changes, organic content changes, vegetation disturbance related changes, aspect changes) while still covering proportionally large spatial areas. An additional difficulty is that no ground data might be available for the calibration of spectral signatures, and little might be known about the size of patches of disturbed soils to be detected. This paper describes a new technique for automatic target detection which capitalizes on both spatial and across spectral bands correlation, does not require any a priori information on the target spectral signature but does not allow discrimination between targets. This approach involves successively a multivariate statistical analysis (principal component analysis) of all spectral bands, a geostatistical filtering of noise and regional background in the first principal components using factorial kriging, and

  20. Building a geological reference platform using sequence stratigraphy combined with geostatistical tools

    NASA Astrophysics Data System (ADS)

    Bourgine, Bernard; Lasseur, Éric; Leynet, Aurélien; Badinier, Guillaume; Ortega, Carole; Issautier, Benoit; Bouchet, Valentin

    2015-04-01

    In 2012 BRGM launched an extensive program to build the new French Geological Reference platform (RGF). Among the objectives of this program is to provide the public with validated, reliable and 3D-consistent geological data, with estimation of uncertainty. Approx. 100,000 boreholes over the whole French national territory provide a preliminary interpretation in terms of depths of main geological interfaces, but with an unchecked, unknown and often low reliability. The aim of this paper is to present the procedure that has been tested on two areas in France, in order to validate (or not) these boreholes, with the aim of being generalized as much as possible to the nearly 100,000 boreholes waiting for validation. The approach is based on the following steps, and includes the management of uncertainty at different steps: (a) Selection of a loose network of boreholes owning a logging or coring information enabling a reliable interpretation. This first interpretation is based on the correlation of well log data and allows defining 3D sequence stratigraphic framework identifying isochronous surfaces. A litho-stratigraphic interpretation is also performed. Be "A" the collection of all boreholes used for this step (typically 3 % of the total number of holes to be validated) and "B" the other boreholes to validate, (b) Geostatistical analysis of characteristic geological interfaces. The analysis is carried out firstly on the "A" type data (to validate the variogram model), then on the "B" type data and at last on "B" knowing "A". It is based on cross-validation tests and evaluation of the uncertainty associated to each geological interface. In this step, we take into account inequality constraints provided by boreholes that do not intersect all interfaces, as well as the "litho-stratigraphic pile" defining the formations and their relationships (depositing surfaces or erosion). The goal is to identify quickly and semi-automatically potential errors among the data, up to

  1. IN SITU NON-INVASIVE SOIL CARBON ANALYSIS: SAMPLE SIZE AND GEOSTATISTICAL CONSIDERATIONS.

    SciTech Connect

    WIELOPOLSKI, L.

    2005-04-01

    I discuss a new approach for quantitative carbon analysis in soil based on INS. Although this INS method is not simple, it offers critical advantages not available with other newly emerging modalities. The key advantages of the INS system include the following: (1) It is a non-destructive method, i.e., no samples of any kind are taken. A neutron generator placed above the ground irradiates the soil, stimulating carbon characteristic gamma-ray emission that is counted by a detection system also placed above the ground. (2) The INS system can undertake multielemental analysis, so expanding its usefulness. (3) It can be used either in static or scanning modes. (4) The volume sampled by the INS method is large with a large footprint; when operating in a scanning mode, the sampled volume is continuous. (5) Except for a moderate initial cost of about $100,000 for the system, no additional expenses are required for its operation over two to three years after which a NG has to be replenished with a new tube at an approximate cost of $10,000, this regardless of the number of sites analyzed. In light of these characteristics, the INS system appears invaluable for monitoring changes in the carbon content in the field. For this purpose no calibration is required; by establishing a carbon index, changes in carbon yield can be followed with time in exactly the same location, thus giving a percent change. On the other hand, with calibration, it can be used to determine the carbon stock in the ground, thus estimating the soil's carbon inventory. However, this requires revising the standard practices for deciding upon the number of sites required to attain a given confidence level, in particular for the purposes of upward scaling. Then, geostatistical considerations should be incorporated in considering properly the averaging effects of the large volumes sampled by the INS system that would require revising standard practices in the field for determining the number of spots to be

  2. A novel geotechnical/geostatistical approach for exploration and production of natural gas from multiple geologic strata, Phase 1

    SciTech Connect

    Overbey, W.K. Jr.; Reeves, T.K.; Salamy, S.P.; Locke, C.D.; Johnson, H.R.; Brunk, R.; Hawkins, L. )

    1991-05-01

    This research program has been designed to develop and verify a unique geostatistical approach for finding natural gas resources. The project has been conducted by Beckley College, Inc., and BDM Engineering Services Company (BDMESC) under contract to the US Department of Energy (DOE), Morgantown Energy Technology Center (METC). This section, Volume II, contains a detailed discussion of the methodology used and the geological and production information collected and analyzed for this study. A companion document, Volume 1, provides an overview of the program, technique and results of the study. In combination, Volumes I and II cover the completion of the research undertaken under Phase I of this DOE project, which included the identification of five high-potential sites for natural gas production on the Eccles Quadrangle, Raleigh County, West Virginia. Each of these sites was selected for its excellent potential for gas production from both relatively shallow coalbeds and the deeper, conventional reservoir formations.

  3. Random distributed feedback fibre lasers

    NASA Astrophysics Data System (ADS)

    Turitsyn, Sergei K.; Babin, Sergey A.; Churkin, Dmitry V.; Vatnik, Ilya D.; Nikulin, Maxim; Podivilov, Evgenii V.

    2014-09-01

    of a stationary near-Gaussian beam with a narrow spectrum. A random distributed feedback fibre laser has efficiency and performance that are comparable to and even exceed those of similar conventional fibre lasers. The key features of the generated radiation of random distributed feedback fibre lasers include: a stationary narrow-band continuous modeless spectrum that is free of mode competition, nonlinear power broadening, and an output beam with a Gaussian profile in the fundamental transverse mode (generated both in single mode and multi-mode fibres). This review presents the current status of research in the field of random fibre lasers and shows their potential and perspectives. We start with an introductory overview of conventional distributed feedback lasers and traditional random lasers to set the stage for discussion of random fibre lasers. We then present a theoretical analysis and experimental studies of various random fibre laser configurations, including widely tunable, multi-wavelength, narrow-band generation, and random fibre lasers operating in different spectral bands in the 1-1.6 μm range. Then we discuss existing and future applications of random fibre lasers, including telecommunication and distributed long reach sensor systems. A theoretical description of random lasers is very challenging and is strongly linked with the theory of disordered systems and kinetic theory. We outline two key models governing the generation of random fibre lasers: the average power balance model and the nonlinear Schrödinger equation based model. Recently invented random distributed feedback fibre lasers represent a new and exciting field of research that brings together such diverse areas of science as laser physics, the theory of disordered systems, fibre optics and nonlinear science. Stable random generation in optical fibre opens up new possibilities for research on wave transport and localization in disordered media. We hope that this review will provide

  4. Interannual Changes in Biomass Affect the Spatial Aggregations of Anchovy and Sardine as Evidenced by Geostatistical and Spatial Indicators.

    PubMed

    Barra, Marco; Petitgas, Pierre; Bonanno, Angelo; Somarakis, Stylianos; Woillez, Mathieu; Machias, Athanasios; Mazzola, Salvatore; Basilone, Gualtiero; Giannoulaki, Marianna

    2015-01-01

    Geostatistical techniques were applied and a series of spatial indicators were calculated (occupation, aggregation, location, dispersion, spatial autocorrelation and overlap) to characterize the spatial distributions of European anchovy and sardine during summer. Two ecosystems were compared for this purpose, both located in the Mediterranean Sea: the Strait of Sicily (upwelling area) and the North Aegean Sea (continental shelf area, influenced by freshwater). Although the biomass of anchovy and sardine presented high interannual variability in both areas, the location of the centres of gravity and the main spatial patches of their populations were very similar between years. The size of the patches representing the dominant part of the abundance (80%) was mostly ecosystem- and species-specific. Occupation (area of presence) appears to be shaped by the extent of suitable habitats in each ecosystem whereas aggregation patterns (how the populations are distributed within the area of presence) were species-specific and related to levels of population biomass. In the upwelling area, both species showed consistently higher occupation values compared to the continental shelf area. Certain characteristics of the spatial distribution of sardine (e.g. spreading area, overlapping with anchovy) differed substantially between the two ecosystems. Principal component analysis of geostatistical and spatial indicators revealed that biomass was significantly related to a suite of, rather than single, spatial indicators. At the spatial scale of our study, strong correlations emerged between biomass and the first principal component axis with highly positive loadings for occupation, aggregation and patchiness, independently of species and ecosystem. Overlapping between anchovy and sardine increased with the increase of sardine biomass but decreased with the increase of anchovy. This contrasting pattern was attributed to the location of the respective major patches combined with the

  5. An integrated assessment of seawater intrusion in a small tropical island using geophysical, geochemical, and geostatistical techniques.

    PubMed

    Kura, Nura Umar; Ramli, Mohammad Firuz; Ibrahim, Shaharin; Sulaiman, Wan Nur Azmin; Aris, Ahmad Zaharin

    2014-01-01

    In this study, geophysics, geochemistry, and geostatistical techniques were integrated to assess seawater intrusion in Kapas Island due to its geological complexity and multiple contamination sources. Five resistivity profiles were measured using an electric resistivity technique. The results reveal very low resistivity <1 Ωm, suggesting either marine clay deposit or seawater intrusion or both along the majority of the resistivity images. As a result, geochemistry was further employed to verify the resistivity evidence. The Chadha and Stiff diagrams classify the island groundwater into Ca-HCO3, Ca-Na-HCO3, Na-HCO3, and Na-Cl water types, with Ca-HCO3 as the dominant. The Mg(2+)/Mg(2+)+Ca(2+), HCO3 (-)/anion, Cl(-)/HCO3 (-), Na(+)/Cl(-), and SO4 (2-)/Cl(-) ratios show that some sampling sites are affected by seawater intrusion; these sampling sites fall within the same areas that show low-resistivity values. The resulting ratios and resistivity values were then used in the geographical information system (GIS) environment to create the geostatistical map of individual indicators. These maps were then overlaid to create the final map showing seawater-affected areas. The final map successfully delineates the area that is actually undergoing seawater intrusion. The proposed technique is not area specific, and hence, it can work in any place with similar completed characteristics or under the influence of multiple contaminants so as to distinguish the area that is truly affected by any targeted pollutants from the rest. This information would provide managers and policy makers with the knowledge of the current situation and will serve as a guide and standard in water research for sustainable management plan. PMID:24532282

  6. Interannual Changes in Biomass Affect the Spatial Aggregations of Anchovy and Sardine as Evidenced by Geostatistical and Spatial Indicators

    PubMed Central

    Barra, Marco; Petitgas, Pierre; Bonanno, Angelo; Somarakis, Stylianos; Woillez, Mathieu; Machias, Athanasios; Mazzola, Salvatore; Basilone, Gualtiero; Giannoulaki, Marianna

    2015-01-01

    Geostatistical techniques were applied and a series of spatial indicators were calculated (occupation, aggregation, location, dispersion, spatial autocorrelation and overlap) to characterize the spatial distributions of European anchovy and sardine during summer. Two ecosystems were compared for this purpose, both located in the Mediterranean Sea: the Strait of Sicily (upwelling area) and the North Aegean Sea (continental shelf area, influenced by freshwater). Although the biomass of anchovy and sardine presented high interannual variability in both areas, the location of the centres of gravity and the main spatial patches of their populations were very similar between years. The size of the patches representing the dominant part of the abundance (80%) was mostly ecosystem- and species-specific. Occupation (area of presence) appears to be shaped by the extent of suitable habitats in each ecosystem whereas aggregation patterns (how the populations are distributed within the area of presence) were species-specific and related to levels of population biomass. In the upwelling area, both species showed consistently higher occupation values compared to the continental shelf area. Certain characteristics of the spatial distribution of sardine (e.g. spreading area, overlapping with anchovy) differed substantially between the two ecosystems. Principal component analysis of geostatistical and spatial indicators revealed that biomass was significantly related to a suite of, rather than single, spatial indicators. At the spatial scale of our study, strong correlations emerged between biomass and the first principal component axis with highly positive loadings for occupation, aggregation and patchiness, independently of species and ecosystem. Overlapping between anchovy and sardine increased with the increase of sardine biomass but decreased with the increase of anchovy. This contrasting pattern was attributed to the location of the respective major patches combined with the

  7. Characterization of ambient air pollution measurement error in a time-series health study using a geostatistical simulation approach

    NASA Astrophysics Data System (ADS)

    Goldman, Gretchen T.; Mulholland, James A.; Russell, Armistead G.; Gass, Katherine; Strickland, Matthew J.; Tolbert, Paige E.

    2012-09-01

    In recent years, geostatistical modeling has been used to inform air pollution health studies. In this study, distributions of daily ambient concentrations were modeled over space and time for 12 air pollutants. Simulated pollutant fields were produced for a 6-year time period over the 20-county metropolitan Atlanta area using the Stanford Geostatistical Modeling Software (SGeMS). These simulations incorporate the temporal and spatial autocorrelation structure of ambient pollutants, as well as season and day-of-week temporal and spatial trends; these fields were considered to be the true ambient pollutant fields for the purposes of the simulations that followed. Simulated monitor data at the locations of actual monitors were then generated that contain error representative of instrument imprecision. From the simulated monitor data, four exposure metrics were calculated: central monitor and unweighted, population-weighted, and area-weighted averages. For each metric, the amount and type of error relative to the simulated pollutant fields are characterized and the impact of error on an epidemiologic time-series analysis is predicted. The amount of error, as indicated by a lack of spatial autocorrelation, is greater for primary pollutants than for secondary pollutants and is only moderately reduced by averaging across monitors; more error will result in less statistical power in the epidemiologic analysis. The type of error, as indicated by the correlations of error with the monitor data and with the true ambient concentration, varies with exposure metric, with error in the central monitor metric more of the classical type (i.e., independent of the monitor data) and error in the spatial average metrics more of the Berkson type (i.e., independent of the true ambient concentration). Error type will affect the bias in the health risk estimate, with bias toward the null and away from the null predicted depending on the exposure metric; population-weighting yielded the

  8. Geostatistical validation and cross-validation of magnetometric measurements of soil pollution with Potentially Toxic Elements in problematic areas

    NASA Astrophysics Data System (ADS)

    Fabijańczyk, Piotr; Zawadzki, Jarosław

    2016-04-01

    Field magnetometry is fast method that was previously effectively used to assess the potential soil pollution. One of the most popular devices that are used to measure the soil magnetic susceptibility on the soil surface is a MS2D Bartington. Single reading using MS2D device of soil magnetic susceptibility is low time-consuming but often characterized by considerable errors related to the instrument or environmental and lithogenic factors. In this connection, measured values of soil magnetic susceptibility have to be usually validated using more precise, but also much more expensive, chemical measurements. The goal of this study was to analyze validation methods of magnetometric measurements using chemical analyses of a concentration of elements in soil. Additionally, validation of surface measurements of soil magnetic susceptibility was performed using selected parameters of a distribution of magnetic susceptibility in a soil profile. Validation was performed using selected geostatistical measures of cross-correlation. The geostatistical approach was compared with validation performed using the classic statistics. Measurements were performed at selected areas located in the Upper Silesian Industrial Area in Poland, and in the selected parts of Norway. In these areas soil magnetic susceptibility was measured on the soil surface using a MS2D Bartington device and in the soil profile using MS2C Bartington device. Additionally, soil samples were taken in order to perform chemical measurements. Acknowledgment The research leading to these results has received funding from the Polish-Norwegian Research Programme operated by the National Centre for Research and Development under the Norwegian Financial Mechanism 2009-2014 in the frame of Project IMPACT - Contract No Pol-Nor/199338/45/2013.

  9. The distribution of arsenic in shallow alluvial groundwater under agricultural land in central Portugal: insights from multivariate geostatistical modeling.

    PubMed

    Andrade, A I A S S; Stigter, T Y

    2013-04-01

    In this study multivariate and geostatistical methods are jointly applied to model the spatial and temporal distribution of arsenic (As) concentrations in shallow groundwater as a function of physicochemical, hydrogeological and land use parameters, as well as to assess the related uncertainty. The study site is located in the Mondego River alluvial body in Central Portugal, where maize, rice and some vegetable crops dominate. In a first analysis scatter plots are used, followed by the application of principal component analysis to two different data matrices, of 112 and 200 samples, with the aim of detecting associations between As levels and other quantitative parameters. In the following phase explanatory models of As are created through factorial regression based on correspondence analysis, integrating both quantitative and qualitative parameters. Finally, these are combined with indicator-geostatistical techniques to create maps indicating the predicted probability of As concentrations in groundwater exceeding the current global drinking water guideline of 10 μg/l. These maps further allow assessing the uncertainty and representativeness of the monitoring network. A clear effect of the redox state on the presence of As is observed, and together with significant correlations with dissolved oxygen, nitrate, sulfate, iron, manganese and alkalinity, points towards the reductive dissolution of Fe (hydr)oxides as the essential mechanism of As release. The association of high As values with rice crop, known to promote reduced environments due to ponding, further corroborates this hypothesis. An additional source of As from fertilizers cannot be excluded, as the correlation with As is higher where rice is associated with vegetables, normally associated with higher fertilization rates. The best explanatory model of As occurrence integrates the parameters season, crop type, well and water depth, nitrate and Eh, though a model without the last two parameters also gives

  10. Reconciling bottom-up and top-down estimates of regional scale carbon budgets through geostatistical inverse modeling

    NASA Astrophysics Data System (ADS)

    Goeckede, M.; Yadav, V.; Mueller, K. L.; Gourdji, S. M.; Michalak, A. M.; Law, B. E.

    2011-12-01

    We designed a framework to train biogeophysics-biogeochemistry process models using atmospheric inverse modeling, multiple databases characterizing biosphere-atmosphere exchange, and advanced geostatistics. Our main objective is to reduce uncertainties in carbon cycle and climate projections by exploring the full spectrum of process representation, data assimilation and statistical tools currently available. Incorporating multiple high-quality data sources like eddy-covariance flux databases or biometric inventories has the potential to produce a rigorous data-constrained process model implementation. However, representation errors may bias spatially explicit model output when upscaling to regional to global scales. Atmospheric inverse modeling can be used to validate the regional representativeness of the fluxes, but each piece of prior information from the surface databases limits the ability of the inverse model to characterize the carbon cycle from the perspective of the atmospheric observations themselves. The use of geostatistical inverse modeling (GIM) holds the potential to overcome these limitations, replacing rigid prior patterns with information on how flux fields are correlated across time and space, as well as ancillary environmental data related to the carbon fluxes. We present results from a regional scale data assimilation study that focuses on generating terrestrial CO2 fluxes at high spatial and temporal resolution in the Pacific Northwest United States. Our framework couples surface fluxes from different biogeochemistry process models to very high resolution atmospheric transport using mesoscale modeling (WRF) and Lagrangian Particle dispersion (STILT). We use GIM to interpret the spatiotemporal differences between bottom-up and top-down flux fields. GIM results make it possible to link those differences to input parameters and processes, strengthening model parameterization and process understanding. Results are compared against independent

  11. What have we learned from deterministic geostatistics at highly resolved field sites, as relevant to mass transport processes in sedimentary aquifers?

    NASA Astrophysics Data System (ADS)

    Ritzi, Robert W.; Soltanian, Mohamad Reza

    2015-12-01

    In the method of deterministic geostatistics (sensu Isaaks and Srivastava, 1988), highly-resolved data sets are used to compute sample spatial-bivariate statistics within a deterministic framework. The general goal is to observe what real, highly resolved, sample spatial-bivariate correlation looks like when it is well-quantified in naturally-occurring sedimentary aquifers. Furthermore, it is to understand how this correlation structure, (i.e. shape and correlation range) is related to independent and physically quantifiable attributes of the sedimentary architecture. The approach has evolved among work by Rubin (1995, 2003), Barrash and Clemo (2002), Ritzi et al. (2004, 2007, 2013), Dai et al. (2005), and Ramanathan et al. (2010). In this evolution, equations for sample statistics have been developed which allow tracking the facies types at the heads and tails of lag vectors. The goal is to observe and thereby understand how aspects of the sedimentary architecture affect the well-supported sample statistics. The approach has been used to study heterogeneity at a number of sites, representing a variety of depositional environments, with highly resolved data sets. What have we learned? We offer and support an opinion that the single most important insight derived from these studies is that the structure of spatial-bivariate correlation is essentially the cross-transition probability structure, determined by the sedimentary architecture. More than one scale of hierarchical sedimentary architecture has been represented in these studies, and a hierarchy of cross-transition probability structures was found to define the correlation structure in all cases. This insight allows decomposing contributions from different scales of the sedimentary architecture, and has led to a more fundamental understanding of mass transport processes including mechanical dispersion of solutes within aquifers, and the time-dependent retardation of reactive solutes. These processes can now be

  12. 76 FR 79204 - Random Drug Testing Rate for Covered Crewmembers

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-21

    ... SECURITY Coast Guard Random Drug Testing Rate for Covered Crewmembers AGENCY: Coast Guard, DHS. ACTION: Notice of minimum random drug testing rate. SUMMARY: The Coast Guard has set the calendar year 2012 minimum random drug testing rate at 50 percent of covered crewmembers. DATES: The minimum random...

  13. 76 FR 1448 - Random Drug Testing Rate for Covered Crewmembers

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-10

    ... SECURITY Coast Guard Random Drug Testing Rate for Covered Crewmembers AGENCY: Coast Guard, DHS. ACTION: Notice of minimum random drug testing rate. SUMMARY: The Coast Guard has set the calendar year 2011 minimum random drug testing rate at 50 percent of covered crewmembers. DATES: The minimum random...

  14. Quantifying randomness in real networks

    PubMed Central

    Orsini, Chiara; Dankulov, Marija M.; Colomer-de-Simón, Pol; Jamakovic, Almerima; Mahadevan, Priya; Vahdat, Amin; Bassler, Kevin E.; Toroczkai, Zoltán; Boguñá, Marián; Caldarelli, Guido; Fortunato, Santo; Krioukov, Dmitri

    2015-01-01

    Represented as graphs, real networks are intricate combinations of order and disorder. Fixing some of the structural properties of network models to their values observed in real networks, many other properties appear as statistical consequences of these fixed observables, plus randomness in other respects. Here we employ the dk-series, a complete set of basic characteristics of the network structure, to study the statistical dependencies between different network properties. We consider six real networks—the Internet, US airport network, human protein interactions, technosocial web of trust, English word network, and an fMRI map of the human brain—and find that many important local and global structural properties of these networks are closely reproduced by dk-random graphs whose degree distributions, degree correlations and clustering are as in the corresponding real network. We discuss important conceptual, methodological, and practical implications of this evaluation of network randomness, and release software to generate dk-random graphs. PMID:26482121

  15. Quantifying randomness in real networks

    NASA Astrophysics Data System (ADS)

    Orsini, Chiara; Dankulov, Marija M.; Colomer-de-Simón, Pol; Jamakovic, Almerima; Mahadevan, Priya; Vahdat, Amin; Bassler, Kevin E.; Toroczkai, Zoltán; Boguñá, Marián; Caldarelli, Guido; Fortunato, Santo; Krioukov, Dmitri

    2015-10-01

    Represented as graphs, real networks are intricate combinations of order and disorder. Fixing some of the structural properties of network models to their values observed in real networks, many other properties appear as statistical consequences of these fixed observables, plus randomness in other respects. Here we employ the dk-series, a complete set of basic characteristics of the network structure, to study the statistical dependencies between different network properties. We consider six real networks--the Internet, US airport network, human protein interactions, technosocial web of trust, English word network, and an fMRI map of the human brain--and find that many important local and global structural properties of these networks are closely reproduced by dk-random graphs whose degree distributions, degree correlations and clustering are as in the corresponding real network. We discuss important conceptual, methodological, and practical implications of this evaluation of network randomness, and release software to generate dk-random graphs.

  16. Quantifying randomness in real networks.

    PubMed

    Orsini, Chiara; Dankulov, Marija M; Colomer-de-Simón, Pol; Jamakovic, Almerima; Mahadevan, Priya; Vahdat, Amin; Bassler, Kevin E; Toroczkai, Zoltán; Boguñá, Marián; Caldarelli, Guido; Fortunato, Santo; Krioukov, Dmitri

    2015-01-01

    Represented as graphs, real networks are intricate combinations of order and disorder. Fixing some of the structural properties of network models to their values observed in real networks, many other properties appear as statistical consequences of these fixed observables, plus randomness in other respects. Here we employ the dk-series, a complete set of basic characteristics of the network structure, to study the statistical dependencies between different network properties. We consider six real networks--the Internet, US airport network, human protein interactions, technosocial web of trust, English word network, and an fMRI map of the human brain--and find that many important local and global structural properties of these networks are closely reproduced by dk-random graphs whose degree distributions, degree correlations and clustering are as in the corresponding real network. We discuss important conceptual, methodological, and practical implications of this evaluation of network randomness, and release software to generate dk-random graphs. PMID:26482121

  17. Common Randomness Principles of Secrecy

    ERIC Educational Resources Information Center

    Tyagi, Himanshu

    2013-01-01

    This dissertation concerns the secure processing of distributed data by multiple terminals, using interactive public communication among themselves, in order to accomplish a given computational task. In the setting of a probabilistic multiterminal source model in which several terminals observe correlated random signals, we analyze secure…

  18. BIFURCATIONS OF RANDOM DIFFERENTIAL EQUATIONS WITH BOUNDED NOISE ON SURFACES.

    PubMed

    Homburg, Ale Jan; Young, Todd R

    2010-03-01

    In random differential equations with bounded noise minimal forward invariant (MFI) sets play a central role since they support stationary measures. We study the stability and possible bifurcations of MFI sets. In dimensions 1 and 2 we classify all minimal forward invariant sets and their codimension one bifurcations in bounded noise random differential equations. PMID:22211081

  19. Randomness and degrees of irregularity.

    PubMed Central

    Pincus, S; Singer, B H

    1996-01-01

    The fundamental question "Are sequential data random?" arises in myriad contexts, often with severe data length constraints. Furthermore, there is frequently a critical need to delineate nonrandom sequences in terms of closeness to randomness--e.g., to evaluate the efficacy of therapy in medicine. We address both these issues from a computable framework via a quantification of regularity. ApEn (approximate entropy), defining maximal randomness for sequences of arbitrary length, indicating the applicability to sequences as short as N = 5 points. An infinite sequence formulation of randomness is introduced that retains the operational (and computable) features of the finite case. In the infinite sequence setting, we indicate how the "foundational" definition of independence in probability theory, and the definition of normality in number theory, reduce to limit theorems without rates of convergence, from which we utilize ApEn to address rates of convergence (of a deficit from maximal randomness), refining the aforementioned concepts in a computationally essential manner. Representative applications among many are indicated to assess (i) random number generation output; (ii) well-shuffled arrangements; and (iii) (the quality of) bootstrap replicates. PMID:11607637

  20. Conducting Randomized Controlled Trials with Offenders in an Administrative Setting

    ERIC Educational Resources Information Center

    Ahlin, Eileen M.

    2015-01-01

    Evaluation research conducted in agencies that sanction law violators is often challenging and due process may preclude evaluators from using experimental methods in traditional criminal justice agencies such as police, courts, and corrections. However, administrative agencies often deal with the same population but are not bound by due process…

  1. A geostatistical framework for quantifying the reach-scale spatial structure of river morphology: 2. Application to restored and natural channels

    NASA Astrophysics Data System (ADS)

    Legleiter, Carl J.

    2014-01-01

    Alluvial rivers are shaped by interactions between the bed topography, the flow field, and the movement of sediment. To help refine our understanding of these connections between form and process, I developed a geostatistical framework for quantifying the reach-scale spatial structure of river morphology, described in a companion paper. In this study, I applied this approach to a restored channel and three disparate reaches of a dynamic, natural stream. Repeat topographic surveys of each site were used to examine relationships between channel change and the variability and organization of the topography. For the restored river, the development of point bars increased overall morphologic diversity, primarily because of greater cross-sectional asymmetry. The three natural reaches experienced a variety of adjustments ranging from 1) gradual bar growth and bank erosion to; 2) extensive deposition followed by channel abandonment; and 3) chute cutoff and incision of a new channel. In both the restored and natural streams, geostatistical analysis, which involved variogram modeling, calculation of integral metrics, and inspection of variogram maps, provided an effective, informative summary of the observed channel changes. The use of dimensionless variables accounted for channel size, highlighted differences in spatial structure, and enabled a comparison among sites — the restored reach had not yet achieved the same degree of heterogeneity as the more pristine channels. Emphasizing variability and spatial pattern via this geostatistical framework could yield insight on form-process interactions and help to quantify geomorphic complexity and habitat heterogeneity in the applied context of river restoration.

  2. Geostatistical Analysis of Population Density and the Change of Land Cover and Land Use in the Komadugu-Yobe River Basin in Nigeria

    NASA Astrophysics Data System (ADS)

    Tobar, I.; Lee, J.; Black, F. W.; Babamaaji, R. A.

    2014-12-01

    The Komadugu-Yobe River Basin in northeastern Nigeria is an important tributary of Lake Chad and has experienced significant changes in population density and land cover in recent decades. The present study focuses on the application of geostatistical methods to examine the land cover and population density dynamics in the river basin. The geostatistical methods include spatial autocorrelation, overlapping neighborhood statistics with Pearson's correlation coefficient, Moran's I index analysis, and indicator variogram analysis with rose diagram. The land cover and land use maps were constructed from USGS Landsat images and Globcover images from the European Space Agency. The target years of the analysis are 1970, 1986, 2000, 2005, and 2009. The calculation of net changes in land cover indicates significant variation in the changes of rainfed cropland, mosaic cropland, and grassland. Spatial autocorrelation analysis and Moran I index analysis showed that the distribution of land cover is highly clustered. A new GIS geostatistical tool was designed to calculate the overlapping neighborhood statistics with Pearson's correlation coefficient between the land use/land cover and population density datasets. The 10x10 neighborhood cell unit showed a clear correlation between the variables in certain zones of the study area. The ranges calculated from the indicator variograms of land use and land cover and population density showed that the cropland and sparse vegetation are most closely related to the spatial change of population density.

  3. Random broadcast on random geometric graphs

    SciTech Connect

    Bradonjic, Milan; Elsasser, Robert; Friedrich, Tobias

    2009-01-01

    In this work, we consider the random broadcast time on random geometric graphs (RGGs). The classic random broadcast model, also known as push algorithm, is defined as: starting with one informed node, in each succeeding round every informed node chooses one of its neighbors uniformly at random and informs it. We consider the random broadcast time on RGGs, when with high probability: (i) RGG is connected, (ii) when there exists the giant component in RGG. We show that the random broadcast time is bounded by {Omicron}({radical} n + diam(component)), where diam(component) is a diameter of the entire graph, or the giant component, for the regimes (i), or (ii), respectively. In other words, for both regimes, we derive the broadcast time to be {Theta}(diam(G)), which is asymptotically optimal.

  4. Quantumness, Randomness and Computability

    NASA Astrophysics Data System (ADS)

    Solis, Aldo; Hirsch, Jorge G.

    2015-06-01

    Randomness plays a central role in the quantum mechanical description of our interactions. We review the relationship between the violation of Bell inequalities, non signaling and randomness. We discuss the challenge in defining a random string, and show that algorithmic information theory provides a necessary condition for randomness using Borel normality. We close with a view on incomputablity and its implications in physics.

  5. How random is a random vector?

    NASA Astrophysics Data System (ADS)

    Eliazar, Iddo

    2015-12-01

    Over 80 years ago Samuel Wilks proposed that the "generalized variance" of a random vector is the determinant of its covariance matrix. To date, the notion and use of the generalized variance is confined only to very specific niches in statistics. In this paper we establish that the "Wilks standard deviation" -the square root of the generalized variance-is indeed the standard deviation of a random vector. We further establish that the "uncorrelation index" -a derivative of the Wilks standard deviation-is a measure of the overall correlation between the components of a random vector. Both the Wilks standard deviation and the uncorrelation index are, respectively, special cases of two general notions that we introduce: "randomness measures" and "independence indices" of random vectors. In turn, these general notions give rise to "randomness diagrams"-tangible planar visualizations that answer the question: How random is a random vector? The notion of "independence indices" yields a novel measure of correlation for Lévy laws. In general, the concepts and results presented in this paper are applicable to any field of science and engineering with random-vectors empirical data.

  6. Geostatistical interpolation of daily rainfall at catchment scale: the use of several variogram models in the Ourthe and Ambleve catchments, Belgium

    NASA Astrophysics Data System (ADS)

    Ly, S.; Charles, C.; Degré, A.

    2011-07-01

    Spatial interpolation of precipitation data is of great importance for hydrological modelling. Geostatistical methods (kriging) are widely applied in spatial interpolation from point measurement to continuous surfaces. The first step in kriging computation is the semi-variogram modelling which usually used only one variogram model for all-moment data. The objective of this paper was to develop different algorithms of spatial interpolation for daily rainfall on 1 km2 regular grids in the catchment area and to compare the results of geostatistical and deterministic approaches. This study leaned on 30-yr daily rainfall data of 70 raingages in the hilly landscape of the Ourthe and Ambleve catchments in Belgium (2908 km2). This area lies between 35 and 693 m in elevation and consists of river networks, which are tributaries of the Meuse River. For geostatistical algorithms, seven semi-variogram models (logarithmic, power, exponential, Gaussian, rational quadratic, spherical and penta-spherical) were fitted to daily sample semi-variogram on a daily basis. These seven variogram models were also adopted to avoid negative interpolated rainfall. The elevation, extracted from a digital elevation model, was incorporated into multivariate geostatistics. Seven validation raingages and cross validation were used to compare the interpolation performance of these algorithms applied to different densities of raingages. We found that between the seven variogram models used, the Gaussian model was the most frequently best fit. Using seven variogram models can avoid negative daily rainfall in ordinary kriging. The negative estimates of kriging were observed for convective more than stratiform rain. The performance of the different methods varied slightly according to the density of raingages, particularly between 8 and 70 raingages but it was much different for interpolation using 4 raingages. Spatial interpolation with the geostatistical and Inverse Distance Weighting (IDW) algorithms

  7. GY SAMPLING THEORY AND GEOSTATISTICS: ALTERNATE MODELS OF VARIABILITY IN CONTINUOUS MEDIA

    EPA Science Inventory



    In the sampling theory developed by Pierre Gy, sample variability is modeled as the sum of a set of seven discrete error components. The variogram used in geostatisties provides an alternate model in which several of Gy's error components are combined in a continuous mode...

  8. A novel geotechnical/geostatistical approach for exploration and production of natural gas from multiple geologic strata, Phase 1

    SciTech Connect

    Overbey, W.K. Jr.; Reeves, T.K.; Salamy, S.P.; Locke, C.D.; Johnson, H.R.; Brunk, R.; Hawkins, L. )

    1991-05-01

    This research program has been designed to develop and verify a unique geostatistical approach for finding natural gas resources. The research has been conducted by Beckley College, Inc. (Beckley) and BDM Engineering Services Company (BDMESC) under contract to the US Department of Energy (DOE), Morgantown Energy Technology Center. Phase 1 of the project consisted of compiling and analyzing relevant geological and gas production information in selected areas of Raleigh County, West Virginia, ultimately narrowed to the Eccles, West Virginia, 7 {1/2} minute Quadrangle. The Phase 1 analysis identified key parameters contributing to the accumulation and production of natural gas in Raleigh County, developed analog models relating geological factors to gas production, and identified specific sites to test and verify the analysis methodologies by drilling. Based on the Phase 1 analysis, five sites have been identified with high potential for economic gas production. Phase 2 will consist of drilling, completing, and producing one or more wells at the sites identified in the Phase 1 analyses. The initial well is schedules to the drilled in April 1991. This report summarizes the results of the Phase 1 investigations. For clarity, the report has been prepared in two volumes. Volume 1 presents the Phase 1 overview; Volume 2 contains the detailed geological and production information collected and analyzed for this study.

  9. Field-scale soil moisture space-time geostatistical modeling for complex Palouse landscapes in the inland Pacific Northwest

    NASA Astrophysics Data System (ADS)

    Chahal, M. K.; Brown, D. J.; Brooks, E. S.; Campbell, C.; Cobos, D. R.; Vierling, L. A.

    2012-12-01

    Estimating soil moisture content continuously over space and time using geo-statistical techniques supports the refinement of process-based watershed hydrology models and the application of soil process models (e.g. biogeochemical models predicting greenhouse gas fluxes) to complex landscapes. In this study, we model soil profile volumetric moisture content for five agricultural fields with loess soils in the Palouse region of Eastern Washington and Northern Idaho. Using a combination of stratification and space-filling techniques, we selected 42 representative and distributed measurement locations in the Cook Agronomy Farm (Pullman, WA) and 12 locations each in four additional grower fields that span the precipitation gradient across the Palouse. At each measurement location, soil moisture was measured on an hourly basis at five different depths (30, 60, 90, 120, and 150 cm) using Decagon 5-TE/5-TM soil moisture sensors (Decagon Devices, Pullman, WA, USA). This data was collected over three years for the Cook Agronomy Farm and one year for each of the grower fields. In addition to ordinary kriging, we explored the correlation of volumetric water content with external, spatially exhaustive indices derived from terrain models, optical remote sensing imagery, and proximal soil sensing data (electromagnetic induction and VisNIR penetrometer)

  10. Geostatistical assessment and validation of uncertainty for three-dimensional dioxin data from sediments in an estuarine river.

    PubMed

    Barabás, N; Goovaerts, P; Adriaens, P

    2001-08-15

    Contaminated sediment management is an urgent environmental and regulatory issue worldwide. Because remediation is expensive, sound quantitative assessments of uncertainty aboutthe spatial distribution of contaminants are critical, butthey are hampered bythe physical complexity of sediment environments. This paper describes the use of geostatistical modeling approaches to quantify uncertainty of 2,3,7,8-tetrachlorodibenzo-p-dioxin concentrations in Passaic River (New Jersey) sediments and to incorporate this information in decision-making processes, such as delineation of contaminated areas and additional sampling needs. First, coordinate transformation and analysis of three-dimensional semivariograms were used to describe and modelthe directional variability accounting forthe meandering course of the river. Then, indicator kriging was employed to provide models of local uncertainty at unsampled locations without requiring a prior transform (e.g. log-normal) of concentrations. Cross-validation results show that the use of probability thresholds leads to more efficient delineation of contaminated areas than a classification based on the exceedence of regulatory thresholds by concentration estimates. Depending on whether additional sampling aims at reducing prediction errors or misclassification rates, the variance of local probability distributions or a measure of the expected closeness to the regulatory threshold can be used to locate candidate locations. PMID:11529567

  11. Geographical distribution of the annual mean radon concentrations in primary schools of Southern Serbia - application of geostatistical methods.

    PubMed

    Bossew, P; Žunić, Z S; Stojanovska, Z; Tollefsen, T; Carpentieri, C; Veselinović, N; Komatina, S; Vaupotič, J; Simović, R D; Antignani, S; Bochicchio, F

    2014-01-01

    Between 2008 and 2011 a survey of radon ((222)Rn) was performed in schools of several districts of Southern Serbia. Some results have been published previously (Žunić et al., 2010; Carpentieri et al., 2011; Žunić et al., 2013). This article concentrates on the geographical distribution of the measured Rn concentrations. Applying geostatistical methods we generate "school radon maps" of expected concentrations and of estimated probabilities that a concentration threshold is exceeded. The resulting maps show a clearly structured spatial pattern which appears related to the geological background. In particular in areas with vulcanite and granitoid rocks, elevated radon (Rn) concentrations can be expected. The "school radon map" can therefore be considered as proxy to a map of the geogenic radon potential, and allows identification of radon-prone areas, i.e. areas in which higher Rn radon concentrations can be expected for natural reasons. It must be stressed that the "radon hazard", or potential risk, estimated this way, has to be distinguished from the actual radon risk, which is a function of exposure. This in turn may require (depending on the target variable which is supposed to measure risk) considering demographic and sociological reality, i.e. population density, distribution of building styles and living habits. PMID:24231373

  12. Merging parallel tempering with sequential geostatistical resampling for improved posterior exploration of high-dimensional subsurface categorical fields

    NASA Astrophysics Data System (ADS)

    Laloy, Eric; Linde, Niklas; Jacques, Diederik; Mariethoz, Grégoire

    2016-04-01

    The sequential geostatistical resampling (SGR) algorithm is a Markov chain Monte Carlo (MCMC) scheme for sampling from possibly non-Gaussian, complex spatially-distributed prior models such as geologic facies or categorical fields. In this work, we highlight the limits of standard SGR for posterior inference of high-dimensional categorical fields with realistically complex likelihood landscapes and benchmark a parallel tempering implementation (PT-SGR). Our proposed PT-SGR approach is demonstrated using synthetic (error corrupted) data from steady-state flow and transport experiments in categorical 7575- and 10,000-dimensional 2D conductivity fields. In both case studies, every SGR trial gets trapped in a local optima while PT-SGR maintains an higher diversity in the sampled model states. The advantage of PT-SGR is most apparent in an inverse transport problem where the posterior distribution is made bimodal by construction. PT-SGR then converges towards the appropriate data misfit much faster than SGR and partly recovers the two modes. In contrast, for the same computational resources SGR does not fit the data to the appropriate error level and hardly produces a locally optimal solution that looks visually similar to one of the two reference modes. Although PT-SGR clearly surpasses SGR in performance, our results also indicate that using a small number (16-24) of temperatures (and thus parallel cores) may not permit complete sampling of the posterior distribution by PT-SGR within a reasonable computational time (less than 1-2 weeks).

  13. Geostatistical study of spatial correlations of lead and zinc concentration in urban reservoir. Study case Czerniakowskie Lake, Warsaw, Poland

    NASA Astrophysics Data System (ADS)

    Fabijańczyk, Piotr; Zawadzki, Jarosław; Wojtkowska, Małgorzata

    2016-07-01

    The article presents detailed geostatistical analysis of spatial distribution of lead and zinc concentration in water, suspension and bottom sediments of large, urban lake exposed to intensive anthropogenic pressure within a large city. Systematic chemical measurements were performed at eleven cross-sections located along Czerniakowskie Lake, the largest lake in Warsaw, the capital of Poland. During the summer, the lake is used as a public bathing area, therefore, to better evaluate human impacts, field measurements were carried out in high-use seasons. It was found that the spatial distributions of aqueous lead and zinc differ during the summer and autumn. In summer several Pb and Zn hot-spots were observed, while during autumn spatial distributions of Pb and Zn were rather homogenous throughout the entire lake. Large seasonal differences in spatial distributions of Pb and Zn were found in bottom sediments. Autumn concentrations of both heavy metals were ten times higher in comparison with summer values. Clear cross-correlations of Pb and Zn concentrations in water, suspension and bottom sediments suggest that both Pb and Zn came to Czerniakowskie Lake from the same source.

  14. Multivariate and geostatistical analyses of the spatial distribution and sources of heavy metals in agricultural soil in Dehui, Northeast China.

    PubMed

    Sun, Chongyu; Liu, Jingshuang; Wang, Yang; Sun, Liqiang; Yu, Hongwen

    2013-07-01

    The characterization of the content and source of heavy metals in soils are necessary to establish quality standards on a regional level and to assess the potential threat of metals to food safety and human health. The surface horizons of 114 agricultural soils in Dehui, a representative agricultural area in the black soil region, Northeast China, were collected and the concentrations of Cr, Ni, Cu, Zn, and Pb were analyzed. The mean values of the heavy metals were 49.7 ± 7.04, 20.8 ± 3.06, 18.9 ± 8.51, 58.9 ± 7.16, and 35.4 ± 9.18 mg kg(-1) for Cr, Ni, Cu, Zn, and Pb, respectively. Anthropic activities caused an enrichment of Cu and Pb in soils. However, metal concentrations in all samples did not exceed the guideline values of Chinese Environmental Quality Standard for Soils. Multivariate and geostatistical analyses suggested that soil Cr, Ni, and Zn had a lithogenic origin. Whereas, the elevated Cu concentrations in the study area were associated with industrial and agronomic practices, and the main sources of Pb were industrial fume, coal burning exhausts, and domestic waste. Source identification of heavy metals in agricultural soil is a basis for undertaking appropriate action to reduce metal inputs. PMID:23608467

  15. APPLICATION OF BAYESIAN AND GEOSTATISTICAL MODELING TO THE ENVIRONMENTAL MONITORING OF CS-137 AT THE IDAHO NATIONAL LABORATORY

    SciTech Connect

    Kara G. Eby

    2010-08-01

    At the Idaho National Laboratory (INL) Cs-137 concentrations above the U.S. Environmental Protection Agency risk-based threshold of 0.23 pCi/g may increase the risk of human mortality due to cancer. As a leader in nuclear research, the INL has been conducting nuclear activities for decades. Elevated anthropogenic radionuclide levels including Cs-137 are a result of atmospheric weapons testing, the Chernobyl accident, and nuclear activities occurring at the INL site. Therefore environmental monitoring and long-term surveillance of Cs-137 is required to evaluate risk. However, due to the large land area involved, frequent and comprehensive monitoring is limited. Developing a spatial model that predicts Cs-137 concentrations at unsampled locations will enhance the spatial characterization of Cs-137 in surface soils, provide guidance for an efficient monitoring program, and pinpoint areas requiring mitigation strategies. The predictive model presented herein is based on applied geostatistics using a Bayesian analysis of environmental characteristics across the INL site, which provides kriging spatial maps of both Cs-137 estimates and prediction errors. Comparisons are presented of two different kriging methods, showing that the use of secondary information (i.e., environmental characteristics) can provide improved prediction performance in some areas of the INL site.

  16. Diffusion in random networks

    NASA Astrophysics Data System (ADS)

    Padrino, Juan C.; Zhang, Duan Z.

    2015-11-01

    The ensemble phase averaging technique is applied to model mass transport in a porous medium. The porous material is idealized as an ensemble of random networks, where each network consists of a set of junction points representing the pores and tortuous channels connecting them. Inside a channel, fluid transport is assumed to be governed by the one-dimensional diffusion equation. Mass balance leads to an integro-differential equation for the pores mass density. Instead of attempting to solve this equation, and equivalent set of partial differential equations is derived whose solution is sought numerically. As a test problem, we consider the one-dimensional diffusion of a substance from one end to the other in a bounded domain. For a statistically homogeneous and isotropic material, results show that for relatively large times the pore mass density evolution from the new theory is significantly delayed in comparison with the solution from the classical diffusion equation. In the short-time case, when the solution evolves with time as if the domain were semi-infinite, numerical results indicate that the pore mass density becomes a function of the similarity variable xt- 1 / 4 rather than xt- 1 / 2 characteristic of classical diffusion. This result was verified analytically. Possible applications of this framework include flow in gas shales. Work supported by LDRD project of LANL.

  17. Comparing MTI randomization procedures to blocked randomization.

    PubMed

    Berger, Vance W; Bejleri, Klejda; Agnor, Rebecca

    2016-02-28

    Randomization is one of the cornerstones of the randomized clinical trial, and there is no shortage of methods one can use to randomize patients to treatment groups. When deciding which one to use, researchers must bear in mind that not all randomization procedures are equally adept at achieving the objective of randomization, namely, balanced treatment groups. One threat is chronological bias, and permuted blocks randomization does such a good job at controlling chronological bias that it has become the standard randomization procedure in clinical trials. But permuted blocks randomization is especially vulnerable to selection bias, so as a result, the maximum tolerated imbalance (MTI) procedures were proposed as better alternatives. In comparing the procedures, we have somewhat of a false controversy, in that actual practice goes uniformly one way (permuted blocks), whereas scientific arguments go uniformly the other way (MTI procedures). There is no argument in the literature to suggest that the permuted block design is better than or even as good as the MTI procedures, but this dearth is matched by an equivalent one regarding actual trials using the MTI procedures. So the 'controversy', if we are to call it that, pits misguided precedent against sound advice that tends to be ignored in practice. We shall review the issues to determine scientifically which of the procedures is better and, therefore, should be used. PMID:26337607

  18. The MCNP5 Random number generator

    SciTech Connect

    Brown, F. B.; Nagaya, Y.

    2002-01-01

    MCNP and other Monte Carlo particle transport codes use random number generators to produce random variates from a uniform distribution on the interval. These random variates are then used in subsequent sampling from probability distributions to simulate the physical behavior of particles during the transport process. This paper describes the new random number generator developed for MCNP Version 5. The new generator will optionally preserve the exact random sequence of previous versions and is entirely conformant to the Fortran-90 standard, hence completely portable. In addition, skip-ahead algorithms have been implemented to efficiently initialize the generator for new histories, a capability that greatly simplifies parallel algorithms. Further, the precision of the generator has been increased, extending the period by a factor of 10{sup 5}. Finally, the new generator has been subjected to 3 different sets of rigorous and extensive statistical tests to verify that it produces a sufficiently random sequence.

  19. A Random Variable Related to the Inversion Vector of a Partial Random Permutation

    ERIC Educational Resources Information Center

    Laghate, Kavita; Deshpande, M. N.

    2005-01-01

    In this article, we define the inversion vector of a permutation of the integers 1, 2,..., n. We set up a particular kind of permutation, called a partial random permutation. The sum of the elements of the inversion vector of such a permutation is a random variable of interest.

  20. "Geo-statistics methods and neural networks in geophysical applications: A case study"

    NASA Astrophysics Data System (ADS)

    Rodriguez Sandoval, R.; Urrutia Fucugauchi, J.; Ramirez Cruz, L. C.

    2008-12-01

    The study is focus in the Ebano-Panuco basin of northeastern Mexico, which is being explored for hydrocarbon reservoirs. These reservoirs are in limestones and there is interest in determining porosity and permeability in the carbonate sequences. The porosity maps presented in this study are estimated from application of multiattribute and neural networks techniques, which combine geophysics logs and 3-D seismic data by means of statistical relationships. The multiattribute analysis is a process to predict a volume of any underground petrophysical measurement from well-log and seismic data. The data consist of a series of target logs from wells which tie a 3-D seismic volume. The target logs are neutron porosity logs. From the 3-D seismic volume a series of sample attributes is calculated. The objective of this study is to derive a set of attributes and the target log values. The selected set is determined by a process of forward stepwise regression. The analysis can be linear or nonlinear. In the linear mode the method consists of a series of weights derived by least-square minimization. In the nonlinear mode, a neural network is trained using the select attributes as inputs. In this case we used a probabilistic neural network PNN. The method is applied to a real data set from PEMEX. For better reservoir characterization the porosity distribution was estimated using both techniques. The case shown a continues improvement in the prediction of the porosity from the multiattribute to the neural network analysis. The improvement is in the training and the validation, which are important indicators of the reliability of the results. The neural network showed an improvement in resolution over the multiattribute analysis. The final maps provide more realistic results of the porosity distribution.

  1. 78 FR 4855 - Random Drug Testing Rate for Covered Crewmembers

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-23

    ... SECURITY Coast Guard Random Drug Testing Rate for Covered Crewmembers AGENCY: Coast Guard, DHS. ACTION: Notice of minimum random drug testing rate. SUMMARY: The Coast Guard has set the calendar year 2013 minimum random drug testing rate at 25 percent of covered crewmembers. The Coast Guard will continue...

  2. Enhancing multiple-point geostatistical modeling: 2. Iterative simulation and multiple distance function

    NASA Astrophysics Data System (ADS)

    Tahmasebi, Pejman; Sahimi, Muhammad

    2016-03-01

    This series addresses a fundamental issue in multiple-point statistical (MPS) simulation for generation of realizations of large-scale porous media. Past methods suffer from the fact that they generate discontinuities and patchiness in the realizations that, in turn, affect their flow and transport properties. Part I of this series addressed certain aspects of this fundamental issue, and proposed two ways of improving of one such MPS method, namely, the cross correlation-based simulation (CCSIM) method that was proposed by the authors. In the present paper, a new algorithm is proposed to further improve the quality of the realizations. The method utilizes the realizations generated by the algorithm introduced in Part I, iteratively removes any possible remaining discontinuities in them, and addresses the problem with honoring hard (quantitative) data, using an error map. The map represents the differences between the patterns in the training image (TI) and the current iteration of a realization. The resulting iterative CCSIM—the iCCSIM algorithm—utilizes a random path and the error map to identify the locations in the current realization in the iteration process that need further "repairing;" that is, those locations at which discontinuities may still exist. The computational time of the new iterative algorithm is considerably lower than one in which every cell of the simulation grid is visited in order to repair the discontinuities. Furthermore, several efficient distance functions are introduced by which one extracts effectively key information from the TIs. To increase the quality of the realizations and extracting the maximum amount of information from the TIs, the distance functions can be used simultaneously. The performance of the iCCSIM algorithm is studied using very complex 2-D and 3-D examples, including those that are process-based. Comparison is made between the quality and accuracy of the results with those generated by the original CCSIM

  3. UpSet: Visualization of Intersecting Sets

    PubMed Central

    Lex, Alexander; Gehlenborg, Nils; Strobelt, Hendrik; Vuillemot, Romain; Pfister, Hanspeter

    2016-01-01

    Understanding relationships between sets is an important analysis task that has received widespread attention in the visualization community. The major challenge in this context is the combinatorial explosion of the number of set intersections if the number of sets exceeds a trivial threshold. In this paper we introduce UpSet, a novel visualization technique for the quantitative analysis of sets, their intersections, and aggregates of intersections. UpSet is focused on creating task-driven aggregates, communicating the size and properties of aggregates and intersections, and a duality between the visualization of the elements in a dataset and their set membership. UpSet visualizes set intersections in a matrix layout and introduces aggregates based on groupings and queries. The matrix layout enables the effective representation of associated data, such as the number of elements in the aggregates and intersections, as well as additional summary statistics derived from subset or element attributes. Sorting according to various measures enables a task-driven analysis of relevant intersections and aggregates. The elements represented in the sets and their associated attributes are visualized in a separate view. Queries based on containment in specific intersections, aggregates or driven by attribute filters are propagated between both views. We also introduce several advanced visual encodings and interaction methods to overcome the problems of varying scales and to address scalability. UpSet is web-based and open source. We demonstrate its general utility in multiple use cases from various domains. PMID:26356912

  4. Random Item IRT Models

    ERIC Educational Resources Information Center

    De Boeck, Paul

    2008-01-01

    It is common practice in IRT to consider items as fixed and persons as random. Both, continuous and categorical person parameters are most often random variables, whereas for items only continuous parameters are used and they are commonly of the fixed type, although exceptions occur. It is shown in the present article that random item parameters…

  5. Randomization in robot tasks

    NASA Technical Reports Server (NTRS)

    Erdmann, Michael

    1992-01-01

    This paper investigates the role of randomization in the solution of robot manipulation tasks. One example of randomization is shown by the strategy of shaking a bin holding a part in order to orient the part in a desired stable state with some high probability. Randomization can be useful for mobile robot navigation and as a means of guiding the design process.

  6. Geostatistical Simulation of Hydrofacies Heterogeneity of the West Thessaly Aquifer Systems in Greece

    SciTech Connect

    Modis, K. Sideri, D.

    2013-06-15

    Integrating geological properties, such as relative positions and proportions of different hydrofacies, is of highest importance in order to render realistic geological patterns. Sequential indicator simulation (SIS) and Plurigaussian simulation (PS) are alternative methods for conceptual and deterministic modeling for the characterization of hydrofacies distribution. In this work, we studied the spatial differentiation of hydrofacies in the alluvial aquifer system of West Thessaly basin in Greece. For this, we applied both SIS and PS techniques to an extensive set of borehole data from that basin. Histograms of model versus experimental hydrofacies proportions and indicative cross sections were plotted in order to validate the results. The PS technique was shown to be more effective in reproducing the spatial characteristics of the different hydrofacies and their distribution across the study area. In addition, the permeability differentiations reflected in the PS model are in accordance to known heterogeneities of the aquifer capacity.

  7. Geostatistical analysis of 3D microCT images of porous media for stochastic upscaling of spatially variable reactive surfaces

    NASA Astrophysics Data System (ADS)

    De Lucia, Marco; Kühn, Michael

    2015-04-01

    The 3D imaging of porous media through micro tomography allows the characterization of porous space and mineral abundances with unprecedented resolution. Such images can be used to perform computational determination of permeability and to obtain a realistic measure of the mineral surfaces exposed to fluid flow and thus to chemical interactions. However, the volume of the plugs that can be analysed with such detail is in the order of 1 cm3, so that their representativity at a larger scale, i.e. as needed for reactive transport modelling at Darcy scale, is questionable at best. In fact, the fine scale heterogeneity (from plug to plug at few cm distance within the same core) would originate substantially different readings of the investigated properties. Therefore, a comprehensive approach including the spatial variability and heterogeneity at the micro- and plug scale needs to be adopted to gain full advantage from the high resolution images in view of the upscaling to Darcy scale. In the framework of the collaborative project H2STORE, micro-CT imaging of different core samples from potential H2-storage sites has been performed by partners at TU Clausthal and Jena University before and after treatment with H2/CO2 mixtures in pressurized autoclaves. We present here the workflow which has been implemented to extract the relevant features from the available data concerning the heterogeneity of the medium at the microscopic and plug scale and to correlate the observed chemical reactions and changes in the porous structure with the geometrical features of the medium. First, a multivariate indicator-based geostatistical model for the microscopic structure of the plugs has been built and fitted to the available images. This involved the implementation of exploratory analysis algorithms such as experimental indicator variograms and cross-variograms. The implemented methods are able to efficiently deal with images in the order of 10003 voxels making use of parallelization

  8. Chapter J: Issues and challenges in the application of geostatistics and spatial-data analysis to the characterization of sand-and-gravel resources

    USGS Publications Warehouse

    Hack, Daniel R.

    2005-01-01

    Sand-and-gravel (aggregate) resources are a critical component of the Nation's infrastructure, yet aggregate-mining technologies lag far behind those of metalliferous mining and other sectors. Deposit-evaluation and site-characterization methodologies are antiquated, and few serious studies of the potential applications of spatial-data analysis and geostatistics have been published. However, because of commodity usage and the necessary proximity of a mine to end use, aggregate-resource exploration and evaluation differ fundamentally from comparable activities for metalliferous ores. Acceptable practices, therefore, can reflect this cruder scale. The increasing use of computer technologies is colliding with the need for sand-and-gravel mines to modernize and improve their overall efficiency of exploration, mine planning, scheduling, automation, and other operations. The emergence of megaquarries in the 21st century will also be a contributing factor. Preliminary research into the practical applications of exploratory-data analysis (EDA) have been promising. For example, EDA was used to develop a linear-regression equation to forecast freeze-thaw durability from absorption values for Lower Paleozoic carbonate rocks mined for crushed aggregate from quarries in Oklahoma. Applications of EDA within a spatial context, a method of spatial-data analysis, have also been promising, as with the investigation of undeveloped sand-and-gravel resources in the sedimentary deposits of Pleistocene Lake Bonneville, Utah. Formal geostatistical investigations of sand-and-gravel deposits are quite rare, and the primary focus of those studies that have been completed is on the spatial characterization of deposit thickness and its subsequent effect on ore reserves. A thorough investigation of a gravel deposit in an active aggregate-mining area in central Essex, U.K., emphasized the problems inherent in the geostatistical characterization of particle-size-analysis data. Beyond such factors

  9. Carbon Tetrachloride Emissions from the US during 2008 - 2012 Derived from Atmospheric Data Using Bayesian and Geostatistical Inversions

    NASA Astrophysics Data System (ADS)

    Hu, L.; Montzka, S. A.; Miller, B.; Andrews, A. E.; Miller, J. B.; Lehman, S.; Sweeney, C.; Miller, S. M.; Thoning, K. W.; Siso, C.; Atlas, E. L.; Blake, D. R.; De Gouw, J. A.; Gilman, J.; Dutton, G. S.; Elkins, J. W.; Hall, B. D.; Chen, H.; Fischer, M. L.; Mountain, M. E.; Nehrkorn, T.; Biraud, S.; Tans, P. P.

    2015-12-01

    Global atmospheric observations suggest substantial ongoing emissions of carbon tetrachloride (CCl4) despite a 100% phase-out of production for dispersive uses since 1996 in developed countries and 2010 in other countries. Little progress has been made in understanding the causes of these ongoing emissions or identifying their contributing sources. In this study, we employed multiple inverse modeling techniques (i.e. Bayesian and geostatistical inversions) to assimilate CCl4 mole fractions observed from the National Oceanic and Atmospheric Administration (NOAA) flask-air sampling network over the US, and quantify its national and regional emissions during 2008 - 2012. Average national total emissions of CCl4 between 2008 and 2012 determined from these observations and an ensemble of inversions range between 2.1 and 6.1 Gg yr-1. This emission is substantially larger than the mean of 0.06 Gg/yr reported to the US EPA Toxics Release Inventory over these years, suggesting that under-reported emissions or non-reporting sources make up the bulk of CCl4 emissions from the US. But while the inventory does not account for the magnitude of observationally-derived CCl4 emissions, the regional distribution of derived and inventory emissions is similar. Furthermore, when considered relative to the distribution of uncapped landfills or population, the variability in measured mole fractions was most consistent with the distribution of industrial sources (i.e., those from the Toxics Release Inventory). Our results suggest that emissions from the US only account for a small fraction of the global on-going emissions of CCl4 (30 - 80 Gg yr-1 over this period). Finally, to ascertain the importance of the US emissions relative to the unaccounted global emission rate we considered multiple approaches to extrapolate our results to other countries and the globe.

  10. Evaluation of geostatistical estimators and their applicability to characterise the spatial patterns of recreational fishing catch rates

    PubMed Central

    Aidoo, Eric N.; Mueller, Ute; Goovaerts, Pierre; Hyndes, Glenn A.

    2015-01-01

    Western Australians are heavily engaged in recreational fishing activities with a participation rate of approximately 30%. An accurate estimation of the spatial distribution of recreational catch per unit effort (catch rates) is an integral component for monitoring fish population changes and to develop strategies for ecosystem-based marine management. Geostatistical techniques such as kriging can provide useful tools for characterising the spatial distributions of recreational catch rates. However, most recreational fishery data are highly skewed, zero-inflated and when expressed as ratios are impacted by the small number problem which can influence the estimates obtained from the traditional kriging. The applicability of ordinary, indicator and Poisson kriging to recreational catch rate data was evaluated for three aquatic species with different behaviours and distribution patterns. The prediction performance of each estimator was assessed based on cross-validation. For all three species, the accuracy plot of the indicator kriging (IK) showed a better agreement between expected and empirical proportions of catch rate data falling within probability intervals of increasing size, as measured by the goodness statistic. Also, indicator kriging was found to be better in predicting the latent catch rate for the three species compared to ordinary and Poisson kriging. For each species, the spatial maps from the three estimators displayed similar patterns but Poisson kriging produced smoother spatial distributions. We show that the IK estimator may be preferable for the spatial modelling of catch rate data exhibiting these characteristics, and has the best prediction performance regardless of the life history and distribution patterns of those three species. PMID:26120221

  11. Geostatistics-based groundwater-level monitoring network design and its application to the Upper Floridan aquifer, USA.

    PubMed

    Bhat, Shirish; Motz, Louis H; Pathak, Chandra; Kuebler, Laura

    2015-01-01

    A geostatistical method was applied to optimize an existing groundwater-level monitoring network in the Upper Floridan aquifer for the South Florida Water Management District in the southeastern United States. Analyses were performed to determine suitable numbers and locations of monitoring wells that will provide equivalent or better quality groundwater-level data compared to an existing monitoring network. Ambient, unadjusted groundwater heads were expressed as salinity-adjusted heads based on the density of freshwater, well screen elevations, and temperature-dependent saline groundwater density. The optimization of the numbers and locations of monitoring wells is based on a pre-defined groundwater-level prediction error. The newly developed network combines an existing network with the addition of new wells that will result in a spatial distribution of groundwater monitoring wells that better defines the regional potentiometric surface of the Upper Floridan aquifer in the study area. The network yields groundwater-level predictions that differ significantly from those produced using the existing network. The newly designed network will reduce the mean prediction standard error by 43% compared to the existing network. The adoption of a hexagonal grid network for the South Florida Water Management District is recommended to achieve both a uniform level of information about groundwater levels and the minimum required accuracy. It is customary to install more monitoring wells for observing groundwater levels and groundwater quality as groundwater development progresses. However, budget constraints often force water managers to implement cost-effective monitoring networks. In this regard, this study provides guidelines to water managers concerned with groundwater planning and monitoring. PMID:25433546

  12. Integrating indicator-based geostatistical estimation and aquifer vulnerability of nitrate-N for establishing groundwater protection zones

    NASA Astrophysics Data System (ADS)

    Jang, Cheng-Shin; Chen, Shih-Kai

    2015-04-01

    Groundwater nitrate-N contamination occurs frequently in agricultural regions, primarily resulting from surface agricultural activities. The focus of this study is to establish groundwater protection zones based on indicator-based geostatistical estimation and aquifer vulnerability of nitrate-N in the Choushui River alluvial fan in Taiwan. The groundwater protection zones are determined by univariate indicator kriging (IK) estimation, aquifer vulnerability assessment using logistic regression (LR), and integration of the IK estimation and aquifer vulnerability using simple IK with local prior means (sIKlpm). First, according to the statistical significance of source, transport, and attenuation factors dominating the occurrence of nitrate-N pollution, a LR model was adopted to evaluate aquifer vulnerability and to characterize occurrence probability of nitrate-N exceeding 0.5 mg/L. Moreover, the probabilities estimated using LR were regarded as local prior means. IK was then used to estimate the actual extent of nitrate-N pollution. The integration of the IK estimation and aquifer vulnerability was obtained using sIKlpm. Finally, groundwater protection zones were probabilistically determined using the three aforementioned methods, and the estimated accuracy of the delineated groundwater protection zones was gauged using a cross-validation procedure based on observed nitrate-N data. The results reveal that the integration of the IK estimation and aquifer vulnerability using sIKlpm is more robust than univariate IK estimation and aquifer vulnerability assessment using LR for establishing groundwater protection zones. Rigorous management practices for fertilizer use should be implemented in orchards situated in the determined groundwater protection zones.

  13. Spatial Distribution of Soil Organic Carbon and Total Nitrogen Based on GIS and Geostatistics in a Small Watershed in a Hilly Area of Northern China

    PubMed Central

    Peng, Gao; Bing, Wang; Guangpo, Geng; Guangcan, Zhang

    2013-01-01

    The spatial variability of soil organic carbon (SOC) and total nitrogen (STN) levels is important in both global carbon-nitrogen cycle and climate change research. There has been little research on the spatial distribution of SOC and STN at the watershed scale based on geographic information systems (GIS) and geostatistics. Ninety-seven soil samples taken at depths of 0–20 cm were collected during October 2010 and 2011 from the Matiyu small watershed (4.2 km2) of a hilly area in Shandong Province, northern China. The impacts of different land use types, elevation, vegetation coverage and other factors on SOC and STN spatial distributions were examined using GIS and a geostatistical method, regression-kriging. The results show that the concentration variations of SOC and STN in the Matiyu small watershed were moderate variation based on the mean, median, minimum and maximum, and the coefficients of variation (CV). Residual values of SOC and STN had moderate spatial autocorrelations, and the Nugget/Sill were 0.2% and 0.1%, respectively. Distribution maps of regression-kriging revealed that both SOC and STN concentrations in the Matiyu watershed decreased from southeast to northwest. This result was similar to the watershed DEM trend and significantly correlated with land use type, elevation and aspect. SOC and STN predictions with the regression-kriging method were more accurate than those obtained using ordinary kriging. This research indicates that geostatistical characteristics of SOC and STN concentrations in the watershed were closely related to both land-use type and spatial topographic structure and that regression-kriging is suitable for investigating the spatial distributions of SOC and STN in the complex topography of the watershed. PMID:24391791

  14. Models of random graph hierarchies

    NASA Astrophysics Data System (ADS)

    Paluch, Robert; Suchecki, Krzysztof; Hołyst, Janusz A.

    2015-10-01

    We introduce two models of inclusion hierarchies: random graph hierarchy (RGH) and limited random graph hierarchy (LRGH). In both models a set of nodes at a given hierarchy level is connected randomly, as in the Erdős-Rényi random graph, with a fixed average degree equal to a system parameter c. Clusters of the resulting network are treated as nodes at the next hierarchy level and they are connected again at this level and so on, until the process cannot continue. In the RGH model we use all clusters, including those of size 1, when building the next hierarchy level, while in the LRGH model clusters of size 1 stop participating in further steps. We find that in both models the number of nodes at a given hierarchy level h decreases approximately exponentially with h. The height of the hierarchy H, i.e. the number of all hierarchy levels, increases logarithmically with the system size N, i.e. with the number of nodes at the first level. The height H decreases monotonically with the connectivity parameter c in the RGH model and it reaches a maximum for a certain c max in the LRGH model. The distribution of separate cluster sizes in the LRGH model is a power law with an exponent about - 1.25. The above results follow from approximate analytical calculations and have been confirmed by numerical simulations.

  15. Mapping soil organic carbon stocks by robust geostatistical and boosted regression models

    NASA Astrophysics Data System (ADS)

    Nussbaum, Madlene; Papritz, Andreas; Baltensweiler, Andri; Walthert, Lorenz

    2013-04-01

    Carbon (C) sequestration in forests offsets greenhouse gas emissions. Therefore, quantifying C stocks and fluxes in forest ecosystems is of interest for greenhouse gas reporting according to the Kyoto protocol. In Switzerland, the National Forest Inventory offers comprehensive data to quantify the aboveground forest biomass and its change in time. Estimating stocks of soil organic C (SOC) in forests is more difficult because the variables needed to quantify stocks vary strongly in space and precise quantification of some of them is very costly. Based on data from 1'033 plots we modeled SOC stocks of the organic layer and the mineral soil to depths of 30 cm and 100 cm for the Swiss forested area. For the statistical modeling a broad range of covariates were available: Climate data (e. g. precipitation, temperature), two elevation models (resolutions 25 and 2 m) with respective terrain attributes and spectral reflectance data representing vegetation. Furthermore, the main mapping units of an overview soil map and a coarse scale geological map were used to coarsely represent the parent material of the soils. The selection of important covariates for SOC stocks modeling out of a large set was a major challenge for the statistical modeling. We used two approaches to deal with this problem: 1) A robust restricted maximum likelihood method to fit linear regression model with spatially correlated errors. The large number of covariates was first reduced by LASSO (Least Absolute Shrinkage and Selection Operator) and then further narrowed down to a parsimonious set of important covariates by cross-validation of the robustly fitted model. To account for nonlinear dependencies of the response on the covariates interaction terms of the latter were included in model if this improved the fit. 2) A boosted structured regression model with componentwise linear least squares or componentwise smoothing splines as base procedures. The selection of important covariates was done by the

  16. Quantum random number generation

    SciTech Connect

    Ma, Xiongfeng; Yuan, Xiao; Cao, Zhu; Zhang, Zhen; Qi, Bing

    2016-01-01

    Here, quantum physics can be exploited to generate true random numbers, which play important roles in many applications, especially in cryptography. Genuine randomness from the measurement of a quantum system reveals the inherent nature of quantumness — coherence, an important feature that differentiates quantum mechanics from classical physics. The generation of genuine randomness is generally considered impossible with only classical means. Based on the degree of trustworthiness on devices, quantum random number generators (QRNGs) can be grouped into three categories. The first category, practical QRNG, is built on fully trusted and calibrated devices and typically can generate randomness at a high speed by properly modeling the devices. The second category is self-testing QRNG, where verifiable randomness can be generated without trusting the actual implementation. The third category, semi-self-testing QRNG, is an intermediate category which provides a tradeoff between the trustworthiness on the device and the random number generation speed.

  17. Quantum random number generation

    DOE PAGESBeta

    Ma, Xiongfeng; Yuan, Xiao; Cao, Zhu; Zhang, Zhen; Qi, Bing

    2016-06-28

    Here, quantum physics can be exploited to generate true random numbers, which play important roles in many applications, especially in cryptography. Genuine randomness from the measurement of a quantum system reveals the inherent nature of quantumness -- coherence, an important feature that differentiates quantum mechanics from classical physics. The generation of genuine randomness is generally considered impossible with only classical means. Based on the degree of trustworthiness on devices, quantum random number generators (QRNGs) can be grouped into three categories. The first category, practical QRNG, is built on fully trusted and calibrated devices and typically can generate randomness at amore » high speed by properly modeling the devices. The second category is self-testing QRNG, where verifiable randomness can be generated without trusting the actual implementation. The third category, semi-self-testing QRNG, is an intermediate category which provides a tradeoff between the trustworthiness on the device and the random number generation speed.« less

  18. Geo-Statistics and its Application for Creating Iso-Maps in Hydro-Geology Electric Conductivity contours of Atrak sub basin, Iran

    NASA Astrophysics Data System (ADS)

    Allahdadi, M.; Partani, S.; Ahmadi, M.

    2012-12-01

    Taking discrete sampling from the water resource and measuring the parameters in quantity and quality and using the procedures of turning discrete points to integrated surface can lead the research process to surface variations consideration. There are different procedures for turning broken discrete points to integrated surface like procedure of Geostatistics which conclude Kriging procedure, Inverse Distance Weighting (IDW), Radial Basis Functions (RBF), Local Polynomial Interpolation, Global Polynomial Interpolation and Co-kriging. This research is going to explain any applications of Geostatistics for creating iso-maps such as underground water table contours, Iso-quality (for example EC and pH) contours, and place changes of too many other hydro-geological parameters. In this way, statistic signs such as Mean Absolute Error (MAE), Mean Absolute Relative Error (MARE), Root Mean Square Error (RMSE), MBE and Coefficient of Correlation, could be employed to select the best Geo-Statistic model in a GIS framework. Eventually, the power in IDW method has been optimized and also best Geo-Statistic method has been introduced for predicting the Electric Conductivity (EC) of underground water in Atrak plain. Finally any conclusions have been extracted. Table No. 1: Comparison of Statistical Indices for Best Estimator Selection G.S Method r RMSE MBE MAE Kriging 0.81 2200 485 1740 RBF 0.79 2263 427 1662 GPI 0.63 2981 347 2031 LPI 0.64 2942 349 2002 Power Optimized IDW 0.459 4816 -183 2845 The use of MAE shows the amount of absolute error, With regard to table No.1 kriging and RBF models have the best estimation which has the least absolute estimation error. As one of results figure No. 2 shows the Iso-Electric Conductivity Map which has been created by Kriging Method in a GIS Framework.; Figure No. 1: Power optimization in IDW method and the minimum of RMSE ; Figure No.2: Iso-Electric Conductivity Map which has been created by Kriging Method in a GIS Framework

  19. Blocked randomization with randomly selected block sizes.

    PubMed

    Efird, Jimmy

    2011-01-01

    When planning a randomized clinical trial, careful consideration must be given to how participants are selected for various arms of a study. Selection and accidental bias may occur when participants are not assigned to study groups with equal probability. A simple random allocation scheme is a process by which each participant has equal likelihood of being assigned to treatment versus referent groups. However, by chance an unequal number of individuals may be assigned to each arm of the study and thus decrease the power to detect statistically significant differences between groups. Block randomization is a commonly used technique in clinical trial design to reduce bias and achieve balance in the allocation of participants to treatment arms, especially when the sample size is small. This method increases the probability that each arm will contain an equal number of individuals by sequencing participant assignments by block. Yet still, the allocation process may be predictable, for example, when the investigator is not blind and the block size is fixed. This paper provides an overview of blocked randomization and illustrates how to avoid selection bias by using random block sizes. PMID:21318011

  20. An innovative geostatistical approach to oil volumetric calculations: Rock Creek Field, West Virginia

    SciTech Connect

    McDowell, R.R.; Matchen, D.L.; Hohn, M.E.; Vargo, A.G. )

    1994-08-01

    Detailed analysis of production trends in heterogeneous reservoirs requires local estimates of production, original, oil in place (OOIP), and recovery efficiency. In older fields, calculating these values is hampered by incomplete well records, inconsistent reporting of production (well by well vs. lease by lease), unknown effective drainage radius, and poorly constrained completion interval. Accepted methods of estimation rely heavily on the use of average values for reservoir properties. The authors have developed the use of average values for calculating local and field-wide estimates, and have compared their results to published values. The study of the Lower Mississippian Big Injun sandstone reservoir in Rock Creek field, central West Virginia, used production data obtained from operators. Production for the first 10 yr was reconstructed, when necessary, by comparison to decline curves for 70 wells with complete production records. Similarly, curve fitting techniques were used to interpolate data for missing years. Cumulative production values for 667 producing wells were kriged over the extent of the field; the resulting grid was sampled to provide an estimate of cumulative production at each well location. Kriged estimates of pay thickness, porosity, and water saturation were used to calculate OOIP and recovery efficiency (cumulative production + OOIP), but not geographic distribution of these two parameters. An optimal radius of 270 ft gave recovery efficiencies ranging between 18.75% and 21.9%, comparing favorably with a published value of 22.3%. Summing the OOIP value for all producing wells in the field yields a value of 139.6 million bbl, significantly higher than the published value of 37.8 million bbl. The estimate reflects a more complete data set and revised values for reservoir parameters. Discussions with the principal operator in the field suggests that the higher figure is more correct.

  1. Heavy metals relationship in arable and greenhouse soils of SE Spain using a geostatistical analysis

    NASA Astrophysics Data System (ADS)

    Gil, Carlos; Joaquin Ramos-Miras, Jose; Rodríguez Martín, Jose Antonio; Boluda, Rafael; Roca, Núria; Bech, Jaume

    2013-04-01

    This study compares heavy metals contents and the main edaphic parameters in greenhouse soils from the W Almería region one of the most productive agricultural systems in Europe, with agricultural soils (arable soils) in western Andalusia, SW Spain. Heavy metals input in agricultural soils mainly occur through pesticides and phytosanitary control products. The hazardousness of the studied elements (Cr, Ni, Pb, Cu, Zn and Cd) is particularly relevant in soils used for intensive greenhouse farming where such agricultural practices, which centre on maximising production, end up with products that finally enter the human food chain directly. Here we explore a total of 199 greenhouse soils and 142 arable soils, representing two scales of variation in this Mediterranean area. Despite their similar edaphic characteristics, the main differences between arable soils and greenhouse soils lie in nutrients contents (P and K) and in certain heavy metals (Cd, Pb and Zn), which reflect widespread use of pesticides in greenhouse farming. One of the most toxic metals is Cd given its mobility, whose concentrations triple in greenhouse soils, although it does not exceed the limits set by Spanish legislation. We conclude that despite anthropic heavy metals input, the association patterns of these elements were similar on the two spatial variability scales. Cd, Pb and Zn contents, and partly those of Cu, are related with agricultural practices. On the short spatial scale, grouping these heavy metals shows very high contents in greenhouse soils in the central northern area of the W Almería region. On the other hand, the associations of Cr and Ni suggest a lithogenic influence combined with a pedogenic effect on spatial maps. This natural origin input becomes more marked on the long spatial scale (arable soils) where the main Cr and Ni contents are found in the vicinity of the Gádor Mountain Range.

  2. Component Evolution in General Random Intersection Graphs