Science.gov

Sample records for geostatistics random sets

  1. Fast Geostatistical Inversion using Randomized Matrix Decompositions and Sketchings for Heterogeneous Aquifer Characterization

    NASA Astrophysics Data System (ADS)

    O'Malley, D.; Le, E. B.; Vesselinov, V. V.

    2015-12-01

    We present a fast, scalable, and highly-implementable stochastic inverse method for characterization of aquifer heterogeneity. The method utilizes recent advances in randomized matrix algebra and exploits the structure of the Quasi-Linear Geostatistical Approach (QLGA), without requiring a structured grid like Fast-Fourier Transform (FFT) methods. The QLGA framework is a more stable version of Gauss-Newton iterates for a large number of unknown model parameters, but provides unbiased estimates. The methods are matrix-free and do not require derivatives or adjoints, and are thus ideal for complex models and black-box implementation. We also incorporate randomized least-square solvers and data-reduction methods, which speed up computation and simulate missing data points. The new inverse methodology is coded in Julia and implemented in the MADS computational framework (http://mads.lanl.gov). Julia is an advanced high-level scientific programing language that allows for efficient memory management and utilization of high-performance computational resources. Inversion results based on series of synthetic problems with steady-state and transient calibration data are presented.

  2. Distribution of random Cantor sets on tubes

    NASA Astrophysics Data System (ADS)

    Chen, Changhao

    2016-04-01

    We show that there exist (d-1)-Ahlfors regular compact sets E subset{R}d, d≥2 such that for any t< d-1, we have supT {H}^{d-1}(E\\cap T)/w(T)t< infty where the supremum is over all tubes T with width w(T) >0. This settles a question of T. Orponen. The sets we construct are random Cantor sets, and the method combines geometric and probabilistic estimates on the intersections of these random Cantor sets with affine subspaces.

  3. Scaling random walks on arbitrary sets

    NASA Astrophysics Data System (ADS)

    Harris, Simon C.; Williams, David; Sibson, Robin

    1999-01-01

    Let I be a countably infinite set of points in [open face R] which we can write as I={ui: i[set membership][open face Z]}, with uirandom-walk, when repeatedly rescaled suitably in space and time, looks more and more like a Brownian motion. In this paper we explore the convergence properties of the Markov chain Y on the set I under suitable space-time scalings. Later, we consider some cases when the set I consists of the points of a renewal process and the jump rates assigned to each state in I are perhaps also randomly chosen.This work sprang from a question asked by one of us (Sibson) about ‘driftless nearest-neighbour’ Markov chains on countable subsets I of [open face R]d, work of Sibson [7] and of Christ, Friedberg and Lee [2] having identified examples of such chains in terms of the Dirichlet tessellation associated with I. Amongst methods which can be brought to bear on this d-dimensional problem is the theory of Dirichlet forms. There are potential problems in doing this because we wish I to be random (for example, a realization of a Poisson point process), we do not wish to impose artificial boundedness conditions which would clearly make things work for certain deterministic sets I. In the 1-dimensional case discussed here and in the following paper by Harris, much simpler techniques (where we embed the Markov chain in a Brownian motion using local time) work very effectively; and it is these, rather than the theory of Dirichlet forms, that we use.

  4. GEOSTATISTICAL SAMPLING DESIGNS FOR HAZARDOUS WASTE SITES

    EPA Science Inventory

    This chapter discusses field sampling design for environmental sites and hazardous waste sites with respect to random variable sampling theory, Gy's sampling theory, and geostatistical (kriging) sampling theory. The literature often presents these sampling methods as an adversari...

  5. Imprecise (fuzzy) information in geostatistics

    SciTech Connect

    Bardossy, A.; Bogardi, I.; Kelly, W.E.

    1988-05-01

    A methodology based on fuzzy set theory for the utilization of imprecise data in geostatistics is presented. A common problem preventing a broader use of geostatistics has been the insufficient amount of accurate measurement data. In certain cases, additional but uncertain (soft) information is available and can be encoded as subjective probabilities, and then the soft kriging method can be applied (Journal, 1986). In other cases, a fuzzy encoding of soft information may be more realistic and simplify the numerical calculations. Imprecise (fuzzy) spatial information on the possible variogram is integrated into a single variogram which is used in a fuzzy kriging procedure. The overall uncertainty of prediction is represented by the estimation variance and the calculated membership function for each kriged point. The methodology is applied to the permeability prediction of a soil liner for hazardous waste containment. The available number of hard measurement data (20) was not enough for a classical geostatistical analysis. An additional 20 soft data made it possible to prepare kriged contour maps using the fuzzy geostatistical procedure.

  6. The non-integer operation associated to random variation sets of the self-similar set

    NASA Astrophysics Data System (ADS)

    Ren, Fu-Yao; Liang, Jin-Rong

    2000-10-01

    When memory sets are random variation sets of the self-similar set and the total number of remaining states in each stage of the division of this set is normalized to unity, the corresponding flux and fractional integral are “robust” and stable under some conditions. This answers an open problem proposed by Alian Le Mehaute et al. in their book entitled Irreversibilitê Temporel et Geometrie Fractale.

  7. Random set particle filter for bearings-only multitarget tracking

    NASA Astrophysics Data System (ADS)

    Vihola, Matti

    2005-05-01

    The random set approach to multitarget tracking is a theoretically sound framework that covers joint estimation of the number of targets and the state of the targets. This paper describes a particle filter implementation of the random set multitarget filter. The contribution of this paper to the random set tracking framework is the formulation of a measurement model where each sensor report is assumed to contain at most one measurement. The implemented filter was tested in synthetic bearings-only tracking scenarios containing up to two targets in the presence of false alarms and missed measurements. The estimated target state consisted of 2D position and velocity components. The filter was capable to track the targets fairly well despite of the missing measurements and the relatively high false alarm rates. In addition, the filter showed robustness against wrong parameter values of false alarm rates. The results that were obtained during the limited tests of the filter show that the random set framework has potential for challenging tracking situations. On the other hand, the computational burden of the described implementation is quite high and increases approximately linearly with respect to the expected number of targets.

  8. Fractal and geostatistical methods for modeling of a fracture network

    SciTech Connect

    Chiles, J.P.

    1988-08-01

    The modeling of fracture networks is useful for fluid flow and rock mechanics studies. About 6600 fracture traces were recorded on drifts of a uranium mine in a granite massif. The traces have an extension of 0.20-20 m. The network was studied by fractal and by geostatistical methods but can be considered neither as a fractal with a constant dimension nor a set of purely randomly located fractures. Two kinds of generalization of conventional models can still provide more flexibility for the characterization of the network: (a) a nonscaling fractal model with variable similarity dimension (for a 2-D network of traces, the dimension varying from 2 for the 10-m scale to 1 for the centimeter scale, (b) a parent-daughter model with a regionalized density; the geostatistical study allows a 3-D model to be established where: fractures are assumed to be discs; fractures are grouped in clusters or swarms; and fracturation density is regionalized (with two ranges at about 30 and 300 m). The fractal model is easy to fit and to simulate along a line, but 2-D and 3-D simulations are more difficult. The geostatistical model is more complex, but easy to simulate, even in 3-D.

  9. GEO-EAS (GEOSTATISTICAL ENVIRONMENTAL ASSESSMENT SOFTWARE) USER'S GUIDE

    EPA Science Inventory

    The report describes how to install and use the Geo-EAS (Geostatistical Environmental Assessment Software) software package on an IBM-PC compatible computer system. A detailed example is provided showing how to use the software to conduct a geostatistical analysis of a data set. ...

  10. Geostatistics and spatial analysis in biological anthropology.

    PubMed

    Relethford, John H

    2008-05-01

    A variety of methods have been used to make evolutionary inferences based on the spatial distribution of biological data, including reconstructing population history and detection of the geographic pattern of natural selection. This article provides an examination of geostatistical analysis, a method used widely in geology but which has not often been applied in biological anthropology. Geostatistical analysis begins with the examination of a variogram, a plot showing the relationship between a biological distance measure and the geographic distance between data points and which provides information on the extent and pattern of spatial correlation. The results of variogram analysis are used for interpolating values of unknown data points in order to construct a contour map, a process known as kriging. The methods of geostatistical analysis and discussion of potential problems are applied to a large data set of anthropometric measures for 197 populations in Ireland. The geostatistical analysis reveals two major sources of spatial variation. One pattern, seen for overall body and craniofacial size, shows an east-west cline most likely reflecting the combined effects of past population dispersal and settlement. The second pattern is seen for craniofacial height and shows an isolation by distance pattern reflecting rapid spatial changes in the midlands region of Ireland, perhaps attributable to the genetic impact of the Vikings. The correspondence of these results with other analyses of these data and the additional insights generated from variogram analysis and kriging illustrate the potential utility of geostatistical analysis in biological anthropology. PMID:18257009

  11. The excursion set approach in non-Gaussian random fields

    NASA Astrophysics Data System (ADS)

    Musso, Marcello; Sheth, Ravi K.

    2014-04-01

    Insight into a number of interesting questions in cosmology can be obtained by studying the first crossing distributions of physically motivated barriers by random walks with correlated steps: higher mass objects are associated with walks that cross the barrier in fewer steps. We write the first crossing distribution as a formal series, ordered by the number of times a walk upcrosses the barrier. Since the fraction of walks with many upcrossings is negligible if the walk has not taken many steps, the leading order term in this series is the most relevant for understanding the massive objects of most interest in cosmology. For walks associated with Gaussian random fields, this first term only requires knowledge of the bivariate distribution of the walk height and slope, and provides an excellent approximation to the first crossing distribution for all barriers and smoothing filters of current interest. We show that this simplicity survives when extending the approach to the case of non-Gaussian random fields. For non-Gaussian fields which are obtained by deterministic transformations of a Gaussian, the first crossing distribution is simply related to that for Gaussian walks crossing a suitably rescaled barrier. Our analysis shows that this is a useful way to think of the generic case as well. Although our study is motivated by the possibility that the primordial fluctuation field was non-Gaussian, our results are general. In particular, they do not assume the non-Gaussianity is small, so they may be viewed as the solution to an excursion set analysis of the late-time, non-linear fluctuation field rather than the initial one. They are also useful for models in which the barrier height is determined by quantities other than the initial density, since most other physically motivated variables (such as the shear) are usually stochastic and non-Gaussian. We use the Lognormal transformation to illustrate some of our arguments.

  12. The use of a genetic algorithm-based search strategy in geostatistics: application to a set of anisotropic piezometric head data

    NASA Astrophysics Data System (ADS)

    Abedini, M. J.; Nasseri, M.; Burn, D. H.

    2012-04-01

    In any geostatistical study, an important consideration is the choice of an appropriate, repeatable, and objective search strategy that controls the nearby samples to be included in the location-specific estimation procedure. Almost all geostatistical software available in the market puts the onus on the user to supply search strategy parameters in a heuristic manner. These parameters are solely controlled by geographical coordinates that are defined for the entire area under study, and the user has no guidance as to how to choose these parameters. The main thesis of the current study is that the selection of search strategy parameters has to be driven by data—both the spatial coordinates and the sample values—and cannot be chosen beforehand. For this purpose, a genetic-algorithm-based ordinary kriging with moving neighborhood technique is proposed. The search capability of a genetic algorithm is exploited to search the feature space for appropriate, either local or global, search strategy parameters. Radius of circle/sphere and/or radii of standard or rotated ellipse/ellipsoid are considered as the decision variables to be optimized by GA. The superiority of GA-based ordinary kriging is demonstrated through application to the Wolfcamp Aquifer piezometric head data. Assessment of numerical results showed that definition of search strategy parameters based on both geographical coordinates and sample values improves cross-validation statistics when compared with that based on geographical coordinates alone. In the case of a variable search neighborhood for each estimation point, optimization of local search strategy parameters for an elliptical support domain—the orientation of which is dictated by anisotropic axes—via GA was able to capture the dynamics of piezometric head in west Texas/New Mexico in an efficient way.

  13. Multiple Extended Target Tracking With Labeled Random Finite Sets

    NASA Astrophysics Data System (ADS)

    Beard, Michael; Reuter, Stephan; Granstrom, Karl; Vo, Ba-Tuong; Vo, Ba-Ngu; Scheel, Alexander

    2016-04-01

    Targets that generate multiple measurements at a given instant in time are commonly known as extended targets. These present a challenge for many tracking algorithms, as they violate one of the key assumptions of the standard measurement model. In this paper, a new algorithm is proposed for tracking multiple extended targets in clutter, that is capable of estimating the number of targets, as well the trajectories of their states, comprising the kinematics, measurement rates and extents. The proposed technique is based on modelling the multi-target state as a generalised labelled multi-Bernoulli (GLMB) random finite set (RFS), within which the extended targets are modelled using gamma Gaussian inverse Wishart (GGIW) distributions. A cheaper variant of the algorithm is also proposed, based on the labelled multi-Bernoulli (LMB) filter. The proposed GLMB/LMB-based algorithms are compared with an extended target version of the cardinalised probability hypothesis density (CPHD) filter, and simulation results show that the (G)LMB has improved estimation and tracking performance.

  14. Robust geostatistical analysis of spatial data

    NASA Astrophysics Data System (ADS)

    Papritz, Andreas; Künsch, Hans Rudolf; Schwierz, Cornelia; Stahel, Werner A.

    2013-04-01

    Most of the geostatistical software tools rely on non-robust algorithms. This is unfortunate, because outlying observations are rather the rule than the exception, in particular in environmental data sets. Outliers affect the modelling of the large-scale spatial trend, the estimation of the spatial dependence of the residual variation and the predictions by kriging. Identifying outliers manually is cumbersome and requires expertise because one needs parameter estimates to decide which observation is a potential outlier. Moreover, inference after the rejection of some observations is problematic. A better approach is to use robust algorithms that prevent automatically that outlying observations have undue influence. Former studies on robust geostatistics focused on robust estimation of the sample variogram and ordinary kriging without external drift. Furthermore, Richardson and Welsh (1995) proposed a robustified version of (restricted) maximum likelihood ([RE]ML) estimation for the variance components of a linear mixed model, which was later used by Marchant and Lark (2007) for robust REML estimation of the variogram. We propose here a novel method for robust REML estimation of the variogram of a Gaussian random field that is possibly contaminated by independent errors from a long-tailed distribution. It is based on robustification of estimating equations for the Gaussian REML estimation (Welsh and Richardson, 1997). Besides robust estimates of the parameters of the external drift and of the variogram, the method also provides standard errors for the estimated parameters, robustified kriging predictions at both sampled and non-sampled locations and kriging variances. Apart from presenting our modelling framework, we shall present selected simulation results by which we explored the properties of the new method. This will be complemented by an analysis a data set on heavy metal contamination of the soil in the vicinity of a metal smelter. Marchant, B.P. and Lark, R

  15. Determining the Significance of Item Order in Randomized Problem Sets

    ERIC Educational Resources Information Center

    Pardos, Zachary A.; Heffernan, Neil T.

    2009-01-01

    Researchers who make tutoring systems would like to know which sequences of educational content lead to the most effective learning by their students. The majority of data collected in many ITS systems consist of answers to a group of questions of a given skill often presented in a random sequence. Following work that identifies which items…

  16. In School Settings, Are All RCTs (Randomized Control Trials) Exploratory?

    ERIC Educational Resources Information Center

    Newman, Denis; Jaciw, Andrew P.

    2012-01-01

    The motivation for this paper is the authors' recent work on several randomized control trials in which they found the primary result, which averaged across subgroups or sites, to be moderated by demographic or site characteristics. They are led to examine a distinction that the Institute of Education Sciences (IES) makes between "confirmatory"…

  17. Preferential sampling and Bayesian geostatistics: Statistical modeling and examples.

    PubMed

    Cecconi, Lorenzo; Grisotto, Laura; Catelan, Dolores; Lagazio, Corrado; Berrocal, Veronica; Biggeri, Annibale

    2016-08-01

    Preferential sampling refers to any situation in which the spatial process and the sampling locations are not stochastically independent. In this paper, we present two examples of geostatistical analysis in which the usual assumption of stochastic independence between the point process and the measurement process is violated. To account for preferential sampling, we specify a flexible and general Bayesian geostatistical model that includes a shared spatial random component. We apply the proposed model to two different case studies that allow us to highlight three different modeling and inferential aspects of geostatistical modeling under preferential sampling: (1) continuous or finite spatial sampling frame; (2) underlying causal model and relevant covariates; and (3) inferential goals related to mean prediction surface or prediction uncertainty. PMID:27566774

  18. The geostatistical characteristics of the Borden aquifer

    NASA Astrophysics Data System (ADS)

    Woodbury, Allan D.; Sudicky, E. A.

    1991-04-01

    A complete reexamination of Sudicky's (1986) field experiment for the geostatistical characterization of hydraulic conductivity at the Borden aquifer in Ontario, Canada is performed. The sampled data reveal that a number of outliers (low ln (K) values) are present in the data base. These low values cause difficulties in both variogram estimation and determining population statistics. The analysis shows that assuming either a normal distribution or exponential distribution for log conductivity is appropriate. The classical, Cressie/Hawkins and squared median of the absolute deviations (SMAD) estimators are used to compute experimental variograms. None of these estimators provides completely satisfactory variograms for the Borden data with the exception of the classical estimator with outliers removed from the data set. Theoretical exponential variogram parameters are determined from nonlinear (NL) estimation. Differences are obtained between NL fits and those of Sudicky (1986). For the classical-screened estimated variogram, NL fits produce an ln (K) variance of 0.24, nugget of 0.07, and integral scales of 5.1 m horizontal and 0.21 m vertical along A-A'. For B-B' these values are 0.37, 0.11, 8.3 and 0.34. The fitted parameter set for B-B' data (horizontal and vertical) is statistically different than the parameter set determined for A-A'. We also evaluate a probabilistic form of Dagan's (1982, 1987) equations relating geostatistical parameters to a tracer cloud's spreading moments. The equations are evaluated using the parameter estimates and covariances determined from line A-A' as input, with a velocity equal to 9.0 cm/day. The results are compared with actual values determined from the field test, but evaluated by both Freyberg (1986) and Rajaram and Gelhar (1988). The geostatistical parameters developed from this study produce an excellent fit to both sets of calculated plume moments when combined with Dagan's stochastic theory for predicting the spread of a

  19. Self-similar continuous cascades supported by random Cantor sets: Application to rainfall data.

    PubMed

    Muzy, Jean-François; Baïle, Rachel

    2016-05-01

    We introduce a variant of continuous random cascade models that extends former constructions introduced by Barral-Mandelbrot and Bacry-Muzy in the sense that they can be supported by sets of arbitrary fractal dimension. The so-introduced sets are exactly self-similar stationary versions of random Cantor sets formerly introduced by Mandelbrot as "random cutouts." We discuss the main mathematical properties of our construction and compute its scaling properties. We then illustrate our purpose on several numerical examples and we consider a possible application to rainfall data. We notably show that our model allows us to reproduce remarkably the distribution of dry period durations. PMID:27300908

  20. Self-similar continuous cascades supported by random Cantor sets: Application to rainfall data

    NASA Astrophysics Data System (ADS)

    Muzy, Jean-François; Baïle, Rachel

    2016-05-01

    We introduce a variant of continuous random cascade models that extends former constructions introduced by Barral-Mandelbrot and Bacry-Muzy in the sense that they can be supported by sets of arbitrary fractal dimension. The so-introduced sets are exactly self-similar stationary versions of random Cantor sets formerly introduced by Mandelbrot as "random cutouts." We discuss the main mathematical properties of our construction and compute its scaling properties. We then illustrate our purpose on several numerical examples and we consider a possible application to rainfall data. We notably show that our model allows us to reproduce remarkably the distribution of dry period durations.

  1. A Practical Primer on Geostatistics

    USGS Publications Warehouse

    Olea, Ricardo A.

    2009-01-01

    THE CHALLENGE Most geological phenomena are extraordinarily complex in their interrelationships and vast in their geographical extension. Ordinarily, engineers and geoscientists are faced with corporate or scientific requirements to properly prepare geological models with measurements involving a small fraction of the entire area or volume of interest. Exact description of a system such as an oil reservoir is neither feasible nor economically possible. The results are necessarily uncertain. Note that the uncertainty is not an intrinsic property of the systems; it is the result of incomplete knowledge by the observer. THE AIM OF GEOSTATISTICS The main objective of geostatistics is the characterization of spatial systems that are incompletely known, systems that are common in geology. A key difference from classical statistics is that geostatistics uses the sampling location of every measurement. Unless the measurements show spatial correlation, the application of geostatistics is pointless. Ordinarily the need for additional knowledge goes beyond a few points, which explains the display of results graphically as fishnet plots, block diagrams, and maps. GEOSTATISTICAL METHODS Geostatistics is a collection of numerical techniques for the characterization of spatial attributes using primarily two tools: probabilistic models, which are used for spatial data in a manner similar to the way in which time-series analysis characterizes temporal data, or pattern recognition techniques. The probabilistic models are used as a way to handle uncertainty in results away from sampling locations, making a radical departure from alternative approaches like inverse distance estimation methods. DIFFERENCES WITH TIME SERIES On dealing with time-series analysis, users frequently concentrate their attention on extrapolations for making forecasts. Although users of geostatistics may be interested in extrapolation, the methods work at their best interpolating. This simple difference has

  2. Model Selection for Geostatistical Models

    SciTech Connect

    Hoeting, Jennifer A.; Davis, Richard A.; Merton, Andrew A.; Thompson, Sandra E.

    2006-02-01

    We consider the problem of model selection for geospatial data. Spatial correlation is typically ignored in the selection of explanatory variables and this can influence model selection results. For example, the inclusion or exclusion of particular explanatory variables may not be apparent when spatial correlation is ignored. To address this problem, we consider the Akaike Information Criterion (AIC) as applied to a geostatistical model. We offer a heuristic derivation of the AIC in this context and provide simulation results that show that using AIC for a geostatistical model is superior to the often used approach of ignoring spatial correlation in the selection of explanatory variables. These ideas are further demonstrated via a model for lizard abundance. We also employ the principle of minimum description length (MDL) to variable selection for the geostatistical model. The effect of sampling design on the selection of explanatory covariates is also explored.

  3. Examining the Missing Completely at Random Mechanism in Incomplete Data Sets: A Multiple Testing Approach

    ERIC Educational Resources Information Center

    Raykov, Tenko; Lichtenberg, Peter A.; Paulson, Daniel

    2012-01-01

    A multiple testing procedure for examining implications of the missing completely at random (MCAR) mechanism in incomplete data sets is discussed. The approach uses the false discovery rate concept and is concerned with testing group differences on a set of variables. The method can be used for ascertaining violations of MCAR and disproving this…

  4. Restricted spatial regression in practice: Geostatistical models, confounding, and robustness under model misspecification

    USGS Publications Warehouse

    Hanks, Ephraim M.; Schliep, Erin M.; Hooten, Mevin B.; Hoeting, Jennifer A.

    2015-01-01

    In spatial generalized linear mixed models (SGLMMs), covariates that are spatially smooth are often collinear with spatially smooth random effects. This phenomenon is known as spatial confounding and has been studied primarily in the case where the spatial support of the process being studied is discrete (e.g., areal spatial data). In this case, the most common approach suggested is restricted spatial regression (RSR) in which the spatial random effects are constrained to be orthogonal to the fixed effects. We consider spatial confounding and RSR in the geostatistical (continuous spatial support) setting. We show that RSR provides computational benefits relative to the confounded SGLMM, but that Bayesian credible intervals under RSR can be inappropriately narrow under model misspecification. We propose a posterior predictive approach to alleviating this potential problem and discuss the appropriateness of RSR in a variety of situations. We illustrate RSR and SGLMM approaches through simulation studies and an analysis of malaria frequencies in The Gambia, Africa.

  5. Application of Frechet and other random-set averaging techniques to fusion of information

    NASA Astrophysics Data System (ADS)

    Goodman, I. R.

    1998-07-01

    An obviously important aspect of target tracking, and more generally, data fusion, is the combination of those pieces of multi-source information deemed to belong together. Recently, it has been pointed out that a random set approach to target tracking and data fusion may be more appropriate rather than the standard point-vector estimate approach -- especially in the case of large inherent parameter errors. In addition, since many data fusion problems involve non-numerical linguistic descriptions, in the same spirit it is also desirable to be able to have a method which averages in some qualitative sense random sets which are non-numerically- valued, i.e., which take on propositions or events, such as 'the target appears in area A or C, given the weather conditions of yesterday and source 1' and 'the target appears in area A or B, given the weather conditions of today and source 2.' This leads to the fundamental problem of how best to define the expectation of a random set. To date, this open issue has only been considered for numerically-based random sets. This paper addresses this issue in part by proposing an approach which is actually algebraically-based, but also applicable to numerical-based random sets, and directly related to both the Frechet and the Aumann-Artstein-Vitale random set averaging procedures. The technique employs the concept of 'constant probability events,' which has also played a key role in the recent development of 'relational event algebra,' a new mathematical tool for representing various models in the form of various functions of probabilities.

  6. A geostatistical approach to estimate mining efficiency indicators with flexible meshes

    NASA Astrophysics Data System (ADS)

    Freixas, Genis; Garriga, David; Fernàndez-Garcia, Daniel; Sanchez-Vila, Xavier

    2014-05-01

    Geostatistics is a branch of statistics developed originally to predict probability distributions of ore grades for mining operations by considering the attributes of a geological formation at unknown locations as a set of correlated random variables. Mining exploitations typically aim to maintain acceptable mineral laws to produce commercial products based upon demand. In this context, we present a new geostatistical methodology to estimate strategic efficiency maps that incorporate hydraulic test data, the evolution of concentrations with time obtained from chemical analysis (packer tests and production wells) as well as hydraulic head variations. The methodology is applied to a salt basin in South America. The exploitation is based on the extraction of brines through vertical and horizontal wells. Thereafter, brines are precipitated in evaporation ponds to obtain target potassium and magnesium salts of economic interest. Lithium carbonate is obtained as a byproduct of the production of potassium chloride. Aside from providing an assemble of traditional geostatistical methods, the strength of this study falls with the new methodology developed, which focus on finding the best sites to exploit the brines while maintaining efficiency criteria. Thus, some strategic indicator efficiency maps have been developed under the specific criteria imposed by exploitation standards to incorporate new extraction wells in new areas that would allow maintain or improve production. Results show that the uncertainty quantification of the efficiency plays a dominant role and that the use flexible meshes, which properly describe the curvilinear features associated with vertical stratification, provides a more consistent estimation of the geological processes. Moreover, we demonstrate that the vertical correlation structure at the given salt basin is essentially linked to variations in the formation thickness, which calls for flexible meshes and non-stationarity stochastic processes.

  7. Statistical analysis of sets of random walks: how to resolve their generating mechanism.

    PubMed

    Coscoy, Sylvie; Huguet, Etienne; Amblard, François

    2007-11-01

    The analysis of experimental random walks aims at identifying the process(es) that generate(s) them. It is in general a difficult task, because statistical dispersion within an experimental set of random walks is a complex combination of the stochastic nature of the generating process, and the possibility to have more than one simple process. In this paper, we study by numerical simulations how the statistical distribution of various geometric descriptors such as the second, third and fourth order moments of two-dimensional random walks depends on the stochastic process that generates that set. From these observations, we derive a method to classify complex sets of random walks, and resolve the generating process(es) by the systematic comparison of experimental moment distributions with those numerically obtained for candidate processes. In particular, various processes such as Brownian diffusion combined with convection, noise, confinement, anisotropy, or intermittency, can be resolved by using high order moment distributions. In addition, finite-size effects are observed that are useful for treating short random walks. As an illustration, we describe how the present method can be used to study the motile behavior of epithelial microvilli. The present work should be of interest in biology for all possible types of single particle tracking experiments. PMID:17896161

  8. Variation analysis of lake extension in space and time from MODIS images using random sets

    NASA Astrophysics Data System (ADS)

    Zhao, Xi; Stein, Alfred; Zhang, Xiang; Feng, Lian; Chen, Xiaoling

    2014-08-01

    Understanding inundation in wetlands may benefit from a joint variation analysis in changes of size, shape, position and extent of water bodies. In this study, we modeled wetland inundation as a random spread process and used random sets to characterize stochastic properties of water body extents. Periodicity, trend and random components were captured by monthly and yearly random sets that were derived from multitemporal images. The Covering-Distance matrix and related operators summarized and visualized the spatial pattern and quantified the similarity of different inundation stages. The study was carried out on the Poyang Lake wetland area in China, and MODIS images for a period of eleven years were used. Results revealed the substantial seasonal dynamic pattern of the inundation and a subtle interannual change in its extension from 2000 to 2010. Various spatial properties including the size, shape, position and extent are visible: areas of high flooding risk are very elongated and locate along the water channel; few of the inundation areas tend to be more circular and spread extensively; the majority of the inundation areas have various extent and size in different month and year. Large differences in the spatial distribution of inundation extents were shown to exist between months from different seasons. A unique spatial pattern occurred during those months that a dramatic flooding or recession happened. Yearly random sets gave detailed information on the spatial distributions of inundation frequency and showed a shrinking trend from 2000 to 2009. 2003 is the partition year in the declining trend and 2010 breaking the trend as an abnormal year. Besides, probability bounds were derived from the model for a region that was attacked by flooding. This ability of supporting decision making is shown in a simple management scenario. We conclude that a random sets analysis is a valuable addition to a frequency analysis that quantifies inundation variation in space and

  9. Geostatistical Modeling of Pore Velocity

    SciTech Connect

    Devary, J.L.; Doctor, P.G.

    1981-06-01

    A significant part of evaluating a geologic formation as a nuclear waste repository involves the modeling of contaminant transport in the surrounding media in the event the repository is breached. The commonly used contaminant transport models are deterministic. However, the spatial variability of hydrologic field parameters introduces uncertainties into contaminant transport predictions. This paper discusses the application of geostatistical techniques to the modeling of spatially varying hydrologic field parameters required as input to contaminant transport analyses. Kriging estimation techniques were applied to Hanford Reservation field data to calculate hydraulic conductivity and the ground-water potential gradients. These quantities were statistically combined to estimate the groundwater pore velocity and to characterize the pore velocity estimation error. Combining geostatistical modeling techniques with product error propagation techniques results in an effective stochastic characterization of groundwater pore velocity, a hydrologic parameter required for contaminant transport analyses.

  10. Geostatistics for high resolution geomorphometry: from spatial continuity to surface texture

    NASA Astrophysics Data System (ADS)

    Trevisani, Sebastiano

    2015-04-01

    This presentation introduces the use of geostatistics in the context of high-resolution geomorphometry. The application of geostatistics to geomorphometry permits a shift in perspective, moving our attention more toward spatial continuity description than toward the inference of a spatial continuity model. This change in perspective opens interesting directions in the application of geostatistical methods in geomorphometry. Geostatistical methodologies have been extensively applied and adapted in the context of remote sensing, leading to many interesting applications aimed at the analysis of the complex patterns characterizing imagery. Among these applications the analysis of image texture has to be mentioned. In fact, the analysis of image texture reverts to the analysis of surface texture when the analyzed image is a raster representation of a digital terrain model. The main idea is to use spatial-continuity indices as multiscale and directional descriptors of surface texture, including the important aspect related to surface roughness. In this context we introduce some examples regarding the application of geostatistics for image analysis and surface texture characterization. We also show as in presence of complex morphological settings there is the need to use alternative indices of spatial continuity, less sensitive to hotspots and to non-stationarity that often characterize surface morphology. This introduction is mainly dedicated to univariate geostatistics; however the same concepts could be exploited by means of multivariate as well as multipoint geostatistics.

  11. Local Geostatistical Models and Big Data in Hydrological and Ecological Applications

    NASA Astrophysics Data System (ADS)

    Hristopulos, Dionissios

    2015-04-01

    The advent of the big data era creates new opportunities for environmental and ecological modelling but also presents significant challenges. The availability of remote sensing images and low-cost wireless sensor networks implies that spatiotemporal environmental data to cover larger spatial domains at higher spatial and temporal resolution for longer time windows. Handling such voluminous data presents several technical and scientific challenges. In particular, the geostatistical methods used to process spatiotemporal data need to overcome the dimensionality curse associated with the need to store and invert large covariance matrices. There are various mathematical approaches for addressing the dimensionality problem, including change of basis, dimensionality reduction, hierarchical schemes, and local approximations. We present a Stochastic Local Interaction (SLI) model that can be used to model local correlations in spatial data. SLI is a random field model suitable for data on discrete supports (i.e., regular lattices or irregular sampling grids). The degree of localization is determined by means of kernel functions and appropriate bandwidths. The strength of the correlations is determined by means of coefficients. In the "plain vanilla" version the parameter set involves scale and rigidity coefficients as well as a characteristic length. The latter determines in connection with the rigidity coefficient the correlation length of the random field. The SLI model is based on statistical field theory and extends previous research on Spartan spatial random fields [2,3] from continuum spaces to explicitly discrete supports. The SLI kernel functions employ adaptive bandwidths learned from the sampling spatial distribution [1]. The SLI precision matrix is expressed explicitly in terms of the model parameter and the kernel function. Hence, covariance matrix inversion is not necessary for parameter inference that is based on leave-one-out cross validation. This property

  12. Building Damage-Resilient Dominating Sets in Complex Networks against Random and Targeted Attacks

    PubMed Central

    Molnár, F.; Derzsy, N.; Szymanski, B. K.; Korniss, G.

    2015-01-01

    We study the vulnerability of dominating sets against random and targeted node removals in complex networks. While small, cost-efficient dominating sets play a significant role in controllability and observability of these networks, a fixed and intact network structure is always implicitly assumed. We find that cost-efficiency of dominating sets optimized for small size alone comes at a price of being vulnerable to damage; domination in the remaining network can be severely disrupted, even if a small fraction of dominator nodes are lost. We develop two new methods for finding flexible dominating sets, allowing either adjustable overall resilience, or dominating set size, while maximizing the dominated fraction of the remaining network after the attack. We analyze the efficiency of each method on synthetic scale-free networks, as well as real complex networks. PMID:25662371

  13. Importance of stationarity for geostatistical assessment of environmental contamination

    SciTech Connect

    Dagdelen, K.; Turner, A.K.

    1996-12-31

    This paper describes a geostatistical case study to assess TCE contamination from multiple point sources that is migrating through the geologically complex conditions with several aquifers. The paper highlights the importance of the stationarity assumption by demonstrating how biased assessments of TCE contamination result when ordinary kriging of the data that violates stationarity assumptions. Division of the data set into more homogeneous geologic and hydrologic zones improved the accuracy of the estimates. Indicator kriging offers an alternate method for providing a stochastic model that is more appropriate for the data. Further improvement in the estimates results when indicator kriging is applied to individual subregional data sets that are based on geological considerations. This further enhances the data homogeneity and makes use of stationary model more appropriate. By combining geological and geostatistical evaluations, more realistic maps may be produced that reflect the hydrogeological environment and provide a sound basis for future investigations and remediation.

  14. Geostatistical enhancement of european hydrological predictions

    NASA Astrophysics Data System (ADS)

    Pugliese, Alessio; Castellarin, Attilio; Parajka, Juraj; Arheimer, Berit; Bagli, Stefano; Mazzoli, Paolo; Montanari, Alberto; Blöschl, Günter

    2016-04-01

    Geostatistical Enhancement of European Hydrological Prediction (GEEHP) is a research experiment developed within the EU funded SWITCH-ON project, which proposes to conduct comparative experiments in a virtual laboratory in order to share water-related information and tackle changes in the hydrosphere for operational needs (http://www.water-switch-on.eu). The main objective of GEEHP deals with the prediction of streamflow indices and signatures in ungauged basins at different spatial scales. In particular, among several possible hydrological signatures we focus in our experiment on the prediction of flow-duration curves (FDCs) along the stream-network, which has attracted an increasing scientific attention in the last decades due to the large number of practical and technical applications of the curves (e.g. hydropower potential estimation, riverine habitat suitability and ecological assessments, etc.). We apply a geostatistical procedure based on Top-kriging, which has been recently shown to be particularly reliable and easy-to-use regionalization approach, employing two different type of streamflow data: pan-European E-HYPE simulations (http://hypeweb.smhi.se/europehype) and observed daily streamflow series collected in two pilot study regions, i.e. Tyrol (merging data from Austrian and Italian stream gauging networks) and Sweden. The merger of the two study regions results in a rather large area (~450000 km2) and might be considered as a proxy for a pan-European application of the approach. In a first phase, we implement a bidirectional validation, i.e. E-HYPE catchments are set as training sites to predict FDCs at the same sites where observed data are available, and vice-versa. Such a validation procedure reveals (1) the usability of the proposed approach for predicting the FDCs over the entire river network of interest using alternatively observed data and E-HYPE simulations and (2) the accuracy of E-HYPE-based predictions of FDCs in ungauged sites. In a

  15. A unified development of several techniques for the representation of random vectors and data sets

    NASA Technical Reports Server (NTRS)

    Bundick, W. T.

    1973-01-01

    Linear vector space theory is used to develop a general representation of a set of data vectors or random vectors by linear combinations of orthonormal vectors such that the mean squared error of the representation is minimized. The orthonormal vectors are shown to be the eigenvectors of an operator. The general representation is applied to several specific problems involving the use of the Karhunen-Loeve expansion, principal component analysis, and empirical orthogonal functions; and the common properties of these representations are developed.

  16. Geostatistics software user's manual for the Geosciences Research and Engineering Department PDP 11/70 computer

    SciTech Connect

    Devary, J.L.; Mitchell, P.J.; Rice, W.A.; Damschen, D.W.

    1983-12-01

    Geostatistics and kriging are statistical techniques that can be used to estimate a hydrologic parameter surface from spatially distributed data. The techniques produce statistically optimum estimates of a surface and can aid in pinpointing areas of greatest need for data. This document describes the use of geostatistical software routines that have been implemented at Pacific Northwest Laboratory's (PNL's) PDP 11/70 computer of the Geosciences Research and Engineering Department for use with the Hanford ground water monitoring data base of the Ground-Water Surveillance Program sponsored by U.S. Department of Energy. The geostatistical routines were modified to improve input/output file efficiency and to accept larger data sets. Two data processing routines were also developed for retrieving and sorting data into a form compatible with the geostatistical routines. 6 references, 9 figures, 1 table.

  17. Stability of Dominating Sets in Complex Networks against Random and Targeted Attacks

    NASA Astrophysics Data System (ADS)

    Molnar, F.; Derzsy, N.; Szymanski, B. K.; Korniss, G.

    2014-03-01

    Minimum dominating sets (MDS) are involved in efficiently controlling and monitoring many social and technological networks. However, MDS influence over the entire network may be significantly reduced when some MDS nodes are disabled due to random breakdowns or targeted attacks against nodes in the network. We investigate the stability of domination in scale-free networks in such scenarios. We define stability as the fraction of nodes in the network that are still dominated after some nodes have been removed, either randomly, or by targeting the highest-degree nodes. We find that although the MDS is the most cost-efficient solution (requiring the least number of nodes) for reaching every node in an undamaged network, it is also very sensitive to damage. Further, we investigate alternative methods for finding dominating sets that are less efficient (more costly) than MDS but provide better stability. Finally we construct an algorithm based on greedy node selection that allows us to precisely control the balance between domination stability and cost, to achieve any desired stability at minimum cost, or the best possible stability at any given cost. Analysis of our method shows moderate improvement of domination cost efficiency against random breakdowns, but substantial improvements against targeted attacks. Supported by DARPA, DTRA, ARL NS-CTA, ARO, and ONR.

  18. Spin-glass phase transitions and minimum energy of the random feedback vertex set problem.

    PubMed

    Qin, Shao-Meng; Zeng, Ying; Zhou, Hai-Jun

    2016-08-01

    A feedback vertex set (FVS) of an undirected graph contains vertices from every cycle of this graph. Constructing a FVS of sufficiently small cardinality is very difficult in the worst cases, but for random graphs this problem can be efficiently solved by converting it into an appropriate spin-glass model [H.-J. Zhou, Eur. Phys. J. B 86, 455 (2013)EPJBFY1434-602810.1140/epjb/e2013-40690-1]. In the present work we study the spin-glass phase transitions and the minimum energy density of the random FVS problem by the first-step replica-symmetry-breaking (1RSB) mean-field theory. For both regular random graphs and Erdös-Rényi graphs, we determine the inverse temperature β_{l} at which the replica-symmetric mean-field theory loses its local stability, the inverse temperature β_{d} of the dynamical (clustering) phase transition, and the inverse temperature β_{s} of the static (condensation) phase transition. These critical inverse temperatures all change with the mean vertex degree in a nonmonotonic way, and β_{d} is distinct from β_{s} for regular random graphs of vertex degrees K>60, while β_{d} are identical to β_{s} for Erdös-Rényi graphs at least up to mean vertex degree c=512. We then derive the zero-temperature limit of the 1RSB theory and use it to compute the minimum FVS cardinality. PMID:27627285

  19. The gradient boosting algorithm and random boosting for genome-assisted evaluation in large data sets.

    PubMed

    González-Recio, O; Jiménez-Montero, J A; Alenda, R

    2013-01-01

    In the next few years, with the advent of high-density single nucleotide polymorphism (SNP) arrays and genome sequencing, genomic evaluation methods will need to deal with a large number of genetic variants and an increasing sample size. The boosting algorithm is a machine-learning technique that may alleviate the drawbacks of dealing with such large data sets. This algorithm combines different predictors in a sequential manner with some shrinkage on them; each predictor is applied consecutively to the residuals from the committee formed by the previous ones to form a final prediction based on a subset of covariates. Here, a detailed description is provided and examples using a toy data set are included. A modification of the algorithm called "random boosting" was proposed to increase predictive ability and decrease computation time of genome-assisted evaluation in large data sets. Random boosting uses a random selection of markers to add a subsequent weak learner to the predictive model. These modifications were applied to a real data set composed of 1,797 bulls genotyped for 39,714 SNP. Deregressed proofs of 4 yield traits and 1 type trait from January 2009 routine evaluations were used as dependent variables. A 2-fold cross-validation scenario was implemented. Sires born before 2005 were used as a training sample (1,576 and 1,562 for production and type traits, respectively), whereas younger sires were used as a testing sample to evaluate predictive ability of the algorithm on yet-to-be-observed phenotypes. Comparison with the original algorithm was provided. The predictive ability of the algorithm was measured as Pearson correlations between observed and predicted responses. Further, estimated bias was computed as the average difference between observed and predicted phenotypes. The results showed that the modification of the original boosting algorithm could be run in 1% of the time used with the original algorithm and with negligible differences in accuracy

  20. Accurate Complete Basis Set Extrapolation of Direct Random Phase Correlation Energies.

    PubMed

    Mezei, Pál D; Csonka, Gábor I; Ruzsinszky, Adrienn

    2015-08-11

    The direct random phase approximation (dRPA) is a promising way to obtain improvements upon the standard semilocal density functional results in many aspects of computational chemistry. In this paper, we address the slow convergence of the calculated dRPA correlation energy with the increase of the quality and size of the popular Gaussian-type Dunning's correlation consistent aug-cc-pVXZ split valence atomic basis set family. The cardinal number X controls the size of the basis set, and we use X = 3-6 in this study. It is known that even the very expensive X = 6 basis sets lead to large errors for the dRPA correlation energy, and thus complete basis set extrapolation is necessary. We study the basis set convergence of the dRPA correlation energies on a set of 65 hydrocarbon isomers from CH4 to C6H6. We calculate the iterative density fitted dRPA correlation energies using an efficient algorithm based on the CC-like form of the equations using the self-consistent HF orbitals. We test the popular inverse cubic, the optimized exponential, and inverse power formulas for complete basis set extrapolation. We have found that the optimized inverse power based extrapolation delivers the best energies. Further analysis showed that the optimal exponent depends on the molecular structure, and the most efficient two-point energy extrapolations that use X = 3 and 4 can be improved considerably by considering the atomic composition and hybridization states of the atoms in the molecules. Our results also show that the optimized exponents that yield accurate X = 3 and 4 extrapolated dRPA energies for atoms or small molecules might be inaccurate for larger molecules. PMID:26574475

  1. Characterizing gene sets using discriminative random walks with restart on heterogeneous biological networks

    PubMed Central

    Blatti, Charles; Sinha, Saurabh

    2016-01-01

    Motivation: Analysis of co-expressed gene sets typically involves testing for enrichment of different annotations or ‘properties’ such as biological processes, pathways, transcription factor binding sites, etc., one property at a time. This common approach ignores any known relationships among the properties or the genes themselves. It is believed that known biological relationships among genes and their many properties may be exploited to more accurately reveal commonalities of a gene set. Previous work has sought to achieve this by building biological networks that combine multiple types of gene–gene or gene–property relationships, and performing network analysis to identify other genes and properties most relevant to a given gene set. Most existing network-based approaches for recognizing genes or annotations relevant to a given gene set collapse information about different properties to simplify (homogenize) the networks. Results: We present a network-based method for ranking genes or properties related to a given gene set. Such related genes or properties are identified from among the nodes of a large, heterogeneous network of biological information. Our method involves a random walk with restarts, performed on an initial network with multiple node and edge types that preserve more of the original, specific property information than current methods that operate on homogeneous networks. In this first stage of our algorithm, we find the properties that are the most relevant to the given gene set and extract a subnetwork of the original network, comprising only these relevant properties. We then re-rank genes by their similarity to the given gene set, based on a second random walk with restarts, performed on the above subnetwork. We demonstrate the effectiveness of this algorithm for ranking genes related to Drosophila embryonic development and aggressive responses in the brains of social animals. Availability and Implementation: DRaWR was implemented as

  2. Random sets technique for information fusion applied to estimation of brain functional images

    NASA Astrophysics Data System (ADS)

    Smith, Therese M.; Kelly, Patrick A.

    1999-05-01

    A new mathematical technique for information fusion based on random sets, developed and described by Goodman, Mahler and Nguyen (The Mathematics of Data Fusion, Kluwer, 1997) can be useful for estimation of functional brian images. Many image estimation algorithms employ prior models that incorporate general knowledge about sizes, shapes and locations of brain regions. Recently, algorithms have been proposed using specific prior knowledge obtained from other imaging modalities (for example, Bowsher, et al., IEEE Trans. Medical Imaging, 1996). However, there is more relevant information than is presently used. A technique that permits use of additional prior information about activity levels would improve the quality of prior models, and hence, of the resulting image estimate. The use of random sets provides this capability because it allows seemingly non-statistical (or ambiguous) information such as that contained in inference rules to be represented and combined with observations in a single statistical model, corresponding to a global joint density. This paper illustrates the use of this approach by constructing an example global joint density function for brain functional activity from measurements of functional activity, anatomical information, clinical observations and inference rules. The estimation procedure is tested on a data phantom with Poisson noise.

  3. Generalized index for spatial data sets as a measure of complete spatial randomness

    NASA Astrophysics Data System (ADS)

    Hackett-Jones, Emily J.; Davies, Kale J.; Binder, Benjamin J.; Landman, Kerry A.

    2012-06-01

    Spatial data sets, generated from a wide range of physical systems can be analyzed by counting the number of objects in a set of bins. Previous work has been limited to equal-sized bins, which are inappropriate for some domains (e.g., circular). We consider a nonequal size bin configuration whereby overlapping or nonoverlapping bins cover the domain. A generalized index, defined in terms of a variance between bin counts, is developed to indicate whether or not a spatial data set, generated from exclusion or nonexclusion processes, is at the complete spatial randomness (CSR) state. Limiting values of the index are determined. Using examples, we investigate trends in the generalized index as a function of density and compare the results with those using equal size bins. The smallest bin size must be much larger than the mean size of the objects. We can determine whether a spatial data set is at the CSR state or not by comparing the values of a generalized index for different bin configurations—the values will be approximately the same if the data is at the CSR state, while the values will differ if the data set is not at the CSR state. In general, the generalized index is lower than the limiting value of the index, since objects do not have access to the entire region due to blocking by other objects. These methods are applied to two applications: (i) spatial data sets generated from a cellular automata model of cell aggregation in the enteric nervous system and (ii) a known plant data distribution.

  4. Context-free pairs of groups II — Cuts, tree sets, and random walks

    PubMed Central

    Woess, Wolfgang

    2012-01-01

    This is a continuation of the study, begun by Ceccherini-Silberstein and Woess (2009) [5], of context-free pairs of groups and the related context-free graphs in the sense of Muller and Schupp (1985) [22]. The graphs under consideration are Schreier graphs of a subgroup of some finitely generated group, and context-freeness relates to a tree-like structure of those graphs. Instead of the cones of Muller and Schupp (1985) [22] (connected components resulting from deletion of finite balls with respect to the graph metric), a more general approach to context-free graphs is proposed via tree sets consisting of cuts of the graph, and associated structure trees. The existence of tree sets with certain “good” properties is studied. With a tree set, a natural context-free grammar is associated. These investigations of the structure of context free pairs, resp. graphs are then applied to study random walk asymptotics via complex analysis. In particular, a complete proof of the local limit theorem for return probabilities on any virtually free group is given, as well as on Schreier graphs of a finitely generated subgoup of a free group. This extends, respectively completes, the significant work of Lalley (1993, 2001) [18,20]. PMID:22267873

  5. The Ethics of Randomized Controlled Trials in Social Settings: Can Social Trials Be Scientifically Promising and Must There Be Equipoise?

    ERIC Educational Resources Information Center

    Fives, Allyn; Russell, Daniel W.; Canavan, John; Lyons, Rena; Eaton, Patricia; Devaney, Carmel; Kearns, Norean; O'Brien, Aoife

    2015-01-01

    In a randomized controlled trial (RCT), treatments are assigned randomly and treatments are withheld from participants. Is it ethically permissible to conduct an RCT in a social setting? This paper addresses two conditions for justifying RCTs: that there should be a state of equipoise and that the trial should be scientifically promising.…

  6. Implementing Randomized Controlled Trials in Preschool Settings That Include Young Children with Disabilities: Considering the Context of Strain and Bovey

    ERIC Educational Resources Information Center

    Snyder, Patricia

    2011-01-01

    In this commentary, developments related to conducting randomized controlled trials in authentic preschool settings that include young children with disabilities are discussed in relation to the Strain and Bovey study.

  7. Reducing spatial uncertainty in climatic maps through geostatistical analysis

    NASA Astrophysics Data System (ADS)

    Pesquer, Lluís; Ninyerola, Miquel; Pons, Xavier

    2014-05-01

    ), applying different interpolation methods/parameters are shown: RMS (mm) error values obtained from the independent test set (20 % of the samples) follow, according to this order: IDW (exponent=1.5, 2, 2.5, 3) / SPT (tension=100, 125, 150, 175, 200) / OK. LOOCV: 92.5; 80.2; 74.2; 72.3 / 181.6; 90.6; 75.7; 71.1; 69.4; 68.8 RS: 101.2; 89.6; 83.9; 81.9 / 115.1; 92.4; 84.0; 81.4; 80.9; 81.1 / 81.1 EU: 57.4; 51.3; 53.1; 55.5 / 59.1; 57.1; 55.9; 55.0; 54.3 / 51.8 A3D: 48.3; 49.8; 52.5; 62.2 / 57.1; 54.4; 52.5; 51.2; 50.2 / 49.7 To study these results, a geostatistical analysis of uncertainty has been done. Main results: variogram analysis of the error (using the test set) shows that the total sill is reduced (50% EU, 60% A3D) when using the two new approaches, while the spatialized standard deviation model calculated from the OK shows significantly lower values when compared to the RS. In conclusion, A3D and EU highly improve LOOCV and RS, whereas A3D slightly improves EU. Also, LOOCV only shows slightly better results than RS, suggesting that non-random-split increases the power of both fitting-test steps. * Ninyerola, Pons, Roure. A methodological approach of climatological modelling of air temperature and precipitation through GIS techniques. IJC, 2000; 20:1823-1841.

  8. Canonical Naimark extension for generalized measurements involving sets of Pauli quantum observables chosen at random

    NASA Astrophysics Data System (ADS)

    Sparaciari, Carlo; Paris, Matteo G. A.

    2013-01-01

    We address measurement schemes where certain observables Xk are chosen at random within a set of nondegenerate isospectral observables and then measured on repeated preparations of a physical system. Each observable has a probability zk to be measured, with ∑kzk=1, and the statistics of this generalized measurement is described by a positive operator-valued measure. This kind of scheme is referred to as quantum roulettes, since each observable Xk is chosen at random, e.g., according to the fluctuating value of an external parameter. Here we focus on quantum roulettes for qubits involving the measurements of Pauli matrices, and we explicitly evaluate their canonical Naimark extensions, i.e., their implementation as indirect measurements involving an interaction scheme with a probe system. We thus provide a concrete model to realize the roulette without destroying the signal state, which can be measured again after the measurement or can be transmitted. Finally, we apply our results to the description of Stern-Gerlach-like experiments on a two-level system.

  9. Reverse engineering discrete dynamical systems from data sets with random input vectors.

    PubMed

    Just, Winfried

    2006-10-01

    Recently a new algorithm for reverse engineering of biochemical networks was developed by Laubenbacher and Stigler. It is based on methods from computational algebra and finds most parsimonious models for a given data set. We derive mathematically rigorous estimates for the expected amount of data needed by this algorithm to find the correct model. In particular, we demonstrate that for one type of input parameter (graded term orders), the expected data requirements scale polynomially with the number n of chemicals in the network, while for another type of input parameters (randomly chosen lex orders) this number scales exponentially in n. We also show that, for a modification of the algorithm, the expected data requirements scale as the logarithm of n. PMID:17061920

  10. A Particle Multi-Target Tracker for Superpositional Measurements Using Labeled Random Finite Sets

    NASA Astrophysics Data System (ADS)

    Papi, Francesco; Kim, Du Yong

    2015-08-01

    In this paper we present a general solution for multi-target tracking with superpositional measurements. Measurements that are functions of the sum of the contributions of the targets present in the surveillance area are called superpositional measurements. We base our modelling on Labeled Random Finite Set (RFS) in order to jointly estimate the number of targets and their trajectories. This modelling leads to a labeled version of Mahler's multi-target Bayes filter. However, a straightforward implementation of this tracker using Sequential Monte Carlo (SMC) methods is not feasible due to the difficulties of sampling in high dimensional spaces. We propose an efficient multi-target sampling strategy based on Superpositional Approximate CPHD (SA-CPHD) filter and the recently introduced Labeled Multi-Bernoulli (LMB) and Vo-Vo densities. The applicability of the proposed approach is verified through simulation in a challenging radar application with closely spaced targets and low signal-to-noise ratio.

  11. Random finite set multi-target trackers: stochastic geometry for space situational awareness

    NASA Astrophysics Data System (ADS)

    Vo, Ba-Ngu; Vo, Ba-Tuong

    2015-05-01

    This paper describes the recent development in the random finite set RFS paradigm in multi-target tracking. Over the last decade the Probability Hypothesis Density filter has become synonymous with the RFS approach. As result the PHD filter is often wrongly used as a performance benchmark for the RFS approach. Since there is a suite of RFS-based multi-target tracking algorithms, benchmarking tracking performance of the RFS approach by using the PHD filter, the cheapest of these, is misleading. Such benchmarking should be performed with more sophisticated RFS algorithms. In this paper we outline the high-performance RFS-based multi-target trackers such that the Generalized Labled Multi-Bernoulli filter, and a number of efficient approximations and discuss extensions and applications of these filters. Applications to space situational awareness are discussed.

  12. Recent Advances in Geostatistical Inversion

    NASA Astrophysics Data System (ADS)

    Kitanidis, P. K.

    2011-12-01

    Inverse problems are common in hydrologic applications, such as in subsurface imaging which is the identification of parameters characterizing geologic formations from hydrologic and geophysical observations. In such problems, the data do not suffice to constrain the solution to be unique. The geostatistical approach utilizes probability theory and statistical inference to assimilate data and information about structure and to explore the range of possible solutions in a systematic way. This is a progress report on recent advances in terms of formulation and computational methods. The standard implementation of the geostatistical approach to the inverse method is computationally very expensive when there are millions of unknowns and hundreds of thousands of observations, as is the case in fusing data from many sources in hydrogeology. However, depending on the specific problem, alternative formulations and numerical methods can reduce the computational problem quite dramatically. One approach can utilize formulations that involve matrices with a very high degree of sparsity combined with indirect methods of solution and strategies that minimize the number of required forward runs. The potential for this method is illustrated with an application to transient hydraulic tomography. Another approach speeds up matrix-vector multiplications by utilizing hierarchical sparsity in commonly encountered matrices, particularly prior covariance matrices. A couple of examples show how the computational cost scales with the size of the problem (number of observations and unknowns) in conventional and newer methods. Yet another fruitful approach is to rely on a large number of realizations to represent ensembles of solutions and we illustrate this approach in fusing data from two different geophysical methods. In all of these approaches, utilizing parallel processors and mixed CPU/GPU programming can significantly reduce the computational cost and make it possible to solve very

  13. Improving practice in community-based settings: a randomized trial of supervision – study protocol

    PubMed Central

    2013-01-01

    Background Evidence-based treatments for child mental health problems are not consistently available in public mental health settings. Expanding availability requires workforce training. However, research has demonstrated that training alone is not sufficient for changing provider behavior, suggesting that ongoing intervention-specific supervision or consultation is required. Supervision is notably under-investigated, particularly as provided in public mental health. The degree to which supervision in this setting includes ‘gold standard’ supervision elements from efficacy trials (e.g., session review, model fidelity, outcome monitoring, skill-building) is unknown. The current federally-funded investigation leverages the Washington State Trauma-focused Cognitive Behavioral Therapy Initiative to describe usual supervision practices and test the impact of systematic implementation of gold standard supervision strategies on treatment fidelity and clinical outcomes. Methods/Design The study has two phases. We will conduct an initial descriptive study (Phase I) of supervision practices within public mental health in Washington State followed by a randomized controlled trial of gold standard supervision strategies (Phase II), with randomization at the clinician level (i.e., supervisors provide both conditions). Study participants will be 35 supervisors and 130 clinicians in community mental health centers. We will enroll one child per clinician in Phase I (N = 130) and three children per clinician in Phase II (N = 390). We use a multi-level mixed within- and between-subjects longitudinal design. Audio recordings of supervision and therapy sessions will be collected and coded throughout both phases. Child outcome data will be collected at the beginning of treatment and at three and six months into treatment. Discussion This study will provide insight into how supervisors can optimally support clinicians delivering evidence-based treatments. Phase I will

  14. Model-Based Geostatistical Mapping of the Prevalence of Onchocerca volvulus in West Africa

    PubMed Central

    O’Hanlon, Simon J.; Slater, Hannah C.; Cheke, Robert A.; Boatin, Boakye A.; Coffeng, Luc E.; Pion, Sébastien D. S.; Boussinesq, Michel; Zouré, Honorat G. M.; Stolk, Wilma A.; Basáñez, María-Gloria

    2016-01-01

    Background The initial endemicity (pre-control prevalence) of onchocerciasis has been shown to be an important determinant of the feasibility of elimination by mass ivermectin distribution. We present the first geostatistical map of microfilarial prevalence in the former Onchocerciasis Control Programme in West Africa (OCP) before commencement of antivectorial and antiparasitic interventions. Methods and Findings Pre-control microfilarial prevalence data from 737 villages across the 11 constituent countries in the OCP epidemiological database were used as ground-truth data. These 737 data points, plus a set of statistically selected environmental covariates, were used in a Bayesian model-based geostatistical (B-MBG) approach to generate a continuous surface (at pixel resolution of 5 km x 5km) of microfilarial prevalence in West Africa prior to the commencement of the OCP. Uncertainty in model predictions was measured using a suite of validation statistics, performed on bootstrap samples of held-out validation data. The mean Pearson’s correlation between observed and estimated prevalence at validation locations was 0.693; the mean prediction error (average difference between observed and estimated values) was 0.77%, and the mean absolute prediction error (average magnitude of difference between observed and estimated values) was 12.2%. Within OCP boundaries, 17.8 million people were deemed to have been at risk, 7.55 million to have been infected, and mean microfilarial prevalence to have been 45% (range: 2–90%) in 1975. Conclusions and Significance This is the first map of initial onchocerciasis prevalence in West Africa using B-MBG. Important environmental predictors of infection prevalence were identified and used in a model out-performing those without spatial random effects or environmental covariates. Results may be compared with recent epidemiological mapping efforts to find areas of persisting transmission. These methods may be extended to areas where

  15. Cognitive remediation for individuals with psychosis in a supported education setting: a randomized controlled trial.

    PubMed

    Kidd, Sean A; Kaur, Jaswant; Virdee, Gursharan; George, Tony P; McKenzie, Kwame; Herman, Yarissa

    2014-08-01

    Cognitive remediation (CR) has demonstrated good outcomes when paired with supported employment, however little is known about its effectiveness when integrated into a supported education program. This randomized controlled trial examined the effectiveness of integrating CR within a supported education program compared with supported education without CR. Thirty-seven students with psychosis were recruited into the study in the 2012 academic year. Academic functioning, cognition, self-esteem, and symptomatology were assessed at baseline, at 4months following the first academic term in which CR was provided, and at 8months assessing maintenance of gains. The treatment group demonstrated better retention in the academic program and a trend of improvement across a range of academic functional domains. While both treatment and control groups showed improvement in cognitive measures, the outcomes were not augmented by CR training. CR was also associated with significant and sustained improvements in self esteem. Further research, investigating specific intervention components is required to clarify the mixed findings regarding the effectiveness of CR in an education setting. PMID:24893903

  16. GEOSTATISTICS FOR WASTE MANAGEMENT: A USER'S MANUAL FOR THE GEOPACK (VERSION 1.0) GEOSTATISTICAL SOFTWARE SYSTEM

    EPA Science Inventory

    GEOPACK, a comprehensive user-friendly geostatistical software system, was developed to help in the analysis of spatially correlated data. The software system was developed to be used by scientists, engineers, regulators, etc., with little experience in geostatistical techniques...

  17. H-Area/ITP Geostatistical Assessment of In-Situ and Engineering Properties

    SciTech Connect

    Wyatt, D.; Bartlett, S.F.; Rouhani, S.; Lin, Y.

    1995-09-05

    This report describes the results of a geostatistical investigation conducted in response to tasks and requirements set out in the SRS Request for Proposal Number 93044EQ, dated January 4, 1994. The project tasks were defined in the Statement of Scope (SOS) Number 93044, Revision 0, dated December 1, 1993, developed by the Savannah River Site Geotechnical Services Department.

  18. Maximizing device-independent randomness from a Bell experiment by optimizing the measurement settings

    NASA Astrophysics Data System (ADS)

    Assad, S. M.; Thearle, O.; Lam, P. K.

    2016-07-01

    The rates at which a user can generate device-independent quantum random numbers from a Bell-type experiment depend on the measurements that the user performs. By numerically optimizing over these measurements, we present lower bounds on the randomness generation rates for a family of two-qubit states composed from a mixture of partially entangled states and the completely mixed state. We also report on the randomness generation rates from a tomographic measurement. Interestingly in this case, the randomness generation rates are not monotonic functions of entanglement.

  19. The Effect of Distributed Practice in Undergraduate Statistics Homework Sets: A Randomized Trial

    ERIC Educational Resources Information Center

    Crissinger, Bryan R.

    2015-01-01

    Most homework sets in statistics courses are constructed so that students concentrate or "mass" their practice on a certain topic in one problem set. Distributed practice homework sets include review problems in each set so that practice on a topic is distributed across problem sets. There is a body of research that points to the…

  20. Spatial continuity measures for probabilistic and deterministic geostatistics

    SciTech Connect

    Isaaks, E.H.; Srivastava, R.M.

    1988-05-01

    Geostatistics has traditionally used a probabilistic framework, one in which expected values or ensemble averages are of primary importance. The less familiar deterministic framework views geostatistical problems in terms of spatial integrals. This paper outlines the two frameworks and examines the issue of which spatial continuity measure, the covariance C(h) or the variogram ..sigma..(h), is appropriate for each framework. Although C(h) and ..sigma..(h) were defined originally in terms of spatial integrals, the convenience of probabilistic notation made the expected value definitions more common. These now classical expected value definitions entail a linear relationship between C(h) and ..sigma..(h); the spatial integral definitions do not. In a probabilistic framework, where available sample information is extrapolated to domains other than the one which was sampled, the expected value definitions are appropriate; furthermore, within a probabilistic framework, reasons exist for preferring the variogram to the covariance function. In a deterministic framework, where available sample information is interpolated within the same domain, the spatial integral definitions are appropriate and no reasons are known for preferring the variogram. A case study on a Wiener-Levy process demonstrates differences between the two frameworks and shows that, for most estimation problems, the deterministic viewpoint is more appropriate. Several case studies on real data sets reveal that the sample covariance function reflects the character of spatial continuity better than the sample variogram. From both theoretical and practical considerations, clearly for most geostatistical problems, direct estimation of the covariance is better than the traditional variogram approach.

  1. Preventing Depression among Early Adolescents in the Primary Care Setting: A Randomized Controlled Study of the Penn Resiliency Program

    ERIC Educational Resources Information Center

    Gillham, Jane E.; Hamilton, John; Freres, Derek R.; Patton, Ken; Gallop, Robert

    2006-01-01

    This study evaluated the Penn Resiliency Program's effectiveness in preventing depression when delivered by therapists in a primary care setting. Two-hundred and seventy-one 11- and 12-year-olds, with elevated depressive symptoms, were randomized to PRP or usual care. Over the 2-year follow-up, PRP improved explanatory style for positive events.…

  2. Observation of Lévy distribution and replica symmetry breaking in random lasers from a single set of measurements

    PubMed Central

    Gomes, Anderson S. L.; Raposo, Ernesto P.; Moura, André L.; Fewo, Serge I.; Pincheira, Pablo I. R.; Jerez, Vladimir; Maia, Lauro J. Q.; de Araújo, Cid B.

    2016-01-01

    Random lasers have been recently exploited as a photonic platform for studies of complex systems. This cross-disciplinary approach opened up new important avenues for the understanding of random-laser behavior, including Lévy-type distributions of strong intensity fluctuations and phase transitions to a photonic spin-glass phase. In this work, we employ the Nd:YBO random laser system to unveil, from a single set of measurements, the physical origin of the complex correspondence between the Lévy fluctuation regime and the replica-symmetry-breaking transition to the spin-glass phase. A novel unexpected finding is also reported: the trend to suppress the spin-glass behavior for high excitation pulse energies. The present description from first principles of this correspondence unfolds new possibilities to characterize other random lasers, such as random fiber lasers, nanolasers and small lasers, which include plasmonic-based, photonic-crystal and bio-derived nanodevices. The statistical nature of the emission provided by random lasers can also impact on their prominent use as sources for speckle-free laser imaging, which nowadays represents one of the most promising applications of random lasers, with expected progress even in cancer research. PMID:27292095

  3. Observation of Lévy distribution and replica symmetry breaking in random lasers from a single set of measurements.

    PubMed

    Gomes, Anderson S L; Raposo, Ernesto P; Moura, André L; Fewo, Serge I; Pincheira, Pablo I R; Jerez, Vladimir; Maia, Lauro J Q; de Araújo, Cid B

    2016-01-01

    Random lasers have been recently exploited as a photonic platform for studies of complex systems. This cross-disciplinary approach opened up new important avenues for the understanding of random-laser behavior, including Lévy-type distributions of strong intensity fluctuations and phase transitions to a photonic spin-glass phase. In this work, we employ the Nd:YBO random laser system to unveil, from a single set of measurements, the physical origin of the complex correspondence between the Lévy fluctuation regime and the replica-symmetry-breaking transition to the spin-glass phase. A novel unexpected finding is also reported: the trend to suppress the spin-glass behavior for high excitation pulse energies. The present description from first principles of this correspondence unfolds new possibilities to characterize other random lasers, such as random fiber lasers, nanolasers and small lasers, which include plasmonic-based, photonic-crystal and bio-derived nanodevices. The statistical nature of the emission provided by random lasers can also impact on their prominent use as sources for speckle-free laser imaging, which nowadays represents one of the most promising applications of random lasers, with expected progress even in cancer research. PMID:27292095

  4. Observation of Lévy distribution and replica symmetry breaking in random lasers from a single set of measurements

    NASA Astrophysics Data System (ADS)

    Gomes, Anderson S. L.; Raposo, Ernesto P.; Moura, André L.; Fewo, Serge I.; Pincheira, Pablo I. R.; Jerez, Vladimir; Maia, Lauro J. Q.; de Araújo, Cid B.

    2016-06-01

    Random lasers have been recently exploited as a photonic platform for studies of complex systems. This cross-disciplinary approach opened up new important avenues for the understanding of random-laser behavior, including Lévy-type distributions of strong intensity fluctuations and phase transitions to a photonic spin-glass phase. In this work, we employ the Nd:YBO random laser system to unveil, from a single set of measurements, the physical origin of the complex correspondence between the Lévy fluctuation regime and the replica-symmetry-breaking transition to the spin-glass phase. A novel unexpected finding is also reported: the trend to suppress the spin-glass behavior for high excitation pulse energies. The present description from first principles of this correspondence unfolds new possibilities to characterize other random lasers, such as random fiber lasers, nanolasers and small lasers, which include plasmonic-based, photonic-crystal and bio-derived nanodevices. The statistical nature of the emission provided by random lasers can also impact on their prominent use as sources for speckle-free laser imaging, which nowadays represents one of the most promising applications of random lasers, with expected progress even in cancer research.

  5. A geostatistical algorithm to reproduce lateral gradual facies transitions: Description and implementation

    NASA Astrophysics Data System (ADS)

    Falivene, Oriol; Cabello, Patricia; Arbués, Pau; Muñoz, Josep Anton; Cabrera, Lluís

    2009-08-01

    Valid representations of geological heterogeneity are fundamental inputs for quantitative models used in managing subsurface activities. Consequently, the simulation of realistic facies distributions is a significant aim. Realistic facies distributions are typically obtained by pixel-based, object-based or process-based methods. This work presents a pixel-based geostatistical algorithm suitable for reproducing lateral gradual facies transitions (LGFT) between two adjacent sedimentary bodies. Lateral contact (i.e. interfingering) between distinct depositional facies is a widespread geometric relationship that occurs at different scales in any depositional system. The algorithm is based on the truncation of the sum of a linear expectation trend and a random Gaussian field, and can be conditioned to well data. The implementation introduced herein also includes subroutines to clean and geometrically characterize the obtained LGFT. The cleaned sedimentary body transition provides a more appropriate and realistic facies distribution for some depositional settings. The geometric measures of the LGFT yield an intuitive measure of the morphology of the sedimentary body boundary, which can be compared to analogue data. An example of a LGFT obtained by the algorithm presented herein is also flow simulated, quantitatively demonstrating the importance of realistically reproducing them in subsurface models, if further flow-related accurate predictions are to be made.

  6. A Comparison of Hourly Typhoon Rainfall Forecasting Models Based on Support Vector Machines and Random Forests with Different Predictor Sets

    NASA Astrophysics Data System (ADS)

    Lin, Kun-Hsiang; Tseng, Hung-Wei; Kuo, Chen-Min; Yang, Tao-Chang; Yu, Pao-Shan

    2016-04-01

    Typhoons with heavy rainfall and strong wind often cause severe floods and losses in Taiwan, which motivates the development of rainfall forecasting models as part of an early warning system. Thus, this study aims to develop rainfall forecasting models based on two machine learning methods, support vector machines (SVMs) and random forests (RFs), and investigate the performances of the models with different predictor sets for searching the optimal predictor set in forecasting. Four predictor sets were used: (1) antecedent rainfalls, (2) antecedent rainfalls and typhoon characteristics, (3) antecedent rainfalls and meteorological factors, and (4) antecedent rainfalls, typhoon characteristics and meteorological factors to construct for 1- to 6-hour ahead rainfall forecasting. An application to three rainfall stations in Yilan River basin, northeastern Taiwan, was conducted. Firstly, the performance of the SVMs-based forecasting model with predictor set #1 was analyzed. The results show that the accuracy of the models for 2- to 6-hour ahead forecasting decrease rapidly as compared to the accuracy of the model for 1-hour ahead forecasting which is acceptable. For improving the model performance, each predictor set was further examined in the SVMs-based forecasting model. The results reveal that the SVMs-based model using predictor set #4 as input variables performs better than the other sets and a significant improvement of model performance is found especially for the long lead time forecasting. Lastly, the performance of the SVMs-based model using predictor set #4 as input variables was compared with the performance of the RFs-based model using predictor set #4 as input variables. It is found that the RFs-based model is superior to the SVMs-based model in hourly typhoon rainfall forecasting. Keywords: hourly typhoon rainfall forecasting, predictor selection, support vector machines, random forests

  7. Set statistics in conductive bridge random access memory device with Cu/HfO{sub 2}/Pt structure

    SciTech Connect

    Zhang, Meiyun; Long, Shibing Wang, Guoming; Xu, Xiaoxin; Li, Yang; Liu, Qi; Lv, Hangbing; Liu, Ming; Lian, Xiaojuan; Miranda, Enrique; Suñé, Jordi

    2014-11-10

    The switching parameter variation of resistive switching memory is one of the most important challenges in its application. In this letter, we have studied the set statistics of conductive bridge random access memory with a Cu/HfO{sub 2}/Pt structure. The experimental distributions of the set parameters in several off resistance ranges are shown to nicely fit a Weibull model. The Weibull slopes of the set voltage and current increase and decrease logarithmically with off resistance, respectively. This experimental behavior is perfectly captured by a Monte Carlo simulator based on the cell-based set voltage statistics model and the Quantum Point Contact electron transport model. Our work provides indications for the improvement of the switching uniformity.

  8. Proposed sets of critical exponents for randomly branched polymers, using a known string theory model

    NASA Astrophysics Data System (ADS)

    March, N. H.; Moreno, A. J.

    2016-06-01

    The critical exponent ν for randomly branched polymers with dimensionality d equal to 3, is known exactly as 1/2. Here, we invoke an already available string theory model to predict the remaining static critical exponents. Utilizing results of Hsu et al. (Comput Phys Commun. 2005;169:114-116), results are added for d = 8. Experiment plus simulation would now be important to confirm, or if necessary to refine, the proposed values.

  9. Reducing HIV-Related Stigma in Health Care Settings: A Randomized Controlled Trial in China

    PubMed Central

    Wu, Zunyou; Liang, Li-Jung; Guan, Jihui; Jia, Manhong; Rou, Keming; Yan, Zhihua

    2013-01-01

    Objectives. The objective of the intervention was to reduce service providers’ stigmatizing attitudes and behaviors toward people living with HIV. Methods. The randomized controlled trial was conducted in 40 county-level hospitals in 2 provinces of China between October 2008 and February 2010. Forty-four service providers were randomly selected from each hospital, yielding a total of 1760 study participants. We randomized the hospitals to either an intervention condition or a control condition. In the intervention hospitals, about 15% of the popular opinion leaders were identified and trained to disseminate stigma reduction messages. Results. We observed significant improvements for the intervention group in reducing prejudicial attitudes (P < .001), reducing avoidance intent towards people living with HIV (P < .001), and increasing institutional support in the hospitals (P = .003) at 6 months after controlling for service providers’ background factors and clinic-level characteristics. The intervention effects were sustained and strengthened at 12 months. Conclusions. The intervention reduced stigmatizing attitudes and behaviors among service providers. It has the potential to be integrated into the health care systems in China and other countries. PMID:23237175

  10. Mental health first aid training in a workplace setting: A randomized controlled trial [ISRCTN13249129

    PubMed Central

    Kitchener, Betty A; Jorm, Anthony F

    2004-01-01

    Background The Mental Health First Aid training course was favorably evaluated in an uncontrolled trial in 2002 showing improvements in participants' mental health literacy, including knowledge, stigmatizing attitudes, confidence and help provided to others. This article reports the first randomized controlled trial of this course. Methods Data are reported on 301 participants randomized to either participate immediately in a course or to be wait-listed for 5 months before undertaking the training. The participants were employees in two large government departments in Canberra, Australia, where the courses were conducted during participants' work time. Data were analyzed according to an intention-to-treat approach. Results The trial found a number of benefits from this training course, including greater confidence in providing help to others, greater likelihood of advising people to seek professional help, improved concordance with health professionals about treatments, and decreased stigmatizing attitudes. An additional unexpected but exciting finding was an improvement in the mental health of the participants themselves. Conclusions The Mental Health First Aid training has shown itself to be not only an effective way to improve participants' mental health literacy but also to improve their own mental health. It is a course that has high applicability across the community. PMID:15310395

  11. Geostatistical Interpolation of Particle-Size Curves in Heterogeneous Aquifers

    NASA Astrophysics Data System (ADS)

    Guadagnini, A.; Menafoglio, A.; Secchi, P.

    2013-12-01

    We address the problem of predicting the spatial field of particle-size curves (PSCs) from measurements associated with soil samples collected at a discrete set of locations within an aquifer system. Proper estimates of the full PSC are relevant to applications related to groundwater hydrology, soil science and geochemistry and aimed at modeling physical and chemical processes occurring in heterogeneous earth systems. Hence, we focus on providing kriging estimates of the entire PSC at unsampled locations. To this end, we treat particle-size curves as cumulative distribution functions, model their densities as functional compositional data and analyze them by embedding these into the Hilbert space of compositional functions endowed with the Aitchison geometry. On this basis, we develop a new geostatistical methodology for the analysis of spatially dependent functional compositional data. Our functional compositional kriging (FCK) approach allows providing predictions at unsampled location of the entire particle-size curve, together with a quantification of the associated uncertainty, by fully exploiting both the functional form of the data and their compositional nature. This is a key advantage of our approach with respect to traditional methodologies, which treat only a set of selected features (e.g., quantiles) of PSCs. Embedding the full PSC into a geostatistical analysis enables one to provide a complete characterization of the spatial distribution of lithotypes in a reservoir, eventually leading to improved predictions of soil hydraulic attributes through pedotransfer functions as well as of soil geochemical parameters which are relevant in sorption/desorption and cation exchange processes. We test our new method on PSCs sampled along a borehole located within an alluvial aquifer near the city of Tuebingen, Germany. The quality of FCK predictions is assessed through leave-one-out cross-validation. A comparison between hydraulic conductivity estimates obtained

  12. Geostatistical Analysis of Spatial Variability of Mineral Abundance and Kd in Frenchman Flat, NTS, Alluvium

    SciTech Connect

    Carle, S F; Zavarin, M; Pawloski, G A

    2002-11-01

    LLNL hydrologic source term modeling at the Cambric site (Pawloski et al., 2000) showed that retardation of radionuclide transport is sensitive to the distribution and amount of radionuclide sorbing minerals. While all mineralogic information available near the Cambric site was used in these early simulations (11 mineral abundance analyses from UE-5n and 9 from RNM-l), these older data sets were qualitative in nature, with detection limits too high to accurately measure many of the important radionuclide sorbing minerals (e.g. iron oxide). Also, the sparse nature of the mineral abundance data permitted only a hypothetical description of the spatial distribution of radionuclide sorbing minerals. Yet, the modeling results predicted that the spatial distribution of sorbing minerals would strongly affect radionuclide transport. Clearly, additional data are needed to improve understanding of mineral abundances and their spatial distributions if model predictions in Frenchman Flat are to be defensible. This report evaluates new high-resolution quantitative X-Ray Diffraction (XRD) data on mineral distributions and their abundances from core samples recently collected from drill hole ER-5-4. The total of 94 samples from ER-5-4 were collected at various spacings to enable evaluation of spatial variability at a variety of spatial scales as small as 0.3 meters and up to hundreds of meters. Additional XRD analyses obtained from drillholes UE-Sn, ER-5-3, and U-11g-1 are used to augment evaluation of vertical spatial variability and permit some evaluation of lateral spatial variability. A total of 163 samples are evaluated. The overall goal of this study is to understand and characterize the spatial variation of sorbing minerals in Frenchman Flat alluvium using geostatistical techniques, with consideration for the potential impact on reactive transport of radionuclides. To achieve this goal requires an effort to ensure that plausible geostatistical models are used to

  13. Does Pedometer Goal Setting Improve Physical Activity among Native Elders? Results from a Randomized Pilot Study

    ERIC Educational Resources Information Center

    Sawchuk, Craig N.; Russo, Joan E.; Charles, Steve; Goldberg, Jack; Forquera, Ralph; Roy-Byrne, Peter; Buchwald, Dedra

    2011-01-01

    We examined if step-count goal setting resulted in increases in physical activity and walking compared to only monitoring step counts with pedometers among American Indian/Alaska Native elders. Outcomes included step counts, self-reported physical activity and well-being, and performance on the 6-minute walk test. Although no significant…

  14. Evaluating the Bookmark Standard Setting Method: The Impact of Random Item Ordering

    ERIC Educational Resources Information Center

    Davis-Becker, Susan L.; Buckendahl, Chad W.; Gerrow, Jack

    2011-01-01

    Throughout the world, cut scores are an important aspect of a high-stakes testing program because they are a key operational component of the interpretation of test scores. One method for setting standards that is prevalent in educational testing programs--the Bookmark method--is intended to be a less cognitively complex alternative to methods…

  15. PREDICTION INTERVALS FOR INTEGRALS OF GAUSSIAN RANDOM FIELDS.

    PubMed

    De Oliveira, Victor; Kone, Bazoumana

    2015-03-01

    Methodology is proposed for the construction of prediction intervals for integrals of Gaussian random fields over bounded regions (called block averages in the geostatistical literature) based on observations at a finite set of sampling locations. Two bootstrap calibration algorithms are proposed, termed indirect and direct, aimed at improving upon plug-in prediction intervals in terms of coverage probability. A simulation study is carried out that illustrates the effectiveness of both procedures, and these procedures are applied to estimate block averages of chromium traces in a potentially contaminated region in Switzerland. PMID:25431507

  16. PREDICTION INTERVALS FOR INTEGRALS OF GAUSSIAN RANDOM FIELDS

    PubMed Central

    De Oliveira, Victor; Kone, Bazoumana

    2014-01-01

    Methodology is proposed for the construction of prediction intervals for integrals of Gaussian random fields over bounded regions (called block averages in the geostatistical literature) based on observations at a finite set of sampling locations. Two bootstrap calibration algorithms are proposed, termed indirect and direct, aimed at improving upon plug-in prediction intervals in terms of coverage probability. A simulation study is carried out that illustrates the effectiveness of both procedures, and these procedures are applied to estimate block averages of chromium traces in a potentially contaminated region in Switzerland. PMID:25431507

  17. Comparing of goal setting strategy with group education method to increase physical activity level: A randomized trial

    PubMed Central

    Jiryaee, Nasrin; Siadat, Zahra Dana; Zamani, Ahmadreza; Taleban, Roya

    2015-01-01

    Background: Designing an intervention to increase physical activity is important to be based on the health care settings resources and be acceptable by the subject group. This study was designed to assess and compare the effect of the goal setting strategy with a group education method on increasing the physical activity of mothers of children aged 1 to 5. Materials and Methods: Mothers who had at least one child of 1-5 years were randomized into two groups. The effect of 1) goal-setting strategy and 2) group education method on increasing physical activity was assessed and compared 1 month and 3 months after the intervention. Also, the weight, height, body mass index (BMI), waist and hip circumference, and well-being were compared between the two groups before and after the intervention. Results: Physical activity level increased significantly after the intervention in the goal-setting group and it was significantly different between the two groups after intervention (P < 0.05). BMI, waist circumference, hip circumference, and well-being score were significantly different in the goal-setting group after the intervention. In the group education method, only the well-being score improved significantly (P < 0.05). Conclusion: Our study presented the effects of using the goal-setting strategy to boost physical activity, improving the state of well-being and decreasing BMI, waist, and hip circumference. PMID:26929765

  18. Continuous quality improvement (CQI) in addiction treatment settings: design and intervention protocol of a group randomized pilot study

    PubMed Central

    2014-01-01

    Background Few studies have designed and tested the use of continuous quality improvement approaches in community based substance use treatment settings. Little is known about the feasibility, costs, efficacy, and sustainment of such approaches in these settings. Methods/Design A group-randomized trial using a modified stepped wedge design is being used. In the first phase of the study, eight programs, stratified by modality (residential, outpatient) are being randomly assigned to the intervention or control condition. In the second phase, the initially assigned control programs are receiving the intervention to gain additional information about feasibility while sustainment is being studied among the programs initially assigned to the intervention. Discussion By using this design in a pilot study, we help inform the field about the feasibility, costs, efficacy and sustainment of the intervention. Determining information at the pilot stage about costs and sustainment provides value for designing future studies and implementation strategies with the goal to reduce the time between intervention development and translation to real world practice settings. PMID:24467770

  19. Randomized Trial of Two Dissemination Strategies for a Skin Cancer Prevention Program in Aquatic Settings

    PubMed Central

    Escoffery, Cam; Elliott, Tom; Nehl, Eric J.

    2015-01-01

    Objectives. We compared 2 strategies for disseminating an evidence-based skin cancer prevention program. Methods. We evaluated the effects of 2 strategies (basic vs enhanced) for dissemination of the Pool Cool skin cancer prevention program in outdoor swimming pools on (1) program implementation, maintenance, and sustainability and (2) improvements in organizational and environmental supports for sun protection. The trial used a cluster-randomized design with pools as the unit of intervention and outcome. The enhanced group received extra incentives, reinforcement, feedback, and skill-building guidance. Surveys were collected in successive years (2003–2006) from managers of 435 pools in 33 metropolitan areas across the United States participating in the Pool Cool Diffusion Trial. Results. Both treatment groups improved their implementation of the program, but pools in the enhanced condition had significantly greater overall maintenance of the program over 3 summers of participation. Furthermore, pools in the enhanced condition established and maintained significantly greater sun-safety policies and supportive environments over time. Conclusions. This study found that more intensive, theory-driven dissemination strategies can significantly enhance program implementation and maintenance of health-promoting environmental and policy changes. Future research is warranted through longitudinal follow-up to examine sustainability. PMID:25521872

  20. Wave scattering from random sets of closely spaced objects through linear embedding via Green's operators

    NASA Astrophysics Data System (ADS)

    Lancellotti, V.; de Hon, B. P.; Tijhuis, A. G.

    2011-08-01

    In this paper we present the application of linear embedding via Green's operators (LEGO) to the solution of the electromagnetic scattering from clusters of arbitrary (both conducting and penetrable) bodies randomly placed in a homogeneous background medium. In the LEGO method the objects are enclosed within simple-shaped bricks described in turn via scattering operators of equivalent surface current densities. Such operators have to be computed only once for a given frequency, and hence they can be re-used to perform the study of many distributions comprising the same objects located in different positions. The surface integral equations of LEGO are solved via the Moments Method combined with Adaptive Cross Approximation (to save memory) and Arnoldi basis functions (to compress the system). By means of purposefully selected numerical experiments we discuss the time requirements with respect to the geometry of a given distribution. Besides, we derive an approximate relationship between the (near-field) accuracy of the computed solution and the number of Arnoldi basis functions used to obtain it. This result endows LEGO with a handy practical criterion for both estimating the error and keeping it in check.

  1. Probiotics reduce symptoms of antibiotic use in a hospital setting: a randomized dose response study.

    PubMed

    Ouwehand, Arthur C; DongLian, Cai; Weijian, Xu; Stewart, Morgan; Ni, Jiayi; Stewart, Tad; Miller, Larry E

    2014-01-16

    Probiotics are known to reduce antibiotic associated diarrhea (AAD) and Clostridium difficile associated diarrhea (CDAD) risk in a strain-specific manner. The aim of this study was to determine the dose-response effect of a four strain probiotic combination (HOWARU(®) Restore) on the incidence of AAD and CDAD and severity of gastrointestinal symptoms in adult in-patients requiring antibiotic therapy. Patients (n=503) were randomized among three study groups: HOWARU(®) Restore probiotic 1.70×10(10) CFU (high-dose, n=168), HOWARU(®) Restore probiotic 4.17×10(9) CFU (low-dose, n=168), or placebo (n=167). Subjects were stratified by gender, age, and duration of antibiotic treatment. Study products were administered daily up to 7 days after the final antibiotic dose. The primary endpoint of the study was the incidence of AAD. Secondary endpoints included incidence of CDAD, diarrhea duration, stools per day, bloody stools, fever, abdominal cramping, and bloating. A significant dose-response effect on AAD was observed with incidences of 12.5, 19.6, and 24.6% with high-dose, low-dose, and placebo, respectively (p=0.02). CDAD was the same in both probiotic groups (1.8%) but different from the placebo group (4.8%; p=0.04). Incidences of fever, abdominal pain, and bloating were lower with increasing probiotic dose. The number of daily liquid stools and average duration of diarrhea decreased with higher probiotic dosage. The tested four strain probiotic combination appears to lower the risk of AAD, CDAD, and gastrointestinal symptoms in a dose-dependent manner in adult in-patients. PMID:24291194

  2. Relevant feature set estimation with a knock-out strategy and random forests.

    PubMed

    Ganz, Melanie; Greve, Douglas N; Fischl, Bruce; Konukoglu, Ender

    2015-11-15

    Group analysis of neuroimaging data is a vital tool for identifying anatomical and functional variations related to diseases as well as normal biological processes. The analyses are often performed on a large number of highly correlated measurements using a relatively smaller number of samples. Despite the correlation structure, the most widely used approach is to analyze the data using univariate methods followed by post-hoc corrections that try to account for the data's multivariate nature. Although widely used, this approach may fail to recover from the adverse effects of the initial analysis when local effects are not strong. Multivariate pattern analysis (MVPA) is a powerful alternative to the univariate approach for identifying relevant variations. Jointly analyzing all the measures, MVPA techniques can detect global effects even when individual local effects are too weak to detect with univariate analysis. Current approaches are successful in identifying variations that yield highly predictive and compact models. However, they suffer from lessened sensitivity and instabilities in identification of relevant variations. Furthermore, current methods' user-defined parameters are often unintuitive and difficult to determine. In this article, we propose a novel MVPA method for group analysis of high-dimensional data that overcomes the drawbacks of the current techniques. Our approach explicitly aims to identify all relevant variations using a "knock-out" strategy and the Random Forest algorithm. In evaluations with synthetic datasets the proposed method achieved substantially higher sensitivity and accuracy than the state-of-the-art MVPA methods, and outperformed the univariate approach when the effect size is low. In experiments with real datasets the proposed method identified regions beyond the univariate approach, while other MVPA methods failed to replicate the univariate results. More importantly, in a reproducibility study with the well-known ADNI dataset

  3. GEOSTATISTICS FOR WASTE MANAGEMENT: A USER'S MANUAL FOR THE GEOPAK (VERSION 1.0) GEOSTATISTICAL SOFTWARE SYSTEM

    EPA Science Inventory

    A comprehensive, user-friendly geostatistical software system called GEOPACk has been developed. he purpose of this software is to make available the programs necessary to undertake a geostatistical analysis of spatially correlated data. he programs were written so that they can ...

  4. GEOSTATISTICS FOR WASTE MANAGEMENT: A USER'S MANUEL FOR THE GEOPACK (VERSION 1.0) GEOSTATISTICAL SOFTWARE SYSTEM

    EPA Science Inventory

    A comprehensive, user-friendly geostatistical software system called GEOPACk has been developed. The purpose of this software is to make available the programs necessary to undertake a geostatistical analysis of spatially correlated data. The programs were written so that they ...

  5. A pilot randomized controlled trial of a brief parenting intervention in low-resource settings in Panama.

    PubMed

    Mejia, Anilena; Calam, Rachel; Sanders, Matthew R

    2015-07-01

    The aim of this study was to determine whether an intervention from the Triple P Positive Parenting Program system was effective in reducing parental reports of child behavioral difficulties in urban low-income settings in Panama City. A pilot parallel-group randomized controlled trial was carried out. A total of 108 parents of children 3 to 12 years old with some level of parent-rated behavioral difficulties were randomly assigned to a discussion group on "dealing with disobedience" or to a no intervention control. Blinded assessments were carried out prior to the intervention, 2 weeks, 3 months, and 6 months later. Results indicated that parental reports of child behavioral difficulties changed over time and decreased more steeply in the intervention than in the control group. The effects of the intervention on parental reports of behavioral difficulties were moderate at post-intervention and 3-month follow-up, and large at 6-month follow-up. Parents who participated in the discussion group reported fewer behavioral difficulties in their children after the intervention than those in the control condition. They also reported reduced parental stress and less use of dysfunctional parenting practices. There is a limited amount of evidence on the efficacy of parenting interventions in low-resource settings. This pilot trial was carried out using a small convenience sample living in low-income urban communities in Panama City, and therefore, the findings are of reduced generalizability to other settings. However, the methodology employed in this trial represents an example for future work in other low-resource settings. PMID:25703382

  6. Geostatistical integration and uncertainty in pollutant concentration surface under preferential sampling.

    PubMed

    Grisotto, Laura; Consonni, Dario; Cecconi, Lorenzo; Catelan, Dolores; Lagazio, Corrado; Bertazzi, Pier Alberto; Baccini, Michela; Biggeri, Annibale

    2016-01-01

    In this paper the focus is on environmental statistics, with the aim of estimating the concentration surface and related uncertainty of an air pollutant. We used air quality data recorded by a network of monitoring stations within a Bayesian framework to overcome difficulties in accounting for prediction uncertainty and to integrate information provided by deterministic models based on emissions meteorology and chemico-physical characteristics of the atmosphere. Several authors have proposed such integration, but all the proposed approaches rely on representativeness and completeness of existing air pollution monitoring networks. We considered the situation in which the spatial process of interest and the sampling locations are not independent. This is known in the literature as the preferential sampling problem, which if ignored in the analysis, can bias geostatistical inferences. We developed a Bayesian geostatistical model to account for preferential sampling with the main interest in statistical integration and uncertainty. We used PM10 data arising from the air quality network of the Environmental Protection Agency of Lombardy Region (Italy) and numerical outputs from the deterministic model. We specified an inhomogeneous Poisson process for the sampling locations intensities and a shared spatial random component model for the dependence between the spatial location of monitors and the pollution surface. We found greater predicted standard deviation differences in areas not properly covered by the air quality network. In conclusion, in this context inferences on prediction uncertainty may be misleading when geostatistical modelling does not take into account preferential sampling. PMID:27087040

  7. Geostatistical prediction of flow-duration curves

    NASA Astrophysics Data System (ADS)

    Pugliese, A.; Castellarin, A.; Brath, A.

    2013-11-01

    We present in this study an adaptation of Topological kriging (or Top-kriging), which makes the geostatistical procedure capable of predicting flow-duration curves (FDCs) in ungauged catchments. Previous applications of Top-kriging mainly focused on the prediction of point streamflow indices (e.g. flood quantiles, low-flow indices, etc.). In this study Top-kriging is used to predict FDCs in ungauged sites as a weighted average of standardised empirical FDCs through the traditional linear-weighting scheme of kriging methods. Our study focuses on the prediction of period-of-record FDCs for 18 unregulated catchments located in Central Italy, for which daily streamflow series with length from 5 to 40 yr are available, together with information on climate referring to the same time-span of each daily streamflow sequence. Empirical FDCs are standardised by a reference streamflow value (i.e. mean annual flow, or mean annual precipitation times the catchment drainage area) and the overall deviation of the curves from this reference value is then used for expressing the hydrological similarity between catchments and for deriving the geostatistical weights. We performed an extensive leave-one-out cross-validation to quantify the accuracy of the proposed technique, and to compare it to traditional regionalisation models that were recently developed for the same study region. The cross-validation points out that Top-kriging is a reliable approach for predicting FDCs, which can significantly outperform traditional regional models in ungauged basins.

  8. A pilot randomized trial of technology-assisted goal setting to improve physical activity among primary care patients with prediabetes.

    PubMed

    Mann, Devin M; Palmisano, Joseph; Lin, Jenny J

    2016-12-01

    Lifestyle behavior changes can prevent progression of prediabetes to diabetes but providers often are not able to effectively counsel about preventive lifestyle changes. We developed and pilot tested the Avoiding Diabetes Thru Action Plan Targeting (ADAPT) program to enhance primary care providers' counseling about behavior change for patients with prediabetes. Primary care providers in two urban academic practices and their patients with prediabetes were recruited to participate in the ADAPT study, an unblinded randomized pragmatic trial to test the effectiveness of the ADAPT program, including a streamlined electronic medical record-based goal setting tool. Providers were randomized to intervention or control arms; eligible patients whose providers were in the intervention arm received the ADAPT program. Physical activity (the primary outcome) was measured using pedometers, and data were gathered about patients' diet, weight and glycemic control. A total of 54 patients were randomized and analyzed as part of the 6-month ADAPT study (2010-2012, New York, NY). Those in the intervention group showed an increase total daily steps compared to those in the control group (+ 1418 vs - 598, p = 0.007) at 6 months. There was also a trend towards weight loss in the intervention compared to the control group (- 1.0 lbs. vs. 3.0 lbs., p = 0.11), although no change in glycemic control. The ADAPT study is among the first to use standard electronic medical record tools to embed goal setting into realistic primary care workflows and to demonstrate a significant improvement in prediabetes patients' physical activity. PMID:27413670

  9. Can Geostatistical Models Represent Nature's Variability? An Analysis Using Flume Experiments

    NASA Astrophysics Data System (ADS)

    Scheidt, C.; Fernandes, A. M.; Paola, C.; Caers, J.

    2015-12-01

    The lack of understanding in the Earth's geological and physical processes governing sediment deposition render subsurface modeling subject to large uncertainty. Geostatistics is often used to model uncertainty because of its capability to stochastically generate spatially varying realizations of the subsurface. These methods can generate a range of realizations of a given pattern - but how representative are these of the full natural variability? And how can we identify the minimum set of images that represent this natural variability? Here we use this minimum set to define the geostatistical prior model: a set of training images that represent the range of patterns generated by autogenic variability in the sedimentary environment under study. The proper definition of the prior model is essential in capturing the variability of the depositional patterns. This work starts with a set of overhead images from an experimental basin that showed ongoing autogenic variability. We use the images to analyze the essential characteristics of this suite of patterns. In particular, our goal is to define a prior model (a minimal set of selected training images) such that geostatistical algorithms, when applied to this set, can reproduce the full measured variability. A necessary prerequisite is to define a measure of variability. In this study, we measure variability using a dissimilarity distance between the images. The distance indicates whether two snapshots contain similar depositional patterns. To reproduce the variability in the images, we apply an MPS algorithm to the set of selected snapshots of the sedimentary basin that serve as training images. The training images are chosen from among the initial set by using the distance measure to ensure that only dissimilar images are chosen. Preliminary investigations show that MPS can reproduce fairly accurately the natural variability of the experimental depositional system. Furthermore, the selected training images provide

  10. Benchmarking a geostatistical procedure for the homogenisation of annual precipitation series

    NASA Astrophysics Data System (ADS)

    Caineta, Júlio; Ribeiro, Sara; Henriques, Roberto; Soares, Amílcar; Costa, Ana Cristina

    2014-05-01

    The European project COST Action ES0601, Advances in homogenisation methods of climate series: an integrated approach (HOME), has brought to attention the importance of establishing reliable homogenisation methods for climate data. In order to achieve that, a benchmark data set, containing monthly and daily temperature and precipitation data, was created to be used as a comparison basis for the effectiveness of those methods. Several contributions were submitted and evaluated by a number of performance metrics, validating the results against realistic inhomogeneous data. HOME also led to the development of new homogenisation software packages, which included feedback and lessons learned during the project. Preliminary studies have suggested a geostatistical stochastic approach, which uses Direct Sequential Simulation (DSS), as a promising methodology for the homogenisation of precipitation data series. Based on the spatial and temporal correlation between the neighbouring stations, DSS calculates local probability density functions at a candidate station to detect inhomogeneities. The purpose of the current study is to test and compare this geostatistical approach with the methods previously presented in the HOME project, using surrogate precipitation series from the HOME benchmark data set. The benchmark data set contains monthly precipitation surrogate series, from which annual precipitation data series were derived. These annual precipitation series were subject to exploratory analysis and to a thorough variography study. The geostatistical approach was then applied to the data set, based on different scenarios for the spatial continuity. Implementing this procedure also promoted the development of a computer program that aims to assist on the homogenisation of climate data, while minimising user interaction. Finally, in order to compare the effectiveness of this methodology with the homogenisation methods submitted during the HOME project, the obtained results

  11. A New 2.5D Representation for Lymph Node Detection using Random Sets of Deep Convolutional Neural Network Observations

    PubMed Central

    Lu, Le; Seff, Ari; Cherry, Kevin M.; Hoffman, Joanne; Wang, Shijun; Liu, Jiamin; Turkbey, Evrim; Summers, Ronald M.

    2015-01-01

    Automated Lymph Node (LN) detection is an important clinical diagnostic task but very challenging due to the low contrast of surrounding structures in Computed Tomography (CT) and to their varying sizes, poses, shapes and sparsely distributed locations. State-of-the-art studies show the performance range of 52.9% sensitivity at 3.1 false-positives per volume (FP/vol.), or 60.9% at 6.1 FP/vol. for mediastinal LN, by one-shot boosting on 3D HAAR features. In this paper, we first operate a preliminary candidate generation stage, towards ~100% sensitivity at the cost of high FP levels (~40 per patient), to harvest volumes of interest (VOI). Our 2.5D approach consequently decomposes any 3D VOI by resampling 2D reformatted orthogonal views N times, via scale, random translations, and rotations with respect to the VOI centroid coordinates. These random views are then used to train a deep Convolutional Neural Network (CNN) classifier. In testing, the CNN is employed to assign LN probabilities for all N random views that can be simply averaged (as a set) to compute the final classification probability per VOI. We validate the approach on two datasets: 90 CT volumes with 388 mediastinal LNs and 86 patients with 595 abdominal LNs. We achieve sensitivities of 70%/83% at 3 FP/vol. and 84%/90% at 6 FP/vol. in mediastinum and abdomen respectively, which drastically improves over the previous state-of-the-art work. PMID:25333158

  12. Increasing Water Intake of Children and Parents in the Family Setting: A Randomized, Controlled Intervention Using Installation Theory.

    PubMed

    Lahlou, Saadi; Boesen-Mariani, Sabine; Franks, Bradley; Guelinckx, Isabelle

    2015-01-01

    On average, children and adults in developed countries consume too little water, which can lead to negative health consequences. In a one-year longitudinal field experiment in Poland, we compared the impact of three home-based interventions on helping children and their parents/caregivers to develop sustainable increased plain water consumption habits. Fluid consumption of 334 children and their caregivers were recorded over one year using an online specific fluid dietary record. They were initially randomly allocated to one of the three following conditions: Control, Information (child and carer received information on the health benefits of water), or Placement (in addition to information, free small bottles of still water for a limited time period were delivered at home). After three months, half of the non-controls were randomly assigned to Community (child and caregiver engaged in an online community forum providing support on water consumption). All conditions significantly increased the water consumption of children (by 21.9-56.7%) and of adults (by 22-89%). Placement + Community generated the largest effects. Community enhanced the impact of Placement for children and parents, as well as the impact of Information for parents but not children. The results suggest that the family setting offers considerable scope for successful installation of interventions encouraging children and caregivers to develop healthier consumption habits, in mutually reinforcing ways. Combining information, affordances, and social influence gives the best, and most sustainable, results. PMID:26088044

  13. Preventing depression among early adolescents in the primary care setting: a randomized controlled study of the Penn Resiliency Program.

    PubMed

    Gillham, Jane E; Hamilton, John; Freres, Derek R; Patton, Ken; Gallop, Robert

    2006-04-01

    This study evaluated the Penn Resiliency Program's effectiveness in preventing depression when delivered by therapists in a primary care setting. Two-hundred and seventy-one 11- and 12-year-olds, with elevated depressive symptoms, were randomized to PRP or usual care. Over the 2-year follow-up, PRP improved explanatory style for positive events. PRP's effects on depressive symptoms and explanatory style for negative events were moderated by sex, with girls benefiting more than boys. Stronger effects were seen in high-fidelity groups than low-fidelity groups. PRP did not significantly prevent depressive disorders but significantly prevented depression, anxiety, and adjustment disorders (when combined) among high-symptom participants. Findings are discussed in relation to previous PRP studies and research on the dissemination of psychological interventions. PMID:16741684

  14. First look: a cluster-randomized trial of ultrasound to improve pregnancy outcomes in low income country settings

    PubMed Central

    2014-01-01

    Background In high-resource settings, obstetric ultrasound is a standard component of prenatal care used to identify pregnancy complications and to establish an accurate gestational age in order to improve obstetric care. Whether or not ultrasound use will improve care and ultimately pregnancy outcomes in low-resource settings is unknown. Methods/Design This multi-country cluster randomized trial will assess the impact of antenatal ultrasound screening performed by health care staff on a composite outcome consisting of maternal mortality and maternal near-miss, stillbirth and neonatal mortality in low-resource community settings. The trial will utilize an existing research infrastructure, the Global Network for Women’s and Children’s Health Research with sites in Pakistan, Kenya, Zambia, Democratic Republic of Congo and Guatemala. A maternal and newborn health registry in defined geographic areas which documents all pregnancies and their outcomes to 6 weeks post-delivery will provide population-based rates of maternal mortality and morbidity, stillbirth, neonatal mortality and morbidity, and health care utilization for study clusters. A total of 58 study clusters each with a health center and about 500 births per year will be randomized (29 intervention and 29 control). The intervention includes training of health workers (e.g., nurses, midwives, clinical officers) to perform ultrasound examinations during antenatal care, generally at 18–22 and at 32–36 weeks for each subject. Women who are identified as having a complication of pregnancy will be referred to a hospital for appropriate care. Finally, the intervention includes community sensitization activities to inform women and their families of the availability of ultrasound at the antenatal care clinic and training in emergency obstetric and neonatal care at referral facilities. Discussion In summary, our trial will evaluate whether introduction of ultrasound during antenatal care improves pregnancy

  15. Best strategies to implement clinical pathways in an emergency department setting: study protocol for a cluster randomized controlled trial

    PubMed Central

    2013-01-01

    Background The clinical pathway is a tool that operationalizes best evidence recommendations and clinical practice guidelines in an accessible format for ‘point of care’ management by multidisciplinary health teams in hospital settings. While high-quality, expert-developed clinical pathways have many potential benefits, their impact has been limited by variable implementation strategies and suboptimal research designs. Best strategies for implementing pathways into hospital settings remain unknown. This study will seek to develop and comprehensively evaluate best strategies for effective local implementation of externally developed expert clinical pathways. Design/methods We will develop a theory-based and knowledge user-informed intervention strategy to implement two pediatric clinical pathways: asthma and gastroenteritis. Using a balanced incomplete block design, we will randomize 16 community emergency departments to receive the intervention for one clinical pathway and serve as control for the alternate clinical pathway, thus conducting two cluster randomized controlled trials to evaluate this implementation intervention. A minimization procedure will be used to randomize sites. Intervention sites will receive a tailored strategy to support full clinical pathway implementation. We will evaluate implementation strategy effectiveness through measurement of relevant process and clinical outcomes. The primary process outcome will be the presence of an appropriately completed clinical pathway on the chart for relevant patients. Primary clinical outcomes for each clinical pathway include the following: Asthma—the proportion of asthmatic patients treated appropriately with corticosteroids in the emergency department and at discharge; and Gastroenteritis—the proportion of relevant patients appropriately treated with oral rehydration therapy. Data sources include chart audits, administrative databases, environmental scans, and qualitative interviews. We will

  16. A space and time scale-dependent nonlinear geostatistical approach for downscaling daily precipitation and temperature

    NASA Astrophysics Data System (ADS)

    Jha, Sanjeev Kumar; Mariethoz, Gregoire; Evans, Jason; McCabe, Matthew F.; Sharma, Ashish

    2015-08-01

    A geostatistical framework is proposed to downscale daily precipitation and temperature. The methodology is based on multiple-point geostatistics (MPS), where a multivariate training image is used to represent the spatial relationship between daily precipitation and daily temperature over several years. Here the training image consists of daily rainfall and temperature outputs from the Weather Research and Forecasting (WRF) model at 50 and 10 km resolution for a 20 year period ranging from 1985 to 2004. The data are used to predict downscaled climate variables for the year 2005. The result, for each downscaled pixel, is daily time series of precipitation and temperature that are spatially dependent. Comparison of predicted precipitation and temperature against a reference data set indicates that both the seasonal average climate response together with the temporal variability are well reproduced. The explicit inclusion of time dependence is explored by considering the climate properties of the previous day as an additional variable. Comparison of simulations with and without inclusion of time dependence shows that the temporal dependence only slightly improves the daily prediction because the temporal variability is already well represented in the conditioning data. Overall, the study shows that the multiple-point geostatistics approach is an efficient tool to be used for statistical downscaling to obtain local-scale estimates of precipitation and temperature from General Circulation Models.

  17. Technology demonstration: geostatistical and hydrologic analysis of salt areas. Assessment of effectiveness of geologic isolation systems

    SciTech Connect

    Doctor, P.G.; Oberlander, P.L.; Rice, W.A.; Devary, J.L.; Nelson, R.W.; Tucker, P.E.

    1982-09-01

    The Office of Nuclear Waste Isolation (ONWI) requested Pacific Northwest Laboratory (PNL) to: (1) use geostatistical analyses to evaluate the adequacy of hydrologic data from three salt regions, each of which contains a potential nuclear waste repository site; and (2) demonstrate a methodology that allows quantification of the value of additional data collection. The three regions examined are the Paradox Basin in Utah, the Permian Basin in Texas, and the Mississippi Study Area. Additional and new data became available to ONWI during and following these analyses; therefore, this report must be considered a methodology demonstration here would apply as illustrated had the complete data sets been available. A combination of geostatistical and hydrologic analyses was used for this demonstration. Geostatistical analyses provided an optimal estimate of the potentiometric surface from the available data, a measure of the uncertainty of that estimate, and a means for selecting and evaluating the location of future data. The hydrologic analyses included the calculation of transmissivities, flow paths, travel times, and ground-water flow rates from hypothetical repository sites. Simulation techniques were used to evaluate the effect of optimally located future data on the potentiometric surface, flow lines, travel times, and flow rates. Data availability, quality, quantity, and conformance with model assumptions differed in each of the salt areas. Report highlights for the three locations are given.

  18. Unsupervised classification of multivariate geostatistical data: Two algorithms

    NASA Astrophysics Data System (ADS)

    Romary, Thomas; Ors, Fabien; Rivoirard, Jacques; Deraisme, Jacques

    2015-12-01

    With the increasing development of remote sensing platforms and the evolution of sampling facilities in mining and oil industry, spatial datasets are becoming increasingly large, inform a growing number of variables and cover wider and wider areas. Therefore, it is often necessary to split the domain of study to account for radically different behaviors of the natural phenomenon over the domain and to simplify the subsequent modeling step. The definition of these areas can be seen as a problem of unsupervised classification, or clustering, where we try to divide the domain into homogeneous domains with respect to the values taken by the variables in hand. The application of classical clustering methods, designed for independent observations, does not ensure the spatial coherence of the resulting classes. Image segmentation methods, based on e.g. Markov random fields, are not adapted to irregularly sampled data. Other existing approaches, based on mixtures of Gaussian random functions estimated via the expectation-maximization algorithm, are limited to reasonable sample sizes and a small number of variables. In this work, we propose two algorithms based on adaptations of classical algorithms to multivariate geostatistical data. Both algorithms are model free and can handle large volumes of multivariate, irregularly spaced data. The first one proceeds by agglomerative hierarchical clustering. The spatial coherence is ensured by a proximity condition imposed for two clusters to merge. This proximity condition relies on a graph organizing the data in the coordinates space. The hierarchical algorithm can then be seen as a graph-partitioning algorithm. Following this interpretation, a spatial version of the spectral clustering algorithm is also proposed. The performances of both algorithms are assessed on toy examples and a mining dataset.

  19. Optimisation of groundwater level monitoring networks using geostatistical modelling based on the Spartan family variogram and a genetic algorithm method

    NASA Astrophysics Data System (ADS)

    Parasyris, Antonios E.; Spanoudaki, Katerina; Kampanis, Nikolaos A.

    2016-04-01

    Groundwater level monitoring networks provide essential information for water resources management, especially in areas with significant groundwater exploitation for agricultural and domestic use. Given the high maintenance costs of these networks, development of tools, which can be used by regulators for efficient network design is essential. In this work, a monitoring network optimisation tool is presented. The network optimisation tool couples geostatistical modelling based on the Spartan family variogram with a genetic algorithm method and is applied to Mires basin in Crete, Greece, an area of high socioeconomic and agricultural interest, which suffers from groundwater overexploitation leading to a dramatic decrease of groundwater levels. The purpose of the optimisation tool is to determine which wells to exclude from the monitoring network because they add little or no beneficial information to groundwater level mapping of the area. Unlike previous relevant investigations, the network optimisation tool presented here uses Ordinary Kriging with the recently-established non-differentiable Spartan variogram for groundwater level mapping, which, based on a previous geostatistical study in the area leads to optimal groundwater level mapping. Seventy boreholes operate in the area for groundwater abstraction and water level monitoring. The Spartan variogram gives overall the most accurate groundwater level estimates followed closely by the power-law model. The geostatistical model is coupled to an integer genetic algorithm method programmed in MATLAB 2015a. The algorithm is used to find the set of wells whose removal leads to the minimum error between the original water level mapping using all the available wells in the network and the groundwater level mapping using the reduced well network (error is defined as the 2-norm of the difference between the original mapping matrix with 70 wells and the mapping matrix of the reduced well network). The solution to the

  20. The role of geostatistics in medical geology

    NASA Astrophysics Data System (ADS)

    Goovaerts, Pierre

    2014-05-01

    Since its development in the mining industry, geostatistics has emerged as the primary tool for spatial data analysis in various fields, ranging from earth and atmospheric sciences, to agriculture, soil science, remote sensing, and more recently environmental exposure assessment. In the last few years, these tools have been tailored to the field of medical geography or spatial epidemiology, which is concerned with the study of spatial patterns of disease incidence and mortality and the identification of potential 'causes' of disease, such as environmental exposure, diet and unhealthy behaviors, economic or socio-demographic factors. On the other hand, medical geology is an emerging interdisciplinary scientific field studying the relationship between natural geological factors and their effects on human and animal health. This paper provides an introduction to the field of medical geology with an overview of geostatistical methods available for the analysis of geological and health data. Key concepts are illustrated using the mapping of groundwater arsenic concentrations across eleven Michigan counties and the exploration of its relationship to the incidence of prostate cancer at the township level. Arsenic in drinking-water is a major problem and has received much attention because of the large human population exposed and the extremely high concentrations (e.g. 600 to 700 μg/L) recorded in many instances. Few studies have however assessed the risks associated with exposure to low levels of arsenic (say < 50 μg/L) most commonly found in drinking water in the United States. In the Michigan thumb region, arsenopyrite (up to 7% As by weight) has been identified in the bedrock of the Marshall Sandstone aquifer, one of the region's most productive aquifers. Epidemiologic studies have suggested a possible associationbetween exposure to inorganic arsenic and prostate cancer mortality, including a study of populations residing in Utah. The information available for the

  1. Tuberculosis case-finding through a village outreach programme in a rural setting in southern Ethiopia: community randomized trial.

    PubMed Central

    Shargie, Estifanos Biru; Mørkve, Odd; Lindtjørn, Bernt

    2006-01-01

    OBJECTIVE: To ascertain whether case-finding through community outreach in a rural setting has an effect on case-notification rate, symptom duration, and treatment outcome of smear-positive tuberculosis (TB). METHODS: We randomly allocated 32 rural communities to intervention or control groups. In intervention communities, health workers from seven health centres held monthly diagnostic outreach clinics at which they obtained sputum samples for sputum microscopy from symptomatic TB suspects. In addition, trained community promoters distributed leaflets and discussed symptoms of TB during house visits and at popular gatherings. Symptomatic individuals were encouraged to visit the outreach team or a nearby health facility. In control communities, cases were detected through passive case-finding among symptomatic suspects reporting to health facilities. Smear-positive TB patients from the intervention and control communities diagnosed during the study period were prospectively enrolled. FINDINGS: In the 1-year study period, 159 and 221 cases of smear-positive TB were detected in the intervention and control groups, respectively. Case-notification rates in all age groups were 124.6/10(5) and 98.1/10(5) person-years, respectively (P = 0.12). The corresponding rates in adults older than 14 years were 207/10(5) and 158/10(5) person-years, respectively (P = 0.09). The proportion of patients with >3 months' symptom duration was 41% in the intervention group compared with 63% in the control group (P<0.001). Pre-treatment symptom duration in the intervention group fell by 55-60% compared with 3-20% in the control group. In the intervention and control groups, 81% and 75%, respectively of patients successfully completed treatment (P = 0.12). CONCLUSION: The intervention was effective in improving the speed but not the extent of case finding for smear-positive TB in this setting. Both groups had comparable treatment outcomes. PMID:16501728

  2. On the comparison of the interval estimation of the Pareto parameter under simple random sampling and ranked set sampling techniques

    NASA Astrophysics Data System (ADS)

    Aissa, Aissa Omar; Ibrahim, Kamarulzaman; Dayyeh, Walid Abu; Zin, Wan Zawiah Wan

    2015-02-01

    Ranked set sampling (RSS) is recognized as a useful sampling scheme for improving the precision of the parameter estimates and increasing the efficiency of estimation. This type of scheme is appropriate when the variable of interest is expensive or time consuming to be quantified, but easy and cheap to be ranked. In this study, the estimation of the shape parameter of the Pareto distribution of the first type when the scale is known is studied for the data that are gathered under simple random sampling (SRS), RSS, and selective order statistics based on the maximum (SORSS(max)). The confidence intervals for the shape parameter of Pareto distribution under the sampling techniques considered are determined. A simulation study is carried out to compare the confidence intervals in terms of coverage probabilities (CPs) and expected lengths (ELs). When the coverage probabilities and expected lengths for the confidence intervals of the shape parameter of Pareto distribution determined based on the different sampling methods are compared, the coverage probabilities and expected lengths are found to be more precise under RSS as opposed to SRS. In particular, it is found that the coverage probabilities under SORSS(max) is closest to the nominal value of 0.95.

  3. Testing Allele Transmission of an SNP Set Using a Family-Based Generalized Genetic Random Field Method.

    PubMed

    Li, Ming; Li, Jingyun; He, Zihuai; Lu, Qing; Witte, John S; Macleod, Stewart L; Hobbs, Charlotte A; Cleves, Mario A

    2016-05-01

    Family-based association studies are commonly used in genetic research because they can be robust to population stratification (PS). Recent advances in high-throughput genotyping technologies have produced a massive amount of genomic data in family-based studies. However, current family-based association tests are mainly focused on evaluating individual variants one at a time. In this article, we introduce a family-based generalized genetic random field (FB-GGRF) method to test the joint association between a set of autosomal SNPs (i.e., single-nucleotide polymorphisms) and disease phenotypes. The proposed method is a natural extension of a recently developed GGRF method for population-based case-control studies. It models offspring genotypes conditional on parental genotypes, and, thus, is robust to PS. Through simulations, we presented that under various disease scenarios the FB-GGRF has improved power over a commonly used family-based sequence kernel association test (FB-SKAT). Further, similar to GGRF, the proposed FB-GGRF method is asymptotically well-behaved, and does not require empirical adjustment of the type I error rates. We illustrate the proposed method using a study of congenital heart defects with family trios from the National Birth Defects Prevention Study (NBDPS). PMID:27061818

  4. Estimation of extreme daily precipitation: comparison between regional and geostatistical approaches.

    NASA Astrophysics Data System (ADS)

    Hellies, Matteo; Deidda, Roberto; Langousis, Andreas

    2016-04-01

    We study the extreme rainfall regime of the Island of Sardinia in Italy, based on annual maxima of daily precipitation. The statistical analysis is conducted using 229 daily rainfall records with at least 50 complete years of observations, collected at different sites by the Hydrological Survey of the Sardinia Region. Preliminary analysis, and the L-skewness and L-kurtosis diagrams, show that the Generalized Extreme Value (GEV) distribution model performs best in describing daily rainfall extremes. The GEV distribution parameters are estimated using the method of Probability Weighted Moments (PWM). To obtain extreme rainfall estimates at ungauged sites, while minimizing uncertainties due to sampling variability, a regional and a geostatistical approach are compared. The regional approach merges information from different gauged sites, within homogeneous regions, to obtain GEV parameter estimates at ungauged locations. The geostatistical approach infers the parameters of the GEV distribution model at locations where measurements are available, and then spatially interpolates them over the study region. In both approaches we use local rainfall means as index-rainfall. In the regional approach we define homogeneous regions by applying a hierarchical cluster analysis based on Ward's method, with L-moment ratios (i.e. L-CV and L-Skewness) as metrics. The analysis results in four contiguous regions, which satisfy the Hosking and Wallis (1997) homogeneity tests. The latter have been conducted using a Monte-Carlo approach based on a 4-parameter Kappa distribution model, fitted to each station cluster. Note that the 4-parameter Kappa model includes the GEV distribution as a sub-case, when the fourth parameter h is set to 0. In the geostatistical approach we apply kriging for uncertain data (KUD), which accounts for the error variance in local parameter estimation and, therefore, may serve as a useful tool for spatial interpolation of metrics affected by high uncertainty. In

  5. High Performance Geostatistical Modeling of Biospheric Resources

    NASA Astrophysics Data System (ADS)

    Pedelty, J. A.; Morisette, J. T.; Smith, J. A.; Schnase, J. L.; Crosier, C. S.; Stohlgren, T. J.

    2004-12-01

    We are using parallel geostatistical codes to study spatial relationships among biospheric resources in several study areas. For example, spatial statistical models based on large- and small-scale variability have been used to predict species richness of both native and exotic plants (hot spots of diversity) and patterns of exotic plant invasion. However, broader use of geostastics in natural resource modeling, especially at regional and national scales, has been limited due to the large computing requirements of these applications. To address this problem, we implemented parallel versions of the kriging spatial interpolation algorithm. The first uses the Message Passing Interface (MPI) in a master/slave paradigm on an open source Linux Beowulf cluster, while the second is implemented with the new proprietary Xgrid distributed processing system on an Xserve G5 cluster from Apple Computer, Inc. These techniques are proving effective and provide the basis for a national decision support capability for invasive species management that is being jointly developed by NASA and the US Geological Survey.

  6. Geostatistical Sampling Methods for Efficient Uncertainty Analysis in Flow and Transport Problems

    NASA Astrophysics Data System (ADS)

    Liodakis, Stylianos; Kyriakidis, Phaedon; Gaganis, Petros

    2015-04-01

    In hydrogeological applications involving flow and transport of in heterogeneous porous media the spatial distribution of hydraulic conductivity is often parameterized in terms of a lognormal random field based on a histogram and variogram model inferred from data and/or synthesized from relevant knowledge. Realizations of simulated conductivity fields are then generated using geostatistical simulation involving simple random (SR) sampling and are subsequently used as inputs to physically-based simulators of flow and transport in a Monte Carlo framework for evaluating the uncertainty in the spatial distribution of solute concentration due to the uncertainty in the spatial distribution of hydraulic con- ductivity [1]. Realistic uncertainty analysis, however, calls for a large number of simulated concentration fields; hence, can become expensive in terms of both time and computer re- sources. A more efficient alternative to SR sampling is Latin hypercube (LH) sampling, a special case of stratified random sampling, which yields a more representative distribution of simulated attribute values with fewer realizations [2]. Here, term representative implies realizations spanning efficiently the range of possible conductivity values corresponding to the lognormal random field. In this work we investigate the efficiency of alternative methods to classical LH sampling within the context of simulation of flow and transport in a heterogeneous porous medium. More precisely, we consider the stratified likelihood (SL) sampling method of [3], in which attribute realizations are generated using the polar simulation method by exploring the geometrical properties of the multivariate Gaussian distribution function. In addition, we propose a more efficient version of the above method, here termed minimum energy (ME) sampling, whereby a set of N representative conductivity realizations at M locations is constructed by: (i) generating a representative set of N points distributed on the

  7. Geostatistical Study of Precipitation on the Island of Crete

    NASA Astrophysics Data System (ADS)

    Agou, Vasiliki D.; Varouchakis, Emmanouil A.; Hristopulos, Dionissios T.

    2015-04-01

    precipitation which are fitted locally to a three-parameter probability distribution, based on which a normalized index is derived. We use the Spartan variogram function to model space-time correlations, because it is more flexible than classical models [3]. The performance of the variogram model is tested by means of leave-one-out cross validation. The variogram model is then used in connection with ordinary kriging to generate precipitation maps for the entire island. In the future, we will explore the joint spatiotemporal evolution of precipitation patterns on Crete. References [1] P. Goovaerts. Geostatistical approaches for incorporating elevation into the spatial interpolation of precipitation. Journal of Hydrology, 228(1):113-129, 2000. [2] N. B. Guttman. Accepting the standardized precipitation index: a calculation algorithm. American Water Resource Association, 35(2):311-322, 1999. [3] D. T Hristopulos. Spartan Gibbs random field models for geostatistical applications. SIAM Journal on Scientific Computing, 24(6):2125-2162, 2003. [4] A.G. Koutroulis, A.-E.K. Vrohidou, and I.K. Tsanis. Spatiotemporal characteristics of meteorological drought for the island of Crete. Journal of Hydrometeorology, 12(2):206-226, 2011. [5] T. B. McKee, N. J. Doesken, and J. Kleist. The relationship of drought frequency and duration to time scales. In Proceedings of the 8th Conference on Applied Climatology, page 179-184, Anaheim, California, 1993.

  8. Geostatistics from Digital Outcrop Models of Outcrop Analogues for Hydrocarbon Reservoir Characterisation.

    NASA Astrophysics Data System (ADS)

    Hodgetts, David; Burnham, Brian; Head, William; Jonathan, Atunima; Rarity, Franklin; Seers, Thomas; Spence, Guy

    2013-04-01

    In the hydrocarbon industry stochastic approaches are the main method by which reservoirs are modelled. These stochastic modelling approaches require geostatistical information on the geometry and distribution of the geological elements of the reservoir. As the reservoir itself cannot be viewed directly (only indirectly via seismic and/or well log data) this leads to a great deal of uncertainty in the geostatistics used, therefore outcrop analogues are characterised to help obtain the geostatistical information required to model the reservoir. Lidar derived Digital Outcrop Model's (DOM's) provide the ability to collect large quantities of statistical information on the geological architecture of the outcrop, far more than is possible by field work alone as the DOM allows accurate measurements to be made in normally inaccessible parts of the exposure. This increases the size of the measured statistical dataset, which in turn results in an increase in statistical significance. There are, however, many problems and biases in the data which cannot be overcome by sample size alone. These biases, for example, may relate to the orientation, size and quality of exposure, as well as the resolution of the DOM itself. Stochastic modelling used in the hydrocarbon industry fall mainly into 4 generic approaches: 1) Object Modelling where the geology is defined by a set of simplistic shapes (such as channels), where parameters such as width, height and orientation, among others, can be defined. 2) Sequential Indicator Simulations where geological shapes are less well defined and the size and distribution are defined using variograms. 3) Multipoint statistics where training images are used to define shapes and relationships between geological elements and 4) Discrete Fracture Networks for fractures reservoirs where information on fracture size and distribution are required. Examples of using DOM's to assist with each of these modelling approaches are presented, highlighting the

  9. Reducing uncertainty in geostatistical description with well testing pressure data

    SciTech Connect

    Reynolds, A.C.; He, Nanqun; Oliver, D.S.

    1997-08-01

    Geostatistics has proven to be an effective tool for generating realizations of reservoir properties conditioned to static data, e.g., core and log data and geologic knowledge. Due to the lack of closely spaced data in the lateral directions, there will be significant variability in reservoir descriptions generated by geostatistical simulation, i.e., significant uncertainty in the reservoir descriptions. In past work, we have presented procedures based on inverse problem theory for generating reservoir descriptions (rock property fields) conditioned to pressure data and geostatistical information represented as prior means for log-permeability and porosity and variograms. Although we have shown that the incorporation of pressure data reduces the uncertainty below the level contained in the geostatistical model based only on static information (the prior model), our previous results assumed did not explicitly account for uncertainties in the prior means and the parameters defining the variogram model. In this work, we investigate how pressure data can help detect errors in the prior means. If errors in the prior means are large and are not taken into account, realizations conditioned to pressure data represent incorrect samples of the a posteriori probability density function for the rock property fields, whereas, if the uncertainty in the prior mean is incorporated properly into the model, one obtains realistic realizations of the rock property fields.

  10. Extending an operational meteorological monitoring network through machine learning and classical geo-statistical approaches

    NASA Astrophysics Data System (ADS)

    Appelhans, Tim; Mwangomo, Ephraim; Otte, Insa; Detsch, Florian; Nauss, Thomas; Hemp, Andreas; Ndyamkama, Jimmy

    2015-04-01

    This study introduces the set-up and characteristics of a meteorological station network on the southern slopes of Mt. Kilimanjaro, Tanzania. The set-up follows a hierarchical approach covering an elevational as well as a land-use disturbance gradient. The network consists of 52 basic stations measuring ambient air temperature and above ground air humidity and 11 precipitation measurement sites. We provide in depth descriptions of various machine learning and classical geo-statistical methods used to fill observation gaps and extend the spatial coverage of the network to a total of 60 research sites. Performance statistics for these methods indicate that the presented data sets provide reliable measurements of the meteorological reality at Mt. Kilimanjaro. These data provide an excellent basis for ecological studies and are also of great value for regional atmospheric numerical modelling studies for which such comprehensive in-situ validation observations are rare, especially in tropical regions of complex terrain.

  11. A Parenting Intervention for Childhood Behavioral Problems: A Randomized Controlled Trial in Disadvantaged Community-Based Settings

    ERIC Educational Resources Information Center

    McGilloway, Sinead; Mhaille, Grainne Ni; Bywater, Tracey; Furlong, Mairead; Leckey, Yvonne; Kelly, Paul; Comiskey, Catherine; Donnelly, Michael

    2012-01-01

    Objective: A community-based randomized controlled trial (RCT) was conducted in urban areas characterized by high levels of disadvantage to test the effectiveness of the Incredible Years BASIC parent training program (IYBP) for children with behavioral problems. Potential moderators of intervention effects on child behavioral outcomes were also…

  12. Geostatistical analysis of soil properties at field scale using standardized data

    NASA Astrophysics Data System (ADS)

    Millan, H.; Tarquis, A. M.; Pérez, L. D.; Matos, J.; González-Posada, M.

    2012-04-01

    Indentifying areas with physical degradation is a crucial step to ameliorate the effects in soil erosion. The quantification and interpretation of spatial variability is a key issue for site-specific soil management. Geostatistics has been the main methodological tool for implementing precision agriculture using field data collected at different spatial resolutions. Even though many works have made significant contributions to the body of knowledge on spatial statistics and its applications, some other key points need to be addressed for conducting precise comparisons between soil properties using geostatistical parameters. The objectives of the present work were (i) to quantify the spatial structure of different physical properties collected from a Vertisol, (ii) to search for potential correlations between different spatial patterns and (iii) to identify relevant components through multivariate spatial analysis. The study was conducted on a Vertisol (Typic Hapludert) dedicated to sugarcane (Saccharum officinarum L.) production during the last sixty years. We used six soil properties collected from a squared grid (225 points) (penetrometer resistance (PR), total porosity, fragmentation dimension (Df), vertical electrical conductivity (ECv), horizontal electrical conductivity (ECh) and soil water content (WC)). All the original data sets were z-transformed before geostatistical analysis. Three different types of semivariogram models were necessary for fitting individual experimental semivariograms. This suggests the different natures of spatial variability patterns. Soil water content rendered the largest nugget effect (C0 = 0.933) while soil total porosity showed the largest range of spatial correlation (A = 43.92 m). The bivariate geostatistical analysis also rendered significant cross-semivariance between different paired soil properties. However, four different semivariogram models were required in that case. This indicates an underlying co

  13. [Geostatistical analysis on distribution dynamics of Myzus persicae (Sulzer) in flue-cured tobacco field].

    PubMed

    Xia, Peng-liang; Liu, Ying-hong; Fan, Jun; Tan, Jun

    2015-02-01

    Abstract: Myzus persicae belonging to Aphididae, Hemiptera, is an important migratory pest in tobacco field. As nymph and adult, it sucks the juice, breeds the mildew stains disease, spreads tobacco virus diseases and causes huge losses to the yield and quality. The distribution pattern and dynamics of winged and wingless aphids in the field were investigated from the transplanting of tobacco to the harvesting stage of mid-place tobacco leaves in Enshi, Hubei. The semivariable function characteristics were analyzed by geostatistical method, and the field migration pattern were simulated. The results showed that the population dynamics of winged aphids in Enshi were of bimodal curve, with two peaks at 3 weeks after transplanting and 2 weeks after multi-topping of tobacco leaves, and there were five-step process such as random, aggregation, random, aggregation and random. The population dynamics of wingless peach aphids were of single-peak curve, getting its peak before multi-topping, and had random, aggregation, random three-step process. Human factors and the hosts had considerable effects on the population density. Spatial distribution simulation-interpolation-figure could clearly reflect the dynamics of tobacco aphids. Combined with the Pearson correlation analysis, we found that the population density was low and highly concentrated as winged type in the immigration period, which was the key period for the management of peach aphids. PMID:26094473

  14. Comparing the performance of geostatistical models with additional information from covariates for sewage plume characterization.

    PubMed

    Del Monego, Maurici; Ribeiro, Paulo Justiniano; Ramos, Patrícia

    2015-04-01

    In this work, kriging with covariates is used to model and map the spatial distribution of salinity measurements gathered by an autonomous underwater vehicle in a sea outfall monitoring campaign aiming to distinguish the effluent plume from the receiving waters and characterize its spatial variability in the vicinity of the discharge. Four different geostatistical linear models for salinity were assumed, where the distance to diffuser, the west-east positioning, and the south-north positioning were used as covariates. Sample variograms were fitted by the Matèrn models using weighted least squares and maximum likelihood estimation methods as a way to detect eventual discrepancies. Typically, the maximum likelihood method estimated very low ranges which have limited the kriging process. So, at least for these data sets, weighted least squares showed to be the most appropriate estimation method for variogram fitting. The kriged maps show clearly the spatial variation of salinity, and it is possible to identify the effluent plume in the area studied. The results obtained show some guidelines for sewage monitoring if a geostatistical analysis of the data is in mind. It is important to treat properly the existence of anomalous values and to adopt a sampling strategy that includes transects parallel and perpendicular to the effluent dispersion. PMID:25345922

  15. Geostatistical regularization of inverse models for the retrieval of vegetation biophysical variables

    NASA Astrophysics Data System (ADS)

    Atzberger, C.; Richter, K.

    2009-09-01

    The robust and accurate retrieval of vegetation biophysical variables using radiative transfer models (RTM) is seriously hampered by the ill-posedness of the inverse problem. With this research we further develop our previously published (object-based) inversion approach [Atzberger (2004)]. The object-based RTM inversion takes advantage of the geostatistical fact that the biophysical characteristics of nearby pixel are generally more similar than those at a larger distance. A two-step inversion based on PROSPECT+SAIL generated look-up-tables is presented that can be easily implemented and adapted to other radiative transfer models. The approach takes into account the spectral signatures of neighboring pixel and optimizes a common value of the average leaf angle (ALA) for all pixel of a given image object, such as an agricultural field. Using a large set of leaf area index (LAI) measurements (n = 58) acquired over six different crops of the Barrax test site, Spain), we demonstrate that the proposed geostatistical regularization yields in most cases more accurate and spatially consistent results compared to the traditional (pixel-based) inversion. Pros and cons of the approach are discussed and possible future extensions presented.

  16. Large-scale hydraulic tomography and joint inversion of head and tracer data using the Principal Component Geostatistical Approach (PCGA)

    NASA Astrophysics Data System (ADS)

    Lee, J.; Kitanidis, P. K.

    2014-07-01

    The stochastic geostatistical inversion approach is widely used in subsurface inverse problems to estimate unknown parameter fields and corresponding uncertainty from noisy observations. However, the approach requires a large number of forward model runs to determine the Jacobian or sensitivity matrix, thus the computational and storage costs become prohibitive when the number of unknowns, m, and the number of observations, n increase. To overcome this challenge in large-scale geostatistical inversion, the Principal Component Geostatistical Approach (PCGA) has recently been developed as a "matrix-free" geostatistical inversion strategy that avoids the direct evaluation of the Jacobian matrix through the principal components (low-rank approximation) of the prior covariance and the drift matrix with a finite difference approximation. As a result, the proposed method requires about K runs of the forward problem in each iteration independently of m and n, where K is the number of principal components and can be much less than m and n for large-scale inverse problems. Furthermore, the PCGA is easily adaptable to different forward simulation models and various data types for which the adjoint-state method may not be implemented suitably. In this paper, we apply the PCGA to representative subsurface inverse problems to illustrate its efficiency and scalability. The low-rank approximation of the large-dimensional dense prior covariance matrix is computed through a randomized eigen decomposition. A hydraulic tomography problem in which the number of observations is typically large is investigated first to validate the accuracy of the PCGA compared with the conventional geostatistical approach. Then the method is applied to a large-scale hydraulic tomography with 3 million unknowns and it is shown that underlying subsurface structures are characterized successfully through an inversion that involves an affordable number of forward simulation runs. Lastly, we present a joint

  17. Stochastic Local Interaction (SLI) model: Bridging machine learning and geostatistics

    NASA Astrophysics Data System (ADS)

    Hristopulos, Dionissios T.

    2015-12-01

    Machine learning and geostatistics are powerful mathematical frameworks for modeling spatial data. Both approaches, however, suffer from poor scaling of the required computational resources for large data applications. We present the Stochastic Local Interaction (SLI) model, which employs a local representation to improve computational efficiency. SLI combines geostatistics and machine learning with ideas from statistical physics and computational geometry. It is based on a joint probability density function defined by an energy functional which involves local interactions implemented by means of kernel functions with adaptive local kernel bandwidths. SLI is expressed in terms of an explicit, typically sparse, precision (inverse covariance) matrix. This representation leads to a semi-analytical expression for interpolation (prediction), which is valid in any number of dimensions and avoids the computationally costly covariance matrix inversion.

  18. Hydrogeologic Unit Flow Characterization Using Transition Probability Geostatistics

    SciTech Connect

    Jones, N L; Walker, J R; Carle, S F

    2003-11-21

    This paper describes a technique for applying the transition probability geostatistics method for stochastic simulation to a MODFLOW model. Transition probability geostatistics has several advantages over traditional indicator kriging methods including a simpler and more intuitive framework for interpreting geologic relationships and the ability to simulate juxtapositional tendencies such as fining upwards sequences. The indicator arrays generated by the transition probability simulation are converted to layer elevation and thickness arrays for use with the new Hydrogeologic Unit Flow (HUF) package in MODFLOW 2000. This makes it possible to preserve complex heterogeneity while using reasonably sized grids. An application of the technique involving probabilistic capture zone delineation for the Aberjona Aquifer in Woburn, Ma. is included.

  19. Assessing the resolution-dependent utility of tomograms for geostatistics

    USGS Publications Warehouse

    Day-Lewis, F. D.; Lane, J.W., Jr.

    2004-01-01

    Geophysical tomograms are used increasingly as auxiliary data for geostatistical modeling of aquifer and reservoir properties. The correlation between tomographic estimates and hydrogeologic properties is commonly based on laboratory measurements, co-located measurements at boreholes, or petrophysical models. The inferred correlation is assumed uniform throughout the interwell region; however, tomographic resolution varies spatially due to acquisition geometry, regularization, data error, and the physics underlying the geophysical measurements. Blurring and inversion artifacts are expected in regions traversed by few or only low-angle raypaths. In the context of radar traveltime tomography, we derive analytical models for (1) the variance of tomographic estimates, (2) the spatially variable correlation with a hydrologic parameter of interest, and (3) the spatial covariance of tomographic estimates. Synthetic examples demonstrate that tomograms of qualitative value may have limited utility for geostatistics; moreover, the imprint of regularization may preclude inference of meaningful spatial statistics from tomograms.

  20. A randomized controlled trial of a nurse-administered educational intervention for improving cancer pain management in ambulatory settings.

    PubMed

    Yates, Patsy; Edwards, Helen; Nash, Robyn; Aranda, Sanchia; Purdie, David; Najman, Jake; Skerman, Helen; Walsh, Anne

    2004-05-01

    The persistence of negative attitudes towards cancer pain and its treatment suggests there is scope for identifying more effective pain education strategies. This randomized controlled trial involving 189 ambulatory cancer patients evaluated an educational intervention that aimed to optimize patients' ability to manage pain. One week post-intervention, patients receiving the pain management intervention (PMI) had a significantly greater increase in self-reported pain knowledge, perceived control over pain, and number of pain treatments recommended. Intervention group patients also demonstrated a greater reduction in willingness to tolerate pain, concerns about addiction and side effects, being a "good" patient, and tolerance to pain relieving medication. The results suggest that targeted educational interventions that utilize individualized instructional techniques may alter cancer patient attitudes, which can potentially act as barriers to effective pain management. PMID:15140463

  1. Addressing uncertainty in rock properties through geostatistical simulation

    SciTech Connect

    McKenna, S.A.; Rautman, A.; Cromer, M.V.; Zelinski, W.P.

    1996-09-01

    Fracture and matrix properties in a sequence of unsaturated, welded tuffs at Yucca Mountain, Nevada, are modeled in two-dimensional cross-sections through geostatistical simulation. In the absence of large amounts of sample data, an n interpretive, deterministic, stratigraphic model is coupled with a gaussian simulation algorithm to constrain realizations of both matrix porosity and fracture frequency. Use of the deterministic, stratigraphic model imposes scientific judgment, in the form of a conceptual geologic model, onto the property realizations. Linear coregionalization and a regression relationship between matrix porosity and matrix hydraulic conductivity are used to generate realizations of matrix hydraulic conductivity. Fracture-frequency simulations conditioned on the stratigraphic model represent one class of fractures (cooling fractures) in the conceptual model of the geology. A second class of fractures (tectonic fractures) is conceptualized as fractures that cut across strata vertically and includes discrete features such as fault zones. Indicator geostatistical simulation provides locations of this second class of fractures. The indicator realizations are combined with the realizations of fracture spacing to create realizations of fracture frequency that are a combination of both classes of fractures. Evaluations of the resulting realizations include comparing vertical profiles of rock properties within the model to those observed in boreholes and checking intra-unit property distributions against collected data. Geostatistical simulation provides an efficient means of addressing spatial uncertainty in dual continuum rock properties.

  2. An improved algorithm of a priori based on geostatistics

    NASA Astrophysics Data System (ADS)

    Chen, Jiangping; Wang, Rong; Tang, Xuehua

    2008-12-01

    In data mining one of the classical algorithms is Apriori which has been developed for association rule mining in large transaction database. And it cannot been directly used in spatial association rules mining. The main difference between data mining in relational DB and in spatial DB is that attributes of the neighbors of some object of interest may have an influence on the object and therefore have to be considered as well. The explicit location and extension of spatial objects define implicit relations of spatial neighborhood (such as topological, distance and direction relations) which are used by spatial data mining algorithms. Therefore, new techniques are required for effective and efficient spatial data mining. Geostatistics are statistical methods used to describe spatial relationships among sample data and to apply this analysis to the prediction of spatial and temporal phenomena. They are used to explain spatial patterns and to interpolate values at unsampled locations. This paper put forward an improved algorithm of Apriori about mining association rules with geostatistics. First the spatial autocorrelation of the attributes with location were estimated with the geostatistics methods such as kriging and Spatial Autoregressive Model (SAR). Then a spatial autocorrelation model of the attributes were built. Later an improved algorithm of apriori combined with the spatial autocorrelation model were offered to mine the spatial association rules. Last an experiment of the new algorithm were carried out on the hayfever incidence and climate factors in UK. The result shows that the output rules is matched with the references.

  3. Psychosocial education improves low back pain beliefs: results from a cluster randomized clinical trial (NCT00373009) in a primary prevention setting.

    PubMed

    George, Steven Z; Teyhen, Deydre S; Wu, Samuel S; Wright, Alison C; Dugan, Jessica L; Yang, Guijun; Robinson, Michael E; Childs, John D

    2009-07-01

    The general population has a pessimistic view of low back pain (LBP), and evidence-based information has been used to positively influence LBP beliefs in previously reported mass media studies. However, there is a lack of randomized trials investigating whether LBP beliefs can be modified in primary prevention settings. This cluster randomized clinical trial investigated the effect of an evidence-based psychosocial educational program (PSEP) on LBP beliefs for soldiers completing military training. A military setting was selected for this clinical trial, because LBP is a common cause of soldier disability. Companies of soldiers (n = 3,792) were recruited, and cluster randomized to receive a PSEP or no education (control group, CG). The PSEP consisted of an interactive seminar, and soldiers were issued the Back Book for reference material. The primary outcome measure was the back beliefs questionnaire (BBQ), which assesses inevitable consequences of and ability to cope with LBP. The BBQ was administered before randomization and 12 weeks later. A linear mixed model was fitted for the BBQ at the 12-week follow-up, and a generalized linear mixed model was fitted for the dichotomous outcomes on BBQ change of greater than two points. Sensitivity analyses were performed to account for drop out. BBQ scores (potential range: 9-45) improved significantly from baseline of 25.6 +/- 5.7 (mean +/- SD) to 26.9 +/- 6.2 for those receiving the PSEP, while there was a significant decline from 26.1 +/- 5.7 to 25.6 +/- 6.0 for those in the CG. The adjusted mean BBQ score at follow-up for those receiving the PSEP was 1.49 points higher than those in the CG (P < 0.0001). The adjusted odds ratio of BBQ improvement of greater than two points for those receiving the PSEP was 1.51 (95% CI = 1.22-1.86) times that of those in the CG. BBQ improvement was also mildly associated with race and college education. Sensitivity analyses suggested minimal influence of drop out. In conclusion, soldiers

  4. Classroom-based Interventions and Teachers’ Perceived Job Stressors and Confidence: Evidence from a Randomized Trial in Head Start Settings

    PubMed Central

    Zhai, Fuhua; Raver, C. Cybele; Li-Grining, Christine

    2011-01-01

    Preschool teachers’ job stressors have received increasing attention but have been understudied in the literature. We investigated the impacts of a classroom-based intervention, the Chicago School Readiness Project (CSRP), on teachers’ perceived job stressors and confidence, as indexed by their perceptions of job control, job resources, job demands, and confidence in behavior management. Using a clustered randomized controlled trial (RCT) design, the CSRP provided multifaceted services to the treatment group, including teacher training and mental health consultation, which were accompanied by stress-reduction services and workshops. Overall, 90 teachers in 35 classrooms at 18 Head Start sites participated in the study. After adjusting for teacher and classroom factors and site fixed effects, we found that the CSRP had significant effects on the improvement of teachers’ perceived job control and work-related resources. We also found that the CSRP decreased teachers’ confidence in behavior management and had no statistically significant effects on job demands. Overall, we did not find significant moderation effects of teacher race/ethnicity, education, teaching experience, or teacher type. The implications for research and policy are discussed. PMID:21927538

  5. Geostatistical analysis of fault and joint measurements in Austin Chalk, Superconducting Super Collider Site, Texas

    SciTech Connect

    Mace, R.E.; Nance, H.S.; Laubach, S.E.

    1995-06-01

    Faults and joints are conduits for ground-water flow and targets for horizontal drilling in the petroleum industry. Spacing and size distribution are rarely predicted accurately by current structural models or documented adequately by conventional borehole or outcrop samples. Tunnel excavations present opportunities to measure fracture attributes in continuous subsurface exposures. These fracture measurements ran be used to improve structural models, guide interpretation of conventional borehole and outcrop data, and geostatistically quantify spatial and spacing characteristics for comparison to outcrop data or for generating distributions of fracture for numerical flow and transport modeling. Structure maps of over 9 mi of nearly continuous tunnel excavations in Austin Chalk at the Superconducting Super Collider (SSC) site in Ellis County, Texas, provide a unique database of fault and joint populations for geostatistical analysis. Observationally, small faults (<10 ft. throw) occur in clusters or swarms that have as many as 24 faults, fault swarms are as much as 2,000 ft. wide and appear to be on average 1,000 ft. apart, and joints are in swarms spaced 500 to more than 2l,000 ft. apart. Semi-variograms show varying degrees of spatial correlation. These variograms have structured sills that correlate directly to highs and lows in fracture frequency observed in the tunnel. Semi-variograms generated with respect to fracture spacing and number also have structured sills, but tend to not show any near-field correlation. The distribution of fault spacing can be described with a negative exponential, which suggests a random distribution. However, there is clearly some structure and clustering in the spacing data as shown by running average and variograms, which implies that a number of different methods should be utilized to characterize fracture spacing.

  6. Psychoneuroendocrine effects of cognitive-behavioral stress management in a naturalistic setting--a randomized controlled trial.

    PubMed

    Gaab, J; Sonderegger, L; Scherrer, S; Ehlert, U

    2006-05-01

    It is assumed that chronic or extensive release of cortisol due to stress has deleterious effects on somatic and psychological health, making interventions aiming to reduce and/or normalize cortisol secretion to stress of interest. Cognitive-behavioral stress management (CBSM) has repeatedly been shown to effectively reduce cortisol responses to acute psychosocial stress. However, the effects of CBSM on psychoneuroendocrine responses during "real-life" stress have yet not been examined in healthy subjects. Eight weeks before all subjects took an important academic exam, 28 healthy economics students were randomly assigned to four weekly sessions of cognitive behavioral stress management (CBSM) training or a waiting control condition. Psychological and somatic symptoms were repeatedly assessed throughout the preparation period. Salivary cortisol (cortisol awakening response and short circadian cortisol profile) was repeatedly measured at baseline and on the day of the exam. In addition, cognitive appraisal was assessed on the day of the exam. Subjects in the CBSM group showed significantly lower anxiety and somatic symptom levels throughout the period prior to the exam. On the day of the exam, groups differed in their cortisol awakening stress responses, with significantly attenuated cortisol levels in controls. Short circadian cortisol levels did not differ between groups. Interestingly, groups differed in their associations between cortisol responses before the exam and cognitive stress appraisal, with dissociation in controls but not in the CBSM group. The results show that CBSM reduces psychological and somatic symptoms and influences the ability to show a cortisol response corresponding to subjectively perceived stress. In line with current psychoneuroendocrine models, the inability to mount a cortisol response corresponding to the cognitive appraisal in controls could be a result of a dysregulated HPA axis, probably as a consequence of longlasting stress. PMID

  7. A randomized controlled trial of a computer-based physician workstation in an outpatient setting: implementation barriers to outcome evaluation.

    PubMed Central

    Rotman, B L; Sullivan, A N; McDonald, T W; Brown, B W; DeSmedt, P; Goodnature, D; Higgins, M C; Suermondt, H J; Young, C; Owens, D K

    1996-01-01

    OBJECTIVE: A research prototype Physician Workstation (PWS) incorporating a graphical user interface and a drug ordering module was compared with the existing hospital information system in an academic Veterans Administration General Medical Clinic. Physicians in the intervention group received recommendations for drug substitutions to reduce costs and were alerted to potential drug interactions. The objective was to evaluate the effect of the PWS on user satisfaction, on health-related outcomes, and on costs. DESIGN: A one-year, two-period, randomized controlled trial with 37 subjects. MEASUREMENTS: Differences in the reliance on noncomputer sources of information, in user satisfaction, in the cost of prescribed medications, and in the rate of clinically relevant drug interactions were assessed. RESULTS: The study subjects logged onto the workstation an average of 6.53 times per provider and used it to generate 2.8% of prescriptions during the intervention period. On a five-point scale (5 = very satisfied, 1 = very dissatisfied), user satisfaction declined in the PWS group (3.44 to 2.98 p = 0.008), and increased in the control group (3.23 to 3.72, p < 0.0001). CONCLUSION: The intervention physicians did not use the PWS frequently enough to influence information-seeking behavior, health outcomes, or cost. The study design did not determine whether the poor usage resulted from satisfaction with the control system, problems using the PWS intervention, or the functions provided by the PWS intervention. Evaluative studies should include provisions to improve the chance of successful implementation as well as to yield maximum information if a negative study occurs. PMID:8880681

  8. A Controlled Design of Aligned and Random Nanofibers for 3D Bi-functionalized Nerve Conduits Fabricated via a Novel Electrospinning Set-up

    PubMed Central

    Kim, Jeong In; Hwang, Tae In; Aguilar, Ludwig Erik; Park, Chan Hee; Kim, Cheol Sang

    2016-01-01

    Scaffolds made of aligned nanofibers are favorable for nerve regeneration due to their superior nerve cell attachment and proliferation. However, it is challenging not only to produce a neat mat or a conduit form with aligned nanofibers but also to use these for surgical applications as a nerve guide conduit due to their insufficient mechanical strength. Furthermore, no studies have been reported on the fabrication of aligned nanofibers and randomly-oriented nanofibers on the same mat. In this study, we have successfully produced a mat with both aligned and randomly-oriented nanofibers by using a novel electrospinning set up. A new conduit with a highly-aligned electrospun mat is produced with this modified electrospinning method, and this proposed conduit with favorable features, such as selective permeability, hydrophilicity and nerve growth directional steering, were fabricated as nerve guide conduits (NGCs). The inner surface of the nerve conduit is covered with highly aligned electrospun nanofibers and is able to enhance the proliferation of neural cells. The central part of the tube is double-coated with randomly-oriented nanofibers over the aligned nanofibers, strengthening the weak mechanical strength of the aligned nanofibers. PMID:27021221

  9. A Controlled Design of Aligned and Random Nanofibers for 3D Bi-functionalized Nerve Conduits Fabricated via a Novel Electrospinning Set-up.

    PubMed

    Kim, Jeong In; Hwang, Tae In; Aguilar, Ludwig Erik; Park, Chan Hee; Kim, Cheol Sang

    2016-01-01

    Scaffolds made of aligned nanofibers are favorable for nerve regeneration due to their superior nerve cell attachment and proliferation. However, it is challenging not only to produce a neat mat or a conduit form with aligned nanofibers but also to use these for surgical applications as a nerve guide conduit due to their insufficient mechanical strength. Furthermore, no studies have been reported on the fabrication of aligned nanofibers and randomly-oriented nanofibers on the same mat. In this study, we have successfully produced a mat with both aligned and randomly-oriented nanofibers by using a novel electrospinning set up. A new conduit with a highly-aligned electrospun mat is produced with this modified electrospinning method, and this proposed conduit with favorable features, such as selective permeability, hydrophilicity and nerve growth directional steering, were fabricated as nerve guide conduits (NGCs). The inner surface of the nerve conduit is covered with highly aligned electrospun nanofibers and is able to enhance the proliferation of neural cells. The central part of the tube is double-coated with randomly-oriented nanofibers over the aligned nanofibers, strengthening the weak mechanical strength of the aligned nanofibers. PMID:27021221

  10. Analysis of vadose zone tritium transport from an underground storage tank release using numerical modeling and geostatistics

    SciTech Connect

    Lee, K.H.

    1997-09-01

    Numerical and geostatistical analyses show that the artificial smoothing effect of kriging removes high permeability flow paths from hydrogeologic data sets, reducing simulated contaminant transport rates in heterogeneous vadose zone systems. therefore, kriging alone is not recommended for estimating the spatial distribution of soil hydraulic properties for contaminant transport analysis at vadose zone sites. Vadose zone transport if modeled more effectively by combining kriging with stochastic simulation to better represent the high degree of spatial variability usually found in the hydraulic properties of field soils. However, kriging is a viable technique for estimating the initial mass distribution of contaminants in the subsurface.

  11. Geostatistical modeling of uncertainty of the spatial distribution of available phosphorus in soil in a sugarcane field

    NASA Astrophysics Data System (ADS)

    Tadeu Pereira, Gener; Ribeiro de Oliveira, Ismênia; De Bortoli Teixeira, Daniel; Arantes Camargo, Livia; Rodrigo Panosso, Alan; Marques, José, Jr.

    2015-04-01

    Phosphorus is one of the limiting nutrients for sugarcane development in Brazilian soils. The spatial variability of this nutrient is great, defined by the properties that control its adsorption and desorption reactions. Spatial estimates to characterize this variability are based on geostatistical interpolation. Thus, the assessment of the uncertainty of estimates associated with the spatial distribution of available P (Plabile) is decisive to optimize the use of phosphate fertilizers. The purpose of this study was to evaluate the performance of sequential Gaussian simulation (sGs) and ordinary kriging (OK) in the modeling of uncertainty in available P estimates. A sampling grid with 626 points was established in a 200-ha experimental sugarcane field in Tabapuã, São Paulo State, Brazil. The soil was sampled in the crossover points of a regular grid with intervals of 50 m. From the observations, 63 points, approximately 10% of sampled points were randomly selected before the geostatistical modeling of the composition of a data set used in the validation process modeling, while the remaining 563 points were used for the predictions variable in a place not sampled. The sGs generated 200 realizations. From the realizations generated, different measures of estimation and uncertainty were obtained. The standard deviation, calculated point to point, all simulated maps provided the map of deviation, used to assess local uncertainty. The visual analysis of maps of the E-type and KO showed that the spatial patterns produced by both methods were similar, however, it was possible to observe the characteristic smoothing effect of the KO especially in regions with extreme values. The Standardized variograms of selected realizations sGs showed both range and model similar to the variogram of the Observed date of Plabile. The variogram KO showed a distinct structure of the observed data, underestimating the variability over short distances, presenting parabolic behavior near

  12. ON THE GEOSTATISTICAL APPROACH TO THE INVERSE PROBLEM. (R825689C037)

    EPA Science Inventory

    Abstract

    The geostatistical approach to the inverse problem is discussed with emphasis on the importance of structural analysis. Although the geostatistical approach is occasionally misconstrued as mere cokriging, in fact it consists of two steps: estimation of statist...

  13. Geospatial Interpolation and Mapping of Tropospheric Ozone Pollution Using Geostatistics

    PubMed Central

    Kethireddy, Swatantra R.; Tchounwou, Paul B.; Ahmad, Hafiz A.; Yerramilli, Anjaneyulu; Young, John H.

    2014-01-01

    Tropospheric ozone (O3) pollution is a major problem worldwide, including in the United States of America (USA), particularly during the summer months. Ozone oxidative capacity and its impact on human health have attracted the attention of the scientific community. In the USA, sparse spatial observations for O3 may not provide a reliable source of data over a geo-environmental region. Geostatistical Analyst in ArcGIS has the capability to interpolate values in unmonitored geo-spaces of interest. In this study of eastern Texas O3 pollution, hourly episodes for spring and summer 2012 were selectively identified. To visualize the O3 distribution, geostatistical techniques were employed in ArcMap. Using ordinary Kriging, geostatistical layers of O3 for all the studied hours were predicted and mapped at a spatial resolution of 1 kilometer. A decent level of prediction accuracy was achieved and was confirmed from cross-validation results. The mean prediction error was close to 0, the root mean-standardized-prediction error was close to 1, and the root mean square and average standard errors were small. O3 pollution map data can be further used in analysis and modeling studies. Kriging results and O3 decadal trends indicate that the populace in Houston-Sugar Land-Baytown, Dallas-Fort Worth-Arlington, Beaumont-Port Arthur, San Antonio, and Longview are repeatedly exposed to high levels of O3-related pollution, and are prone to the corresponding respiratory and cardiovascular health effects. Optimization of the monitoring network proves to be an added advantage for the accurate prediction of exposure levels. PMID:24434594

  14. Assessment of spatial distribution of fallout radionuclides through geostatistics concept.

    PubMed

    Mabit, L; Bernard, C

    2007-01-01

    After introducing geostatistics concept and its utility in environmental science and especially in Fallout Radionuclide (FRN) spatialisation, a case study for cesium-137 ((137)Cs) redistribution at the field scale using geostatistics is presented. On a Canadian agricultural field, geostatistics coupled with a Geographic Information System (GIS) was used to test three different techniques of interpolation [Ordinary Kriging (OK), Inverse Distance Weighting power one (IDW1) and two (IDW2)] to create a (137)Cs map and to establish a radioisotope budget. Following the optimization of variographic parameters, an experimental semivariogram was developed to determine the spatial dependence of (137)Cs. It was adjusted to a spherical isotropic model with a range of 30 m and a very small nugget effect. This (137)Cs semivariogram showed a good autocorrelation (R(2)=0.91) and was well structured ('nugget-to-sill' ratio of 4%). It also revealed that the sampling strategy was adequate to reveal the spatial correlation of (137)Cs. The spatial redistribution of (137)Cs was estimated by Ordinary Kriging and IDW to produce contour maps. A radioisotope budget was established for the 2.16 ha agricultural field under investigation. It was estimated that around 2 x 10(7)Bq of (137)Cs were missing (around 30% of the total initial fallout) and were exported by physical processes (runoff and erosion processes) from the area under investigation. The cross-validation analysis showed that in the case of spatially structured data, OK is a better interpolation method than IDW1 or IDW2 for the assessment of potential radioactive contamination and/or pollution. PMID:17673340

  15. Geospatial interpolation and mapping of tropospheric ozone pollution using geostatistics.

    PubMed

    Kethireddy, Swatantra R; Tchounwou, Paul B; Ahmad, Hafiz A; Yerramilli, Anjaneyulu; Young, John H

    2014-01-01

    Tropospheric ozone (O3) pollution is a major problem worldwide, including in the United States of America (USA), particularly during the summer months. Ozone oxidative capacity and its impact on human health have attracted the attention of the scientific community. In the USA, sparse spatial observations for O3 may not provide a reliable source of data over a geo-environmental region. Geostatistical Analyst in ArcGIS has the capability to interpolate values in unmonitored geo-spaces of interest. In this study of eastern Texas O3 pollution, hourly episodes for spring and summer 2012 were selectively identified. To visualize the O3 distribution, geostatistical techniques were employed in ArcMap. Using ordinary Kriging, geostatistical layers of O3 for all the studied hours were predicted and mapped at a spatial resolution of 1 kilometer. A decent level of prediction accuracy was achieved and was confirmed from cross-validation results. The mean prediction error was close to 0, the root mean-standardized-prediction error was close to 1, and the root mean square and average standard errors were small. O3 pollution map data can be further used in analysis and modeling studies. Kriging results and O3 decadal trends indicate that the populace in Houston-Sugar Land-Baytown, Dallas-Fort Worth-Arlington, Beaumont-Port Arthur, San Antonio, and Longview are repeatedly exposed to high levels of O3-related pollution, and are prone to the corresponding respiratory and cardiovascular health effects. Optimization of the monitoring network proves to be an added advantage for the accurate prediction of exposure levels. PMID:24434594

  16. Examples of improved reservoir modeling through geostatistical data integration

    SciTech Connect

    Bashore, W.M.; Araktingi, U.G.

    1994-12-31

    Results from four case studies are presented to demonstrate improvements in reservoir modeling and subsequent flow predictions through various uses of geostatistical integration methods. Specifically, these cases highlight improvements gained from (1) better understanding of reservoir geometries through 3D visualization, (2) forward modeling to assess the value of new data prior to acquisition and integration, (3) assessment of reduced uncertainty in porosity prediction through integration of seismic acoustic impedance, and (4) integration of crosswell tomographic and reflection data. The intent of each of these examples is to quantify the add-value of geological and geophysical data integration in engineering terms such as fluid-flow results and reservoir property predictions.

  17. Mercury emissions from coal combustion in Silesia, analysis using geostatistics

    NASA Astrophysics Data System (ADS)

    Zasina, Damian; Zawadzki, Jaroslaw

    2015-04-01

    Data provided by the UNEP's report on mercury [1] shows that solid fuel combustion in significant source of mercury emission to air. Silesia, located in southwestern Poland, is notably affected by mercury emission due to being one of the most industrialized Polish regions: the place of coal mining, production of metals, stone mining, mineral quarrying and chemical industry. Moreover, Silesia is the region with high population density. People are exposed to severe risk of mercury emitted from both: industrial and domestic sources (i.e. small household furnaces). Small sources have significant contribution to total emission of mercury. Official and statistical analysis, including prepared for international purposes [2] did not provide data about spatial distribution of the mercury emitted to air, however number of analysis on Polish public power and energy sector had been prepared so far [3; 4]. The distribution of locations exposed for mercury emission from small domestic sources is interesting matter merging information from various sources: statistical, economical and environmental. This paper presents geostatistical approach to distibution of mercury emission from coal combustion. Analysed data organized in 2 independent levels: individual, bottom-up approach derived from national emission reporting system [5; 6] and top down - regional data calculated basing on official statistics [7]. Analysis, that will be presented, will include comparison of spatial distributions of mercury emission using data derived from sources mentioned above. Investigation will include three voivodeships of Poland: Lower Silesian, Opole (voivodeship) and Silesian using selected geostatistical methodologies including ordinary kriging [8]. References [1] UNEP. Global Mercury Assessment 2013: Sources, Emissions, Releases and Environmental Transport. UNEP Chemicals Branch, Geneva, Switzerland, 2013. [2] NCEM. Poland's Informative Inventory Report 2014. NCEM at the IEP-NRI, 2014. http

  18. Geostatistical three-dimensional modeling of oolite shoals, St. Louis Limestone, southwest Kansas

    USGS Publications Warehouse

    Qi, L.; Carr, T.R.; Goldstein, R.H.

    2007-01-01

    In the Hugoton embayment of southwestern Kansas, reservoirs composed of relatively thin (<4 m; <13.1 ft) oolitic deposits within the St. Louis Limestone have produced more than 300 million bbl of oil. The geometry and distribution of oolitic deposits control the heterogeneity of the reservoirs, resulting in exploration challenges and relatively low recovery. Geostatistical three-dimensional (3-D) models were constructed to quantify the geometry and spatial distribution of oolitic reservoirs, and the continuity of flow units within Big Bow and Sand Arroyo Creek fields. Lithofacies in uncored wells were predicted from digital logs using a neural network. The tilting effect from the Laramide orogeny was removed to construct restored structural surfaces at the time of deposition. Well data and structural maps were integrated to build 3-D models of oolitic reservoirs using stochastic simulations with geometry data. Three-dimensional models provide insights into the distribution, the external and internal geometry of oolitic deposits, and the sedimentologic processes that generated reservoir intervals. The structural highs and general structural trend had a significant impact on the distribution and orientation of the oolitic complexes. The depositional pattern and connectivity analysis suggest an overall aggradation of shallow-marine deposits during pulses of relative sea level rise followed by deepening near the top of the St. Louis Limestone. Cemented oolitic deposits were modeled as barriers and baffles and tend to concentrate at the edge of oolitic complexes. Spatial distribution of porous oolitic deposits controls the internal geometry of rock properties. Integrated geostatistical modeling methods can be applicable to other complex carbonate or siliciclastic reservoirs in shallow-marine settings. Copyright ?? 2007. The American Association of Petroleum Geologists. All rights reserved.

  19. Geostatistical inference of main Y-STR-haplotype groups in Europe.

    PubMed

    Diaz-Lacava, Amalia; Walier, Maja; Willuweit, Sascha; Wienker, Thomas F; Fimmers, Rolf; Baur, Max P; Roewer, Lutz

    2011-03-01

    We examined the multifarious genetic heterogeneity of Europe and neighboring regions from a geographical perspective. We created composite maps outlining the estimated geographical distribution of major groups of genetically similar individuals on the basis of forensic Y-chromosomal markers. We analyzed Y-chromosomal haplotypes composed of 7 highly polymorphic STR loci, genotyped for 33,010 samples, collected at 249 sites in Europe, Western Asia and North Africa, deposited in the YHRD database (www.yhrd.org). The data set comprised 4176 different haplotypes, which we grouped into 20 clusters. For each cluster, the frequency per site was calculated. All geostatistical analysis was performed with the geographic information system GRASS-GIS. We interpolated frequency values across the study area separately for each cluster. Juxtaposing all 20 interpolated surfaces, we point-wisely screened for the highest cluster frequencies and stored it in parallel with the respective cluster label. We combined these two types of data in a composite map. We repeated this procedure for the second highest frequencies in Europe. Major groups were assigned to Northern, Western and Eastern Europe. North Africa built a separate region, Southeastern Europe, Turkey and Near East were divided into several regions. The spatial distribution of the groups accounting for the second highest frequencies in Europe overlapped with the territories of the largest countries. The genetic structure presented in the composite maps fits major historical geopolitical regions and is in agreement with previous studies of genetic frequencies, validating our approach. Our genetic geostatistical approach provides, on the basis of two composite maps, detailed evidence of the geographical distribution and relative frequencies of the most predominant groups of the extant male European population, examined on the basis of forensic Y-STR haplotypes. The existence of considerable genetic differences among geographic

  20. Enhancing multiple-point geostatistical modeling: 1. Graph theory and pattern adjustment

    NASA Astrophysics Data System (ADS)

    Tahmasebi, Pejman; Sahimi, Muhammad

    2016-03-01

    In recent years, higher-order geostatistical methods have been used for modeling of a wide variety of large-scale porous media, such as groundwater aquifers and oil reservoirs. Their popularity stems from their ability to account for qualitative data and the great flexibility that they offer for conditioning the models to hard (quantitative) data, which endow them with the capability for generating realistic realizations of porous formations with very complex channels, as well as features that are mainly a barrier to fluid flow. One group of such models consists of pattern-based methods that use a set of data points for generating stochastic realizations by which the large-scale structure and highly-connected features are reproduced accurately. The cross correlation-based simulation (CCSIM) algorithm, proposed previously by the authors, is a member of this group that has been shown to be capable of simulating multimillion cell models in a matter of a few CPU seconds. The method is, however, sensitive to pattern's specifications, such as boundaries and the number of replicates. In this paper the original CCSIM algorithm is reconsidered and two significant improvements are proposed for accurately reproducing large-scale patterns of heterogeneities in porous media. First, an effective boundary-correction method based on the graph theory is presented by which one identifies the optimal cutting path/surface for removing the patchiness and discontinuities in the realization of a porous medium. Next, a new pattern adjustment method is proposed that automatically transfers the features in a pattern to one that seamlessly matches the surrounding patterns. The original CCSIM algorithm is then combined with the two methods and is tested using various complex two- and three-dimensional examples. It should, however, be emphasized that the methods that we propose in this paper are applicable to other pattern-based geostatistical simulation methods.

  1. Spatially explicit Schistosoma infection risk in eastern Africa using Bayesian geostatistical modelling.

    PubMed

    Schur, Nadine; Hürlimann, Eveline; Stensgaard, Anna-Sofie; Chimfwembe, Kingford; Mushinge, Gabriel; Simoonga, Christopher; Kabatereine, Narcis B; Kristensen, Thomas K; Utzinger, Jürg; Vounatsou, Penelope

    2013-11-01

    Schistosomiasis remains one of the most prevalent parasitic diseases in the tropics and subtropics, but current statistics are outdated due to demographic and ecological transformations and ongoing control efforts. Reliable risk estimates are important to plan and evaluate interventions in a spatially explicit and cost-effective manner. We analysed a large ensemble of georeferenced survey data derived from an open-access neglected tropical diseases database to create smooth empirical prevalence maps for Schistosoma mansoni and Schistosoma haematobium for a total of 13 countries of eastern Africa. Bayesian geostatistical models based on climatic and other environmental data were used to account for potential spatial clustering in spatially structured exposures. Geostatistical variable selection was employed to reduce the set of covariates. Alignment factors were implemented to combine surveys on different age-groups and to acquire separate estimates for individuals aged ≤20 years and entire communities. Prevalence estimates were combined with population statistics to obtain country-specific numbers of Schistosoma infections. We estimate that 122 million individuals in eastern Africa are currently infected with either S. mansoni, or S. haematobium, or both species concurrently. Country-specific population-adjusted prevalence estimates range between 12.9% (Uganda) and 34.5% (Mozambique) for S. mansoni and between 11.9% (Djibouti) and 40.9% (Mozambique) for S. haematobium. Our models revealed that infection risk in Burundi, Eritrea, Ethiopia, Kenya, Rwanda, Somalia and Sudan might be considerably higher than previously reported, while in Mozambique and Tanzania, the risk might be lower than current estimates suggest. Our empirical, large-scale, high-resolution infection risk estimates for S. mansoni and S. haematobium in eastern Africa can guide future control interventions and provide a benchmark for subsequent monitoring and evaluation activities. PMID:22019933

  2. Comparative soil CO2 flux measurements and geostatistical estimation methods on Masaya volcano, Nicaragua

    USGS Publications Warehouse

    Lewicki, J.L.; Bergfeld, D.; Cardellini, C.; Chiodini, G.; Granieri, D.; Varley, N.; Werner, C.

    2005-01-01

    We present a comparative study of soil CO2 flux (FCO2) measured by five groups (Groups 1-5) at the IAVCEI-CCVG Eighth Workshop on Volcanic Gases on Masaya volcano, Nicaragua. Groups 1-5 measured (FCO2) using the accumulation chamber method at 5-m spacing within a 900 m2 grid during a morning (AM) period. These measurements were repeated by Groups 1-3 during an afternoon (PM) period. Measured (FCO2 ranged from 218 to 14,719 g m-2 day-1. The variability of the five measurements made at each grid point ranged from ??5 to 167%. However, the arithmetic means of fluxes measured over the entire grid and associated total CO2 emission rate estimates varied between groups by only ??22%. All three groups that made PM measurements reported an 8-19% increase in total emissions over the AM results. Based on a comparison of measurements made during AM and PM times, we argue that this change is due in large part to natural temporal variability of gas flow, rather than to measurement error. In order to estimate the mean and associated CO2 emission rate of one data set and to map the spatial FCO2 distribution, we compared six geostatistical methods: Arithmetic and minimum variance unbiased estimator means of uninterpolated data, and arithmetic means of data interpolated by the multiquadric radial basis function, ordinary kriging, multi-Gaussian kriging, and sequential Gaussian simulation methods. While the total CO2 emission rates estimated using the different techniques only varied by ??4.4%, the FCO2 maps showed important differences. We suggest that the sequential Gaussian simulation method yields the most realistic representation of the spatial distribution of FCO2, but a variety of geostatistical methods are appropriate to estimate the total CO2 emission rate from a study area, which is a primary goal in volcano monitoring research. ?? Springer-Verlag 2005.

  3. The effectiveness of physical activity monitoring and distance counseling in an occupational setting – Results from a randomized controlled trial (CoAct)

    PubMed Central

    2012-01-01

    Background Lack of physical activity (PA) is a known risk factor for many health conditions. The workplace is a setting often used to promote activity and health. We investigated the effectiveness of an intervention on PA and productivity-related outcomes in an occupational setting. Methods We conducted a randomized controlled trial of 12 months duration with two 1:1 allocated parallel groups of insurance company employees. Eligibility criteria included permanent employment and absence of any condition that risked the participant’s health during PA. Subjects in the intervention group monitored their daily PA with an accelerometer, set goals, had access to an online service to help them track their activity levels, and received counseling via telephone or web messages for 12 months. The control group received the results of a fitness test and an information leaflet on PA at the beginning of the study. The intervention’s aim was to increase PA, improve work productivity, and decrease sickness absence. Primary outcomes were PA (measured as MET minutes per week), work productivity (quantity and quality of work; QQ index), and sickness absence (SA) days at 12 months. Participants were assigned to groups using block randomization with a computer-generated scheme. The study was not blinded. Results There were 544 randomized participants, of which 521 were included in the analysis (64% female, mean age 43 years). At 12 months, there was no significant difference in physical activity levels between the intervention group (n = 264) and the control group (n = 257). The adjusted mean difference was −206 MET min/week [95% Bayesian credible interval −540 to 128; negative values favor control group]. There was also no significant difference in the QQ index (−0.5 [−4.4 to 3.3]) or SA days (0.0 [−1.2 to 0.9]). Of secondary outcomes, body weight (0.5 kg [0.0 to 1.0]) and percentage of body fat (0.6% [0.2% to 1.1%]) were slightly higher in the

  4. A multiple-point geostatistical approach to quantifying uncertainty for flow and transport simulation in geologically complex environments

    NASA Astrophysics Data System (ADS)

    Cronkite-Ratcliff, C.; Phelps, G. A.; Boucher, A.

    2011-12-01

    In many geologic settings, the pathways of groundwater flow are controlled by geologic heterogeneities which have complex geometries. Models of these geologic heterogeneities, and consequently, their effects on the simulated pathways of groundwater flow, are characterized by uncertainty. Multiple-point geostatistics, which uses a training image to represent complex geometric descriptions of geologic heterogeneity, provides a stochastic approach to the analysis of geologic uncertainty. Incorporating multiple-point geostatistics into numerical models provides a way to extend this analysis to the effects of geologic uncertainty on the results of flow simulations. We present two case studies to demonstrate the application of multiple-point geostatistics to numerical flow simulation in complex geologic settings with both static and dynamic conditioning data. Both cases involve the development of a training image from a complex geometric description of the geologic environment. Geologic heterogeneity is modeled stochastically by generating multiple equally-probable realizations, all consistent with the training image. Numerical flow simulation for each stochastic realization provides the basis for analyzing the effects of geologic uncertainty on simulated hydraulic response. The first case study is a hypothetical geologic scenario developed using data from the alluvial deposits in Yucca Flat, Nevada. The SNESIM algorithm is used to stochastically model geologic heterogeneity conditioned to the mapped surface geology as well as vertical drill-hole data. Numerical simulation of groundwater flow and contaminant transport through geologic models produces a distribution of hydraulic responses and contaminant concentration results. From this distribution of results, the probability of exceeding a given contaminant concentration threshold can be used as an indicator of uncertainty about the location of the contaminant plume boundary. The second case study considers a

  5. Testing geostatistical methods to combine radar and rain gauges for precipitation mapping in a mountainous region

    NASA Astrophysics Data System (ADS)

    Erdin, R.; Frei, C.; Sideris, I.; Kuensch, H.-R.

    2010-09-01

    There is an increasing demand for accurate mapping of precipitation at a spatial resolution of kilometers. Radar and rain gauges - the two main precipitation measurement systems - exhibit complementary strengths and weaknesses. Radar offers high spatial and temporal resolution but lacks accuracy of absolute values, whereas rain gauges provide accurate values at their specific point location but suffer from poor spatial representativeness. Methods of geostatistical mapping have been proposed to combine radar and rain gauge data for quantitative precipitation estimation (QPE). The aim is to combine the respective strengths and compensate for the respective weaknesses of the two observation platforms. Several studies have demonstrated the potential of these methods over topography of moderate complexity, but their performance remains unclear for high-mountain regions where rainfall patterns are complex, the representativeness of rain gauge measurements is limited and radar observations are obstructed. In this study we examine the potential and limitations of two frequently used geostatistical mapping methods for the territory of Switzerland, where the mountain chain of the Alps poses particular challenges to QPE. The two geostatistical methods explored are kriging with external drift (KED) using radar as drift variable and ordinary kriging of radar errors (OKRE). The radar data is a composite from three C-band radars using a constant Z-R relationship, advanced correction processings for visibility, ground clutter and beam shielding and a climatological bias adjustment. The rain gauge data originates from an automatic network with a typical inter-station distance of 25 km. Both combination methods are applied to a set of case examples representing typical rainfall situations in the Alps with their inherent challenges at daily and hourly time resolution. The quality of precipitation estimates is assessed by several skill scores calculated from cross validation errors at

  6. A geostatistical analysis of the geographic distribution of lymphatic filariasis prevalence in southern India.

    PubMed

    Srividya, A; Michael, E; Palaniyandi, M; Pani, S P; Das, P K

    2002-11-01

    Gaining a better understanding of the spatial population structure of infectious agents is increasingly recognized as being key to their more effective mapping and to improving knowledge of their overall population dynamics and control. Here, we investigate the spatial structure of bancroftian filariasis distribution using geostatistical methods in an endemic region in Southern India. Analysis of a parasite antigenemia prevalence dataset assembled by sampling 79 villages selected using a World Health Organization (WHO) proposed 25 x 25 km grid sampling procedure in a 225 x 225 km area within this region was compared with that of a corresponding microfilaraemia prevalence dataset assembled by sampling 119 randomly selected villages from a smaller subregion located within the main study area. A major finding from the analysis was that once large-scale spatial trends were removed, the antigenemia data did not show evidence for the existence of any small-scale dependency at the study sampling interval of 25 km. By contrast, analysis of the randomly sampled microfilaraemia data indicated strong spatial contagion in prevalence up to a distance of approximately 6.6 kms, suggesting the likely existence of small spatial patches or foci of transmission in the study area occurring below the sampling scale used for sampling the antigenemia data. While this could indicate differences in parasite spatial population dynamics based on antigenemia versus microfilaraemia data, the result may also suggest that the WHO recommended 25 x 25 km sampling grid for rapid filariasis mapping could have been too coarse a scale to capture and describe the likely local variation in filariasis infection in this endemic location and highlights the need for caution when applying uniform sampling schemes in diverse endemic regions for investigating the spatial pattern of this parasitic infection. The present results, on the other hand, imply that both small-scale spatial processes and large

  7. Probabilistic assessment of ground-water contamination. 1: Geostatistical framework

    SciTech Connect

    Rautman, C.A.; Istok, J.D.

    1996-09-01

    Characterizing the extent and severity of ground-water contamination at waste sites is expensive and time-consuming. A probabilistic approach, based on the acceptance of uncertainty and a finite probability of making classification errors (contaminated relative to a regulatory threshold vs. uncontaminated), is presented as an alternative to traditional site characterization methodology. The approach utilizes geostatistical techniques to identify and model the spatial continuity of contamination at a site (variography) and to develop alternate plausible simulations of contamination fields (conditional simulation). Probabilistic summaries of many simulations provide tools for (a) estimating the range of plausible contaminant concentrations at unsampled locations, (b) identifying the locations of boundaries between contaminated and uncontaminated portions of the site and the degree of certainty in those locations, and (c) estimating the range of plausible values for total contaminant mass. The first paper in the series presents the geostatistical framework and illustrates the approach using synthetic data for a hypothetical site. The second paper presents an application of the proposed methodology to the probabilistic assessment of ground-water contamination at a site involving ground-water contamination by nitrate and herbicide in a shallow, unconfined alluvial aquifer in an agricultural area in eastern Oregon.

  8. Validating spatial structure in canopy water content using geostatistics

    NASA Technical Reports Server (NTRS)

    Sanderson, E. W.; Zhang, M. H.; Ustin, S. L.; Rejmankova, E.; Haxo, R. S.

    1995-01-01

    Heterogeneity in ecological phenomena are scale dependent and affect the hierarchical structure of image data. AVIRIS pixels average reflectance produced by complex absorption and scattering interactions between biogeochemical composition, canopy architecture, view and illumination angles, species distributions, and plant cover as well as other factors. These scales affect validation of pixel reflectance, typically performed by relating pixel spectra to ground measurements acquired at scales of 1m(exp 2) or less (e.g., field spectra, foilage and soil samples, etc.). As image analysis becomes more sophisticated, such as those for detection of canopy chemistry, better validation becomes a critical problem. This paper presents a methodology for bridging between point measurements and pixels using geostatistics. Geostatistics have been extensively used in geological or hydrogeolocial studies but have received little application in ecological studies. The key criteria for kriging estimation is that the phenomena varies in space and that an underlying controlling process produces spatial correlation between the measured data points. Ecological variation meets this requirement because communities vary along environmental gradients like soil moisture, nutrient availability, or topography.

  9. A geostatistical modeling study of the effect of heterogeneity on radionuclide transport in the unsaturated zone, Yucca Mountain.

    PubMed

    Viswanathan, Hari S; Robinson, Bruce A; Gable, Carl W; Carey, James W

    2003-01-01

    Retardation of certain radionuclides due to sorption to zeolitic minerals is considered one of the major barriers to contaminant transport in the unsaturated zone of Yucca Mountain. However, zeolitically altered areas are lower in permeability than unaltered regions, which raises the possibility that contaminants might bypass the sorptive zeolites. The relationship between hydrologic and chemical properties must be understood to predict the transport of radionuclides through zeolitically altered areas. In this study, we incorporate mineralogical information into an unsaturated zone transport model using geostatistical techniques to correlate zeolitic abundance to hydrologic and chemical properties. Geostatistical methods are used to develop variograms, kriging maps, and conditional simulations of zeolitic abundance. We then investigate, using flow and transport modeling on a heterogeneous field, the relationship between percent zeolitic alteration, permeability changes due to alteration, sorption due to alteration, and their overall effect on radionuclide transport. We compare these geostatistical simulations to a simplified threshold method in which each spatial location in the model is assigned either zeolitic or vitric properties based on the zeolitic abundance at that location. A key conclusion is that retardation due to sorption predicted by using the continuous distribution is larger than the retardation predicted by the threshold method. The reason for larger retardation when using the continuous distribution is a small but significant sorption at locations with low zeolitic abundance. If, for practical reasons, models with homogeneous properties within each layer are used, we recommend setting nonzero K(d)s in the vitric tuffs to mimic the more rigorous continuous distribution simulations. Regions with high zeolitic abundance may not be as effective in retarding radionuclides such as Neptunium since these rocks are lower in permeability and contaminants can

  10. Large to intermediate-scale aquifer heterogeneity in fine-grain dominated alluvial fans (Cenozoic As Pontes Basin, northwestern Spain): insight based on three-dimensional geostatistical reconstruction

    NASA Astrophysics Data System (ADS)

    Falivene, O.; Cabrera, L.; Sáez, A.

    2007-08-01

    Facies reconstructions are used in hydrogeology to improve the interpretation of aquifer permeability distribution. In the absence of sufficient data to define the heterogeneity due to geological processes, uncertainties in the distribution of aquifer hydrofacies and characteristics may appear. Geometric and geostatistical methods are used to understand and model aquifer hydrofacies distribution, providing models to improve comprehension and development of aquifers. However, these models require some input statistical parameters that can be difficult to infer from the study site. A three-dimensional reconstruction of a kilometer scale fine-grain dominated Cenozoic alluvial fan derived from more than 200 continuously cored, closely spaced, and regularly distributed wells is presented. The facies distributions were reconstructed using a genetic stratigraphic subdivision and a deterministic geostatistical algorithm. The reconstruction is only slightly affected by variations in the geostatistical input parameters because of the high-density data set. Analysis of the reconstruction allowed identification in the proximal to medial alluvial fan zones of several laterally extensive sand bodies with relatively higher permeability; these sand bodies were quantified in terms of volume, mean thickness, maximum area, and maximum equivalent diameter. These quantifications provide trends and geological scenarios for input statistical parameters to model aquifer systems in similar alluvial fan depositional settings.

  11. Feasibility intervention trial of two types of improved cookstoves in three resource-limited settings: study protocol for a randomized controlled trial

    PubMed Central

    2013-01-01

    Background Exposure to biomass fuel smoke is one of the leading risk factors for disease burden worldwide. International campaigns are currently promoting the widespread adoption of improved cookstoves in resource-limited settings, yet little is known about the cultural and social barriers to successful improved cookstove adoption and how these barriers affect environmental exposures and health outcomes. Design We plan to conduct a one-year crossover, feasibility intervention trial in three resource-limited settings (Kenya, Nepal and Peru). We will enroll 40 to 46 female primary cooks aged 20 to 49 years in each site (total 120 to 138). Methods At baseline, we will collect information on sociodemographic characteristics and cooking practices, and measure respiratory health and blood pressure for all participating women. An initial observational period of four months while households use their traditional, open-fire design cookstoves will take place prior to randomization. All participants will then be randomized to receive one of two types of improved, ventilated cookstoves with a chimney: a commercially-constructed cookstove (Envirofit G3300/G3355) or a locally-constructed cookstove. After four months of observation, participants will crossover and receive the other improved cookstove design and be followed for another four months. During each of the three four-month study periods, we will collect monthly information on self-reported respiratory symptoms, cooking practices, compliance with cookstove use (intervention periods only), and measure peak expiratory flow, forced expiratory volume at 1 second, exhaled carbon monoxide and blood pressure. We will also measure pulmonary function testing in the women participants and 24-hour kitchen particulate matter and carbon monoxide levels at least once per period. Discussion Findings from this study will help us better understand the behavioral, biological, and environmental changes that occur with a cookstove

  12. No Effect of Insecticide Treated Curtain Deployment on Aedes Infestation in a Cluster Randomized Trial in a Setting of Low Dengue Transmission in Guantanamo, Cuba

    PubMed Central

    Lambert, Isora; Montada, Domingo; Baly, Alberto; Van der Stuyft, Patrick

    2015-01-01

    Objective & Methodology The current study evaluated the effectiveness and cost-effectiveness of Insecticide Treated Curtain (ITC) deployment for reducing dengue vector infestation levels in the Cuban context with intensive routine control activities. A cluster randomized controlled trial took place in Guantanamo city, east Cuba. Twelve neighborhoods (about 500 households each) were selected among the ones with the highest Aedes infestation levels in the previous two years, and were randomly allocated to the intervention and control arms. Long lasting ITC (PermaNet) were distributed in the intervention clusters in March 2009. Routine control activities were continued in the whole study area. In both study arms, we monitored monthly pre- and post-intervention House Index (HI, number of houses with at least 1 container with Aedes immature stages/100 houses inspected), during 12 and 18 months respectively. We evaluated the effect of ITC deployment on HI by fitting a generalized linear regression model with a negative binomial link function to these data. Principal Findings At distribution, the ITC coverage (% of households using ≥1 ITC) reached 98.4%, with a median of 3 ITC distributed/household. After 18 months, the coverage remained 97.4%. The local Aedes species was susceptible to deltamethrin (mosquito mortality rate of 99.7%) and the residual deltamethrin activity in the ITC was within acceptable levels (mosquito mortality rate of 73.1%) after one year of curtain use. Over the 18 month observation period after ITC distribution, the adjusted HI rate ratio, intervention versus control clusters, was 1.15 (95% CI 0.57 to 2.34). The annualized cost per household of ITC implementation was 3.8 USD, against 16.8 USD for all routine ACP activities. Conclusion Deployment of ITC in a setting with already intensive routine Aedes control actions does not lead to reductions in Aedes infestation levels. PMID:25794192

  13. Mobile phone technologies improve adherence to antiretroviral treatment in a resource-limited setting: a randomized controlled trial of text message reminders

    PubMed Central

    Pop-Eleches, Cristian; Thirumurthy, Harsha; Habyarimana, James P.; Zivin, Joshua G.; Goldstein, Markus P.; de Walque, Damien; MacKeen, Leslie; Haberer, Jessica; Kimaiyo, Sylvester; Sidle, John; Ngare, Duncan; Bangsberg, David R.

    2013-01-01

    Objective There is limited evidence on whether growing mobile phone availability in sub-Saharan Africa can be used to promote high adherence to antiretroviral therapy (ART). This study tested the efficacy of short message service (SMS) reminders on adherence to ART among patients attending a rural clinic in Kenya. Design A randomized controlled trial of four SMS reminder interventions with 48 weeks of follow-up. Methods Four hundred and thirty-one adult patients who had initiated ART within 3 months were enrolled and randomly assigned to a control group or one of the four intervention groups. Participants in the intervention groups received SMS reminders that were either short or long and sent at a daily or weekly frequency. Adherence was measured using the medication event monitoring system. The primary outcome was whether adherence exceeded 90% during each 12-week period of analysis and the 48-week study period. The secondary outcome was whether there were treatment interruptions lasting at least 48 h. Results In intention-to-treat analysis, 53% of participants receiving weekly SMS reminders achieved adherence of at least 90% during the 48 weeks of the study, compared with 40% of participants in the control group (P=0.03). Participants in groups receiving weekly reminders were also significantly less likely to experience treatment interruptions exceeding 48 h during the 48-week follow-up period than participants in the control group (81 vs. 90%, P = 0.03). Conclusion These results suggest that SMS reminders may be an important tool to achieve optimal treatment response in resource-limited settings. PMID:21252632

  14. A group randomized trial of a complexity-based organizational intervention to improve risk factors for diabetes complications in primary care settings: study protocol

    PubMed Central

    Parchman, Michael L; Pugh, Jacqueline A; Culler, Steven D; Noel, Polly H; Arar, Nedal H; Romero, Raquel L; Palmer, Raymond F

    2008-01-01

    Background Most patients with type 2 diabetes have suboptimal control of their glucose, blood pressure (BP), and lipids – three risk factors for diabetes complications. Although the chronic care model (CCM) provides a roadmap for improving these outcomes, developing theoretically sound implementation strategies that will work across diverse primary care settings has been challenging. One explanation for this difficulty may be that most strategies do not account for the complex adaptive system (CAS) characteristics of the primary care setting. A CAS is comprised of individuals who can learn, interconnect, self-organize, and interact with their environment in a way that demonstrates non-linear dynamic behavior. One implementation strategy that may be used to leverage these properties is practice facilitation (PF). PF creates time for learning and reflection by members of the team in each clinic, improves their communication, and promotes an individualized approach to implement a strategy to improve patient outcomes. Specific objectives The specific objectives of this protocol are to: evaluate the effectiveness and sustainability of PF to improve risk factor control in patients with type 2 diabetes across a variety of primary care settings; assess the implementation of the CCM in response to the intervention; examine the relationship between communication within the practice team and the implementation of the CCM; and determine the cost of the intervention both from the perspective of the organization conducting the PF intervention and from the perspective of the primary care practice. Intervention The study will be a group randomized trial conducted in 40 primary care clinics. Data will be collected on all clinics, with 60 patients in each clinic, using a multi-method assessment process at baseline, 12, and 24 months. The intervention, PF, will consist of a series of practice improvement team meetings led by trained facilitators over 12 months. Primary hypotheses

  15. A combined community- and facility-based approach to improve pregnancy outcomes in low-resource settings: a Global Network cluster randomized trial

    PubMed Central

    2013-01-01

    Background Fetal and neonatal mortality rates in low-income countries are at least 10-fold greater than in high-income countries. These differences have been related to poor access to and poor quality of obstetric and neonatal care. Methods This trial tested the hypothesis that teams of health care providers, administrators and local residents can address the problem of limited access to quality obstetric and neonatal care and lead to a reduction in perinatal mortality in intervention compared to control locations. In seven geographic areas in five low-income and one middle-income country, most with high perinatal mortality rates and substantial numbers of home deliveries, we performed a cluster randomized non-masked trial of a package of interventions that included community mobilization focusing on birth planning and hospital transport, community birth attendant training in problem recognition, and facility staff training in the management of obstetric and neonatal emergencies. The primary outcome was perinatal mortality at ≥28 weeks gestation or birth weight ≥1000 g. Results Despite extensive effort in all sites in each of the three intervention areas, no differences emerged in the primary or any secondary outcome between the intervention and control clusters. In both groups, the mean perinatal mortality was 40.1/1,000 births (P = 0.9996). Neither were there differences between the two groups in outcomes in the last six months of the project, in the year following intervention cessation, nor in the clusters that best implemented the intervention. Conclusions This cluster randomized comprehensive, large-scale, multi-sector intervention did not result in detectable impact on the proposed outcomes. While this does not negate the importance of these interventions, we expect that achieving improvement in pregnancy outcomes in these settings will require substantially more obstetric and neonatal care infrastructure than was available at the sites during this trial

  16. Indoor radon variations in central Iran and its geostatistical map

    NASA Astrophysics Data System (ADS)

    Hadad, Kamal; Mokhtari, Javad

    2015-02-01

    We present the results of 2 year indoor radon survey in 10 cities of Yazd province in Central Iran (covering an area of 80,000 km2). We used passive diffusive samplers with LATEX polycarbonate films as Solid State Nuclear Track Detector (SSNTD). This study carried out in central Iran where there are major minerals and uranium mines. Our results indicate that despite few extraordinary high concentrations, average annual concentrations of indoor radon are within ICRP guidelines. When geostatistical spatial distribution of radon mapped onto geographical features of the province it was observed that risk of high radon concentration increases near the Saqand, Bafq, Harat and Abarkooh cities, this depended on the elevation and vicinity of the ores and mines.

  17. Bayesian geostatistics in health cartography: the perspective of malaria.

    PubMed

    Patil, Anand P; Gething, Peter W; Piel, Frédéric B; Hay, Simon I

    2011-06-01

    Maps of parasite prevalences and other aspects of infectious diseases that vary in space are widely used in parasitology. However, spatial parasitological datasets rarely, if ever, have sufficient coverage to allow exact determination of such maps. Bayesian geostatistics (BG) is a method for finding a large sample of maps that can explain a dataset, in which maps that do a better job of explaining the data are more likely to be represented. This sample represents the knowledge that the analyst has gained from the data about the unknown true map. BG provides a conceptually simple way to convert these samples to predictions of features of the unknown map, for example regional averages. These predictions account for each map in the sample, yielding an appropriate level of predictive precision. PMID:21420361

  18. Geostatistical analysis of allele presence patterns among American black bears in eastern North Carolina

    USGS Publications Warehouse

    Thompson, L.M.; Van Manen, F.T.; King, T.L.

    2005-01-01

    Highways are one of the leading causes of wildlife habitat fragmentation and may particularly affect wide-ranging species, such as American black bears (Ursus americanus). We initiated a research project in 2000 to determine potential effects of a 4-lane highway on black bear ecology in Washington County, North Carolina. The research design included a treatment area (highway construction) and a control area and a pre- and post-construction phase. We used data from the pre-construction phase to determine whether we could detect scale dependency or directionality among allele occurrence patterns using geostatistics. Detection of such patterns could provide a powerful tool to measure the effects of landscape fragmentation on gene flow. We sampled DNA from roots of black bear hair at 70 hair-sampling sites on each study area for 7 weeks during fall of 2000. We used microsatellite analysis based on 10 loci to determine unique multi-locus genotypes. We examined all alleles sampled at ???25 sites on each study area and mapped their presence or absence at each hair-sample site. We calculated semivariograms, which measure the strength of statistical correlation as a function of distance, and adjusted them for anisotropy to determine the maximum direction of spatial continuity. We then calculated the mean direction of spatial continuity for all examined alleles. The mean direction of allele frequency variation was 118.3?? (SE = 8.5) on the treatment area and 172.3?? (SE = 6.0) on the control area. Rayleigh's tests showed that these directions differed from random distributions (P = 0.028 and P < 0.001, respectively), indicating consistent directional patterns for the alleles we examined in each area. Despite the small spatial scale of our study (approximately 11,000 ha for each study area), we observed distinct and consistent patterns of allele occurrence, suggesting different directions of gene flow between the study areas. These directions seemed to coincide with the

  19. Mapping aboveground woody biomass using forest inventory, remote sensing and geostatistical techniques.

    PubMed

    Yadav, Bechu K V; Nandy, S

    2015-05-01

    Mapping forest biomass is fundamental for estimating CO₂ emissions, and planning and monitoring of forests and ecosystem productivity. The present study attempted to map aboveground woody biomass (AGWB) integrating forest inventory, remote sensing and geostatistical techniques, viz., direct radiometric relationships (DRR), k-nearest neighbours (k-NN) and cokriging (CoK) and to evaluate their accuracy. A part of the Timli Forest Range of Kalsi Soil and Water Conservation Division, Uttarakhand, India was selected for the present study. Stratified random sampling was used to collect biophysical data from 36 sample plots of 0.1 ha (31.62 m × 31.62 m) size. Species-specific volumetric equations were used for calculating volume and multiplied by specific gravity to get biomass. Three forest-type density classes, viz. 10-40, 40-70 and >70% of Shorea robusta forest and four non-forest classes were delineated using on-screen visual interpretation of IRS P6 LISS-III data of December 2012. The volume in different strata of forest-type density ranged from 189.84 to 484.36 m(3) ha(-1). The total growing stock of the forest was found to be 2,024,652.88 m(3). The AGWB ranged from 143 to 421 Mgha(-1). Spectral bands and vegetation indices were used as independent variables and biomass as dependent variable for DRR, k-NN and CoK. After validation and comparison, k-NN method of Mahalanobis distance (root mean square error (RMSE) = 42.25 Mgha(-1)) was found to be the best method followed by fuzzy distance and Euclidean distance with RMSE of 44.23 and 45.13 Mgha(-1) respectively. DRR was found to be the least accurate method with RMSE of 67.17 Mgha(-1). The study highlighted the potential of integrating of forest inventory, remote sensing and geostatistical techniques for forest biomass mapping. PMID:25930205

  20. Medical Geography: a Promising Field of Application for Geostatistics

    PubMed Central

    Goovaerts, P.

    2008-01-01

    The analysis of health data and putative covariates, such as environmental, socio-economic, behavioral or demographic factors, is a promising application for geostatistics. It presents, however, several methodological challenges that arise from the fact that data are typically aggregated over irregular spatial supports and consist of a numerator and a denominator (i.e. population size). This paper presents an overview of recent developments in the field of health geostatistics, with an emphasis on three main steps in the analysis of areal health data: estimation of the underlying disease risk, detection of areas with significantly higher risk, and analysis of relationships with putative risk factors. The analysis is illustrated using age-adjusted cervix cancer mortality rates recorded over the 1970–1994 period for 118 counties of four states in the Western USA. Poisson kriging allows the filtering of noisy mortality rates computed from small population sizes, enhancing the correlation with two putative explanatory variables: percentage of habitants living below the federally defined poverty line, and percentage of Hispanic females. Area-to-point kriging formulation creates continuous maps of mortality risk, reducing the visual bias associated with the interpretation of choropleth maps. Stochastic simulation is used to generate realizations of cancer mortality maps, which allows one to quantify numerically how the uncertainty about the spatial distribution of health outcomes translates into uncertainty about the location of clusters of high values or the correlation with covariates. Last, geographically-weighted regression highlights the non-stationarity in the explanatory power of covariates: the higher mortality values along the coast are better explained by the two covariates than the lower risk recorded in Utah. PMID:19412347

  1. Principal Component Geostatistical Approach for large-dimensional inverse problems

    PubMed Central

    Kitanidis, P K; Lee, J

    2014-01-01

    The quasi-linear geostatistical approach is for weakly nonlinear underdetermined inverse problems, such as Hydraulic Tomography and Electrical Resistivity Tomography. It provides best estimates as well as measures for uncertainty quantification. However, for its textbook implementation, the approach involves iterations, to reach an optimum, and requires the determination of the Jacobian matrix, i.e., the derivative of the observation function with respect to the unknown. Although there are elegant methods for the determination of the Jacobian, the cost is high when the number of unknowns, m, and the number of observations, n, is high. It is also wasteful to compute the Jacobian for points away from the optimum. Irrespective of the issue of computing derivatives, the computational cost of implementing the method is generally of the order of m2n, though there are methods to reduce the computational cost. In this work, we present an implementation that utilizes a matrix free in terms of the Jacobian matrix Gauss-Newton method and improves the scalability of the geostatistical inverse problem. For each iteration, it is required to perform K runs of the forward problem, where K is not just much smaller than m but can be smaller that n. The computational and storage cost of implementation of the inverse procedure scales roughly linearly with m instead of m2 as in the textbook approach. For problems of very large m, this implementation constitutes a dramatic reduction in computational cost compared to the textbook approach. Results illustrate the validity of the approach and provide insight in the conditions under which this method perform best. PMID:25558113

  2. Principal Component Geostatistical Approach for large-dimensional inverse problems

    NASA Astrophysics Data System (ADS)

    Kitanidis, P. K.; Lee, J.

    2014-07-01

    The quasi-linear geostatistical approach is for weakly nonlinear underdetermined inverse problems, such as Hydraulic Tomography and Electrical Resistivity Tomography. It provides best estimates as well as measures for uncertainty quantification. However, for its textbook implementation, the approach involves iterations, to reach an optimum, and requires the determination of the Jacobian matrix, i.e., the derivative of the observation function with respect to the unknown. Although there are elegant methods for the determination of the Jacobian, the cost is high when the number of unknowns, m, and the number of observations, n, is high. It is also wasteful to compute the Jacobian for points away from the optimum. Irrespective of the issue of computing derivatives, the computational cost of implementing the method is generally of the order of m2n, though there are methods to reduce the computational cost. In this work, we present an implementation that utilizes a matrix free in terms of the Jacobian matrix Gauss-Newton method and improves the scalability of the geostatistical inverse problem. For each iteration, it is required to perform K runs of the forward problem, where K is not just much smaller than m but can be smaller that n. The computational and storage cost of implementation of the inverse procedure scales roughly linearly with m instead of m2 as in the textbook approach. For problems of very large m, this implementation constitutes a dramatic reduction in computational cost compared to the textbook approach. Results illustrate the validity of the approach and provide insight in the conditions under which this method perform best.

  3. DEM-based delineation for improving geostatistical interpolation of rainfall in mountainous region of Central Himalayas, India

    NASA Astrophysics Data System (ADS)

    Kumari, Madhuri; Singh, Chander Kumar; Bakimchandra, Oinam; Basistha, Ashoke

    2016-07-01

    In mountainous region with heterogeneous topography, the geostatistical modeling of the rainfall using global data set may not confirm to the intrinsic hypothesis of stationarity. This study was focused on improving the precision of the interpolated rainfall maps by spatial stratification in complex terrain. Predictions of the normal annual rainfall data were carried out by ordinary kriging, universal kriging, and co-kriging, using 80-point observations in the Indian Himalayas extending over an area of 53,484 km2. A two-step spatial clustering approach is proposed. In the first step, the study area was delineated into two regions namely lowland and upland based on the elevation derived from the digital elevation model. The delineation was based on the natural break classification method. In the next step, the rainfall data was clustered into two groups based on its spatial location in lowland or upland. The terrain ruggedness index (TRI) was incorporated as a co-variable in co-kriging interpolation algorithm. The precision of the kriged and co-kriged maps was assessed by two accuracy measures, root mean square error and Chatfield's percent better. It was observed that the stratification of rainfall data resulted in 5-20 % of increase in the performance efficiency of interpolation methods. Co-kriging outperformed the kriging models at annual and seasonal scale. The result illustrates that the stratification of the study area improves the stationarity characteristic of the point data, thus enhancing the precision of the interpolated rainfall maps derived using geostatistical methods.

  4. Spatial distribution of Biomphalaria mollusks at São Francisco River Basin, Minas Gerais, Brazil, using geostatistical procedures.

    PubMed

    Guimarães, Ricardo J P S; Freitas, Corina C; Dutra, Luciano V; Felgueiras, Carlos A; Moura, Ana C M; Amaral, Ronaldo S; Drummond, Sandra C; Scholte, Ronaldo G C; Oliveira, Guilherme; Carvalho, Omar S

    2009-03-01

    Geostatistics is used in this work to make inferences about the presence of the species of Biomphalaria (B. glabrata, B. tenagophila and/or B. straminea), intermediate hosts of Schistosoma mansoni, at the São Francisco River Basin, in Minas Gerais, Brazil. One of these geostatistical procedures, known as indicator kriging, allows the classification of categorical data, in areas where the data are not available, using a punctual sample set. The result is a map of species and risk area definition. More than a single map of the categorical attribute, the procedure also permits the association of uncertainties of the stochastic model, which can be used to qualify the inferences. In order to validate the estimated data of the risk map, a fieldwork in five municipalities was carried out. The obtained results showed that indicator kriging is a rather robust tool since it presented a very good agreement with the field findings. The obtained risk map can be thought as an auxiliary tool to formulate proper public health strategies, and to guide other fieldwork, considering the places with higher occurrence probability of the most important snail species. Also, the risk map will enable better resource distribution and adequate policies for the mollusk control. This methodology will be applied to other river basins to generate a predictive map for Biomphalaria species distribution for the entire state of Minas Gerais. PMID:19046937

  5. Revised Geostatistical Analysis of the Inventory of Carbon Tetrachloride in the Unconfined Aquifer in the 200 West Area of the Hanford Site

    SciTech Connect

    Murray, Christopher J.; Bott, Yi-Ju

    2008-12-30

    This report provides an updated estimate of the inventory of carbon tetrachloride (CTET) in the unconfined aquifer in the 200 West Area of the Hanford Site. The contaminant plumes of interest extend within the 200-ZP-1 and 200-UP-1 operable units. CH2M HILL Plateau Remediation Company (CHPRC) currently is preparing a plan identifying locations for groundwater extraction wells, injection wells, transfer stations, and one or more treatment facilities to address contaminants of concern identified in the 200-ZP-1 CERCLA Record of Decision. To accomplish this, a current understanding of the inventory of CTET is needed throughout the unconfined aquifer in the 200 West Area. Pacific Northwest National Laboratory (PNNL) previously developed an estimate of the CTET inventory in the area using a Monte Carlo approach based on geostatistical simulation of the three-dimensional (3D) distribution of CTET and chloroform in the aquifer. Fluor Hanford, Inc. (FH) (the previous site contractor) requested PNNL to update that inventory estimate using as input a set of geostatistical realizations of CTET and chloroform recently created for a related but separate project, referred to as the mapping project. The scope of work for the inventory revision complemented the scope of work for the mapping project, performed for FH by PNNL. This report briefly describes the spatial and univariate distribution of the CTET and chloroform data, along with the results of the geostatistical analysis and simulation performed for the mapping project.

  6. An in vitro systematic spectroscopic examination of the photostabilities of a random set of commercial sunscreen lotions and their chemical UVB/UVA active agents.

    PubMed

    Serpone, Nick; Salinaro, Angela; Emeline, Alexei V; Horikoshi, Satoshi; Hidaka, Hisao; Zhao, Jincai

    2002-12-01

    The photostabilities of a random set of commercially available sunscreen lotions and their active ingredients are examined spectroscopically subsequent to simulated sunlight UV exposure. Loss of filtering efficacy can occur because of possible photochemical modifications of the sunscreen active agents. Changes in absorption of UVA/ UVB sunlight by agents in sunscreen lotions also leads to a reduction of the expected photoprotection of human skin and DNA against the harmful UV radiation. The active ingredients were investigated in aqueous media and in organic solvents of various polarities (methanol, acetonitrile, and n-hexane) under aerobic and anaerobic conditions The UV absorption features are affected by the nature of the solvents with properties closely related to oil-in-water (o/w) or water-in-oil (w/o) emulsions actually used in sunscreen formulations, and by the presence of molecular oxygen. The photostabilities of two combined chemical ingredients (oxybenzone and octyl methoxycinnamate) and the combination oxybenzone/titanium dioxide were also explored. In the latter case, oxybenzone undergoes significant photodegradation in the presence of the physical filter TiO2. PMID:12661594

  7. Building on crossvalidation for increasing the quality of geostatistical modeling

    USGS Publications Warehouse

    Olea, R.A.

    2012-01-01

    The random function is a mathematical model commonly used in the assessment of uncertainty associated with a spatially correlated attribute that has been partially sampled. There are multiple algorithms for modeling such random functions, all sharing the requirement of specifying various parameters that have critical influence on the results. The importance of finding ways to compare the methods and setting parameters to obtain results that better model uncertainty has increased as these algorithms have grown in number and complexity. Crossvalidation has been used in spatial statistics, mostly in kriging, for the analysis of mean square errors. An appeal of this approach is its ability to work with the same empirical sample available for running the algorithms. This paper goes beyond checking estimates by formulating a function sensitive to conditional bias. Under ideal conditions, such function turns into a straight line, which can be used as a reference for preparing measures of performance. Applied to kriging, deviations from the ideal line provide sensitivity to the semivariogram lacking in crossvalidation of kriging errors and are more sensitive to conditional bias than analyses of errors. In terms of stochastic simulation, in addition to finding better parameters, the deviations allow comparison of the realizations resulting from the applications of different methods. Examples show improvements of about 30% in the deviations and approximately 10% in the square root of mean square errors between reasonable starting modelling and the solutions according to the new criteria. ?? 2011 US Government.

  8. Geostatistical noise filtering of geophysical images : application to unexploded ordnance (UXO) sites.

    SciTech Connect

    Saito, Hirotaka; McKenna, Sean Andrew; Coburn, Timothy C.

    2004-07-01

    Geostatistical and non-geostatistical noise filtering methodologies, factorial kriging and a low-pass filter, and a region growing method are applied to analytic signal magnetometer images at two UXO contaminated sites to delineate UXO target areas. Overall delineation performance is improved by removing background noise. Factorial kriging slightly outperforms the low-pass filter but there is no distinct difference between them in terms of finding anomalies of interest.

  9. Should hydraulic tomography data be interpreted using geostatistical inverse modeling? A laboratory sandbox investigation

    NASA Astrophysics Data System (ADS)

    Illman, Walter A.; Berg, Steven J.; Zhao, Zhanfeng

    2015-05-01

    The robust performance of hydraulic tomography (HT) based on geostatistics has been demonstrated through numerous synthetic, laboratory, and field studies. While geostatistical inverse methods offer many advantages, one key disadvantage is its highly parameterized nature, which renders it computationally intensive for large-scale problems. Another issue is that geostatistics-based HT may produce overly smooth images of subsurface heterogeneity when there are few monitoring interval data. Therefore, some may question the utility of the geostatistical inversion approach in certain situations and seek alternative approaches. To investigate these issues, we simultaneously calibrated different groundwater models with varying subsurface conceptualizations and parameter resolutions using a laboratory sandbox aquifer. The compared models included: (1) isotropic and anisotropic effective parameter models; (2) a heterogeneous model that faithfully represents the geological features; and (3) a heterogeneous model based on geostatistical inverse modeling. The performance of these models was assessed by quantitatively examining the results from model calibration and validation. Calibration data consisted of steady state drawdown data from eight pumping tests and validation data consisted of data from 16 separate pumping tests not used in the calibration effort. Results revealed that the geostatistical inversion approach performed the best among the approaches compared, although the geological model that faithfully represented stratigraphy came a close second. In addition, when the number of pumping tests available for inverse modeling was small, the geological modeling approach yielded more robust validation results. This suggests that better knowledge of stratigraphy obtained via geophysics or other means may contribute to improved results for HT.

  10. Optimized Field Sampling and Monitoring of Airborne Hazardous Transport Plumes; A Geostatistical Simulation Approach

    SciTech Connect

    Chen, DI-WEN

    2001-11-21

    Airborne hazardous plumes inadvertently released during nuclear/chemical/biological incidents are mostly of unknown composition and concentration until measurements are taken of post-accident ground concentrations from plume-ground deposition of constituents. Unfortunately, measurements often are days post-incident and rely on hazardous manned air-vehicle measurements. Before this happens, computational plume migration models are the only source of information on the plume characteristics, constituents, concentrations, directions of travel, ground deposition, etc. A mobile ''lighter than air'' (LTA) system is being developed at Oak Ridge National Laboratory that will be part of the first response in emergency conditions. These interactive and remote unmanned air vehicles will carry light-weight detectors and weather instrumentation to measure the conditions during and after plume release. This requires a cooperative computationally organized, GPS-controlled set of LTA's that self-coordinate around the objectives in an emergency situation in restricted time frames. A critical step before an optimum and cost-effective field sampling and monitoring program proceeds is the collection of data that provides statistically significant information, collected in a reliable and expeditious manner. Efficient aerial arrangements of the detectors taking the data (for active airborne release conditions) are necessary for plume identification, computational 3-dimensional reconstruction, and source distribution functions. This report describes the application of stochastic or geostatistical simulations to delineate the plume for guiding subsequent sampling and monitoring designs. A case study is presented of building digital plume images, based on existing ''hard'' experimental data and ''soft'' preliminary transport modeling results of Prairie Grass Trials Site. Markov Bayes Simulation, a coupled Bayesian/geostatistical methodology, quantitatively combines soft information

  11. Geostatistical prediction of flow-duration curves in an index-flow framework

    NASA Astrophysics Data System (ADS)

    Pugliese, Alessio; Castellarin, Attilio; Brath, Armando

    2014-05-01

    An empirical period-of-record Flow-Duration Curve (FDC) describes the percentage of time (duration) in which a given streamflow was equaled or exceeded over an historical period of time. FDCs have always attracted a great deal of interest in engineering applications because of their ability to provide a simple yet comprehensive graphical view of the overall historical variability of streamflows in a river basin, from floods to low-flows. Nevertheless, in many practical applications one has to construct FDC in basins that are ungauged or where very few observations are available. We present in this study an application strategy of Topological kriging (or Top-kriging), which makes the geostatistical procedure capable of predicting flow-duration curves (FDCs) in ungauged catchments. Previous applications of Top-kriging mainly focused on the prediction of point streamflow indices (e.g. flood quantiles, low-flow indices, etc.). In this study Top-kriging is used to predict FDCs in ungauged sites as a weighted average of standardised empirical FDCs through the traditional linear-weighting scheme of kriging methods. Our study focuses on the prediction of FDCs for 18 unregulated catchments located in Central Italy, for which daily streamflow series with length from 5 to 40 years are available, together with information on climate referring to the same time-span of each daily streamflow sequence. Empirical FDCs are standardised by a reference index-flow value (i.e. mean annual flow, or mean annual precipitation times the catchment drainage area) and the overall deviation of the curves from this reference value is then used for expressing the hydrological similarity between catchments and for deriving the geostatistical weights. We performed an extensive leave-one-out cross-validation to quantify the accuracy of the proposed technique, and to compare it to traditional regionalisation models that were recently developed for the same study region. The cross-validation points

  12. Optimization of ventilator setting by flow and pressure waveforms analysis during noninvasive ventilation for acute exacerbations of COPD: a multicentric randomized controlled trial

    PubMed Central

    2011-01-01

    Introduction The analysis of flow and pressure waveforms generated by ventilators can be useful in the optimization of patient-ventilator interactions, notably in chronic obstructive pulmonary disease (COPD) patients. To date, however, a real clinical benefit of this approach has not been proven. Methods The aim of the present randomized, multi-centric, controlled study was to compare optimized ventilation, driven by the analysis of flow and pressure waveforms, to standard ventilation (same physician, same initial ventilator setting, same time spent at the bedside while the ventilator screen was obscured with numerical data always available). The primary aim was the rate of pH normalization at two hours, while secondary aims were changes in PaCO2, respiratory rate and the patient's tolerance to ventilation (all parameters evaluated at baseline, 30, 120, 360 minutes and 24 hours after the beginning of ventilation). Seventy patients (35 for each group) with acute exacerbation of COPD were enrolled. Results Optimized ventilation led to a more rapid normalization of pH at two hours (51 vs. 26% of patients), to a significant improvement of the patient's tolerance to ventilation at two hours, and to a higher decrease of PaCO2 at two and six hours. Optimized ventilation induced physicians to use higher levels of external positive end-expiratory pressure, more sensitive inspiratory triggers and a faster speed of pressurization. Conclusions The analysis of the waveforms generated by ventilators has a significant positive effect on physiological and patient-centered outcomes during acute exacerbation of COPD. The acquisition of specific skills in this field should be encouraged. Trial registration ClinicalTrials.gov NCT01291303. PMID:22115190

  13. Hand-held echocardiography in the setting of pre-operative cardiac evaluation of patients undergoing non-cardiac surgery: results from a randomized pilot study.

    PubMed

    Cavallari, Ilaria; Mega, Simona; Goffredo, Costanza; Patti, Giuseppe; Chello, Massimo; Di Sciascio, Germano

    2015-06-01

    Transthoracic echocardiography is not a routine test in the pre-operative cardiac evaluation of patients undergoing non-cardiac surgery but may be considered in those with known heart failure and valvular heart disease or complaining cardiac symptoms. In this setting, hand-held echocardiography (HHE) could find a potential application as an alternative to standard echocardiography in selected patients; however, its utility in this context has not been investigated. The aim of this pilot study was to evaluate the conclusiveness of HHE compared to standard echocardiography in this subset of patients. 100 patients scheduled for non-cardiac surgery were randomized to receive a standard exam with a Philips Ie33 or a bedside evaluation with a pocket-size imaging device (Opti-Go, Philips Medical System). The primary endpoint was the percentage of satisfactory diagnosis at the end of the examination referred as conclusiveness. Secondary endpoints were the mean duration time and the mean waiting time to perform the exams. No significant difference in terms of conclusiveness between HHE and standard echo was found (86 vs 96%; P = 0.08). Mean duration time of the examinations was 6.1 ± 1.2 min with HHE and 13.1 ± 2.6 min with standard echocardiography (P < 0.001). HHE resulted in a consistent save of waiting time because it was performed the same day of clinical evaluation whereas patients waited 10.1 ± 6.1 days for a standard echocardiography (P < 0.001). This study suggests the potential role of HHE for pre-operative evaluation of selected patients undergoing non-cardiac surgery, since it provided similar information but it was faster and earlier performed compared to standard echocardiography. PMID:25985940

  14. Prospective Randomized Double-Blind Pilot Study of Site-Specific Consensus Atlas Implementation for Rectal Cancer Target Volume Delineation in the Cooperative Group Setting

    SciTech Connect

    Fuller, Clifton D.; Nijkamp, Jasper; Duppen, Joop C.; Rasch, Coen R.N.; Thomas, Charles R.; Wang, Samuel J.; Okunieff, Paul; Jones, William E.; Baseman, Daniel; Patel, Shilpen; Demandante, Carlo G.N.; Harris, Anna M.; Smith, Benjamin D.; Katz, Alan W.; McGann, Camille

    2011-02-01

    Purpose: Variations in target volume delineation represent a significant hurdle in clinical trials involving conformal radiotherapy. We sought to determine the effect of a consensus guideline-based visual atlas on contouring the target volumes. Methods and Materials: A representative case was contoured (Scan 1) by 14 physician observers and a reference expert with and without target volume delineation instructions derived from a proposed rectal cancer clinical trial involving conformal radiotherapy. The gross tumor volume (GTV), and two clinical target volumes (CTVA, including the internal iliac, presacral, and perirectal nodes, and CTVB, which included the external iliac nodes) were contoured. The observers were randomly assigned to receipt (Group A) or nonreceipt (Group B) of a consensus guideline and atlas for anorectal cancers and then instructed to recontour the same case/images (Scan 2). Observer variation was analyzed volumetrically using the conformation number (CN, where CN = 1 equals total agreement). Results: Of 14 evaluable contour sets (1 expert and 7 Group A and 6 Group B observers), greater agreement was found for the GTV (mean CN, 0.75) than for the CTVs (mean CN, 0.46-0.65). Atlas exposure for Group A led to significantly increased interobserver agreement for CTVA (mean initial CN, 0.68, after atlas use, 0.76; p = .03) and increased agreement with the expert reference (initial mean CN, 0.58; after atlas use, 0.69; p = .02). For the GTV and CTVB, neither the interobserver nor the expert agreement was altered after atlas exposure. Conclusion: Consensus guideline atlas implementation resulted in a detectable difference in interobserver agreement and a greater approximation of expert volumes for the CTVA but not for the GTV or CTVB in the specified case. Visual atlas inclusion should be considered as a feature in future clinical trials incorporating conformal RT.

  15. Randomized controlled trial of meat compared with multimicronutrient-fortified cereal in infants and toddlers with high stunting rates in diverse settings123

    PubMed Central

    Krebs, Nancy F; Mazariegos, Manolo; Chomba, Elwyn; Sami, Neelofar; Pasha, Omrana; Tshefu, Antoinette; Carlo, Waldemar A; Goldenberg, Robert L; Bose, Carl L; Wright, Linda L; Koso-Thomas, Marion; Goco, Norman; Kindem, Mark; McClure, Elizabeth M; Westcott, Jamie; Garces, Ana; Lokangaka, Adrien; Manasyan, Albert; Imenda, Edna; Hartwell, Tyler D; Hambidge, K Michael

    2012-01-01

    Background: Improved complementary feeding is cited as a critical factor for reducing stunting. Consumption of meats has been advocated, but its efficacy in low-resource settings has not been tested. Objective: The objective was to test the hypothesis that daily intake of 30 to 45 g meat from 6 to 18 mo of age would result in greater linear growth velocity and improved micronutrient status in comparison with an equicaloric multimicronutrient-fortified cereal. Design: This was a cluster randomized efficacy trial conducted in the Democratic Republic of Congo, Zambia, Guatemala, and Pakistan. Individual daily portions of study foods and education messages to enhance complementary feeding were delivered to participants. Blood tests were obtained at trial completion. Results: A total of 532 (86.1%) and 530 (85.8%) participants from the meat and cereal arms, respectively, completed the study. Linear growth velocity did not differ between treatment groups: 1.00 (95% CI: 0.99, 1.02) and 1.02 (95% CI: 1.00, 1.04) cm/mo for the meat and cereal groups, respectively (P = 0.39). From baseline to 18 mo, stunting [length-for-age z score (LAZ) <−2.0] rates increased from ∼33% to nearly 50%. Years of maternal education and maternal height were positively associated with linear growth velocity (P = 0.0006 and 0.003, respectively); LAZ at 6 mo was negatively associated (P < 0.0001). Anemia rates did not differ by group; iron deficiency was significantly lower in the cereal group. Conclusion: The high rate of stunting at baseline and the lack of effect of either the meat or multiple micronutrient-fortified cereal intervention to reverse its progression argue for multifaceted interventions beginning in the pre- and early postnatal periods. This trial was registered at clinicaltrials.gov as NCT01084109. PMID:22952176

  16. Estimating transmissivity in the Edwards Aquifer using upscaling, geostatistics, and Bayesian updating

    NASA Astrophysics Data System (ADS)

    Painter, S. L.; Jiang, Y.; Woodbury, A. D.

    2002-12-01

    The Edwards Aquifer, a highly heterogeneous karst aquifer located in south central Texas, is the sole source of drinking water for more than one million people. Hydraulic conductivity (K) measurements in the Edwards Aquifer are sparse, highly variable (log-K variance of 6.4), and are mostly from single-well drawdown tests that are appropriate for the spatial scale of a few meters. To support ongoing efforts to develop a groundwater management (MODFLOW) model of the San Antonio segment of the Edwards Aquifer, a multistep procedure was developed to assign hydraulic parameters to the 402 m x 402 m computational cells intended for the management model. The approach used a combination of nonparametric geostatistical analysis, stochastic simulation, numerical upscaling, and automatic model calibration based on Bayesian updating [1,2]. Indicator correlograms reveal a nested spatial structure in the well-test K of the confined zone, with practical correlation ranges of 3,600 and 15,000 meters and a large nugget effect. The fitted geostatistical model was used in unconditional stochastic simulations by the sequential indicator simulation method. The resulting realizations of K, defined at the scale of the well tests, were then numerically upscaled to the block scale. A new geostatistical model was fitted to the upscaled values. The upscaled model was then used to cokrige the block-scale K based on the well-test K. The resulting K map was then converted to transmissivity (T) using deterministically mapped aquifer thickness. When tested in a forward groundwater model, the upscaled T reproduced hydraulic heads better than a simple kriging of the well-test values (mean error of -3.9 meter and mean-absolute-error of 12 meters, as compared with -13 and 17 meters for the simple kriging). As the final step in the study, the upscaled T map was used as the prior distribution in an inverse procedure based on Bayesian updating [1,2]. When input to the forward groundwater model, the

  17. A multicenter randomized controlled trial of a plant-based nutrition program to reduce body weight and cardiovascular risk in the corporate setting: the GEICO study

    PubMed Central

    Mishra, S; Xu, J; Agarwal, U; Gonzales, J; Levin, S; Barnard, N D

    2013-01-01

    Background/objectives: To determine the effects of a low-fat plant-based diet program on anthropometric and biochemical measures in a multicenter corporate setting. Subjects/methods: Employees from 10 sites of a major US company with body mass index ⩾25 kg/m2 and/or previous diagnosis of type 2 diabetes were randomized to either follow a low-fat vegan diet, with weekly group support and work cafeteria options available, or make no diet changes for 18 weeks. Dietary intake, body weight, plasma lipid concentrations, blood pressure and glycated hemoglobin (HbA1C) were determined at baseline and 18 weeks. Results: Mean body weight fell 2.9 kg and 0.06 kg in the intervention and control groups, respectively (P<0.001). Total and low-density lipoprotein (LDL) cholesterol fell 8.0 and 8.1 mg/dl in the intervention group and 0.01 and 0.9 mg/dl in the control group (P<0.01). HbA1C fell 0.6 percentage point and 0.08 percentage point in the intervention and control group, respectively (P<0.01). Among study completers, mean changes in body weight were −4.3 kg and −0.08 kg in the intervention and control groups, respectively (P<0.001). Total and LDL cholesterol fell 13.7 and 13.0 mg/dl in the intervention group and 1.3 and 1.7 mg/dl in the control group (P<0.001). HbA1C levels decreased 0.7 percentage point and 0.1 percentage point in the intervention and control group, respectively (P<0.01). Conclusions: An 18-week dietary intervention using a low-fat plant-based diet in a corporate setting improves body weight, plasma lipids, and, in individuals with diabetes, glycemic control. PMID:23695207

  18. Working with men to prevent intimate partner violence in a conflict-affected setting: a pilot cluster randomized controlled trial in rural Côte d’Ivoire

    PubMed Central

    2014-01-01

    Background Evidence from armed conflict settings points to high levels of intimate partner violence (IPV) against women. Current knowledge on how to prevent IPV is limited—especially within war-affected settings. To inform prevention programming on gender-based violence in settings affected by conflict, we evaluated the impact of adding a targeted men’s intervention to a community-based prevention programme in Côte d’Ivoire. Methods We conducted a two-armed, non-blinded cluster randomized trial in Côte d’Ivoire among 12 pair-matched communities spanning government-controlled, UN buffer, and rebel–controlled zones. The intervention communities received a 16-week IPV prevention intervention using a men’s discussion group format. All communities received community-based prevention programmes. Baseline data were collected from couples in September 2010 (pre-intervention) and follow-up in March 2012 (one year post-intervention). The primary trial outcome was women’s reported experiences of physical and/or sexual IPV in the last 12 months. We also assessed men’s reported intention to use physical IPV, attitudes towards sexual IPV, use of hostility and conflict management skills, and participation in gendered household tasks. An adjusted cluster-level intention to treat analysis was used to compare outcomes between intervention and control communities at follow-up. Results At follow-up, reported levels of physical and/or sexual IPV in the intervention arm had decreased compared to the control arm (ARR 0.52, 95% CI 0.18-1.51, not significant). Men participating in the intervention reported decreased intentions to use physical IPV (ARR 0.83, 95% CI 0.66-1.06) and improved attitudes toward sexual IPV (ARR 1.21, 95% CI 0.77-1.91). Significant differences were found between men in the intervention and control arms’ reported ability to control their hostility and manage conflict (ARR 1.3, 95% CI 1.06-1.58), and participation in gendered household tasks (ARR

  19. Geostatistical analysis of tritium, groundwater age and other noble gas derived parameters in California.

    PubMed

    Visser, A; Moran, J E; Hillegonds, Darren; Singleton, M J; Kulongoski, Justin T; Belitz, Kenneth; Esser, B K

    2016-03-15

    Key characteristics of California groundwater systems related to aquifer vulnerability, sustainability, recharge locations and mechanisms, and anthropogenic impact on recharge are revealed in a spatial geostatistical analysis of a unique data set of tritium, noble gases and other isotopic analyses unprecedented in size at nearly 4000 samples. The correlation length of key groundwater residence time parameters varies between tens of kilometers ((3)H; age) to the order of a hundred kilometers ((4)Heter; (14)C; (3)Hetrit). The correlation length of parameters related to climate, topography and atmospheric processes is on the order of several hundred kilometers (recharge temperature; δ(18)O). Young groundwater ages that highlight regional recharge areas are located in the eastern San Joaquin Valley, in the southern Santa Clara Valley Basin, in the upper LA basin and along unlined canals carrying Colorado River water, showing that much of the recent recharge in central and southern California is dominated by river recharge and managed aquifer recharge. Modern groundwater is found in wells with the top open intervals below 60 m depth in the southeastern San Joaquin Valley, Santa Clara Valley and Los Angeles basin, as the result of intensive pumping and/or managed aquifer recharge operations. PMID:26803267

  20. Characterizing the spatial structure of endangered species habitat using geostatistical analysis of IKONOS imagery

    USGS Publications Warehouse

    Wallace, C.S.A.; Marsh, S.E.

    2005-01-01

    Our study used geostatistics to extract measures that characterize the spatial structure of vegetated landscapes from satellite imagery for mapping endangered Sonoran pronghorn habitat. Fine spatial resolution IKONOS data provided information at the scale of individual trees or shrubs that permitted analysis of vegetation structure and pattern. We derived images of landscape structure by calculating local estimates of the nugget, sill, and range variogram parameters within 25 ?? 25-m image windows. These variogram parameters, which describe the spatial autocorrelation of the 1-m image pixels, are shown in previous studies to discriminate between different species-specific vegetation associations. We constructed two independent models of pronghorn landscape preference by coupling the derived measures with Sonoran pronghorn sighting data: a distribution-based model and a cluster-based model. The distribution-based model used the descriptive statistics for variogram measures at pronghorn sightings, whereas the cluster-based model used the distribution of pronghorn sightings within clusters of an unsupervised classification of derived images. Both models define similar landscapes, and validation results confirm they effectively predict the locations of an independent set of pronghorn sightings. Such information, although not a substitute for field-based knowledge of the landscape and associated ecological processes, can provide valuable reconnaissance information to guide natural resource management efforts. ?? 2005 Taylor & Francis Group Ltd.

  1. Geostatistical investigations for suitable mapping of the water table: the Bordeaux case (France)

    NASA Astrophysics Data System (ADS)

    Guekie simo, Aubin Thibaut; Marache, Antoine; Lastennet, Roland; Breysse, Denys

    2016-02-01

    Methodologies have been developed to establish realistic water-table maps using geostatistical methods: ordinary kriging (OK), cokriging (CoK), collocated cokriging (CoCoK), and kriging with external drift (KED). In fact, in a hilly terrain, when piezometric data are sparsely distributed over large areas, the water-table maps obtained by these methods provide exact water levels at monitoring wells but fail to represent the groundwater flow system, manifested through an interpolated water table above the topography. A methodology is developed in order to rebuild water-table maps for urban areas at the city scale. The interpolation methodology is presented and applied in a case study where water levels are monitored at a set of 47 points for a part urban domain covering 25.6 km2 close to Bordeaux city, France. To select the best method, a geographic information system was used to visualize surfaces reconstructed with each method. A cross-validation was carried out to evaluate the predictive performances of each kriging method. KED proves to be the most accurate and yields a better description of the local fluctuations induced by the topography (natural occurrence of ridges and valleys).

  2. Spatial analysis of heavy metals in surface soils based on GeoStatistics

    NASA Astrophysics Data System (ADS)

    Sun, Yingjun; Ding, Ning; Cai, Fei; Meng, Fei

    2008-10-01

    The pollution of surface soils caused by heavy metals has been a focus problem discussed. Instead of the acquisition of the "best" estimation of unsampled points, the author paid much attention to the assessment of the spatial uncertainty about unsampled values. The simulation method of Geostatistics, aimed at the same statistics (histogram, Variogram), can generate a set of equally-probable realizations which is particularly useful for assessing the uncertainty in the spatial distribution of attribute values. The case study was from an Urban - Rural transition zone of Shanghai, China. Six kinds of heavy metals (Cu, Pb, Cd, Cr, Hg and As) in agricultural surface soils were analyzed in the paper. Based on the study of spatial variation of different kind of heavy metal, the author got the different realization of the 6 kinds of heavy metals respectively based on the sequential simulation methods. At last, the author drew the conclusion that Cu, Cd and Cr were the dominant elements that influenced soil quality in the study area. At the end of the paper, the author gave the uncertainty map of the six heave metals respectively.

  3. Obtaining parsimonious hydraulic conductivity fields using head and transport observations: A bayesian geostatistical parameter estimation approach

    USGS Publications Warehouse

    Fienen, M.; Hunt, R.; Krabbenhoft, D.; Clemo, T.

    2009-01-01

    Flow path delineation is a valuable tool for interpreting the subsurface hydrogeochemical environment. Different types of data, such as groundwater flow and transport, inform different aspects of hydrogeologie parameter values (hydraulic conductivity in this case) which, in turn, determine flow paths. This work combines flow and transport information to estimate a unified set of hydrogeologic parameters using the Bayesian geostatistical inverse approach. Parameter flexibility is allowed by using a highly parameterized approach with the level of complexity informed by the data. Despite the effort to adhere to the ideal of minimal a priori structure imposed on the problem, extreme contrasts in parameters can result in the need to censor correlation across hydrostratigraphic bounding surfaces. These partitions segregate parameters into faci??s associations. With an iterative approach in which partitions are based on inspection of initial estimates, flow path interpretation is progressively refined through the inclusion of more types of data. Head observations, stable oxygen isotopes (18O/16O) ratios), and tritium are all used to progressively refine flow path delineation on an isthmus between two lakes in the Trout Lake watershed, northern Wisconsin, United States. Despite allowing significant parameter freedom by estimating many distributed parameter values, a smooth field is obtained. Copyright 2009 by the American Geophysical Union.

  4. Geostatistical Analysis of Spatio-Temporal Forest Fire Data

    NASA Astrophysics Data System (ADS)

    Vega Orozco, Carmen D.; Kanevski, Mikhail; Tonini, Marj; Conedera, Marc

    2010-05-01

    Forest fire is one of the major phenomena causing degradation of environment, landscape, natural ecosystems, human health and economy. One of the main topic in forest fire data studies deals with the detection, analysis and modelling of spatio-temporal patterns of clustering. Spatial patterns of forest fire locations, their sizes and their sequence in time are of great interest for fire prediction and for forest fire management planning and distribution in optimal way necessary resources. Currently, fires can be analyzed and monitored by using different statistical tools, for example, Ripley's k-function, fractals, Allan factor, scan statistics, etc. Some of them are adapted to temporal or spatial data and are either local or global. In the present study the main attention is paid to the application of geostatistical tools - variography and methods for the analysis of monitoring networks (MN) clustering techniques (topological, statistical and fractal measures), in order to detect and to characterize spatio-temporal forest fire patterns. The main studies performed include: a) analysis of forest fires temporal sequences; b) spatial clustering of forest fires; c) geostatistical spatial analysis of burnt areas. Variography was carried out both for temporal and spatial data. Real case study is based on the forest-fire event data from Canton of Ticino (Switzerland) for a period of 1969 to 2008. The results from temporal analysis show the presence of clustering and seasonal periodicities. Comprehensive analysis of the variograms shows an anisotropy in the direction 30° East-North where smooth changes are detected, while on the direction 30° North-West a greater variability was identified. The research was completed with an application of different MN analysis techniques including, analysis of distributions of distances between events, Morisita Index (MI), fractal dimensions (sandbox counting and box counting methods) and functional fractal dimensions, adapted and

  5. Geostatistical Analyses of the Persistence and Inventory of Carbon Tetrachloride in the 200 West Area of the Hanford Site

    SciTech Connect

    Murray, Christopher J.; Bott, Yi-Ju; Truex, Michael J.

    2007-04-30

    This report documents two separate geostatistical studies performed by researchers from Pacific Northwest National Laboratory to evaluate the carbon tetrachloride plume in the groundwater on the Hanford Site.

  6. Bootstrapped models for intrinsic random functions

    SciTech Connect

    Campbell, K.

    1987-01-01

    The use of intrinsic random function stochastic models as a basis for estimation in geostatistical work requires the identification of the generalized covariance function of the underlying process, and the fact that this function has to be estimated from the data introduces an additional source of error into predictions based on the model. This paper develops the sample reuse procedure called the ''bootstrap'' in the context of intrinsic random functions to obtain realistic estimates of these errors. Simulation results support the conclusion that bootstrap distributions of functionals of the process, as well as of their ''kriging variance,'' provide a reasonable picture of the variability introduced by imperfect estimation of the generalized covariance function.

  7. Bootstrapped models for intrinsic random functions

    SciTech Connect

    Campbell, K.

    1988-08-01

    Use of intrinsic random function stochastic models as a basis for estimation in geostatistical work requires the identification of the generalized covariance function of the underlying process. The fact that this function has to be estimated from data introduces an additional source of error into predictions based on the model. This paper develops the sample reuse procedure called the bootstrap in the context of intrinsic random functions to obtain realistic estimates of these errors. Simulation results support the conclusion that bootstrap distributions of functionals of the process, as well as their kriging variance, provide a reasonable picture of variability introduced by imperfect estimation of the generalized covariance function.

  8. Priority Setting and Influential Factors on Acceptance of Pharmaceutical Recommendations in Collaborative Medication Reviews in an Ambulatory Care Setting – Analysis of a Cluster Randomized Controlled Trial (WestGem-Study)

    PubMed Central

    Rose, Olaf; Mennemann, Hugo; John, Carina; Lautenschläger, Marcus; Mertens-Keller, Damaris; Richling, Katharina; Waltering, Isabel; Hamacher, Stefanie; Felsch, Moritz; Herich, Lena; Czarnecki, Kathrin; Schaffert, Corinna; Jaehde, Ulrich; Köberlein-Neu, Juliane

    2016-01-01

    Background Medication reviews are recognized services to increase quality of therapy and reduce medication risks. The selection of eligible patients with potential to receive a major benefit is based on assumptions rather than on factual data. Acceptance of interprofessional collaboration is crucial to increase the quality of medication therapy. Objective The research question was to identify and prioritize eligible patients for a medication review and to provide evidence-based criteria for patient selection. Acceptance of the prescribing general practitioner to implement pharmaceutical recommendations was measured and factors influencing physicians’ acceptance were explored to obtain an impression on the extent of collaboration in medication review in an ambulatory care setting. Methods Based on data of a cluster-randomized controlled study (WestGem-study), the correlation between patient parameters and the individual performance in a medication review was calculated in a multiple logistic regression model. Physician’s acceptance of the suggested intervention was assessed using feedback forms. Influential factors were analyzed. Results The number of drugs in use (p = 0.001), discrepancies between prescribed and used medicines (p = 0.014), the baseline Medication Appropriateness Index score (p<0.001) and the duration of the intervention (p = 0.006) could be identified as influential factors for a major benefit from a medication review, whereas morbidity (p>0.05) and a low kidney function (p>0.05) do not predetermine the outcome. Longitudinal patient care with repeated reviews showed higher interprofessional acceptance and superior patient benefit. A total of 54.9% of the recommendations in a medication review on drug therapy were accepted for implementation. Conclusions The number of drugs in use and medication reconciliation could be a first rational step in patient selection for a medication review. Most elderly, multimorbid patients with polymedication

  9. Spatiotemporal analysis of olive flowering using geostatistical techniques.

    PubMed

    Rojo, Jesús; Pérez-Badia, Rosa

    2015-02-01

    Analysis of flowering patterns in the olive (Olea europaea L.) are of considerable agricultural and ecological interest, and also provide valuable information for allergy-sufferers, enabling identification of the major sources of airborne pollen at any given moment by interpreting the aerobiological data recorded in pollen traps. The present spatiotemporal analysis of olive flowering in central Spain combined geostatistical techniques with the application of a Geographic Information Systems, and compared results for flowering intensity with airborne pollen records. The results were used to obtain continuous phenological maps which determined the pattern of the succession of the olive flowering. The results show also that, although the highest airborne olive-pollen counts were recorded during the greatest flowering intensity of the groves closest to the pollen trap, the counts recorded at the start of the pollen season were not linked to local olive groves, which had not yet begin to flower. To detect the remote sources of olive pollen several episodes of pollen recorded before the local flowering season were analysed using a HYSPLIT trajectory model and the findings showed that western, southern and southwestern winds transported pollen grains into the study area from earlier-flowering groves located outside the territory. PMID:25461089

  10. Kriging in the Shadows: Geostatistical Interpolation for Remote Sensing

    NASA Technical Reports Server (NTRS)

    Rossi, Richard E.; Dungan, Jennifer L.; Beck, Louisa R.

    1994-01-01

    It is often useful to estimate obscured or missing remotely sensed data. Traditional interpolation methods, such as nearest-neighbor or bilinear resampling, do not take full advantage of the spatial information in the image. An alternative method, a geostatistical technique known as indicator kriging, is described and demonstrated using a Landsat Thematic Mapper image in southern Chiapas, Mexico. The image was first classified into pasture and nonpasture land cover. For each pixel that was obscured by cloud or cloud shadow, the probability that it was pasture was assigned by the algorithm. An exponential omnidirectional variogram model was used to characterize the spatial continuity of the image for use in the kriging algorithm. Assuming a cutoff probability level of 50%, the error was shown to be 17% with no obvious spatial bias but with some tendency to categorize nonpasture as pasture (overestimation). While this is a promising result, the method's practical application in other missing data problems for remotely sensed images will depend on the amount and spatial pattern of the unobscured pixels and missing pixels and the success of the spatial continuity model used.

  11. Geostatistical modeling of uncertainty, simulation, and proposed applications in GIScience

    NASA Astrophysics Data System (ADS)

    Doucette, Peter; Dolloff, John; Lenihan, Michael

    2015-05-01

    Geostatistical modeling of spatial uncertainty has its roots in the mining, water and oil reservoir exploration communities, and has great potential for broader applications as proposed in this paper. This paper describes the underlying statistical models and their use in both the estimation of quantities of interest and the Monte-Carlo simulation of their uncertainty or errors, including their variance or expected magnitude and their spatial correlations or inter-relationships. These quantities can include 2D or 3D terrain locations, feature vertex locations, or any specified attributes whose statistical properties vary spatially. The simulation of spatial uncertainty or errors is a practical and powerful tool for understanding the effects of error propagation in complex systems. This paper describes various simulation techniques and trades-off their generality with complexity and speed. One technique recently proposed by the authors, Fast Sequential Simulation, has the ability to simulate tens of millions of errors with specifiable variance and spatial correlations in a few seconds on a lap-top computer. This ability allows for the timely evaluation of resultant output errors or the performance of a "down-stream" module or application. It also allows for near-real time evaluation when such a simulation capability is built into the application itself.

  12. Soil Organic Carbon Mapping by Geostatistics in Europe Scale

    NASA Astrophysics Data System (ADS)

    Aksoy, E.; Panagos, P.; Montanarella, L.

    2013-12-01

    Accuracy in assessing the distribution of soil organic carbon (SOC) is an important issue because SOC is an important soil component that plays key roles in the functions of both natural ecosystems and agricultural systems. The SOC content varies from place to place and it is strongly related with climate variables (temperature and rainfall), terrain features, soil texture, parent material, vegetation, land-use types, and human management (management and degradation) at different spatial scales. Geostatistical techniques allow for the prediction of soil properties using soil information and environmental covariates. In this study, assessment of SOC distribution has been predicted with Regression-Kriging method in Europe scale. In this prediction, combination of the soil samples which were collected from the LUCAS (European Land Use/Cover Area frame statistical Survey) & BioSoil Projects, with local soil data which were collected from six different CZOs in Europe and ten spatial predictors (slope, aspect, elevation, CTI, CORINE land-cover classification, parent material, texture, WRB soil classification, annual average temperature and precipitation) were used. Significant correlation between the covariates and the organic carbon dependent variable was found. Moreover, investigating the contribution of local dataset in watershed scale into regional dataset in European scale was an important challenge.

  13. An interactive Bayesian geostatistical inverse protocol for hydraulic tomography

    USGS Publications Warehouse

    Fienen, Michael N.; Clemo, Tom; Kitanidis, Peter K.

    2008-01-01

    Hydraulic tomography is a powerful technique for characterizing heterogeneous hydrogeologic parameters. An explicit trade-off between characterization based on measurement misfit and subjective characterization using prior information is presented. We apply a Bayesian geostatistical inverse approach that is well suited to accommodate a flexible model with the level of complexity driven by the data and explicitly considering uncertainty. Prior information is incorporated through the selection of a parameter covariance model characterizing continuity and providing stability. Often, discontinuities in the parameter field, typically caused by geologic contacts between contrasting lithologic units, necessitate subdivision into zones across which there is no correlation among hydraulic parameters. We propose an interactive protocol in which zonation candidates are implied from the data and are evaluated using cross validation and expert knowledge. Uncertainty introduced by limited knowledge of dynamic regional conditions is mitigated by using drawdown rather than native head values. An adjoint state formulation of MODFLOW-2000 is used to calculate sensitivities which are used both for the solution to the inverse problem and to guide protocol decisions. The protocol is tested using synthetic two-dimensional steady state examples in which the wells are located at the edge of the region of interest.

  14. Effect of mixing on oxygen reduction, denitrification, and isotopic fractionation in a heterogeneous aquifer in an agricultural setting.

    NASA Astrophysics Data System (ADS)

    Green, C. T.; Bekins, B. A.

    2007-12-01

    Large fluxes of nitrate in agricultural settings threaten water quality worldwide. In groundwater, the transport and fate of anthropogenic nitrate can be influenced by denitrification, which occurs under anaerobic conditions. It is difficult to characterize denitrification because (1) waters from multiple flow paths commingle in the aquifer due to dispersion and (2) a sample typically includes water from multiple lithologies. For a site near Merced, California, geostatistical simulations, flow modeling, and backward random walk particle tracking were applied to simulate the influence of mixing on apparent reaction rates, redox boundaries, and fractionation of stable isotopes of nitrogen. Results show that dispersion has a strong effect on the distribution of redox indicators. When reactions are rapid, mixing can create the appearance of gradual redox transitions that mask the actual reaction rates and the degree of isotopic fractionation.

  15. How effective is an in-hospital heart failure self-care program in a Japanese setting? Lessons from a randomized controlled pilot study

    PubMed Central

    Kato, Naoko P; Kinugawa, Koichiro; Sano, Miho; Kogure, Asuka; Sakuragi, Fumika; Kobukata, Kihoko; Ohtsu, Hiroshi; Wakita, Sanae; Jaarsma, Tiny; Kazuma, Keiko

    2016-01-01

    Background Although the effectiveness of heart failure (HF) disease management programs has been established in Western countries, to date there have been no such programs in Japan. These programs may have different effectiveness due to differences in health care organization and possible cultural differences with regard to self-care. Therefore, the purpose of this study was to evaluate the effectiveness of a pilot HF program in a Japanese setting. Methods We developed an HF program focused on enhancing patient self-care before hospital discharge. Patients were randomized 1:1 to receive the new HF program or usual care. The primary outcome was self-care behavior as assessed by the European Heart Failure Self-Care Behavior Scale (EHFScBS). Secondary outcomes included HF knowledge and the 2-year rate of HF hospitalization and/or cardiac death. Results A total of 32 patients were enrolled (mean age, 63 years; 31% female). There was no difference in the total score of the EHFScBS between the two groups. One specific behavior score regarding a low-salt diet significantly improved compared with baseline in the intervention group. HF knowledge in the intervention group tended to improve more over 6 months than in the control group (a group-by-time effect, F=2.47, P=0.098). During a 2-year follow-up, the HF program was related to better outcomes regarding HF hospitalization and/or cardiac death (14% vs 48%, log-rank test P=0.04). In Cox regression analysis after adjustment for age, sex, and logarithmic of B-type natriuretic peptide, the program was associated with a reduction in HF hospitalization and/or cardiac death (hazard ratio, 0.17; 95% confidence interval, 0.03–0.90; P=0.04). Conclusion The HF program was likely to increase patients’ HF knowledge, change their behavior regarding a low-salt diet, and reduce HF hospitalization and/or cardiac events. Further improvement focused on the transition of knowledge to self-care behavior is necessary. PMID:26937177

  16. A multiple-point geostatistical method for characterizing uncertainty of subsurface alluvial units and its effects on flow and transport

    USGS Publications Warehouse

    Cronkite-Ratcliff, C.; Phelps, G.A.; Boucher, A.

    2012-01-01

    This report provides a proof-of-concept to demonstrate the potential application of multiple-point geostatistics for characterizing geologic heterogeneity and its effect on flow and transport simulation. The study presented in this report is the result of collaboration between the U.S. Geological Survey (USGS) and Stanford University. This collaboration focused on improving the characterization of alluvial deposits by incorporating prior knowledge of geologic structure and estimating the uncertainty of the modeled geologic units. In this study, geologic heterogeneity of alluvial units is characterized as a set of stochastic realizations, and uncertainty is indicated by variability in the results of flow and transport simulations for this set of realizations. This approach is tested on a hypothetical geologic scenario developed using data from the alluvial deposits in Yucca Flat, Nevada. Yucca Flat was chosen as a data source for this test case because it includes both complex geologic and hydrologic characteristics and also contains a substantial amount of both surface and subsurface geologic data. Multiple-point geostatistics is used to model geologic heterogeneity in the subsurface. A three-dimensional (3D) model of spatial variability is developed by integrating alluvial units mapped at the surface with vertical drill-hole data. The SNESIM (Single Normal Equation Simulation) algorithm is used to represent geologic heterogeneity stochastically by generating 20 realizations, each of which represents an equally probable geologic scenario. A 3D numerical model is used to simulate groundwater flow and contaminant transport for each realization, producing a distribution of flow and transport responses to the geologic heterogeneity. From this distribution of flow and transport responses, the frequency of exceeding a given contaminant concentration threshold can be used as an indicator of uncertainty about the location of the contaminant plume boundary.

  17. Examining the spatial distribution of flower thrips in southern highbush blueberries by utilizing geostatistical methods.

    PubMed

    Rhodes, Elena M; Liburd, Oscar E; Grunwald, Sabine

    2011-08-01

    Flower thrips (Frankliniella spp.) are one of the key pests of southern highbush blueberries (Vaccinium corymbosum L. x V. darrowii Camp), a high-value crop in Florida. Thrips' feeding and oviposition injury to flowers can result in fruit scarring that renders the fruit unmarketable. Flower thrips often form areas of high population, termed "hot spots", in blueberry plantings. The objective of this study was to model thrips spatial distribution patterns with geostatistical techniques. Semivariogram models were used to determine optimum trap spacing and two commonly used interpolation methods, inverse distance weighting (IDW) and ordinary kriging (OK), were compared for their ability to model thrips spatial patterns. The experimental design consisted of a grid of 100 white sticky traps spaced at 15.24-m and 7.61-m intervals in 2008 and 2009, respectively. Thirty additional traps were placed randomly throughout the sampling area to collect information on distances shorter than the grid spacing. The semivariogram analysis indicated that, in most cases, spacing traps at least 28.8 m apart would result in spatially independent samples. Also, the 7.61-m grid spacing captured more of the thrips spatial variability than the 15.24-m grid spacing. IDW and OK produced maps with similar accuracy in both years, which indicates that thrips spatial distribution patterns, including "hot spots," can be modeled using either interpolation method. Future studies can use this information to determine if the formation of "hot spots" can be predicted using flower density, temperature, and other environmental factors. If so, this development would allow growers to spot treat the "hot spots" rather than their entire field. PMID:22251691

  18. Massively Parallel Geostatistical Inversion of Coupled Processes in Heterogeneous Porous Media

    NASA Astrophysics Data System (ADS)

    Ngo, A.; Schwede, R. L.; Li, W.; Bastian, P.; Ippisch, O.; Cirpka, O. A.

    2012-04-01

    The quasi-linear geostatistical approach is an inversion scheme that can be used to estimate the spatial distribution of a heterogeneous hydraulic conductivity field. The estimated parameter field is considered to be a random variable that varies continuously in space, meets the measurements of dependent quantities (such as the hydraulic head, the concentration of a transported solute or its arrival time) and shows the required spatial correlation (described by certain variogram models). This is a method of conditioning a parameter field to observations. Upon discretization, this results in as many parameters as elements of the computational grid. For a full three dimensional representation of the heterogeneous subsurface it is hardly sufficient to work with resolutions (up to one million parameters) of the model domain that can be achieved on a serial computer. The forward problems to be solved within the inversion procedure consists of the elliptic steady-state groundwater flow equation and the formally elliptic but nearly hyperbolic steady-state advection-dominated solute transport equation in a heterogeneous porous medium. Both equations are discretized by Finite Element Methods (FEM) using fully scalable domain decomposition techniques. Whereas standard conforming FEM is sufficient for the flow equation, for the advection dominated transport equation, which rises well known numerical difficulties at sharp fronts or boundary layers, we use the streamline diffusion approach. The arising linear systems are solved using efficient iterative solvers with an AMG (algebraic multigrid) pre-conditioner. During each iteration step of the inversion scheme one needs to solve a multitude of forward and adjoint problems in order to calculate the sensitivities of each measurement and the related cross-covariance matrix of the unknown parameters and the observations. In order to reduce interprocess communications and to improve the scalability of the code on larger clusters

  19. A geostatistical approach to mapping site response spectral amplifications

    USGS Publications Warehouse

    Thompson, E.M.; Baise, L.G.; Kayen, R.E.; Tanaka, Y.; Tanaka, H.

    2010-01-01

    If quantitative estimates of the seismic properties do not exist at a location of interest then the site response spectral amplifications must be estimated from data collected at other locations. Currently, the most common approach employs correlations of site class with maps of surficial geology. Analogously, correlations of site class with topographic slope can be employed where the surficial geology is unknown. Our goal is to identify and validate a method to estimate site response with greater spatial resolution and accuracy for regions where additional effort is warranted. This method consists of three components: region-specific data collection, a spatial model for interpolating seismic properties, and a theoretical method for computing spectral amplifications from the interpolated seismic properties. We consider three spatial interpolation schemes: correlations with surficial geology, termed the geologic trend (GT), ordinary kriging (OK), and kriging with a trend (KT). We estimate the spectral amplifications from seismic properties using the square root of impedance method, thereby linking the frequency-dependent spectral amplifications to the depth-dependent seismic properties. Thus, the range of periods for which this method is applicable is limited by the depth of exploration. A dense survey of near-surface S-wave slowness (Ss) throughout Kobe, Japan shows that the geostatistical methods give more accurate estimates of Ss than the topographic slope and GT methods, and the OK and KT methods perform equally well. We prefer the KT model because it can be seamlessly integrated with geologic maps that cover larger regions. Empirical spectral amplifications show that the region-specific data achieve more accurate estimates of observed median short-period amplifications than the topographic slope method. ?? 2010 Elsevier B.V.

  20. Bayesian geostatistical modeling of Malaria Indicator Survey data in Angola.

    PubMed

    Gosoniu, Laura; Veta, Andre Mia; Vounatsou, Penelope

    2010-01-01

    The 2006-2007 Angola Malaria Indicator Survey (AMIS) is the first nationally representative household survey in the country assessing coverage of the key malaria control interventions and measuring malaria-related burden among children under 5 years of age. In this paper, the Angolan MIS data were analyzed to produce the first smooth map of parasitaemia prevalence based on contemporary nationwide empirical data in the country. Bayesian geostatistical models were fitted to assess the effect of interventions after adjusting for environmental, climatic and socio-economic factors. Non-linear relationships between parasitaemia risk and environmental predictors were modeled by categorizing the covariates and by employing two non-parametric approaches, the B-splines and the P-splines. The results of the model validation showed that the categorical model was able to better capture the relationship between parasitaemia prevalence and the environmental factors. Model fit and prediction were handled within a Bayesian framework using Markov chain Monte Carlo (MCMC) simulations. Combining estimates of parasitaemia prevalence with the number of children under we obtained estimates of the number of infected children in the country. The population-adjusted prevalence ranges from in Namibe province to in Malanje province. The odds of parasitaemia in children living in a household with at least ITNs per person was by 41% lower (CI: 14%, 60%) than in those with fewer ITNs. The estimates of the number of parasitaemic children produced in this paper are important for planning and implementing malaria control interventions and for monitoring the impact of prevention and control activities. PMID:20351775

  1. Geostatistical inspired metamodeling and optimization of nanoscale analog circuits

    NASA Astrophysics Data System (ADS)

    Okobiah, Oghenekarho

    The current trend towards miniaturization of modern consumer electronic devices significantly affects their design. The demand for efficient all-in-one appliances leads to smaller, yet more complex and powerful nanoelectronic devices. The increasing complexity in the design of such nanoscale Analog/Mixed-Signal Systems-on-Chip (AMS-SoCs) presents difficult challenges to designers. One promising design method used to mitigate the burden of this design effort is the use of metamodeling (surrogate) modeling techniques. Their use significantly reduces the time for computer simulation and design space exploration and optimization. This dissertation addresses several issues of metamodeling based nanoelectronic based AMS design exploration. A surrogate modeling technique which uses geostatistical based Kriging prediction methods in creating metamodels is proposed. Kriging prediction techniques take into account the correlation effects between input parameters for performance point prediction. We propose the use of Kriging to utilize this property for the accurate modeling of process variation effects of designs in the deep nanometer region. Different Kriging methods have been explored for this work such as simple and ordinary Kriging. We also propose another metamodeling technique Kriging-Bootstrapped Neural Network that combines the accuracy and process variation awareness of Kriging with artificial neural network models for ultra-fast and accurate process aware metamodeling design. The proposed methodologies combine Kriging metamodels with selected algorithms for ultra-fast layout optimization. The selected algorithms explored are: Gravitational Search Algorithm (GSA), Simulated Annealing Optimization (SAO), and Ant Colony Optimization (ACO). Experimental results demonstrate that the proposed Kriging metamodel based methodologies can perform the optimizations with minimal computational burden compared to traditional (SPICE-based) design flows.

  2. Bayesian Geostatistical Modeling of Malaria Indicator Survey Data in Angola

    PubMed Central

    Gosoniu, Laura; Veta, Andre Mia; Vounatsou, Penelope

    2010-01-01

    The 2006–2007 Angola Malaria Indicator Survey (AMIS) is the first nationally representative household survey in the country assessing coverage of the key malaria control interventions and measuring malaria-related burden among children under 5 years of age. In this paper, the Angolan MIS data were analyzed to produce the first smooth map of parasitaemia prevalence based on contemporary nationwide empirical data in the country. Bayesian geostatistical models were fitted to assess the effect of interventions after adjusting for environmental, climatic and socio-economic factors. Non-linear relationships between parasitaemia risk and environmental predictors were modeled by categorizing the covariates and by employing two non-parametric approaches, the B-splines and the P-splines. The results of the model validation showed that the categorical model was able to better capture the relationship between parasitaemia prevalence and the environmental factors. Model fit and prediction were handled within a Bayesian framework using Markov chain Monte Carlo (MCMC) simulations. Combining estimates of parasitaemia prevalence with the number of children under we obtained estimates of the number of infected children in the country. The population-adjusted prevalence ranges from in Namibe province to in Malanje province. The odds of parasitaemia in children living in a household with at least ITNs per person was by 41% lower (CI: 14%, 60%) than in those with fewer ITNs. The estimates of the number of parasitaemic children produced in this paper are important for planning and implementing malaria control interventions and for monitoring the impact of prevention and control activities. PMID:20351775

  3. Use of geostatistics for remediation planning to transcend urban political boundaries.

    PubMed

    Milillo, Tammy M; Sinha, Gaurav; Gardella, Joseph A

    2012-11-01

    Soil remediation plans are often dictated by areas of jurisdiction or property lines instead of scientific information. This study exemplifies how geostatistically interpolated surfaces can substantially improve remediation planning. Ordinary kriging, ordinary co-kriging, and inverse distance weighting spatial interpolation methods were compared for analyzing surface and sub-surface soil sample data originally collected by the US EPA and researchers at the University at Buffalo in Hickory Woods, an industrial-residential neighborhood in Buffalo, NY, where both lead and arsenic contamination is present. Past clean-up efforts estimated contamination levels from point samples, but parcel and agency jurisdiction boundaries were used to define remediation sites, rather than geostatistical models estimating the spatial behavior of the contaminants in the soil. Residents were understandably dissatisfied with the arbitrariness of the remediation plan. In this study we show how geostatistical mapping and participatory assessment can make soil remediation scientifically defensible, socially acceptable, and economically feasible. PMID:22771352

  4. Use of geostatistical modeling to capture complex geology in finite-element analyses

    SciTech Connect

    Rautman, C.A.; Longenbaugh, R.S.; Ryder, E.E.

    1995-12-01

    This paper summarizes a number of transient thermal analyses performed for a representative two-dimensional cross section of volcanic tuffs at Yucca Mountain using the finite element, nonlinear heat-conduction code COYOTE-II. In addition to conventional design analyses, in which material properties are formulated as a uniform single material and as horizontally layered, internally uniform matters, an attempt was made to increase the resemblance of the thermal property field to the actual geology by creating two fairly complex, geologically realistic models. The first model was created by digitizing an existing two-dimensional geologic cross section of Yucca Mountain. The second model was created using conditional geostatistical simulation. Direct mapping of geostatistically generated material property fields onto finite element computational meshes was demonstrated to yield temperature fields approximately equivalent to those generated through more conventional procedures. However, the ability to use the geostatistical models offers a means of simplifying the physical-process analyses.

  5. Geostatistical analysis of the spatial variation of the ash reserves in the litter of bog birch forests in Western Siberia

    NASA Astrophysics Data System (ADS)

    Efremova, T. T.; Sekretenko, O. P.; Avrova, A. F.; Efremov, S. P.

    2013-01-01

    A typological series of native Betula pubescens Ehrh. dendrocenoses along the channel of a river crossing a bog was studied. The variability of the mineral element reserves is described by geostatistical methods as the sum of a trend, autocorrelation, and random components. The contribution of deterministic and random components has been assessed in the years with average precipitation and in the year of 2007 with high and long-term flooding. The empirical variograms and the parameters of the model variograms are presented. The class of the spatial correlation of the ash reserves is described. A primary cause of the ash content's variability is the specific water regime, which is determined by the following: (i) the abundance and duration of the spring floods responsible for the silt mass brought by the river and (ii) the draining effect of the intrabog river, the distance from which provided the formation in the forest of the ground cover with the specific species composition and ash content. The falloff of the arboreal layer in the bog birch forests formed the fundamental mineral background of the litter.

  6. Geostatistical analysis of groundwater level using Euclidean and non-Euclidean distance metrics and variable variogram fitting criteria

    NASA Astrophysics Data System (ADS)

    Theodoridou, Panagiota G.; Karatzas, George P.; Varouchakis, Emmanouil A.; Corzo Perez, Gerald A.

    2015-04-01

    Groundwater level is an important information in hydrological modelling. Geostatistical methods are often employed to map the free surface of an aquifer. In geostatistical analysis using Kriging techniques the selection of the optimal variogram model is very important for the optimal method performance. This work compares three different criteria, the least squares sum method, the Akaike Information Criterion and the Cressie's Indicator, to assess the theoretical variogram that fits to the experimental one and investigates the impact on the prediction results. Moreover, five different distance functions (Euclidean, Minkowski, Manhattan, Canberra, and Bray-Curtis) are applied to calculate the distance between observations that affects both the variogram calculation and the Kriging estimator. Cross validation analysis in terms of Ordinary Kriging is applied by using sequentially a different distance metric and the above three variogram fitting criteria. The spatial dependence of the observations in the tested dataset is studied by fitting classical variogram models and the Matérn model. The proposed comparison analysis performed for a data set of two hundred fifty hydraulic head measurements distributed over an alluvial aquifer that covers an area of 210 km2. The study area is located in the Prefecture of Drama, which belongs to the Water District of East Macedonia (Greece). This area was selected in terms of hydro-geological data availability and geological homogeneity. The analysis showed that a combination of the Akaike information Criterion for the variogram fitting assessment and the Brays-Curtis distance metric provided the most accurate cross-validation results. The Power-law variogram model provided the best fit to the experimental data. The aforementioned approach for the specific dataset in terms of the Ordinary Kriging method improves the prediction efficiency in comparison to the classical Euclidean distance metric. Therefore, maps of the spatial

  7. Systematic evaluation of sequential geostatistical resampling within MCMC for posterior sampling of near-surface geophysical inverse problems

    NASA Astrophysics Data System (ADS)

    Ruggeri, Paolo; Irving, James; Holliger, Klaus

    2015-08-01

    We critically examine the performance of sequential geostatistical resampling (SGR) as a model proposal mechanism for Bayesian Markov-chain-Monte-Carlo (MCMC) solutions to near-surface geophysical inverse problems. Focusing on a series of simple yet realistic synthetic crosshole georadar tomographic examples characterized by different numbers of data, levels of data error and degrees of model parameter spatial correlation, we investigate the efficiency of three different resampling strategies with regard to their ability to generate statistically independent realizations from the Bayesian posterior distribution. Quite importantly, our results show that, no matter what resampling strategy is employed, many of the examined test cases require an unreasonably high number of forward model runs to produce independent posterior samples, meaning that the SGR approach as currently implemented will not be computationally feasible for a wide range of problems. Although use of a novel gradual-deformation-based proposal method can help to alleviate these issues, it does not offer a full solution. Further, we find that the nature of the SGR is found to strongly influence MCMC performance; however no clear rule exists as to what set of inversion parameters and/or overall proposal acceptance rate will allow for the most efficient implementation. We conclude that although the SGR methodology is highly attractive as it allows for the consideration of complex geostatistical priors as well as conditioning to hard and soft data, further developments are necessary in the context of novel or hybrid MCMC approaches for it to be considered generally suitable for near-surface geophysical inversions.

  8. UNCERT: geostatistics, uncertainty analysis and visualization software applied to groundwater flow and contaminant transport modeling

    NASA Astrophysics Data System (ADS)

    Wingle, William L.; Poeter, Eileen P.; McKenna, Sean A.

    1999-05-01

    UNCERT is a 2D and 3D geostatistics, uncertainty analysis and visualization software package applied to ground water flow and contaminant transport modeling. It is a collection of modules that provides tools for linear regression, univariate statistics, semivariogram analysis, inverse-distance gridding, trend-surface analysis, simple and ordinary kriging and discrete conditional indicator simulation. Graphical user interfaces for MODFLOW and MT3D, ground water flow and contaminant transport models, are provided for streamlined data input and result analysis. Visualization tools are included for displaying data input and output. These include, but are not limited to, 2D and 3D scatter plots, histograms, box and whisker plots, 2D contour maps, surface renderings of 2D gridded data and 3D views of gridded data. By design, UNCERT's graphical user interface and visualization tools facilitate model design and analysis. There are few built in restrictions on data set sizes and each module (with two exceptions) can be run in either graphical or batch mode. UNCERT is in the public domain and is available from the World Wide Web with complete on-line and printable (PDF) documentation. UNCERT is written in ANSI-C with a small amount of FORTRAN77, for UNIX workstations running X-Windows and Motif (or Lesstif). This article discusses the features of each module and demonstrates how they can be used individually and in combination. The tools are applicable to a wide range of fields and are currently used by researchers in the ground water, mining, mathematics, chemistry and geophysics, to name a few disciplines.

  9. Error modeling based on geostatistics for uncertainty analysis in crop mapping using Gaofen-1 multispectral imagery

    NASA Astrophysics Data System (ADS)

    You, Jiong; Pei, Zhiyuan

    2015-01-01

    With the development of remote sensing technology, its applications in agriculture monitoring systems, crop mapping accuracy, and spatial distribution are more and more being explored by administrators and users. Uncertainty in crop mapping is profoundly affected by the spatial pattern of spectral reflectance values obtained from the applied remote sensing data. Errors in remotely sensed crop cover information and the propagation in derivative products need to be quantified and handled correctly. Therefore, this study discusses the methods of error modeling for uncertainty characterization in crop mapping using GF-1 multispectral imagery. An error modeling framework based on geostatistics is proposed, which introduced the sequential Gaussian simulation algorithm to explore the relationship between classification errors and the spectral signature from remote sensing data source. On this basis, a misclassification probability model to produce a spatially explicit classification error probability surface for the map of a crop is developed, which realizes the uncertainty characterization for crop mapping. In this process, trend surface analysis was carried out to generate a spatially varying mean response and the corresponding residual response with spatial variation for the spectral bands of GF-1 multispectral imagery. Variogram models were employed to measure the spatial dependence in the spectral bands and the derived misclassification probability surfaces. Simulated spectral data and classification results were quantitatively analyzed. Through experiments using data sets from a region in the low rolling country located at the Yangtze River valley, it was found that GF-1 multispectral imagery can be used for crop mapping with a good overall performance, the proposal error modeling framework can be used to quantify the uncertainty in crop mapping, and the misclassification probability model can summarize the spatial variation in map accuracy and is helpful for

  10. Geostatistical analysis of soil geochemical data from an industrial area (Puertollano, South-Central Spain).

    NASA Astrophysics Data System (ADS)

    Esbrí, José M.; Higueras, Pablo; López-Berdonces, Miguel A.; García-Noguero, Eva M.; González-Corrochano, Beatriz; Fernández-Calderón, Sergio; Martínez-Coronado, Alba

    2015-04-01

    Puertollano is the biggest industrial city of Castilla-La Mancha, with 48,086 inhabitants. It is located 250 km South of Madrid in the North border of the Ojailén River valley. The industrial area includes a big coal open pit (ENCASUR), two power plants (EON and ELCOGAS), a petrochemical complex (REPSOL) and a fertiliser factory (ENFERSA), all located in the proximities of the town. These industries suppose a complex scenario in terms of metals and metalloids emissions. For instance, mercury emissions declared to PRTR inventory during 2010 were 210 kg year-1 (REPSOL), 130 kg year-1 (ELCOGAS) and 11,9 kg year-1 (EON). Besides it still remains an unaccounted possibly of diffuse sources of other potentially toxic elements coming from the different industrial sites. Multielemental analyses of soils from two different depths covering the whole valley were carried out by means of XRF with a portable Oxford Instruments device. Geostatistical data treatment was performed using SURFER software, applying block kriging to obtain interpolation maps for the study area. Semivariograms of elemental concentrations make a clear distinction between volatile (Hg, Se) and non-volatile elements (Cu, Ni), with differences in scales and variances between the two soil horizons considered. Semivariograms also show different models for elements emitted by combustion processes (Ni) and for anomalous elements from geological substrate (Pb, Zn). In addition to differences in anisotropy of data, these models reflect different forms of elemental dispersion; despite this, identification of particular sources for the different elements is not possible for this geochemical data set.

  11. Quantifying the Relationship between Dynamical Cores and Physical Parameterizations by Geostatistical Methods

    NASA Astrophysics Data System (ADS)

    Yorgun, M. S.; Rood, R. B.

    2010-12-01

    The behavior of atmospheric models is sensitive to the algorithms that are used to represent the equations of motion. Typically, comprehensive models are conceived in terms of the resolved fluid dynamics (i.e. the dynamical core) and subgrid, unresolved physics represented by parameterizations. Deterministic weather predictions are often validated with feature-by-feature comparison. Probabilistic weather forecasts and climate projects are evaluated with statistical methods. We seek to develop model evaluation strategies that identify like “objects” - coherent systems with an associated set of measurable parameters. This makes it possible to evaluate processes in models without needing to reproduce the time and location of, for example, a particular observed cloud system. Process- and object-based evaluation preserves information in the observations by avoiding the need for extensive spatial and temporal averaging. As a concrete example, we focus on analyzing how the choice of dynamical core impacts the representation of precipitation in the Pacific Northwest of the United States, Western Canada, and Alaska; this brings attention to the interaction of the resolved and the parameterized components of the model. Two dynamical cores are considered within the Community Atmosphere Model. These are the Spectral (Eulerian), which relies on global basis functions and the Finite Volume (FV), which uses only local information. We introduce the concept of "meteorological realism" that is, do local representations of large-scale phenomena, for example, fronts and orographic precipitation, look like the observations? A follow on question is, does the representation of these phenomena improve with resolution? Our approach to quantify meteorological realism starts with methods of geospatial statistics. Specifically, we employ variography, which is a geostatistical method which is used to measure the spatial continuity of a regionalized variable, and principle component

  12. Bayesian Geostatistical Analysis and Prediction of Rhodesian Human African Trypanosomiasis

    PubMed Central

    Wardrop, Nicola A.; Atkinson, Peter M.; Gething, Peter W.; Fèvre, Eric M.; Picozzi, Kim; Kakembo, Abbas S. L.; Welburn, Susan C.

    2010-01-01

    Background The persistent spread of Rhodesian human African trypanosomiasis (HAT) in Uganda in recent years has increased concerns of a potential overlap with the Gambian form of the disease. Recent research has aimed to increase the evidence base for targeting control measures by focusing on the environmental and climatic factors that control the spatial distribution of the disease. Objectives One recent study used simple logistic regression methods to explore the relationship between prevalence of Rhodesian HAT and several social, environmental and climatic variables in two of the most recently affected districts of Uganda, and suggested the disease had spread into the study area due to the movement of infected, untreated livestock. Here we extend this study to account for spatial autocorrelation, incorporate uncertainty in input data and model parameters and undertake predictive mapping for risk of high HAT prevalence in future. Materials and Methods Using a spatial analysis in which a generalised linear geostatistical model is used in a Bayesian framework to account explicitly for spatial autocorrelation and incorporate uncertainty in input data and model parameters we are able to demonstrate a more rigorous analytical approach, potentially resulting in more accurate parameter and significance estimates and increased predictive accuracy, thereby allowing an assessment of the validity of the livestock movement hypothesis given more robust parameter estimation and appropriate assessment of covariate effects. Results Analysis strongly supports the theory that Rhodesian HAT was imported to the study area via the movement of untreated, infected livestock from endemic areas. The confounding effect of health care accessibility on the spatial distribution of Rhodesian HAT and the linkages between the disease's distribution and minimum land surface temperature have also been confirmed via the application of these methods. Conclusions Predictive mapping indicates an

  13. Spectral turning bands for efficient Gaussian random fields generation on GPUs and accelerators

    NASA Astrophysics Data System (ADS)

    Hunger, L.; Cosenza, B.; Kimeswenger, S.; Fahringer, T.

    2015-11-01

    A random field (RF) is a set of correlated random variables associated with different spatial locations. RF generation algorithms are of crucial importance for many scientific areas, such as astrophysics, geostatistics, computer graphics, and many others. Current approaches commonly make use of 3D fast Fourier transform (FFT), which does not scale well for RF bigger than the available memory; they are also limited to regular rectilinear meshes. We introduce random field generation with the turning band method (RAFT), an RF generation algorithm based on the turning band method that is optimized for massively parallel hardware such as GPUs and accelerators. Our algorithm replaces the 3D FFT with a lower-order, one-dimensional FFT followed by a projection step and is further optimized with loop unrolling and blocking. RAFT can easily generate RF on non-regular (non-uniform) meshes and efficiently produce fields with mesh sizes bigger than the available device memory by using a streaming, out-of-core approach. Our algorithm generates RF with the correct statistical behavior and is tested on a variety of modern hardware, such as NVIDIA Tesla, AMD FirePro and Intel Phi. RAFT is faster than the traditional methods on regular meshes and has been successfully applied to two real case scenarios: planetary nebulae and cosmological simulations.

  14. Delivering Prevention Interventions to People Living with HIV in Clinical Care Settings: Results of a Cluster Randomized Trial in Kenya, Namibia, and Tanzania

    PubMed Central

    Kidder, Daniel; Medley, Amy; Pals, Sherri L.; Carpenter, Deborah; Howard, Andrea; Antelman, Gretchen; DeLuca, Nicolas; Muhenje, Odylia; Sheriff, Muhsin; Somi, Geoffrey; Katuta, Frieda; Cherutich, Peter; Moore, Janet

    2016-01-01

    We conducted a group randomized trial to assess the feasibility and effectiveness of a multi-component, clinic-based HIV prevention intervention for HIV-positive patients attending clinical care in Namibia, Kenya, and Tanzania. Eighteen HIV care and treatment clinics (six per country) were randomly assigned to intervention or control arms. Approximately 200 sexually active clients from each clinic were enrolled and interviewed at baseline and 6- and 12-months post-intervention. Mixed model logistic regression with random effects for clinic and participant was used to assess the effectiveness of the intervention. Of 3522 HIV-positive patients enrolled, 3034 (86 %) completed a 12-month follow-up interview. Intervention participants were significantly more likely to report receiving provider-delivered messages on disclosure, partner testing, family planning, alcohol reduction, and consistent condom use compared to participants in comparison clinics. Participants in intervention clinics were less likely to report unprotected sex in the past 2 weeks (OR = 0.56, 95 % CI 0.32, 0.99) compared to participants in comparison clinics. In Tanzania, a higher percentage of participants in intervention clinics (17 %) reported using a highly effective method of contraception compared to participants in comparison clinics (10 %, OR = 2.25, 95 % CI 1.24, 4.10). This effect was not observed in Kenya or Namibia. HIV prevention services are feasible to implement as part of routine care and are associated with a self-reported decrease in unprotected sex. Further operational research is needed to identify strategies to address common operational challenges including staff turnover and large patient volumes. PMID:26995678

  15. Delivering Prevention Interventions to People Living with HIV in Clinical Care Settings: Results of a Cluster Randomized Trial in Kenya, Namibia, and Tanzania.

    PubMed

    Bachanas, Pamela; Kidder, Daniel; Medley, Amy; Pals, Sherri L; Carpenter, Deborah; Howard, Andrea; Antelman, Gretchen; DeLuca, Nicolas; Muhenje, Odylia; Sheriff, Muhsin; Somi, Geoffrey; Katuta, Frieda; Cherutich, Peter; Moore, Janet

    2016-09-01

    We conducted a group randomized trial to assess the feasibility and effectiveness of a multi-component, clinic-based HIV prevention intervention for HIV-positive patients attending clinical care in Namibia, Kenya, and Tanzania. Eighteen HIV care and treatment clinics (six per country) were randomly assigned to intervention or control arms. Approximately 200 sexually active clients from each clinic were enrolled and interviewed at baseline and 6- and 12-months post-intervention. Mixed model logistic regression with random effects for clinic and participant was used to assess the effectiveness of the intervention. Of 3522 HIV-positive patients enrolled, 3034 (86 %) completed a 12-month follow-up interview. Intervention participants were significantly more likely to report receiving provider-delivered messages on disclosure, partner testing, family planning, alcohol reduction, and consistent condom use compared to participants in comparison clinics. Participants in intervention clinics were less likely to report unprotected sex in the past 2 weeks (OR = 0.56, 95 % CI 0.32, 0.99) compared to participants in comparison clinics. In Tanzania, a higher percentage of participants in intervention clinics (17 %) reported using a highly effective method of contraception compared to participants in comparison clinics (10 %, OR = 2.25, 95 % CI 1.24, 4.10). This effect was not observed in Kenya or Namibia. HIV prevention services are feasible to implement as part of routine care and are associated with a self-reported decrease in unprotected sex. Further operational research is needed to identify strategies to address common operational challenges including staff turnover and large patient volumes. PMID:26995678

  16. Integrated geostatistics for modeling fluid contacts and shales in Prudhoe Bay

    SciTech Connect

    Perez, G.; Chopra, A.K.; Severson, C.D.

    1997-12-01

    Geostatistics techniques are being used increasingly to model reservoir heterogeneity at a wide range of scales. A variety of techniques is now available with differing underlying assumptions, complexity, and applications. This paper introduces a novel method of geostatistics to model dynamic gas-oil contacts and shales in the Prudhoe Bay reservoir. The method integrates reservoir description and surveillance data within the same geostatistical framework. Surveillance logs and shale data are transformed to indicator variables. These variables are used to evaluate vertical and horizontal spatial correlation and cross-correlation of gas and shale at different times and to develop variogram models. Conditional simulation techniques are used to generate multiple three-dimensional (3D) descriptions of gas and shales that provide a measure of uncertainty. These techniques capture the complex 3D distribution of gas-oil contacts through time. The authors compare results of the geostatistical method with conventional techniques as well as with infill wells drilled after the study. Predicted gas-oil contacts and shale distributions are in close agreement with gas-oil contacts observed at infill wells.

  17. USING GEOSTATISTICS TO UNDERSTAND THE SPATIAL DISTRIBUTION OF SOIL PROPERTIES AND THE FIELD DISSIPATION OF HERBICIDES

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Current research in precision agriculture has focused on solving problems dealing with soil fertility and crop yield. Specifically, geostatistical methods are being used to link the spatial distribution in soil physical and chemical properties to spatial patterns in crop yield. Because soil proper...

  18. Estimating Solar PV Output Using Modern Space/Time Geostatistics (Presentation)

    SciTech Connect

    Lee, S. J.; George, R.; Bush, B.

    2009-04-29

    This presentation describes a project that uses mapping techniques to predict solar output at subhourly resolution at any spatial point, develop a methodology that is applicable to natural resources in general, and demonstrate capability of geostatistical techniques to predict the output of a potential solar plant.

  19. Combining Geostatistics with Moran’s I Analysis for Mapping Soil Heavy Metals in Beijing, China

    PubMed Central

    Huo, Xiao-Ni; Li, Hong; Sun, Dan-Feng; Zhou, Lian-Di; Li, Bao-Guo

    2012-01-01

    Production of high quality interpolation maps of heavy metals is important for risk assessment of environmental pollution. In this paper, the spatial correlation characteristics information obtained from Moran’s I analysis was used to supplement the traditional geostatistics. According to Moran’s I analysis, four characteristics distances were obtained and used as the active lag distance to calculate the semivariance. Validation of the optimality of semivariance demonstrated that using the two distances where the Moran’s I and the standardized Moran’s I, Z(I) reached a maximum as the active lag distance can improve the fitting accuracy of semivariance. Then, spatial interpolation was produced based on the two distances and their nested model. The comparative analysis of estimation accuracy and the measured and predicted pollution status showed that the method combining geostatistics with Moran’s I analysis was better than traditional geostatistics. Thus, Moran’s I analysis is a useful complement for geostatistics to improve the spatial interpolation accuracy of heavy metals. PMID:22690179

  20. Combined assimilation of streamflow and satellite soil moisture with the particle filter and geostatistical modeling

    NASA Astrophysics Data System (ADS)

    Yan, Hongxiang; Moradkhani, Hamid

    2016-08-01

    Assimilation of satellite soil moisture and streamflow data into a distributed hydrologic model has received increasing attention over the past few years. This study provides a detailed analysis of the joint and separate assimilation of streamflow and Advanced Scatterometer (ASCAT) surface soil moisture into a distributed Sacramento Soil Moisture Accounting (SAC-SMA) model, with the use of recently developed particle filter-Markov chain Monte Carlo (PF-MCMC) method. Performance is assessed over the Salt River Watershed in Arizona, which is one of the watersheds without anthropogenic effects in Model Parameter Estimation Experiment (MOPEX). A total of five data assimilation (DA) scenarios are designed and the effects of the locations of streamflow gauges and the ASCAT soil moisture on the predictions of soil moisture and streamflow are assessed. In addition, a geostatistical model is introduced to overcome the significantly biased satellite soil moisture and also discontinuity issue. The results indicate that: (1) solely assimilating outlet streamflow can lead to biased soil moisture estimation; (2) when the study area can only be partially covered by the satellite data, the geostatistical approach can estimate the soil moisture for those uncovered grid cells; (3) joint assimilation of streamflow and soil moisture from geostatistical modeling can further improve the surface soil moisture prediction. This study recommends that the geostatistical model is a helpful tool to aid the remote sensing technique and the hydrologic DA study.

  1. Adapting geostatistics to analyze spatial and temporal trends in weed populations

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Geostatistics were originally developed in mining to estimate the location, abundance and quality of ore over large areas from soil samples to optimize future mining efforts. Here, some of these methods were adapted to weeds to account for a limited distribution area (i.e., inside a field), variatio...

  2. The use of geostatistics in the study of floral phenology of Vulpia geniculata (L.) link.

    PubMed

    León Ruiz, Eduardo J; García Mozo, Herminia; Domínguez Vilches, Eugenio; Galán, Carmen

    2012-01-01

    Traditionally phenology studies have been focused on changes through time, but there exist many instances in ecological research where it is necessary to interpolate among spatially stratified samples. The combined use of Geographical Information Systems (GIS) and Geostatistics can be an essential tool for spatial analysis in phenological studies. Geostatistics are a family of statistics that describe correlations through space/time and they can be used for both quantifying spatial correlation and interpolating unsampled points. In the present work, estimations based upon Geostatistics and GIS mapping have enabled the construction of spatial models that reflect phenological evolution of Vulpia geniculata (L.) Link throughout the study area during sampling season. Ten sampling points, scattered throughout the city and low mountains in the "Sierra de Córdoba" were chosen to carry out the weekly phenological monitoring during flowering season. The phenological data were interpolated by applying the traditional geostatitical method of Kriging, which was used to elaborate weekly estimations of V. geniculata phenology in unsampled areas. Finally, the application of Geostatistics and GIS to create phenological maps could be an essential complement in pollen aerobiological studies, given the increased interest in obtaining automatic aerobiological forecasting maps. PMID:22629169

  3. Geostatistical prediction of stream-flow regime in southeastern United States

    NASA Astrophysics Data System (ADS)

    Pugliese, Alessio; Castellarin, Attilio; Archfield, Stacey; Farmer, William

    2015-04-01

    similar performances independently of the interpretation of the curves (i.e. period-of-record/annual, or complete/seasonal) or Q* (MAF or MAP*); at -site performances are satisfactory or good (i.e. Nash-Sutcliffe Efficiency NSE ranges from 0.60 to 0.90 for cross-validated FDCs, depending on the model setting), while the overall performance at regional scale indicates that OK and TK are associated with smaller BIAS and RMSE relative to the six benchmark procedures. Acknowledgements: We thankfully acknowledge Alessia Bononi and Antonio Liguori for their preliminary analyses and Jon O. Skøien and Edzer Pebesma for their helpful assistance with R-packages rtop and gstat. The study is part of the research activities carried out by the working group: Anthropogenic and Climatic Controls on WateR AvailabilitY (ACCuRAcY) of Panta Rhei - Everything Flows Change in Hydrology and Society (IAHS Scientific Decade 2013-2022). References Pugliese, A., A. Castellarin, A., Brath (2014): Geostatistical prediction of flow-duration curves in an index-flow framework, Hydrol. Earth Syst. Sci., 18, 3801-3816,doi:10.5194/hess-18-3801-2014. Castiglioni, S., A. Castellarin, A. Montanari (2009): Prediction of low-flow indices in ungauged basins through physiographical space-based interpolation, Journal of Hydrology, 378, 272-280.

  4. Introduction to This Special Issue on Geostatistics and Geospatial Techniques in Remote Sensing

    NASA Technical Reports Server (NTRS)

    Atkinson, Peter; Quattrochi, Dale A.; Goodman, H. Michael (Technical Monitor)

    2000-01-01

    The germination of this special Computers & Geosciences (C&G) issue began at the Royal Geographical Society (with the Institute of British Geographers) (RGS-IBG) annual meeting in January 1997 held at the University of Exeter, UK. The snow and cold of the English winter were tempered greatly by warm and cordial discussion of how to stimulate and enhance cooperation on geostatistical and geospatial research in remote sensing 'across the big pond' between UK and US researchers. It was decided that one way forward would be to hold parallel sessions in 1998 on geostatistical and geospatial research in remote sensing at appropriate venues in both the UK and the US. Selected papers given at these sessions would be published as special issues of C&G on the UK side and Photogrammetric Engineering and Remote Sensing (PE&RS) on the US side. These issues would highlight the commonality in research on geostatistical and geospatial research in remote sensing on both sides of the Atlantic Ocean. As a consequence, a session on "Geostatistics and Geospatial Techniques for Remote Sensing of Land Surface Processes" was held at the RGS-IBG annual meeting in Guildford, Surrey, UK in January 1998, organized by the Modeling and Advanced Techniques Special Interest Group (MAT SIG) of the Remote Sensing Society (RSS). A similar session was held at the Association of American Geographers (AAG) annual meeting in Boston, Massachusetts in March 1998, sponsored by the AAG's Remote Sensing Specialty Group (RSSG). The 10 papers that make up this issue of C&G, comprise 7 papers from the UK and 3 papers from the LIS. We are both co-editors of each of the journal special issues, with the lead editor of each journal issue being from their respective side of the Atlantic. The special issue of PE&RS (vol. 65) that constitutes the other half of this co-edited journal series was published in early 1999, comprising 6 papers by US authors. We are indebted to the International Association for Mathematical

  5. Evaluation of the supply of antifungal medication for the treatment of vaginal thrush in the community pharmacy setting: a randomized controlled trial

    PubMed Central

    Schneider, Carl R.; Emery, Lyndal; Brostek, Raisa; Clifford, Rhonda M.

    Background The Pharmaceutical Society of Australia have developed "guidance" for the supply of several medicines available without prescription to the general public. Limited research has been published assessing the effect of these guidelines on the provision of medication within the practice of pharmacy. Objective To assess appropriate supply of non-prescription antifungal medications for the treatment of vaginal thrush in community pharmacies, with and without a guideline. A secondary aim was to describe the assessment and counseling provided to patients when requesting this medication. Methods A randomized controlled trial was undertaken whereby two simulated patients conducted visits to 100 randomly selected community pharmacies in a metropolitan region. A product-based request for fluconazole (an oral antifungal that has a guideline was compared to a product-based request for clotrimazole (a topical antifungal without a guideline). The same patient details were used for both requests. Outcome measures of the visits were the appropriateness of supply and referral to a medical practitioner. Results Overall 16% (n=16) of visits resulted in an appropriate outcome; 10% (n=5) of fluconozaole requests compared with 22% (n=11) of clotrimazole requests (chi-square=2.68, p=0.10). There was a difference in the type of assessment performed by pharmacy staff between visits for fluconazole and clotrimazole. A request for clotrimazole resulted in a significant increase in frequency in regards to assessment of the reason for the request (chi-square=8.57, p=0.003), symptom location (chi-square=8.27, p=0.004), and prior history (chi-square=5.09, p=0.02). Conclusions Overall practice was poor, with the majority of pharmacies inappropriately supplying antifungal medication. New strategies are required to improve current practice of community pharmacies for provision of non-prescription antifungals in the treatment of vaginal thrush. PMID:24223077

  6. Evaluation and implementation of graded in vivo exposure for chronic low back pain in a German outpatient setting: a study protocol of a randomized controlled trial

    PubMed Central

    2013-01-01

    Background The purpose of the present study is to introduce an adapted protocol of in vivo exposure for fear avoidant back pain patients and its implementation in the German health care system without multidisciplinary teams. Case studies demonstrated promising effects but three preceding randomized controlled trials (RCTs) could not support the former results. More empirical support is necessary to further substantiate the effectiveness of in vivo exposure. Methods A total of 108 chronic low back pain patients are randomly assigned to one out of three conditions (A: exposure_long (15 sessions), B: exposure_short (10 sessions) or C: control condition cognitive behavioral therapy (15 sessions)). The inclusion criteria are: back pain ≥3 months and a sufficient level of fear-avoidance. An effect evaluation, a process evaluation and an economic evaluation are conducted. Primary outcomes are pain-related disability and pain intensity. Secondary outcomes are: emotional distress, fear avoidance, catastrophizing and costs. Data are collected at baseline, upon completion of the intervention, after 10 sessions, and at six months following completion of treatment. Besides the comparison of exposure in vivo and cognitive behavioral therapy (CBT), we additionally compare a short and a long version of exposure to analyze dose response effects. Discussion This is, to our knowledge, the first RCT comparing in vivo exposure to psychological treatment as usual in terms of cognitive behavioral therapy. Results will help to find out whether a tailored treatment for fear avoidant back pain patients is more effective than a general pain management treatment. Trial registration The trial has been registered to ClinicalTrial.gov. The trial registration number is NCT01484418 PMID:23837551

  7. Computer generation and application of 3-D model porous media: From pore-level geostatistics to the estimation of formation factor

    SciTech Connect

    Ioannidis, M.; Kwiecien, M.; Chatzis, I.

    1995-12-31

    This paper describes a new method for the computer generation of 3-D stochastic realizations of porous media using geostatistical information obtained from high-contrast 2-D images of pore casts. The stochastic method yields model porous media with statistical properties identical to those of their real counterparts. Synthetic media obtained in this manner can form the basis for a number of studies related to the detailed characterization of the porous microstructure and, ultimately, the prediction of important petrophysical and reservoir engineering properties. In this context, direct computer estimation of the formation resistivity factor is examined using a discrete random walk algorithm. The dependence of formation factor on measureable statistical properties of the pore space is also investigated.

  8. Considerations in Applying the Results of Randomized Controlled Clinical Trials to the Care of Older Adults With Kidney Disease in the Clinical Setting: The SHARP Trial.

    PubMed

    Butler, Catherine R; O'Hare, Ann M

    2016-01-01

    The Study of Heart and Renal Protection (SHARP) found that treatment with ezetemibe and low-dose simvastatin reduced the incidence of major atherosclerotic events in patients with kidney disease. Due to the paucity of evidence-based interventions that lower cardiovascular morbidity in this high-risk population, the SHARP trial will likely have a large impact on clinical practice. However, applying the results of clinical trials conducted in select populations to the care of individual patients in real-world settings can be fraught with difficulty. This is especially true when caring for older adults with complex comorbidity and limited life expectancy. These patients are often excluded from clinical trials, frequently have competing health priorities, and may be less likely to benefit and more likely to be harmed by medications. We discuss key considerations in applying the results of the SHARP trial to the care of older adults with CKD in real-world clinical settings using guiding principles set forth by the American Geriatrics Society's Expert Panel on the Care of Older Adults with Multimorbidity. Using this schema, we emphasize the importance of evaluating trial results in the unique context of each patient's goals, values, priorities, and circumstances. PMID:26709060

  9. RANDOM LASSO.

    PubMed

    Wang, Sijian; Nan, Bin; Rosset, Saharon; Zhu, Ji

    2011-03-01

    We propose a computationally intensive method, the random lasso method, for variable selection in linear models. The method consists of two major steps. In step 1, the lasso method is applied to many bootstrap samples, each using a set of randomly selected covariates. A measure of importance is yielded from this step for each covariate. In step 2, a similar procedure to the first step is implemented with the exception that for each bootstrap sample, a subset of covariates is randomly selected with unequal selection probabilities determined by the covariates' importance. Adaptive lasso may be used in the second step with weights determined by the importance measures. The final set of covariates and their coefficients are determined by averaging bootstrap results obtained from step 2. The proposed method alleviates some of the limitations of lasso, elastic-net and related methods noted especially in the context of microarray data analysis: it tends to remove highly correlated variables altogether or select them all, and maintains maximal flexibility in estimating their coefficients, particularly with different signs; the number of selected variables is no longer limited by the sample size; and the resulting prediction accuracy is competitive or superior compared to the alternatives. We illustrate the proposed method by extensive simulation studies. The proposed method is also applied to a Glioblastoma microarray data analysis. PMID:22997542

  10. Systematic Review and Pooled Analyses of Recent Neurointerventional Randomized Controlled Trials: Setting a New Standard of Care for Acute Ischemic Stroke Treatment after 20 Years

    PubMed Central

    Hussain, Mohammed; Moussavi, Mohammad; Korya, Daniel; Mehta, Siddhart; Brar, Jaskiran; Chahal, Harina; Qureshi, Ihtesham; Mehta, Tapan; Ahmad, Javaad; Zaidat, Osama O.; Kirmani, Jawad F.

    2016-01-01

    Background Recent advances in the treatment of ischemic stroke have focused on revascularization and led to better clinical and functional outcomes. A systematic review and pooled analyses of 6 recent multicentered prospective randomized controlled trials (MPRCT) were performed to compare intravenous tissue plasminogen activator (IV tPA) and endovascular therapy (intervention) with IV tPA alone (control) for anterior circulation ischemic stroke (AIS) secondary to large vessel occlusion (LVO). Objectives Six MPRCTs (MR CLEAN, ESCAPE, EXTEND IA, SWIFT PRIME, REVASCAT and THERAPY) incorporating image-based LVO AIS were selected for assessing the following: (1) prespecified primary clinical outcomes of AIS patients in intervention and control arms: good outcomes were defined by a modified Rankin Scale score of 0-2 at 90 days; (2) secondary clinical outcomes were: (a) revascularization rates [favorable outcomes defined as modified Thrombolysis in Cerebral Infarction scale (mTICI) score of 2b/3]; (b) symptomatic intracranial hemorrhage (sICH) rates and mortality; (c) derivation of number needed to harm (NNH), number needed to treat (NNT), and relative percent difference (RPD) between intervention and control groups, and (d) random effects model to determine overall significance (forest and funnel plots). Results A total of 1,386 patients were included. Good outcomes at 90 days were seen in 46% of patients in the intervention (p < 0.00001) and in 27% of patients in the control groups (p < 0.00002). An mTICI score of 2b/3 was achieved in 70.2% of patients in the intervention arm. The sICH and mortality in the intervention arm compared with the control arm were 4.7 and 14.3% versus 7.9 and 17.8%, respectively. The NNT and NNH in the intervention and control groups were 5.3 and 9.1, respectively. Patients in the intervention arm had a 50.1% (RPD) better chance of achieving a good 90-day outcome as compared to controls. Conclusions Endovascular therapy combined with IV t

  11. Short-term effects of goal-setting focusing on the life goal concept on subjective well-being and treatment engagement in subacute inpatients: a quasi-randomized controlled trial

    PubMed Central

    Ogawa, Tatsuya; Omon, Kyohei; Yuda, Tomohisa; Ishigaki, Tomoya; Imai, Ryota; Ohmatsu, Satoko; Morioka, Shu

    2016-01-01

    Objective: To investigate the short-term effects of the life goal concept on subjective well-being and treatment engagement, and to determine the sample size required for a larger trial. Design: A quasi-randomized controlled trial that was not blinded. Setting: A subacute rehabilitation ward. Subjects: A total of 66 patients were randomized to a goal-setting intervention group with the life goal concept (Life Goal), a standard rehabilitation group with no goal-setting intervention (Control 1), or a goal-setting intervention group without the life goal concept (Control 2). Interventions: The goal-setting intervention in the Life Goal and Control 2 was Goal Attainment Scaling. The Life Goal patients were assessed in terms of their life goals, and the hierarchy of goals was explained. The intervention duration was four weeks. Main measures: Patients were assessed pre- and post-intervention. The outcome measures were the Hospital Anxiety and Depression Scale, 12-item General Health Questionnaire, Pittsburgh Rehabilitation Participation Scale, and Functional Independence Measure. Results: Of the 296 potential participants, 66 were enrolled; Life Goal (n = 22), Control 1 (n = 22) and Control 2 (n = 22). Anxiety was significantly lower in the Life Goal (4.1 ±3.0) than in Control 1 (6.7 ±3.4), but treatment engagement was significantly higher in the Life Goal (5.3 ±0.4) compared with both the Control 1 (4.8 ±0.6) and Control 2 (4.9 ±0.5). Conclusions: The life goal concept had a short-term effect on treatment engagement. A sample of 31 patients per group would be required for a fully powered clinical trial. PMID:27496700

  12. Geostatistical and Stochastic Study of Flow and Transport in the Unsaturated Zone at Yucca Mountain

    SciTech Connect

    Ye, Ming; Pan, Feng; Hu, Xiaolong; Zhu, Jianting

    2007-08-14

    Yucca Mountain has been proposed by the U.S. Department of Energy as the nation’s long-term, permanent geologic repository for spent nuclear fuel or high-level radioactive waste. The potential repository would be located in Yucca Mountain’s unsaturated zone (UZ), which acts as a critical natural barrier delaying arrival of radionuclides to the water table. Since radionuclide transport in groundwater can pose serious threats to human health and the environment, it is important to understand how much and how fast water and radionuclides travel through the UZ to groundwater. The UZ system consists of multiple hydrogeologic units whose hydraulic and geochemical properties exhibit systematic and random spatial variation, or heterogeneity, at multiple scales. Predictions of radionuclide transport under such complicated conditions are uncertain, and the uncertainty complicates decision making and risk analysis. This project aims at using geostatistical and stochastic methods to assess uncertainty of unsaturated flow and radionuclide transport in the UZ at Yucca Mountain. Focus of this study is parameter uncertainty of hydraulic and transport properties of the UZ. The parametric uncertainty arises since limited parameter measurements are unable to deterministically describe spatial variability of the parameters. In this project, matrix porosity, permeability and sorption coefficient of the reactive tracer (neptunium) of the UZ are treated as random variables. Corresponding propagation of parametric uncertainty is quantitatively measured using mean, variance, 5th and 95th percentiles of simulated state variables (e.g., saturation, capillary pressure, percolation flux, and travel time). These statistics are evaluated using a Monte Carlo method, in which a three-dimensional flow and transport model implemented using the TOUGH2 code is executed with multiple parameter realizations of the random model parameters. The project specifically studies uncertainty of unsaturated

  13. The Impact of Brief Messages on HSV-2 Screening Uptake Among Female Defendants in a Court Setting: A Randomized Controlled Trial Utilizing Prospect Theory

    PubMed Central

    ROTH, ALEXIS M.; VAN DER POL, BARBARA; FORTENBERRY, J. DENNIS; DODGE, BRIAN; REECE, MICHAEL; CERTO, DAVID; ZIMET, GREGORY D.

    2015-01-01

    Epidemiologic data demonstrate that women involved with the criminal justice system in the United States are at high risk for sexually transmitted infections, including herpes simplex virus type 2 (HSV-2). Female defendants were recruited from a misdemeanor court to assess whether brief framed messages utilizing prospect theory could encourage testing for HSV-2. Participants were randomly assigned to a message condition (gain, loss, or control), completed an interviewer-administered survey assessing factors associated with antibody test uptake/refusal and were offered free point-of-care HSV-2 serologic testing. Although individuals in the loss-frame group accepted testing at the highest rate, an overall statistical difference in HSV-2 testing behavior by group (p ≤.43) was not detected. The majority of the sample (74.6%) characterized receiving a serological test for HSV-2 as health affirming. However, this did not moderate the effect of the intervention nor was it significantly associated with test acceptance (p ≤.82). Although the effects of message framing are subtle, the findings have important theoretical implications given the participants’ characterization of HSV-2 screening as health affirming despite being a detection behavior. Implications of study results for health care providers interested in brief, low cost interventions are also explored. PMID:25494832

  14. Consequences of anorectal cancer atlas implementation in the cooperative group setting: Radiobiologic analysis of a prospective randomized in silico target delineation study

    PubMed Central

    Mavroidis, Panayiotis; Giantsoudis, Drosoula; Awan, Musaddiq J.; Nijkamp, Jasper; Rasch, Coen R. N.; Duppen, Joop C.; Thomas, Charles R.; Okunieff, Paul; Jones, William E.; Kachnicc, Lisa A.; Papanikolaou, Niko; Fuller, Clifton D.

    2014-01-01

    Purpose The aim of this study is to ascertain the subsequent radiobiological impact of using a consensus guideline target volume delineation atlas. Materials and methods Using a representative case and target volume delineation instructions derived from a proposed IMRT rectal cancer clinical trial, gross tumor volume (GTV) and clinical/planning target volumes (CTV/PTV) were contoured by 13 physician observers (Phase 1). The observers were then randomly assigned to follow (atlas) or not-follow (control) a consensus guideline/atlas for anorectal cancers, and instructed to re-contour the same case (Phase 2). Results The atlas group was found to have increased tumor control probability (TCP) after the atlas intervention for both the CTV (p < 0.0001) and PTV1 (p = 0.0011) with decreasing normal tissue complication probability (NTCP) for small intestine, while the control group did not. Additionally, the atlas group had reduced variance in TCP for all target volumes and reduced variance in NTCP for the bowel. In Phase 2, the atlas group had increased TCP relative to the control for CTV (p = 0.03). Conclusions Visual atlas and consensus treatment guidelines usage in the development of rectal cancer IMRT treatment plans reduced the inter-observer radiobiological variation, with clinically relevant TCP alteration for CTV and PTV volumes. PMID:24996454

  15. The PRESLO study: evaluation of a global secondary low back pain prevention program for health care personnel in a hospital setting. Multicenter, randomized intervention trial

    PubMed Central

    2012-01-01

    Background Common low back pain represents a major public health problem in terms of its direct cost to health care and its socio-economic repercussions. Ten percent of individuals who suffer from low back pain evolve toward a chronic case and as such are responsible for 75 to 80% of the direct cost of low back pain. It is therefore imperative to highlight the predictive factors of low back pain chronification in order to lighten the economic burden of low back pain-related invalidity. Despite being particularly affected by low back pain, Hospices Civils de Lyon (HCL) personnel have never been offered a specific, tailor-made treatment plan. The PRESLO study (with PRESLO referring to Secondary Low Back Pain Prevention, or in French, PREvention Secondaire de la LOmbalgie), proposed by HCL occupational health services and the Centre Médico-Chirurgical et de Réadaptation des Massues – Croix Rouge Française, is a randomized trial that aims to evaluate the feasibility and efficiency of a global secondary low back pain prevention program for the low back pain sufferers among HCL hospital personnel, a population at risk for recurrence and chronification. This program, which is based on the concept of physical retraining, employs a multidisciplinary approach uniting physical activity, cognitive education about low back pain and lumbopelvic morphotype analysis. No study targeting populations at risk for low back pain chronification has as yet evaluated the efficiency of lighter secondary prevention programs. Methods/Design This study is a two-arm parallel randomized controlled trial proposed to all low back pain sufferers among HCL workers, included between October 2008 and July 2011 and followed over two years. The personnel following their usual treatment (control group) and those following the global prevention program in addition to their usual treatment (intervention group) are compared in terms of low back pain recurrence and the impairments measured at the

  16. Predictive modeling of groundwater nitrate pollution using Random Forest and multisource variables related to intrinsic and specific vulnerability: a case study in an agricultural setting (Southern Spain).

    PubMed

    Rodriguez-Galiano, Victor; Mendes, Maria Paula; Garcia-Soldado, Maria Jose; Chica-Olmo, Mario; Ribeiro, Luis

    2014-04-01

    Watershed management decisions need robust methods, which allow an accurate predictive modeling of pollutant occurrences. Random Forest (RF) is a powerful machine learning data driven method that is rarely used in water resources studies, and thus has not been evaluated thoroughly in this field, when compared to more conventional pattern recognition techniques key advantages of RF include: its non-parametric nature; high predictive accuracy; and capability to determine variable importance. This last characteristic can be used to better understand the individual role and the combined effect of explanatory variables in both protecting and exposing groundwater from and to a pollutant. In this paper, the performance of the RF regression for predictive modeling of nitrate pollution is explored, based on intrinsic and specific vulnerability assessment of the Vega de Granada aquifer. The applicability of this new machine learning technique is demonstrated in an agriculture-dominated area where nitrate concentrations in groundwater can exceed the trigger value of 50 mg/L, at many locations. A comprehensive GIS database of twenty-four parameters related to intrinsic hydrogeologic proprieties, driving forces, remotely sensed variables and physical-chemical variables measured in "situ", were used as inputs to build different predictive models of nitrate pollution. RF measures of importance were also used to define the most significant predictors of nitrate pollution in groundwater, allowing the establishment of the pollution sources (pressures). The potential of RF for generating a vulnerability map to nitrate pollution is assessed considering multiple criteria related to variations in the algorithm parameters and the accuracy of the maps. The performance of the RF is also evaluated in comparison to the logistic regression (LR) method using different efficiency measures to ensure their generalization ability. Prediction results show the ability of RF to build accurate models

  17. Randomized comparative study of left versus right radial approach in the setting of primary percutaneous coronary intervention for ST-elevation myocardial infarction

    PubMed Central

    Fu, Qiang; Hu, Hongyu; Wang, Dezhao; Chen, Wei; Tan, Zhixu; Li, Qun; Chen, Buxing

    2015-01-01

    Background Growing evidence suggests that the left radial approach (LRA) is related to decreased coronary procedure duration and fewer cerebrovascular complications as compared to the right radial approach (RRA) in elective percutaneous coronary intervention (PCI). However, the feasibility of LRA in primary PCI has yet to be studied further. Therefore, the aim of this study was to investigate the efficacy of LRA compared with RRA for primary PCI in ST-elevation myocardial infarction (STEMI) patients. Materials and methods A total of 200 consecutive patients with STEMI who received primary PCI were randomized to LRA (number [n]=100) or RRA (n=100). The study endpoint was needle-to-balloon time, defined as the time from local anesthesia infiltration to the first balloon inflation. Radiation dose by measuring cumulative air kerma (CAK) and CAK dose area product, as well as fluoroscopy time and contrast volume were also investigated. Results There were no significant differences in the baseline characteristics between the two groups. The coronary procedural success rate was similar between both radial approaches (98% for left versus 94% for right; P=0.28). Compared with RRA, LRA had significantly shorter needle-to-balloon time (16.0±4.8 minutes versus 18.0±6.5 minutes, respectively; P=0.02). Additionally, fluoroscopy time (7.4±3.4 minutes versus 8.8±3.5 minutes, respectively; P=0.01) and CAK dose area product (51.9±30.4 Gy cm2 versus 65.3±49.1 Gy cm2, respectively; P=0.04) were significantly lower with LRA than with RRA. Conclusion Primary PCI can be performed via LRA with earlier blood flow restoration in the infarct-related artery and lower radiation exposure when compared with RRA; therefore, the LRA may become a feasible and attractive alternative to perform primary PCI for STEMI patients. PMID:26150704

  18. SRS 2010 Vegetation Inventory GeoStatistical Mapping Results for Custom Reaction Intensity and Total Dead Fuels.

    SciTech Connect

    Edwards, Lloyd A.; Paresol, Bernard

    2014-09-01

    This report of the geostatistical analysis results of the fire fuels response variables, custom reaction intensity and total dead fuels is but a part of an SRS 2010 vegetation inventory project. For detailed description of project, theory and background including sample design, methods, and results please refer to USDA Forest Service Savannah River Site internal report “SRS 2010 Vegetation Inventory GeoStatistical Mapping Report”, (Edwards & Parresol 2013).

  19. A geostatistical approach for describing spatial pattern in stream networks

    USGS Publications Warehouse

    Ganio, L.M.; Torgersen, C.E.; Gresswell, R.E.

    2005-01-01

    The shape and configuration of branched networks influence ecological patterns and processes. Recent investigations of network influences in riverine ecology stress the need to quantify spatial structure not only in a two-dimensional plane, but also in networks. An initial step in understanding data from stream networks is discerning non-random patterns along the network. On the other hand, data collected in the network may be spatially autocorrelated and thus not suitable for traditional statistical analyses. Here we provide a method that uses commercially available software to construct an empirical variogram to describe spatial pattern in the relative abundance of coastal cutthroat trout in headwater stream networks. We describe the mathematical and practical considerations involved in calculating a variogram using a non-Euclidean distance metric to incorporate the network pathway structure in the analysis of spatial variability, and use a non-parametric technique to ascertain if the pattern in the empirical variogram is non-random.

  20. Performance of a Culturally Tailored Cognitive Behavioral Intervention (CBI) Integrated in a Public Health Setting to Reduce Risk of Antepartum Depression: A Randomized Clinical Trial

    PubMed Central

    Jesse, D. Elizabeth; Gaynes, Bradley N.; Feldhousen, Elizabeth; Newton, Edward R.; Bunch, Shelia; Hollon, Steven D.

    2016-01-01

    Introduction Cognitive behavioral group interventions have been shown to improve depressive symptoms in adult populations. This article details the feasibility and efficacy of a 6-week culturally tailored cognitive behavioral intervention offered to rural, minority, low-income women at risk for antepartum depression. Methods 146 pregnant women were stratified by high-risk for antepartum depression (Edinburgh Postnatal Depression Scale (EPDS) score of 10 or higher) or low-moderate risk (EPDS score of 4-9) and randomized to a cognitive behavioral intervention or treatment-as-usual. Differences in mean change of EPDS and BDI-II scores for low-moderate and high-risk women in the cognitive behavioral intervention and treatment-as-usual for the full sample were assessed from baseline (T1), post-treatment (T2) and 1-month follow-up (T3) and for African-American women in the subsample. Results Both the cognitive behavioral intervention and treatment-as-usual groups had significant reductions in the EPDS scores from T1 to T2 and T1 to T3. In women at high-risk for depression (n=62), there was no significant treatment effect from T1 to T2 or T3 for the Edinburgh Postnatal Depression Scale. However, in low-moderate risk women, there was a significant decrease in the BDI-II scores from T1 to T2 (4.92 vs. 0.59, P=.018) and T1 to T3 (5.67 vs. 1.51, P=.04). Also, the cognitive behavioral intervention significantly reduced EPDS scores for African-American women at high-risk (n=43) from T1 to T2 (5.59 vs. 2.18, P=.02) and from T1 to T3 (6.32 vs. 3.14, P= .04). Discussion A cognitive behavioral intervention integrated within prenatal clinics is feasible in this sample, although attrition rates were high. Compared to treatment-as-usual, the cognitive behavioral intervention reduced depressive symptoms for African-American women at high-risk for antepartum depression and for the full sample of women at low-moderate risk for antepartum depression. These promising findings need to be

  1. Geostatistical simulations for radon indoor with a nested model including the housing factor.

    PubMed

    Cafaro, C; Giovani, C; Garavaglia, M

    2016-01-01

    The radon prone areas definition is matter of many researches in radioecology, since radon is considered a leading cause of lung tumours, therefore the authorities ask for support to develop an appropriate sanitary prevention strategy. In this paper, we use geostatistical tools to elaborate a definition accounting for some of the available information about the dwellings. Co-kriging is the proper interpolator used in geostatistics to refine the predictions by using external covariates. In advance, co-kriging is not guaranteed to improve significantly the results obtained by applying the common lognormal kriging. Here, instead, such multivariate approach leads to reduce the cross-validation residual variance to an extent which is deemed as satisfying. Furthermore, with the application of Monte Carlo simulations, the paradigm provides a more conservative radon prone areas definition than the one previously made by lognormal kriging. PMID:26547362

  2. Three-dimensional geostatistical inversion of flowmeter and pumping test data.

    PubMed

    Li, Wei; Englert, Andreas; Cirpka, Olaf A; Vereecken, Harry

    2008-01-01

    We jointly invert field data of flowmeter and multiple pumping tests in fully screened wells to estimate hydraulic conductivity using a geostatistical method. We use the steady-state drawdowns of pumping tests and the discharge profiles of flowmeter tests as our data in the inference. The discharge profiles need not be converted to absolute hydraulic conductivities. Consequently, we do not need measurements of depth-averaged hydraulic conductivity at well locations. The flowmeter profiles contain information about relative vertical distributions of hydraulic conductivity, while drawdown measurements of pumping tests provide information about horizontal fluctuation of the depth-averaged hydraulic conductivity. We apply the method to data obtained at the Krauthausen test site of the Forschungszentrum Jülich, Germany. The resulting estimate of our joint three-dimensional (3D) geostatistical inversion shows an improved 3D structure in comparison to the inversion of pumping test data only. PMID:18266734

  3. Geochemical and geostatistical evaluation, Arkansas Canyon Planning Unit, Fremont and Custer Counties, Colorado

    USGS Publications Warehouse

    Weiland, E.F.; Connors, R.A.; Robinson, M.L.; Lindemann, J.W.; Meyer, W.T.

    1982-01-01

    A mineral assessment of the Arkansas Canyon Planning Unit was undertaken by Barringer Resources Inc., under the terms of contract YA-553-CTO-100 with the Bureau of Land Management, Colorado State Office. The study was based on a geochemical-geostatistical survey in which 700 stream sediment samples were collected and analyzed for 25 elements. Geochemical results were interpreted by statistical processing which included factor, discriminant, multiple regression and characteristic analysis. The major deposit types evaluated were massive sulfide-base metal, sedimentary and magmatic uranium, thorium vein, magmatic segregation, and carbonatite related deposits. Results of the single element data and multivariate geostatistical analysis indicate that limited potential exists for base metal mineralization near the Horseshoe, El Plomo, and Green Mountain Mines. Thirty areas are considered to be anomalous with regard to one or more of the geochemical parameters evaluated during this study. The evaluation of carbonatite related mineralization was restricted due to the lack of geochemical data specific to this environment.

  4. Evaluation of Effectiveness and Cost‐Effectiveness of a Clinical Decision Support System in Managing Hypertension in Resource Constrained Primary Health Care Settings: Results From a Cluster Randomized Trial

    PubMed Central

    Anchala, Raghupathy; Kaptoge, Stephen; Pant, Hira; Di Angelantonio, Emanuele; Franco, Oscar H.; Prabhakaran, D.

    2015-01-01

    Background Randomized control trials from the developed world report that clinical decision support systems (DSS) could provide an effective means to improve the management of hypertension (HTN). However, evidence from developing countries in this regard is rather limited, and there is a need to assess the impact of a clinical DSS on managing HTN in primary health care center (PHC) settings. Methods and Results We performed a cluster randomized trial to test the effectiveness and cost‐effectiveness of a clinical DSS among Indian adult hypertensive patients (between 35 and 64 years of age), wherein 16 PHC clusters from a district of Telangana state, India, were randomized to receive either a DSS or a chart‐based support (CBS) system. Each intervention arm had 8 PHC clusters, with a mean of 102 hypertensive patients per cluster (n=845 in DSS and 783 in CBS groups). Mean change in systolic blood pressure (SBP) from baseline to 12 months was the primary endpoint. The mean difference in SBP change from baseline between the DSS and CBS at the 12th month of follow‐up, adjusted for age, sex, height, waist, body mass index, alcohol consumption, vegetable intake, pickle intake, and baseline differences in blood pressure, was −6.59 mm Hg (95% confidence interval: −12.18 to −1.42; P=0.021). The cost‐effective ratio for CBS and DSS groups was $96.01 and $36.57 per mm of SBP reduction, respectively. Conclusion Clinical DSS are effective and cost‐effective in the management of HTN in resource‐constrained PHC settings. Clinical Trial Registration URL: http://www.ctri.nic.in. Unique identifier: CTRI/2012/03/002476. PMID:25559011

  5. Family, Community and Clinic Collaboration to Treat Overweight and Obese Children: Stanford GOALS -- a Randomized Controlled Trial of a Three-Year, Multi-Component, Multi-Level, Multi-Setting Intervention

    PubMed Central

    Robinson, Thomas N.; Matheson, Donna; Desai, Manisha; Wilson, Darrell M.; Weintraub, Dana L.; Haskell, William L.; McClain, Arianna; McClure, Samuel; Banda, Jorge; Sanders, Lee M.; Haydel, K. Farish; Killen, Joel D.

    2013-01-01

    Objective To test the effects of a three-year, community-based, multi-component, multi-level, multi-setting (MMM) approach for treating overweight and obese children. Design Two-arm, parallel group, randomized controlled trial with measures at baseline, 12, 24, and 36 months after randomization. Participants Seven through eleven year old, overweight and obese children (BMI ≥ 85th percentile) and their parents/caregivers recruited from community locations in low-income, primarily Latino neighborhoods in Northern California. Interventions Families are randomized to the MMM intervention versus a community health education active-placebo comparison intervention. Interventions last for three years for each participant. The MMM intervention includes a community-based after school team sports program designed specifically for overweight and obese children, a home-based family intervention to reduce screen time, alter the home food/eating environment, and promote self-regulatory skills for eating and activity behavior change, and a primary care behavioral counseling intervention linked to the community and home interventions. The active-placebo comparison intervention includes semi-annual health education home visits, monthly health education newsletters for children and for parents/guardians, and a series of community-based health education events for families. Main Outcome Measure Body mass index trajectory over the three-year study. Secondary outcome measures include waist circumference, triceps skinfold thickness, accelerometer-measured physical activity, 24-hour dietary recalls, screen time and other sedentary behaviors, blood pressure, fasting lipids, glucose, insulin, hemoglobin A1c, C-reactive protein, alanine aminotransferase, and psychosocial measures. Conclusions The Stanford GOALS trial is testing the efficacy of a novel community-based multi-component, multi-level, multi-setting treatment for childhood overweight and obesity in low-income, Latino families

  6. Application of a computationally efficient geostatistical approach to characterizing variably spaced water-table data

    SciTech Connect

    Quinn, J.J.

    1996-02-01

    Geostatistical analysis of hydraulic head data is useful in producing unbiased contour plots of head estimates and relative errors. However, at most sites being characterized, monitoring wells are generally present at different densities, with clusters of wells in some areas and few wells elsewhere. The problem that arises when kriging data at different densities is in achieving adequate resolution of the grid while maintaining computational efficiency and working within software limitations. For the site considered, 113 data points were available over a 14-mi{sup 2} study area, including 57 monitoring wells within an area of concern of 1.5 mi{sup 2}. Variogram analyses of the data indicate a linear model with a negligible nugget effect. The geostatistical package used in the study allows a maximum grid of 100 by 100 cells. Two-dimensional kriging was performed for the entire study area with a 500-ft grid spacing, while the smaller zone was modeled separately with a 100-ft spacing. In this manner, grid cells for the dense area and the sparse area remained small relative to the well separation distances, and the maximum dimensions of the program were not exceeded. The spatial head results for the detailed zone were then nested into the regional output by use of a graphical, object-oriented database that performed the contouring of the geostatistical output. This study benefitted from the two-scale approach and from very fine geostatistical grid spacings relative to typical data separation distances. The combining of the sparse, regional results with those from the finer-resolution area of concern yielded contours that honored the actual data at every measurement location. The method applied in this study can also be used to generate reproducible, unbiased representations of other types of spatial data.

  7. Tomogram-based comparison of geostatistical models: Application to the Macrodispersion Experiment (MADE) site

    NASA Astrophysics Data System (ADS)

    Linde, Niklas; Lochbühler, Tobias; Dogan, Mine; Van Dam, Remke L.

    2015-12-01

    We propose a new framework to compare alternative geostatistical descriptions of a given site. Multiple realizations of each of the considered geostatistical models and their corresponding tomograms (based on inversion of noise-contaminated simulated data) are used as a multivariate training image. The training image is scanned with a direct sampling algorithm to obtain conditional realizations of hydraulic conductivity that are not only in agreement with the geostatistical model, but also honor the spatially varying resolution of the site-specific tomogram. Model comparison is based on the quality of the simulated geophysical data from the ensemble of conditional realizations. The tomogram in this study is obtained by inversion of cross-hole ground-penetrating radar (GPR) first-arrival travel time data acquired at the MAcro-Dispersion Experiment (MADE) site in Mississippi (USA). Various heterogeneity descriptions ranging from multi-Gaussian fields to fields with complex multiple-point statistics inferred from outcrops are considered. Under the assumption that the relationship between porosity and hydraulic conductivity inferred from local measurements is valid, we find that conditioned multi-Gaussian realizations and derivatives thereof can explain the crosshole geophysical data. A training image based on an aquifer analog from Germany was found to be in better agreement with the geophysical data than the one based on the local outcrop, which appears to under-represent high hydraulic conductivity zones. These findings are only based on the information content in a single resolution-limited tomogram and extending the analysis to tracer or higher resolution surface GPR data might lead to different conclusions (e.g., that discrete facies boundaries are necessary). Our framework makes it possible to identify inadequate geostatistical models and petrophysical relationships, effectively narrowing the space of possible heterogeneity representations.

  8. A geostatistical approach to predicting sulfur content in the Pittsburgh coal bed

    USGS Publications Warehouse

    Watson, W.D.; Ruppert, L.F.; Bragg, L.J.; Tewalt, S.J.

    2001-01-01

    The US Geological Survey (USGS) is completing a national assessment of coal resources in the five top coal-producing regions in the US. Point-located data provide measurements on coal thickness and sulfur content. The sample data and their geologic interpretation represent the most regionally complete and up-to-date assessment of what is known about top-producing US coal beds. The sample data are analyzed using a combination of geologic and Geographic Information System (GIS) models to estimate tonnages and qualities of the coal beds. Traditionally, GIS practitioners use contouring to represent geographical patterns of "similar" data values. The tonnage and grade of coal resources are then assessed by using the contour lines as references for interpolation. An assessment taken to this point is only indicative of resource quantity and quality. Data users may benefit from a statistical approach that would allow them to better understand the uncertainty and limitations of the sample data. To develop a quantitative approach, geostatistics were applied to the data on coal sulfur content from samples taken in the Pittsburgh coal bed (located in the eastern US, in the southwestern part of the state of Pennsylvania, and in adjoining areas in the states of Ohio and West Virginia). Geostatistical methods that account for regional and local trends were applied to blocks 2.7 mi (4.3 km) on a side. The data and geostatistics support conclusions concerning the average sulfur content and its degree of reliability at regional- and economic-block scale over the large, contiguous part of the Pittsburgh outcrop, but not to a mine scale. To validate the method, a comparison was made with the sulfur contents in sample data taken from 53 coal mines located in the study area. The comparison showed a high degree of similarity between the sulfur content in the mine samples and the sulfur content represented by the geostatistically derived contours. Published by Elsevier Science B.V.

  9. Geostatistical mapping of effluent-affected sediment distribution on the Palos Verdes Shelf

    SciTech Connect

    Murray, Christopher J. ); Lee, H J.; Hampton, M A.

    2001-12-01

    Geostatistical techniques were used to study the spatial continuity of the thickness of effluent-affected sediment in the offshore Palos Verdes margin area. The thickness data were measured directly from cores and indirectly from high-frequency subbottom profiles collected over the Palos Verdes Margin. Strong spatial continuity of the sediment thickness data was identified, with a maximum range of correlation in excess of 1.4 km. The spatial correlation showed a marked anisotropy, and was more than twice as continuous in the alongshore direction as in the cross-shelf direction. Sequential indicator simulation employing models fit to the thickness data variograms was used to map the distribution of the sediment, and to quantify the uncertainty in those estimates. A strong correlation between sediment thickness data and measurements of the mass of the contaminant p,p?-DDE per unit area was identified. A calibration based on the bivariate distribution of the thickness and p,p?-DDE data was applied using Markov-Bayes indicator simulation to extend the geostatistical study and map the contamination levels in the sediment. Integrating the map grids produced by the geostatistical study of the two variables indicated that 7.8 million cubic meters of effluent-affected sediment exist in the map area, containing approximately 61 to 72 Mg (metric tons) of p,p?-DDE. Most of the contaminated sediment (about 85% of the sediment and 89% of the p,p?-DDE) occurs in water depths less than 100 m. The geostatistical study also indicated that the samples available for mapping are well distributed and the uncertainty of the estimates of the thickness and contamination level of the sediments is lowest in areas where the contaminated sediment is most prevalent.

  10. Hierarchical probabilistic regionalization of volcanism for Sengan region in Japan using multivariate statistical techniques and geostatistical interpolation techniques

    SciTech Connect

    Park, Jinyong; Balasingham, P; McKenna, Sean Andrew; Pinnaduwa H.S.W. Kulatilake

    2004-09-01

    Sandia National Laboratories, under contract to Nuclear Waste Management Organization of Japan (NUMO), is performing research on regional classification of given sites in Japan with respect to potential volcanic disruption using multivariate statistics and geo-statistical interpolation techniques. This report provides results obtained for hierarchical probabilistic regionalization of volcanism for the Sengan region in Japan by applying multivariate statistical techniques and geostatistical interpolation techniques on the geologic data provided by NUMO. A workshop report produced in September 2003 by Sandia National Laboratories (Arnold et al., 2003) on volcanism lists a set of most important geologic variables as well as some secondary information related to volcanism. Geologic data extracted for the Sengan region in Japan from the data provided by NUMO revealed that data are not available at the same locations for all the important geologic variables. In other words, the geologic variable vectors were found to be incomplete spatially. However, it is necessary to have complete geologic variable vectors to perform multivariate statistical analyses. As a first step towards constructing complete geologic variable vectors, the Universal Transverse Mercator (UTM) zone 54 projected coordinate system and a 1 km square regular grid system were selected. The data available for each geologic variable on a geographic coordinate system were transferred to the aforementioned grid system. Also the recorded data on volcanic activity for Sengan region were produced on the same grid system. Each geologic variable map was compared with the recorded volcanic activity map to determine the geologic variables that are most important for volcanism. In the regionalized classification procedure, this step is known as the variable selection step. The following variables were determined as most important for volcanism: geothermal gradient, groundwater temperature, heat discharge, groundwater