Science.gov

Sample records for geostatistics random sets

  1. Random vectors and spatial analysis by geostatistics for geotechnical applications

    SciTech Connect

    Young, D.S.

    1987-08-01

    Geostatistics is extended to the spatial analysis of vector variables by defining the estimation variance and vector variogram in terms of the magnitude of difference vectors. Many random variables in geotechnology are in vectorial terms rather than scalars, and its structural analysis requires those sample variable interpolations to construct and characterize structural models. A better local estimator will result in greater quality of input models; geostatistics can provide such estimators; kriging estimators. The efficiency of geostatistics for vector variables is demonstrated in a case study of rock joint orientations in geological formations. The positive cross-validation encourages application of geostatistics to spatial analysis of random vectors in geoscience as well as various geotechnical fields including optimum site characterization, rock mechanics for mining and civil structures, cavability analysis of block cavings, petroleum engineering, and hydrologic and hydraulic modelings.

  2. Random spatial processes and geostatistical models for soil variables

    NASA Astrophysics Data System (ADS)

    Lark, R. M.

    2009-04-01

    Geostatistical models of soil variation have been used to considerable effect to facilitate efficient and powerful prediction of soil properties at unsampled sites or over partially sampled regions. Geostatistical models can also be used to investigate the scaling behaviour of soil process models, to design sampling strategies and to account for spatial dependence in the random effects of linear mixed models for spatial variables. However, most geostatistical models (variograms) are selected for reasons of mathematical convenience (in particular, to ensure positive definiteness of the corresponding variables). They assume some underlying spatial mathematical operator which may give a good description of observed variation of the soil, but which may not relate in any clear way to the processes that we know give rise to that observed variation in the real world. In this paper I shall argue that soil scientists should pay closer attention to the underlying operators in geostatistical models, with a view to identifying, where ever possible, operators that reflect our knowledge of processes in the soil. I shall illustrate how this can be done in the case of two problems. The first exemplar problem is the definition of operators to represent statistically processes in which the soil landscape is divided into discrete domains. This may occur at disparate scales from the landscape (outcrops, catchments, fields with different landuse) to the soil core (aggregates, rhizospheres). The operators that underly standard geostatistical models of soil variation typically describe continuous variation, and so do not offer any way to incorporate information on processes which occur in discrete domains. I shall present the Poisson Voronoi Tessellation as an alternative spatial operator, examine its corresponding variogram, and apply these to some real data. The second exemplar problem arises from different operators that are equifinal with respect to the variograms of the

  3. Fast Geostatistical Inversion using Randomized Matrix Decompositions and Sketchings for Heterogeneous Aquifer Characterization

    NASA Astrophysics Data System (ADS)

    O'Malley, D.; Le, E. B.; Vesselinov, V. V.

    2015-12-01

    We present a fast, scalable, and highly-implementable stochastic inverse method for characterization of aquifer heterogeneity. The method utilizes recent advances in randomized matrix algebra and exploits the structure of the Quasi-Linear Geostatistical Approach (QLGA), without requiring a structured grid like Fast-Fourier Transform (FFT) methods. The QLGA framework is a more stable version of Gauss-Newton iterates for a large number of unknown model parameters, but provides unbiased estimates. The methods are matrix-free and do not require derivatives or adjoints, and are thus ideal for complex models and black-box implementation. We also incorporate randomized least-square solvers and data-reduction methods, which speed up computation and simulate missing data points. The new inverse methodology is coded in Julia and implemented in the MADS computational framework (http://mads.lanl.gov). Julia is an advanced high-level scientific programing language that allows for efficient memory management and utilization of high-performance computational resources. Inversion results based on series of synthetic problems with steady-state and transient calibration data are presented.

  4. A general parallelization strategy for random path based geostatistical simulation methods

    NASA Astrophysics Data System (ADS)

    Mariethoz, Grégoire

    2010-07-01

    The size of simulation grids used for numerical models has increased by many orders of magnitude in the past years, and this trend is likely to continue. Efficient pixel-based geostatistical simulation algorithms have been developed, but for very large grids and complex spatial models, the computational burden remains heavy. As cluster computers become widely available, using parallel strategies is a natural step for increasing the usable grid size and the complexity of the models. These strategies must profit from of the possibilities offered by machines with a large number of processors. On such machines, the bottleneck is often the communication time between processors. We present a strategy distributing grid nodes among all available processors while minimizing communication and latency times. It consists in centralizing the simulation on a master processor that calls other slave processors as if they were functions simulating one node every time. The key is to decouple the sending and the receiving operations to avoid synchronization. Centralization allows having a conflict management system ensuring that nodes being simulated simultaneously do not interfere in terms of neighborhood. The strategy is computationally efficient and is versatile enough to be applicable to all random path based simulation methods.

  5. Geostatistics as a validation tool for setting ozone standards for durum wheat.

    PubMed

    De Marco, Alessandra; Screpanti, Augusto; Paoletti, Elena

    2010-02-01

    Which is the best standard for protecting plants from ozone? To answer this question, we must validate the standards by testing biological responses vs. ambient data in the field. A validation is missing for European and USA standards, because the networks for ozone, meteorology and plant responses are spatially independent. We proposed geostatistics as validation tool, and used durum wheat in central Italy as a test. The standards summarized ozone impact on yield better than hourly averages. Although USA criteria explained ozone-induced yield losses better than European criteria, USA legal level (75 ppb) protected only 39% of sites. European exposure-based standards protected > or =90%. Reducing the USA level to the Canadian 65 ppb or using W126 protected 91% and 97%, respectively. For a no-threshold accumulated stomatal flux, 22 mmol m(-2) was suggested to protect 97% of sites. In a multiple regression, precipitation explained 22% and ozone explained <0.9% of yield variability. Copyright (c) 2009 Elsevier Ltd. All rights reserved.

  6. GEOSTATISTICAL SAMPLING DESIGNS FOR HAZARDOUS WASTE SITES

    EPA Science Inventory

    This chapter discusses field sampling design for environmental sites and hazardous waste sites with respect to random variable sampling theory, Gy's sampling theory, and geostatistical (kriging) sampling theory. The literature often presents these sampling methods as an adversari...

  7. GEOSTATISTICAL SAMPLING DESIGNS FOR HAZARDOUS WASTE SITES

    EPA Science Inventory

    This chapter discusses field sampling design for environmental sites and hazardous waste sites with respect to random variable sampling theory, Gy's sampling theory, and geostatistical (kriging) sampling theory. The literature often presents these sampling methods as an adversari...

  8. Geostatistics and petroleum geology

    SciTech Connect

    Hohn, M.E.

    1988-01-01

    This book examines purpose and use of geostatistics in exploration and development of oil and gas with an emphasis on appropriate and pertinent case studies. It present an overview of geostatistics. Topics covered include: The semivariogram; Linear estimation; Multivariate geostatistics; Nonlinear estimation; From indicator variables to nonparametric estimation; and More detail, less certainty; conditional simulation.

  9. Description and validation of a new set of object-based temporal geostatistical features for land-use/land-cover change detection

    NASA Astrophysics Data System (ADS)

    Gil-Yepes, Jose L.; Ruiz, Luis A.; Recio, Jorge A.; Balaguer-Beser, Ángel; Hermosilla, Txomin

    2016-11-01

    A new set of temporal features derived from codispersion and cross-semivariogram geostatistical functions is proposed, described, extracted, and evaluated for object-based land-use/land-cover change detection using high resolution images. Five features were extracted from the codispersion function and another six from the cross-semivariogram. The set of features describes the temporal behaviour of the internal structure of the image objects defined in a cadastral database. The set of extracted features was combined with spectral information and a feature selection study was performed using forward stepwise discriminant analysis, principal component analysis, as well as correlation and feature interpretation analysis. The temporal feature set was validated using high resolution aerial images from an agricultural area located in south-east Spain, in order to solve a tree crop change detection problem. Direct classification using decision tree classifier was used as change detection method. Different classifications were performed comparing various feature group combinations in order to obtain the most suitable features for this study. Results showed that the new sets of cross-semivariogram and codispersion features provided high global accuracy classification results (83.55% and 85.71% respectively), showing high potential for detecting changes related to the internal structure of agricultural tree crop parcels. A significant increase in accuracy value was obtained when combining features from both groups with spectral information (94.59%).

  10. Excursion set theory for correlated random walks

    NASA Astrophysics Data System (ADS)

    Farahi, Arya; Benson, Andrew J.

    2013-08-01

    We present a new method to compute the first crossing distribution in excursion set theory for the case of correlated random walks. We use a combination of the path integral formalism of Maggiore & Riotto, and the integral equation solution of Zhang & Hui and Benson et al. to find a numerically and convenient algorithm to derive the first crossing distribution. We apply this methodology to the specific case of a Gaussian random density field filtered with a Gaussian smoothing function. By comparing our solutions to results from Monte Carlo calculations of the first crossing distribution we demonstrate that approximately it is in good agreement with exact solution for certain barriers, and at large masses. Our approach is quite general, and can be adapted to other smoothing functions and barrier function and also to non-Gaussian density fields.

  11. Imprecise (fuzzy) information in geostatistics

    SciTech Connect

    Bardossy, A.; Bogardi, I.; Kelly, W.E.

    1988-05-01

    A methodology based on fuzzy set theory for the utilization of imprecise data in geostatistics is presented. A common problem preventing a broader use of geostatistics has been the insufficient amount of accurate measurement data. In certain cases, additional but uncertain (soft) information is available and can be encoded as subjective probabilities, and then the soft kriging method can be applied (Journal, 1986). In other cases, a fuzzy encoding of soft information may be more realistic and simplify the numerical calculations. Imprecise (fuzzy) spatial information on the possible variogram is integrated into a single variogram which is used in a fuzzy kriging procedure. The overall uncertainty of prediction is represented by the estimation variance and the calculated membership function for each kriged point. The methodology is applied to the permeability prediction of a soil liner for hazardous waste containment. The available number of hard measurement data (20) was not enough for a classical geostatistical analysis. An additional 20 soft data made it possible to prepare kriged contour maps using the fuzzy geostatistical procedure.

  12. Application of geostatistics to risk assessment.

    PubMed

    Thayer, William C; Griffith, Daniel A; Goodrum, Philip E; Diamond, Gary L; Hassett, James M

    2003-10-01

    Geostatistics offers two fundamental contributions to environmental contaminant exposure assessment: (1) a group of methods to quantitatively describe the spatial distribution of a pollutant and (2) the ability to improve estimates of the exposure point concentration by exploiting the geospatial information present in the data. The second contribution is particularly valuable when exposure estimates must be derived from small data sets, which is often the case in environmental risk assessment. This article addresses two topics related to the use of geostatistics in human and ecological risk assessments performed at hazardous waste sites: (1) the importance of assessing model assumptions when using geostatistics and (2) the use of geostatistics to improve estimates of the exposure point concentration (EPC) in the limited data scenario. The latter topic is approached here by comparing design-based estimators that are familiar to environmental risk assessors (e.g., Land's method) with geostatistics, a model-based estimator. In this report, we summarize the basics of spatial weighting of sample data, kriging, and geostatistical simulation. We then explore the two topics identified above in a case study, using soil lead concentration data from a Superfund site (a skeet and trap range). We also describe several areas where research is needed to advance the use of geostatistics in environmental risk assessment.

  13. Scaling random walks on arbitrary sets

    NASA Astrophysics Data System (ADS)

    Harris, Simon C.; Williams, David; Sibson, Robin

    1999-01-01

    Let I be a countably infinite set of points in [open face R] which we can write as I={ui: i[set membership][open face Z]}, with uirandom-walk, when repeatedly rescaled suitably in space and time, looks more and more like a Brownian motion. In this paper we explore the convergence properties of the Markov chain Y on the set I under suitable space-time scalings. Later, we consider some cases when the set I consists of the points of a renewal process and the jump rates assigned to each state in I are perhaps also randomly chosen.This work sprang from a question asked by one of us (Sibson) about ‘driftless nearest-neighbour’ Markov chains on countable subsets I of [open face R]d, work of Sibson [7] and of Christ, Friedberg and Lee [2] having identified examples of such chains in terms of the Dirichlet tessellation associated with I. Amongst methods which can be brought to bear on this d-dimensional problem is the theory of Dirichlet forms. There are potential problems in doing this because we wish I to be random (for example, a realization of a Poisson point process), we do not wish to impose artificial boundedness conditions which would clearly make things work for certain deterministic sets I. In the 1-dimensional case discussed here and in the following paper by Harris, much simpler techniques (where we embed the Markov chain in a Brownian motion using local time) work very effectively; and it is these, rather than the theory of Dirichlet forms, that we use.

  14. Geostatistics and petroleum geology

    SciTech Connect

    Hohn, M.E.

    1988-01-01

    The book reviewed is designed as a practical guide to geostatistics or kriging for the petroleum geologists. The author's aim in the book is to explain geostatistics as a working tool for petroleum geologists through extensive use of case-study material mostly drawn from his own research in gas potential evaluation in West Virginia. Theory and mathematics are pared down to immediate needs.

  15. Geostatistical model to estimate in stream pollutant loads and concentrations.

    NASA Astrophysics Data System (ADS)

    Polus, E.; Flipo, N.; de Fouquet, C.; Poulin, M.

    2009-04-01

    Models that estimate loads and concentrations of pollutants in streams can roughly be classified into two categories: physically-based and stochastic models. While the first ones tend to reproduce physical processes that occur in streams, the stochastic models consider loads and concentrations as random variables. This work is interesting in such models and particularly in geostatistical models, which provide an estimate of loads and concentrations and the joint measurement of uncertainty also: the estimation variance. Along a stream network that can be modelled as a graph, most of usual geostatistical covariance or variogram models are not valid anymore. Based on recent models applied on tree graphs, we present a covariance or variogram construction combining one-dimensional Random Functions (RF) defined on each path between sources and the outlet. The model properties are examined, namely the consistency conditions at the confluences for different variables. In practice, the scarcity of spatial data makes a precise inference of covariances difficult. Can then a phenomenological model be used to guide the geostatistical modelling? To answer this question the example of a portion of the Seine River (France) is examined, where both measurement data and the outputs of the physically-based model ProSe are used. The comparison between both data sets shows an excellent agreement for discharges and a consistent one for nitrate concentrations. Nevertheless, a detailed exploratory analysis brings to light the importance of the boundary conditions, which ones are not consistent with the downstream measurements. The agreement between data and modelled values can be improved thanks to a reconstruction of consistent boundary conditions by cokriging. This is an example of the usefulness of using jointly physically-based models and geostatistics. The next step is a joint modelling of discharges, loads and concentrations along the stream network. This modelling should improve the

  16. Fractal and geostatistical methods for modeling of a fracture network

    SciTech Connect

    Chiles, J.P.

    1988-08-01

    The modeling of fracture networks is useful for fluid flow and rock mechanics studies. About 6600 fracture traces were recorded on drifts of a uranium mine in a granite massif. The traces have an extension of 0.20-20 m. The network was studied by fractal and by geostatistical methods but can be considered neither as a fractal with a constant dimension nor a set of purely randomly located fractures. Two kinds of generalization of conventional models can still provide more flexibility for the characterization of the network: (a) a nonscaling fractal model with variable similarity dimension (for a 2-D network of traces, the dimension varying from 2 for the 10-m scale to 1 for the centimeter scale, (b) a parent-daughter model with a regionalized density; the geostatistical study allows a 3-D model to be established where: fractures are assumed to be discs; fractures are grouped in clusters or swarms; and fracturation density is regionalized (with two ranges at about 30 and 300 m). The fractal model is easy to fit and to simulate along a line, but 2-D and 3-D simulations are more difficult. The geostatistical model is more complex, but easy to simulate, even in 3-D.

  17. Geostatistics and spatial analysis in biological anthropology.

    PubMed

    Relethford, John H

    2008-05-01

    A variety of methods have been used to make evolutionary inferences based on the spatial distribution of biological data, including reconstructing population history and detection of the geographic pattern of natural selection. This article provides an examination of geostatistical analysis, a method used widely in geology but which has not often been applied in biological anthropology. Geostatistical analysis begins with the examination of a variogram, a plot showing the relationship between a biological distance measure and the geographic distance between data points and which provides information on the extent and pattern of spatial correlation. The results of variogram analysis are used for interpolating values of unknown data points in order to construct a contour map, a process known as kriging. The methods of geostatistical analysis and discussion of potential problems are applied to a large data set of anthropometric measures for 197 populations in Ireland. The geostatistical analysis reveals two major sources of spatial variation. One pattern, seen for overall body and craniofacial size, shows an east-west cline most likely reflecting the combined effects of past population dispersal and settlement. The second pattern is seen for craniofacial height and shows an isolation by distance pattern reflecting rapid spatial changes in the midlands region of Ireland, perhaps attributable to the genetic impact of the Vikings. The correspondence of these results with other analyses of these data and the additional insights generated from variogram analysis and kriging illustrate the potential utility of geostatistical analysis in biological anthropology. (c) 2008 Wiley-Liss, Inc.

  18. Geostatistical case studies

    SciTech Connect

    Matheron, G.; Armstrong, M.

    1987-01-01

    The objective of this volume of contributed chapters is to present a series of applications of geostatistics. These range from a careful variographic analysis on uranium data, through detailed studies on geologically complex deposits, right up to the latest nonlinear methods applied to deposits with highly skewed data contributions. Applications of new techniques such as the external drift method for combining well data with seismic information have also been included. The volume emphasizes geostatistics in practice. Notation has been kept to a minimum and mathematical details have been relegated to annexes.

  19. On statistical inference for the random set generated Cox process with set-marking.

    PubMed

    Penttinen, Antti; Niemi, Aki

    2007-04-01

    Cox point process is a process class for hierarchical modelling of systems of non-interacting points in Rd under environmental heterogeneity which is modelled through a random intensity function. In this work a class of Cox processes is suggested where the random intensity is generated by a random closed set. Such heterogeneity appears for example in forestry where silvicultural treatments like harvesting and site-preparation create geometrical patterns for tree density variation in two different phases. In this paper the second order property, important both in data analysis and in the context of spatial sampling, is derived. The usefulness of the random set generated Cox process is highly increased, if for each point it is observed whether it is included in the random set or not. This additional information is easy and economical to obtain in many cases and is hence of practical value; it leads to marks for the points. The resulting random set marked Cox process is a marked point process where the marks are intensity-dependent. The problem with set-marking is that the marks are not a representative sample from the random set. This paper derives the second order property of the random set marked Cox process and suggests a practical estimation method for area fraction and covariance of the random set and for the point densities within and outside the random set. A simulated example and a forestry example are given.

  20. Randomization methods in emergency setting trials: a descriptive review

    PubMed Central

    Moe‐Byrne, Thirimon; Oddie, Sam; McGuire, William

    2015-01-01

    Background Quasi‐randomization might expedite recruitment into trials in emergency care settings but may also introduce selection bias. Methods We searched the Cochrane Library and other databases for systematic reviews of interventions in emergency medicine or urgent care settings. We assessed selection bias (baseline imbalances) in prognostic indicators between treatment groups in trials using true randomization versus trials using quasi‐randomization. Results Seven reviews contained 16 trials that used true randomization and 11 that used quasi‐randomization. Baseline group imbalance was identified in four trials using true randomization (25%) and in two quasi‐randomized trials (18%). Of the four truly randomized trials with imbalance, three concealed treatment allocation adequately. Clinical heterogeneity and poor reporting limited the assessment of trial recruitment outcomes. Conclusions We did not find strong or consistent evidence that quasi‐randomization is associated with selection bias more often than true randomization. High risk of bias judgements for quasi‐randomized emergency studies should therefore not be assumed in systematic reviews. Clinical heterogeneity across trials within reviews, coupled with limited availability of relevant trial accrual data, meant it was not possible to adequately explore the possibility that true randomization might result in slower trial recruitment rates, or the recruitment of less representative populations. © 2015 The Authors. Research Synthesis Methods published by John Wiley & Sons, Ltd. PMID:26333419

  1. Multivariable geostatistics in S: the gstat package

    NASA Astrophysics Data System (ADS)

    Pebesma, Edzer J.

    2004-08-01

    This paper discusses advantages and shortcomings of the S environment for multivariable geostatistics, in particular when extended with the gstat package, an extension package for the S environments (R, S-Plus). The gstat S package provides multivariable geostatistical modelling, prediction and simulation, as well as several visualisation functions. In particular, it makes the calculation, simultaneous fitting, and visualisation of a large number of direct and cross (residual) variograms very easy. Gstat was started 10 years ago and was released under the GPL in 1996; gstat.org was started in 1998. Gstat was not initially written for teaching purposes, but for research purposes, emphasising flexibility, scalability and portability. It can deal with a large number of practical issues in geostatistics, including change of support (block kriging), simple/ordinary/universal (co)kriging, fast local neighbourhood selection, flexible trend modelling, variables with different sampling configurations, and efficient simulation of large spatially correlated random fields, indicator kriging and simulation, and (directional) variogram and cross variogram modelling. The formula/models interface of the S language is used to define multivariable geostatistical models. This paper introduces the gstat S package, and discusses a number of design and implementation issues. It also draws attention to a number of papers on integration of spatial statistics software, GIS and the S environment that were presented on the spatial statistics workshop and sessions during the conference Distributed Statistical Computing 2003.

  2. The use of a genetic algorithm-based search strategy in geostatistics: application to a set of anisotropic piezometric head data

    NASA Astrophysics Data System (ADS)

    Abedini, M. J.; Nasseri, M.; Burn, D. H.

    2012-04-01

    In any geostatistical study, an important consideration is the choice of an appropriate, repeatable, and objective search strategy that controls the nearby samples to be included in the location-specific estimation procedure. Almost all geostatistical software available in the market puts the onus on the user to supply search strategy parameters in a heuristic manner. These parameters are solely controlled by geographical coordinates that are defined for the entire area under study, and the user has no guidance as to how to choose these parameters. The main thesis of the current study is that the selection of search strategy parameters has to be driven by data—both the spatial coordinates and the sample values—and cannot be chosen beforehand. For this purpose, a genetic-algorithm-based ordinary kriging with moving neighborhood technique is proposed. The search capability of a genetic algorithm is exploited to search the feature space for appropriate, either local or global, search strategy parameters. Radius of circle/sphere and/or radii of standard or rotated ellipse/ellipsoid are considered as the decision variables to be optimized by GA. The superiority of GA-based ordinary kriging is demonstrated through application to the Wolfcamp Aquifer piezometric head data. Assessment of numerical results showed that definition of search strategy parameters based on both geographical coordinates and sample values improves cross-validation statistics when compared with that based on geographical coordinates alone. In the case of a variable search neighborhood for each estimation point, optimization of local search strategy parameters for an elliptical support domain—the orientation of which is dictated by anisotropic axes—via GA was able to capture the dynamics of piezometric head in west Texas/New Mexico in an efficient way.

  3. Applications of Geostatistics in Plant Nematology

    PubMed Central

    Wallace, M. K.; Hawkins, D. M.

    1994-01-01

    The application of geostatistics to plant nematology was made by evaluating soil and nematode data acquired from 200 soil samples collected from the Ap horizon of a reed canary-grass field in northern Minnesota. Geostatistical concepts relevant to nematology include semi-variogram modelling, kriging, and change of support calculations. Soil and nematode data generally followed a spherical semi-variogram model, with little random variability associated with soil data and large inherent variability for nematode data. Block kriging of soil and nematode data provided useful contour maps of the data. Change of snpport calculations indicated that most of the random variation in nematode data was due to short-range spatial variability in the nematode population densities. PMID:19279938

  4. Applications of geostatistics in plant nematology.

    PubMed

    Wallace, M K; Hawkins, D M

    1994-12-01

    The application of geostatistics to plant nematology was made by evaluating soil and nematode data acquired from 200 soil samples collected from the A(p) horizon of a reed canary-grass field in northern Minnesota. Geostatistical concepts relevant to nematology include semi-variogram modelling, kriging, and change of support calculations. Soil and nematode data generally followed a spherical semi-variogram model, with little random variability associated with soil data and large inherent variability for nematode data. Block kriging of soil and nematode data provided useful contour maps of the data. Change of snpport calculations indicated that most of the random variation in nematode data was due to short-range spatial variability in the nematode population densities.

  5. Randomization Methods in Emergency Setting Trials: A Descriptive Review

    ERIC Educational Resources Information Center

    Corbett, Mark Stephen; Moe-Byrne, Thirimon; Oddie, Sam; McGuire, William

    2016-01-01

    Background: Quasi-randomization might expedite recruitment into trials in emergency care settings but may also introduce selection bias. Methods: We searched the Cochrane Library and other databases for systematic reviews of interventions in emergency medicine or urgent care settings. We assessed selection bias (baseline imbalances) in prognostic…

  6. Randomization Methods in Emergency Setting Trials: A Descriptive Review

    ERIC Educational Resources Information Center

    Corbett, Mark Stephen; Moe-Byrne, Thirimon; Oddie, Sam; McGuire, William

    2016-01-01

    Background: Quasi-randomization might expedite recruitment into trials in emergency care settings but may also introduce selection bias. Methods: We searched the Cochrane Library and other databases for systematic reviews of interventions in emergency medicine or urgent care settings. We assessed selection bias (baseline imbalances) in prognostic…

  7. Dissecting random and systematic differences between noisy composite data sets.

    PubMed

    Diederichs, Kay

    2017-04-01

    Composite data sets measured on different objects are usually affected by random errors, but may also be influenced by systematic (genuine) differences in the objects themselves, or the experimental conditions. If the individual measurements forming each data set are quantitative and approximately normally distributed, a correlation coefficient is often used to compare data sets. However, the relations between data sets are not obvious from the matrix of pairwise correlations since the numerical value of the correlation coefficient is lowered by both random and systematic differences between the data sets. This work presents a multidimensional scaling analysis of the pairwise correlation coefficients which places data sets into a unit sphere within low-dimensional space, at a position given by their CC* values [as defined by Karplus & Diederichs (2012), Science, 336, 1030-1033] in the radial direction and by their systematic differences in one or more angular directions. This dimensionality reduction can not only be used for classification purposes, but also to derive data-set relations on a continuous scale. Projecting the arrangement of data sets onto the subspace spanned by systematic differences (the surface of a unit sphere) allows, irrespective of the random-error levels, the identification of clusters of closely related data sets. The method gains power with increasing numbers of data sets. It is illustrated with an example from low signal-to-noise ratio image processing, and an application in macromolecular crystallography is shown, but the approach is completely general and thus should be widely applicable.

  8. Robust geostatistical analysis of spatial data

    NASA Astrophysics Data System (ADS)

    Papritz, Andreas; Künsch, Hans Rudolf; Schwierz, Cornelia; Stahel, Werner A.

    2013-04-01

    Most of the geostatistical software tools rely on non-robust algorithms. This is unfortunate, because outlying observations are rather the rule than the exception, in particular in environmental data sets. Outliers affect the modelling of the large-scale spatial trend, the estimation of the spatial dependence of the residual variation and the predictions by kriging. Identifying outliers manually is cumbersome and requires expertise because one needs parameter estimates to decide which observation is a potential outlier. Moreover, inference after the rejection of some observations is problematic. A better approach is to use robust algorithms that prevent automatically that outlying observations have undue influence. Former studies on robust geostatistics focused on robust estimation of the sample variogram and ordinary kriging without external drift. Furthermore, Richardson and Welsh (1995) proposed a robustified version of (restricted) maximum likelihood ([RE]ML) estimation for the variance components of a linear mixed model, which was later used by Marchant and Lark (2007) for robust REML estimation of the variogram. We propose here a novel method for robust REML estimation of the variogram of a Gaussian random field that is possibly contaminated by independent errors from a long-tailed distribution. It is based on robustification of estimating equations for the Gaussian REML estimation (Welsh and Richardson, 1997). Besides robust estimates of the parameters of the external drift and of the variogram, the method also provides standard errors for the estimated parameters, robustified kriging predictions at both sampled and non-sampled locations and kriging variances. Apart from presenting our modelling framework, we shall present selected simulation results by which we explored the properties of the new method. This will be complemented by an analysis a data set on heavy metal contamination of the soil in the vicinity of a metal smelter. Marchant, B.P. and Lark, R

  9. Finitely approximable random sets and their evolution via differential equations

    NASA Astrophysics Data System (ADS)

    Ananyev, B. I.

    2016-12-01

    In this paper, random closed sets (RCS) in Euclidean space are considered along with their distributions and approximation. Distributions of RCS may be used for the calculation of expectation and other characteristics. Reachable sets on initial data and some ways of their approximate evolutionary description are investigated for stochastic differential equations (SDE) with initial state in some RCS. Markov property of random reachable sets is proved in the space of closed sets. For approximate calculus, the initial RCS is replaced by a finite set on the integer multidimensional grid and the multistage Markov chain is substituted for SDE. The Markov chain is constructed by methods of SDE numerical integration. Some examples are also given.

  10. The non-integer operation associated to random variation sets of the self-similar set

    NASA Astrophysics Data System (ADS)

    Ren, Fu-Yao; Liang, Jin-Rong

    2000-10-01

    When memory sets are random variation sets of the self-similar set and the total number of remaining states in each stage of the division of this set is normalized to unity, the corresponding flux and fractional integral are “robust” and stable under some conditions. This answers an open problem proposed by Alian Le Mehaute et al. in their book entitled Irreversibilitê Temporel et Geometrie Fractale.

  11. Unsupervised gene set testing based on random matrix theory.

    PubMed

    Frost, H Robert; Amos, Christopher I

    2016-11-04

    Gene set testing, or pathway analysis, is a bioinformatics technique that performs statistical testing on biologically meaningful sets of genomic variables. Although originally developed for supervised analyses, i.e., to test the association between gene sets and an outcome variable, gene set testing also has important unsupervised applications, e.g., p-value weighting. For unsupervised testing, however, few effective gene set testing methods are available with support especially poor for several biologically relevant use cases. In this paper, we describe two new unsupervised gene set testing methods based on random matrix theory, the Marc̆enko-Pastur Distribution Test (MPDT) and the Tracy-Widom Test (TWT), that support both self-contained and competitive null hypotheses. For the self-contained case, we contrast our proposed tests with the classic multivariate test based on a modified likelihood ratio criterion. For the competitive case, we compare the new tests against a competitive version of the classic test and our recently developed Spectral Gene Set Enrichment (SGSE) method. Evaluation of the TWT and MPDT methods is based on both simulation studies and a weighted p-value analysis of two real gene expression data sets using gene sets drawn from MSigDB collections. The MPDT and TWT methods are novel and effective tools for unsupervised gene set analysis with superior statistical performance relative to existing techniques and the ability to generate biologically important results on real genomic data sets.

  12. Geostatistical applications in environmental remediation

    SciTech Connect

    Stewart, R.N.; Purucker, S.T.; Lyon, B.F.

    1995-02-01

    Geostatistical analysis refers to a collection of statistical methods for addressing data that vary in space. By incorporating spatial information into the analysis, geostatistics has advantages over traditional statistical analysis for problems with a spatial context. Geostatistics has a history of success in earth science applications, and its popularity is increasing in other areas, including environmental remediation. Due to recent advances in computer technology, geostatistical algorithms can be executed at a speed comparable to many standard statistical software packages. When used responsibly, geostatistics is a systematic and defensible tool can be used in various decision frameworks, such as the Data Quality Objectives (DQO) process. At every point in the site, geostatistics can estimate both the concentration level and the probability or risk of exceeding a given value. Using these probability maps can assist in identifying clean-up zones. Given any decision threshold and an acceptable level of risk, the probability maps identify those areas that are estimated to be above or below the acceptable risk. Those areas that are above the threshold are of the most concern with regard to remediation. In addition to estimating clean-up zones, geostatistics can assist in designing cost-effective secondary sampling schemes. Those areas of the probability map with high levels of estimated uncertainty are areas where more secondary sampling should occur. In addition, geostatistics has the ability to incorporate soft data directly into the analysis. These data include historical records, a highly correlated secondary contaminant, or expert judgment. The role of geostatistics in environmental remediation is a tool that in conjunction with other methods can provide a common forum for building consensus.

  13. Randomized trial of infusion set function: steel versus teflon.

    PubMed

    Patel, Parul J; Benasi, Kari; Ferrari, Gina; Evans, Mark G; Shanmugham, Satya; Wilson, Darrell M; Buckingham, Bruce A

    2014-01-01

    This study compared infusion set function for up to 1 week using either a Teflon(®) (Dupont(™), Wilmington, DE) catheter or a steel catheter for insulin pump therapy in type 1 diabetes mellitus. Twenty subjects participating in a randomized, open-labeled, crossover study were asked to wear two Quick-Set(®) and two Sure-T(®) infusion sets (both from Medtronic Minimed, Northridge, CA) until the infusion set failed or was worn for 1 week. All subjects wore a MiniMed continuous glucose monitoring system for the duration of the study. One subject withdrew from the study. There were 38 weeks of Sure-T wear and 39 weeks of Quick-Set wear with no difference in the survival curves of the infusion sets. There was, however, a 15% initial failure rate with the Teflon infusion set. After 7 days, both types of infusion sets had a 64% failure rate. Overall, 30% failed because of hyperglycemia and a failed correction dose, 13% were removed for pain, 10% were pulled out by accident, 10% had erythema and/or induration of>10 mm, 5% fell out because of loss of adhesion, and 4% were removed for infection. The main predictor of length of wear was the individual subject. There was no increase in hyperglycemia or daily insulin requirements when an infusion set was successfully used for 7 days (n=25 of 77 weeks). We found no difference between steel and Teflon infusion sets in their function over 7 days, although 15% of Teflon sets failed because of kinking on insertion. The strongest predictor of prolonged 7-day infusion set function was the individual subject, not the type of infusion set.

  14. Randomized Trial of Infusion Set Function: Steel Versus Teflon

    PubMed Central

    Patel, Parul J.; Benasi, Kari; Ferrari, Gina; Evans, Mark G.; Shanmugham, Satya; Wilson, Darrell M.

    2014-01-01

    Abstract Background: This study compared infusion set function for up to 1 week using either a Teflon® (Dupont™, Wilmington, DE) catheter or a steel catheter for insulin pump therapy in type 1 diabetes mellitus. Subjects and Methods: Twenty subjects participating in a randomized, open-labeled, crossover study were asked to wear two Quick-Set® and two Sure-T® infusion sets (both from Medtronic Minimed, Northridge, CA) until the infusion set failed or was worn for 1 week. All subjects wore a MiniMed continuous glucose monitoring system for the duration of the study. Results: One subject withdrew from the study. There were 38 weeks of Sure-T wear and 39 weeks of Quick-Set wear with no difference in the survival curves of the infusion sets. There was, however, a 15% initial failure rate with the Teflon infusion set. After 7 days, both types of infusion sets had a 64% failure rate. Overall, 30% failed because of hyperglycemia and a failed correction dose, 13% were removed for pain, 10% were pulled out by accident, 10% had erythema and/or induration of>10 mm, 5% fell out because of loss of adhesion, and 4% were removed for infection. The main predictor of length of wear was the individual subject. There was no increase in hyperglycemia or daily insulin requirements when an infusion set was successfully used for 7 days (n=25 of 77 weeks). Conclusions: We found no difference between steel and Teflon infusion sets in their function over 7 days, although 15% of Teflon sets failed because of kinking on insertion. The strongest predictor of prolonged 7-day infusion set function was the individual subject, not the type of infusion set. PMID:24090124

  15. Geostatistical analysis as applied to two environmental radiometric time series.

    PubMed

    Dowdall, Mark; Lind, Bjørn; Gerland, Sebastian; Rudjord, Anne Liv

    2003-03-01

    This article details the results of an investigation into the application of geostatistical data analysis to two environmental radiometric time series. The data series employed consist of 99Tc values for seaweed (Fucus vesiculosus) and seawater samples taken as part of a marine monitoring program conducted on the coast of northern Norway by the Norwegian Radiation Protection Authority. Geostatistical methods were selected in order to provide information on values of the variables at unsampled times and to investigate the temporal correlation exhibited by the data sets. This information is of use in the optimisation of future sampling schemes and for providing information on the temporal behaviour of the variables in question that may not be obtained during a cursory analysis. The results indicate a high degree of temporal correlation within the data sets, the correlation for the seawater and seaweed data being modelled with an exponential and linear function, respectively. The semi-variogram for the seawater data indicates a temporal range of correlation of approximately 395 days with no apparent random component to the overall variance structure and was described best by an exponential function. The temporal structure of the seaweed data was best modelled by a linear function with a small nugget component. Evidence of drift was present in both semi-variograms. Interpolation of the data sets using the fitted models and a simple kriging procedure were compared, using a cross-validation procedure, with simple linear interpolation. Results of this exercise indicate that, for the seawater data, the kriging procedure outperformed the simple interpolation with respect to error distribution and correlation of estimates with actual values. Using the unbounded linear model with the seaweed data produced estimates that were only marginally better than those produced by the simple interpolation.

  16. An application of geostatistics and fractal geometry for reservoir characterization

    SciTech Connect

    Aasum, Y.; Kelkar, M.G. ); Gupta, S.P. )

    1991-03-01

    This paper presents an application of geostatistics and fractal geometry concepts for 2D characterization of rock properties (k and {phi}) in a dolomitic, layered-cake reservoir. The results indicate that lack of closely spaced data yield effectively random distributions of properties. Further, incorporation of geology reduces uncertainties in fractal interpolation of wellbore properties.

  17. Preferential sampling and Bayesian geostatistics: Statistical modeling and examples.

    PubMed

    Cecconi, Lorenzo; Grisotto, Laura; Catelan, Dolores; Lagazio, Corrado; Berrocal, Veronica; Biggeri, Annibale

    2016-08-01

    Preferential sampling refers to any situation in which the spatial process and the sampling locations are not stochastically independent. In this paper, we present two examples of geostatistical analysis in which the usual assumption of stochastic independence between the point process and the measurement process is violated. To account for preferential sampling, we specify a flexible and general Bayesian geostatistical model that includes a shared spatial random component. We apply the proposed model to two different case studies that allow us to highlight three different modeling and inferential aspects of geostatistical modeling under preferential sampling: (1) continuous or finite spatial sampling frame; (2) underlying causal model and relevant covariates; and (3) inferential goals related to mean prediction surface or prediction uncertainty.

  18. A Practical Primer on Geostatistics

    USGS Publications Warehouse

    Olea, Ricardo A.

    2009-01-01

    THE CHALLENGE Most geological phenomena are extraordinarily complex in their interrelationships and vast in their geographical extension. Ordinarily, engineers and geoscientists are faced with corporate or scientific requirements to properly prepare geological models with measurements involving a small fraction of the entire area or volume of interest. Exact description of a system such as an oil reservoir is neither feasible nor economically possible. The results are necessarily uncertain. Note that the uncertainty is not an intrinsic property of the systems; it is the result of incomplete knowledge by the observer. THE AIM OF GEOSTATISTICS The main objective of geostatistics is the characterization of spatial systems that are incompletely known, systems that are common in geology. A key difference from classical statistics is that geostatistics uses the sampling location of every measurement. Unless the measurements show spatial correlation, the application of geostatistics is pointless. Ordinarily the need for additional knowledge goes beyond a few points, which explains the display of results graphically as fishnet plots, block diagrams, and maps. GEOSTATISTICAL METHODS Geostatistics is a collection of numerical techniques for the characterization of spatial attributes using primarily two tools: probabilistic models, which are used for spatial data in a manner similar to the way in which time-series analysis characterizes temporal data, or pattern recognition techniques. The probabilistic models are used as a way to handle uncertainty in results away from sampling locations, making a radical departure from alternative approaches like inverse distance estimation methods. DIFFERENCES WITH TIME SERIES On dealing with time-series analysis, users frequently concentrate their attention on extrapolations for making forecasts. Although users of geostatistics may be interested in extrapolation, the methods work at their best interpolating. This simple difference has

  19. Model Selection for Geostatistical Models

    SciTech Connect

    Hoeting, Jennifer A.; Davis, Richard A.; Merton, Andrew A.; Thompson, Sandra E.

    2006-02-01

    We consider the problem of model selection for geospatial data. Spatial correlation is typically ignored in the selection of explanatory variables and this can influence model selection results. For example, the inclusion or exclusion of particular explanatory variables may not be apparent when spatial correlation is ignored. To address this problem, we consider the Akaike Information Criterion (AIC) as applied to a geostatistical model. We offer a heuristic derivation of the AIC in this context and provide simulation results that show that using AIC for a geostatistical model is superior to the often used approach of ignoring spatial correlation in the selection of explanatory variables. These ideas are further demonstrated via a model for lizard abundance. We also employ the principle of minimum description length (MDL) to variable selection for the geostatistical model. The effect of sampling design on the selection of explanatory covariates is also explored.

  20. Geostatistics and GIS: tools for characterizing environmental contamination.

    PubMed

    Henshaw, Shannon L; Curriero, Frank C; Shields, Timothy M; Glass, Gregory E; Strickland, Paul T; Breysse, Patrick N

    2004-08-01

    Geostatistics is a set of statistical techniques used in the analysis of georeferenced data that can be applied to environmental contamination and remediation studies. In this study, the 1,1-dichloro-2,2-bis(p-chlorophenyl)ethylene (DDE) contamination at a Superfund site in western Maryland is evaluated. Concern about the site and its future clean up has triggered interest within the community because residential development surrounds the area. Spatial statistical methods, of which geostatistics is a subset, are becoming increasingly popular, in part due to the availability of geographic information system (GIS) software in a variety of application packages. In this article, the joint use of ArcGIS software and the R statistical computing environment are demonstrated as an approach for comprehensive geostatistical analyses. The spatial regression method, kriging, is used to provide predictions of DDE levels at unsampled locations both within the site and the surrounding areas where residential development is ongoing.

  1. The random set approach to nontraditional measurements is rigorously Bayesian

    NASA Astrophysics Data System (ADS)

    Mahler, Ronald; El-Fallah, Adel

    2012-06-01

    In several previous publications the first author has proposed a "generalized likelihood function" (GLF) approach for processing nontraditional measurements such as attributes, features, natural-language statements, and inference rules. The GLF approach is based on random set "generalized measurement models" for nontraditional measurements. GLFs are not conventional likelihood functions, since they are not density functions and their integrals are usually infinite, rather than equal to 1. For this reason, it has been unclear whether or not the GLF approach is fully rigorous from a strict Bayesian point of view. In a recent paper, the first author demonstrated that the GLF of a specific type of nontraditional measurement-quantized measurements-is rigorously Bayesian. In this paper we show that this result can be generalized to arbitrary nontraditional measurements, thus removing any doubt that the GLF approach is rigorously Bayesian.

  2. High-performance computational and geostatistical experiments for testing the capabilities of 3-d electrical tomography

    SciTech Connect

    Carle, S. F.; Daily, W. D.; Newmark, R. L.; Ramirez, A.; Tompson, A.

    1999-01-19

    This project explores the feasibility of combining geologic insight, geostatistics, and high-performance computing to analyze the capabilities of 3-D electrical resistance tomography (ERT). Geostatistical methods are used to characterize the spatial variability of geologic facies that control sub-surface variability of permeability and electrical resistivity Synthetic ERT data sets are generated from geostatistical realizations of alluvial facies architecture. The synthetic data sets enable comparison of the "truth" to inversion results, quantification of the ability to detect particular facies at particular locations, and sensitivity studies on inversion parameters

  3. Model selection for geostatistical models.

    PubMed

    Hoeting, Jennifer A; Davis, Richard A; Merton, Andrew A; Thompson, Sandra E

    2006-02-01

    We consider the problem of model selection for geospatial data. Spatial correlation is often ignored in the selection of explanatory variables, and this can influence model selection results. For example, the importance of particular explanatory variables may not be apparent when spatial correlation is ignored. To address this problem, we consider the Akaike Information Criterion (AIC) as applied to a geostatistical model. We offer a heuristic derivation of the AIC in this context and provide simulation results that show that using AIC for a geostatistical model is superior to the often-used traditional approach of ignoring spatial correlation in the selection of explanatory variables. These ideas are further demonstrated via a model for lizard abundance. We also apply the principle of minimum description length (MDL) to variable selection for the geostatistical model. The effect of sampling design on the selection of explanatory covariates is also explored. R software to implement the geostatistical model selection methods described in this paper is available in the Supplement.

  4. A practical primer on geostatistics

    USGS Publications Warehouse

    Olea, Ricardo A.

    2009-01-01

    The Challenge—Most geological phenomena are extraordinarily complex in their interrelationships and vast in their geographical extension. Ordinarily, engineers and geoscientists are faced with corporate or scientific requirements to properly prepare geological models with measurements involving a small fraction of the entire area or volume of interest. Exact description of a system such as an oil reservoir is neither feasible nor economically possible. The results are necessarily uncertain. Note that the uncertainty is not an intrinsic property of the systems; it is the result of incomplete knowledge by the observer.The Aim of Geostatistics—The main objective of geostatistics is the characterization of spatial systems that are incompletely known, systems that are common in geology. A key difference from classical statistics is that geostatistics uses the sampling location of every measurement. Unless the measurements show spatial correlation, the application of geostatistics is pointless. Ordinarily the need for additional knowledge goes beyond a few points, which explains the display of results graphically as fishnet plots, block diagrams, and maps.Geostatistical Methods—Geostatistics is a collection of numerical techniques for the characterization of spatial attributes using primarily two tools: probabilistic models, which are used for spatial data in a manner similar to the way in which time-series analysis characterizes temporal data, or pattern recognition techniques. The probabilistic models are used as a way to handle uncertainty in results away from sampling locations, making a radical departure from alternative approaches like inverse distance estimation methods.Differences with Time Series—On dealing with time-series analysis, users frequently concentrate their attention on extrapolations for making forecasts. Although users of geostatistics may be interested in extrapolation, the methods work at their best interpolating. This simple difference

  5. Qualitative optimization of image processing systems using random set modeling

    NASA Astrophysics Data System (ADS)

    Kelly, Patrick A.; Derin, Haluk; Vaidya, Priya G.

    2000-08-01

    Many decision-making systems involve image processing that converts input sensor data into output images having desirable features. Typically, the system user selects some processing parameters. The processor together with the input image can then be viewed as a system that maps the processing parameters into output features. However, the most significant output features often are not numerical quantities, but instead are subjective measures of image quality. It can be a difficult task for a user to find the processing parameters that give the 'best' output. We wish to automate this qualitative optimization task. The key to this is incorporation linguistic operating rules and qualitative output parameters in a numerical optimization scheme. In this paper, we use the test system of input parameter selection for 2D Wiener filtering to restore noisy and blurred images. Operating rules represented with random sets are used to generate a nominal input-output system model, which is then used to select initial Wiener filter input parameters. Whenthe nominally optimal Wiener filter is applied to an observed image, the operator's assessment of output image quality is used in an adaptive filtering algorithm to adjust the model and select new input parameters. Test on several images have confirmed that with a few such iterations, a significant improvement in output quality is achieved.

  6. Efficient geostatistical inversion of transient groundwater flow using preconditioned nonlinear conjugate gradients

    NASA Astrophysics Data System (ADS)

    Klein, Ole; Cirpka, Olaf A.; Bastian, Peter; Ippisch, Olaf

    2017-04-01

    In the geostatistical inverse problem of subsurface hydrology, continuous hydraulic parameter fields, in most cases hydraulic conductivity, are estimated from measurements of dependent variables, such as hydraulic heads, under the assumption that the parameter fields are autocorrelated random space functions. Upon discretization, the continuous fields become large parameter vectors with O (104 -107) elements. While cokriging-like inversion methods have been shown to be efficient for highly resolved parameter fields when the number of measurements is small, they require the calculation of the sensitivity of each measurement with respect to all parameters, which may become prohibitive with large sets of measured data such as those arising from transient groundwater flow. We present a Preconditioned Conjugate Gradient method for the geostatistical inverse problem, in which a single adjoint equation needs to be solved to obtain the gradient of the objective function. Using the autocovariance matrix of the parameters as preconditioning matrix, expensive multiplications with its inverse can be avoided, and the number of iterations is significantly reduced. We use a randomized spectral decomposition of the posterior covariance matrix of the parameters to perform a linearized uncertainty quantification of the parameter estimate. The feasibility of the method is tested by virtual examples of head observations in steady-state and transient groundwater flow. These synthetic tests demonstrate that transient data can reduce both parameter uncertainty and time spent conducting experiments, while the presented methods are able to handle the resulting large number of measurements.

  7. Restricted spatial regression in practice: Geostatistical models, confounding, and robustness under model misspecification

    USGS Publications Warehouse

    Hanks, Ephraim M.; Schliep, Erin M.; Hooten, Mevin B.; Hoeting, Jennifer A.

    2015-01-01

    In spatial generalized linear mixed models (SGLMMs), covariates that are spatially smooth are often collinear with spatially smooth random effects. This phenomenon is known as spatial confounding and has been studied primarily in the case where the spatial support of the process being studied is discrete (e.g., areal spatial data). In this case, the most common approach suggested is restricted spatial regression (RSR) in which the spatial random effects are constrained to be orthogonal to the fixed effects. We consider spatial confounding and RSR in the geostatistical (continuous spatial support) setting. We show that RSR provides computational benefits relative to the confounded SGLMM, but that Bayesian credible intervals under RSR can be inappropriately narrow under model misspecification. We propose a posterior predictive approach to alleviating this potential problem and discuss the appropriateness of RSR in a variety of situations. We illustrate RSR and SGLMM approaches through simulation studies and an analysis of malaria frequencies in The Gambia, Africa.

  8. Geostatistics applied to gas reservoirs

    SciTech Connect

    Meunier, G.; Coulomb, C.; Laille, J.P. )

    1989-09-01

    The spatial distribution of many of the physical parameters connected with a gas reservoir is of primary interest to both engineers and geologists throughout the study, development, and operation of a field. It is therefore desirable for the distribution to be capable of statistical interpretation, to have a simple graphical representation, and to allow data to be entered from either two- or three-dimensional grids. To satisfy these needs while dealing with the geographical variables, new methods have been developed under the name geostatistics. This paper describes briefly the theory of geostatistics and its most recent improvements for the specific problem of subsurface description. The external-drift technique has been emphasized in particular, and in addition, four case studies related to gas reservoirs are presented.

  9. Geostatistical Modeling of Pore Velocity

    SciTech Connect

    Devary, J.L.; Doctor, P.G.

    1981-06-01

    A significant part of evaluating a geologic formation as a nuclear waste repository involves the modeling of contaminant transport in the surrounding media in the event the repository is breached. The commonly used contaminant transport models are deterministic. However, the spatial variability of hydrologic field parameters introduces uncertainties into contaminant transport predictions. This paper discusses the application of geostatistical techniques to the modeling of spatially varying hydrologic field parameters required as input to contaminant transport analyses. Kriging estimation techniques were applied to Hanford Reservation field data to calculate hydraulic conductivity and the ground-water potential gradients. These quantities were statistically combined to estimate the groundwater pore velocity and to characterize the pore velocity estimation error. Combining geostatistical modeling techniques with product error propagation techniques results in an effective stochastic characterization of groundwater pore velocity, a hydrologic parameter required for contaminant transport analyses.

  10. Geostatistical inversion of moment eqations of groundwater flow at the Montalto Uffugo reasearch site (Italy)

    NASA Astrophysics Data System (ADS)

    Bianchi Janetti, E.; Riva, M.; Straface, S.; Guadagnini, A.

    2009-04-01

    We present the application of a methodology of inverting stochastic mean groundwater flow equations to characterize the spatial variability of (natural) log-transmissivity of the Montalto Uffugo research site (Italy). The methodology has been originally proposed by Hernandez et al. [2003, 2006]. It relies on a nonlinear geostatistical inverse algorithm for steady-state groundwater flow that allows estimating jointly the spatial variability of log-transmissivity, the underlying variogram and its parameters, and the variance-covariance of the estimates. Exact mean flow equations are rendered workable by means of a suitable second-order approximation (in terms of a small parameter, representing the standard deviation of the underlying random log-transmissivities). A unique feature of the method is its capability of providing estimates of prediction errors of hydraulic heads and fluxes, which are calculated a posteriori, upon solving corresponding moment equations. Prior estimates of the transmissivity variogram and its associated parameters at the test site are obtained on the basis of available electrical resistivity data. Transmissivity is parameterized geostatistically on the basis of an available measured value and a set of unknown values at discrete pilot points. While prior pilot point values are obtained by generalized kriging, posterior estimates at pilot points are obtained by calibrating mean flow against late-time values of hydraulic head collected during a pumping test. Information on hydraulic heads is obtained on the basis of self-potential signals recorded by 47 surface electrodes during the test. We explore the effectiveness of both a second-order and a lower-order closure of the mean flow equation at capturing the parameters of the estimated log-transmissivity variogram. The latter are estimated a posteriori using formal model selection criteria. Our results highlight that assimilating hydrogeophysical data within a second-order model for mean

  11. Using geostatistics to evaluate cleanup goals

    SciTech Connect

    Marcon, M.F.; Hopkins, L.P.

    1995-12-01

    Geostatistical analysis is a powerful predictive tool typically used to define spatial variability in environmental data. The information from a geostatistical analysis using kriging, a geostatistical. tool, can be taken a step further to optimize sampling location and frequency and help quantify sampling uncertainty in both the remedial investigation and remedial design at a hazardous waste site. Geostatistics were used to quantify sampling uncertainty in attainment of a risk-based cleanup goal and determine the optimal sampling frequency necessary to delineate the horizontal extent of impacted soils at a Gulf Coast waste site.

  12. Geostatistical enhancement of european hydrological predictions

    NASA Astrophysics Data System (ADS)

    Pugliese, Alessio; Castellarin, Attilio; Parajka, Juraj; Arheimer, Berit; Bagli, Stefano; Mazzoli, Paolo; Montanari, Alberto; Blöschl, Günter

    2016-04-01

    Geostatistical Enhancement of European Hydrological Prediction (GEEHP) is a research experiment developed within the EU funded SWITCH-ON project, which proposes to conduct comparative experiments in a virtual laboratory in order to share water-related information and tackle changes in the hydrosphere for operational needs (http://www.water-switch-on.eu). The main objective of GEEHP deals with the prediction of streamflow indices and signatures in ungauged basins at different spatial scales. In particular, among several possible hydrological signatures we focus in our experiment on the prediction of flow-duration curves (FDCs) along the stream-network, which has attracted an increasing scientific attention in the last decades due to the large number of practical and technical applications of the curves (e.g. hydropower potential estimation, riverine habitat suitability and ecological assessments, etc.). We apply a geostatistical procedure based on Top-kriging, which has been recently shown to be particularly reliable and easy-to-use regionalization approach, employing two different type of streamflow data: pan-European E-HYPE simulations (http://hypeweb.smhi.se/europehype) and observed daily streamflow series collected in two pilot study regions, i.e. Tyrol (merging data from Austrian and Italian stream gauging networks) and Sweden. The merger of the two study regions results in a rather large area (~450000 km2) and might be considered as a proxy for a pan-European application of the approach. In a first phase, we implement a bidirectional validation, i.e. E-HYPE catchments are set as training sites to predict FDCs at the same sites where observed data are available, and vice-versa. Such a validation procedure reveals (1) the usability of the proposed approach for predicting the FDCs over the entire river network of interest using alternatively observed data and E-HYPE simulations and (2) the accuracy of E-HYPE-based predictions of FDCs in ungauged sites. In a

  13. In School Settings, Are All RCTs (Randomized Control Trials) Exploratory?

    ERIC Educational Resources Information Center

    Newman, Denis; Jaciw, Andrew P.

    2012-01-01

    The motivation for this paper is the authors' recent work on several randomized control trials in which they found the primary result, which averaged across subgroups or sites, to be moderated by demographic or site characteristics. They are led to examine a distinction that the Institute of Education Sciences (IES) makes between "confirmatory"…

  14. In School Settings, Are All RCTs (Randomized Control Trials) Exploratory?

    ERIC Educational Resources Information Center

    Newman, Denis; Jaciw, Andrew P.

    2012-01-01

    The motivation for this paper is the authors' recent work on several randomized control trials in which they found the primary result, which averaged across subgroups or sites, to be moderated by demographic or site characteristics. They are led to examine a distinction that the Institute of Education Sciences (IES) makes between "confirmatory"…

  15. Importance of stationarity for geostatistical assessment of environmental contamination

    SciTech Connect

    Dagdelen, K.; Turner, A.K.

    1996-12-31

    This paper describes a geostatistical case study to assess TCE contamination from multiple point sources that is migrating through the geologically complex conditions with several aquifers. The paper highlights the importance of the stationarity assumption by demonstrating how biased assessments of TCE contamination result when ordinary kriging of the data that violates stationarity assumptions. Division of the data set into more homogeneous geologic and hydrologic zones improved the accuracy of the estimates. Indicator kriging offers an alternate method for providing a stochastic model that is more appropriate for the data. Further improvement in the estimates results when indicator kriging is applied to individual subregional data sets that are based on geological considerations. This further enhances the data homogeneity and makes use of stationary model more appropriate. By combining geological and geostatistical evaluations, more realistic maps may be produced that reflect the hydrogeological environment and provide a sound basis for future investigations and remediation.

  16. Using geostatistics to estimate coal reserves

    SciTech Connect

    Royle, A.G.

    1982-09-01

    Geostatistics have, in the past, been used for evaluating metallic ore reserves. Today they are finding more use for coal reserve determination. An example is given to show how geostatistics can be used to estimate mean thickness, sulphur content and other data from in situ coal. (3 refs.)

  17. Geostatistical inversion of transient moment equations of groundwater flow

    NASA Astrophysics Data System (ADS)

    Riva, M.; Guadagnini, A.; Neuman, S. P.; Bianchi Janetti, E.; Malama, B.

    2009-04-01

    We present a methodology for conditioning estimates of hydraulic heads and fluxes and their associated uncertainty on information about transmissivity, T , and hydraulic heads, h, collected within a randomly heterogeneous aquifer under transient conditions. Our approach is based on recursive finite-element approximations of exact nonlocal first and second conditional moment equations. We present a nonlinear geostatistical inverse algorithm for transient groundwater flow that allows estimating jointly the spatial variability of log-transmissivity, Y = ln T, the underlying variogram and its parameters, and the variance-covariance of the estimates. Log-transmissivity is parameterized geostatistically based on measured values at discrete locations and unknown values at discrete "pilot points." While prior pilot point values are obtained by generalized kriging, posterior estimates at pilot points are obtained by history matching of transient mean flow against values of hydraulic head collected during a pumping test. Parameters are then projected onto a computational grid by kriging. Prior information on hydraulic properties is included in the optimization process via a suitable regularization term which is included in the objective function to be minimized. The weight of the regularization term, hydraulic and unknown variogram parameters are then estimated by maximum likelihood calibration. The main features of the methodology are explored by means of a synthetic example. As alternative flow models we consider (a) a second-order and (b) a lower-order closure of the mean transient flow equation and assess the ability of these models at capturing the parameters of the estimated log-transmissivity variogram. With the aid of formal model selection criteria we associate each mean flow model and different sets of tested variogram parameters with a weight, or posterior probability, representing their relative degrees of likelihood. Our findings suggest that the weight of the

  18. Local Geostatistical Models and Big Data in Hydrological and Ecological Applications

    NASA Astrophysics Data System (ADS)

    Hristopulos, Dionissios

    2015-04-01

    The advent of the big data era creates new opportunities for environmental and ecological modelling but also presents significant challenges. The availability of remote sensing images and low-cost wireless sensor networks implies that spatiotemporal environmental data to cover larger spatial domains at higher spatial and temporal resolution for longer time windows. Handling such voluminous data presents several technical and scientific challenges. In particular, the geostatistical methods used to process spatiotemporal data need to overcome the dimensionality curse associated with the need to store and invert large covariance matrices. There are various mathematical approaches for addressing the dimensionality problem, including change of basis, dimensionality reduction, hierarchical schemes, and local approximations. We present a Stochastic Local Interaction (SLI) model that can be used to model local correlations in spatial data. SLI is a random field model suitable for data on discrete supports (i.e., regular lattices or irregular sampling grids). The degree of localization is determined by means of kernel functions and appropriate bandwidths. The strength of the correlations is determined by means of coefficients. In the "plain vanilla" version the parameter set involves scale and rigidity coefficients as well as a characteristic length. The latter determines in connection with the rigidity coefficient the correlation length of the random field. The SLI model is based on statistical field theory and extends previous research on Spartan spatial random fields [2,3] from continuum spaces to explicitly discrete supports. The SLI kernel functions employ adaptive bandwidths learned from the sampling spatial distribution [1]. The SLI precision matrix is expressed explicitly in terms of the model parameter and the kernel function. Hence, covariance matrix inversion is not necessary for parameter inference that is based on leave-one-out cross validation. This property

  19. Reservoir property grids improve with geostatistics

    SciTech Connect

    Vogt, J. . E and P Technology Dept.)

    1993-09-01

    Visualization software, reservoir simulators and many other E and P software applications need reservoir property grids as input. Using geostatistics, as compared to other gridding methods, to produce these grids leads to the best output from the software programs. For the purpose stated herein, geostatistics is simply two types of gridding methods. Mathematically, these methods are based on minimizing or duplicating certain statistical properties of the input data. One geostatical method, called kriging, is used when the highest possible point-by-point accuracy is desired. The other method, called conditional simulation, is used when one wants statistics and texture of the resulting grid to be the same as for the input data. In the following discussion, each method is explained, compared to other gridding methods, and illustrated through example applications. Proper use of geostatistical data in flow simulations, use of geostatistical data for history matching, and situations where geostatistics has no significant advantage over other methods, also will be covered.

  20. On the geostatistical characterization of hierarchical media

    NASA Astrophysics Data System (ADS)

    Neuman, Shlomo P.; Riva, Monica; Guadagnini, Alberto

    2008-02-01

    The subsurface consists of porous and fractured materials exhibiting a hierarchical geologic structure, which gives rise to systematic and random spatial and directional variations in hydraulic and transport properties on a multiplicity of scales. Traditional geostatistical moment analysis allows one to infer the spatial covariance structure of such hierarchical, multiscale geologic materials on the basis of numerous measurements on a given support scale across a domain or "window" of a given length scale. The resultant sample variogram often appears to fit a stationary variogram model with constant variance (sill) and integral (spatial correlation) scale. In fact, some authors, who recognize that hierarchical sedimentary architecture and associated log hydraulic conductivity fields tend to be nonstationary, nevertheless associate them with stationary "exponential-like" transition probabilities and variograms, respectively, the latter being a consequence of the former. We propose that (1) the apparent ability of stationary spatial statistics to characterize the covariance structure of nonstationary hierarchical media is an artifact stemming from the finite size of the windows within which geologic and hydrologic variables are ubiquitously sampled, and (2) the artifact is eliminated upon characterizing the covariance structure of such media with the aid of truncated power variograms, which represent stationary random fields obtained upon sampling a nonstationary fractal over finite windows. To support our opinion, we note that truncated power variograms arise formally when a hierarchical medium is sampled jointly across all geologic categories and scales within a window; cite direct evidence that geostatistical parameters (variance and integral scale) inferred on the basis of traditional variograms vary systematically with support and window scales; demonstrate the ability of truncated power models to capture these variations in terms of a few scaling parameters

  1. Random Access: The DotsPlus Tactile Font Set.

    ERIC Educational Resources Information Center

    Gardner, John

    1998-01-01

    Describes "DotsPlus," a tactile font set that allows computers to print documents in any language which uses the Roman alphabet in tactile form. DotsPlus overcomes such Braille problems as code translation, Braille numbers, exotic symbols, and symbols out of context. A new printing technology (TIGER--Tactile Graphics Embosser) produces DotsPlus…

  2. Network motifs come in sets: correlations in the randomization process.

    PubMed

    Ginoza, Reid; Mugler, Andrew

    2010-07-01

    The identification of motifs--subgraphs that appear significantly more often in a particular network than in an ensemble of randomized networks--has become a ubiquitous method for uncovering potentially important subunits within networks drawn from a wide variety of fields. We find that the most common algorithms used to generate the ensemble from the real network change subgraph counts in a highly correlated manner, such that one subgraph's status as a motif may not be independent from the statuses of the other subgraphs. We demonstrate this effect for the problem of three- and four-node motif identification in the transcriptional regulatory networks of E. coli and S. cerevisiae in which randomized networks are generated via an edge-swapping algorithm. We find strong correlations among subgraph counts; for three-node subgraphs these correlations are easily interpreted, and we present an information-theoretic tool that may be used to identify correlations among subgraphs of any size. Our results suggest that single-feature statistics such as Z scores that implicitly assume independence among subgraph counts constitute an insufficient summary of the network.

  3. Network motifs come in sets: Correlations in the randomization process

    NASA Astrophysics Data System (ADS)

    Ginoza, Reid; Mugler, Andrew

    2010-07-01

    The identification of motifs—subgraphs that appear significantly more often in a particular network than in an ensemble of randomized networks—has become a ubiquitous method for uncovering potentially important subunits within networks drawn from a wide variety of fields. We find that the most common algorithms used to generate the ensemble from the real network change subgraph counts in a highly correlated manner, such that one subgraph’s status as a motif may not be independent from the statuses of the other subgraphs. We demonstrate this effect for the problem of three- and four-node motif identification in the transcriptional regulatory networks of E. coli and S. cerevisiae in which randomized networks are generated via an edge-swapping algorithm. We find strong correlations among subgraph counts; for three-node subgraphs these correlations are easily interpreted, and we present an information-theoretic tool that may be used to identify correlations among subgraphs of any size. Our results suggest that single-feature statistics such as Z scores that implicitly assume independence among subgraph counts constitute an insufficient summary of the network.

  4. Identification and Simulation of Subsurface Soil patterns using hidden Markov random fields and remote sensing and geophysical EMI data sets

    NASA Astrophysics Data System (ADS)

    Wang, Hui; Wellmann, Florian; Verweij, Elizabeth; von Hebel, Christian; van der Kruk, Jan

    2017-04-01

    Lateral and vertical spatial heterogeneity of subsurface properties such as soil texture and structure influences the available water and resource supply for crop growth. High-resolution mapping of subsurface structures using non-invasive geo-referenced geophysical measurements, like electromagnetic induction (EMI), enables a characterization of 3D soil structures, which have shown correlations to remote sensing information of the crop states. The benefit of EMI is that it can return 3D subsurface information, however the spatial dimensions are limited due to the labor intensive measurement procedure. Although active and passive sensors mounted on air- or space-borne platforms return 2D images, they have much larger spatial dimensions. Combining both approaches provides us with a potential pathway to extend the detailed 3D geophysical information to a larger area by using remote sensing information. In this study, we aim at extracting and providing insights into the spatial and statistical correlation of the geophysical and remote sensing observations of the soil/vegetation continuum system. To this end, two key points need to be addressed: 1) how to detect and recognize the geometric patterns (i.e., spatial heterogeneity) from multiple data sets, and 2) how to quantitatively describe the statistical correlation between remote sensing information and geophysical measurements. In the current study, the spatial domain is restricted to shallow depths up to 3 meters, and the geostatistical database contains normalized difference vegetation index (NDVI) derived from RapidEye satellite images and apparent electrical conductivities (ECa) measured from multi-receiver EMI sensors for nine depths of exploration ranging from 0-2.7 m. The integrated data sets are mapped into both the physical space (i.e. the spatial domain) and feature space (i.e. a two-dimensional space framed by the NDVI and the ECa data). Hidden Markov Random Fields (HMRF) are employed to model the

  5. Patient agenda setting in respiratory outpatients: A randomized controlled trial.

    PubMed

    Early, Frances; Everden, Angharad Jt; O'Brien, Cathy M; Fagan, Petrea L; Fuld, Jonathan P

    2015-11-01

    Soliciting a patient's agenda (the reason for their visit, concerns and expectations) is fundamental to health care but if not done effectively outcomes can be adversely affected. Forms to help patients consider important issues prior to a consultation have been tested with mixed results. We hypothesized that using an agenda form would impact the extent to which patients felt their doctor discussed the issues that were important to them. Patients were randomized to receive an agenda form to complete whilst waiting or usual care. The primary outcome measure was the proportion of patients agreeing with the statement 'My doctor discussed the issues that were important to me' rated on a four-point scale. Secondary outcomes included other experience and satisfaction measures, consultation duration and patient confidence. There was no significant effect of agenda form use on primary or secondary outcomes. Post hoc exploratory analyses suggested possible differential effects for new compared to follow-up patients. There was no overall benefit from the form and a risk of detrimental impact on patient experience for some patients. There is a need for greater understanding of what works for whom in supporting patients to get the most from their consultation.

  6. Identification of Fuzzy Sets with a Class of Canonically Induced Random Sets and Some Applications.

    DTIC Science & Technology

    1980-09-15

    Vol. I. 27. C.L. Chang, "Fuzzy Topological Spaces , J. Math. Anal. & Applic. 24, 182-190 (1968). 28. B. Hutton and 1. Reilly, "Separation Axioms in...Fuzzy Topological Spaces ", Fuzzy Sets and SystemsI’ 3, 93-104 (1980). 29. L.A. Zadeh, Fuzzy Sets and Their Application to Pattern Classification and

  7. Bayesian Geostatistical Design: Optimal Site Investigation When the Geostatistical Model is Uncertain

    NASA Astrophysics Data System (ADS)

    Nowak, W.; de Barros, F. P. J.; Rubin, Y.

    2009-04-01

    Geostatistical optimal design optimizes subsurface exploration for maximum information towards task-specific prediction goals. Until recently, geostatistical design studies have assumed that the geostatistical description (i.e., the mean, trends, covariance models and their parameters) is given a priori, even if only few or no data offer support for such assumptions. This is in contradiction with the fact that the bulk of data acquisition is merely being planned at this stage. We believe that geostatistical design should comply with the following four guidelines: 1. Avoid unjustified a priori assumptions on the geostatistical description such as claiming certainty in the geostatistical model, but to acknowledge the inevitable uncertainty of geostatistical descriptions, 2. Reduce geostatistical model uncertainty as secondary design objective, 3. Rate this secondary objective optimal for the overall prediction goal and 4. Be robust even under inaccurate geostatistical assumptions. Bayesian Geostatistical Design (Diggle und Lophaven, 2006) follows the above four guidelines by considering uncertain covariance model parameters. These authors considered a kriging-like prediction task, using the spatial average of the estimation variance as objective function for the design. We transfer their concept from kriging-like applications to geostatistical inverse problems, thus generalizing towards arbitrary hydrogeological or geophysical data and prediction goals. A remaining concern is that we deem it inappropriate to consider parametric uncertainty only within a single covariance model. The Matérn family of covariance functions has an additional shape parameter, and so allows for uncertain smoothness and shape of the covariance function (Zhang and Rubin, submitted to WRR). Controlling model shape by a parameter converts covariance model selection to parameter identification and resembles Bayesian model averaging over a continuous spectrum of covariance models. We illustrate

  8. 3D vadose zone modeling using geostatistical inferences

    SciTech Connect

    Knutson, C.F.; Lee, C.B.

    1991-01-01

    In developing a 3D model of the 600 ft thick interbedded basalt and sediment complex that constitutes the vadose zone at the Radioactive Waste Management Complex (RWMC) at the Idaho National Engineering Laboratory (INEL) geostatistical data were captured for 12--15 parameters (e.g. permeability, porosity, saturation, etc. and flow height, flow width, flow internal zonation, etc.). This two scale data set was generated from studies of subsurface core and geophysical log suites at RWMC and from surface outcrop exposures located at the Box Canyon of the Big Lost River and from Hell's Half Acre lava field all located in the general RWMC area. Based on these currently available data, it is possible to build a 3D stochastic model that utilizes: cumulative distribution functions obtained from the geostatistical data; backstripping and rebuilding of stratigraphic units; an expert'' system that incorporates rules based on expert geologic analysis and experimentally derived geostatistics for providing: (a) a structural and isopach map of each layer, (b) a realization of the flow geometry of each basalt flow unit, and (c) a realization of the internal flow parameters (eg permeability, porosity, and saturation) for each flow. 10 refs., 4 figs., 1 tab.

  9. A Random Finite Set Approach to Space Junk Tracking and Identification

    DTIC Science & Technology

    2014-09-03

    Final 3. DATES COVERED (From - To) 31 Jan 13 – 29 Apr 14 4. TITLE AND SUBTITLE A Random Finite Set Approach to Space Junk Tracking and...01-2013 to 29-04-2014 4. TITLE AND SUBTITLE A Random Finite Set Approach to Space Junk Tracking and Identification 5a. CONTRACT NUMBER FA2386-13...Prescribed by ANSI Std Z39-18 A Random Finite Set Approach to Space Junk Tracking and Indentification Ba-Ngu Vo1, Ba-Tuong Vo1, 1Department of

  10. Self-similar continuous cascades supported by random Cantor sets: Application to rainfall data

    NASA Astrophysics Data System (ADS)

    Muzy, Jean-François; Baïle, Rachel

    2016-05-01

    We introduce a variant of continuous random cascade models that extends former constructions introduced by Barral-Mandelbrot and Bacry-Muzy in the sense that they can be supported by sets of arbitrary fractal dimension. The so-introduced sets are exactly self-similar stationary versions of random Cantor sets formerly introduced by Mandelbrot as "random cutouts." We discuss the main mathematical properties of our construction and compute its scaling properties. We then illustrate our purpose on several numerical examples and we consider a possible application to rainfall data. We notably show that our model allows us to reproduce remarkably the distribution of dry period durations.

  11. [Geostatistical modeling of Ascaris lumbricoides infection].

    PubMed

    Fortes, Bruno de Paula Menezes Drumond; Ortiz Valencia, Luis Iván; Ribeiro, Simone do Vale; Medronho, Roberto de Andrade

    2004-01-01

    The following study intends to model the spatial distribution of ascariasis, through the use of geoprocessing and geostatistic analysis. The database used in the study was taken from the PAISQUA project, including a coproparasitologic and domiciliary survey, conducted in 19 selected census tracts of Rio de Janeiro State, Brazil, randomly selecting a group of 1,550 children aged 1 to 9 years old plotting them in their respective domicile's centroids. Risk maps of Ascaris lumbricoides were generated by indicator kriging. The estimated and observed values from the cross-validation were compared using a ROC curve. An isotropic spherical semivariogram model with a range of 30m and nugget effect of 50% was employed in ordinary indicator kriging to create a map of probability of A. lumbricoides infection. The area under the ROC curve indicated a significant global accuracy. The occurrence of disease could be estimated in the study area, and a risk map was elaborated through the use ordinary kriging. The spatial statistics analysis has proven itself adequate for predicting the occurrence of ascariasis, unrestricted to the regions political boundaries.

  12. Reservoir studies with geostatistics to forecast performance

    SciTech Connect

    Tang, R.W.; Behrens, R.A.; Emanuel, A.S. )

    1991-05-01

    In this paper example geostatistics and streamtube applications are presented for waterflood and CO{sub 2} flood in two low-permeability sandstone reservoirs. Thy hybrid approach of combining fine vertical resolution in cross-sectional models with streamtubes resulted in models that showed water channeling and provided realistic performance estimates. Results indicate that the combination of detailed geostatistical cross sections and fine-grid streamtube models offers a systematic approach for realistic performance forecasts.

  13. Improved Reconstruction of Two-Dimensional Resistivity Field Data Using Geostatistics

    NASA Astrophysics Data System (ADS)

    Bagtzoglou, A.; Lane, J.; Cornacchiulo, D.; Ergun, K.

    2003-04-01

    The need to reduce the effects of various type of noise observed in geophysical field data motivated this study to determine if geostatistical methods (in this case krigging) could be used to restore noisy field data eliminated from source files prior to inversion. We used resistivity forward and inversion modeling software for a simple fault earth model to produce and invert synthetic datasets that were manipulated using a geostatistical program developed for this study. The effect of a range of random noise and data density deletion were examined to study the influence these factors have on the inversion of resistivity data and the subsequent interpretability of the geologic structure.

  14. [Spatial distribution pattern of Chilo suppressalis analyzed by classical method and geostatistics].

    PubMed

    Yuan, Zheming; Fu, Wei; Li, Fangyi

    2004-04-01

    Two original samples of Chilo suppressalis and their grid, random and sequence samples were analyzed by classical method and geostatistics to characterize the spatial distribution pattern of C. suppressalis. The limitations of spatial distribution analysis with classical method, especially influenced by the original position of grid, were summarized rather completely. On the contrary, geostatistics characterized well the spatial distribution pattern, congregation intensity and spatial heterogeneity of C. suppressalis. According to geostatistics, the population was up to Poisson distribution in low density. As for higher density population, its distribution was up to aggregative, and the aggregation intensity and dependence range were 0.1056 and 193 cm, respectively. Spatial heterogeneity was also found in the higher density population. Its spatial correlativity in line direction was more closely than that in row direction, and the dependence ranges in line and row direction were 115 and 264 cm, respectively.

  15. Identifying the minor set cover of dense connected bipartite graphs via random matching edge sets

    NASA Astrophysics Data System (ADS)

    Hamilton, Kathleen E.; Humble, Travis S.

    2017-04-01

    Using quantum annealing to solve an optimization problem requires minor embedding a logic graph into a known hardware graph. In an effort to reduce the complexity of the minor embedding problem, we introduce the minor set cover (MSC) of a known graph G: a subset of graph minors which contain any remaining minor of the graph as a subgraph. Any graph that can be embedded into G will be embeddable into a member of the MSC. Focusing on embedding into the hardware graph of commercially available quantum annealers, we establish the MSC for a particular known virtual hardware, which is a complete bipartite graph. We show that the complete bipartite graph K_{N,N} has a MSC of N minors, from which K_{N+1} is identified as the largest clique minor of K_{N,N}. The case of determining the largest clique minor of hardware with faults is briefly discussed but remains an open question.

  16. Identifying the minor set cover of dense connected bipartite graphs via random matching edge sets

    DOE PAGES

    Hamilton, Kathleen E.; Humble, Travis S.

    2017-02-23

    Using quantum annealing to solve an optimization problem requires minor embedding a logic graph into a known hardware graph. We introduce the minor set cover (MSC) of a known graph GG : a subset of graph minors which contain any remaining minor of the graph as a subgraph, in an effort to reduce the complexity of the minor embedding problem. Any graph that can be embedded into GG will be embeddable into a member of the MSC. Focusing on embedding into the hardware graph of commercially available quantum annealers, we establish the MSC for a particular known virtual hardware, whichmore » is a complete bipartite graph. Furthermore, we show that the complete bipartite graph KN,N has a MSC of N minors, from which KN+1 is identified as the largest clique minor of KN,N. In the case of determining the largest clique minor of hardware with faults we briefly discussed this open question.« less

  17. Examining the Missing Completely at Random Mechanism in Incomplete Data Sets: A Multiple Testing Approach

    ERIC Educational Resources Information Center

    Raykov, Tenko; Lichtenberg, Peter A.; Paulson, Daniel

    2012-01-01

    A multiple testing procedure for examining implications of the missing completely at random (MCAR) mechanism in incomplete data sets is discussed. The approach uses the false discovery rate concept and is concerned with testing group differences on a set of variables. The method can be used for ascertaining violations of MCAR and disproving this…

  18. Book Review Geostatistical Analysis of Compositional Data

    SciTech Connect

    Carle, S F

    2007-03-26

    Compositional data are represented as vector variables with individual vector components ranging between zero and a positive maximum value representing a constant sum constraint, usually unity (or 100 percent). The earth sciences are flooded with spatial distributions of compositional data, such as concentrations of major ion constituents in natural waters (e.g. mole, mass, or volume fractions), mineral percentages, ore grades, or proportions of mutually exclusive categories (e.g. a water-oil-rock system). While geostatistical techniques have become popular in earth science applications since the 1970s, very little attention has been paid to the unique mathematical properties of geostatistical formulations involving compositional variables. The book 'Geostatistical Analysis of Compositional Data' by Vera Pawlowsky-Glahn and Ricardo Olea (Oxford University Press, 2004), unlike any previous book on geostatistics, directly confronts the mathematical difficulties inherent to applying geostatistics to compositional variables. The book righteously justifies itself with prodigious referencing to previous work addressing nonsensical ranges of estimated values and error, spurious correlation, and singular cross-covariance matrices.

  19. Reducing spatial uncertainty in climatic maps through geostatistical analysis

    NASA Astrophysics Data System (ADS)

    Pesquer, Lluís; Ninyerola, Miquel; Pons, Xavier

    2014-05-01

    ), applying different interpolation methods/parameters are shown: RMS (mm) error values obtained from the independent test set (20 % of the samples) follow, according to this order: IDW (exponent=1.5, 2, 2.5, 3) / SPT (tension=100, 125, 150, 175, 200) / OK. LOOCV: 92.5; 80.2; 74.2; 72.3 / 181.6; 90.6; 75.7; 71.1; 69.4; 68.8 RS: 101.2; 89.6; 83.9; 81.9 / 115.1; 92.4; 84.0; 81.4; 80.9; 81.1 / 81.1 EU: 57.4; 51.3; 53.1; 55.5 / 59.1; 57.1; 55.9; 55.0; 54.3 / 51.8 A3D: 48.3; 49.8; 52.5; 62.2 / 57.1; 54.4; 52.5; 51.2; 50.2 / 49.7 To study these results, a geostatistical analysis of uncertainty has been done. Main results: variogram analysis of the error (using the test set) shows that the total sill is reduced (50% EU, 60% A3D) when using the two new approaches, while the spatialized standard deviation model calculated from the OK shows significantly lower values when compared to the RS. In conclusion, A3D and EU highly improve LOOCV and RS, whereas A3D slightly improves EU. Also, LOOCV only shows slightly better results than RS, suggesting that non-random-split increases the power of both fitting-test steps. * Ninyerola, Pons, Roure. A methodological approach of climatological modelling of air temperature and precipitation through GIS techniques. IJC, 2000; 20:1823-1841.

  20. Model-Based Geostatistical Mapping of the Prevalence of Onchocerca volvulus in West Africa

    PubMed Central

    O’Hanlon, Simon J.; Slater, Hannah C.; Cheke, Robert A.; Boatin, Boakye A.; Coffeng, Luc E.; Pion, Sébastien D. S.; Boussinesq, Michel; Zouré, Honorat G. M.; Stolk, Wilma A.; Basáñez, María-Gloria

    2016-01-01

    Background The initial endemicity (pre-control prevalence) of onchocerciasis has been shown to be an important determinant of the feasibility of elimination by mass ivermectin distribution. We present the first geostatistical map of microfilarial prevalence in the former Onchocerciasis Control Programme in West Africa (OCP) before commencement of antivectorial and antiparasitic interventions. Methods and Findings Pre-control microfilarial prevalence data from 737 villages across the 11 constituent countries in the OCP epidemiological database were used as ground-truth data. These 737 data points, plus a set of statistically selected environmental covariates, were used in a Bayesian model-based geostatistical (B-MBG) approach to generate a continuous surface (at pixel resolution of 5 km x 5km) of microfilarial prevalence in West Africa prior to the commencement of the OCP. Uncertainty in model predictions was measured using a suite of validation statistics, performed on bootstrap samples of held-out validation data. The mean Pearson’s correlation between observed and estimated prevalence at validation locations was 0.693; the mean prediction error (average difference between observed and estimated values) was 0.77%, and the mean absolute prediction error (average magnitude of difference between observed and estimated values) was 12.2%. Within OCP boundaries, 17.8 million people were deemed to have been at risk, 7.55 million to have been infected, and mean microfilarial prevalence to have been 45% (range: 2–90%) in 1975. Conclusions and Significance This is the first map of initial onchocerciasis prevalence in West Africa using B-MBG. Important environmental predictors of infection prevalence were identified and used in a model out-performing those without spatial random effects or environmental covariates. Results may be compared with recent epidemiological mapping efforts to find areas of persisting transmission. These methods may be extended to areas where

  1. GEOSTATISTICS FOR WASTE MANAGEMENT: A USER'S MANUAL FOR THE GEOPACK (VERSION 1.0) GEOSTATISTICAL SOFTWARE SYSTEM

    EPA Science Inventory

    GEOPACK, a comprehensive user-friendly geostatistical software system, was developed to help in the analysis of spatially correlated data. The software system was developed to be used by scientists, engineers, regulators, etc., with little experience in geostatistical techniques...

  2. GEOSTATISTICS FOR WASTE MANAGEMENT: A USER'S MANUAL FOR THE GEOPACK (VERSION 1.0) GEOSTATISTICAL SOFTWARE SYSTEM

    EPA Science Inventory

    GEOPACK, a comprehensive user-friendly geostatistical software system, was developed to help in the analysis of spatially correlated data. The software system was developed to be used by scientists, engineers, regulators, etc., with little experience in geostatistical techniques...

  3. Application of Frechet and other random-set averaging techniques to fusion of information

    NASA Astrophysics Data System (ADS)

    Goodman, I. R.

    1998-07-01

    An obviously important aspect of target tracking, and more generally, data fusion, is the combination of those pieces of multi-source information deemed to belong together. Recently, it has been pointed out that a random set approach to target tracking and data fusion may be more appropriate rather than the standard point-vector estimate approach -- especially in the case of large inherent parameter errors. In addition, since many data fusion problems involve non-numerical linguistic descriptions, in the same spirit it is also desirable to be able to have a method which averages in some qualitative sense random sets which are non-numerically- valued, i.e., which take on propositions or events, such as 'the target appears in area A or C, given the weather conditions of yesterday and source 1' and 'the target appears in area A or B, given the weather conditions of today and source 2.' This leads to the fundamental problem of how best to define the expectation of a random set. To date, this open issue has only been considered for numerically-based random sets. This paper addresses this issue in part by proposing an approach which is actually algebraically-based, but also applicable to numerical-based random sets, and directly related to both the Frechet and the Aumann-Artstein-Vitale random set averaging procedures. The technique employs the concept of 'constant probability events,' which has also played a key role in the recent development of 'relational event algebra,' a new mathematical tool for representing various models in the form of various functions of probabilities.

  4. Risk Assessment of Distribution Network Based on Random set Theory and Sensitivity Analysis

    NASA Astrophysics Data System (ADS)

    Zhang, Sh; Bai, C. X.; Liang, J.; Jiao, L.; Hou, Z.; Liu, B. Zh

    2017-05-01

    Considering the complexity and uncertainty of operating information in distribution network, this paper introduces the use of random set for risk assessment. The proposed method is based on the operating conditions defined in the random set framework to obtain the upper and lower cumulative probability functions of risk indices. Moreover, the sensitivity of risk indices can effectually reflect information about system reliability and operating conditions, and by use of these information the bottlenecks that suppress system reliability can be found. The analysis about a typical radial distribution network shows that the proposed method is reasonable and effective.

  5. Regression and Geostatistical Techniques: Considerations and Observations from Experiences in NE-FIA

    Treesearch

    Rachel Riemann; Andrew Lister

    2005-01-01

    Maps of forest variables improve our understanding of the forest resource by allowing us to view and analyze it spatially. The USDA Forest Service's Northeastern Forest Inventory and Analysis unit (NE-FIA) has used geostatistical techniques, particularly stochastic simulation, to produce maps and spatial data sets of FIA variables. That work underscores the...

  6. Quantifying natural delta variability using a multiple-point geostatistics prior uncertainty model

    NASA Astrophysics Data System (ADS)

    Scheidt, Céline; Fernandes, Anjali M.; Paola, Chris; Caers, Jef

    2016-10-01

    We address the question of quantifying uncertainty associated with autogenic pattern variability in a channelized transport system by means of a modern geostatistical method. This question has considerable relevance for practical subsurface applications as well, particularly those related to uncertainty quantification relying on Bayesian approaches. Specifically, we show how the autogenic variability in a laboratory experiment can be represented and reproduced by a multiple-point geostatistical prior uncertainty model. The latter geostatistical method requires selection of a limited set of training images from which a possibly infinite set of geostatistical model realizations, mimicking the training image patterns, can be generated. To that end, we investigate two methods to determine how many training images and what training images should be provided to reproduce natural autogenic variability. The first method relies on distance-based clustering of overhead snapshots of the experiment; the second method relies on a rate of change quantification by means of a computer vision algorithm termed the demon algorithm. We show quantitatively that with either training image selection method, we can statistically reproduce the natural variability of the delta formed in the experiment. In addition, we study the nature of the patterns represented in the set of training images as a representation of the "eigenpatterns" of the natural system. The eigenpattern in the training image sets display patterns consistent with previous physical interpretations of the fundamental modes of this type of delta system: a highly channelized, incisional mode; a poorly channelized, depositional mode; and an intermediate mode between the two.

  7. Auricular Therapy for Treatment of Musculoskeletal Pain in the Setting of Military Personnel: A Randomized Trial

    DTIC Science & Technology

    2015-10-01

    Award Number: W81XWH-10-2-0163 TITLE: Auricular Therapy for Treatment of Musculoskeletal Pain in the Setting of Military Personnel: A Randomized...TITLE AND SUBTITLE 5a. CONTRACT NUMBER W81XWH-10-2-0163 Auricular Therapy for Treatment of Musculoskeletal Pain in the Setting of Military Personnel: A...SUBJET TERMS Auricular Therapy; Musculoskeletal Pain 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT 18. NUMBER OF PAGES 19a. NAME OF

  8. Spatial continuity measures for probabilistic and deterministic geostatistics

    SciTech Connect

    Isaaks, E.H.; Srivastava, R.M.

    1988-05-01

    Geostatistics has traditionally used a probabilistic framework, one in which expected values or ensemble averages are of primary importance. The less familiar deterministic framework views geostatistical problems in terms of spatial integrals. This paper outlines the two frameworks and examines the issue of which spatial continuity measure, the covariance C(h) or the variogram ..sigma..(h), is appropriate for each framework. Although C(h) and ..sigma..(h) were defined originally in terms of spatial integrals, the convenience of probabilistic notation made the expected value definitions more common. These now classical expected value definitions entail a linear relationship between C(h) and ..sigma..(h); the spatial integral definitions do not. In a probabilistic framework, where available sample information is extrapolated to domains other than the one which was sampled, the expected value definitions are appropriate; furthermore, within a probabilistic framework, reasons exist for preferring the variogram to the covariance function. In a deterministic framework, where available sample information is interpolated within the same domain, the spatial integral definitions are appropriate and no reasons are known for preferring the variogram. A case study on a Wiener-Levy process demonstrates differences between the two frameworks and shows that, for most estimation problems, the deterministic viewpoint is more appropriate. Several case studies on real data sets reveal that the sample covariance function reflects the character of spatial continuity better than the sample variogram. From both theoretical and practical considerations, clearly for most geostatistical problems, direct estimation of the covariance is better than the traditional variogram approach.

  9. Adaptive geostatistical sampling enables efficient identification of malaria hotspots in repeated cross-sectional surveys in rural Malawi

    PubMed Central

    Chipeta, Michael G.; McCann, Robert S.; Phiri, Kamija S.; van Vugt, Michèle; Takken, Willem; Diggle, Peter; Terlouw, Anja D.

    2017-01-01

    Introduction In the context of malaria elimination, interventions will need to target high burden areas to further reduce transmission. Current tools to monitor and report disease burden lack the capacity to continuously detect fine-scale spatial and temporal variations of disease distribution exhibited by malaria. These tools use random sampling techniques that are inefficient for capturing underlying heterogeneity while health facility data in resource-limited settings are inaccurate. Continuous community surveys of malaria burden provide real-time results of local spatio-temporal variation. Adaptive geostatistical design (AGD) improves prediction of outcome of interest compared to current random sampling techniques. We present findings of continuous malaria prevalence surveys using an adaptive sampling design. Methods We conducted repeated cross sectional surveys guided by an adaptive sampling design to monitor the prevalence of malaria parasitaemia and anaemia in children below five years old in the communities living around Majete Wildlife Reserve in Chikwawa district, Southern Malawi. AGD sampling uses previously collected data to sample new locations of high prediction variance or, where prediction exceeds a set threshold. We fitted a geostatistical model to predict malaria prevalence in the area. Findings We conducted five rounds of sampling, and tested 876 children aged 6–59 months from 1377 households over a 12-month period. Malaria prevalence prediction maps showed spatial heterogeneity and presence of hotspots—where predicted malaria prevalence was above 30%; predictors of malaria included age, socio-economic status and ownership of insecticide-treated mosquito nets. Conclusions Continuous malaria prevalence surveys using adaptive sampling increased malaria prevalence prediction accuracy. Results from the surveys were readily available after data collection. The tool can assist local managers to target malaria control interventions in areas with the

  10. Adaptive geostatistical sampling enables efficient identification of malaria hotspots in repeated cross-sectional surveys in rural Malawi.

    PubMed

    Kabaghe, Alinune N; Chipeta, Michael G; McCann, Robert S; Phiri, Kamija S; van Vugt, Michèle; Takken, Willem; Diggle, Peter; Terlouw, Anja D

    2017-01-01

    In the context of malaria elimination, interventions will need to target high burden areas to further reduce transmission. Current tools to monitor and report disease burden lack the capacity to continuously detect fine-scale spatial and temporal variations of disease distribution exhibited by malaria. These tools use random sampling techniques that are inefficient for capturing underlying heterogeneity while health facility data in resource-limited settings are inaccurate. Continuous community surveys of malaria burden provide real-time results of local spatio-temporal variation. Adaptive geostatistical design (AGD) improves prediction of outcome of interest compared to current random sampling techniques. We present findings of continuous malaria prevalence surveys using an adaptive sampling design. We conducted repeated cross sectional surveys guided by an adaptive sampling design to monitor the prevalence of malaria parasitaemia and anaemia in children below five years old in the communities living around Majete Wildlife Reserve in Chikwawa district, Southern Malawi. AGD sampling uses previously collected data to sample new locations of high prediction variance or, where prediction exceeds a set threshold. We fitted a geostatistical model to predict malaria prevalence in the area. We conducted five rounds of sampling, and tested 876 children aged 6-59 months from 1377 households over a 12-month period. Malaria prevalence prediction maps showed spatial heterogeneity and presence of hotspots-where predicted malaria prevalence was above 30%; predictors of malaria included age, socio-economic status and ownership of insecticide-treated mosquito nets. Continuous malaria prevalence surveys using adaptive sampling increased malaria prevalence prediction accuracy. Results from the surveys were readily available after data collection. The tool can assist local managers to target malaria control interventions in areas with the greatest health impact and is ready for

  11. Geostatistics for environmental and geotechnical applications

    SciTech Connect

    Rouhani, S.; Srivastava, R.M.; Desbarats, A.J.; Cromer, M.V.; Johnson, A.I.

    1996-12-31

    This conference was held January 26--27, 1995 in Phoenix, Arizona. The purpose of this conference was to provide a multidisciplinary forum for exchange of state-of-the-art information on the technology of geostatistics and its applicability for environmental studies, especially site characterization. Individual papers have been processed separately for inclusion in the appropriate data bases.

  12. Ice Surface Classification using Geo-Statistical Texture-Parameters

    NASA Astrophysics Data System (ADS)

    Wallin, B. F.; Herzfeld, U. C.

    2009-12-01

    The morphological features of ice surfaces contain valuable information on the state of morphogenetic, environmental, and dynamic processes that the ice has experienced. To effectively use this information, however, the scale and rapid evolution of earth's cryospheric systems necessitate intermediate processing and abstraction as vital tools. We demonstrate methods for automated detection of ice surface classes using robust geostatistical texture parameters and several clustering/classification techniques. By measuring texture parameters as aggregates of regional spatial point distributions, a great variety of surface types can be characterized and detected, at the expense of computational complexity. Unsupervised clustering algorithms such as OPTICS identify numerically distinct sets of values which then seed segmentation algorithms. The results of the methods applied to several data-sets are presented, including RADARSAT, ATM, and ICESAT observations over both land and sea-ice. The surface roughness characteristics are analyzed at the various scales covered by the data-sets.

  13. Building Damage-Resilient Dominating Sets in Complex Networks against Random and Targeted Attacks

    PubMed Central

    Molnár, F.; Derzsy, N.; Szymanski, B. K.; Korniss, G.

    2015-01-01

    We study the vulnerability of dominating sets against random and targeted node removals in complex networks. While small, cost-efficient dominating sets play a significant role in controllability and observability of these networks, a fixed and intact network structure is always implicitly assumed. We find that cost-efficiency of dominating sets optimized for small size alone comes at a price of being vulnerable to damage; domination in the remaining network can be severely disrupted, even if a small fraction of dominator nodes are lost. We develop two new methods for finding flexible dominating sets, allowing either adjustable overall resilience, or dominating set size, while maximizing the dominated fraction of the remaining network after the attack. We analyze the efficiency of each method on synthetic scale-free networks, as well as real complex networks. PMID:25662371

  14. A pilot cluster randomized controlled trial of structured goal-setting following stroke.

    PubMed

    Taylor, William J; Brown, Melanie; William, Levack; McPherson, Kathryn M; Reed, Kirk; Dean, Sarah G; Weatherall, Mark

    2012-04-01

    To determine the feasibility, the cluster design effect and the variance and minimal clinical importance difference in the primary outcome in a pilot study of a structured approach to goal-setting. A cluster randomized controlled trial. Inpatient rehabilitation facilities. People who were admitted to inpatient rehabilitation following stroke who had sufficient cognition to engage in structured goal-setting and complete the primary outcome measure. Structured goal elicitation using the Canadian Occupational Performance Measure. Quality of life at 12 weeks using the Schedule for Individualised Quality of Life (SEIQOL-DW), Functional Independence Measure, Short Form 36 and Patient Perception of Rehabilitation (measuring satisfaction with rehabilitation). Assessors were blinded to the intervention. Four rehabilitation services and 41 patients were randomized. We found high values of the intraclass correlation for the outcome measures (ranging from 0.03 to 0.40) and high variance of the SEIQOL-DW (SD 19.6) in relation to the minimally importance difference of 2.1, leading to impractically large sample size requirements for a cluster randomized design. A cluster randomized design is not a practical means of avoiding contamination effects in studies of inpatient rehabilitation goal-setting. Other techniques for coping with contamination effects are necessary.

  15. Dancing for Parkinson Disease: A Randomized Trial of Irish Set Dancing Compared With Usual Care.

    PubMed

    Shanahan, Joanne; Morris, Meg E; Bhriain, Orfhlaith Ni; Volpe, Daniele; Lynch, Tim; Clifford, Amanda M

    2017-09-01

    To examine the feasibility of a randomized controlled study design and to explore the benefits of a set dancing intervention compared with usual care. Randomized controlled design, with participants randomized to Irish set dance classes or a usual care group. Community based. Individuals with idiopathic Parkinson disease (PD) (N=90). The dance group attended a 1.5-hour dancing class each week for 10 weeks and undertook a home dance program for 20 minutes, 3 times per week. The usual care group continued with their usual care and daily activities. The primary outcome was feasibility, determined by recruitment rates, success of randomization and allocation procedures, attrition, adherence, safety, willingness of participants to be randomized, resource availability, and cost. Secondary outcomes were motor function (motor section of the Unified Parkinson's Disease Rating Scale), quality of life (Parkinson's Disease Questionnaire-39), functional endurance (6-min walk test), and balance (mini-BESTest). Ninety participants were randomized (45 per group). There were no adverse effects or resource constraints. Although adherence to the dancing program was 93.5%, there was >40% attrition in each group. Postintervention, the dance group had greater nonsignificant gains in quality of life than the usual care group. There was a meaningful deterioration in endurance in the usual care group. There were no meaningful changes in other outcomes. The exit questionnaire showed participants enjoyed the classes and would like to continue participation. For people with mild to moderately severe PD, set dancing is feasible and enjoyable and may improve quality of life. Copyright © 2016 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  16. Random forests-based differential analysis of gene sets for gene expression data.

    PubMed

    Hsueh, Huey-Miin; Zhou, Da-Wei; Tsai, Chen-An

    2013-04-10

    In DNA microarray studies, gene-set analysis (GSA) has become the focus of gene expression data analysis. GSA utilizes the gene expression profiles of functionally related gene sets in Gene Ontology (GO) categories or priori-defined biological classes to assess the significance of gene sets associated with clinical outcomes or phenotypes. Many statistical approaches have been proposed to determine whether such functionally related gene sets express differentially (enrichment and/or deletion) in variations of phenotypes. However, little attention has been given to the discriminatory power of gene sets and classification of patients. In this study, we propose a method of gene set analysis, in which gene sets are used to develop classifications of patients based on the Random Forest (RF) algorithm. The corresponding empirical p-value of an observed out-of-bag (OOB) error rate of the classifier is introduced to identify differentially expressed gene sets using an adequate resampling method. In addition, we discuss the impacts and correlations of genes within each gene set based on the measures of variable importance in the RF algorithm. Significant classifications are reported and visualized together with the underlying gene sets and their contribution to the phenotypes of interest. Numerical studies using both synthesized data and a series of publicly available gene expression data sets are conducted to evaluate the performance of the proposed methods. Compared with other hypothesis testing approaches, our proposed methods are reliable and successful in identifying enriched gene sets and in discovering the contributions of genes within a gene set. The classification results of identified gene sets can provide an valuable alternative to gene set testing to reveal the unknown, biologically relevant classes of samples or patients. In summary, our proposed method allows one to simultaneously assess the discriminatory ability of gene sets and the importance of genes for

  17. Fusion of Imperfect Information in the Unified Framework of Random Sets Theory: Application to Target Identification

    DTIC Science & Technology

    2007-11-01

    Informatique WGZ Anne-Laure Jousselme Éloi Bossé DRDC Valcartier Defence R&D Canada – Valcartier Technical Report DRDC Valcartier TR 2003-319 November 2007...Fusion of imperfect information in the unified framework of random sets theory Application to target identification Mihai Cristian Florea Informatique ...Cell CFB Esquimalt P.O. Box 17000 Stn Forces Victoria, British Columbia, V9A 7N2 Attn: Commanding Officer 1 M. C. Florea (author) Informatique WGZ inc

  18. A unified development of several techniques for the representation of random vectors and data sets

    NASA Technical Reports Server (NTRS)

    Bundick, W. T.

    1973-01-01

    Linear vector space theory is used to develop a general representation of a set of data vectors or random vectors by linear combinations of orthonormal vectors such that the mean squared error of the representation is minimized. The orthonormal vectors are shown to be the eigenvectors of an operator. The general representation is applied to several specific problems involving the use of the Karhunen-Loeve expansion, principal component analysis, and empirical orthogonal functions; and the common properties of these representations are developed.

  19. Alcohol risk management in college settings: the safer California universities randomized trial.

    PubMed

    Saltz, Robert F; Paschall, Mallie J; McGaffigan, Richard P; Nygaard, Peter M O

    2010-12-01

    Potentially effective environmental strategies have been recommended to reduce heavy alcohol use among college students. However, studies to date on environmental prevention strategies are few in number and have been limited by their nonexperimental designs, inadequate sample sizes, and lack of attention to settings where the majority of heavy drinking events occur. To determine whether environmental prevention strategies targeting off-campus settings would reduce the likelihood and incidence of student intoxication at those settings. The Safer California Universities study involved 14 large public universities, half of which were assigned randomly to the Safer intervention condition after baseline data collection in 2003. Environmental interventions took place in 2005 and 2006 after 1 year of planning with seven Safer intervention universities. Random cross-sectional samples of undergraduates completed online surveys in four consecutive fall semesters (2003-2006). Campuses and communities surrounding eight campuses of the University of California and six in the California State University system were utilized. The study used random samples of undergraduates (∼500-1000 per campus per year) attending the 14 public California universities. Safer environmental interventions included nuisance party enforcement operations, minor decoy operations, driving-under-the-influence checkpoints, social host ordinances, and use of campus and local media to increase the visibility of environmental strategies. Proportion of drinking occasions in which students drank to intoxication at six different settings during the fall semester (residence hall party, campus event, fraternity or sorority party, party at off-campus apartment or house, bar/restaurant, outdoor setting), any intoxication at each setting during the semester, and whether students drank to intoxication the last time they went to each setting. Significant reductions in the incidence and likelihood of intoxication at

  20. Mixed-point geostatistical simulation: A combination of two- and multiple-point geostatistics

    NASA Astrophysics Data System (ADS)

    Cordua, Knud Skou; Hansen, Thomas Mejer; Gulbrandsen, Mats Lundh; Barnes, Christophe; Mosegaard, Klaus

    2016-09-01

    Multiple-point-based geostatistical methods are used to model complex geological structures. However, a training image containing the characteristic patterns of the Earth model has to be provided. If no training image is available, two-point (i.e., covariance-based) geostatistical methods are typically applied instead because these methods provide fewer constraints on the Earth model. This study is motivated by the case where 1-D vertical training images are available through borehole logs, whereas little or no information about horizontal dependencies exists. This problem is solved by developing theory that makes it possible to combine information from multiple- and two-point geostatistics for different directions, leading to a mixed-point geostatistical model. An example of combining information from the multiple-point-based single normal equation simulation algorithm and two-point-based sequential indicator simulation algorithm is provided. The mixed-point geostatistical model is used for conditional sequential simulation based on vertical training images from five borehole logs and a range parameter describing the horizontal dependencies.

  1. A more complete task-set reconfiguration in random than in predictable task switch.

    PubMed

    Tornay, F J; Milán, E G

    2001-08-01

    Three experiments are presented that compare the cost found when switching from one task to another in two different conditions. In one of them, the tasks switch in predictable sequences. In the other condition, the tasks alternate at random. A smaller time cost is found in the random-switch condition when enough preparation time is allowed. Such an effect is due to the random-switch cost continuing to decrease with preparation time after the predictable-switch cost has reached an asymptote. Although the relationship between number of repetitions of one task and time cost is different in the random- and the predictable-switch conditions, only the latter shows the presence of an "exogenous" component of cost. The implications of this finding are discussed in relationship with the usual distinction between an endogenous component of switch cost that is affected by preparation time and another exogenous, residual component (e.g., Rogers & Monsell, 1995). It is proposed that a different kind of task-set preparation is at work when tasks alternate at random.

  2. Set-back versus buried vertical mattress suturing: results of a randomized blinded trial.

    PubMed

    Wang, Audrey S; Kleinerman, Rebecca; Armstrong, April W; Fitzmaurice, Sarah; Pascucci, Anabella; Awasthi, Smita; Ratnarathorn, Mondhipa; Sivamani, Raja; King, Thomas H; Eisen, Daniel B

    2015-04-01

    The set-back suture, an absorbable dermal suturing technique, purportedly improves wound eversion and cosmetic outcomes. We sought to conduct a split-wound, prospective, randomized study to compare the cosmetic outcome and wound eversion achieved with the set-back suture and the buried vertical mattress suture (BVMS). A total of 46 surgical elliptical wounds were randomized to subcuticular closure with the set-back suture on half and the BVMS on the other. Maximum eversion height and width were measured immediately postoperatively. At 3 months, 2 blinded observers evaluated each scar using a 7-point Likert physician global scar assessment scale. Subjects and observers also completed the validated Patient and Observer Scar Assessment Scale, where a score of 6 represents normal-appearing skin and 60 represents worst imaginable scar. In all, 42 subjects completed the study. The set-back suture provided statistically significant wound eversion. On the Likert scale, observers rated the set-back suture side 1 point better than the BVMS side. Both patient and observer total Patient and Observer Scar Assessment Scale scores were significantly lower for the set-back suture side (subject mean 13.0 ± 8.7 vs 16.2 ± 12.0 [P = .039]; observer mean 24.5 ± 10.4 vs 27.7 ± 13.6 [P = .028], respectively). Single institution experience and relatively short follow-up are limitations. The set-back suture provides superior wound eversion and better cosmetic outcomes than the BVMS. Copyright © 2014 American Academy of Dermatology, Inc. Published by Elsevier Inc. All rights reserved.

  3. Multiple Point Geostatistics for automated landform mapping

    NASA Astrophysics Data System (ADS)

    Karssenberg, D.; Vannametee, E.; Babel, L.; Schuur, J.; Hendriks, M.; Bierkens, M. F.

    2011-12-01

    Land-surface processes are often studied at the level of elementary landform units, e.g. geomorphological units. To avoid expensive and difficult field surveys and to ensure a consistent mapping scheme, automated derivation of these units is desirable. However, automated classification based on two-point statistics of topographical attributes (e.g. semivarigram) is inadequate in reproducing complex, curvilinear landform patterns. Therefore, the spatial structure and configuration of terrain characteristics suitable for landform classification should be based on statistics from multiple points. In this study, a generic automated landform classification routine is developed which is based on Multiple Point Geostatistics (MPG) using information from a field map of geomorphology in a training area and a gridded Digital Elevation Model (DEM). Focus is on classification of geomorphologic units; e.g. alluvial fan, river terrace. The approach is evaluated using data from the French Alps. In the first procedural step, spatial statistics of the geomorphologic units are retrieved from a training data set, consisting of a digital elevation model and a geomorphologic map, created using field observations and 37.5 x 37.5 m2 cells. For each grid cell in the training data set, the geomorphological unit of the grid cell and a set of topographical attributes (i.e. a pattern) of the grid cell is stored in a frequency database. The set of topographical attributes stored is chosen such that it represents criteria used in field mapping. These are, for instance, topographical slope gradient, upstream area, or geomorphological units mapped in the neighborhood of the cell. Continuous information (e.g. slope) is converted to categorical data (slope class), which is required in the MPG approach. The second step is to use the knowledge stored in the frequency database for mapping. The algorithm reads a set of attribute classes from a classification target cell and its surrounding cells taking

  4. A PC-Windows-Based program for geostatistical modeling application

    SciTech Connect

    Wu, G.G.; Yang, A.P.

    1994-12-31

    This paper describes a technically advanced, user-friendly, PC-Windows{sup TM} based reservoir simulation tool (SIMTOOLS) that allows construction of realistic reservoir models using a geostatistical approach. This PC-Windows based product has three application tools: Digitizing, mapping, and geostatistics. It has been designed primarily to enable reservoir engineers to apply the geostatistical gridding technique in mapping and reservoir simulation practices.

  5. Stability of Dominating Sets in Complex Networks against Random and Targeted Attacks

    NASA Astrophysics Data System (ADS)

    Molnar, F.; Derzsy, N.; Szymanski, B. K.; Korniss, G.

    2014-03-01

    Minimum dominating sets (MDS) are involved in efficiently controlling and monitoring many social and technological networks. However, MDS influence over the entire network may be significantly reduced when some MDS nodes are disabled due to random breakdowns or targeted attacks against nodes in the network. We investigate the stability of domination in scale-free networks in such scenarios. We define stability as the fraction of nodes in the network that are still dominated after some nodes have been removed, either randomly, or by targeting the highest-degree nodes. We find that although the MDS is the most cost-efficient solution (requiring the least number of nodes) for reaching every node in an undamaged network, it is also very sensitive to damage. Further, we investigate alternative methods for finding dominating sets that are less efficient (more costly) than MDS but provide better stability. Finally we construct an algorithm based on greedy node selection that allows us to precisely control the balance between domination stability and cost, to achieve any desired stability at minimum cost, or the best possible stability at any given cost. Analysis of our method shows moderate improvement of domination cost efficiency against random breakdowns, but substantial improvements against targeted attacks. Supported by DARPA, DTRA, ARL NS-CTA, ARO, and ONR.

  6. Efficient estimation in two-stage randomized clinical trials using ranked sets.

    PubMed

    Hossain, Syed Shahadat; Awan, Nabil

    2016-12-14

    Clinical trials designed for survival probability estimation of different treatment policies for chronic diseases like cancer, leukemia, and schizophrenia usually need randomization of treatments in two stages. Since complete remission is rare for these diseases, initially an induction therapy is given for patient's remission. Further treatment, which is often an expensive maintenance therapy, is administered only for the patients with remission. If the maintenance therapy is so expensive that the cost of the trial inflates, only a simple random sample of patients will be treated with the expensive maintenance due to budget constraint. In this article, we have implemented a design using ranked sets instead of simple randomization in the second stage and obtained an unbiased estimator of the overall survival distribution for a particular treatment combination. Through simulation studies under different conditions, we have found that the design we developed based on ranked sets gives an unbiased estimate of the population survival probability which is more efficient than the estimate obtained by the usual design.

  7. Geostatistical Interpolation of Particle-Size Curves in Heterogeneous Aquifers

    NASA Astrophysics Data System (ADS)

    Guadagnini, A.; Menafoglio, A.; Secchi, P.

    2013-12-01

    We address the problem of predicting the spatial field of particle-size curves (PSCs) from measurements associated with soil samples collected at a discrete set of locations within an aquifer system. Proper estimates of the full PSC are relevant to applications related to groundwater hydrology, soil science and geochemistry and aimed at modeling physical and chemical processes occurring in heterogeneous earth systems. Hence, we focus on providing kriging estimates of the entire PSC at unsampled locations. To this end, we treat particle-size curves as cumulative distribution functions, model their densities as functional compositional data and analyze them by embedding these into the Hilbert space of compositional functions endowed with the Aitchison geometry. On this basis, we develop a new geostatistical methodology for the analysis of spatially dependent functional compositional data. Our functional compositional kriging (FCK) approach allows providing predictions at unsampled location of the entire particle-size curve, together with a quantification of the associated uncertainty, by fully exploiting both the functional form of the data and their compositional nature. This is a key advantage of our approach with respect to traditional methodologies, which treat only a set of selected features (e.g., quantiles) of PSCs. Embedding the full PSC into a geostatistical analysis enables one to provide a complete characterization of the spatial distribution of lithotypes in a reservoir, eventually leading to improved predictions of soil hydraulic attributes through pedotransfer functions as well as of soil geochemical parameters which are relevant in sorption/desorption and cation exchange processes. We test our new method on PSCs sampled along a borehole located within an alluvial aquifer near the city of Tuebingen, Germany. The quality of FCK predictions is assessed through leave-one-out cross-validation. A comparison between hydraulic conductivity estimates obtained

  8. Characterizing gene sets using discriminative random walks with restart on heterogeneous biological networks

    PubMed Central

    Blatti, Charles; Sinha, Saurabh

    2016-01-01

    Motivation: Analysis of co-expressed gene sets typically involves testing for enrichment of different annotations or ‘properties’ such as biological processes, pathways, transcription factor binding sites, etc., one property at a time. This common approach ignores any known relationships among the properties or the genes themselves. It is believed that known biological relationships among genes and their many properties may be exploited to more accurately reveal commonalities of a gene set. Previous work has sought to achieve this by building biological networks that combine multiple types of gene–gene or gene–property relationships, and performing network analysis to identify other genes and properties most relevant to a given gene set. Most existing network-based approaches for recognizing genes or annotations relevant to a given gene set collapse information about different properties to simplify (homogenize) the networks. Results: We present a network-based method for ranking genes or properties related to a given gene set. Such related genes or properties are identified from among the nodes of a large, heterogeneous network of biological information. Our method involves a random walk with restarts, performed on an initial network with multiple node and edge types that preserve more of the original, specific property information than current methods that operate on homogeneous networks. In this first stage of our algorithm, we find the properties that are the most relevant to the given gene set and extract a subnetwork of the original network, comprising only these relevant properties. We then re-rank genes by their similarity to the given gene set, based on a second random walk with restarts, performed on the above subnetwork. We demonstrate the effectiveness of this algorithm for ranking genes related to Drosophila embryonic development and aggressive responses in the brains of social animals. Availability and Implementation: DRaWR was implemented as

  9. Geostatistics: models and tools for the earth sciences

    SciTech Connect

    Journel, A.G.

    1986-01-01

    The probability construct underlying geostatistical methodology is recalled, stressing that stationary is a property of the model rather than of the phenomenon being represented. Geostatistics is more then interpolation and kriging(s) is more than linear interpolation through ordinary kriging. A few common misconceptions are addressed.

  10. A natural setting behavior management program for persons with acquired brain injury: a randomized controlled trial.

    PubMed

    Carnevale, George J; Anselmi, Vera; Johnston, Mark V; Busichio, Kim; Walsh, Vanessa

    2006-10-01

    To investigate the efficacy of a behavior management program delivered in the natural community setting for persons with brain injury and their caregivers. Three-group randomized controlled trial. Homes and other community settings. Thirty-seven persons with traumatic and other acquired brain injury and their caregivers. Natural Setting Behavior Management (NSBM) involving education and individualized behavior modification program versus education only versus control group. Changes in frequency of targeted problematic behaviors. Subscale in Questionnaire on Resources and Stress, Maslach Burnout Inventory, and the Neurobehavioral Functioning Inventory. While no significant effects were detected at termination of education only (P<.075) or of NSBM (P<.56), significant treatment effects were found at the main outcome point 3 months after termination of services (P<.002). Rates of disruptive or aggressive behaviors declined significantly in the NSBM group. Differences in caregiver-rated stress, burden, and aggression were not statistically significant. A program of caregiver education and individualized behavior management in natural settings can decrease the frequency of disruptive behavioral challenges. Larger studies are needed to clarify the duration and intensity of education and individualized treatment required to diminish behavioral challenges and to understand relationships with general stress and burden experienced by caregivers.

  11. Random sets of stadiums in square and collective behavior of bacteria.

    PubMed

    Czapla, Roman

    2016-09-21

    Collective motion of swimmers can be detected by hydrodynamic interactions through the eective (macroscopic) viscosity. It follows form the general hydrodynamics that the eective viscosity of non-dilute random suspensions depends on the shape of particles and of their spacial probabilistic distribution. Therefore, a comparative analysis of disordered and collectively interacting particles of the bacteria shape can be done in terms of the probabilistic geometric parameters which determine the eective viscosity. In this paper, we develop a quantitative criterion to detect the collective behavior of bacteria. This criterion is based on the basic statistic moments (e-sums or generalized Eisenstein-Rayleigh sums) which characterize the high-order correlation functions. The locations and the shape of bacteria are modeled by stadiums randomly embedded in medium without overlapping. This shape models can be considered as improvement of the previous segment model.We calculate the e-sums of the simulated disordered sets and of the observed experimental locations of bacteria subtilis. The obtained results show a dierence between these two sets that demonstrates the collective motion of bacteria.

  12. An Efficient ERP-Based Brain-Computer Interface Using Random Set Presentation and Face Familiarity

    PubMed Central

    Müller, Klaus-Robert; Lee, Seong-Whan

    2014-01-01

    Event-related potential (ERP)-based P300 spellers are commonly used in the field of brain-computer interfaces as an alternative channel of communication for people with severe neuro-muscular diseases. This study introduces a novel P300 based brain-computer interface (BCI) stimulus paradigm using a random set presentation pattern and exploiting the effects of face familiarity. The effect of face familiarity is widely studied in the cognitive neurosciences and has recently been addressed for the purpose of BCI. In this study we compare P300-based BCI performances of a conventional row-column (RC)-based paradigm with our approach that combines a random set presentation paradigm with (non-) self-face stimuli. Our experimental results indicate stronger deflections of the ERPs in response to face stimuli, which are further enhanced when using the self-face images, and thereby improving P300-based spelling performance. This lead to a significant reduction of stimulus sequences required for correct character classification. These findings demonstrate a promising new approach for improving the speed and thus fluency of BCI-enhanced communication with the widely used P300-based BCI setup. PMID:25384045

  13. Random Finite Set Based Bayesian Filtering with OpenCL in a Heterogeneous Platform

    PubMed Central

    Hu, Biao; Sharif, Uzair; Koner, Rajat; Chen, Guang; Huang, Kai; Zhang, Feihu; Stechele, Walter; Knoll, Alois

    2017-01-01

    While most filtering approaches based on random finite sets have focused on improving performance, in this paper, we argue that computation times are very important in order to enable real-time applications such as pedestrian detection. Towards this goal, this paper investigates the use of OpenCL to accelerate the computation of random finite set-based Bayesian filtering in a heterogeneous system. In detail, we developed an efficient and fully-functional pedestrian-tracking system implementation, which can run under real-time constraints, meanwhile offering decent tracking accuracy. An extensive evaluation analysis was carried out to ensure the fulfillment of sufficient accuracy requirements. This was followed by extensive profiling analysis to spot the potential bottlenecks in terms of execution performance, which were then targeted to come up with an OpenCL accelerated application. Video-throughput improvements from roughly 15 fps to 100 fps (6×) were observed on average while processing typical MOT benchmark videos. Moreover, the worst-case frame processing yielded an 18× advantage from nearly 2 fps to 36 fps, thereby comfortably meeting the real-time constraints. Our implementation is released as open-source code. PMID:28417906

  14. Random Finite Set Based Bayesian Filtering with OpenCL in a Heterogeneous Platform.

    PubMed

    Hu, Biao; Sharif, Uzair; Koner, Rajat; Chen, Guang; Huang, Kai; Zhang, Feihu; Stechele, Walter; Knoll, Alois

    2017-04-12

    While most filtering approaches based on random finite sets have focused on improving performance, in this paper, we argue that computation times are very important in order to enable real-time applications such as pedestrian detection. Towards this goal, this paper investigates the use of OpenCL to accelerate the computation of random finite set-based Bayesian filtering in a heterogeneous system. In detail, we developed an efficient and fully-functional pedestrian-tracking system implementation, which can run under real-time constraints, meanwhile offering decent tracking accuracy. An extensive evaluation analysis was carried out to ensure the fulfillment of sufficient accuracy requirements. This was followed by extensive profiling analysis to spot the potential bottlenecks in terms of execution performance, which were then targeted to come up with an OpenCL accelerated application. Video-throughput improvements from roughly 15 fps to 100 fps (6×) were observed on average while processing typical MOT benchmark videos. Moreover, the worst-case frame processing yielded an 18× advantage from nearly 2 fps to 36 fps, thereby comfortably meeting the real-time constraints. Our implementation is released as open-source code.

  15. An efficient ERP-based brain-computer interface using random set presentation and face familiarity.

    PubMed

    Yeom, Seul-Ki; Fazli, Siamac; Müller, Klaus-Robert; Lee, Seong-Whan

    2014-01-01

    Event-related potential (ERP)-based P300 spellers are commonly used in the field of brain-computer interfaces as an alternative channel of communication for people with severe neuro-muscular diseases. This study introduces a novel P300 based brain-computer interface (BCI) stimulus paradigm using a random set presentation pattern and exploiting the effects of face familiarity. The effect of face familiarity is widely studied in the cognitive neurosciences and has recently been addressed for the purpose of BCI. In this study we compare P300-based BCI performances of a conventional row-column (RC)-based paradigm with our approach that combines a random set presentation paradigm with (non-) self-face stimuli. Our experimental results indicate stronger deflections of the ERPs in response to face stimuli, which are further enhanced when using the self-face images, and thereby improving P300-based spelling performance. This lead to a significant reduction of stimulus sequences required for correct character classification. These findings demonstrate a promising new approach for improving the speed and thus fluency of BCI-enhanced communication with the widely used P300-based BCI setup.

  16. Geostatistical Analysis of Spatial Variability of Mineral Abundance and Kd in Frenchman Flat, NTS, Alluvium

    SciTech Connect

    Carle, S F; Zavarin, M; Pawloski, G A

    2002-11-01

    LLNL hydrologic source term modeling at the Cambric site (Pawloski et al., 2000) showed that retardation of radionuclide transport is sensitive to the distribution and amount of radionuclide sorbing minerals. While all mineralogic information available near the Cambric site was used in these early simulations (11 mineral abundance analyses from UE-5n and 9 from RNM-l), these older data sets were qualitative in nature, with detection limits too high to accurately measure many of the important radionuclide sorbing minerals (e.g. iron oxide). Also, the sparse nature of the mineral abundance data permitted only a hypothetical description of the spatial distribution of radionuclide sorbing minerals. Yet, the modeling results predicted that the spatial distribution of sorbing minerals would strongly affect radionuclide transport. Clearly, additional data are needed to improve understanding of mineral abundances and their spatial distributions if model predictions in Frenchman Flat are to be defensible. This report evaluates new high-resolution quantitative X-Ray Diffraction (XRD) data on mineral distributions and their abundances from core samples recently collected from drill hole ER-5-4. The total of 94 samples from ER-5-4 were collected at various spacings to enable evaluation of spatial variability at a variety of spatial scales as small as 0.3 meters and up to hundreds of meters. Additional XRD analyses obtained from drillholes UE-Sn, ER-5-3, and U-11g-1 are used to augment evaluation of vertical spatial variability and permit some evaluation of lateral spatial variability. A total of 163 samples are evaluated. The overall goal of this study is to understand and characterize the spatial variation of sorbing minerals in Frenchman Flat alluvium using geostatistical techniques, with consideration for the potential impact on reactive transport of radionuclides. To achieve this goal requires an effort to ensure that plausible geostatistical models are used to

  17. Randomized Trial of Plastic Bags to Prevent Term Neonatal Hypothermia in a Resource-Poor Setting

    PubMed Central

    Belsches, Theodore C.; Tilly, Alyssa E.; Miller, Tonya R.; Kambeyanda, Rohan H.; Leadford, Alicia; Manasyan, Albert; Chomba, Elwyn; Ramani, Manimaran; Ambalavanan, Namasivayam

    2013-01-01

    OBJECTIVES: Term infants in resource-poor settings frequently develop hypothermia during the first hours after birth. Plastic bags or wraps are a low-cost intervention for the prevention of hypothermia in preterm and low birth weight infants that may also be effective in term infants. Our objective was to test the hypothesis that placement of term neonates in plastic bags at birth reduces hypothermia at 1 hour after birth in a resource-poor hospital. METHODS: This parallel-group randomized controlled trial was conducted at University Teaching Hospital, the tertiary referral center in Zambia. Inborn neonates with both a gestational age ≥37 weeks and a birth weight ≥2500 g were randomized 1:1 to either a standard thermoregulation protocol or to a standard thermoregulation protocol with placement of the torso and lower extremities inside a plastic bag within 10 minutes after birth. The primary outcome was hypothermia (<36.5°C axillary temperature) at 1 hour after birth. RESULTS: Neonates randomized to plastic bag (n = 135) or to standard thermoregulation care (n = 136) had similar baseline characteristics (birth weight, gestational age, gender, and baseline temperature). Neonates in the plastic bag group had a lower rate of hypothermia (60% vs 73%, risk ratio 0.76, confidence interval 0.60–0.96, P = .026) and a higher axillary temperature (36.4 ± 0.5°C vs 36.2 ± 0.7°C, P < .001) at 1 hour after birth compared with infants receiving standard care. CONCLUSIONS: Placement in a plastic bag at birth reduced the incidence of hypothermia at 1 hour after birth in term neonates born in a resource-poor setting, but most neonates remained hypothermic. PMID:23979082

  18. GEOSTATISTICS FOR WASTE MANAGEMENT: A USER'S MANUEL FOR THE GEOPACK (VERSION 1.0) GEOSTATISTICAL SOFTWARE SYSTEM

    EPA Science Inventory

    A comprehensive, user-friendly geostatistical software system called GEOPACk has been developed. The purpose of this software is to make available the programs necessary to undertake a geostatistical analysis of spatially correlated data. The programs were written so that they ...

  19. GEOSTATISTICS FOR WASTE MANAGEMENT: A USER'S MANUEL FOR THE GEOPACK (VERSION 1.0) GEOSTATISTICAL SOFTWARE SYSTEM

    EPA Science Inventory

    A comprehensive, user-friendly geostatistical software system called GEOPACk has been developed. The purpose of this software is to make available the programs necessary to undertake a geostatistical analysis of spatially correlated data. The programs were written so that they ...

  20. Context-free pairs of groups II — Cuts, tree sets, and random walks

    PubMed Central

    Woess, Wolfgang

    2012-01-01

    This is a continuation of the study, begun by Ceccherini-Silberstein and Woess (2009) [5], of context-free pairs of groups and the related context-free graphs in the sense of Muller and Schupp (1985) [22]. The graphs under consideration are Schreier graphs of a subgroup of some finitely generated group, and context-freeness relates to a tree-like structure of those graphs. Instead of the cones of Muller and Schupp (1985) [22] (connected components resulting from deletion of finite balls with respect to the graph metric), a more general approach to context-free graphs is proposed via tree sets consisting of cuts of the graph, and associated structure trees. The existence of tree sets with certain “good” properties is studied. With a tree set, a natural context-free grammar is associated. These investigations of the structure of context free pairs, resp. graphs are then applied to study random walk asymptotics via complex analysis. In particular, a complete proof of the local limit theorem for return probabilities on any virtually free group is given, as well as on Schreier graphs of a finitely generated subgoup of a free group. This extends, respectively completes, the significant work of Lalley (1993, 2001) [18,20]. PMID:22267873

  1. Cluster randomized controlled trial protocol: addressing reproductive coercion in health settings (ARCHES).

    PubMed

    Tancredi, Daniel J; Silverman, Jay G; Decker, Michele R; McCauley, Heather L; Anderson, Heather A; Jones, Kelley A; Ciaravino, Samantha; Hicks, Angela; Raible, Claire; Zelazny, Sarah; James, Lisa; Miller, Elizabeth

    2015-08-06

    Women ages 16-29 utilizing family planning clinics for medical services experience higher rates of intimate partner violence (IPV) and reproductive coercion (RC) than their same-age peers, increasing risk for unintended pregnancy and related poor reproductive health outcomes. Brief interventions integrated into routine family planning care have shown promise in reducing risk for RC, but longer-term intervention effects on partner violence victimization, RC, and unintended pregnancy have not been examined. The 'Addressing Reproductive Coercion in Health Settings (ARCHES)' Intervention Study is a cluster randomized controlled trial evaluating the effectiveness of a brief, clinician-delivered universal education and counseling intervention to reduce IPV, RC and unintended pregnancy compared to standard-of-care in family planning clinic settings. The ARCHES intervention was refined based on formative research. Twenty five family planning clinics were randomized (in 17 clusters) to either a three hour training for all family planning clinic staff on how to deliver the ARCHES intervention or to a standard-of-care control condition. All women ages 16-29 seeking care in these family planning clinics were eligible to participate. Consenting clients use laptop computers to answer survey questions immediately prior to their clinic visit, a brief exit survey immediately after the clinic visit, a first follow up survey 12-20 weeks after the baseline visit (T2), and a final survey 12 months after the baseline (T3). Medical record chart review provides additional data about IPV and RC assessment and disclosure, sexual and reproductive health diagnoses, and health care utilization. Of 4009 women approached and determined to be eligible based on age (16-29 years old), 3687 (92 % participation) completed the baseline survey and were included in the sample. The ARCHES Intervention Study is a community-partnered study designed to provide arigorous assessment of the short (3-4 months

  2. The Ethics of Randomized Controlled Trials in Social Settings: Can Social Trials Be Scientifically Promising and Must There Be Equipoise?

    ERIC Educational Resources Information Center

    Fives, Allyn; Russell, Daniel W.; Canavan, John; Lyons, Rena; Eaton, Patricia; Devaney, Carmel; Kearns, Norean; O'Brien, Aoife

    2015-01-01

    In a randomized controlled trial (RCT), treatments are assigned randomly and treatments are withheld from participants. Is it ethically permissible to conduct an RCT in a social setting? This paper addresses two conditions for justifying RCTs: that there should be a state of equipoise and that the trial should be scientifically promising.…

  3. Randomized Controlled Trial in Clinical Settings to Evaluate Effectiveness of Coping Skills Education Used with Progressive Tinnitus Management

    ERIC Educational Resources Information Center

    Henry, James A.; Thielman, Emily J.; Zaugg, Tara L.; Kaelin, Christine; Schmidt, Caroline J.; Griest, Susan; McMillan, Garnett P.; Myers, Paula; Rivera, Izel; Baldwin, Robert; Carlson, Kathleen

    2017-01-01

    Purpose: This randomized controlled trial evaluated, within clinical settings, the effectiveness of coping skills education that is provided with progressive tinnitus management (PTM). Method: At 2 Veterans Affairs medical centers, N = 300 veterans were randomized to either PTM intervention or 6-month wait-list control. The PTM intervention…

  4. The Ethics of Randomized Controlled Trials in Social Settings: Can Social Trials Be Scientifically Promising and Must There Be Equipoise?

    ERIC Educational Resources Information Center

    Fives, Allyn; Russell, Daniel W.; Canavan, John; Lyons, Rena; Eaton, Patricia; Devaney, Carmel; Kearns, Norean; O'Brien, Aoife

    2015-01-01

    In a randomized controlled trial (RCT), treatments are assigned randomly and treatments are withheld from participants. Is it ethically permissible to conduct an RCT in a social setting? This paper addresses two conditions for justifying RCTs: that there should be a state of equipoise and that the trial should be scientifically promising.…

  5. Fitting parametric random effects models in very large data sets with application to VHA national data.

    PubMed

    Gebregziabher, Mulugeta; Egede, Leonard; Gilbert, Gregory E; Hunt, Kelly; Nietert, Paul J; Mauldin, Patrick

    2012-10-24

    With the current focus on personalized medicine, patient/subject level inference is often of key interest in translational research. As a result, random effects models (REM) are becoming popular for patient level inference. However, for very large data sets that are characterized by large sample size, it can be difficult to fit REM using commonly available statistical software such as SAS since they require inordinate amounts of computer time and memory allocations beyond what are available preventing model convergence. For example, in a retrospective cohort study of over 800,000 Veterans with type 2 diabetes with longitudinal data over 5 years, fitting REM via generalized linear mixed modeling using currently available standard procedures in SAS (e.g. PROC GLIMMIX) was very difficult and same problems exist in Stata's gllamm or R's lme packages. Thus, this study proposes and assesses the performance of a meta regression approach and makes comparison with methods based on sampling of the full data. We use both simulated and real data from a national cohort of Veterans with type 2 diabetes (n=890,394) which was created by linking multiple patient and administrative files resulting in a cohort with longitudinal data collected over 5 years. The outcome of interest was mean annual HbA1c measured over a 5 years period. Using this outcome, we compared parameter estimates from the proposed random effects meta regression (REMR) with estimates based on simple random sampling and VISN (Veterans Integrated Service Networks) based stratified sampling of the full data. Our results indicate that REMR provides parameter estimates that are less likely to be biased with tighter confidence intervals when the VISN level estimates are homogenous. When the interest is to fit REM in repeated measures data with very large sample size, REMR can be used as a good alternative. It leads to reasonable inference for both Gaussian and non-Gaussian responses if parameter estimates are

  6. Directly observed therapy for chronic hepatitis C: a randomized clinical trial in the prison setting.

    PubMed

    Saiz de la Hoya, Pablo; Portilla, Joaquín; Marco, Andrés; García-Guerrero, Julio; Faraco, Inmaculada; Antón, José; de Juan, José; Pozo, Edelmira

    2014-10-01

    The diagnosis and treatment of chronic hepatitis C are major concerns in prisons. The aim of this randomized clinical trial was to determine the extent to which directly observed therapy (DOT) improved the efficacy of the standard treatment for chronic hepatitis C in the prison setting. A randomized clinical trial was carried out to evaluate the efficacy of a DOT compared with a self-administered therapy in prison inmates who underwent standard treatment for chronic hepatitis C (based on pegylated interferon alpha-2a and ribavirin). A total of 252 inmates were randomized, of which 244 were analyzed: 109 in the DOT group and 135 in the non-DOT group. The mean age was 35.88 years (SD 6.54), 94.3% were men, 72.1% reported intravenous drug use, 21.3% were HIV co-infected, and 55.3% had genotype 1 or 4. The patients received the study treatment for a median time of 33.9 weeks in the overall sample. Sustained virological response was achieved in 60.6% (95% CI, 51.17-69.22) of the DOT group and in 65.9% (95% CI, 57.59-73.38) of the standard therapy group (risk ratio=0.92; 95% CI, 0.76-1.12). The mean proportion of patients continuing the treatment was 83% (SD=31). Adverse events were reported in 93.4% of the patients, and serious adverse events were reported in 8.2%, with no significant differences between groups. Sustained virological response was remarkably high, although there were no differences between groups, probably due to high treatment adherence. Copyright © 2013 Elsevier España, S.L.U. and AEEH y AEG. All rights reserved.

  7. Timing of enteral feeding in cerebral malaria in resource-poor settings: a randomized trial.

    PubMed

    Maude, Richard J; Hoque, Gofranul; Hasan, Mahtab Uddin; Sayeed, Abu; Akter, Shahena; Samad, Rasheda; Alam, Badrul; Yunus, Emran Bin; Rahman, Ridwanur; Rahman, Waliur; Chowdhury, Romal; Seal, Tapan; Charunwatthana, Prakaykaew; Chang, Christina C; White, Nicholas J; Faiz, M Abul; Day, Nicholas P J; Dondorp, Arjen M; Hossain, Amir

    2011-01-01

    Early start of enteral feeding is an established treatment strategy in intubated patients in intensive care since it reduces invasive bacterial infections and length of hospital stay. There is equipoise whether early enteral feeding is also beneficial in non-intubated patients with cerebral malaria in resource poor settings. We hypothesized that the risk of aspiration pneumonia might outweigh the potential benefits of earlier recovery and prevention of hypoglycaemia. A randomized trial of early (day of admission) versus late (after 60 hours in adults or 36 hours in children) start of enteral feeding was undertaken in patients with cerebral malaria in Chittagong, Bangladesh from May 2008 to August 2009. The primary outcome measures were incidence of aspiration pneumonia, hypoglycaemia and coma recovery time. The trial was terminated after inclusion of 56 patients because of a high incidence of aspiration pneumonia in the early feeding group (9/27 (33%)), compared to the late feeding group (0/29 (0%)), p = 0.001). One patient in the late feeding group, and none in the early group, had hypoglycaemia during admission. There was no significant difference in overall mortality (9/27 (33%) vs 6/29 (21%), p = 0.370), but mortality was 5/9 (56%) in patients with aspiration pneumonia. In conclusion, early start of enteral feeding is detrimental in non-intubated patients with cerebral malaria in many resource-poor settings. Evidence gathered in resource rich settings is not necessarily transferable to resource-poor settings. Controlled-Trials.com ISRCTN57488577.

  8. Randomized controlled clinical trial on the three-dimensional accuracy of fast-set impression materials.

    PubMed

    Rudolph, Heike; Quaas, Sebastian; Haim, Manuela; Preißler, Jörg; Walter, Michael H; Koch, Rainer; Luthardt, Ralph G

    2013-06-01

    The use of fast-setting impression materials with different viscosities for the one-stage impression technique demands precise working times when mixing. We examined the effect of varying working time on impression precision in a randomized clinical trial. Focusing on tooth 46, three impressions were made from each of 96 volunteers, using either a polyether (PE: Impregum Penta H/L DuoSoft Quick, 3 M ESPE) or an addition-curing silicone (AS: Aquasil Ultra LV, Dentsply/DeTrey), one with the manufacturer's recommended working time (used as a reference) and two with altered working times. All stages of the impression-taking were subject to randomization. The three-dimensional precision of the non-standard working time impressions was digitally analyzed compared to the reference impression. Statistical analysis was performed using multivariate models. The mean difference in the position of the lower right first molar (vs. the reference impression) ranged from ±12 μm for PE to +19 and -14 μm for AS. Significantly higher mean values (+62 to -40 μm) were found for AS compared to PE (+21 to -26 μm) in the area of the distal adjacent tooth. Fast-set impression materials offer high precision when used for single tooth restorations as part of a one-stage impression technique, even when the working time (mixing plus application of the light- and heavy-body components) diverges significantly from the manufacturer's recommended protocol. Best accuracy was achieved with machine-mixed heavy-body/light-body polyether. Both materials examined met the clinical requirements regarding precision when the teeth were completely syringed with light material.

  9. Geostatistical integration and uncertainty in pollutant concentration surface under preferential sampling.

    PubMed

    Grisotto, Laura; Consonni, Dario; Cecconi, Lorenzo; Catelan, Dolores; Lagazio, Corrado; Bertazzi, Pier Alberto; Baccini, Michela; Biggeri, Annibale

    2016-04-18

    In this paper the focus is on environmental statistics, with the aim of estimating the concentration surface and related uncertainty of an air pollutant. We used air quality data recorded by a network of monitoring stations within a Bayesian framework to overcome difficulties in accounting for prediction uncertainty and to integrate information provided by deterministic models based on emissions meteorology and chemico-physical characteristics of the atmosphere. Several authors have proposed such integration, but all the proposed approaches rely on representativeness and completeness of existing air pollution monitoring networks. We considered the situation in which the spatial process of interest and the sampling locations are not independent. This is known in the literature as the preferential sampling problem, which if ignored in the analysis, can bias geostatistical inferences. We developed a Bayesian geostatistical model to account for preferential sampling with the main interest in statistical integration and uncertainty. We used PM10 data arising from the air quality network of the Environmental Protection Agency of Lombardy Region (Italy) and numerical outputs from the deterministic model. We specified an inhomogeneous Poisson process for the sampling locations intensities and a shared spatial random component model for the dependence between the spatial location of monitors and the pollution surface. We found greater predicted standard deviation differences in areas not properly covered by the air quality network. In conclusion, in this context inferences on prediction uncertainty may be misleading when geostatistical modelling does not take into account preferential sampling.

  10. Lifestyle referral assessment in an acute cardiology setting: study protocol for a randomized controlled feasibility trial.

    PubMed

    Hill, Kate M; Walwyn, Rebecca E A; Camidge, Diana C; Meads, David M; Murray, Jenni Y; Reynolds, Greg; Farrin, Amanda J; House, Allan O

    2013-07-11

    Lifestyle and behaviour change are important factors in the prevention of cardiovascular disease and reduction of premature mortality. Public health initiatives have focused on opportunities for healthcare staff to deliver lifestyle advice routinely in primary and secondary care but there is no consistent approach to onward referrals and the rate of uptake of advice remains low. We do not know if advice is more effective in supporting behaviour change when a systematic approach is taken that includes identification of barriers to change, directing patients toward services, referral to services, and feedback on outcome. This is a single-centre, randomized, unblinded feasibility trial in an acute hospital setting which aims to assess the feasibility of a definitive trial and provide proof of concept for the systematic delivery of individualized lifestyle advice in patients managed through an acute cardiology in-patient service.Patients will be recruited before discharge and randomized to two groups. A control group will receive the usual lifestyle assessment and referral, while an intervention group will receive the usual assessment plus the new individualized lifestyle assessment and referral. The new assessment will inform assignment of each patient to one of three categories based on personal barriers to change. Patients may be referred to a formal lifestyle-change programme, through the 'Leeds Let's Change' website, or they may be guided in self-management, using goal setting, or they may be assigned to a 'deferment' category, for reassessment at follow-up. These latter patients will be given a contact card for the 'Leeds Let's Change' service. Lifestyle change is an important mechanism for improving health and wellbeing across the population but there are widely acknowledged difficulties in addressing lifestyle factors with patients and supporting behaviour change. A systematic approach to assessment would facilitate audit and provide an indicator of the quality

  11. Mine planning and emission control strategies using geostatistics

    SciTech Connect

    Martino, F.; Kim, Y.C.

    1983-03-01

    This paper reviews the past four years' research efforts performed jointly by the University of Arizona and the Homer City Owners in which geostatistics were applied to solve various problems associated with coal characterization, mine planning, and development of emission control strategies. Because geostatistics is the only technique which can quantify the degree of confidence associated with a given estimate (or prediction), it played an important role throughout the research efforts. Through geostatistics, it was learned that there is an urgent need for closely spaced sample information, if short-term coal quality predictions are to be made for mine planning purposes.

  12. Can Geostatistical Models Represent Nature's Variability? An Analysis Using Flume Experiments

    NASA Astrophysics Data System (ADS)

    Scheidt, C.; Fernandes, A. M.; Paola, C.; Caers, J.

    2015-12-01

    The lack of understanding in the Earth's geological and physical processes governing sediment deposition render subsurface modeling subject to large uncertainty. Geostatistics is often used to model uncertainty because of its capability to stochastically generate spatially varying realizations of the subsurface. These methods can generate a range of realizations of a given pattern - but how representative are these of the full natural variability? And how can we identify the minimum set of images that represent this natural variability? Here we use this minimum set to define the geostatistical prior model: a set of training images that represent the range of patterns generated by autogenic variability in the sedimentary environment under study. The proper definition of the prior model is essential in capturing the variability of the depositional patterns. This work starts with a set of overhead images from an experimental basin that showed ongoing autogenic variability. We use the images to analyze the essential characteristics of this suite of patterns. In particular, our goal is to define a prior model (a minimal set of selected training images) such that geostatistical algorithms, when applied to this set, can reproduce the full measured variability. A necessary prerequisite is to define a measure of variability. In this study, we measure variability using a dissimilarity distance between the images. The distance indicates whether two snapshots contain similar depositional patterns. To reproduce the variability in the images, we apply an MPS algorithm to the set of selected snapshots of the sedimentary basin that serve as training images. The training images are chosen from among the initial set by using the distance measure to ensure that only dissimilar images are chosen. Preliminary investigations show that MPS can reproduce fairly accurately the natural variability of the experimental depositional system. Furthermore, the selected training images provide

  13. Geostatistical Estimations of Regional Hydraulic Conductivity Fields

    NASA Astrophysics Data System (ADS)

    Patriarche, D.; Castro, M. C.; Goovaerts, P.

    2004-12-01

    Direct and indirect measurements of hydraulic conductivity (K) are commonly performed, providing information on the magnitude of this parameter at the local scale (tens of centimeters to hundreds of meters) and at shallow depths. By contrast, field information on hydraulic conductivities at regional scales of tens to hundreds of kilometers and at greater depths is relatively scarce. Geostatistical methods allow for sparsely sampled observations of a variable (primary information) to be complemented by a more densely sampled secondary attribute. Geostatistical estimations of the hydraulic conductivity field in the Carrizo aquifer, a major groundwater flow system extending along Texas, are performed using available primary (e.g., transmissivity, hydraulic conductivity) and secondary (specific capacity) information, for depths up to 2.2 km, and over three regional domains of increasing extent: 1) the domain corresponding to a three-dimensional groundwater flow model previously built (model domain); 2) the area corresponding to the ten counties encompassing the model domain (County domain), and; 3) the full extension of the Carrizo aquifer within Texas (Texas domain). Two different approaches are used: 1) an indirect approach are transmissivity (T) is estimated first and (K) is retrieved through division of the T estimate by the screening length of the wells, and; 2) a direct approach where K data are kriged directly. Prediction performances of the tested geostatistical procedures (kriging combined with linear regression, kriging with known local means, kriging of residuals, and cokriging) are evaluated through cross validation for both log-transformed variables and back-transformed ones. For the indirect approach, kriging of log T residuals yields the best estimates for both log-transformed and back-transformed variables in the model domain. For larger regional scales (County and Texas domains), cokriging performs generally better than univariate kriging procedures

  14. Geostatistics, remote sensing and precision farming.

    PubMed

    Mulla, D J

    1997-01-01

    Precision farming is possible today because of advances in farming technology, procedures for mapping and interpolating spatial patterns, and geographic information systems for overlaying and interpreting several soil, landscape and crop attributes. The key component of precision farming is the map showing spatial patterns in field characteristics. Obtaining information for this map is often achieved by soil sampling. This approach, however, can be cost-prohibitive for grain crops. Soil sampling strategies can be simplified by use of auxiliary data provided by satellite or aerial photo imagery. This paper describes geostatistical methods for estimating spatial patterns in soil organic matter, soil test phosphorus and wheat grain yield from a combination of Thematic Mapper imaging and soil sampling.

  15. Multimodal Intervention to Improve Osteoporosis Care in Home Health Settings: Results from a Cluster Randomized Trial

    PubMed Central

    Kilgore, Meredith L.; Outman, Ryan; Locher, Julie L.; Allison, Jeroan J.; Mudano, Amy; Kitchin, Beth; Saag, Kenneth G.; Curtis, Jeffrey R.

    2014-01-01

    Purpose To test an evidence-implementation intervention to improve the quality of care in the home health care setting for patients at high risk for fractures. Methods We conducted a cluster randomized trial of a multimodal intervention targeted at home care for high-risk patients (prior fracture or physician-diagnosed osteoporosis) receiving care in a statewide home health agency in Alabama. Offices throughout the state were randomized to receive the intervention or to usual care. The primary outcome was the proportion of high-risk home health patients treated with osteoporosis medications. A t-test of difference in proportions was conducted between intervention and control arms and constituted the primary analysis. Secondary analyses included logistic regression estimating the effect of individual patients being treated in an intervention arm office on the likelihood of a patient receiving osteoporosis medications. A follow-on analysis examined the effect of an automated alert built into the electronic medical record that prompted the home health care nurses to deploy the intervention for high risk patients using a pre-post design. Results Among the offices in the intervention arm the average proportion of eligible patients receiving osteoporosis medications post-intervention was 19.1%, compared with 15.7% in the usual care arm (difference in proportions 3.4%, 95% CI: −2.6 −9.5%). The overall rates of osteoporosis medication use increased from 14.8% prior to activation of the automated alert to 17.6% afterward, a non-significant difference. Conclusions The home health intervention did not result in a significant improvement in use of osteoporosis medications in high risk patients. PMID:23536256

  16. Fitting parametric random effects models in very large data sets with application to VHA national data

    PubMed Central

    2012-01-01

    Background With the current focus on personalized medicine, patient/subject level inference is often of key interest in translational research. As a result, random effects models (REM) are becoming popular for patient level inference. However, for very large data sets that are characterized by large sample size, it can be difficult to fit REM using commonly available statistical software such as SAS since they require inordinate amounts of computer time and memory allocations beyond what are available preventing model convergence. For example, in a retrospective cohort study of over 800,000 Veterans with type 2 diabetes with longitudinal data over 5 years, fitting REM via generalized linear mixed modeling using currently available standard procedures in SAS (e.g. PROC GLIMMIX) was very difficult and same problems exist in Stata’s gllamm or R’s lme packages. Thus, this study proposes and assesses the performance of a meta regression approach and makes comparison with methods based on sampling of the full data. Data We use both simulated and real data from a national cohort of Veterans with type 2 diabetes (n=890,394) which was created by linking multiple patient and administrative files resulting in a cohort with longitudinal data collected over 5 years. Methods and results The outcome of interest was mean annual HbA1c measured over a 5 years period. Using this outcome, we compared parameter estimates from the proposed random effects meta regression (REMR) with estimates based on simple random sampling and VISN (Veterans Integrated Service Networks) based stratified sampling of the full data. Our results indicate that REMR provides parameter estimates that are less likely to be biased with tighter confidence intervals when the VISN level estimates are homogenous. Conclusion When the interest is to fit REM in repeated measures data with very large sample size, REMR can be used as a good alternative. It leads to reasonable inference for both Gaussian and non

  17. Validation and comparison of geostatistical and spline models for spatial stream networks.

    PubMed

    Rushworth, A M; Peterson, E E; Ver Hoef, J M; Bowman, A W

    2015-08-01

    Scientists need appropriate spatial-statistical models to account for the unique features of stream network data. Recent advances provide a growing methodological toolbox for modelling these data, but general-purpose statistical software has only recently emerged, with little information about when to use different approaches. We implemented a simulation study to evaluate and validate geostatistical models that use continuous distances, and penalised spline models that use a finite discrete approximation for stream networks. Data were simulated from the geostatistical model, with performance measured by empirical prediction and fixed effects estimation. We found that both models were comparable in terms of squared error, with a slight advantage for the geostatistical models. Generally, both methods were unbiased and had valid confidence intervals. The most marked differences were found for confidence intervals on fixed-effect parameter estimates, where, for small sample sizes, the spline models underestimated variance. However, the penalised spline models were always more computationally efficient, which may be important for real-time prediction and estimation. Thus, decisions about which method to use must be influenced by the size and format of the data set, in addition to the characteristics of the environmental process and the modelling goals. ©2015 The Authors. Environmetrics published by John Wiley & Sons, Ltd.

  18. Technology demonstration: geostatistical and hydrologic analysis of salt areas. Assessment of effectiveness of geologic isolation systems

    SciTech Connect

    Doctor, P.G.; Oberlander, P.L.; Rice, W.A.; Devary, J.L.; Nelson, R.W.; Tucker, P.E.

    1982-09-01

    The Office of Nuclear Waste Isolation (ONWI) requested Pacific Northwest Laboratory (PNL) to: (1) use geostatistical analyses to evaluate the adequacy of hydrologic data from three salt regions, each of which contains a potential nuclear waste repository site; and (2) demonstrate a methodology that allows quantification of the value of additional data collection. The three regions examined are the Paradox Basin in Utah, the Permian Basin in Texas, and the Mississippi Study Area. Additional and new data became available to ONWI during and following these analyses; therefore, this report must be considered a methodology demonstration here would apply as illustrated had the complete data sets been available. A combination of geostatistical and hydrologic analyses was used for this demonstration. Geostatistical analyses provided an optimal estimate of the potentiometric surface from the available data, a measure of the uncertainty of that estimate, and a means for selecting and evaluating the location of future data. The hydrologic analyses included the calculation of transmissivities, flow paths, travel times, and ground-water flow rates from hypothetical repository sites. Simulation techniques were used to evaluate the effect of optimally located future data on the potentiometric surface, flow lines, travel times, and flow rates. Data availability, quality, quantity, and conformance with model assumptions differed in each of the salt areas. Report highlights for the three locations are given.

  19. Use of the Airtraq laryngoscope for emergency intubation in the prehospital setting: a randomized control trial.

    PubMed

    Trimmel, Helmut; Kreutziger, Janett; Fertsak, Georg; Fitzka, Robert; Dittrich, Markus; Voelckel, Wolfgang G

    2011-03-01

    The optical Airtraq laryngoscope (Prodol Meditec, Vizcaya, Spain) has been shown to have advantages when compared with direct laryngoscopy in difficult airway patients. Furthermore, it has been suggested that it is easy to use and handle even for inexperienced advanced life support providers. As such, we sought to assess whether the Airtraq may be a reliable alternative to conventional intubation when used in the prehospital setting. Prospective, randomized control trial in emergency patients requiring endotracheal intubation provided by anesthesiologists or emergency physicians responding with an emergency medical service helicopter or ground unit associated with the Department of Anesthesiology, General Hospital, Wiener Neustadt, Austria. During the 18-month study period, 212 patients were enrolled. When the Airtraq was used as first-line airway device (n=106) vs. direct laryngoscopy (n=106), success rate was 47% vs. 99%, respectively (p<.001). Reasons for failed Airtraq intubation were related to the fiber-optic characteristic of this device (i.e., impaired sight due to blood and vomitus, n=11) or to assumed handling problems (i.e., cuff damage, tube misplacement, or inappropriate visualization of the glottis, n=24). In 54 of 56 patients where Airtraq intubation failed, direct laryngoscopy was successful on the first attempt; in the remaining two and in one additional case of failed direct laryngoscopy, the airway was finally secured employing the Fastrach laryngeal mask. There was no correlation between success rates and body mass index, age, indication for airway management, emergency medical service unit, or experience of the physicians. Based on these results, the use of the Airtraq laryngoscope as a primary airway device cannot be recommended in the prehospital setting without significant clinical experience obtained in the operation room. We conclude that the clinical learning process of the Airtraq laryngoscope is much longer than reported in the

  20. Novel Geostatistical Characterization of the Borden Aquifer, Canada

    NASA Astrophysics Data System (ADS)

    Maghrebi, M.; Jankovic, I.; Allen-King, R. M.; Rabideau, A. J.; Weissmann, G. S.

    2012-12-01

    Geostatistical (spatial) characterization of aquifer properties (hydraulic conductivity and sorption distribution coefficient) is the basis for development of contaminant transport models including stochastic models. Large datasets are required for complete spatial analysis of aquifer properties. The process of collecting the required field data is labor intensive and expensive. Because of limited data availability, spatial analysis is traditionally limited to 2-point overall covariance/variogram analysis. Additional assumptions, not supported by field data, are adopted to develop contaminant transport models. For example, hydraulic conductivity is often modeled as a MultiGaussian random field. A very large dataset of aquifer properties was collected and analyzed for the Borden aquifer, Canada. This is accomplished by exposing 15 panels in a sand quarry during our summer 2010 field study. The sedimentary facies from these 15 panels were then mapped using high-resolution photography, terrestrial Lidar images with field descriptions (Pickel, A.C. et al, Outcrop Analog Analysis of Lithofacies Distributions within Borden Aquifer Sediments, Ontario, CA, AGU 2011 Fall meeting, Poster ID: H51H-1297). The mapped images enable us to determine the three-dimensional positions of hydrofacies at the site and to classify these hydrofacies. Hydrofacies were classified into 6 general material groups with distinct magnitude of hydraulic conductivity and sorption distribution coefficient. The resulting dataset contains the information of 3D point coordinates and the group indices for approximately 3 million points. In order to perform the spatial analysis of this dataset, computer code GSLIB was parallelized with distributed and shared-memory directives and used to compute indicator semi-variograms of six material groups. While this does not constitute a full statistical analysis of the formation (multi-point and/or higher order correlations would be necessary for such analysis

  1. Incorporating reservoir heterogeneity with geostatistics to investigate waterflood recoveries

    SciTech Connect

    Wolcott, D.S. ); Chopra, A.K. )

    1993-03-01

    This paper presents an investigation of infill drilling performance and reservoir continuity with geostatistics and a reservoir simulator. The geostatistical technique provides many possible realizations and realistic descriptions of reservoir heterogeneity. Correlation between recovery efficiency and thickness of individual sand subunits is shown. Additional recovery from infill drilling results from thin, discontinuous subunits. The technique may be applied to variations in continuity for other sandstone reservoirs.

  2. Cognitive remediation for individuals with psychosis in a supported education setting: a randomized controlled trial.

    PubMed

    Kidd, Sean A; Kaur, Jaswant; Virdee, Gursharan; George, Tony P; McKenzie, Kwame; Herman, Yarissa

    2014-08-01

    Cognitive remediation (CR) has demonstrated good outcomes when paired with supported employment, however little is known about its effectiveness when integrated into a supported education program. This randomized controlled trial examined the effectiveness of integrating CR within a supported education program compared with supported education without CR. Thirty-seven students with psychosis were recruited into the study in the 2012 academic year. Academic functioning, cognition, self-esteem, and symptomatology were assessed at baseline, at 4months following the first academic term in which CR was provided, and at 8months assessing maintenance of gains. The treatment group demonstrated better retention in the academic program and a trend of improvement across a range of academic functional domains. While both treatment and control groups showed improvement in cognitive measures, the outcomes were not augmented by CR training. CR was also associated with significant and sustained improvements in self esteem. Further research, investigating specific intervention components is required to clarify the mixed findings regarding the effectiveness of CR in an education setting. Copyright © 2014 Elsevier B.V. All rights reserved.

  3. A randomized trial of group parent training: reducing child conduct problems in real-world settings.

    PubMed

    Kjøbli, John; Hukkelberg, Silje; Ogden, Terje

    2013-03-01

    Group-based Parent Management Training, the Oregon model (PMTO, 12 sessions) was delivered by the regular staff of municipal child and family services. PMTO is based on social interaction learning theory and promotes positive parenting skills in parents of children with conduct problems. This study examined the effectiveness of the group-based training intervention in real world settings both immediately following and six months after termination of the intervention. One hundred thirty-seven children (3-12 years) and their parents participated in this study. The families were randomly assigned to group-based training or a comparison group. Data were collected from parents and teachers. The caregiver assessments of parenting practices and child conduct problems and caregiver and teacher reported social competence revealed immediate and significant intervention effects. Short- and long-term beneficial effects were reported from parents, although no follow-up effects were evident on teacher reports. These effectiveness findings and the potential for increasing the number of families served to support the further dissemination and implementation of group-based parent training. Copyright © 2012 Elsevier Ltd. All rights reserved.

  4. A randomized controlled trial of aromatherapy massage in a hospice setting.

    PubMed

    Soden, Katie; Vincent, Karen; Craske, Stephen; Lucas, Caroline; Ashley, Sue

    2004-03-01

    Research suggests that patients with cancer, particularly in the palliative care setting, are increasingly using aromatherapy and massage. There is good evidence that these therapies may be helpful for anxiety reduction for short periods, but few studies have looked at the longer term effects. This study was designed to compare the effects of four-week courses of aromatherapy massage and massage alone on physical and psychological symptoms in patients with advanced cancer. Forty-two patients were randomly allocated to receive weekly massages with lavender essential oil and an inert carrier oil (aromatherapy group), an inert carrier oil only (massage group) or no intervention. Outcome measures included a Visual Analogue Scale (VAS) of pain intensity, the Verran and Snyder-Halpern (VSH) sleep scale, the Hospital Anxiety and Depression (HAD) scale and the Rotterdam Symptom Checklist (RSCL). We were unable to demonstrate any significant long-term benefits of aromatherapy or massage in terms of improving pain control, anxiety or quality of life. However, sleep scores improved significantly in both the massage and the combined massage (aromatherapy and massage) groups. There were also statistically significant reductions in depression scores in the massage group. In this study of patients with advanced cancer, the addition of lavender essential oil did not appear to increase the beneficial effects of massage. Our results do suggest, however, that patients with high levels of psychological distress respond best to these therapies.

  5. The Effect of Distributed Practice in Undergraduate Statistics Homework Sets: A Randomized Trial

    ERIC Educational Resources Information Center

    Crissinger, Bryan R.

    2015-01-01

    Most homework sets in statistics courses are constructed so that students concentrate or "mass" their practice on a certain topic in one problem set. Distributed practice homework sets include review problems in each set so that practice on a topic is distributed across problem sets. There is a body of research that points to the…

  6. The Effect of Distributed Practice in Undergraduate Statistics Homework Sets: A Randomized Trial

    ERIC Educational Resources Information Center

    Crissinger, Bryan R.

    2015-01-01

    Most homework sets in statistics courses are constructed so that students concentrate or "mass" their practice on a certain topic in one problem set. Distributed practice homework sets include review problems in each set so that practice on a topic is distributed across problem sets. There is a body of research that points to the…

  7. Geostatistical upscaling of rain gauge data to support uncertainty analysis of lumped urban hydrological models

    NASA Astrophysics Data System (ADS)

    Muthusamy, Manoranjan; Schellart, Alma; Tait, Simon; Heuvelink, Gerard B. M.

    2017-04-01

    Geostatistical methods have been used to analyse the spatial correlation structure of rainfall at various spatial scales, but its application to estimate the level of uncertainty in rainfall upscaling has not been fully explored mainly due to its inherent complexity and demanding data requirements. In this study we presented a method to overcome these challenges and predict AARI together with associated uncertainty using geostatistical upscaling. Rainfall data collected from a cluster of eight paired rain gauges in a 400 × 200 sq. m. urban catchment are used in combination with spatial stochastic simulation to obtain optimal predictions of the spatially averaged rainfall intensity at any point in time within the urban catchment. The uncertainty in the prediction of catchment average rainfall intensity is obtained for multiple combinations of intensity ranges and temporal averaging intervals. The two main challenges addressed in this study are scarcity of rainfall measurement locations and non-normality of rainfall data, both of which need to be considered when adopting a geostatistical approach. Scarcity of measurement points is dealt with by pooling sample variograms of repeated rainfall measurements with similar characteristics. Normality of rainfall data is achieved through the use of Normal Score Transformation. Geostatistical models in the form of variograms are derived for transformed rainfall intensity. Next spatial stochastic simulation which is robust to nonlinear data transformation is applied to produce realisations of rainfall fields. These realisations in transformed space are first back-transformed and next spatially aggregated to derive a random sample of the spatially averaged rainfall intensity. This study shows that for small time and space scales the use of a single geostatistical model based on a single variogram is not appropriate and a distinction between rainfall intensity classes and length of temporal averaging intervals should be made

  8. Comparative Effectiveness of Goal Setting in Diabetes Mellitus Group Clinics:Randomized Clinical Trial

    PubMed Central

    Naik, Aanand D.; Palmer, Nynikka; Petersen, Nancy J.; Street, Richard L.; Rao, Radha; Suarez-Almazor, Maria; Haidet, Paul

    2011-01-01

    Background Diabetes group clinics can effectively control hypertension, but data to support glycemic control is equivocal. This study evaluated the comparative effectiveness of two diabetes group clinic interventions on glycosolated hemoglobin (HbA1c) levels in primary care. Methods Participants (n = 87) were recruited from a diabetes registry of a single regional VA medical center to participate in an open, randomized comparative effectiveness study. Two primary care based diabetes group interventions of three months duration were compared. Empowering Patients in Care (EPIC) was a clinician-led, patient-centered group clinic consisting of four sessions on setting self-management action plans (diet, exercise, home monitoring, medications, etc.) and communicating about progress with action plans. The comparison intervention consisted of group education sessions with a diabetes educator and dietician followed by an additional visit with one’s primary care provider. HbA1c levels were compared post-intervention and at one-year follow-up. Results Participants in the EPIC intervention had significantly greater improvements in HbA1c levels immediately following the active intervention (8.86 to 8.04 vs. 8.74 to 8.70, mean [SD] between-group difference 0.67±1.3, P=.03) and these differences persisted at 1 year follow-up (.59±1.4, P=.05). A repeated measures analysis using all study time points found a significant time-by-treatment interaction effect on HbA1c levels favoring the EPIC intervention (F(2,85) =3.55, P= .03). The effect of the time-by-treatment interaction appears to be partially mediated by diabetes self-efficacy (F(1,85) =10.39, P= .002). Conclusions Primary care based diabetes group clinics that include structured goal-setting approaches to self-management can significantly improve HbA1c levels post-intervention and maintain improvements for 1-year. Trial registration ClinicalTrials.gov Identifier: NCT00481286 PMID:21403042

  9. The role of geostatistics in medical geology

    NASA Astrophysics Data System (ADS)

    Goovaerts, Pierre

    2014-05-01

    Since its development in the mining industry, geostatistics has emerged as the primary tool for spatial data analysis in various fields, ranging from earth and atmospheric sciences, to agriculture, soil science, remote sensing, and more recently environmental exposure assessment. In the last few years, these tools have been tailored to the field of medical geography or spatial epidemiology, which is concerned with the study of spatial patterns of disease incidence and mortality and the identification of potential 'causes' of disease, such as environmental exposure, diet and unhealthy behaviors, economic or socio-demographic factors. On the other hand, medical geology is an emerging interdisciplinary scientific field studying the relationship between natural geological factors and their effects on human and animal health. This paper provides an introduction to the field of medical geology with an overview of geostatistical methods available for the analysis of geological and health data. Key concepts are illustrated using the mapping of groundwater arsenic concentrations across eleven Michigan counties and the exploration of its relationship to the incidence of prostate cancer at the township level. Arsenic in drinking-water is a major problem and has received much attention because of the large human population exposed and the extremely high concentrations (e.g. 600 to 700 μg/L) recorded in many instances. Few studies have however assessed the risks associated with exposure to low levels of arsenic (say < 50 μg/L) most commonly found in drinking water in the United States. In the Michigan thumb region, arsenopyrite (up to 7% As by weight) has been identified in the bedrock of the Marshall Sandstone aquifer, one of the region's most productive aquifers. Epidemiologic studies have suggested a possible associationbetween exposure to inorganic arsenic and prostate cancer mortality, including a study of populations residing in Utah. The information available for the

  10. Unsupervised classification of multivariate geostatistical data: Two algorithms

    NASA Astrophysics Data System (ADS)

    Romary, Thomas; Ors, Fabien; Rivoirard, Jacques; Deraisme, Jacques

    2015-12-01

    With the increasing development of remote sensing platforms and the evolution of sampling facilities in mining and oil industry, spatial datasets are becoming increasingly large, inform a growing number of variables and cover wider and wider areas. Therefore, it is often necessary to split the domain of study to account for radically different behaviors of the natural phenomenon over the domain and to simplify the subsequent modeling step. The definition of these areas can be seen as a problem of unsupervised classification, or clustering, where we try to divide the domain into homogeneous domains with respect to the values taken by the variables in hand. The application of classical clustering methods, designed for independent observations, does not ensure the spatial coherence of the resulting classes. Image segmentation methods, based on e.g. Markov random fields, are not adapted to irregularly sampled data. Other existing approaches, based on mixtures of Gaussian random functions estimated via the expectation-maximization algorithm, are limited to reasonable sample sizes and a small number of variables. In this work, we propose two algorithms based on adaptations of classical algorithms to multivariate geostatistical data. Both algorithms are model free and can handle large volumes of multivariate, irregularly spaced data. The first one proceeds by agglomerative hierarchical clustering. The spatial coherence is ensured by a proximity condition imposed for two clusters to merge. This proximity condition relies on a graph organizing the data in the coordinates space. The hierarchical algorithm can then be seen as a graph-partitioning algorithm. Following this interpretation, a spatial version of the spectral clustering algorithm is also proposed. The performances of both algorithms are assessed on toy examples and a mining dataset.

  11. Implementing Random Assignment: A Computer-Based Approach in a Field Experimental Setting.

    ERIC Educational Resources Information Center

    Dobson, Douglas; Cook, Thomas J.

    1979-01-01

    A major problem in social science research is that of successfully carrying out the random assignment of persons to experimental and control groups. In this study a computer-based random assignment procedure operated successfully on a weekly basis for 17 consecutive weeks in a program serving over 360 ex-offenders. (CTM)

  12. Optimisation of groundwater level monitoring networks using geostatistical modelling based on the Spartan family variogram and a genetic algorithm method

    NASA Astrophysics Data System (ADS)

    Parasyris, Antonios E.; Spanoudaki, Katerina; Kampanis, Nikolaos A.

    2016-04-01

    Groundwater level monitoring networks provide essential information for water resources management, especially in areas with significant groundwater exploitation for agricultural and domestic use. Given the high maintenance costs of these networks, development of tools, which can be used by regulators for efficient network design is essential. In this work, a monitoring network optimisation tool is presented. The network optimisation tool couples geostatistical modelling based on the Spartan family variogram with a genetic algorithm method and is applied to Mires basin in Crete, Greece, an area of high socioeconomic and agricultural interest, which suffers from groundwater overexploitation leading to a dramatic decrease of groundwater levels. The purpose of the optimisation tool is to determine which wells to exclude from the monitoring network because they add little or no beneficial information to groundwater level mapping of the area. Unlike previous relevant investigations, the network optimisation tool presented here uses Ordinary Kriging with the recently-established non-differentiable Spartan variogram for groundwater level mapping, which, based on a previous geostatistical study in the area leads to optimal groundwater level mapping. Seventy boreholes operate in the area for groundwater abstraction and water level monitoring. The Spartan variogram gives overall the most accurate groundwater level estimates followed closely by the power-law model. The geostatistical model is coupled to an integer genetic algorithm method programmed in MATLAB 2015a. The algorithm is used to find the set of wells whose removal leads to the minimum error between the original water level mapping using all the available wells in the network and the groundwater level mapping using the reduced well network (error is defined as the 2-norm of the difference between the original mapping matrix with 70 wells and the mapping matrix of the reduced well network). The solution to the

  13. Estimation of extreme daily precipitation: comparison between regional and geostatistical approaches.

    NASA Astrophysics Data System (ADS)

    Hellies, Matteo; Deidda, Roberto; Langousis, Andreas

    2016-04-01

    We study the extreme rainfall regime of the Island of Sardinia in Italy, based on annual maxima of daily precipitation. The statistical analysis is conducted using 229 daily rainfall records with at least 50 complete years of observations, collected at different sites by the Hydrological Survey of the Sardinia Region. Preliminary analysis, and the L-skewness and L-kurtosis diagrams, show that the Generalized Extreme Value (GEV) distribution model performs best in describing daily rainfall extremes. The GEV distribution parameters are estimated using the method of Probability Weighted Moments (PWM). To obtain extreme rainfall estimates at ungauged sites, while minimizing uncertainties due to sampling variability, a regional and a geostatistical approach are compared. The regional approach merges information from different gauged sites, within homogeneous regions, to obtain GEV parameter estimates at ungauged locations. The geostatistical approach infers the parameters of the GEV distribution model at locations where measurements are available, and then spatially interpolates them over the study region. In both approaches we use local rainfall means as index-rainfall. In the regional approach we define homogeneous regions by applying a hierarchical cluster analysis based on Ward's method, with L-moment ratios (i.e. L-CV and L-Skewness) as metrics. The analysis results in four contiguous regions, which satisfy the Hosking and Wallis (1997) homogeneity tests. The latter have been conducted using a Monte-Carlo approach based on a 4-parameter Kappa distribution model, fitted to each station cluster. Note that the 4-parameter Kappa model includes the GEV distribution as a sub-case, when the fourth parameter h is set to 0. In the geostatistical approach we apply kriging for uncertain data (KUD), which accounts for the error variance in local parameter estimation and, therefore, may serve as a useful tool for spatial interpolation of metrics affected by high uncertainty. In

  14. High Performance Geostatistical Modeling of Biospheric Resources

    NASA Astrophysics Data System (ADS)

    Pedelty, J. A.; Morisette, J. T.; Smith, J. A.; Schnase, J. L.; Crosier, C. S.; Stohlgren, T. J.

    2004-12-01

    We are using parallel geostatistical codes to study spatial relationships among biospheric resources in several study areas. For example, spatial statistical models based on large- and small-scale variability have been used to predict species richness of both native and exotic plants (hot spots of diversity) and patterns of exotic plant invasion. However, broader use of geostastics in natural resource modeling, especially at regional and national scales, has been limited due to the large computing requirements of these applications. To address this problem, we implemented parallel versions of the kriging spatial interpolation algorithm. The first uses the Message Passing Interface (MPI) in a master/slave paradigm on an open source Linux Beowulf cluster, while the second is implemented with the new proprietary Xgrid distributed processing system on an Xserve G5 cluster from Apple Computer, Inc. These techniques are proving effective and provide the basis for a national decision support capability for invasive species management that is being jointly developed by NASA and the US Geological Survey.

  15. A geostatistical approach to contaminant source identification

    NASA Astrophysics Data System (ADS)

    Snodgrass, Mark F.; Kitanidis, Peter K.

    1997-04-01

    A geostatistical approach to contaminant source estimation is presented. The problem is to estimate the release history of a conservative solute given point concentration measurements at some time after the release. A Bayesian framework is followed to derive the best estimate and to quantify the estimation error. The relation between this approach and common regularization and interpolation schemes is discussed. The performance of the method is demonstrated for transport in a simple one-dimensional homogeneous medium, although the approach is directly applicable to transport in two- or three-dimensional domains. The methodology produces a best estimate of the release history and a confidence interval. Conditional realizations of the release history are generated that are useful in visualization and risk assessment. The performance of the method with sparse data and large measurement error is examined. Emphasis is placed on formulating the estimation method in a computationally efficient manner. The method does not require the inversion of matrices whose size depends on the grid size used to resolve the solute release history. The issue of model validation is addressed.

  16. Preventing Depression among Early Adolescents in the Primary Care Setting: A Randomized Controlled Study of the Penn Resiliency Program

    ERIC Educational Resources Information Center

    Gillham, Jane E.; Hamilton, John; Freres, Derek R.; Patton, Ken; Gallop, Robert

    2006-01-01

    This study evaluated the Penn Resiliency Program's effectiveness in preventing depression when delivered by therapists in a primary care setting. Two-hundred and seventy-one 11- and 12-year-olds, with elevated depressive symptoms, were randomized to PRP or usual care. Over the 2-year follow-up, PRP improved explanatory style for positive events.…

  17. Preventing Depression among Early Adolescents in the Primary Care Setting: A Randomized Controlled Study of the Penn Resiliency Program

    ERIC Educational Resources Information Center

    Gillham, Jane E.; Hamilton, John; Freres, Derek R.; Patton, Ken; Gallop, Robert

    2006-01-01

    This study evaluated the Penn Resiliency Program's effectiveness in preventing depression when delivered by therapists in a primary care setting. Two-hundred and seventy-one 11- and 12-year-olds, with elevated depressive symptoms, were randomized to PRP or usual care. Over the 2-year follow-up, PRP improved explanatory style for positive events.…

  18. Implementing Randomized Controlled Trial Studies in Afterschool Settings: The State of the Field. Afterschool Research Brief. Issue No. 1

    ERIC Educational Resources Information Center

    Vaden-Kiernan, Michael; Jones, Debra Hughes; Rudo, Zena

    2008-01-01

    SEDL is providing analytic and technical support to three large-scale randomized controlled trials assessing the efficacy of promising literacy curriculum in afterschool settings on student academic achievement. In the field of educational research, competition among research organizations and researchers can often impede collaborative efforts in…

  19. Effects of excluding a set of random effects on prediction error variance of breeding value.

    PubMed

    Cantet, R J

    1997-01-12

    The effects of excluding a set of random effects (U-effects) uncorrelated to breeding values (BV) on prediction error variance (PEV) is studied analytically. Two situations are considered for model comparison: (a) existence of a 'true' model, (b) uncertainty about which of the competing models is 'true'. Models compared are the 'long' one, which includes BV + U-effects, and the 'short' one which includes BV's as the only random systematic effect. Expressions for PEV(BV) were obtained for the long model (PEVL); the short model (PEVS); and the short model assuming the long model is the correct one (PEVSI). It is shown that in general PEVS ≤ PEVL ≤ PEVSI. Results are exemplified by means of an example including a computer simulation. RESUMEN: En este trabajo se estudia analiticamente el efecto de excluir una variable aleatoria (efecto U) no correlacionada con el valor de cría (BV), sobre la varianza del error de predicción de este último (PEV(BV)). Para ello se utilizan dos enfoques de comparación de modelos: (a) existencia de un modelo 'verdadero', (b) incertidumbre respecto de cuál de ambos modelos alternativos es el correcto. Los modelos que se comparan son: el 'largo', que incluye BV+U, y el 'corto', el cuál solo incluye BV. Se obtienen las expresiones para PEV(BV) en las siguientes situaciones: (1) en el modelo largo (PEVL), (2) en el modelo corto (PEVS), y (3) en el modelo corto pero asumiendo que el largo es el verdadero (PEVSI). Se demuestra que en general PEVS ≤ PEVL ≤ PEVSI. Los resultados obtenidos son ilustrados mediante un ejemplo que incluye una simulación estocástica. ZUSAMMENFASSUNG: Veränderung der Fehlervarianz der Zuchtwertvoraussage durch Vernachlässigung einer Gruppe zufäliger Wirkungen. Es wird die Auswirkung der Ausschaltung einer Gruppe zufälliger Wirkungen (U-effects), die mit Zuchtwerten (BV) nicht korreliert sind, auf die Varianz des Voraussage-Fehlers (PEV) analytisch untersucht. Zwei Modelle werden betrachtet: (a

  20. A fluence-convolution method to calculate radiation therapy dose distributions that incorporate random set-up error

    NASA Astrophysics Data System (ADS)

    Beckham, W. A.; Keall, P. J.; Siebers, J. V.

    2002-10-01

    The International Commission on Radiation Units and Measurements Report 62 (ICRU 1999) introduced the concept of expanding the clinical target volume (CTV) to form the planning target volume by a two-step process. The first step is adding a clinically definable internal margin, which produces an internal target volume that accounts for the size, shape and position of the CTV in relation to anatomical reference points. The second is the use of a set-up margin (SM) that incorporates the uncertainties of patient beam positioning, i.e. systematic and random set-up errors. We propose to replace the random set-up error component of the SM by explicitly incorporating the random set-up error into the dose-calculation model by convolving the incident photon beam fluence with a Gaussian set-up error kernel. This fluence-convolution method was implemented into a Monte Carlo (MC) based treatment-planning system. Also implemented for comparison purposes was a dose-matrix-convolution algorithm similar to that described by Leong (1987 Phys. Med. Biol. 32 327-34). Fluence and dose-matrix-convolution agree in homogeneous media. However, for the heterogeneous phantom calculations, discrepancies of up to 5% in the dose profiles were observed with a 0.4 cm set-up error value. Fluence-convolution mimics reality more closely, as dose perturbations at interfaces are correctly predicted (Wang et al 1999 Med. Phys. 26 2626-34, Sauer 1995 Med. Phys. 22 1685-90). Fluence-convolution effectively decouples the treatment beams from the patient, and more closely resembles the reality of particle fluence distributions for many individual beam-patient set-ups. However, dose-matrix-convolution reduces the random statistical noise in MC calculations. Fluence-convolution can easily be applied to convolution/superposition based dose-calculation algorithms.

  1. Screening for intimate partner violence in health care settings: a randomized trial.

    PubMed

    MacMillan, Harriet L; Wathen, C Nadine; Jamieson, Ellen; Boyle, Michael H; Shannon, Harry S; Ford-Gilboe, Marilyn; Worster, Andrew; Lent, Barbara; Coben, Jeffrey H; Campbell, Jacquelyn C; McNutt, Louise-Anne

    2009-08-05

    Whether intimate partner violence (IPV) screening reduces violence or improves health outcomes for women is unknown. To determine the effectiveness of IPV screening and communication of positive results to clinicians. Randomized controlled trial conducted in 11 emergency departments, 12 family practices, and 3 obstetrics/gynecology clinics in Ontario, Canada, among 6743 English-speaking female patients aged 18 to 64 years who presented between July 2005 and December 2006, could be seen individually, and were well enough to participate. Women in the screened group (n=3271) self-completed the Woman Abuse Screening Tool (WAST); if a woman screened positive, this information was given to her clinician before the health care visit. Subsequent discussions and/or referrals were at the discretion of the treating clinician. The nonscreened group (n=3472) self-completed the WAST and other measures after their visit. Women disclosing past-year IPV were interviewed at baseline and every 6 months until 18 months regarding IPV reexposure and quality of life (primary outcomes), as well as several health outcomes and potential harms of screening. Participant loss to follow-up was high: 43% (148/347) of screened women and 41% (148/360) of nonscreened women. At 18 months (n = 411), observed recurrence of IPV among screened vs nonscreened women was 46% vs 53% (modeled odds ratio, 0.82; 95% confidence interval, 0.32-2.12). Screened vs nonscreened women exhibited about a 0.2-SD greater improvement in quality-of-life scores (modeled score difference at 18 months, 3.74; 95% confidence interval, 0.47-7.00). When multiple imputation was used to account for sample loss, differences between groups were reduced and quality-of-life differences were no longer significant. Screened women reported no harms of screening. Although sample attrition urges cautious interpretation, the results of this trial do not provide sufficient evidence to support IPV screening in health care settings. Evaluation

  2. Observation of Lévy distribution and replica symmetry breaking in random lasers from a single set of measurements

    NASA Astrophysics Data System (ADS)

    Gomes, Anderson S. L.; Raposo, Ernesto P.; Moura, André L.; Fewo, Serge I.; Pincheira, Pablo I. R.; Jerez, Vladimir; Maia, Lauro J. Q.; de Araújo, Cid B.

    2016-06-01

    Random lasers have been recently exploited as a photonic platform for studies of complex systems. This cross-disciplinary approach opened up new important avenues for the understanding of random-laser behavior, including Lévy-type distributions of strong intensity fluctuations and phase transitions to a photonic spin-glass phase. In this work, we employ the Nd:YBO random laser system to unveil, from a single set of measurements, the physical origin of the complex correspondence between the Lévy fluctuation regime and the replica-symmetry-breaking transition to the spin-glass phase. A novel unexpected finding is also reported: the trend to suppress the spin-glass behavior for high excitation pulse energies. The present description from first principles of this correspondence unfolds new possibilities to characterize other random lasers, such as random fiber lasers, nanolasers and small lasers, which include plasmonic-based, photonic-crystal and bio-derived nanodevices. The statistical nature of the emission provided by random lasers can also impact on their prominent use as sources for speckle-free laser imaging, which nowadays represents one of the most promising applications of random lasers, with expected progress even in cancer research.

  3. Observation of Lévy distribution and replica symmetry breaking in random lasers from a single set of measurements.

    PubMed

    Gomes, Anderson S L; Raposo, Ernesto P; Moura, André L; Fewo, Serge I; Pincheira, Pablo I R; Jerez, Vladimir; Maia, Lauro J Q; de Araújo, Cid B

    2016-06-13

    Random lasers have been recently exploited as a photonic platform for studies of complex systems. This cross-disciplinary approach opened up new important avenues for the understanding of random-laser behavior, including Lévy-type distributions of strong intensity fluctuations and phase transitions to a photonic spin-glass phase. In this work, we employ the Nd:YBO random laser system to unveil, from a single set of measurements, the physical origin of the complex correspondence between the Lévy fluctuation regime and the replica-symmetry-breaking transition to the spin-glass phase. A novel unexpected finding is also reported: the trend to suppress the spin-glass behavior for high excitation pulse energies. The present description from first principles of this correspondence unfolds new possibilities to characterize other random lasers, such as random fiber lasers, nanolasers and small lasers, which include plasmonic-based, photonic-crystal and bio-derived nanodevices. The statistical nature of the emission provided by random lasers can also impact on their prominent use as sources for speckle-free laser imaging, which nowadays represents one of the most promising applications of random lasers, with expected progress even in cancer research.

  4. Observation of Lévy distribution and replica symmetry breaking in random lasers from a single set of measurements

    PubMed Central

    Gomes, Anderson S. L.; Raposo, Ernesto P.; Moura, André L.; Fewo, Serge I.; Pincheira, Pablo I. R.; Jerez, Vladimir; Maia, Lauro J. Q.; de Araújo, Cid B.

    2016-01-01

    Random lasers have been recently exploited as a photonic platform for studies of complex systems. This cross-disciplinary approach opened up new important avenues for the understanding of random-laser behavior, including Lévy-type distributions of strong intensity fluctuations and phase transitions to a photonic spin-glass phase. In this work, we employ the Nd:YBO random laser system to unveil, from a single set of measurements, the physical origin of the complex correspondence between the Lévy fluctuation regime and the replica-symmetry-breaking transition to the spin-glass phase. A novel unexpected finding is also reported: the trend to suppress the spin-glass behavior for high excitation pulse energies. The present description from first principles of this correspondence unfolds new possibilities to characterize other random lasers, such as random fiber lasers, nanolasers and small lasers, which include plasmonic-based, photonic-crystal and bio-derived nanodevices. The statistical nature of the emission provided by random lasers can also impact on their prominent use as sources for speckle-free laser imaging, which nowadays represents one of the most promising applications of random lasers, with expected progress even in cancer research. PMID:27292095

  5. A Comparison of Traditional, Step-Path, and Geostatistical Techniques in the Stability Analysis of a Large Open Pit

    NASA Astrophysics Data System (ADS)

    Mayer, J. M.; Stead, D.

    2017-04-01

    With the increased drive towards deeper and more complex mine designs, geotechnical engineers are often forced to reconsider traditional deterministic design techniques in favour of probabilistic methods. These alternative techniques allow for the direct quantification of uncertainties within a risk and/or decision analysis framework. However, conventional probabilistic practices typically discretize geological materials into discrete, homogeneous domains, with attributes defined by spatially constant random variables, despite the fact that geological media display inherent heterogeneous spatial characteristics. This research directly simulates this phenomenon using a geostatistical approach, known as sequential Gaussian simulation. The method utilizes the variogram which imposes a degree of controlled spatial heterogeneity on the system. Simulations are constrained using data from the Ok Tedi mine site in Papua New Guinea and designed to randomly vary the geological strength index and uniaxial compressive strength using Monte Carlo techniques. Results suggest that conventional probabilistic techniques have a fundamental limitation compared to geostatistical approaches, as they fail to account for the spatial dependencies inherent to geotechnical datasets. This can result in erroneous model predictions, which are overly conservative when compared to the geostatistical results.

  6. Selective remediation of contaminated sites using a two-level multiphase strategy and geostatistics.

    PubMed

    Saito, Hirotaka; Goovaerts, Pierre

    2003-05-01

    Selective soil remediation aims to reduce costs by cleaning only the fraction of an exposure unit (EU) necessary to lower the average concentration below the regulatory threshold. This approach requires a prior stratification of each EU into smaller remediation units (RU) which are then selected according to various criteria. This paper presents a geostatistical framework to account for uncertainties attached to both RU and EU average concentrations in selective remediation. The selection of RUs is based on their impact on the postremediation probability for the EU average concentration to exceed the regulatory threshold, which is assessed using geostatistical stochastic simulation. Application of the technique to a set of 600 dioxin concentrations collected at Piazza Road EPA Superfund site in Missouri shows a substantial decrease in the number of RU remediated compared with single phase remediation. The lower remediation costs achieved by the new strategy are obtained to the detriment of a higher risk of false negatives, yet for this data set this risk remains below the 5% rate set by EPA region 7.

  7. Comparison of Direct and Indirect Laryngoscopes in Vomitus and Hematemesis Settings: A Randomized Simulation Trial

    PubMed Central

    Mihara, Ryosuke; Komasawa, Nobuyasu; Matsunami, Sayuri; Minami, Toshiaki

    2015-01-01

    Background. Videolaryngoscopes may not be useful in the presence of hematemesis or vomitus. We compared the utility of the Macintosh laryngoscope (McL), which is a direct laryngoscope, with that of the Pentax-AWS Airwayscope (AWS) and McGRATH MAC (McGRATH), which are videolaryngoscopes, in simulated hematemesis and vomitus settings. Methods. Seventeen anesthesiologists with more than 1 year of experience performed tracheal intubation on an adult manikin using McL, AWS, and McGRATH under normal, hematemesis, and vomitus simulations. Results. In the normal setting, the intubation success rate was 100% for all three laryngoscopes. In the hematemesis settings, the intubation success rate differed significantly among the three laryngoscopes (P = 0.021). In the vomitus settings, all participants succeeded in tracheal intubation with McL or McGRATH, while five failed in the AWS trial with significant difference (P = 0.003). The intubation time did not significantly differ in normal settings, while it was significantly longer in the AWS trial compared to McL or McGRATH trial in the hematemesis or vomitus settings (P < 0.001, compared to McL or McGRATH in both settings). Conclusion. The performance of McGRATH and McL can be superior to that of AWS for tracheal intubation in vomitus and hematemesis settings in adults. PMID:26618177

  8. Geostatistical Solutions for Downscaling Remotely Sensed Land Surface Temperature

    NASA Astrophysics Data System (ADS)

    Wang, Q.; Rodriguez-Galiano, V.; Atkinson, P. M.

    2017-09-01

    Remotely sensed land surface temperature (LST) downscaling is an important issue in remote sensing. Geostatistical methods have shown their applicability in downscaling multi/hyperspectral images. In this paper, four geostatistical solutions, including regression kriging (RK), downscaling cokriging (DSCK), kriging with external drift (KED) and area-to-point regression kriging (ATPRK), are applied for downscaling remotely sensed LST. Their differences are analyzed theoretically and the performances are compared experimentally using a Landsat 7 ETM+ dataset. They are also compared to the classical TsHARP method.

  9. Geostatistics for environmental and geotechnical applications: A technology transferred

    SciTech Connect

    Cromer, M.V.

    1996-12-31

    Although successfully applied during the past few decades for predicting the spatial occurrences of properties that are cloaked from direct observation, geostatistical methods remain somewhat of a mystery to practitioners in the environmental and geotechnical fields. The techniques are powerful analytical tools that integrate numerical and statistical methods with scientific intuition and professional judgment to resolve conflicts between conceptual interpretation and direct measurement. This paper examines the practicality of these techniques within the entitled field of study and concludes by introducing a practical case study in which the geostatistical approach is thoroughly executed.

  10. Hydrogeologic unit flow characterization using transition probability geostatistics.

    PubMed

    Jones, Norman L; Walker, Justin R; Carle, Steven F

    2005-01-01

    This paper describes a technique for applying the transition probability geostatistics method for stochastic simulation to a MODFLOW model. Transition probability geostatistics has some advantages over traditional indicator kriging methods including a simpler and more intuitive framework for interpreting geologic relationships and the ability to simulate juxtapositional tendencies such as fining upward sequences. The indicator arrays generated by the transition probability simulation are converted to layer elevation and thickness arrays for use with the new Hydrogeologic Unit Flow package in MODFLOW 2000. This makes it possible to preserve complex heterogeneity while using reasonably sized grids and/or grids with nonuniform cell thicknesses.

  11. Comparing spatial regression to random forests for large environmental data sets

    EPA Science Inventory

    Environmental data may be “large” due to number of records, number of covariates, or both. Random forests has a reputation for good predictive performance when using many covariates, whereas spatial regression, when using reduced rank methods, has a reputatio...

  12. Set statistics in conductive bridge random access memory device with Cu/HfO{sub 2}/Pt structure

    SciTech Connect

    Zhang, Meiyun; Long, Shibing Wang, Guoming; Xu, Xiaoxin; Li, Yang; Liu, Qi; Lv, Hangbing; Liu, Ming; Lian, Xiaojuan; Miranda, Enrique; Suñé, Jordi

    2014-11-10

    The switching parameter variation of resistive switching memory is one of the most important challenges in its application. In this letter, we have studied the set statistics of conductive bridge random access memory with a Cu/HfO{sub 2}/Pt structure. The experimental distributions of the set parameters in several off resistance ranges are shown to nicely fit a Weibull model. The Weibull slopes of the set voltage and current increase and decrease logarithmically with off resistance, respectively. This experimental behavior is perfectly captured by a Monte Carlo simulator based on the cell-based set voltage statistics model and the Quantum Point Contact electron transport model. Our work provides indications for the improvement of the switching uniformity.

  13. Geostatistical Study of Precipitation on the Island of Crete

    NASA Astrophysics Data System (ADS)

    Agou, Vasiliki D.; Varouchakis, Emmanouil A.; Hristopulos, Dionissios T.

    2015-04-01

    precipitation which are fitted locally to a three-parameter probability distribution, based on which a normalized index is derived. We use the Spartan variogram function to model space-time correlations, because it is more flexible than classical models [3]. The performance of the variogram model is tested by means of leave-one-out cross validation. The variogram model is then used in connection with ordinary kriging to generate precipitation maps for the entire island. In the future, we will explore the joint spatiotemporal evolution of precipitation patterns on Crete. References [1] P. Goovaerts. Geostatistical approaches for incorporating elevation into the spatial interpolation of precipitation. Journal of Hydrology, 228(1):113-129, 2000. [2] N. B. Guttman. Accepting the standardized precipitation index: a calculation algorithm. American Water Resource Association, 35(2):311-322, 1999. [3] D. T Hristopulos. Spartan Gibbs random field models for geostatistical applications. SIAM Journal on Scientific Computing, 24(6):2125-2162, 2003. [4] A.G. Koutroulis, A.-E.K. Vrohidou, and I.K. Tsanis. Spatiotemporal characteristics of meteorological drought for the island of Crete. Journal of Hydrometeorology, 12(2):206-226, 2011. [5] T. B. McKee, N. J. Doesken, and J. Kleist. The relationship of drought frequency and duration to time scales. In Proceedings of the 8th Conference on Applied Climatology, page 179-184, Anaheim, California, 1993.

  14. Proposed sets of critical exponents for randomly branched polymers, using a known string theory model

    NASA Astrophysics Data System (ADS)

    March, N. H.; Moreno, A. J.

    2016-06-01

    The critical exponent ν for randomly branched polymers with dimensionality d equal to 3, is known exactly as 1/2. Here, we invoke an already available string theory model to predict the remaining static critical exponents. Utilizing results of Hsu et al. (Comput Phys Commun. 2005;169:114-116), results are added for d = 8. Experiment plus simulation would now be important to confirm, or if necessary to refine, the proposed values.

  15. High-resolution Geostatistical Inversion of the Transient Richards Equation

    NASA Astrophysics Data System (ADS)

    Klein, Ole; Bastian, Peter; Ippisch, Olaf

    2015-04-01

    The vadose zone and the complex physical processes in it play a vital role in our understanding of the environment. The production of most food is directly or indirectly linked to the growth of organic matter sustained by subsurface flow. For a reliable assessment of the influence of natural and anthropogenic changes to such a coupled system detailed knowledge about the flow patterns and dynamics is important, but the high spatial variability of subsurface hydraulic parameters makes reliable predictions about flow patterns difficult. Direct measurement of these properties is not possible, making indirect observations through dependent quantities and parameter estimation a necessity. The geostatistical approach characterizes these hydraulic parameters without predetermined zonation. The parameter fields are treated as stochastic processes, optionally incorporating a priori information in the probability distribution. Maximizing the likelihood of the parameters with regard to the given observations yields a parameter estimate with high spatial resolution. This approach naturally leads to non-linear least squares optimization problems that may theoretically be solved using standard techniques. However, the accurate numerical representation of the Richards equation necessitates high spatio-temporal resolution and therefore a large number of parameters, while time series of observed physical quantities typically lead to many data points to invert. This high dimensionality in both the parameter and observation space makes standard techniques infeasible. We present an extension of one of these existing inversion methods, developed for stationary flow in confined aquifers, to instationary flow regimes in partially saturated porous media. Our approach uses a Conjugate Gradients scheme preconditioned with the prior covariance matrix to avoid both multiplications with its inverse and the explicit assembly of the sensitivity matrix. Instead, one combined adjoint model run is

  16. Reducing HIV-Related Stigma in Health Care Settings: A Randomized Controlled Trial in China

    PubMed Central

    Wu, Zunyou; Liang, Li-Jung; Guan, Jihui; Jia, Manhong; Rou, Keming; Yan, Zhihua

    2013-01-01

    Objectives. The objective of the intervention was to reduce service providers’ stigmatizing attitudes and behaviors toward people living with HIV. Methods. The randomized controlled trial was conducted in 40 county-level hospitals in 2 provinces of China between October 2008 and February 2010. Forty-four service providers were randomly selected from each hospital, yielding a total of 1760 study participants. We randomized the hospitals to either an intervention condition or a control condition. In the intervention hospitals, about 15% of the popular opinion leaders were identified and trained to disseminate stigma reduction messages. Results. We observed significant improvements for the intervention group in reducing prejudicial attitudes (P < .001), reducing avoidance intent towards people living with HIV (P < .001), and increasing institutional support in the hospitals (P = .003) at 6 months after controlling for service providers’ background factors and clinic-level characteristics. The intervention effects were sustained and strengthened at 12 months. Conclusions. The intervention reduced stigmatizing attitudes and behaviors among service providers. It has the potential to be integrated into the health care systems in China and other countries. PMID:23237175

  17. Geostatistical Sampling Methods for Efficient Uncertainty Analysis in Flow and Transport Problems

    NASA Astrophysics Data System (ADS)

    Liodakis, Stylianos; Kyriakidis, Phaedon; Gaganis, Petros

    2015-04-01

    In hydrogeological applications involving flow and transport of in heterogeneous porous media the spatial distribution of hydraulic conductivity is often parameterized in terms of a lognormal random field based on a histogram and variogram model inferred from data and/or synthesized from relevant knowledge. Realizations of simulated conductivity fields are then generated using geostatistical simulation involving simple random (SR) sampling and are subsequently used as inputs to physically-based simulators of flow and transport in a Monte Carlo framework for evaluating the uncertainty in the spatial distribution of solute concentration due to the uncertainty in the spatial distribution of hydraulic con- ductivity [1]. Realistic uncertainty analysis, however, calls for a large number of simulated concentration fields; hence, can become expensive in terms of both time and computer re- sources. A more efficient alternative to SR sampling is Latin hypercube (LH) sampling, a special case of stratified random sampling, which yields a more representative distribution of simulated attribute values with fewer realizations [2]. Here, term representative implies realizations spanning efficiently the range of possible conductivity values corresponding to the lognormal random field. In this work we investigate the efficiency of alternative methods to classical LH sampling within the context of simulation of flow and transport in a heterogeneous porous medium. More precisely, we consider the stratified likelihood (SL) sampling method of [3], in which attribute realizations are generated using the polar simulation method by exploring the geometrical properties of the multivariate Gaussian distribution function. In addition, we propose a more efficient version of the above method, here termed minimum energy (ME) sampling, whereby a set of N representative conductivity realizations at M locations is constructed by: (i) generating a representative set of N points distributed on the

  18. Comparison of the Predictive Performance and Interpretability of Random Forest and Linear Models on Benchmark Data Sets.

    PubMed

    Marchese Robinson, Richard L; Palczewska, Anna; Palczewski, Jan; Kidley, Nathan

    2017-08-28

    The ability to interpret the predictions made by quantitative structure-activity relationships (QSARs) offers a number of advantages. While QSARs built using nonlinear modeling approaches, such as the popular Random Forest algorithm, might sometimes be more predictive than those built using linear modeling approaches, their predictions have been perceived as difficult to interpret. However, a growing number of approaches have been proposed for interpreting nonlinear QSAR models in general and Random Forest in particular. In the current work, we compare the performance of Random Forest to those of two widely used linear modeling approaches: linear Support Vector Machines (SVMs) (or Support Vector Regression (SVR)) and partial least-squares (PLS). We compare their performance in terms of their predictivity as well as the chemical interpretability of the predictions using novel scoring schemes for assessing heat map images of substructural contributions. We critically assess different approaches for interpreting Random Forest models as well as for obtaining predictions from the forest. We assess the models on a large number of widely employed public-domain benchmark data sets corresponding to regression and binary classification problems of relevance to hit identification and toxicology. We conclude that Random Forest typically yields comparable or possibly better predictive performance than the linear modeling approaches and that its predictions may also be interpreted in a chemically and biologically meaningful way. In contrast to earlier work looking at interpretation of nonlinear QSAR models, we directly compare two methodologically distinct approaches for interpreting Random Forest models. The approaches for interpreting Random Forest assessed in our article were implemented using open-source programs that we have made available to the community. These programs are the rfFC package ( https://r-forge.r-project.org/R/?group_id=1725 ) for the R statistical

  19. Monte Carlo Analysis of Reservoir Models Using Seismic Data and Geostatistical Models

    NASA Astrophysics Data System (ADS)

    Zunino, A.; Mosegaard, K.; Lange, K.; Melnikova, Y.; Hansen, T. M.

    2013-12-01

    We present a study on the analysis of petroleum reservoir models consistent with seismic data and geostatistical constraints performed on a synthetic reservoir model. Our aim is to invert directly for structure and rock bulk properties of the target reservoir zone. To infer the rock facies, porosity and oil saturation seismology alone is not sufficient but a rock physics model must be taken into account, which links the unknown properties to the elastic parameters. We then combine a rock physics model with a simple convolutional approach for seismic waves to invert the "measured" seismograms. To solve this inverse problem, we employ a Markov chain Monte Carlo (MCMC) method, because it offers the possibility to handle non-linearity, complex and multi-step forward models and provides realistic estimates of uncertainties. However, for large data sets the MCMC method may be impractical because of a very high computational demand. To face this challenge one strategy is to feed the algorithm with realistic models, hence relying on proper prior information. To address this problem, we utilize an algorithm drawn from geostatistics to generate geologically plausible models which represent samples of the prior distribution. The geostatistical algorithm learns the multiple-point statistics from prototype models (in the form of training images), then generates thousands of different models which are accepted or rejected by a Metropolis sampler. To further reduce the computation time we parallelize the software and run it on multi-core machines. The solution of the inverse problem is then represented by a collection of reservoir models in terms of facies, porosity and oil saturation, which constitute samples of the posterior distribution. We are finally able to produce probability maps of the properties we are interested in by performing statistical analysis on the collection of solutions.

  20. A Nonparametric Geostatistical Method For Estimating Species Importance

    Treesearch

    Andrew J. Lister; Rachel Riemann; Michael Hoppus

    2001-01-01

    Parametric statistical methods are not always appropriate for conducting spatial analyses of forest inventory data. Parametric geostatistical methods such as variography and kriging are essentially averaging procedures, and thus can be affected by extreme values. Furthermore, non normal distributions violate the assumptions of analyses in which test statistics are...

  1. Geostatistical Modeling of Evolving Landscapes by Means of Image Quilting

    NASA Astrophysics Data System (ADS)

    Mendes, J. H.; Caers, J.; Scheidt, C.

    2015-12-01

    Realistic geological representation of subsurface heterogeneity remains an important outstanding challenge. While many geostatistical methods exist for representing sedimentary systems, such as multiple-point geostatistics, rule-based methods or Boolean methods, the question of what the prior uncertainty on parameters (or training images) of such algorithms are, remains outstanding. In this initial work, we investigate the use of flume experiments to constrain better such prior uncertainty and to start understanding what information should be provided to geostatistical algorithms. In particular, we study the use of image quilting as a novel multiple-point method for generating fast geostatistical realizations once a training image is provided. Image quilting is a method emanating from computer graphics where patterns are extracted from training images and then stochastically quilted along a raster path to create stochastic variation of the stated training image. In this initial study, we use a flume experiment and extract 10 training images as representative for the variability of the evolving landscape over a period of 136 minutes. The training images consists of wet/dry regions obtained from overhead shots taken over the flume experiment. To investigate whether such image quilting reproduces the same variability of the evolving landscape in terms of wet/dry regions, we generate multiple realizations with all 10 training images and compare that variability with the variability seen in the entire flume experiment. By proper tuning of the quilting parameters we find generally reasonable agreement with the flume experiment.

  2. A geostatistical and sampling analysis of regraded spoil materials

    SciTech Connect

    Myers, J.C.; Brown, T.H.

    1990-12-31

    Characterization of the pH and acid-base account levels in regraded spoil materials from mining operations is a difficult task due to mixing and the directional nature of product extraction. Geostatistical analysis of regraded spoil materials is currently being studied as the eventual methodology for determining sample grid size and sub-sample number for minesoil monitoring programs in the State of Texas. It is anticipated that geostatistics will soon be utilized for similar reasons at mine sites in other regions. In view of this, it is necessary to develop a position on geostatistics as a method for determining sample intensity necessary to statistically characterize Acid Forming Material (AFM) conditions existing in post-mined soils. A group of six Texas lignite mines has been analyzed using geostatistical methods. Acid-base account and pH values were mapped at four levels in each site. Determinations as to the confidence of the sampling programs were performed for all sites. Recommendations and strategies were developed for future sampling programs. Additional techniques to minimize sub-sample spacing were also developed.

  3. Reducing uncertainty in geostatistical description with well testing pressure data

    SciTech Connect

    Reynolds, A.C.; He, Nanqun; Oliver, D.S.

    1997-08-01

    Geostatistics has proven to be an effective tool for generating realizations of reservoir properties conditioned to static data, e.g., core and log data and geologic knowledge. Due to the lack of closely spaced data in the lateral directions, there will be significant variability in reservoir descriptions generated by geostatistical simulation, i.e., significant uncertainty in the reservoir descriptions. In past work, we have presented procedures based on inverse problem theory for generating reservoir descriptions (rock property fields) conditioned to pressure data and geostatistical information represented as prior means for log-permeability and porosity and variograms. Although we have shown that the incorporation of pressure data reduces the uncertainty below the level contained in the geostatistical model based only on static information (the prior model), our previous results assumed did not explicitly account for uncertainties in the prior means and the parameters defining the variogram model. In this work, we investigate how pressure data can help detect errors in the prior means. If errors in the prior means are large and are not taken into account, realizations conditioned to pressure data represent incorrect samples of the a posteriori probability density function for the rock property fields, whereas, if the uncertainty in the prior mean is incorporated properly into the model, one obtains realistic realizations of the rock property fields.

  4. Gstat: a program for geostatistical modelling, prediction and simulation

    NASA Astrophysics Data System (ADS)

    Pebesma, Edzer J.; Wesseling, Cees G.

    1998-01-01

    Gstat is a computer program for variogram modelling, and geostatistical prediction and simulation. It provides a generic implementation of the multivariable linear model with trends modelled as a linear function of coordinate polynomials or of user-defined base functions, and independent or dependent, geostatistically modelled, residuals. Simulation in gstat comprises conditional or unconditional (multi-) Gaussian sequential simulation of point values or block averages, or (multi-) indicator sequential simulation. Besides many of the popular options found in other geostatistical software packages, gstat offers the unique combination of (i) an interactive user interface for modelling variograms and generalized covariances (residual variograms), that uses the device-independent plotting program gnuplot for graphical display, (ii) support for several ascii and binary data and map file formats for input and output, (iii) a concise, intuitive and flexible command language, (iv) user customization of program defaults, (v) no built-in limits, and (vi) free, portable ANSI-C source code. This paper describes the class of problems gstat can solve, and addresses aspects of efficiency and implementation, managing geostatistical projects, and relevant technical details.

  5. Childhood obesity prevention interventions in childcare settings: systematic review of randomized and nonrandomized controlled trials.

    PubMed

    Zhou, Yuan E; Emerson, Janice S; Levine, Robert S; Kihlberg, Courtney J; Hull, Pamela C

    2014-01-01

    Childcare settings are an opportune location for early intervention programs seeking to prevent childhood obesity. This article reports on a systematic review of controlled trials of obesity prevention interventions in childcare settings. The review was limited to English language articles published in PubMed, Web of Science, and Education Resources Information Center (ERIC) between January 2000 and April 2012. childhood obesity prevention interventions in childcare settings using controlled designs that reported adiposity and behavior outcomes. no interventions, non-childcare settings, clinical weight loss programs, non-English publications. Publications were identified by key word search. Two authors reviewed eligible studies to extract study information and study results. Qualitative synthesis was conducted, including tabulation of information and a narrative summary. Fifteen studies met the eligibility criteria. Seven studies reported improvements in adiposity. Six of the 13 interventions with dietary components reported improved intake or eating behaviors. Eight of the 12 interventions with physical activity components reported improved activity levels or physical fitness. Evidence was mixed for all outcomes. Results should be interpreted cautiously given the high variability in study designs and interventions. Further research needs long-term follow-up, multistrategy interventions that include changes in the nutrition and physical activity environment, reporting of cost data, and consideration of sustainability.

  6. Evaluating the Bookmark Standard Setting Method: The Impact of Random Item Ordering

    ERIC Educational Resources Information Center

    Davis-Becker, Susan L.; Buckendahl, Chad W.; Gerrow, Jack

    2011-01-01

    Throughout the world, cut scores are an important aspect of a high-stakes testing program because they are a key operational component of the interpretation of test scores. One method for setting standards that is prevalent in educational testing programs--the Bookmark method--is intended to be a less cognitively complex alternative to methods…

  7. Does Pedometer Goal Setting Improve Physical Activity among Native Elders? Results from a Randomized Pilot Study

    ERIC Educational Resources Information Center

    Sawchuk, Craig N.; Russo, Joan E.; Charles, Steve; Goldberg, Jack; Forquera, Ralph; Roy-Byrne, Peter; Buchwald, Dedra

    2011-01-01

    We examined if step-count goal setting resulted in increases in physical activity and walking compared to only monitoring step counts with pedometers among American Indian/Alaska Native elders. Outcomes included step counts, self-reported physical activity and well-being, and performance on the 6-minute walk test. Although no significant…

  8. Using Performance Feedback and Goal Setting to Improve Elementary Students' Writing Fluency: A Randomized Controlled Trial

    ERIC Educational Resources Information Center

    Koenig, Elizabeth A.; Eckert, Tanya L.; Hier, Bridget O.

    2016-01-01

    Although performance feedback interventions successfully lead to improvements in students' performance, research suggests that the combination of feedback and goal setting leads to greater performance than either component alone and that graphing performance in relation to a goal can lead to improvements in academic performance. The goal of the…

  9. Web-based weight management programs in an integrated health care setting: a randomized, controlled trial.

    PubMed

    Rothert, Kendra; Strecher, Victor J; Doyle, Laurie A; Caplan, William M; Joyce, Jodi S; Jimison, Holly B; Karm, Lya M; Mims, Adrienne D; Roth, Mark A

    2006-02-01

    To assess the efficacy of a Web-based tailored behavioral weight management program compared with Web-based information-only weight management materials. Participants, 2862 eligible overweight and obese (BMI = 27 to 40 kg/m2) members from four regions of Kaiser Permanente's integrated health care delivery system, were randomized to receive either a tailored expert system or information-only Web-based weight management materials. Weight change and program satisfaction were assessed by self-report through an Internet-based survey at 3- and 6-month follow-up periods. Significantly greater weight loss at follow-up was found among participants assigned to the tailored expert system than among those assigned to the information-only condition. Subjects in the tailored expert system lost a mean of 3 +/- 0.3% of their baseline weight, whereas subjects in the information-only condition lost a mean of 1.2 +/- 0.4% (p < 0.0004). Participants were also more likely to report that the tailored expert system was personally relevant, helpful, and easy to understand. Notably, 36% of enrollees were African-American, with enrollment rates higher than the general proportion of African Americans in any of the study regions. The results of this large, randomized control trial show the potential benefit of the Web-based tailored expert system for weight management compared with a Web-based information-only weight management program.

  10. Adjustment of Open-Loop Settings to Improve Closed-Loop Results in Type 1 Diabetes: A Multicenter Randomized Trial

    PubMed Central

    Dassau, Eyal; Brown, Sue A.; Basu, Ananda; Pinsker, Jordan E.; Kudva, Yogish C.; Gondhalekar, Ravi; Patek, Steve; Lv, Dayu; Schiavon, Michele; Lee, Joon Bok; Dalla Man, Chiara; Hinshaw, Ling; Castorino, Kristin; Mallad, Ashwini; Dadlani, Vikash; McCrady-Spitzer, Shelly K.; McElwee-Malloy, Molly; Wakeman, Christian A.; Bevier, Wendy C.; Bradley, Paige K.; Kovatchev, Boris; Cobelli, Claudio; Zisser, Howard C.

    2015-01-01

    Context: Closed-loop control (CLC) relies on an individual's open-loop insulin pump settings to initialize the system. Optimizing open-loop settings before using CLC usually requires significant time and effort. Objective: The objective was to investigate the effects of a one-time algorithmic adjustment of basal rate and insulin to carbohydrate ratio open-loop settings on the performance of CLC. Design: This study reports a multicenter, outpatient, randomized, crossover clinical trial. Patients: Thirty-seven adults with type 1 diabetes were enrolled at three clinical sites. Interventions: Each subject's insulin pump settings were subject to a one-time algorithmic adjustment based on 1 week of open-loop (i.e., home care) data collection. Subjects then underwent two 27-hour periods of CLC in random order with either unchanged (control) or algorithmic adjusted basal rate and carbohydrate ratio settings (adjusted) used to initialize the zone-model predictive control artificial pancreas controller. Subject's followed their usual meal-plan and had an unannounced exercise session. Main Outcomes and Measures: Time in the glucose range was 80–140 mg/dL, compared between both arms. Results: Thirty-two subjects completed the protocol. Median time in CLC was 25.3 hours. The median time in the 80–140 mg/dl range was similar in both groups (39.7% control, 44.2% adjusted). Subjects in both arms of CLC showed minimal time spent less than 70 mg/dl (median 1.34% and 1.37%, respectively). There were no significant differences more than 140 mg/dL. Conclusions: A one-time algorithmic adjustment of open-loop settings did not alter glucose control in a relatively short duration outpatient closed-loop study. The CLC system proved very robust and adaptable, with minimal (<2%) time spent in the hypoglycemic range in either arm. PMID:26204135

  11. Geostatistical analysis of the flood risk perception queries in the village of Navaluenga (Central Spain)

    NASA Astrophysics Data System (ADS)

    Guardiola-Albert, Carolina; Díez-Herrero, Andrés; Amérigo, María; García, Juan Antonio; María Bodoque, José; Fernández-Naranjo, Nuria

    2017-04-01

    response actions, such as designing optimal evacuation routes during flood emergencies. Geostatistical tools also provide a set of interpolation techniques for the prediction of the variable value at unstudied similar locations, basing on the sample point values and other variables related with the measured variable. We attempt different geostatistical interpolation methods to obtain continuous surfaces of the risk perception and level of awareness in the study area. The use of these maps for future extensions and actualizations of the Civil Protection Plan is evaluated. References Bodoque, J. M., Amérigo, M., Díez-Herrero, A., García, J. A., Cortés, B., Ballesteros-Cánovas, J. A., & Olcina, J. (2016). Improvement of resilience of urban areas by integrating social perception in flash-flood risk management.Journal of Hydrology.

  12. Geostatistical analysis of soil properties at field scale using standardized data

    NASA Astrophysics Data System (ADS)

    Millan, H.; Tarquis, A. M.; Pérez, L. D.; Matos, J.; González-Posada, M.

    2012-04-01

    Indentifying areas with physical degradation is a crucial step to ameliorate the effects in soil erosion. The quantification and interpretation of spatial variability is a key issue for site-specific soil management. Geostatistics has been the main methodological tool for implementing precision agriculture using field data collected at different spatial resolutions. Even though many works have made significant contributions to the body of knowledge on spatial statistics and its applications, some other key points need to be addressed for conducting precise comparisons between soil properties using geostatistical parameters. The objectives of the present work were (i) to quantify the spatial structure of different physical properties collected from a Vertisol, (ii) to search for potential correlations between different spatial patterns and (iii) to identify relevant components through multivariate spatial analysis. The study was conducted on a Vertisol (Typic Hapludert) dedicated to sugarcane (Saccharum officinarum L.) production during the last sixty years. We used six soil properties collected from a squared grid (225 points) (penetrometer resistance (PR), total porosity, fragmentation dimension (Df), vertical electrical conductivity (ECv), horizontal electrical conductivity (ECh) and soil water content (WC)). All the original data sets were z-transformed before geostatistical analysis. Three different types of semivariogram models were necessary for fitting individual experimental semivariograms. This suggests the different natures of spatial variability patterns. Soil water content rendered the largest nugget effect (C0 = 0.933) while soil total porosity showed the largest range of spatial correlation (A = 43.92 m). The bivariate geostatistical analysis also rendered significant cross-semivariance between different paired soil properties. However, four different semivariogram models were required in that case. This indicates an underlying co

  13. Extending an operational meteorological monitoring network through machine learning and classical geo-statistical approaches

    NASA Astrophysics Data System (ADS)

    Appelhans, Tim; Mwangomo, Ephraim; Otte, Insa; Detsch, Florian; Nauss, Thomas; Hemp, Andreas; Ndyamkama, Jimmy

    2015-04-01

    This study introduces the set-up and characteristics of a meteorological station network on the southern slopes of Mt. Kilimanjaro, Tanzania. The set-up follows a hierarchical approach covering an elevational as well as a land-use disturbance gradient. The network consists of 52 basic stations measuring ambient air temperature and above ground air humidity and 11 precipitation measurement sites. We provide in depth descriptions of various machine learning and classical geo-statistical methods used to fill observation gaps and extend the spatial coverage of the network to a total of 60 research sites. Performance statistics for these methods indicate that the presented data sets provide reliable measurements of the meteorological reality at Mt. Kilimanjaro. These data provide an excellent basis for ecological studies and are also of great value for regional atmospheric numerical modelling studies for which such comprehensive in-situ validation observations are rare, especially in tropical regions of complex terrain.

  14. Laparoscopic and robotic skills are transferable in a simulation setting: a randomized controlled trial.

    PubMed

    Thomaier, Lauren; Orlando, Megan; Abernethy, Melinda; Paka, Chandhana; Chen, Chi Chiung Grace

    2017-08-01

    Although surgical simulation provides an effective supplement to traditional training, it is not known whether skills are transferable between minimally invasive surgical modalities. The purpose of this study was to assess the transferability of skills between minimally invasive surgical simulation platforms among simulation-naïve participants. Forty simulation-naïve medical students were enrolled in this randomized single-blinded controlled trial. Participants completed a baseline evaluation on laparoscopic (Fundamentals of Laparoscopic Surgery Program, Los Angeles, CA) and robotic (dV-Trainer, Mimic, Seattle, WA) simulation peg transfer tasks. Participants were then randomized to perform a practice session on either the robotic (N = 20) or laparoscopic (N = 20) simulator. Two blinded, expert minimally invasive surgeons evaluated participants before and after training using a modified previously validated subjective global rating scale. Objective measures including time to task completion and Mimic dV-Trainer motion metrics were also recorded. At baseline, there were no significant differences between the training groups as measured by objective and subjective measures for either simulation task. After training, participants randomized to the laparoscopic practice group completed the laparoscopic task faster (p < 0.003) and with higher global rating scale scores (p < 0.001) than the robotic group. Robotic-trained participants performed the robotic task faster (p < 0.001), with improved economy of motion (p < 0.001), and with higher global rating scale scores (p = 0.006) than the laparoscopic group. The robotic practice group also demonstrated significantly improved performance on the laparoscopic task (p = 0.02). Laparoscopic-trained participants also improved their robotic performance (p = 0.02), though the robotic group had a higher percent improvement on the robotic task (p = 0.037). Skills acquired through practice on either laparoscopic

  15. BETTER HEALTH: Durham -- protocol for a cluster randomized trial of BETTER in community and public health settings.

    PubMed

    Paszat, Lawrence; Sutradhar, Rinku; O'Brien, Mary Ann; Lofters, Aisha; Pinto, Andrew; Selby, Peter; Baxter, Nancy; Donnelly, Peter D; Elliott, Regina; Glazier, Richard H; Kyle, Robert; Manca, Donna; Pietrusiak, Mary-Anne; Rabeneck, Linda; Sopcak, Nicolette; Tinmouth, Jill; Wall, Becky; Grunfeld, Eva

    2017-09-29

    The Building on Existing Tools to Improve Chronic Disease Prevention and Screening (BETTER) cluster randomized trial in primary care settings demonstrated a 30% improvement in adherence to evidence-based Chronic Disease Prevention and Screening (CDPS) activities. CDPS activities included healthy activities, lifestyle modifications, and screening tests. We present a protocol for the adaptation of BETTER to a public health setting, and testing the adaptation in a cluster randomized trial (BETTER HEALTH: Durham) among low income neighbourhoods in Durham Region, Ontario (Canada). The BETTER intervention consists of a personalized prevention visit between a participant and a prevention practitioner, which is focused on the participant's eligible CDPS activities, and uses Brief Action Planning, to empower the participant to set achievable short-term goals. Durham aims to establish that the BETTER intervention can be adapted and proven effective among 40-64 year old residents of low income areas when provided in the community by public health nurses trained as prevention practitioners. Focus groups and key informant interviews among stakeholders and eligible residents of low income areas will inform the adaptation, along with feedback from the trial's Community Advisory Committee. We have created a sampling frame of 16 clusters composed of census dissemination areas in the lowest urban quintile of median household income, and will sample 10 clusters to be randomly allocated to immediate intervention or six month wait list control. Accounting for the clustered design effect, the trial will have 80% power to detect an absolute 30% difference in the primary outcome, a composite score of completed eligible CDPS actions six months after enrollment. The prevention practitioner will attempt to link participants without a primary care provider (PCP) to a local PCP. The implementation of BETTER HEALTH: Durham will be evaluated by focus groups and key informant interviews. The

  16. Randomized Trial of Two Dissemination Strategies for a Skin Cancer Prevention Program in Aquatic Settings

    PubMed Central

    Escoffery, Cam; Elliott, Tom; Nehl, Eric J.

    2015-01-01

    Objectives. We compared 2 strategies for disseminating an evidence-based skin cancer prevention program. Methods. We evaluated the effects of 2 strategies (basic vs enhanced) for dissemination of the Pool Cool skin cancer prevention program in outdoor swimming pools on (1) program implementation, maintenance, and sustainability and (2) improvements in organizational and environmental supports for sun protection. The trial used a cluster-randomized design with pools as the unit of intervention and outcome. The enhanced group received extra incentives, reinforcement, feedback, and skill-building guidance. Surveys were collected in successive years (2003–2006) from managers of 435 pools in 33 metropolitan areas across the United States participating in the Pool Cool Diffusion Trial. Results. Both treatment groups improved their implementation of the program, but pools in the enhanced condition had significantly greater overall maintenance of the program over 3 summers of participation. Furthermore, pools in the enhanced condition established and maintained significantly greater sun-safety policies and supportive environments over time. Conclusions. This study found that more intensive, theory-driven dissemination strategies can significantly enhance program implementation and maintenance of health-promoting environmental and policy changes. Future research is warranted through longitudinal follow-up to examine sustainability. PMID:25521872

  17. Geostatistical regularization of inverse models for the retrieval of vegetation biophysical variables

    NASA Astrophysics Data System (ADS)

    Atzberger, C.; Richter, K.

    2009-09-01

    The robust and accurate retrieval of vegetation biophysical variables using radiative transfer models (RTM) is seriously hampered by the ill-posedness of the inverse problem. With this research we further develop our previously published (object-based) inversion approach [Atzberger (2004)]. The object-based RTM inversion takes advantage of the geostatistical fact that the biophysical characteristics of nearby pixel are generally more similar than those at a larger distance. A two-step inversion based on PROSPECT+SAIL generated look-up-tables is presented that can be easily implemented and adapted to other radiative transfer models. The approach takes into account the spectral signatures of neighboring pixel and optimizes a common value of the average leaf angle (ALA) for all pixel of a given image object, such as an agricultural field. Using a large set of leaf area index (LAI) measurements (n = 58) acquired over six different crops of the Barrax test site, Spain), we demonstrate that the proposed geostatistical regularization yields in most cases more accurate and spatially consistent results compared to the traditional (pixel-based) inversion. Pros and cons of the approach are discussed and possible future extensions presented.

  18. Optimal design of hydraulic head monitoring networks using space-time geostatistics

    NASA Astrophysics Data System (ADS)

    Herrera, G. S.; Júnez-Ferreira, H. E.

    2013-05-01

    This paper presents a new methodology for the optimal design of space-time hydraulic head monitoring networks and its application to the Valle de Querétaro aquifer in Mexico. The selection of the space-time monitoring points is done using a static Kalman filter combined with a sequential optimization method. The Kalman filter requires as input a space-time covariance matrix, which is derived from a geostatistical analysis. A sequential optimization method that selects the space-time point that minimizes a function of the variance, in each step, is used. We demonstrate the methodology applying it to the redesign of the hydraulic head monitoring network of the Valle de Querétaro aquifer with the objective of selecting from a set of monitoring positions and times, those that minimize the spatiotemporal redundancy. The database for the geostatistical space-time analysis corresponds to information of 273 wells located within the aquifer for the period 1970-2007. A total of 1,435 hydraulic head data were used to construct the experimental space-time variogram. The results show that from the existing monitoring program that consists of 418 space-time monitoring points, only 178 are not redundant. The implied reduction of monitoring costs was possible because the proposed method is successful in propagating information in space and time.

  19. Geological, geomechanical and geostatistical assessment of rockfall hazard in San Quirico Village (Abruzzo, Italy)

    NASA Astrophysics Data System (ADS)

    Chiessi, Vittorio; D'Orefice, Maurizio; Scarascia Mugnozza, Gabriele; Vitale, Valerio; Cannese, Christian

    2010-07-01

    This paper describes the results of a rockfall hazard assessment for the village of San Quirico (Abruzzo region, Italy) based on an engineering-geological model. After the collection of geological, geomechanical, and geomorphological data, the rockfall hazard assessment was performed based on two separate approaches: i) simulation of detachment of rock blocks and their downhill movement using a GIS; and ii) application of geostatistical techniques to the analysis of georeferenced observations of previously fallen blocks, in order to assess the probability of arrival of blocks due to potential future collapses. The results show that the trajectographic analysis is significantly influenced by the input parameters, with particular reference to the coefficients of restitution values. In order to solve this problem, the model was calibrated based on repeated field observations. The geostatistical approach is useful because it gives the best estimation of point-source phenomena such as rockfalls; however, the sensitivity of results to basic assumptions, e.g. assessment of variograms and choice of a threshold value, may be problematic. Consequently, interpolations derived from different variograms have been used and compared among them; hence, those showing the lowest errors were adopted. The data sets which were statistically analysed are relevant to both kinetic energy and surveyed rock blocks in the accumulation area. The obtained maps highlight areas susceptible to rock block arrivals, and show that the area accommodating the new settlement of S. Quirico Village has the highest level of hazard according to both probabilistic and deterministic methods.

  20. Comparing the performance of geostatistical models with additional information from covariates for sewage plume characterization.

    PubMed

    Del Monego, Maurici; Ribeiro, Paulo Justiniano; Ramos, Patrícia

    2015-04-01

    In this work, kriging with covariates is used to model and map the spatial distribution of salinity measurements gathered by an autonomous underwater vehicle in a sea outfall monitoring campaign aiming to distinguish the effluent plume from the receiving waters and characterize its spatial variability in the vicinity of the discharge. Four different geostatistical linear models for salinity were assumed, where the distance to diffuser, the west-east positioning, and the south-north positioning were used as covariates. Sample variograms were fitted by the Matèrn models using weighted least squares and maximum likelihood estimation methods as a way to detect eventual discrepancies. Typically, the maximum likelihood method estimated very low ranges which have limited the kriging process. So, at least for these data sets, weighted least squares showed to be the most appropriate estimation method for variogram fitting. The kriged maps show clearly the spatial variation of salinity, and it is possible to identify the effluent plume in the area studied. The results obtained show some guidelines for sewage monitoring if a geostatistical analysis of the data is in mind. It is important to treat properly the existence of anomalous values and to adopt a sampling strategy that includes transects parallel and perpendicular to the effluent dispersion.

  1. Analysis of "Randomized Cross-Over Study Comparing Two Infusion Sets for CSII in Daily Life".

    PubMed

    Renard, Eric

    2017-03-01

    In an article in Journal of Diabetes Science and Technology, Freckmann et al report an evaluation of 2 infusion sets used for CSII in daily life, both from Roche. Because the 2 catheters differed at the soft cannula level in terms of geometry and introducer needle, focus was made on pain at insertion. While no significant difference was noted on this primary endpoint, unplanned catheter changes occurred in close to 20% of cases also with both catheter models, mainly driven by apparent insulin delivery issues. Since kinked cannulas were not frequent, there is a clear need for better understanding of "unexplained hyperglycemia" as the main reason for reduction of catheter lifetime. Fixing this weak aspect of CSII becomes crucial while moving toward closed-loop insulin delivery.

  2. The efficacy of early language intervention in mainstream school settings: a randomized controlled trial.

    PubMed

    Fricke, Silke; Burgoyne, Kelly; Bowyer-Crane, Claudine; Kyriacou, Maria; Zosimidou, Alexandra; Maxwell, Liam; Lervåg, Arne; Snowling, Margaret J; Hulme, Charles

    2017-10-01

    Oral language skills are a critical foundation for literacy and more generally for educational success. The current study shows that oral language skills can be improved by providing suitable additional help to children with language difficulties in the early stages of formal education. We conducted a randomized controlled trial with 394 children in England, comparing a 30-week oral language intervention programme starting in nursery (N = 132) with a 20-week version of the same programme starting in Reception (N = 133). The intervention groups were compared to an untreated waiting control group (N = 129). The programmes were delivered by trained teaching assistants (TAs) working in the children's schools/nurseries. All testers were blind to group allocation. Both the 20- and 30-week programmes produced improvements on primary outcome measures of oral language skill compared to the untreated control group. Effect sizes were small to moderate (20-week programme: d = .21; 30-week programme: d = .30) immediately following the intervention and were maintained at follow-up 6 months later. The difference in improvement between the 20-week and 30-week programmes was not statistically significant. Neither programme produced statistically significant improvements in children's early word reading or reading comprehension skills (secondary outcome measures). This study provides further evidence that oral language interventions can be delivered successfully by trained TAs to children with oral language difficulties in nursery and Reception classes. The methods evaluated have potentially important policy implications for early education. © 2017 Association for Child and Adolescent Mental Health.

  3. Probiotics reduce symptoms of antibiotic use in a hospital setting: a randomized dose response study.

    PubMed

    Ouwehand, Arthur C; DongLian, Cai; Weijian, Xu; Stewart, Morgan; Ni, Jiayi; Stewart, Tad; Miller, Larry E

    2014-01-16

    Probiotics are known to reduce antibiotic associated diarrhea (AAD) and Clostridium difficile associated diarrhea (CDAD) risk in a strain-specific manner. The aim of this study was to determine the dose-response effect of a four strain probiotic combination (HOWARU(®) Restore) on the incidence of AAD and CDAD and severity of gastrointestinal symptoms in adult in-patients requiring antibiotic therapy. Patients (n=503) were randomized among three study groups: HOWARU(®) Restore probiotic 1.70×10(10) CFU (high-dose, n=168), HOWARU(®) Restore probiotic 4.17×10(9) CFU (low-dose, n=168), or placebo (n=167). Subjects were stratified by gender, age, and duration of antibiotic treatment. Study products were administered daily up to 7 days after the final antibiotic dose. The primary endpoint of the study was the incidence of AAD. Secondary endpoints included incidence of CDAD, diarrhea duration, stools per day, bloody stools, fever, abdominal cramping, and bloating. A significant dose-response effect on AAD was observed with incidences of 12.5, 19.6, and 24.6% with high-dose, low-dose, and placebo, respectively (p=0.02). CDAD was the same in both probiotic groups (1.8%) but different from the placebo group (4.8%; p=0.04). Incidences of fever, abdominal pain, and bloating were lower with increasing probiotic dose. The number of daily liquid stools and average duration of diarrhea decreased with higher probiotic dosage. The tested four strain probiotic combination appears to lower the risk of AAD, CDAD, and gastrointestinal symptoms in a dose-dependent manner in adult in-patients.

  4. Collaborative care for depression symptoms in an outpatient cardiology setting: A randomized clinical trial.

    PubMed

    Carney, Robert M; Freedland, Kenneth E; Steinmeyer, Brian C; Rubin, Eugene H; Ewald, Gregory

    2016-09-15

    Depression is a risk factor for morbidity and mortality in patients with coronary heart disease. Finding effective methods for identifying and treating depression in these patients is a high priority. The purpose of this study was to determine whether collaborative care (CC) for patients who screen positive for depression during an outpatient cardiology visit results in greater improvement in depression symptoms and better medical outcomes than seen in patients who screen positive for depression but receive only usual care (UC). Two hundred-one patients seen in an outpatient cardiology clinic who screened positive for depression during an outpatient visit were randomized to receive either CC or UC. Recommendations for depression treatment and ongoing support and monitoring of depression symptoms were provided to CC patients and their primary care physicians (PCPs) for up to 6months. There were no differences between the arms in mean Beck Depression Inventory-II scores(CC, 15.9; UC, 17.4; p=.45) or in depression remission rates(CC, 32.5%; UC, 26.2%; p=0.34) after 6months, or in the number of hospitalizations after 12months (p=0.73). There were fewer deaths among the CC (1/100) than UC patients (8/101) (p=0.03). This trial did not show that CC produces better depression outcomes than UC. Screening led to a higher rate of depression treatment than was expected in the UC group, and delays in obtaining depression treatment from PCPs may have reduced treatment effectiveness for the CC patients. A different strategy for depression treatment following screening in outpatient cardiology services is needed. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  5. Preoperative Nutrition and Postoperative Discomfort in an ERAS Setting: A Randomized Study in Gastric Bypass Surgery.

    PubMed

    Karlsson, A; Wendel, K; Polits, S; Gislason, H; Hedenbro, J L

    2016-04-01

    Many patients experience postoperative nausea and vomiting (PONV). Preoperative treatment with carbohydrate solutions seems to improve the course after different types of surgery. This study was undertaken to investigate the potential value of different models for preoperative hydration/nutrition, in addition to our ERAS (enhanced recovery after surgery) protocol. Ninety non-diabetic women planned for elective laparoscopic gastric bypass and aged 18-65 years were included. All were on preoperative low-calorie diet (LCD). They were randomized into three arms, either a carbohydrate-rich drink, a protein-enriched drink, or tap water and instructed to drink 800 and 200 mL 16 and 2 h, respectively, prior to operation. Risk factors for PONV were recorded preoperatively. All patients were operated before lunch and received 1500-2000 mL of Ringer-Acetate solution during the 24-30-h postoperative hospital time. Four variables (nausea, pain, tiredness, and headache) were registered on 100-mm visual analog scales six times over 22 h. The need for additional medication was registered. Out of 90 patients, 73 complete datasets were obtained. Nausea peaked at 7 p.m. but with no statistically significant differences between groups for any of the variables. Pain peaked the first 2 h postoperatively, remained longer, and had not returned to baseline values at 6 a.m. the morning after surgery but with no difference between groups. Inside our ERAS protocol, additional preoperative carbohydrate- or protein-enriched fluid treatment did not further reduce immediate patient discomfort in laparoscopic gastric bypass surgery.

  6. Relevant Feature Set Estimation with a Knock-out Strategy and Random Forests

    PubMed Central

    Ganz, Melanie; Greve, Douglas N.; Fischl, Bruce; Konukoglu, Ender

    2015-01-01

    Group analysis of neuroimaging data is a vital tool for identifying anatomical and functional variations related to diseases as well as normal biological processes. The analyses are often performed on a large number of highly correlated measurements using a relatively smaller number of samples. Despite the correlation structure, the most widely used approach is to analyze the data using univariate methods followed by post-hoc corrections that try to account for the data’s multivariate nature. Although widely used, this approach may fail to recover from the adverse effects of the initial analysis when local effects are not strong. Multivariate pattern analysis (MVPA) is a powerful alternative to the univariate approach for identifying relevant variations. Jointly analyzing all the measures, MVPA techniques can detect global effects even when individual local effects are too weak to detect with univariate analysis. Current approaches are successful in identifying variations that yield highly predictive and compact models. However, they suffer from lessened sensitivity and instabilities in identification of relevant variations. Furthermore, current methods’ user-defined parameters are often unintuitive and difficult to determine. In this article, we propose a novel MVPA method for group analysis of high-dimensional data that overcomes the drawbacks of the current techniques. Our approach explicitly aims to identify all relevant variations using a “knock-out” strategy and the Random Forest algorithm. In evaluations with synthetic datasets the proposed method achieved substantially higher sensitivity and accuracy than the state-of-the-art MVPA methods, and outperformed the univariate approach when the effect size is low. In experiments with real datasets the proposed method identified regions beyond the univariate approach, while other MVPA methods failed to replicate the univariate results. More importantly, in a reproducibility study with the well-known ADNI

  7. Relevant feature set estimation with a knock-out strategy and random forests.

    PubMed

    Ganz, Melanie; Greve, Douglas N; Fischl, Bruce; Konukoglu, Ender

    2015-11-15

    Group analysis of neuroimaging data is a vital tool for identifying anatomical and functional variations related to diseases as well as normal biological processes. The analyses are often performed on a large number of highly correlated measurements using a relatively smaller number of samples. Despite the correlation structure, the most widely used approach is to analyze the data using univariate methods followed by post-hoc corrections that try to account for the data's multivariate nature. Although widely used, this approach may fail to recover from the adverse effects of the initial analysis when local effects are not strong. Multivariate pattern analysis (MVPA) is a powerful alternative to the univariate approach for identifying relevant variations. Jointly analyzing all the measures, MVPA techniques can detect global effects even when individual local effects are too weak to detect with univariate analysis. Current approaches are successful in identifying variations that yield highly predictive and compact models. However, they suffer from lessened sensitivity and instabilities in identification of relevant variations. Furthermore, current methods' user-defined parameters are often unintuitive and difficult to determine. In this article, we propose a novel MVPA method for group analysis of high-dimensional data that overcomes the drawbacks of the current techniques. Our approach explicitly aims to identify all relevant variations using a "knock-out" strategy and the Random Forest algorithm. In evaluations with synthetic datasets the proposed method achieved substantially higher sensitivity and accuracy than the state-of-the-art MVPA methods, and outperformed the univariate approach when the effect size is low. In experiments with real datasets the proposed method identified regions beyond the univariate approach, while other MVPA methods failed to replicate the univariate results. More importantly, in a reproducibility study with the well-known ADNI dataset

  8. [Geostatistical analysis on distribution pattern of the tobacco budworm larva in Enshi, Hubei, China].

    PubMed

    Xia, Peng-Liang; Wang, Rui; Tan, Jun

    2014-03-01

    Tobacco budworm (Helicoverpa assulta) larvae feed on tobacco leaves (Nicotiana sp.), resulting in significant loss in tobacco production. Geostatistical method was used to analyze H. assulta spatial patterns and dynamics in this paper. The results showed that, H. assulta larvae appeared 40 days after the tobacco plants transplanting, and reached its peak at the early-mature period. The nested spherical and exponential model was the major model for tobacco budworm larva in the field, suggesting its aggregated distribution. The spatial variability C/(C0 + C) was larger than 0.75, which indicated H. assulta larva had wider structural variation and narrower random variation. There was a massive migration of tobacco budworm larva in the fast-growing stage of tobacco. Its quantity became stable after that, especially at the mature stage of tobacco.

  9. Randomized Controlled Trial in Clinical Settings to Evaluate Effectiveness of Coping Skills Education Used With Progressive Tinnitus Management.

    PubMed

    Henry, James A; Thielman, Emily J; Zaugg, Tara L; Kaelin, Christine; Schmidt, Caroline J; Griest, Susan; McMillan, Garnett P; Myers, Paula; Rivera, Izel; Baldwin, Robert; Carlson, Kathleen

    2017-05-24

    This randomized controlled trial evaluated, within clinical settings, the effectiveness of coping skills education that is provided with progressive tinnitus management (PTM). At 2 Veterans Affairs medical centers, N = 300 veterans were randomized to either PTM intervention or 6-month wait-list control. The PTM intervention involved 5 group workshops: 2 led by an audiologist (teaching how to use sound as therapy) and 3 by a psychologist (teaching coping skills derived from cognitive behavioral therapy). It was hypothesized that PTM would be more effective than wait-list control in reducing functional effects of tinnitus and that there would be no differences in effectiveness between sites. At both sites, a statistically significant improvement in mean Tinnitus Functional Index scores was seen at 6 months for the PTM group. Combined data across sites revealed a statistically significant improvement in Tinnitus Functional Index relative to wait-list control. The effect size for PTM using the Tinnitus Functional Index was 0.36 (small). Results suggest that PTM is effective at reducing tinnitus-related functional distress in clinical settings. Although effect sizes were small, they provide evidence of clinical effectiveness of PTM in the absence of stringent research-related inclusion criteria and with a relatively small number of sessions of cognitive behavioral therapy.

  10. Effects of gut-directed hypnotherapy on IBS in different clinical settings-results from two randomized, controlled trials.

    PubMed

    Lindfors, Perjohan; Unge, Peter; Arvidsson, Patrik; Nyhlin, Henry; Björnsson, Einar; Abrahamsson, Hasse; Simrén, Magnus

    2012-02-01

    Gut-directed hypnotherapy has been found to be effective in irritable bowel syndrome (IBS). However, randomized, controlled studies are rare and few have been performed outside highly specialized research centers. The objective of this study was to study the effect of gut-directed hypnotherapy in IBS in different clinical settings outside the traditional research units. The study population included IBS patients refractory to standard management. In study 1, patients were randomized to receive gut-directed hypnotherapy (12 sessions, 1 h/week) in psychology private practices or supportive therapy, whereas patients were randomized to receive gut-directed hypnotherapy in a small county hospital or to serve as waiting list controls in study 2. Gastrointestinal symptom severity and quality of life were evaluated at baseline, at 3 months follow-up and after 1 year. We randomized 138 IBS patients refractory to standard management, 90 in study 1 and 48 in study 2. In both the studies, IBS-related symptoms were improved at 3 months in the gut-directed hypnotherapy groups (P<0.05), but not in the control groups (ns). In study 1, a significantly greater improvement of IBS-related symptom severity could be detected in the gut-directed hypnotherapy group than in the control group (P<0.05), and a trend in the same direction was seen in study 2 (P=0.17). The results seen at 3 months were sustained up to 1 year. Gut-directed hypnotherapy is an effective treatment alternative for patients with refractory IBS, but the effectiveness is lower when the therapy is given outside the highly specialized research centers.

  11. Assessing the resolution-dependent utility of tomograms for geostatistics

    USGS Publications Warehouse

    Day-Lewis, F. D.; Lane, J.W.

    2004-01-01

    Geophysical tomograms are used increasingly as auxiliary data for geostatistical modeling of aquifer and reservoir properties. The correlation between tomographic estimates and hydrogeologic properties is commonly based on laboratory measurements, co-located measurements at boreholes, or petrophysical models. The inferred correlation is assumed uniform throughout the interwell region; however, tomographic resolution varies spatially due to acquisition geometry, regularization, data error, and the physics underlying the geophysical measurements. Blurring and inversion artifacts are expected in regions traversed by few or only low-angle raypaths. In the context of radar traveltime tomography, we derive analytical models for (1) the variance of tomographic estimates, (2) the spatially variable correlation with a hydrologic parameter of interest, and (3) the spatial covariance of tomographic estimates. Synthetic examples demonstrate that tomograms of qualitative value may have limited utility for geostatistics; moreover, the imprint of regularization may preclude inference of meaningful spatial statistics from tomograms.

  12. Hydrogeologic Unit Flow Characterization Using Transition Probability Geostatistics

    SciTech Connect

    Jones, N L; Walker, J R; Carle, S F

    2003-11-21

    This paper describes a technique for applying the transition probability geostatistics method for stochastic simulation to a MODFLOW model. Transition probability geostatistics has several advantages over traditional indicator kriging methods including a simpler and more intuitive framework for interpreting geologic relationships and the ability to simulate juxtapositional tendencies such as fining upwards sequences. The indicator arrays generated by the transition probability simulation are converted to layer elevation and thickness arrays for use with the new Hydrogeologic Unit Flow (HUF) package in MODFLOW 2000. This makes it possible to preserve complex heterogeneity while using reasonably sized grids. An application of the technique involving probabilistic capture zone delineation for the Aberjona Aquifer in Woburn, Ma. is included.

  13. Three-dimensional ERT imaging by the geostatistical approach

    NASA Astrophysics Data System (ADS)

    Kitanidis, P. K.; Lee, J. H.

    2015-12-01

    Electric resistivity tomography (ERT), with observations made at the surface or in boreholes, is a method of imaging the subsurface with many potential applications in areas that include hydrology and environmental engineering. The estimation of the resistivity function from observations is a classic inverse problem. One method to solve this problem is the Geostatistical Approach (GA), which is a stochastic method that allows one to explore the range of possible solutions. GA is an objective and empirical Bayes method. The emphasis of this talk is on methods to reduce the computational cost of implementing this approach. We will show examples of application of the Principal Component Geostatistical Approach (PCGA). PCGA is Jacobian-free and uses forward solvers as black boxes. It utilizes the leading principal components from the prior covariance to obtain a good approximation of the solution at a fraction of the cost.

  14. Stochastic Local Interaction (SLI) model: Bridging machine learning and geostatistics

    NASA Astrophysics Data System (ADS)

    Hristopulos, Dionissios T.

    2015-12-01

    Machine learning and geostatistics are powerful mathematical frameworks for modeling spatial data. Both approaches, however, suffer from poor scaling of the required computational resources for large data applications. We present the Stochastic Local Interaction (SLI) model, which employs a local representation to improve computational efficiency. SLI combines geostatistics and machine learning with ideas from statistical physics and computational geometry. It is based on a joint probability density function defined by an energy functional which involves local interactions implemented by means of kernel functions with adaptive local kernel bandwidths. SLI is expressed in terms of an explicit, typically sparse, precision (inverse covariance) matrix. This representation leads to a semi-analytical expression for interpolation (prediction), which is valid in any number of dimensions and avoids the computationally costly covariance matrix inversion.

  15. Geostatistical joint inversion of seismic and potential field methods

    NASA Astrophysics Data System (ADS)

    Shamsipour, Pejman; Chouteau, Michel; Giroux, Bernard

    2016-04-01

    Interpretation of geophysical data needs to integrate different types of information to make the proposed model geologically realistic. Multiple data sets can reduce uncertainty and non-uniqueness present in separate geophysical data inversions. Seismic data can play an important role in mineral exploration, however processing and interpretation of seismic data is difficult due to complexity of hard-rock geology. On the other hand, the recovered model from potential field methods is affected by inherent non uniqueness caused by the nature of the physics and by underdetermination of the problem. Joint inversion of seismic and potential field data can mitigate weakness of separate inversion of these methods. A stochastic joint inversion method based on geostatistical techniques is applied to estimate density and velocity distributions from gravity and travel time data. The method fully integrates the physical relations between density-gravity, on one hand, and slowness-travel time, on the other hand. As a consequence, when the data are considered noise-free, the responses from the inverted slowness and density data exactly reproduce the observed data. The required density and velocity auto- and cross-covariance are assumed to follow a linear model of coregionalization (LCM). The recent development of nonlinear model of coregionalization could also be applied if needed. The kernel function for the gravity method is obtained by the closed form formulation. For ray tracing, we use the shortest-path methods (SPM) to calculate the operation matrix. The jointed inversion is performed on structured grid; however, it is possible to extend it to use unstructured grid. The method is tested on two synthetic models: a model consisting of two objects buried in a homogeneous background and a model with stochastic distribution of parameters. The results illustrate the capability of the method to improve the inverted model compared to the separate inverted models with either gravity

  16. Geostatistics for natural resources characterization. Part 1 and part 2

    SciTech Connect

    Verly, G.

    1984-01-01

    This collection of 33 research papers (in two volumes) bears witness to the fact that applications of geostatistics are no longer limited to the mining industry. Given are reports made in fields such as hydrology, soil sciences, pollution control, and geotechnical engineering. Contents, abridged: (pt.1) Variogram. Kriging. Recoverage reserves. Spectral analysis and data analysis. (pt. 2) Applications in the petroleum industry and automatics contouring. Applications in hydrogeology and geochemical exploration. Case studies in ore reserves estimation. Simulation. Index.

  17. Peaks and dips in Gaussian random fields: a new algorithm for the shear eigenvalues, and the excursion set theory

    NASA Astrophysics Data System (ADS)

    Rossi, Graziano

    2013-04-01

    We present a new algorithm to sample the constrained eigenvalues of the initial shear field associated with Gaussian statistics, called the `peak/dip excursion-set-based' algorithm, at positions which correspond to peaks or dips of the correlated density field. The computational procedure is based on a new formula which extends Doroshkevich's unconditional distribution for the eigenvalues of the linear tidal field, to account for the fact that haloes and voids may correspond to maxima or minima of the density field. The ability to differentiate between random positions and special points in space around which haloes or voids may form (i.e. peaks/dips), encoded in the new formula and reflected in the algorithm, naturally leads to a straightforward implementation of an excursion set model for peaks and dips in Gaussian random fields - one of the key advantages of this sampling procedure. In addition, it offers novel insights into the statistical description of the cosmic web. As a first physical application, we show how the standard distributions of shear ellipticity and prolateness in triaxial models of structure formation are modified by the constraint. In particular, we provide a new expression for the conditional distribution of shape parameters given the density peak constraint, which generalizes some previous literature work. The formula has important implications for the modelling of non-spherical dark matter halo shapes, in relation to their initial shape distribution. We also test and confirm our theoretical predictions for the individual distributions of eigenvalues subjected to the extremum constraint, along with other directly related conditional probabilities. Finally, we indicate how the proposed sampling procedure naturally integrates into the standard excursion set model, potentially solving some of its well-known problems, and into the ellipsoidal collapse framework. Several other ongoing applications and extensions, towards the development of

  18. Addressing uncertainty in rock properties through geostatistical simulation

    SciTech Connect

    McKenna, S.A.; Rautman, A.; Cromer, M.V.; Zelinski, W.P.

    1996-09-01

    Fracture and matrix properties in a sequence of unsaturated, welded tuffs at Yucca Mountain, Nevada, are modeled in two-dimensional cross-sections through geostatistical simulation. In the absence of large amounts of sample data, an n interpretive, deterministic, stratigraphic model is coupled with a gaussian simulation algorithm to constrain realizations of both matrix porosity and fracture frequency. Use of the deterministic, stratigraphic model imposes scientific judgment, in the form of a conceptual geologic model, onto the property realizations. Linear coregionalization and a regression relationship between matrix porosity and matrix hydraulic conductivity are used to generate realizations of matrix hydraulic conductivity. Fracture-frequency simulations conditioned on the stratigraphic model represent one class of fractures (cooling fractures) in the conceptual model of the geology. A second class of fractures (tectonic fractures) is conceptualized as fractures that cut across strata vertically and includes discrete features such as fault zones. Indicator geostatistical simulation provides locations of this second class of fractures. The indicator realizations are combined with the realizations of fracture spacing to create realizations of fracture frequency that are a combination of both classes of fractures. Evaluations of the resulting realizations include comparing vertical profiles of rock properties within the model to those observed in boreholes and checking intra-unit property distributions against collected data. Geostatistical simulation provides an efficient means of addressing spatial uncertainty in dual continuum rock properties.

  19. Breast carcinoma, intratumour heterogeneity and histological grading, using geostatistics.

    PubMed

    Sharifi-Salamatian, V; de Roquancourt, A; Rigaut, J P

    2000-01-01

    Tumour progression is currently believed to result from genetic instability. Chromosomal patterns specific of a type of cancer are frequent even though phenotypic spatial heterogeneity is omnipresent. The latter is the usual cause of histological grading imprecision, a well documented problem, without any fully satisfactory solution up to now. The present article addresses this problem in breast carcinoma. The assessment of a genetic marker for human tumours requires quantifiable measures of intratumoral heterogeneity. If any invariance paradigm representing a stochastic or geostatistic function could be discovered, this might help in solving the grading problem. A novel methodological approach using geostatistics to measure heterogeneity is used. Twenty tumours from the three usual (Scarff-Bloom and Richardson) grades were obtained and paraffin sections stained by MIB-1 (Ki-67) and peroxidase staining. Whole two-dimensional sections were sampled. Morphometric grids of variable sizes allowed a simple and fast recording of positions of epithelial nuclei, marked or not by MIB-1. The geostatistical method is based here upon the asymptotic behaviour of dispersion variance. Measure of asymptotic exponent of dispersion variance shows an increase from grade 1 to grade 3. Preliminary results are encouraging: grades 1 and 3 on one hand and 2 and 3 on the other hand are totally separated. The final proof of an improved grading using this measure will of course require a confrontation with the results of survival studies.

  20. Application of geostatistics to coal-resource characterization and mine planning. Final report

    SciTech Connect

    Kauffman, P.W.; Walton, D.R.; Martuneac, L.; Kim, Y.C.; Knudsen, H.P.; Baafi, E.Y.; Lonergan, J.E.; Martino, F.

    1981-12-01

    Geostatistics is a proven method of ore reserve estimation in many non-coal mining areas but little has been published concerning its application to coal resources. This report presents the case for using geostatistics for coal mining applications and describes how a coal mining concern can best utilize geostatistical techniques for coal resource characterization and mine planning. An overview of the theory of geostatistics is also presented. Many of the applications discussed are documented in case studies that are a part of the report. The results of an exhaustive literature search are presented and recommendations are made for needed future research and demonstration projects.

  1. Academic Performance of Subsequent Schools and Impacts of Early Interventions: Evidence from a Randomized Controlled Trial in Head Start Settings

    PubMed Central

    Zhai, Fuhua; Raver, C. Cybele; Jones, Stephanie M.

    2012-01-01

    The role of subsequent school contexts in the long-term effects of early childhood interventions has received increasing attention, but has been understudied in the literature. Using data from the Chicago School Readiness Project (CSRP), a cluster-randomized controlled trial conducted in Head Start programs, we investigate whether the intervention had differential effects on academic and behavioral outcomes in kindergarten if children attended high- or low-performing schools subsequent to the preschool intervention year. To address the issue of selection bias, we adopt an innovative method, principal score matching, and control for a set of child, mother, and classroom covariates. We find that exposure to the CSRP intervention in the Head Start year had significant effects on academic and behavioral outcomes in kindergarten for children who subsequently attended high-performing schools, but no significant effects on children attending low-performing schools. Policy implications of the findings are discussed. PMID:22773872

  2. A New 2.5D Representation for Lymph Node Detection using Random Sets of Deep Convolutional Neural Network Observations

    PubMed Central

    Lu, Le; Seff, Ari; Cherry, Kevin M.; Hoffman, Joanne; Wang, Shijun; Liu, Jiamin; Turkbey, Evrim; Summers, Ronald M.

    2015-01-01

    Automated Lymph Node (LN) detection is an important clinical diagnostic task but very challenging due to the low contrast of surrounding structures in Computed Tomography (CT) and to their varying sizes, poses, shapes and sparsely distributed locations. State-of-the-art studies show the performance range of 52.9% sensitivity at 3.1 false-positives per volume (FP/vol.), or 60.9% at 6.1 FP/vol. for mediastinal LN, by one-shot boosting on 3D HAAR features. In this paper, we first operate a preliminary candidate generation stage, towards ~100% sensitivity at the cost of high FP levels (~40 per patient), to harvest volumes of interest (VOI). Our 2.5D approach consequently decomposes any 3D VOI by resampling 2D reformatted orthogonal views N times, via scale, random translations, and rotations with respect to the VOI centroid coordinates. These random views are then used to train a deep Convolutional Neural Network (CNN) classifier. In testing, the CNN is employed to assign LN probabilities for all N random views that can be simply averaged (as a set) to compute the final classification probability per VOI. We validate the approach on two datasets: 90 CT volumes with 388 mediastinal LNs and 86 patients with 595 abdominal LNs. We achieve sensitivities of 70%/83% at 3 FP/vol. and 84%/90% at 6 FP/vol. in mediastinum and abdomen respectively, which drastically improves over the previous state-of-the-art work. PMID:25333158

  3. A new 2.5D representation for lymph node detection using random sets of deep convolutional neural network observations.

    PubMed

    Roth, Holger R; Lu, Le; Seff, Ari; Cherry, Kevin M; Hoffman, Joanne; Wang, Shijun; Liu, Jiamin; Turkbey, Evrim; Summers, Ronald M

    2014-01-01

    Automated Lymph Node (LN) detection is an important clinical diagnostic task but very challenging due to the low contrast of surrounding structures in Computed Tomography (CT) and to their varying sizes, poses, shapes and sparsely distributed locations. State-of-the-art studies show the performance range of 52.9% sensitivity at 3.1 false-positives per volume (FP/vol.), or 60.9% at 6.1 FP/vol. for mediastinal LN, by one-shot boosting on 3D HAAR features. In this paper, we first operate a preliminary candidate generation stage, towards -100% sensitivity at the cost of high FP levels (-40 per patient), to harvest volumes of interest (VOI). Our 2.5D approach consequently decomposes any 3D VOI by resampling 2D reformatted orthogonal views N times, via scale, random translations, and rotations with respect to the VOI centroid coordinates. These random views are then used to train a deep Convolutional Neural Network (CNN) classifier. In testing, the CNN is employed to assign LN probabilities for all N random views that can be simply averaged (as a set) to compute the final classification probability per VOI. We validate the approach on two datasets: 90 CT volumes with 388 mediastinal LNs and 86 patients with 595 abdominal LNs. We achieve sensitivities of 70%/83% at 3 FP/vol. and 84%/90% at 6 FP/vol. in mediastinum and abdomen respectively, which drastically improves over the previous state-of-the-art work.

  4. Geostatistical analysis of disease data: estimation of cancer mortality risk from empirical frequencies using Poisson kriging

    PubMed Central

    Goovaerts, Pierre

    2005-01-01

    Background Cancer mortality maps are used by public health officials to identify areas of excess and to guide surveillance and control activities. Quality of decision-making thus relies on an accurate quantification of risks from observed rates which can be very unreliable when computed from sparsely populated geographical units or recorded for minority populations. This paper presents a geostatistical methodology that accounts for spatially varying population sizes and spatial patterns in the processing of cancer mortality data. Simulation studies are conducted to compare the performances of Poisson kriging to a few simple smoothers (i.e. population-weighted estimators and empirical Bayes smoothers) under different scenarios for the disease frequency, the population size, and the spatial pattern of risk. A public-domain executable with example datasets is provided. Results The analysis of age-adjusted mortality rates for breast and cervix cancers illustrated some key features of commonly used smoothing techniques. Because of the small weight assigned to the rate observed over the entity being smoothed (kernel weight), the population-weighted average leads to risk maps that show little variability. Other techniques assign larger and similar kernel weights but they use a different piece of auxiliary information in the prediction: global or local means for global or local empirical Bayes smoothers, and spatial combination of surrounding rates for the geostatistical estimator. Simulation studies indicated that Poisson kriging outperforms other approaches for most scenarios, with a clear benefit when the risk values are spatially correlated. Global empirical Bayes smoothers provide more accurate predictions under the least frequent scenario of spatially random risk. Conclusion The approach presented in this paper enables researchers to incorporate the pattern of spatial dependence of mortality rates into the mapping of risk values and the quantification of the

  5. Geostatistical upscaling of rain gauge data to support uncertainty analysis of lumped urban hydrological models

    NASA Astrophysics Data System (ADS)

    Muthusamy, Manoranjan; Schellart, Alma; Tait, Simon; Heuvelink, Gerard B. M.

    2017-02-01

    In this study we develop a method to estimate the spatially averaged rainfall intensity together with associated level of uncertainty using geostatistical upscaling. Rainfall data collected from a cluster of eight paired rain gauges in a 400 m × 200 m urban catchment are used in combination with spatial stochastic simulation to obtain optimal predictions of the spatially averaged rainfall intensity at any point in time within the urban catchment. The uncertainty in the prediction of catchment average rainfall intensity is obtained for multiple combinations of intensity ranges and temporal averaging intervals. The two main challenges addressed in this study are scarcity of rainfall measurement locations and non-normality of rainfall data, both of which need to be considered when adopting a geostatistical approach. Scarcity of measurement points is dealt with by pooling sample variograms of repeated rainfall measurements with similar characteristics. Normality of rainfall data is achieved through the use of normal score transformation. Geostatistical models in the form of variograms are derived for transformed rainfall intensity. Next spatial stochastic simulation which is robust to nonlinear data transformation is applied to produce realisations of rainfall fields. These realisations in transformed space are first back-transformed and next spatially aggregated to derive a random sample of the spatially averaged rainfall intensity. Results show that the prediction uncertainty comes mainly from two sources: spatial variability of rainfall and measurement error. At smaller temporal averaging intervals both these effects are high, resulting in a relatively high uncertainty in prediction. With longer temporal averaging intervals the uncertainty becomes lower due to stronger spatial correlation of rainfall data and relatively smaller measurement error. Results also show that the measurement error increases with decreasing rainfall intensity resulting in a higher

  6. Increasing Water Intake of Children and Parents in the Family Setting: A Randomized, Controlled Intervention Using Installation Theory.

    PubMed

    Lahlou, Saadi; Boesen-Mariani, Sabine; Franks, Bradley; Guelinckx, Isabelle

    2015-01-01

    On average, children and adults in developed countries consume too little water, which can lead to negative health consequences. In a one-year longitudinal field experiment in Poland, we compared the impact of three home-based interventions on helping children and their parents/caregivers to develop sustainable increased plain water consumption habits. Fluid consumption of 334 children and their caregivers were recorded over one year using an online specific fluid dietary record. They were initially randomly allocated to one of the three following conditions: Control, Information (child and carer received information on the health benefits of water), or Placement (in addition to information, free small bottles of still water for a limited time period were delivered at home). After three months, half of the non-controls were randomly assigned to Community (child and caregiver engaged in an online community forum providing support on water consumption). All conditions significantly increased the water consumption of children (by 21.9-56.7%) and of adults (by 22-89%). Placement + Community generated the largest effects. Community enhanced the impact of Placement for children and parents, as well as the impact of Information for parents but not children. The results suggest that the family setting offers considerable scope for successful installation of interventions encouraging children and caregivers to develop healthier consumption habits, in mutually reinforcing ways. Combining information, affordances, and social influence gives the best, and most sustainable, results.

  7. Alcohol screening and brief interventions for offenders in the probation setting (SIPS Trial): a pragmatic multicentre cluster randomized controlled trial.

    PubMed

    Newbury-Birch, Dorothy; Coulton, Simon; Bland, Martin; Cassidy, Paul; Dale, Veronica; Deluca, Paolo; Gilvarry, Eilish; Godfrey, Christine; Heather, Nick; Kaner, Eileen; McGovern, Ruth; Myles, Judy; Oyefeso, Adenekan; Parrott, Steve; Patton, Robert; Perryman, Katherine; Phillips, Tom; Shepherd, Jonathan; Drummond, Colin

    2014-01-01

    To evaluate the effectiveness of different brief intervention strategies at reducing hazardous or harmful drinking in the probation setting. Offender managers were randomized to three interventions, each of which built on the previous one: feedback on screening outcome and a client information leaflet control group, 5 min of structured brief advice and 20 min of brief lifestyle counselling. A pragmatic multicentre factorial cluster randomized controlled trial. The primary outcome was self-reported hazardous or harmful drinking status measured by Alcohol Use Disorders Identification Test (AUDIT) at 6 months (negative status was a score of <8). Secondary outcomes were AUDIT status at 12 months, experience of alcohol-related problems, health utility, service utilization, readiness to change and reduction in conviction rates. Follow-up rates were 68% at 6 months and 60% at 12 months. At both time points, there was no significant advantage of more intensive interventions compared with the control group in terms of AUDIT status. Those in the brief advice and brief lifestyle counselling intervention groups were statistically significantly less likely to reoffend (36 and 38%, respectively) than those in the client information leaflet group (50%) in the year following intervention. Brief advice or brief lifestyle counselling provided no additional benefit in reducing hazardous or harmful drinking compared with feedback on screening outcome and a client information leaflet. The impact of more intensive brief intervention on reoffending warrants further research. © The Author 2014. Medical Council on Alcohol and Oxford University Press. All rights reserved.

  8. Use of the GlideScope Ranger Video Laryngoscope for Emergency Intubation in the Prehospital Setting: A Randomized Control Trial.

    PubMed

    Trimmel, Helmut; Kreutziger, Janett; Fitzka, Robert; Szüts, Stephan; Derdak, Christoph; Koch, Elisabeth; Erwied, Boris; Voelckel, Wolfgang G

    2016-07-01

    We sought to assess whether the GlideScope Ranger video laryngoscope may be a reliable alternative to direct laryngoscopy in the prehospital setting. Multicenter, prospective, randomized, control trial with patient recruitment over 18 months. Four study centers operating physician-staffed rescue helicopters or ground units in Austria and Norway. Adult emergency patients requiring endotracheal intubation. Airway management strictly following a prehospital algorithm. First and second intubation attempt employing GlideScope or direct laryngoscopy as randomized; third attempt crossover. After three failed intubation attempts, immediate use of an extraglottic airway device. A total of 326 patients were enrolled. Success rate with the GlideScope (n = 168) versus direct laryngoscopy (n = 158) group was 61.9% (104/168) versus 96.2% (152/158), respectively (p < 0.001). The main reasons for failed GlideScope intubation were failure to advance the tube into the larynx or trachea (26/168 vs 0/158; p < 0.001) and/or impaired sight due to blood or fluids (21/168 vs 3/158; p < 0.001). When GlideScope intubation failed, direct laryngoscopy was successful in 61 of 64 patients (95.3%), whereas GlideScope enabled intubation in four of six cases (66.7%) where direct laryngoscopy failed (p = 0.055). In addition, GlideScope was prone to impaired visualization of the monitor because of ambient light (29/168; 17.3%). There was no correlation between success rates and body mass index, age, indication for airway management, or experience of the physicians, respectively. Video laryngoscopy is an established tool in difficult airway management, but our results shed light on the specific problems in the emergency medical service setting. Prehospital use of the GlideScope was associated with some major problems, thus resulting in a lower intubation success rate when compared with direct laryngoscopy.

  9. Modal Petrology and Geostatistics of the Blue Hills Igneous Complex, Boston, Massachusetts, by Rietveld X-ray Diffraction: Multi-scalar Investigation of Volcanic and Intrusive Relationships

    NASA Astrophysics Data System (ADS)

    Besancon, J. R.; Spence, T. M.

    2004-12-01

    The Blue Hills Igneous Complex of eastern Massachusetts consists of mildly peralkaline volcanic and intrusive units including the Quincy Granite, the Blue Hills Porphyry, and a set of mainly pyroclastic rhyolite flow units traditionally called the Aporhyolite. Similar whole-rock chemistry has led most workers to assume that they are related rocks, despite some unclear field relationships. Kaktins (1976) divided the volcanic rocks into six units, but buried contacts do not permit confidence in either their number or stratigraphic position. To test a new method of modal analysis of these rocks, thirty-five samples were crushed, ground to approximately 5 micrometers, spray-dried to produce randomly oriented powder, and analyzed by x-ray diffraction. A constant eleven-phase Rietveld starting model was applied to the x-ray spectra, and then refined to produce a modal database of phase proportions in each sample. Geostatistical analysis with GIS software delineates a number of trends, with statistical measures of uncertainty. Aegirine in volcanics decreases in abundance with distance south from the E-W contact of volcanic rocks and granite. Riebeckite is found in the granite (both as veins and as apparently magmatic crystals) and the porphyry, but is less abundant or absent among the volcanic rocks. Where both amphibole and pyroxene are present, they are negatively correlated. The goal is to develop an additional tool for correlation of volcanic rocks, one based on mineral proportions in both aphanitic and phaneritic rocks.

  10. Geostatistical modeling of a portion of the alluvial aquifer of Mexico City

    NASA Astrophysics Data System (ADS)

    Morales-Casique, E.; Medina-Ortega, P.; Escolero-Fuentes, O.; Hernandez Espriu, A.

    2012-12-01

    Mexico City is one of the largest cities in the world and the pressure exerted on water resources generates problems such as intensive groundwater exploitation, subsidence and groundwater pollution. Most of the main aquifer under exploitation underlies lacustrine sediments and it is composed of a highly heterogeneous mixture of alluvial deposits and volcanic rocks. Lithological records from 113 production water wells are analyzed using indicator geostatistics. The different lithological categories are grouped into four hydrofacies, where a hydrofacies is a set of lithological categories which have similar hydraulic properties. An exponential variogram model was fitted to each hydrofacies by minimizing cross validation errors. The data is then kriged to obtain the three-dimensional distribution of each hydrofacies within the alluvial aquifer of Mexico City.

  11. GEOSTATISTICAL ANALYSIS OF HEALTH DATA WITH DIFFERENT LEVELS OF SPATIAL AGGREGATION

    PubMed Central

    Goovaerts, Pierre

    2012-01-01

    This paper presents a geostatistical approach to combine two geographical sets of area-based data into the mapping of disease risk, with an application to the rate of prostate cancer late-stage diagnosis in North Florida. This methodology is used to combine individual-level data assigned to census tracts for confidentiality reasons with individual-level data that were allocated to ZIP codes because of incomplete geocoding. This form of binomial kriging, which accounts for the population size and shape of each geographical unit, can generate choropleth or isopleth risk maps that are all coherent through spatial aggregation. Incorporation of both types of areal data reduces the loss of information associated with incomplete geocoding, leading to maps of risk estimates that are globally less smooth and with smaller prediction error variance. PMID:22469493

  12. Geostatistics - a tool applied to the distribution of Legionella pneumophila in a hospital water system.

    PubMed

    Laganà, Pasqualina; Moscato, Umberto; Poscia, Andrea; La Milia, Daniele Ignazio; Boccia, Stefania; Avventuroso, Emanuela; Delia, Santi

    2015-01-01

    Legionnaires' disease is normally acquired by inhalation of legionellae from a contaminated environmental source. Water systems of large buildings, such as hospitals, are often contaminated with legionellae and therefore represent a potential risk for the hospital population. The aim of this study was to evaluate the potential contamination of Legionella pneumophila (LP) in a large hospital in Italy through georeferential statistical analysis to assess the possible sources of dispersion and, consequently, the risk of exposure for both health care staff and patients. LP serogroups 1 and 2-14 distribution was considered in the wards housed on two consecutive floors of the hospital building. On the basis of information provided by 53 bacteriological analysis, a 'random' grid of points was chosen and spatial geostatistics or FAIk Kriging was applied and compared with the results of classical statistical analysis. Over 50% of the examined samples were positive for Legionella pneumophila. LP 1 was isolated in 69% of samples from the ground floor and in 60% of sample from the first floor; LP 2-14 in 36% of sample from the ground floor and 24% from the first. The iso-estimation maps show clearly the most contaminated pipe and the difference in the diffusion of the different L. pneumophila serogroups. Experimental work has demonstrated that geostatistical methods applied to the microbiological analysis of water matrices allows a better modeling of the phenomenon under study, a greater potential for risk management and a greater choice of methods of prevention and environmental recovery to be put in place with respect to the classical statistical analysis.

  13. Mapping aboveground woody biomass using forest inventory, remote sensing and geostatistical techniques.

    PubMed

    Yadav, Bechu K V; Nandy, S

    2015-05-01

    Mapping forest biomass is fundamental for estimating CO₂ emissions, and planning and monitoring of forests and ecosystem productivity. The present study attempted to map aboveground woody biomass (AGWB) integrating forest inventory, remote sensing and geostatistical techniques, viz., direct radiometric relationships (DRR), k-nearest neighbours (k-NN) and cokriging (CoK) and to evaluate their accuracy. A part of the Timli Forest Range of Kalsi Soil and Water Conservation Division, Uttarakhand, India was selected for the present study. Stratified random sampling was used to collect biophysical data from 36 sample plots of 0.1 ha (31.62 m × 31.62 m) size. Species-specific volumetric equations were used for calculating volume and multiplied by specific gravity to get biomass. Three forest-type density classes, viz. 10-40, 40-70 and >70% of Shorea robusta forest and four non-forest classes were delineated using on-screen visual interpretation of IRS P6 LISS-III data of December 2012. The volume in different strata of forest-type density ranged from 189.84 to 484.36 m(3) ha(-1). The total growing stock of the forest was found to be 2,024,652.88 m(3). The AGWB ranged from 143 to 421 Mgha(-1). Spectral bands and vegetation indices were used as independent variables and biomass as dependent variable for DRR, k-NN and CoK. After validation and comparison, k-NN method of Mahalanobis distance (root mean square error (RMSE) = 42.25 Mgha(-1)) was found to be the best method followed by fuzzy distance and Euclidean distance with RMSE of 44.23 and 45.13 Mgha(-1) respectively. DRR was found to be the least accurate method with RMSE of 67.17 Mgha(-1). The study highlighted the potential of integrating of forest inventory, remote sensing and geostatistical techniques for forest biomass mapping.

  14. EFFICIENT MODEL-FITTING AND MODEL-COMPARISON FOR HIGH-DIMENSIONAL BAYESIAN GEOSTATISTICAL MODELS. (R826887)

    EPA Science Inventory

    Geostatistical models are appropriate for spatially distributed data measured at irregularly spaced locations. We propose an efficient Markov chain Monte Carlo (MCMC) algorithm for fitting Bayesian geostatistical models with substantial numbers of unknown parameters to sizable...

  15. EFFICIENT MODEL-FITTING AND MODEL-COMPARISON FOR HIGH-DIMENSIONAL BAYESIAN GEOSTATISTICAL MODELS. (R826887)

    EPA Science Inventory

    Geostatistical models are appropriate for spatially distributed data measured at irregularly spaced locations. We propose an efficient Markov chain Monte Carlo (MCMC) algorithm for fitting Bayesian geostatistical models with substantial numbers of unknown parameters to sizable...

  16. Analysis of vadose zone tritium transport from an underground storage tank release using numerical modeling and geostatistics

    SciTech Connect

    Lee, K.H.

    1997-09-01

    Numerical and geostatistical analyses show that the artificial smoothing effect of kriging removes high permeability flow paths from hydrogeologic data sets, reducing simulated contaminant transport rates in heterogeneous vadose zone systems. therefore, kriging alone is not recommended for estimating the spatial distribution of soil hydraulic properties for contaminant transport analysis at vadose zone sites. Vadose zone transport if modeled more effectively by combining kriging with stochastic simulation to better represent the high degree of spatial variability usually found in the hydraulic properties of field soils. However, kriging is a viable technique for estimating the initial mass distribution of contaminants in the subsurface.

  17. Randomized, controlled trial of the effectiveness of simulation education: A 24-month follow-up study in a clinical setting.

    PubMed

    Jansson, Miia M; Syrjälä, Hannu P; Ohtonen, Pasi P; Meriläinen, Merja H; Kyngäs, Helvi A; Ala-Kokko, Tero I

    2016-04-01

    Critical care nurses' knowledge and skills in adhering to evidence-based guidelines for avoiding complications associated with intubation and mechanical ventilation are currently limited. We hypothesized that single simulation education session would lead to a long-lasting higher level of skills among critical care nurses. A randomized controlled trial was conducted in a 22-bed adult mixed medical-surgical intensive care unit in Finland during the period February 2012-March 2014. Thirty out of 40 initially randomized critical care nurses participated in a 24-month follow-up study. Behavior and cognitive development was evaluated through a validated Ventilator Bundle Observation Schedule and Questionnaire at the baseline measurement and repeated 3 times during simulation and real-life clinic settings. After simulation education, the average skills score increased from 46.8%-58.8% of the total score in the final postintervention measurement (Ptime < .001, Ptime × group = .040, and Pgroup = .11). The average knowledge scores within groups did not change significantly. The average between-group difference in skills scores was significant only at the measurement taken at 6 months (P = .006). Critical care nurses' skills in adhering to evidence-based guidelines improved in both groups over time, but the improvements between the study groups was significantly different only at 6 months and was no longer evident after 2 years following a single simulation education. Copyright © 2016 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.

  18. ON THE GEOSTATISTICAL APPROACH TO THE INVERSE PROBLEM. (R825689C037)

    EPA Science Inventory

    Abstract

    The geostatistical approach to the inverse problem is discussed with emphasis on the importance of structural analysis. Although the geostatistical approach is occasionally misconstrued as mere cokriging, in fact it consists of two steps: estimation of statist...

  19. Geospatial Interpolation and Mapping of Tropospheric Ozone Pollution Using Geostatistics

    PubMed Central

    Kethireddy, Swatantra R.; Tchounwou, Paul B.; Ahmad, Hafiz A.; Yerramilli, Anjaneyulu; Young, John H.

    2014-01-01

    Tropospheric ozone (O3) pollution is a major problem worldwide, including in the United States of America (USA), particularly during the summer months. Ozone oxidative capacity and its impact on human health have attracted the attention of the scientific community. In the USA, sparse spatial observations for O3 may not provide a reliable source of data over a geo-environmental region. Geostatistical Analyst in ArcGIS has the capability to interpolate values in unmonitored geo-spaces of interest. In this study of eastern Texas O3 pollution, hourly episodes for spring and summer 2012 were selectively identified. To visualize the O3 distribution, geostatistical techniques were employed in ArcMap. Using ordinary Kriging, geostatistical layers of O3 for all the studied hours were predicted and mapped at a spatial resolution of 1 kilometer. A decent level of prediction accuracy was achieved and was confirmed from cross-validation results. The mean prediction error was close to 0, the root mean-standardized-prediction error was close to 1, and the root mean square and average standard errors were small. O3 pollution map data can be further used in analysis and modeling studies. Kriging results and O3 decadal trends indicate that the populace in Houston-Sugar Land-Baytown, Dallas-Fort Worth-Arlington, Beaumont-Port Arthur, San Antonio, and Longview are repeatedly exposed to high levels of O3-related pollution, and are prone to the corresponding respiratory and cardiovascular health effects. Optimization of the monitoring network proves to be an added advantage for the accurate prediction of exposure levels. PMID:24434594

  20. Assessment of spatial distribution of fallout radionuclides through geostatistics concept.

    PubMed

    Mabit, L; Bernard, C

    2007-01-01

    After introducing geostatistics concept and its utility in environmental science and especially in Fallout Radionuclide (FRN) spatialisation, a case study for cesium-137 ((137)Cs) redistribution at the field scale using geostatistics is presented. On a Canadian agricultural field, geostatistics coupled with a Geographic Information System (GIS) was used to test three different techniques of interpolation [Ordinary Kriging (OK), Inverse Distance Weighting power one (IDW1) and two (IDW2)] to create a (137)Cs map and to establish a radioisotope budget. Following the optimization of variographic parameters, an experimental semivariogram was developed to determine the spatial dependence of (137)Cs. It was adjusted to a spherical isotropic model with a range of 30 m and a very small nugget effect. This (137)Cs semivariogram showed a good autocorrelation (R(2)=0.91) and was well structured ('nugget-to-sill' ratio of 4%). It also revealed that the sampling strategy was adequate to reveal the spatial correlation of (137)Cs. The spatial redistribution of (137)Cs was estimated by Ordinary Kriging and IDW to produce contour maps. A radioisotope budget was established for the 2.16 ha agricultural field under investigation. It was estimated that around 2 x 10(7)Bq of (137)Cs were missing (around 30% of the total initial fallout) and were exported by physical processes (runoff and erosion processes) from the area under investigation. The cross-validation analysis showed that in the case of spatially structured data, OK is a better interpolation method than IDW1 or IDW2 for the assessment of potential radioactive contamination and/or pollution.

  1. Geospatial interpolation and mapping of tropospheric ozone pollution using geostatistics.

    PubMed

    Kethireddy, Swatantra R; Tchounwou, Paul B; Ahmad, Hafiz A; Yerramilli, Anjaneyulu; Young, John H

    2014-01-10

    Tropospheric ozone (O3) pollution is a major problem worldwide, including in the United States of America (USA), particularly during the summer months. Ozone oxidative capacity and its impact on human health have attracted the attention of the scientific community. In the USA, sparse spatial observations for O3 may not provide a reliable source of data over a geo-environmental region. Geostatistical Analyst in ArcGIS has the capability to interpolate values in unmonitored geo-spaces of interest. In this study of eastern Texas O3 pollution, hourly episodes for spring and summer 2012 were selectively identified. To visualize the O3 distribution, geostatistical techniques were employed in ArcMap. Using ordinary Kriging, geostatistical layers of O3 for all the studied hours were predicted and mapped at a spatial resolution of 1 kilometer. A decent level of prediction accuracy was achieved and was confirmed from cross-validation results. The mean prediction error was close to 0, the root mean-standardized-prediction error was close to 1, and the root mean square and average standard errors were small. O3 pollution map data can be further used in analysis and modeling studies. Kriging results and O3 decadal trends indicate that the populace in Houston-Sugar Land-Baytown, Dallas-Fort Worth-Arlington, Beaumont-Port Arthur, San Antonio, and Longview are repeatedly exposed to high levels of O3-related pollution, and are prone to the corresponding respiratory and cardiovascular health effects. Optimization of the monitoring network proves to be an added advantage for the accurate prediction of exposure levels.

  2. Focused Training for Humanitarian Responders in Regional Anesthesia Techniques for a Planned Randomized Controlled Trial in a Disaster Setting

    PubMed Central

    Aluisio, Adam R.; Teicher, Carrei; Wiskel, Tess; Guy, Allysia; Levine, Adam

    2016-01-01

    Background:Lower extremity trauma during earthquakes accounts for the largest burden of geophysical disaster-related injuries. Insufficient pain management is common in disaster settings, and regional anesthesia (RA) has the potential to reduce pain in injured patients beyond current standards. To date, no prospective research has evaluated the use of RA in a disaster setting. This cross-sectional study assesses knowledge translation and skill acquisition outcomes for lower extremity RA performed with and without ultrasound guidance among a cohort of Médecins Sans Frontières (MSF) volunteers who will function as proceduralists in a planned randomized controlled trial evaluating the efficacy of RA for pain management in an earthquake setting. Methods:Generalist humanitarian healthcare responders, including both physicians and nurses, were trained in ultrasound guided femoral nerve block (USGFNB) and landmark guided fascia iliaca compartment block (LGFICB) techniques using didactic sessions and interactive simulations during a one-day focused course. Outcome measures evaluated interval knowledge attainment and technical proficiency in performing the RA procedures. Knowledge attainment was assessed via pre- and post-test evaluations and procedural proficiency was evaluated through monitored simulations, with performance of critical actions graded by two independent observers. Results:Twelve humanitarian response providers were enrolled and completed the trainings and assessments. Knowledge scores significantly increased from a mean pre-test score of 79% to post-test score of 88% (p<0.001). In practical evaluation of the LGFICB, participants correctly performed a median of 15.0 (Interquartile Range (IQR) 14.0-16.0) out of 16 critical actions. For the USGFNB, the median score was also 15.0 (IQR 14.0-16.0) out of 16 critical actions. Inter-rater reliability for completion of critical actions was excellent, with inter-rater agreement of 83.3% and 91.7% for the LGFICB

  3. Geostatistical enhancement of macro-scale runoff simulations

    NASA Astrophysics Data System (ADS)

    Pugliese, Alessio; Persiano, Simone; Castellarin, Attilio; Parajka, Juraj; Arheimer, Berit; Capell, René; Bagli, Stefano; Mazzoli, Paolo; Montanari, Alberto; Blöschl, Günter

    2017-04-01

    This study presents the results of the research experiment Geostatistical Enhancement of European Hydrological Prediction (GEEHP). GEEHP is developed within the EU funded SWITCH-ON project, which proposes to conduct collaborative experiments in a virtual laboratory in order to share water-related information and to tackle changes in the hydrosphere for operational needs (http://www.water-switch-on.eu). The main objective of GEEHP is to develop a simple and easy-to-apply technique for locally enhancing the performance of macro-scale rainfall-runoff models on the basis of observed streamflow data available at nearby streamgauges, without re-running computationally intensive rainfall-runoff simulations. The experiment relies upon the prediction of regional period-of-record flow-duration curves (FDCs) by means of a geostatistical procedure based on Top-kriging, which has been recently shown to be particularly reliable for the regionalization of FDCs. The procedure developed employs two different types of daily streamflow data collected in a limited portion of territories centred in Tyrol (Austria and Italy): large-scale rainfall-runoff model simulation series (EHYPE, http://hypeweb.smhi.se/europehype) and observed series from 46 gauged catchments. The first phase of the experiment required the implementation and cross-validation of the geostatistically-based regional model over the study area, capable of predicting FDCs in ungauged sites. Cross-validation results showed good overall performances of the regional model, with an average Nash-Sutcliffe efficiency on log-flows (LNSE) equal to 0.898 over the entire river network in Tyrol. In a second phase, we selected 11 target catchments within the study area, for which both EHYPE simulations and observed data were available over the period 1980-2010. Then, we computed residuals between Top-kriged FDCs and FDCs constructed from simulated streamflow series, and, finally, we used these residuals for enhancing simulated time

  4. Brain lesion detection in MRI with fuzzy and geostatistical models.

    PubMed

    Pham, Tuan D

    2010-01-01

    Automated image detection of white matter changes of the brain is essentially helpful in providing a quantitative measure for studying the association of white matter lesions with other types of biomedical data. Such study allows the possibility of several medical hypothesis validations which lead to therapeutic treatment and prevention. This paper presents a new clustering-based segmentation approach for detecting white matter changes in magnetic resonance imaging with particular reference to cognitive decline in the elderly. The proposed method is formulated using the principles of fuzzy c-means algorithm and geostatistics.

  5. Using geostatistics to predict the characteristics of washed coal

    SciTech Connect

    Armstrong, M.

    1984-04-01

    This paper was presented to an SME-AIME meeting in 1981. The established techniques of linear geostatistics (ordinary kriging) can be used to estimate the total tonnage and the grade of coal in situ; more sophisticated techniques are required for predicting the characteristics of washed coal in situ. Two approaches are being investigated. One involves a parametric model of the washability curves and disjunctive kriging. The other is similar to the service variable approach used for estimating recoverable uranium reserves. This latter method is described in this paper.

  6. Mercury emissions from coal combustion in Silesia, analysis using geostatistics

    NASA Astrophysics Data System (ADS)

    Zasina, Damian; Zawadzki, Jaroslaw

    2015-04-01

    Data provided by the UNEP's report on mercury [1] shows that solid fuel combustion in significant source of mercury emission to air. Silesia, located in southwestern Poland, is notably affected by mercury emission due to being one of the most industrialized Polish regions: the place of coal mining, production of metals, stone mining, mineral quarrying and chemical industry. Moreover, Silesia is the region with high population density. People are exposed to severe risk of mercury emitted from both: industrial and domestic sources (i.e. small household furnaces). Small sources have significant contribution to total emission of mercury. Official and statistical analysis, including prepared for international purposes [2] did not provide data about spatial distribution of the mercury emitted to air, however number of analysis on Polish public power and energy sector had been prepared so far [3; 4]. The distribution of locations exposed for mercury emission from small domestic sources is interesting matter merging information from various sources: statistical, economical and environmental. This paper presents geostatistical approach to distibution of mercury emission from coal combustion. Analysed data organized in 2 independent levels: individual, bottom-up approach derived from national emission reporting system [5; 6] and top down - regional data calculated basing on official statistics [7]. Analysis, that will be presented, will include comparison of spatial distributions of mercury emission using data derived from sources mentioned above. Investigation will include three voivodeships of Poland: Lower Silesian, Opole (voivodeship) and Silesian using selected geostatistical methodologies including ordinary kriging [8]. References [1] UNEP. Global Mercury Assessment 2013: Sources, Emissions, Releases and Environmental Transport. UNEP Chemicals Branch, Geneva, Switzerland, 2013. [2] NCEM. Poland's Informative Inventory Report 2014. NCEM at the IEP-NRI, 2014. http

  7. Geostatistical modeling of uncertainty of the spatial distribution of available phosphorus in soil in a sugarcane field

    NASA Astrophysics Data System (ADS)

    Tadeu Pereira, Gener; Ribeiro de Oliveira, Ismênia; De Bortoli Teixeira, Daniel; Arantes Camargo, Livia; Rodrigo Panosso, Alan; Marques, José, Jr.

    2015-04-01

    Phosphorus is one of the limiting nutrients for sugarcane development in Brazilian soils. The spatial variability of this nutrient is great, defined by the properties that control its adsorption and desorption reactions. Spatial estimates to characterize this variability are based on geostatistical interpolation. Thus, the assessment of the uncertainty of estimates associated with the spatial distribution of available P (Plabile) is decisive to optimize the use of phosphate fertilizers. The purpose of this study was to evaluate the performance of sequential Gaussian simulation (sGs) and ordinary kriging (OK) in the modeling of uncertainty in available P estimates. A sampling grid with 626 points was established in a 200-ha experimental sugarcane field in Tabapuã, São Paulo State, Brazil. The soil was sampled in the crossover points of a regular grid with intervals of 50 m. From the observations, 63 points, approximately 10% of sampled points were randomly selected before the geostatistical modeling of the composition of a data set used in the validation process modeling, while the remaining 563 points were used for the predictions variable in a place not sampled. The sGs generated 200 realizations. From the realizations generated, different measures of estimation and uncertainty were obtained. The standard deviation, calculated point to point, all simulated maps provided the map of deviation, used to assess local uncertainty. The visual analysis of maps of the E-type and KO showed that the spatial patterns produced by both methods were similar, however, it was possible to observe the characteristic smoothing effect of the KO especially in regions with extreme values. The Standardized variograms of selected realizations sGs showed both range and model similar to the variogram of the Observed date of Plabile. The variogram KO showed a distinct structure of the observed data, underestimating the variability over short distances, presenting parabolic behavior near

  8. Comparative soil CO2 flux measurements and geostatistical estimation methods on Masaya volcano, Nicaragua

    USGS Publications Warehouse

    Lewicki, J.L.; Bergfeld, D.; Cardellini, C.; Chiodini, G.; Granieri, D.; Varley, N.; Werner, C.

    2005-01-01

    We present a comparative study of soil CO2 flux (FCO2) measured by five groups (Groups 1-5) at the IAVCEI-CCVG Eighth Workshop on Volcanic Gases on Masaya volcano, Nicaragua. Groups 1-5 measured (FCO2) using the accumulation chamber method at 5-m spacing within a 900 m2 grid during a morning (AM) period. These measurements were repeated by Groups 1-3 during an afternoon (PM) period. Measured (FCO2 ranged from 218 to 14,719 g m-2 day-1. The variability of the five measurements made at each grid point ranged from ??5 to 167%. However, the arithmetic means of fluxes measured over the entire grid and associated total CO2 emission rate estimates varied between groups by only ??22%. All three groups that made PM measurements reported an 8-19% increase in total emissions over the AM results. Based on a comparison of measurements made during AM and PM times, we argue that this change is due in large part to natural temporal variability of gas flow, rather than to measurement error. In order to estimate the mean and associated CO2 emission rate of one data set and to map the spatial FCO2 distribution, we compared six geostatistical methods: Arithmetic and minimum variance unbiased estimator means of uninterpolated data, and arithmetic means of data interpolated by the multiquadric radial basis function, ordinary kriging, multi-Gaussian kriging, and sequential Gaussian simulation methods. While the total CO2 emission rates estimated using the different techniques only varied by ??4.4%, the FCO2 maps showed important differences. We suggest that the sequential Gaussian simulation method yields the most realistic representation of the spatial distribution of FCO2, but a variety of geostatistical methods are appropriate to estimate the total CO2 emission rate from a study area, which is a primary goal in volcano monitoring research. ?? Springer-Verlag 2005.

  9. Enhancing multiple-point geostatistical modeling: 1. Graph theory and pattern adjustment

    NASA Astrophysics Data System (ADS)

    Tahmasebi, Pejman; Sahimi, Muhammad

    2016-03-01

    In recent years, higher-order geostatistical methods have been used for modeling of a wide variety of large-scale porous media, such as groundwater aquifers and oil reservoirs. Their popularity stems from their ability to account for qualitative data and the great flexibility that they offer for conditioning the models to hard (quantitative) data, which endow them with the capability for generating realistic realizations of porous formations with very complex channels, as well as features that are mainly a barrier to fluid flow. One group of such models consists of pattern-based methods that use a set of data points for generating stochastic realizations by which the large-scale structure and highly-connected features are reproduced accurately. The cross correlation-based simulation (CCSIM) algorithm, proposed previously by the authors, is a member of this group that has been shown to be capable of simulating multimillion cell models in a matter of a few CPU seconds. The method is, however, sensitive to pattern's specifications, such as boundaries and the number of replicates. In this paper the original CCSIM algorithm is reconsidered and two significant improvements are proposed for accurately reproducing large-scale patterns of heterogeneities in porous media. First, an effective boundary-correction method based on the graph theory is presented by which one identifies the optimal cutting path/surface for removing the patchiness and discontinuities in the realization of a porous medium. Next, a new pattern adjustment method is proposed that automatically transfers the features in a pattern to one that seamlessly matches the surrounding patterns. The original CCSIM algorithm is then combined with the two methods and is tested using various complex two- and three-dimensional examples. It should, however, be emphasized that the methods that we propose in this paper are applicable to other pattern-based geostatistical simulation methods.

  10. Geostatistical three-dimensional modeling of oolite shoals, St. Louis Limestone, southwest Kansas

    USGS Publications Warehouse

    Qi, L.; Carr, T.R.; Goldstein, R.H.

    2007-01-01

    In the Hugoton embayment of southwestern Kansas, reservoirs composed of relatively thin (<4 m; <13.1 ft) oolitic deposits within the St. Louis Limestone have produced more than 300 million bbl of oil. The geometry and distribution of oolitic deposits control the heterogeneity of the reservoirs, resulting in exploration challenges and relatively low recovery. Geostatistical three-dimensional (3-D) models were constructed to quantify the geometry and spatial distribution of oolitic reservoirs, and the continuity of flow units within Big Bow and Sand Arroyo Creek fields. Lithofacies in uncored wells were predicted from digital logs using a neural network. The tilting effect from the Laramide orogeny was removed to construct restored structural surfaces at the time of deposition. Well data and structural maps were integrated to build 3-D models of oolitic reservoirs using stochastic simulations with geometry data. Three-dimensional models provide insights into the distribution, the external and internal geometry of oolitic deposits, and the sedimentologic processes that generated reservoir intervals. The structural highs and general structural trend had a significant impact on the distribution and orientation of the oolitic complexes. The depositional pattern and connectivity analysis suggest an overall aggradation of shallow-marine deposits during pulses of relative sea level rise followed by deepening near the top of the St. Louis Limestone. Cemented oolitic deposits were modeled as barriers and baffles and tend to concentrate at the edge of oolitic complexes. Spatial distribution of porous oolitic deposits controls the internal geometry of rock properties. Integrated geostatistical modeling methods can be applicable to other complex carbonate or siliciclastic reservoirs in shallow-marine settings. Copyright ?? 2007. The American Association of Petroleum Geologists. All rights reserved.

  11. Spatially explicit Schistosoma infection risk in eastern Africa using Bayesian geostatistical modelling.

    PubMed

    Schur, Nadine; Hürlimann, Eveline; Stensgaard, Anna-Sofie; Chimfwembe, Kingford; Mushinge, Gabriel; Simoonga, Christopher; Kabatereine, Narcis B; Kristensen, Thomas K; Utzinger, Jürg; Vounatsou, Penelope

    2013-11-01

    Schistosomiasis remains one of the most prevalent parasitic diseases in the tropics and subtropics, but current statistics are outdated due to demographic and ecological transformations and ongoing control efforts. Reliable risk estimates are important to plan and evaluate interventions in a spatially explicit and cost-effective manner. We analysed a large ensemble of georeferenced survey data derived from an open-access neglected tropical diseases database to create smooth empirical prevalence maps for Schistosoma mansoni and Schistosoma haematobium for a total of 13 countries of eastern Africa. Bayesian geostatistical models based on climatic and other environmental data were used to account for potential spatial clustering in spatially structured exposures. Geostatistical variable selection was employed to reduce the set of covariates. Alignment factors were implemented to combine surveys on different age-groups and to acquire separate estimates for individuals aged ≤20 years and entire communities. Prevalence estimates were combined with population statistics to obtain country-specific numbers of Schistosoma infections. We estimate that 122 million individuals in eastern Africa are currently infected with either S. mansoni, or S. haematobium, or both species concurrently. Country-specific population-adjusted prevalence estimates range between 12.9% (Uganda) and 34.5% (Mozambique) for S. mansoni and between 11.9% (Djibouti) and 40.9% (Mozambique) for S. haematobium. Our models revealed that infection risk in Burundi, Eritrea, Ethiopia, Kenya, Rwanda, Somalia and Sudan might be considerably higher than previously reported, while in Mozambique and Tanzania, the risk might be lower than current estimates suggest. Our empirical, large-scale, high-resolution infection risk estimates for S. mansoni and S. haematobium in eastern Africa can guide future control interventions and provide a benchmark for subsequent monitoring and evaluation activities. Copyright © 2011

  12. Geostatistical prediction of flow-duration curves in an index-flow framework

    NASA Astrophysics Data System (ADS)

    Pugliese, A.; Castellarin, A.; Brath, A.

    2014-09-01

    An empirical period-of-record flow-duration curve (FDC) describes the percentage of time (duration) in which a given streamflow was equaled or exceeded over an historical period of time. In many practical applications one has to construct FDCs in basins that are ungauged or where very few observations are available. We present an application strategy of top-kriging, which makes the geostatistical procedure capable of predicting FDCs in ungauged catchments. Previous applications of top-kriging mainly focused on the prediction of point streamflow indices (e.g. flood quantiles, low-flow indices, etc.); here the procedure is used to predict the entire curve in ungauged sites as a weighted average of standardised empirical FDCs through the traditional linear-weighting scheme of kriging methods. In particular, we propose to standardise empirical FDCs by a reference index-flow value (i.e. mean annual flow, or mean annual precipitation × the drainage area) and to compute the overall negative deviation of the curves from this reference value. We then propose to use these values, which we term total negative deviation (TND), for expressing the hydrological similarity between catchments and for deriving the geostatistical weights. We focus on the prediction of FDCs for 18 unregulated catchments located in central Italy, and we quantify the accuracy of the proposed technique under various operational conditions through an extensive cross-validation and sensitivity analysis. The cross-validation points out that top-kriging is a reliable approach for predicting FDCs with Nash-Sutcliffe efficiency measures ranging from 0.85 to 0.96 (depending on the model settings) very low biases over the entire duration range, and an enhanced representation of the low-flow regime relative to other regionalisation models that were recently developed for the same study region.

  13. Tuberculosis case-finding through a village outreach programme in a rural setting in southern Ethiopia: community randomized trial.

    PubMed

    Shargie, Estifanos Biru; Mørkve, Odd; Lindtjørn, Bernt

    2006-02-01

    To ascertain whether case-finding through community outreach in a rural setting has an effect on case-notification rate, symptom duration, and treatment outcome of smear-positive tuberculosis (TB). We randomly allocated 32 rural communities to intervention or control groups. In intervention communities, health workers from seven health centres held monthly diagnostic outreach clinics at which they obtained sputum samples for sputum microscopy from symptomatic TB suspects. In addition, trained community promoters distributed leaflets and discussed symptoms of TB during house visits and at popular gatherings. Symptomatic individuals were encouraged to visit the outreach team or a nearby health facility. In control communities, cases were detected through passive case-finding among symptomatic suspects reporting to health facilities. Smear-positive TB patients from the intervention and control communities diagnosed during the study period were prospectively enrolled. In the 1-year study period, 159 and 221 cases of smear-positive TB were detected in the intervention and control groups, respectively. Case-notification rates in all age groups were 124.6/10(5) and 98.1/10(5) person-years, respectively (P = 0.12). The corresponding rates in adults older than 14 years were 207/10(5) and 158/10(5) person-years, respectively (P = 0.09). The proportion of patients with >3 months' symptom duration was 41% in the intervention group compared with 63% in the control group (P<0.001). Pre-treatment symptom duration in the intervention group fell by 55-60% compared with 3-20% in the control group. In the intervention and control groups, 81% and 75%, respectively of patients successfully completed treatment (P = 0.12). The intervention was effective in improving the speed but not the extent of case finding for smear-positive TB in this setting. Both groups had comparable treatment outcomes.

  14. Tuberculosis case-finding through a village outreach programme in a rural setting in southern Ethiopia: community randomized trial.

    PubMed Central

    Shargie, Estifanos Biru; Mørkve, Odd; Lindtjørn, Bernt

    2006-01-01

    OBJECTIVE: To ascertain whether case-finding through community outreach in a rural setting has an effect on case-notification rate, symptom duration, and treatment outcome of smear-positive tuberculosis (TB). METHODS: We randomly allocated 32 rural communities to intervention or control groups. In intervention communities, health workers from seven health centres held monthly diagnostic outreach clinics at which they obtained sputum samples for sputum microscopy from symptomatic TB suspects. In addition, trained community promoters distributed leaflets and discussed symptoms of TB during house visits and at popular gatherings. Symptomatic individuals were encouraged to visit the outreach team or a nearby health facility. In control communities, cases were detected through passive case-finding among symptomatic suspects reporting to health facilities. Smear-positive TB patients from the intervention and control communities diagnosed during the study period were prospectively enrolled. FINDINGS: In the 1-year study period, 159 and 221 cases of smear-positive TB were detected in the intervention and control groups, respectively. Case-notification rates in all age groups were 124.6/10(5) and 98.1/10(5) person-years, respectively (P = 0.12). The corresponding rates in adults older than 14 years were 207/10(5) and 158/10(5) person-years, respectively (P = 0.09). The proportion of patients with >3 months' symptom duration was 41% in the intervention group compared with 63% in the control group (P<0.001). Pre-treatment symptom duration in the intervention group fell by 55-60% compared with 3-20% in the control group. In the intervention and control groups, 81% and 75%, respectively of patients successfully completed treatment (P = 0.12). CONCLUSION: The intervention was effective in improving the speed but not the extent of case finding for smear-positive TB in this setting. Both groups had comparable treatment outcomes. PMID:16501728

  15. Phase III randomized trial assessing rofecoxib in the adjuvant setting of colorectal cancer: final results of the VICTOR trial.

    PubMed

    Midgley, Rachel S; McConkey, Christopher C; Johnstone, Elaine C; Dunn, Janet A; Smith, Justine L; Grumett, Simon A; Julier, Patrick; Iveson, Claire; Yanagisawa, Yoko; Warren, Bryan; Langman, Michael J; Kerr, David J

    2010-10-20

    Laboratory and case-control studies suggest a pivotal role for the cyclooxygenase-2 (COX-2) pathway in colorectal carcinogenesis. The purpose of this study was to test whether the COX-2 inhibitor rofecoxib could reduce recurrence and improve survival when administered in the adjuvant setting of colorectal cancer (CRC). Patients who had undergone potentially curative surgery and completion of adjuvant therapy for stage II and III CRC were randomly assigned to receive rofecoxib (20 mg daily) or placebo. The primary end point was overall survival (OS). Where formalin-fixed paraffin-embedded tumor tissue samples were available, COX-2 expression was evaluated by immunohistochemistry and correlated with clinical outcome. Two thousand four hundred thirty-four patients were entered onto the study. The trial was terminated early because of the worldwide withdrawal of rofecoxib. At this point, 1,167 patients had received rofecoxib and 1,160 patients had received placebo for median treatment durations of 7.4 and 8.2 months, respectively. For the rofecoxib and placebo arms, median follow-up times were 4.84 and 4.85 years, with 241 and 246 deaths and 297 and 329 recurrences, respectively. No difference was demonstrated in OS (hazard ratio [HR] = 0.97; 95% CI, 0.81 to 1.16; P = .75) or recurrence (HR = 0.89; 95% CI, 0.76 to 1.04; P = .15) comparing the two groups. Tumor COX-2 expression by immunohistochemistry was assessed for 871 patients, but neither prognostic nor predictive effects were observed. In this study of abbreviated therapy in the adjuvant setting of CRC, rofecoxib did not improve OS or protect from recurrence in unselected patients. In addition, COX-2 expression did not correlate with prognosis overall or predict effectiveness of COX-2 inhibitors.

  16. Home use of misoprostol for early medical abortion in a low resource setting: secondary analysis of a randomized controlled trial.

    PubMed

    Iyengar, Kirti; Klingberg-Allvin, Marie; Iyengar, Sharad D; Paul, Mandira; Essén, Birgitta; Gemzell-Danielsson, Kristina

    2016-02-01

    Although home use of misoprostol for early medical abortion is considered to be safe, effective and feasible, it has not become standard service delivery practice. The aim of this study was to compare the efficacy, safety, and acceptability of home use of misoprostol with clinic misoprostol in a low-resource setting. This was a secondary analysis of a randomized controlled trial conducted in six primary care clinics in India. Women seeking medical abortion within up to nine gestational weeks (n = 731) received mifepristone in the clinic and were allocated either to home or clinic administration of misoprostol. Follow-up contact was after 10-15 days. Of 731 participants, 73% were from rural areas and 55% had no formal education. Complete abortion rates in the home and clinic misoprostol groups were 94.2 and 94.4%, respectively. The rate of adverse events was similar in both groups (0.3%). A greater proportion of home users (90.2%) said that they would opt for misoprostol at home in the event of a future abortion compared with clinic users (79.7%) who would opt for misoprostol at the clinic in a similar situation (p = 0.0002). Ninety-six percent women using misoprostol at home or in the clinic were satisfied with their abortion experience. Home-use of misoprostol for early medical abortion is as effective and acceptable as clinic use, in low resource settings. Women should be offered a choice of this option regardless of distance of their residence from the clinic and communication facilities. © 2015 Nordic Federation of Societies of Obstetrics and Gynecology.

  17. Testing Allele Transmission of an SNP Set Using a Family-Based Generalized Genetic Random Field Method.

    PubMed

    Li, Ming; Li, Jingyun; He, Zihuai; Lu, Qing; Witte, John S; Macleod, Stewart L; Hobbs, Charlotte A; Cleves, Mario A

    2016-05-01

    Family-based association studies are commonly used in genetic research because they can be robust to population stratification (PS). Recent advances in high-throughput genotyping technologies have produced a massive amount of genomic data in family-based studies. However, current family-based association tests are mainly focused on evaluating individual variants one at a time. In this article, we introduce a family-based generalized genetic random field (FB-GGRF) method to test the joint association between a set of autosomal SNPs (i.e., single-nucleotide polymorphisms) and disease phenotypes. The proposed method is a natural extension of a recently developed GGRF method for population-based case-control studies. It models offspring genotypes conditional on parental genotypes, and, thus, is robust to PS. Through simulations, we presented that under various disease scenarios the FB-GGRF has improved power over a commonly used family-based sequence kernel association test (FB-SKAT). Further, similar to GGRF, the proposed FB-GGRF method is asymptotically well-behaved, and does not require empirical adjustment of the type I error rates. We illustrate the proposed method using a study of congenital heart defects with family trios from the National Birth Defects Prevention Study (NBDPS).

  18. Translating U-500R Randomized Clinical Trial Evidence to the Practice Setting: A Diabetes Educator/Expert Prescriber Team Approach

    PubMed Central

    Bergen, Paula M.; Kruger, Davida F.; Taylor, April D.; Eid, Wael E.; Bhan, Arti; Jackson, Jeffrey A.

    2017-01-01

    Purpose The purpose of this article is to provide recommendations to the diabetes educator/expert prescriber team for the use of human regular U-500 insulin (U-500R) in patients with severely insulin-resistant type 2 diabetes, including its initiation and titration, by utilizing dosing charts and teaching materials translated from a recent U-500R clinical trial. Conclusions Clinically relevant recommendations and teaching materials for the optimal use and management of U-500R in clinical practice are provided based on the efficacy and safety results of and lessons learned from the U-500R clinical trial by Hood et al, current standards of practice, and the authors’ clinical expertise. This trial was the first robustly powered, randomized, titration-to-target trial to compare twice-daily and three-times-daily U-500R dosing regimens. Modifications were made to the initiation and titration dosing algorithms used in this trial to simplify dosing strategies for the clinical setting and align with current glycemic targets recommended by the American Diabetes Association. Leveraging the expertise, resources, and patient interactions of the diabetes educator who can provide diabetes self-management education and support in collaboration with the multidisciplinary diabetes team is strongly recommended to ensure patients treated with U-500R receive the timely and comprehensive care required to safely and effectively use this highly concentrated insulin. PMID:28427304

  19. High resolution general purpose D-layer experiment for EISCAT incoherent scatter radars using selected set of random codes

    NASA Astrophysics Data System (ADS)

    Turunen, T.; Westman, A.; Häggström, I.; Wannberg, G.

    2002-09-01

    The ionospheric D-layer is a narrow bandwidth radar target often with a very small scattering cross section. The target autocorrelation function can be obtained by transmitting a series of relatively short coded pulses and computing the correlation between data obtained from different pulses. The spatial resolution should be as high as possible and the spatial side lobes of the codes used should be as small as possible. However, due to the short pulse repetition period (in the order of milliseconds) at any instant, the radar receives detectable scattered signals not only from the pulse illuminating the D-region but also from 3 5 ambiguous-range pulses, which makes it difficult to produce a reliable estimate near zero lag of the autocorrelation function. A new experimental solution to this measurement problem, using a selected set of 40-bit random codes with 4 µs elements giving 600 m spatial resolution is presented. The zero lag is approximated by dividing the pulse into two 20-bit codes and computing the correlation between those two pulses. The lowest altitudes of the E-layer are measured by dividing the pulse into 5 pieces of 8 bits, which allows for computation of 4 lags. In addition, coherent integration of data from four pulses is used for obtaining separately the autocorrelation function estimate for the lowest altitudes and in cases when the target contains structures with a long coherence time. Design details and responses of the experiment are given, and analysed test data are shown.

  20. A multiple-point geostatistical approach to quantifying uncertainty for flow and transport simulation in geologically complex environments

    NASA Astrophysics Data System (ADS)

    Cronkite-Ratcliff, C.; Phelps, G. A.; Boucher, A.

    2011-12-01

    In many geologic settings, the pathways of groundwater flow are controlled by geologic heterogeneities which have complex geometries. Models of these geologic heterogeneities, and consequently, their effects on the simulated pathways of groundwater flow, are characterized by uncertainty. Multiple-point geostatistics, which uses a training image to represent complex geometric descriptions of geologic heterogeneity, provides a stochastic approach to the analysis of geologic uncertainty. Incorporating multiple-point geostatistics into numerical models provides a way to extend this analysis to the effects of geologic uncertainty on the results of flow simulations. We present two case studies to demonstrate the application of multiple-point geostatistics to numerical flow simulation in complex geologic settings with both static and dynamic conditioning data. Both cases involve the development of a training image from a complex geometric description of the geologic environment. Geologic heterogeneity is modeled stochastically by generating multiple equally-probable realizations, all consistent with the training image. Numerical flow simulation for each stochastic realization provides the basis for analyzing the effects of geologic uncertainty on simulated hydraulic response. The first case study is a hypothetical geologic scenario developed using data from the alluvial deposits in Yucca Flat, Nevada. The SNESIM algorithm is used to stochastically model geologic heterogeneity conditioned to the mapped surface geology as well as vertical drill-hole data. Numerical simulation of groundwater flow and contaminant transport through geologic models produces a distribution of hydraulic responses and contaminant concentration results. From this distribution of results, the probability of exceeding a given contaminant concentration threshold can be used as an indicator of uncertainty about the location of the contaminant plume boundary. The second case study considers a

  1. A Bayesian geostatistical transfer function approach to tracer test analysis

    NASA Astrophysics Data System (ADS)

    Fienen, Michael N.; Luo, Jian; Kitanidis, Peter K.

    2006-07-01

    Reactive transport modeling is often used in support of bioremediation and chemical treatment planning and design. There remains a pressing need for practical and efficient models that do not require (or assume attainable) the high level of characterization needed by complex numerical models. We focus on a linear systems or transfer function approach to the problem of reactive tracer transport in a heterogeneous saprolite aquifer. Transfer functions are obtained through the Bayesian geostatistical inverse method applied to tracer injection histories and breakthrough curves. We employ nonparametric transfer functions, which require minimal assumptions about shape and structure. The resulting flexibility empowers the data to determine the nature of the transfer function with minimal prior assumptions. Nonnegativity is enforced through a reflected Brownian motion stochastic model. The inverse method enables us to quantify uncertainty and to generate conditional realizations of the transfer function. Complex information about a hydrogeologic system is distilled into a relatively simple but rigorously obtained function that describes the transport behavior of the system between two wells. The resulting transfer functions are valuable in reactive transport models based on traveltime and streamline methods. The information contained in the data, particularly in the case of strong heterogeneity, is not overextended but is fully used. This is the first application of Bayesian geostatistical inversion to transfer functions in hydrogeology but the methodology can be extended to any linear system.

  2. Regional flow duration curves: Geostatistical techniques versus multivariate regression

    USGS Publications Warehouse

    Pugliese, Alessio; Farmer, William H.; Castellarin, Attilio; Archfield, Stacey A.; Vogel, Richard M.

    2016-01-01

    A period-of-record flow duration curve (FDC) represents the relationship between the magnitude and frequency of daily streamflows. Prediction of FDCs is of great importance for locations characterized by sparse or missing streamflow observations. We present a detailed comparison of two methods which are capable of predicting an FDC at ungauged basins: (1) an adaptation of the geostatistical method, Top-kriging, employing a linear weighted average of dimensionless empirical FDCs, standardised with a reference streamflow value; and (2) regional multiple linear regression of streamflow quantiles, perhaps the most common method for the prediction of FDCs at ungauged sites. In particular, Top-kriging relies on a metric for expressing the similarity between catchments computed as the negative deviation of the FDC from a reference streamflow value, which we termed total negative deviation (TND). Comparisons of these two methods are made in 182 largely unregulated river catchments in the southeastern U.S. using a three-fold cross-validation algorithm. Our results reveal that the two methods perform similarly throughout flow-regimes, with average Nash-Sutcliffe Efficiencies 0.566 and 0.662, (0.883 and 0.829 on log-transformed quantiles) for the geostatistical and the linear regression models, respectively. The differences between the reproduction of FDC's occurred mostly for low flows with exceedance probability (i.e. duration) above 0.98.

  3. Validating spatial structure in canopy water content using geostatistics

    NASA Technical Reports Server (NTRS)

    Sanderson, E. W.; Zhang, M. H.; Ustin, S. L.; Rejmankova, E.; Haxo, R. S.

    1995-01-01

    Heterogeneity in ecological phenomena are scale dependent and affect the hierarchical structure of image data. AVIRIS pixels average reflectance produced by complex absorption and scattering interactions between biogeochemical composition, canopy architecture, view and illumination angles, species distributions, and plant cover as well as other factors. These scales affect validation of pixel reflectance, typically performed by relating pixel spectra to ground measurements acquired at scales of 1m(exp 2) or less (e.g., field spectra, foilage and soil samples, etc.). As image analysis becomes more sophisticated, such as those for detection of canopy chemistry, better validation becomes a critical problem. This paper presents a methodology for bridging between point measurements and pixels using geostatistics. Geostatistics have been extensively used in geological or hydrogeolocial studies but have received little application in ecological studies. The key criteria for kriging estimation is that the phenomena varies in space and that an underlying controlling process produces spatial correlation between the measured data points. Ecological variation meets this requirement because communities vary along environmental gradients like soil moisture, nutrient availability, or topography.

  4. Spatial analysis of hazardous waste data using geostatistics

    SciTech Connect

    Zirschky, J.H.

    1984-01-01

    The objective of this investigation was to determine if geostatistics could be a useful tool for evaluating hazardous waste sites. Three sites contaminated by dioxin (2,3,7,8-tetrachlorodibenzo-p-dioxin (TCDD)) were investigated. The first site evaluated was a creek into which TCDD-contaminated soil had eroded. The second site was a town in which TCDD-contaminated wastes had been sprayed onto the streets. Finally, the third site was a highway of which the shoulders were contaminated by dust deposition from a nearby hazardous waste site. The distribution of TCDD at the first and third sites were investigated using kriging, an optimal estimation technique. By using kriging, the areas of both sites requiring cleanup were successfully identified. At the second site, the town, satisfactory results were not obtained. The distribution of contamination in this town is believed to be very heterogeneous; thus, reasonable estimates could not be obtained. Additional sampling was therefore recommended at this site. Based upon this research, geostatistics appears to be a very useful tool for evaluating a hazardous waste site if the distribution of contaminants at the site is homogeneous, or can be divided into homogeneous areas.

  5. Regional flow duration curves: Geostatistical techniques versus multivariate regression

    NASA Astrophysics Data System (ADS)

    Pugliese, Alessio; Farmer, William H.; Castellarin, Attilio; Archfield, Stacey A.; Vogel, Richard M.

    2016-10-01

    A period-of-record flow duration curve (FDC) represents the relationship between the magnitude and frequency of daily streamflows. Prediction of FDCs is of great importance for locations characterized by sparse or missing streamflow observations. We present a detailed comparison of two methods which are capable of predicting an FDC at ungauged basins: (1) an adaptation of the geostatistical method, Top-kriging, employing a linear weighted average of dimensionless empirical FDCs, standardised with a reference streamflow value; and (2) regional multiple linear regression of streamflow quantiles, perhaps the most common method for the prediction of FDCs at ungauged sites. In particular, Top-kriging relies on a metric for expressing the similarity between catchments computed as the negative deviation of the FDC from a reference streamflow value, which we termed total negative deviation (TND). Comparisons of these two methods are made in 182 largely unregulated river catchments in the southeastern U.S. using a three-fold cross-validation algorithm. Our results reveal that the two methods perform similarly throughout flow-regimes, with average Nash-Sutcliffe Efficiencies 0.566 and 0.662, (0.883 and 0.829 on log-transformed quantiles) for the geostatistical and the linear regression models, respectively. The differences between the reproduction of FDC's occurred mostly for low flows with exceedance probability (i.e. duration) above 0.98.

  6. Self-management support intervention to control cancer pain in the outpatient setting: a randomized controlled trial study protocol.

    PubMed

    Hochstenbach, Laura M J; Courtens, Annemie M; Zwakhalen, Sandra M G; van Kleef, Maarten; de Witte, Luc P

    2015-05-19

    Pain is a prevalent and distressing symptom in patients with cancer, having an enormous impact on functioning and quality of life. Fragmentation of care, inadequate pain communication, and reluctance towards pain medication contribute to difficulties in optimizing outcomes. Integration of patient self-management and professional care by means of healthcare technology provides new opportunities in the outpatient setting. This study protocol outlines a two-armed multicenter randomized controlled trial that compares a technology based multicomponent self-management support intervention with care as usual and includes an effect, economic and process evaluation. Patients will be recruited consecutively via the outpatient oncology clinics and inpatient oncology wards of one academic hospital and one regional hospital in the south of the Netherlands. Irrespective of the stage of disease, patients are eligible when they are diagnosed with cancer and have uncontrolled moderate to severe cancer (treatment) related pain defined as NRS≥4 for more than two weeks. Randomization (1:1) will assign patients to either the intervention or control group; patients in the intervention group receive self-management support and patients in the control group receive care as usual. The intervention will be delivered by registered nurses specialized in pain and palliative care. Important components include monitoring of pain, adverse effects and medication as well as graphical feedback, education, and nurse support. Effect measurements for both groups will be carried out with questionnaires at baseline (T0), after 4 weeks (T1) and after 12 weeks (T2). Pain intensity and quality of life are the primary outcomes. Secondary outcomes include self-efficacy, knowledge, anxiety, depression and pain medication use. The final questionnaire contains also questions for the economic evaluation that includes both cost-effectiveness and cost-utility analysis. Data for the process evaluation will be

  7. Estimating the Burden of Malaria in Senegal: Bayesian Zero-Inflated Binomial Geostatistical Modeling of the MIS 2008 Data

    PubMed Central

    Giardina, Federica; Gosoniu, Laura; Konate, Lassana; Diouf, Mame Birame; Perry, Robert; Gaye, Oumar; Faye, Ousmane; Vounatsou, Penelope

    2012-01-01

    The Research Center for Human Development in Dakar (CRDH) with the technical assistance of ICF Macro and the National Malaria Control Programme (NMCP) conducted in 2008/2009 the Senegal Malaria Indicator Survey (SMIS), the first nationally representative household survey collecting parasitological data and malaria-related indicators. In this paper, we present spatially explicit parasitaemia risk estimates and number of infected children below 5 years. Geostatistical Zero-Inflated Binomial models (ZIB) were developed to take into account the large number of zero-prevalence survey locations (70%) in the data. Bayesian variable selection methods were incorporated within a geostatistical framework in order to choose the best set of environmental and climatic covariates associated with the parasitaemia risk. Model validation confirmed that the ZIB model had a better predictive ability than the standard Binomial analogue. Markov chain Monte Carlo (MCMC) methods were used for inference. Several insecticide treated nets (ITN) coverage indicators were calculated to assess the effectiveness of interventions. After adjusting for climatic and socio-economic factors, the presence of at least one ITN per every two household members and living in urban areas reduced the odds of parasitaemia by 86% and 81% respectively. Posterior estimates of the ORs related to the wealth index show a decreasing trend with the quintiles. Infection odds appear to be increasing with age. The population-adjusted prevalence ranges from 0.12% in Thillé-Boubacar to 13.1% in Dabo. Tambacounda has the highest population-adjusted predicted prevalence (8.08%) whereas the region with the highest estimated number of infected children under the age of 5 years is Kolda (13940). The contemporary map and estimates of malaria burden identify the priority areas for future control interventions and provide baseline information for monitoring and evaluation. Zero-Inflated formulations are more appropriate in

  8. A geostatistical modeling study of the effect of heterogeneity on radionuclide transport in the unsaturated zone, Yucca Mountain.

    PubMed

    Viswanathan, Hari S; Robinson, Bruce A; Gable, Carl W; Carey, James W

    2003-01-01

    Retardation of certain radionuclides due to sorption to zeolitic minerals is considered one of the major barriers to contaminant transport in the unsaturated zone of Yucca Mountain. However, zeolitically altered areas are lower in permeability than unaltered regions, which raises the possibility that contaminants might bypass the sorptive zeolites. The relationship between hydrologic and chemical properties must be understood to predict the transport of radionuclides through zeolitically altered areas. In this study, we incorporate mineralogical information into an unsaturated zone transport model using geostatistical techniques to correlate zeolitic abundance to hydrologic and chemical properties. Geostatistical methods are used to develop variograms, kriging maps, and conditional simulations of zeolitic abundance. We then investigate, using flow and transport modeling on a heterogeneous field, the relationship between percent zeolitic alteration, permeability changes due to alteration, sorption due to alteration, and their overall effect on radionuclide transport. We compare these geostatistical simulations to a simplified threshold method in which each spatial location in the model is assigned either zeolitic or vitric properties based on the zeolitic abundance at that location. A key conclusion is that retardation due to sorption predicted by using the continuous distribution is larger than the retardation predicted by the threshold method. The reason for larger retardation when using the continuous distribution is a small but significant sorption at locations with low zeolitic abundance. If, for practical reasons, models with homogeneous properties within each layer are used, we recommend setting nonzero K(d)s in the vitric tuffs to mimic the more rigorous continuous distribution simulations. Regions with high zeolitic abundance may not be as effective in retarding radionuclides such as Neptunium since these rocks are lower in permeability and contaminants can

  9. Ethical and legal issues in emergency research: barriers to conducting prospective randomized trials in an emergency setting.

    PubMed

    Morrison, C Anne; Horwitz, Irwin B; Carrick, Matthew M

    2009-11-01

    As in any area of medicine, clinical trials are crucial to the advancement of trauma care and the establishment of evidence-based guidelines. This work identifies consent regulations that impede advances in trauma resuscitation research and examines several ethical issues underlying current policies in the United States which regulate how clinical trials are conducted in an emergency setting. Trauma is a leading cause of mortality in the U.S. Minorities and those in low socioeconomic groups are subject to a disproportional amount of traumatic injuries and have worse treatment outcomes than non-minority individuals. Current regulations guiding consent requirements in emergency research were enacted to protect such vulnerable populations from exploitation. Ironically, these same regulations also serve as barriers to clinical trials in trauma research, thus depriving these same vulnerable groups from the benefits of advances in trauma care. A literature review was conducted on areas affecting emergency medical research including: informed consent, socioeconomic and racial disparities, federal regulations in trauma research and biomedical ethics. In the ten year period following the passage of the FDA's Common Rule (21 CFR 50.24) in 1995, 21 published emergency research studies were conducted under the waiver of informed consent. Misconceptions regarding federal regulations and cumbersome internal review board approval processes are frequently cited as significant barriers to conducting prospective randomized trials in the emergency setting. Given the history of past abuses in medical research, the principle of maintaining autonomy of choice is of paramount importance. However, trauma resuscitation is unique in that patients are either unconscious or of limited mental capacity at the time treatment is required, and thus the standard of informed consent is unable to be achieved as in other areas of medicine. While this paradox was recognized by the FDA in 1995 with the

  10. Impact of counselling on exclusive breast-feeding practices in a poor urban setting in Kenya: a randomized controlled trial.

    PubMed

    Ochola, Sophie A; Labadarios, Demetre; Nduati, Ruth W

    2013-10-01

    To determine the impact of facility-based semi-intensive and home-based intensive counselling in improving exclusive breast-feeding (EBF) in a low-resource urban setting in Kenya. A cluster randomized controlled trial in which nine villages were assigned on a 1:1:1 ratio, by computer, to two intervention groups and a control group. The home-based intensive counselling group (HBICG) received seven counselling sessions at home by trained peers, one prenatally and six postnatally. The facility-based semi-intensive counselling group (FBSICG) received only one counselling session prenatally. The control group (CG) received no counselling from the research team. Information on infant feeding practices was collected monthly for 6 months after delivery. The data-gathering team was blinded to the intervention allocation. The outcome was EBF prevalence at 6 months. Kibera slum, Nairobi. A total of 360 HIV-negative women, 34-36 weeks pregnant, were selected from an antenatal clinic in Kibera; 120 per study group. Of the 360 women enrolled, 265 completed the study and were included in the analysis (CG n 89; FBSICG n 87; HBICG n 89). Analysis was by intention to treat. The prevalence of EBF at 6 months was 23.6% in HBICG, 9.2% in FBSICG and 5.6% in CG. HBICG mothers had four times increased likelihood to practise EBF compared with those in the CG (adjusted relative risk = 4.01; 95% CI 2.30, 7.01; P=0.001). There was no significant difference between EBF rates in FBSICG and CG. EBF can be promoted in low socio-economic conditions using home-based intensive counselling. One session of facility-based counselling is not sufficient to sustain EBF.

  11. Comparing the applicability of some geostatistical methods to predict the spatial distribution of topsoil Calcium Carbonate in part of farmland of Zanjan Province

    NASA Astrophysics Data System (ADS)

    Sarmadian, Fereydoon; Keshavarzi, Ali

    2010-05-01

    Most of soils in iran, were located in the arid and semi-arid regions and have high pH (more than 7) and high amount of calcium carbonate and this problem cause to their calcification.In calcareous soils, plant growing and production is difficult. Most part of this problem, in relation to high pH and high concentration of calcium ion that cause to fixation and unavailability of elements which were dependent to pH, especially Phosphorous and some micro nutrients such as Fe, Zn, Mn and Cu. Prediction of soil calcium carbonate in non-sampled areas and mapping the calcium carbonate variability in order to sustainable management of soil fertility is very important.So, this research was done with the aim of evaluation and analyzing spatial variability of topsoil calcium carbonate as an aspect of soil fertility and plant nutrition, comparing geostatistical methods such as kriging and co-kriging and mapping topsoil calcium carbonate. For geostatistical analyzing, sampling was done with stratified random method and soil samples from 0 to 15 cm depth were collected with auger within 23 locations.In co-kriging method, salinity data was used as auxiliary variable. For comparing and evaluation of geostatistical methods, cross validation were used by statistical parameters of RMSE. The results showed that co-kriging method has the highest correlation coefficient and less RMSE and has the higher accuracy than kriging method to prediction of calcium carbonate content in non-sampled areas.

  12. Key Issues and Strategies for Recruitment and Implementation in Large-Scale Randomized Controlled Trial Studies in Afterschool Settings. Afterschool Research Brief. Issue No. 2

    ERIC Educational Resources Information Center

    Jones, Debra Hughes; Vaden-Kiernan, Michael; Rudo, Zena; Fitzgerald, Robert; Hartry, Ardice; Chambers, Bette; Smith, Dewi; Muller, Patricia; Moss, Marcey A.

    2008-01-01

    Under the larger scope of the National Partnership for Quality Afterschool Learning, SEDL funded three awardees to carry out large-scale randomized controlled trials (RCT) assessing the efficacy of promising literacy curricula in afterschool settings on student academic achievement. SEDL provided analytic and technical support to the RCT studies…

  13. [Spatial distribution of soil animals: a geostatistical approach].

    PubMed

    Gongal'skiĭ, K B; Zaĭtsev, A S; Savin, F A

    2009-01-01

    Spatial distribution is one of the main parameters of populations of soil animals. Spatial soil ecology having been developing during last decades bases animal distribution estimates on the geostatistic approach. A simple principle underlying the latter's methodology is that samples placed close to each other have more similarity than those distantly placed, it is usually called autocorrelation. The principles of basic statistics cannot be applied to autocorrelated data. Apiplying variograms, Mantel test, Moran index, and SADIE statistics enables to reveal the size of clusters of both soil parameters and soil animal aggregations. This direction of investigations quite popular in the western literature is just rarely employed by Russian soil ecologists. Statistically correct procedures allow developing field sampling methodology that is vital in applied studies of soil ecology, namely, in bioindication and ecotoxicology of soils, in the assessment of biological resources in terms of abundance and biomass of soil animals. This methodology has a decisive importance in the development of soil biogeography.

  14. Bayesian geostatistics in health cartography: the perspective of malaria.

    PubMed

    Patil, Anand P; Gething, Peter W; Piel, Frédéric B; Hay, Simon I

    2011-06-01

    Maps of parasite prevalences and other aspects of infectious diseases that vary in space are widely used in parasitology. However, spatial parasitological datasets rarely, if ever, have sufficient coverage to allow exact determination of such maps. Bayesian geostatistics (BG) is a method for finding a large sample of maps that can explain a dataset, in which maps that do a better job of explaining the data are more likely to be represented. This sample represents the knowledge that the analyst has gained from the data about the unknown true map. BG provides a conceptually simple way to convert these samples to predictions of features of the unknown map, for example regional averages. These predictions account for each map in the sample, yielding an appropriate level of predictive precision.

  15. Interactive Declustering of Spatial Environmental Data for Geostatistical Analyses

    SciTech Connect

    Christopher P. Oertel; John R. Giles; Stanley M. Miller

    2009-01-01

    Clustered sampling often results during environmental site investigations when localized areas are over-sampled due to specific concerns in those areas or due to monitoring programs focused on specific zones. An interactive, nearest-neighbor method for efficient spatial declustering has been developed as part of the ongoing monitoring and assessment of Cs-137 concentrations in soils at the Idaho National Laboratory (INL) site. Insitu field measurements of Cs-137 have been obtained with a field gamma-ray spectrometer using a sampling layout with points concentrated around several former nuclear processing facilities. The spatial declustering process allows for more useful and productive geostatistical studies focused on characterization of spatial dependence and subsequent spatial estimation to generate maps that depict Cs-137 concentrations across the site.

  16. Bayesian geostatistics in health cartography: the perspective of malaria

    PubMed Central

    Patil, Anand P.; Gething, Peter W.; Piel, Frédéric B.; Hay, Simon I.

    2011-01-01

    Maps of parasite prevalences and other aspects of infectious diseases that vary in space are widely used in parasitology. However, spatial parasitological datasets rarely, if ever, have sufficient coverage to allow exact determination of such maps. Bayesian geostatistics (BG) is a method for finding a large sample of maps that can explain a dataset, in which maps that do a better job of explaining the data are more likely to be represented. This sample represents the knowledge that the analyst has gained from the data about the unknown true map. BG provides a conceptually simple way to convert these samples to predictions of features of the unknown map, for example regional averages. These predictions account for each map in the sample, yielding an appropriate level of predictive precision. PMID:21420361

  17. Medical Geography: a Promising Field of Application for Geostatistics.

    PubMed

    Goovaerts, P

    2009-01-01

    The analysis of health data and putative covariates, such as environmental, socio-economic, behavioral or demographic factors, is a promising application for geostatistics. It presents, however, several methodological challenges that arise from the fact that data are typically aggregated over irregular spatial supports and consist of a numerator and a denominator (i.e. population size). This paper presents an overview of recent developments in the field of health geostatistics, with an emphasis on three main steps in the analysis of areal health data: estimation of the underlying disease risk, detection of areas with significantly higher risk, and analysis of relationships with putative risk factors. The analysis is illustrated using age-adjusted cervix cancer mortality rates recorded over the 1970-1994 period for 118 counties of four states in the Western USA. Poisson kriging allows the filtering of noisy mortality rates computed from small population sizes, enhancing the correlation with two putative explanatory variables: percentage of habitants living below the federally defined poverty line, and percentage of Hispanic females. Area-to-point kriging formulation creates continuous maps of mortality risk, reducing the visual bias associated with the interpretation of choropleth maps. Stochastic simulation is used to generate realizations of cancer mortality maps, which allows one to quantify numerically how the uncertainty about the spatial distribution of health outcomes translates into uncertainty about the location of clusters of high values or the correlation with covariates. Last, geographically-weighted regression highlights the non-stationarity in the explanatory power of covariates: the higher mortality values along the coast are better explained by the two covariates than the lower risk recorded in Utah.

  18. Performance prediction using geostatistics and window reservoir simulation

    SciTech Connect

    Fontanilla, J.P.; Al-Khalawi, A.A.; Johnson, S.G.

    1995-11-01

    This paper is the first window model study in the northern area of a large carbonate reservoir in Saudi Arabia. It describes window reservoir simulation with geostatistics to model uneven water encroachment in the southwest producing area of the northern portion of the reservoir. In addition, this paper describes performance predictions that investigate the sweep efficiency of the current peripheral waterflood. A 50 x 50 x 549 (240 m. x 260 m. x 0.15 m. average grid block size) geological model was constructed with geostatistics software. Conditional simulation was used to obtain spatial distributions of porosity and volume of dolomite. Core data transforms were used to obtain horizontal and vertical permeability distributions. Simple averaging techniques were used to convert the 549-layer geological model to a 50 x 50 x 10 (240 m. x 260 m. x 8 m. average grid block size) window reservoir simulation model. Flux injectors and flux producers were assigned to the outermost grid blocks. Historical boundary flux rates were obtained from a coarsely-ridded full-field model. Pressure distribution, water cuts, GORs, and recent flowmeter data were history matched. Permeability correction factors and numerous parameter adjustments were required to obtain the final history match. The permeability correction factors were based on pressure transient permeability-thickness analyses. The prediction phase of the study evaluated the effects of infill drilling, the use of artificial lifts, workovers, horizontal wells, producing rate constraints, and tight zone development to formulate depletion strategies for the development of this area. The window model will also be used to investigate day-to-day reservoir management problems in this area.

  19. Spatial evaluation of the risk of groundwater quality degradation. A comparison between disjunctive kriging and geostatistical simulation.

    PubMed

    Barca, E; Passarella, G

    2008-02-01

    In some previous papers a probabilistic methodology was introduced to estimate a spatial index of risk of groundwater quality degradation, defined as the conditional probability of exceeding assigned thresholds of concentration of a generic chemical sampled in the studied water system. A crucial stage of this methodology was the use of geostatistical techniques to provide an estimation of the above-mentioned probability in a number of selected points by crossing spatial and temporal information. In this work, spatial risk values were obtained using alternatively stochastic conditional simulation and disjunctive kriging. A comparison between the resulting two sets of spatial risks, based on global and local statistical tests, showed that they do not come from the same statistical population and, consequently, they cannot be viewed as equivalent in a statistical sense. At a first glance, geostatistical conditional simulation may appear to represent the spatial variability of the phenomenon more effectively, as the latter tends to be smoothed by DK. However, a close examination of real case study results suggests that disjunctive kriging is more effective than simulation in estimating the spatial risk of groundwater quality degradation. In the study case, the potentially 'harmful event' considered, threatening a natural 'vulnerable groundwater system,' is fertilizer and manure application.

  20. Using rank-order geostatistics for spatial interpolation of highly skewed data in a heavy-metal contaminated site.

    PubMed

    Juang, K W; Lee, D Y; Ellsworth, T R

    2001-01-01

    The spatial distribution of a pollutant in contaminated soils is usually highly skewed. As a result, the sample variogram often differs considerably from its regional counterpart and the geostatistical interpolation is hindered. In this study, rank-order geostatistics with standardized rank transformation was used for the spatial interpolation of pollutants with a highly skewed distribution in contaminated soils when commonly used nonlinear methods, such as logarithmic and normal-scored transformations, are not suitable. A real data set of soil Cd concentrations with great variation and high skewness in a contaminated site of Taiwan was used for illustration. The spatial dependence of ranks transformed from Cd concentrations was identified and kriging estimation was readily performed in the standardized-rank space. The estimated standardized rank was back-transformed into the concentration space using the middle point model within a standardized-rank interval of the empirical distribution function (EDF). The spatial distribution of Cd concentrations was then obtained. The probability of Cd concentration being higher than a given cutoff value also can be estimated by using the estimated distribution of standardized ranks. The contour maps of Cd concentrations and the probabilities of Cd concentrations being higher than the cutoff value can be simultaneously used for delineation of hazardous areas of contaminated soils.

  1. Spatial distribution of Biomphalaria mollusks at São Francisco River Basin, Minas Gerais, Brazil, using geostatistical procedures.

    PubMed

    Guimarães, Ricardo J P S; Freitas, Corina C; Dutra, Luciano V; Felgueiras, Carlos A; Moura, Ana C M; Amaral, Ronaldo S; Drummond, Sandra C; Scholte, Ronaldo G C; Oliveira, Guilherme; Carvalho, Omar S

    2009-03-01

    Geostatistics is used in this work to make inferences about the presence of the species of Biomphalaria (B. glabrata, B. tenagophila and/or B. straminea), intermediate hosts of Schistosoma mansoni, at the São Francisco River Basin, in Minas Gerais, Brazil. One of these geostatistical procedures, known as indicator kriging, allows the classification of categorical data, in areas where the data are not available, using a punctual sample set. The result is a map of species and risk area definition. More than a single map of the categorical attribute, the procedure also permits the association of uncertainties of the stochastic model, which can be used to qualify the inferences. In order to validate the estimated data of the risk map, a fieldwork in five municipalities was carried out. The obtained results showed that indicator kriging is a rather robust tool since it presented a very good agreement with the field findings. The obtained risk map can be thought as an auxiliary tool to formulate proper public health strategies, and to guide other fieldwork, considering the places with higher occurrence probability of the most important snail species. Also, the risk map will enable better resource distribution and adequate policies for the mollusk control. This methodology will be applied to other river basins to generate a predictive map for Biomphalaria species distribution for the entire state of Minas Gerais.

  2. Revised Geostatistical Analysis of the Inventory of Carbon Tetrachloride in the Unconfined Aquifer in the 200 West Area of the Hanford Site

    SciTech Connect

    Murray, Christopher J.; Bott, Yi-Ju

    2008-12-30

    This report provides an updated estimate of the inventory of carbon tetrachloride (CTET) in the unconfined aquifer in the 200 West Area of the Hanford Site. The contaminant plumes of interest extend within the 200-ZP-1 and 200-UP-1 operable units. CH2M HILL Plateau Remediation Company (CHPRC) currently is preparing a plan identifying locations for groundwater extraction wells, injection wells, transfer stations, and one or more treatment facilities to address contaminants of concern identified in the 200-ZP-1 CERCLA Record of Decision. To accomplish this, a current understanding of the inventory of CTET is needed throughout the unconfined aquifer in the 200 West Area. Pacific Northwest National Laboratory (PNNL) previously developed an estimate of the CTET inventory in the area using a Monte Carlo approach based on geostatistical simulation of the three-dimensional (3D) distribution of CTET and chloroform in the aquifer. Fluor Hanford, Inc. (FH) (the previous site contractor) requested PNNL to update that inventory estimate using as input a set of geostatistical realizations of CTET and chloroform recently created for a related but separate project, referred to as the mapping project. The scope of work for the inventory revision complemented the scope of work for the mapping project, performed for FH by PNNL. This report briefly describes the spatial and univariate distribution of the CTET and chloroform data, along with the results of the geostatistical analysis and simulation performed for the mapping project.

  3. Combining geostatistics with Moran's I analysis for mapping soil heavy metals in Beijing, China.

    PubMed

    Huo, Xiao-Ni; Li, Hong; Sun, Dan-Feng; Zhou, Lian-Di; Li, Bao-Guo

    2012-03-01

    Production of high quality interpolation maps of heavy metals is important for risk assessment of environmental pollution. In this paper, the spatial correlation characteristics information obtained from Moran's I analysis was used to supplement the traditional geostatistics. According to Moran's I analysis, four characteristics distances were obtained and used as the active lag distance to calculate the semivariance. Validation of the optimality of semivariance demonstrated that using the two distances where the Moran's I and the standardized Moran's I, Z(I) reached a maximum as the active lag distance can improve the fitting accuracy of semivariance. Then, spatial interpolation was produced based on the two distances and their nested model. The comparative analysis of estimation accuracy and the measured and predicted pollution status showed that the method combining geostatistics with Moran's I analysis was better than traditional geostatistics. Thus, Moran's I analysis is a useful complement for geostatistics to improve the spatial interpolation accuracy of heavy metals.

  4. Radon risk mapping in southern Belgium: an application of geostatistical and GIS techniques.

    PubMed

    Zh, H C; Charlet, J M; Poffijn, A

    2001-05-14

    A data set of long-term radon measurements in approximately 2200 houses in southern Belgium has been collected in an on-going national radon survey. The spatial variation of indoor Rn concentrations is modelled by variograms. A radon distribution map is produced using the log-normal kriging technique. A GIS is used to digitise, process and integrate a variety of data, including geological maps, Rn concentrations associated with house locations and an administrative map, etc. It also allows evaluation of the relationships between various spatial data sets with the goal of producing radon risk maps. Based on geostatistical mapping and spatial analysis, we define three categories of risk areas: high risk, medium risk and low risk area. The correlation between radon concentrations and geological features is proved in this study. High and medium Rn risk zones are dominantly situated in bedrock from the Cambrian to Lower Devonian, although a few medium risk zones are within the Jurassic. It is evident that high-risk zones are related to a strongly folded and fractured context.

  5. Geostatistical analysis of morphometric features in the selected parts of the Sudetes (SW Poland)

    NASA Astrophysics Data System (ADS)

    Pawlowski, Lukasz; Szymanowski, Mariusz; Migon, Piotr

    2017-04-01

    Recent years have brought rapid development of quantitative techniques that are successfully applied in geomorphology. They open up new interpretation possibilities, even in seemingly very well recognized areas. In particular, we are talking about the geomorphometric and geostatistical techniques whose integration in Geographic Information Systems allows to look at the spatial pattern of landforms and process signatures from a new perspective. The morphology of the Sudetes, as of other mountain ranges in central Europe, is the result of protracted interaction of tectonic and surface processes, passive geological factors such as lithology and structure, and passage of time. This raises the question whether, and to which extent, these different controls and signals have resulted in similarities or differences in the morphometric structure of different parts within the same mountain range. In this paper we assume that geomorphic signals of various origins are expressed by a set of primary and secondary topographic attributes, which can be further analyzed as regional variables and modelled using geostatistical methods. Special attention is paid to variogram modelling. This method allows the identification of the spatial structure of the morphometric characteristics, its spatial scale and direction reflected in quantitative parameters of variograms (model functions, range, sill, nugget, anisotropy). This parameters for various areas are compared to find (dis-)similarities between different parts of the Sudetes. Thus, the main goals of the paper are: 1. To evaluate the usefulness of topographic attributes' variogram modelling for quantification of the spatial morphometric structure of mountain areas, on the example of medium-altitude, non-glaciated mountain terrain. 2. To compare different parts of the Sudetes to find similarities and differences between them and to interpret the findings through the examination of geology and geomorphology of the region. The analysis

  6. Analysis of alluvial hydrostratigraphy using indicator geostatistics, with examples from Santa Clara Valley, California

    SciTech Connect

    1995-03-01

    Current trends in hydrogeology seek to enlist sedimentary concepts in the interpretation of permeability structures. However, existing conceptual models of alluvial deposition tend to inadequately account for the heterogeneity caused by complex sedimentological and external factors. This dissertation presents three analyses of alluvial hydrostratigraphy using indicator geostatistics. This approach empirically acknowledges both the random and structured qualities of alluvial structures at scales relevant to site investigations. The first analysis introduces the indicator approach, whereby binary values are assigned to borehole-log intervals on the basis of inferred relative permeability; it presents a case study of indicator variography at a well-documented ground-water contamination site, and uses indicator kriging to interpolate an aquifer-aquitard sequence in three dimensions. The second analysis develops an alluvial-architecture context for interpreting semivariograms, and performs comparative variography for a suite of alluvial sites in Santa Clara Valley, California. The third analysis investigates the use of a water well perforation indicator for assessing large-scale hydrostratigraphic structures within relatively deep production zones.

  7. [Geostatistics analyzing to cause of formation of circle distribution of plant communities in Horqin Sandy Land].

    PubMed

    He, Xingdong; Gao, Yubao; Zhao, Wenzhi; Cong, Zili

    2004-09-01

    Investigation results in the present study showed that plant communities took typical concentric circles distribution patterns along habitat gradient from top, slope to interdune on a few large fixed dunes in middle part of Korqin Sandy Land. In order to explain this phenomenon, analysis of water content and its spatial heterogeneity in sand layers on different locations of dunes was conducted. In these dunes, water contents in sand layers of the tops were lower than those of the slopes; both of them were lower than those of the interdunes. According to the results of geostatistics analysis, whether shifting dune or fixed dune, spatial heterogeneity of water contents in sand layers took on regular changes, such as ratios between nugget and sill and ranges reduced gradually, fractal dimension increased gradually, the regular changes of these parameters indicated that random spatial heterogeneity reduced gradually, and autocorrelation spatial heterogeneity increased gradually from the top, the slope to the interdune. The regular changes of water contents in sand layers and their spatial heterogeneity of different locations of the dunes, thus, might be an important cause resulted in the formation of the concentric circles patterns of the plant communities on these fixed dunes.

  8. Geostatistical analysis of allele presence patterns among American black bears in eastern North Carolina

    USGS Publications Warehouse

    Thompson, L.M.; Van Manen, F.T.; King, T.L.

    2005-01-01

    Highways are one of the leading causes of wildlife habitat fragmentation and may particularly affect wide-ranging species, such as American black bears (Ursus americanus). We initiated a research project in 2000 to determine potential effects of a 4-lane highway on black bear ecology in Washington County, North Carolina. The research design included a treatment area (highway construction) and a control area and a pre- and post-construction phase. We used data from the pre-construction phase to determine whether we could detect scale dependency or directionality among allele occurrence patterns using geostatistics. Detection of such patterns could provide a powerful tool to measure the effects of landscape fragmentation on gene flow. We sampled DNA from roots of black bear hair at 70 hair-sampling sites on each study area for 7 weeks during fall of 2000. We used microsatellite analysis based on 10 loci to determine unique multi-locus genotypes. We examined all alleles sampled at ???25 sites on each study area and mapped their presence or absence at each hair-sample site. We calculated semivariograms, which measure the strength of statistical correlation as a function of distance, and adjusted them for anisotropy to determine the maximum direction of spatial continuity. We then calculated the mean direction of spatial continuity for all examined alleles. The mean direction of allele frequency variation was 118.3?? (SE = 8.5) on the treatment area and 172.3?? (SE = 6.0) on the control area. Rayleigh's tests showed that these directions differed from random distributions (P = 0.028 and P < 0.001, respectively), indicating consistent directional patterns for the alleles we examined in each area. Despite the small spatial scale of our study (approximately 11,000 ha for each study area), we observed distinct and consistent patterns of allele occurrence, suggesting different directions of gene flow between the study areas. These directions seemed to coincide with the

  9. Efficient Geostatistical Inversion under Transient Flow Conditions in Heterogeneous Porous Media

    NASA Astrophysics Data System (ADS)

    Klein, Ole; Cirpka, Olaf A.; Bastian, Peter; Ippisch, Olaf

    2014-05-01

    a reasonable range. Transient inversion, however, requires time series of measurements and therefore typically leads to a large number of observations, and under these circumstances the existing methods become unfeasible. We present an extension of the existing inversion methods to instationary flow regimes. Our approach uses a Conjugate Gradients scheme preconditioned with the prior covariance matrix QY Y to avoid both multiplications with QY Y -1 and the explicit assembly of Hz. Instead, one combined adjoint model run is used for all observations at once. As the computing time of our approach is largely independent of the number of measurements used for inversion, the presented method can be applied to large data sets. This facilitates the treatment of applications with variable boundary conditions (nearby rivers, precipitation). We integrate the geostatistical inversion method into the software framework DUNE, enabling the use of high-performance-computing techniques and full parallelization. Feasibility of our approach is demonstrated through the joint inversion of several synthetic data sets in two and three dimensions, e.g. estimation of hydraulic conductivity using hydraulic head values and tracer concentrations, and scalability of the new method is analyzed. A comparison of the new method with existing geostatistical inversion approaches highlights its advantages and drawbacks and demonstrates scenarios in which our scheme can be beneficial.

  10. Geostatistical noise filtering of geophysical images : application to unexploded ordnance (UXO) sites.

    SciTech Connect

    Saito, Hirotaka; McKenna, Sean Andrew; Coburn, Timothy C.

    2004-07-01

    Geostatistical and non-geostatistical noise filtering methodologies, factorial kriging and a low-pass filter, and a region growing method are applied to analytic signal magnetometer images at two UXO contaminated sites to delineate UXO target areas. Overall delineation performance is improved by removing background noise. Factorial kriging slightly outperforms the low-pass filter but there is no distinct difference between them in terms of finding anomalies of interest.

  11. Analysis of dengue fever risk using geostatistics model in bone regency

    NASA Astrophysics Data System (ADS)

    Amran, Stang, Mallongi, Anwar

    2017-03-01

    This research aim is to analysis of dengue fever risk based on Geostatistics model in Bone Regency. Risk levels of dengue fever are denoted by parameter of Binomial distribution. Effect of temperature, rainfalls, elevation, and larvae abundance are investigated through Geostatistics model. Bayesian hierarchical method is used in estimation process. Using dengue fever data in eleven locations this research shows that temperature and rainfall have significant effect of dengue fever risk in Bone regency.

  12. A Parenting Intervention for Childhood Behavioral Problems: A Randomized Controlled Trial in Disadvantaged Community-Based Settings

    ERIC Educational Resources Information Center

    McGilloway, Sinead; Mhaille, Grainne Ni; Bywater, Tracey; Furlong, Mairead; Leckey, Yvonne; Kelly, Paul; Comiskey, Catherine; Donnelly, Michael

    2012-01-01

    Objective: A community-based randomized controlled trial (RCT) was conducted in urban areas characterized by high levels of disadvantage to test the effectiveness of the Incredible Years BASIC parent training program (IYBP) for children with behavioral problems. Potential moderators of intervention effects on child behavioral outcomes were also…

  13. Optimized Field Sampling and Monitoring of Airborne Hazardous Transport Plumes; A Geostatistical Simulation Approach

    SciTech Connect

    Chen, DI-WEN

    2001-11-21

    Airborne hazardous plumes inadvertently released during nuclear/chemical/biological incidents are mostly of unknown composition and concentration until measurements are taken of post-accident ground concentrations from plume-ground deposition of constituents. Unfortunately, measurements often are days post-incident and rely on hazardous manned air-vehicle measurements. Before this happens, computational plume migration models are the only source of information on the plume characteristics, constituents, concentrations, directions of travel, ground deposition, etc. A mobile ''lighter than air'' (LTA) system is being developed at Oak Ridge National Laboratory that will be part of the first response in emergency conditions. These interactive and remote unmanned air vehicles will carry light-weight detectors and weather instrumentation to measure the conditions during and after plume release. This requires a cooperative computationally organized, GPS-controlled set of LTA's that self-coordinate around the objectives in an emergency situation in restricted time frames. A critical step before an optimum and cost-effective field sampling and monitoring program proceeds is the collection of data that provides statistically significant information, collected in a reliable and expeditious manner. Efficient aerial arrangements of the detectors taking the data (for active airborne release conditions) are necessary for plume identification, computational 3-dimensional reconstruction, and source distribution functions. This report describes the application of stochastic or geostatistical simulations to delineate the plume for guiding subsequent sampling and monitoring designs. A case study is presented of building digital plume images, based on existing ''hard'' experimental data and ''soft'' preliminary transport modeling results of Prairie Grass Trials Site. Markov Bayes Simulation, a coupled Bayesian/geostatistical methodology, quantitatively combines soft information

  14. Geostatistical prediction of flow-duration curves in an index-flow framework

    NASA Astrophysics Data System (ADS)

    Pugliese, Alessio; Castellarin, Attilio; Brath, Armando

    2014-05-01

    An empirical period-of-record Flow-Duration Curve (FDC) describes the percentage of time (duration) in which a given streamflow was equaled or exceeded over an historical period of time. FDCs have always attracted a great deal of interest in engineering applications because of their ability to provide a simple yet comprehensive graphical view of the overall historical variability of streamflows in a river basin, from floods to low-flows. Nevertheless, in many practical applications one has to construct FDC in basins that are ungauged or where very few observations are available. We present in this study an application strategy of Topological kriging (or Top-kriging), which makes the geostatistical procedure capable of predicting flow-duration curves (FDCs) in ungauged catchments. Previous applications of Top-kriging mainly focused on the prediction of point streamflow indices (e.g. flood quantiles, low-flow indices, etc.). In this study Top-kriging is used to predict FDCs in ungauged sites as a weighted average of standardised empirical FDCs through the traditional linear-weighting scheme of kriging methods. Our study focuses on the prediction of FDCs for 18 unregulated catchments located in Central Italy, for which daily streamflow series with length from 5 to 40 years are available, together with information on climate referring to the same time-span of each daily streamflow sequence. Empirical FDCs are standardised by a reference index-flow value (i.e. mean annual flow, or mean annual precipitation times the catchment drainage area) and the overall deviation of the curves from this reference value is then used for expressing the hydrological similarity between catchments and for deriving the geostatistical weights. We performed an extensive leave-one-out cross-validation to quantify the accuracy of the proposed technique, and to compare it to traditional regionalisation models that were recently developed for the same study region. The cross-validation points

  15. Estimating transmissivity in the Edwards Aquifer using upscaling, geostatistics, and Bayesian updating

    NASA Astrophysics Data System (ADS)

    Painter, S. L.; Jiang, Y.; Woodbury, A. D.

    2002-12-01

    The Edwards Aquifer, a highly heterogeneous karst aquifer located in south central Texas, is the sole source of drinking water for more than one million people. Hydraulic conductivity (K) measurements in the Edwards Aquifer are sparse, highly variable (log-K variance of 6.4), and are mostly from single-well drawdown tests that are appropriate for the spatial scale of a few meters. To support ongoing efforts to develop a groundwater management (MODFLOW) model of the San Antonio segment of the Edwards Aquifer, a multistep procedure was developed to assign hydraulic parameters to the 402 m x 402 m computational cells intended for the management model. The approach used a combination of nonparametric geostatistical analysis, stochastic simulation, numerical upscaling, and automatic model calibration based on Bayesian updating [1,2]. Indicator correlograms reveal a nested spatial structure in the well-test K of the confined zone, with practical correlation ranges of 3,600 and 15,000 meters and a large nugget effect. The fitted geostatistical model was used in unconditional stochastic simulations by the sequential indicator simulation method. The resulting realizations of K, defined at the scale of the well tests, were then numerically upscaled to the block scale. A new geostatistical model was fitted to the upscaled values. The upscaled model was then used to cokrige the block-scale K based on the well-test K. The resulting K map was then converted to transmissivity (T) using deterministically mapped aquifer thickness. When tested in a forward groundwater model, the upscaled T reproduced hydraulic heads better than a simple kriging of the well-test values (mean error of -3.9 meter and mean-absolute-error of 12 meters, as compared with -13 and 17 meters for the simple kriging). As the final step in the study, the upscaled T map was used as the prior distribution in an inverse procedure based on Bayesian updating [1,2]. When input to the forward groundwater model, the

  16. Comparing different approaches - data mining, geostatistic, and deterministic pedology - to assess the frequency of WRB Reference Soil Groups in the Italian soil regions

    NASA Astrophysics Data System (ADS)

    Lorenzetti, Romina; Barbetti, Roberto; L'Abate, Giovanni; Fantappiè, Maria; Costantini, Edoardo A. C.

    2013-04-01

    Estimating frequency of soil classes in map unit is always affected by some degree of uncertainty, especially at small scales, with a larger generalization. The aim of this study was to compare different possible approaches - data mining, geostatistic, deterministic pedology - to assess the frequency of WRB Reference Soil Groups (RSG) in the major Italian soil regions. In the soil map of Italy (Costantini et al., 2012), a list of the first five RSG was reported in each major 10 soil regions. The soil map was produced using the national soil geodatabase, which stored 22,015 analyzed and classified pedons, 1,413 soil typological unit (STU) and a set of auxiliary variables (lithology, land-use, DEM). Other variables were added, to better consider the influence of soil forming factors (slope, soil aridity index, carbon stock, soil inorganic carbon content, clay, sand, geography of soil regions and soil systems) and a grid at 1 km mesh was set up. The traditional deterministic pedology assessed the STU frequency according to the expert judgment presence in every elementary landscape which formed the mapping unit. Different data mining techniques were firstly compared in their ability to predict RSG through auxiliary variables (neural networks, random forests, boosted tree, supported vector machine (SVM)). We selected SVM according to the result of a testing set. A SVM model is a representation of the examples as points in space, mapped so that examples of separate categories are divided by a clear gap that is as wide as possible. The geostatistic algorithm we used was an indicator collocated cokriging. The class values of the auxiliary variables, available at all the points of the grid, were transformed in indicator variables (values 0, 1). A principal component analysis allowed us to select the variables that were able to explain the largest variability, and to correlate each RSG with the first principal component, which explained the 51% of the total variability. The

  17. Porous media flux sensitivity to pore-scale geostatistics: A bottom-up approach

    NASA Astrophysics Data System (ADS)

    Di Palma, P. R.; Guyennon, N.; Heße, F.; Romano, E.

    2017-04-01

    Macroscopic properties of flow through porous media can be directly computed by solving the Navier-Stokes equations at the scales related to the actual flow processes, while considering the porous structures in an explicit way. The aim of this paper is to investigate the effects of the pore-scale spatial distribution on seepage velocity through numerical simulations of 3D fluid flow performed by the lattice Boltzmann method. To this end, we generate multiple random Gaussian fields whose spatial correlation follows an assigned semi-variogram function. The Exponential and Gaussian semi-variograms are chosen as extreme-cases of correlation for short distances and statistical properties of the resulting porous media (indicator field) are described using the Matèrn covariance model, with characteristic lengths of spatial autocorrelation (pore size) varying from 2% to 13% of the linear domain. To consider the sensitivity of the modeling results to the geostatistical representativeness of the domain as well as to the adopted resolution, porous media have been generated repetitively with re-initialized random seeds and three different resolutions have been tested for each resulting realization. The main difference among results is observed between the two adopted semi-variograms, indicating that the roughness (short distances autocorrelation) is the property mainly affecting the flux. However, computed seepage velocities show additionally a wide variability (about three orders of magnitude) for each semi-variogram model in relation to the assigned correlation length, corresponding to pore sizes. The spatial resolution affects more the results for short correlation lengths (i.e., small pore sizes), resulting in an increasing underestimation of the seepage velocity with the decreasing correlation length. On the other hand, results show an increasing uncertainty as the correlation length approaches the domain size.

  18. Integrating Ensemble Data Assimilation and Indicator Geostatistics to Delineate Hydrofacies Spatial Distribution

    NASA Astrophysics Data System (ADS)

    Song, X.; Chen, X.; Ye, M.; Dai, Z.; Hammond, G. E.

    2015-12-01

    We present a new framework for delineating spatial distributions of hydrofacies from indirect data by linking ensemble-based data assimilation method (e.g., Ensemble Kalman filter, EnKF) with indicator geostatistics based on transition probability. The nature of ensemble data assimilation makes the framework efficient and flexible to integrate various types of observation data. We leveraged the level set concept to establish transformations between discrete hydrofacies and continuous variables, which is a critical element to implement ensemble data assimilation methods for hydrofacies delineation. T-PROGS is used to generate realizations of hydrofacies fields given conditioning points. An additional quenching step of T-PROGS is taken to preserve spatial structure of updated hydrofacies after each data assimilation step. This new method is illustrated by a two-dimensional (2-D) synthetic study in which transient hydraulic head data resulting from pumping is assimilated to delineate hydrofacies distribution. Our results showed that the proposed framework was able to characterize hydrofacies distribution and their associated permeability with adequate accuracy even with limited direct hydrofacies data. This method may find broader applications in facies delineation using other types of indirect measurements, such as tracer tests and geophysical surveys.

  19. Mapping soil gas radon concentration: a comparative study of geostatistical methods.

    PubMed

    Buttafuoco, Gabriele; Tallarico, Adalisa; Falcone, Giovanni

    2007-08-01

    Understanding soil gas radon spatial variations can allow the constructor of a new house to prevent radon gas flowing from the ground. Indoor radon concentration distribution depends on many parameters and it is difficult to use its spatial variation to assess radon potential. Many scientists use to measure outdoor soil gas radon concentrations to assess the radon potential. Geostatistical methods provide us a valuable tool to study spatial structure of radon concentration and mapping. To explore the structure of soil gas radon concentration within an area in south Italy and choice a kriging algorithm, we compared the prediction performances of four different kriging algorithms: ordinary kriging, lognormal kriging, ordinary multi-Gaussian kriging, and ordinary indicator cokriging. Their results were compared using an independent validation data set. The comparison of predictions was based on three measures of accuracy: (1) the mean absolute error, (2) the mean-squared error of prediction; (3) the mean relative error, and a measure of effectiveness: the goodness-of-prediction estimate. The results obtained in this case study showed that the multi-Gaussian kriging was the most accurate approach among those considered. Comparing radon anomalies with lithology and fault locations, no evidence of a strict correlation between type of outcropping terrain and radon anomalies was found, except in the western sector where there were granitic and gneissic terrain. Moreover, there was a clear correlation between radon anomalies and fault systems.

  20. Obtaining parsimonious hydraulic conductivity fields using head and transport observations: A Bayesian geostatistical parameter estimation approach

    NASA Astrophysics Data System (ADS)

    Fienen, M.; Hunt, R.; Krabbenhoft, D.; Clemo, T.

    2009-08-01

    Flow path delineation is a valuable tool for interpreting the subsurface hydrogeochemical environment. Different types of data, such as groundwater flow and transport, inform different aspects of hydrogeologic parameter values (hydraulic conductivity in this case) which, in turn, determine flow paths. This work combines flow and transport information to estimate a unified set of hydrogeologic parameters using the Bayesian geostatistical inverse approach. Parameter flexibility is allowed by using a highly parameterized approach with the level of complexity informed by the data. Despite the effort to adhere to the ideal of minimal a priori structure imposed on the problem, extreme contrasts in parameters can result in the need to censor correlation across hydrostratigraphic bounding surfaces. These partitions segregate parameters into facies associations. With an iterative approach in which partitions are based on inspection of initial estimates, flow path interpretation is progressively refined through the inclusion of more types of data. Head observations, stable oxygen isotopes (18O/16O ratios), and tritium are all used to progressively refine flow path delineation on an isthmus between two lakes in the Trout Lake watershed, northern Wisconsin, United States. Despite allowing significant parameter freedom by estimating many distributed parameter values, a smooth field is obtained.

  1. Geostatistical analysis of regional hydraulic conductivity variations in the Snake River Plain aquifer, eastern Idaho

    USGS Publications Warehouse

    Welhan, J.A.; Reed, M.F.

    1997-01-01

    The regional spatial correlation structure of bulk horizontal hydraulic conductivity (Kb) estimated from published transmissivity data from 79 open boreholes in the fractured basalt aquifer of the eastern Snake River Plain was analyzed with geostatistical methods. The two-dimensional spatial correlation structure of In Kb shows a pronounced 4:1 range anisotropy, with a maximum correlation range in the north-northwest- south-southeast direction of about 6 km. The maximum variogram range of In Kb is similar to the mean length of flow groups exposed at the surface. The In Kb range anisotropy is similar to the mean width/length ratio of late Quaternary and Holocene basalt lava flows and the orientations of the major volcanic structural features on the eastern Snake River Plain. The similarity between In Kb correlation scales and basalt flow dimensions and between basalt flow orientations and correlation range anisotropy suggests that the spatial distribution of zones of high hydraulic conductivity may be controlled by the lateral dimensions, spatial distribution, and interconnection between highly permeable zones which are known to occur between lava flows within flow groups. If hydraulic conductivity and lithology are eventually shown to be cross correlative in this geologic setting, it may be possible to stochastically simulate hydraulic conductivity distributions, which are conditional on a knowledge of volcanic stratigraphy.

  2. Improving Imperfect Data from Health Management Information Systems in Africa Using Space–Time Geostatistics

    PubMed Central

    Gething, Peter W; Noor, Abdisalan M; Gikandi, Priscilla W; Ogara, Esther A. A; Hay, Simon I; Nixon, Mark S; Snow, Robert W; Atkinson, Peter M

    2006-01-01

    Background Reliable and timely information on disease-specific treatment burdens within a health system is critical for the planning and monitoring of service provision. Health management information systems (HMIS) exist to address this need at national scales across Africa but are failing to deliver adequate data because of widespread underreporting by health facilities. Faced with this inadequacy, vital public health decisions often rely on crudely adjusted regional and national estimates of treatment burdens. Methods and Findings This study has taken the example of presumed malaria in outpatients within the largely incomplete Kenyan HMIS database and has defined a geostatistical modelling framework that can predict values for all data that are missing through space and time. The resulting complete set can then be used to define treatment burdens for presumed malaria at any level of spatial and temporal aggregation. Validation of the model has shown that these burdens are quantified to an acceptable level of accuracy at the district, provincial, and national scale. Conclusions The modelling framework presented here provides, to our knowledge for the first time, reliable information from imperfect HMIS data to support evidence-based decision-making at national and sub-national levels. PMID:16719557

  3. Geostatistical analysis of tritium, groundwater age and other noble gas derived parameters in California.

    PubMed

    Visser, A; Moran, J E; Hillegonds, Darren; Singleton, M J; Kulongoski, Justin T; Belitz, Kenneth; Esser, B K

    2016-03-15

    Key characteristics of California groundwater systems related to aquifer vulnerability, sustainability, recharge locations and mechanisms, and anthropogenic impact on recharge are revealed in a spatial geostatistical analysis of a unique data set of tritium, noble gases and other isotopic analyses unprecedented in size at nearly 4000 samples. The correlation length of key groundwater residence time parameters varies between tens of kilometers ((3)H; age) to the order of a hundred kilometers ((4)Heter; (14)C; (3)Hetrit). The correlation length of parameters related to climate, topography and atmospheric processes is on the order of several hundred kilometers (recharge temperature; δ(18)O). Young groundwater ages that highlight regional recharge areas are located in the eastern San Joaquin Valley, in the southern Santa Clara Valley Basin, in the upper LA basin and along unlined canals carrying Colorado River water, showing that much of the recent recharge in central and southern California is dominated by river recharge and managed aquifer recharge. Modern groundwater is found in wells with the top open intervals below 60 m depth in the southeastern San Joaquin Valley, Santa Clara Valley and Los Angeles basin, as the result of intensive pumping and/or managed aquifer recharge operations.

  4. Geostatistical investigations for suitable mapping of the water table: the Bordeaux case (France)

    NASA Astrophysics Data System (ADS)

    Guekie simo, Aubin Thibaut; Marache, Antoine; Lastennet, Roland; Breysse, Denys

    2016-02-01

    Methodologies have been developed to establish realistic water-table maps using geostatistical methods: ordinary kriging (OK), cokriging (CoK), collocated cokriging (CoCoK), and kriging with external drift (KED). In fact, in a hilly terrain, when piezometric data are sparsely distributed over large areas, the water-table maps obtained by these methods provide exact water levels at monitoring wells but fail to represent the groundwater flow system, manifested through an interpolated water table above the topography. A methodology is developed in order to rebuild water-table maps for urban areas at the city scale. The interpolation methodology is presented and applied in a case study where water levels are monitored at a set of 47 points for a part urban domain covering 25.6 km2 close to Bordeaux city, France. To select the best method, a geographic information system was used to visualize surfaces reconstructed with each method. A cross-validation was carried out to evaluate the predictive performances of each kriging method. KED proves to be the most accurate and yields a better description of the local fluctuations induced by the topography (natural occurrence of ridges and valleys).

  5. Delineation of estuarine management areas using multivariate geostatistics: the case of Sado Estuary.

    PubMed

    Caeiro, Sandra; Goovaerts, Pierre; Painho, Marco; Costa, M Helena

    2003-09-15

    The Sado Estuary is a coastal zone located in the south of Portugal where conflicts between conservation and development exist because of its location near industrialized urban zones and its designation as a natural reserve. The aim of this paper is to evaluate a set of multivariate geostatistical approaches to delineate spatially contiguous regions of sediment structure for Sado Estuary. These areas will be the supporting infrastructure of an environmental management system for this estuary. The boundaries of each homogeneous area were derived from three sediment characterization attributes through three different approaches: (1) cluster analysis of dissimilarity matrix function of geographical separation followed by indicator kriging of the cluster data, (2) discriminant analysis of kriged values of the three sediment attributes, and (3) a combination of methods 1 and 2. Final maximum likelihood classification was integrated into a geographical information system. All methods generated fairly spatially contiguous management areas that reproduce well the environment of the estuary. Map comparison techniques based on kappa statistics showed thatthe resultant three maps are similar, supporting the choice of any of the methods as appropriate for management of the Sado Estuary. However, the results of method 1 seem to be in better agreement with estuary behavior, assessment of contamination sources, and previous work conducted at this site.

  6. Characterizing the spatial structure of endangered species habitat using geostatistical analysis of IKONOS imagery

    USGS Publications Warehouse

    Wallace, C.S.A.; Marsh, S.E.

    2005-01-01

    Our study used geostatistics to extract measures that characterize the spatial structure of vegetated landscapes from satellite imagery for mapping endangered Sonoran pronghorn habitat. Fine spatial resolution IKONOS data provided information at the scale of individual trees or shrubs that permitted analysis of vegetation structure and pattern. We derived images of landscape structure by calculating local estimates of the nugget, sill, and range variogram parameters within 25 ?? 25-m image windows. These variogram parameters, which describe the spatial autocorrelation of the 1-m image pixels, are shown in previous studies to discriminate between different species-specific vegetation associations. We constructed two independent models of pronghorn landscape preference by coupling the derived measures with Sonoran pronghorn sighting data: a distribution-based model and a cluster-based model. The distribution-based model used the descriptive statistics for variogram measures at pronghorn sightings, whereas the cluster-based model used the distribution of pronghorn sightings within clusters of an unsupervised classification of derived images. Both models define similar landscapes, and validation results confirm they effectively predict the locations of an independent set of pronghorn sightings. Such information, although not a substitute for field-based knowledge of the landscape and associated ecological processes, can provide valuable reconnaissance information to guide natural resource management efforts. ?? 2005 Taylor & Francis Group Ltd.

  7. Obtaining parsimonious hydraulic conductivity fields using head and transport observations: A bayesian geostatistical parameter estimation approach

    USGS Publications Warehouse

    Fienen, M.; Hunt, R.; Krabbenhoft, D.; Clemo, T.

    2009-01-01

    Flow path delineation is a valuable tool for interpreting the subsurface hydrogeochemical environment. Different types of data, such as groundwater flow and transport, inform different aspects of hydrogeologie parameter values (hydraulic conductivity in this case) which, in turn, determine flow paths. This work combines flow and transport information to estimate a unified set of hydrogeologic parameters using the Bayesian geostatistical inverse approach. Parameter flexibility is allowed by using a highly parameterized approach with the level of complexity informed by the data. Despite the effort to adhere to the ideal of minimal a priori structure imposed on the problem, extreme contrasts in parameters can result in the need to censor correlation across hydrostratigraphic bounding surfaces. These partitions segregate parameters into faci??s associations. With an iterative approach in which partitions are based on inspection of initial estimates, flow path interpretation is progressively refined through the inclusion of more types of data. Head observations, stable oxygen isotopes (18O/16O) ratios), and tritium are all used to progressively refine flow path delineation on an isthmus between two lakes in the Trout Lake watershed, northern Wisconsin, United States. Despite allowing significant parameter freedom by estimating many distributed parameter values, a smooth field is obtained. Copyright 2009 by the American Geophysical Union.

  8. A geostatistics-informed hierarchical sensitivity analysis method for complex groundwater flow and transport modeling

    NASA Astrophysics Data System (ADS)

    Dai, Heng; Chen, Xingyuan; Ye, Ming; Song, Xuehang; Zachara, John M.

    2017-05-01

    Sensitivity analysis is an important tool for development and improvement of mathematical models, especially for complex systems with a high dimension of spatially correlated parameters. Variance-based global sensitivity analysis has gained popularity because it can quantify the relative contribution of uncertainty from different sources. However, its computational cost increases dramatically with the complexity of the considered model and the dimension of model parameters. In this study, we developed a new sensitivity analysis method that integrates the concept of variance-based method with a hierarchical uncertainty quantification framework. Different uncertain inputs are grouped and organized into a multilayer framework based on their characteristics and dependency relationships to reduce the dimensionality of the sensitivity analysis. A set of new sensitivity indices are defined for the grouped inputs using the variance decomposition method. Using this methodology, we identified the most important uncertainty source for a dynamic groundwater flow and solute transport model at the Department of Energy (DOE) Hanford site. The results indicate that boundary conditions and permeability field contribute the most uncertainty to the simulated head field and tracer plume, respectively. The relative contribution from each source varied spatially and temporally. By using a geostatistical approach to reduce the number of realizations needed for the sensitivity analysis, the computational cost of implementing the developed method was reduced to a practically manageable level. The developed sensitivity analysis method is generally applicable to a wide range of hydrologic and environmental problems that deal with high-dimensional spatially distributed input variables.

  9. Use of a Transition Probability/Markov Approach to Improve Geostatistical of Facies Architecture

    SciTech Connect

    Carle, S.F.

    2000-11-01

    Facies may account for the largest permeability contrasts within the reservoir model at the scale relevant to production. Conditional simulation of the spatial distribution of facies is one of the most important components of building a reservoir model. Geostatistical techniques are widely used to produce realistic and geologically plausible realizations of facies architecture. However, there are two stumbling blocks to the traditional indicator variogram-based approaches: (1) intensive data sets are needed to develop models of spatial variability by empirical curve-fitting to sample indicator (cross-) variograms and to implement ''post-processing'' simulation algorithms; and (2) the prevalent ''sequential indicator simulation'' (SIS) methods do not accurately produce patterns of spatial variability for systems with three or more facies (Seifert and Jensen, 1999). This paper demonstrates an alternative transition probability/Markov approach that emphasizes: (1) Conceptual understanding of the parameters of the spatial variability model, so that geologic insight can support and enhance model development when data are sparse. (2) Mathematical rigor, so that the ''coregionalization'' model (including the spatial cross-correlations) obeys probability law. (3) Consideration of spatial cross-correlation, so that juxtapositional tendencies--how frequently one facies tends to occur adjacent to another facies--are honored.

  10. Decision-making and goal-setting in chronic disease management: Baseline findings of a randomized controlled trial.

    PubMed

    Kangovi, Shreya; Mitra, Nandita; Smith, Robyn A; Kulkarni, Raina; Turr, Lindsey; Huo, Hairong; Glanz, Karen; Grande, David; Long, Judith A

    2017-03-01

    Growing interest in collaborative goal-setting has raised questions. First, are patients making the 'right choices' from a biomedical perspective? Second, are patients and providers setting goals of appropriate difficulty? Finally, what types of support will patients need to accomplish their goals? We analyzed goals and action plans from a trial of collaborative goal-setting among 302 residents of a high-poverty urban region who had multiple chronic conditions. Patients used a low-literacy aid to prioritize one of their chronic conditions and then set a goal for that condition with their primary care provider. Patients created patient-driven action plans for reaching these goals. Patients chose to focus on conditions that were in poor control and set ambitious chronic disease management goals. The mean goal weight loss -16.8lbs (SD 19.5), goal HbA1C reduction was -1.3% (SD 1.7%) and goal blood pressure reduction was -9.8mmHg (SD 19.2mmHg). Patient-driven action plans spanned domains including health behavior (58.9%) and psychosocial (23.5%). High-risk, low-SES patients identified high priority conditions, set ambitious goals and generate individualized action plans for chronic disease management. Practices may require flexible personnel who can support patients using a blend of coaching, social support and navigation. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  11. Geostatistical Analyses of the Persistence and Inventory of Carbon Tetrachloride in the 200 West Area of the Hanford Site

    SciTech Connect

    Murray, Christopher J.; Bott, Yi-Ju; Truex, Michael J.

    2007-04-30

    This report documents two separate geostatistical studies performed by researchers from Pacific Northwest National Laboratory to evaluate the carbon tetrachloride plume in the groundwater on the Hanford Site.

  12. Building on crossvalidation for increasing the quality of geostatistical modeling

    USGS Publications Warehouse

    Olea, R.A.

    2012-01-01

    The random function is a mathematical model commonly used in the assessment of uncertainty associated with a spatially correlated attribute that has been partially sampled. There are multiple algorithms for modeling such random functions, all sharing the requirement of specifying various parameters that have critical influence on the results. The importance of finding ways to compare the methods and setting parameters to obtain results that better model uncertainty has increased as these algorithms have grown in number and complexity. Crossvalidation has been used in spatial statistics, mostly in kriging, for the analysis of mean square errors. An appeal of this approach is its ability to work with the same empirical sample available for running the algorithms. This paper goes beyond checking estimates by formulating a function sensitive to conditional bias. Under ideal conditions, such function turns into a straight line, which can be used as a reference for preparing measures of performance. Applied to kriging, deviations from the ideal line provide sensitivity to the semivariogram lacking in crossvalidation of kriging errors and are more sensitive to conditional bias than analyses of errors. In terms of stochastic simulation, in addition to finding better parameters, the deviations allow comparison of the realizations resulting from the applications of different methods. Examples show improvements of about 30% in the deviations and approximately 10% in the square root of mean square errors between reasonable starting modelling and the solutions according to the new criteria. ?? 2011 US Government.

  13. Auricular Therapy for Treatment of Musculoskeletal Pain in the Setting of Deployed Military Personnel: A Randomized Trial

    DTIC Science & Technology

    2016-12-01

    to randomize 150 subjects to examine whether the addition of a specific auricular therapy protocol to standard care will have a beneficial impact on...The goal of the study was to determine whether auricular therapy provides a) more rapid and significant pain relief than usual care ; b) more rapid...and significant relief of co-morbidities related to pain (sleep disruption, mood changes, etc) than usual care ; c) more rapid and significant return

  14. A Randomized Feasibility Trial of a New Lifestyle Referral Assessment Versus Usual Assessment in an Acute Cardiology Setting.

    PubMed

    Hill, Kate; Walwyn, Rebecca; Camidge, Diana; Murray, Jenni; Meads, David; Reynolds, Greg; Farrin, Amanda; House, Allan

    A healthy diet, taking exercise, and not smoking or consuming alcohol in excess are important to reduce the risk of cardiovascular disease either alone or in combination with statin medication. Health education, including providing information to patients on healthy living and guidance on how to achieve it, is a key nursing function. This study aims first to assess the feasibility of conducting a full-scale trial of lifestyle referral assessment as shown by recruitment rate, data collection, and follow-up and second to assess proof of concept and explore possible mechanisms of change. This was a single-center, randomized, 2-arm, parallel-group, unblinded feasibility trial conducted in an acute teaching hospital trust. Participants were followed up at 3 and 6 months after randomization. Eight hundred eighty-seven patients were screened for eligibility, of whom 132 (15%) were randomized into the trial. Of the patients allocated to the individualized assessment, 27% accepted referral or self-referred by 3 months in comparison to 5% allocated to the usual assessment. We demonstrated that a full-scale trial is feasible and that an individualized approach increased the number of patients accepting referral to a formal program and initiating lifestyle change. However, we should consider the aim of the assessment and ways in which the process of change can be optimized in order to produce long-term benefit for patients. current controlled trials ISRCTN41781196.

  15. Predictors of nicotine withdrawal symptoms: findings from the first randomized smoking cessation trial in a low-income country setting.

    PubMed

    Ben Taleb, Ziyad; Ward, Kenneth D; Asfar, Taghrid; Jaber, Rana; Auf, Rehab; Maziak, Wasim

    2016-07-01

    To identify predictors of nicotine withdrawal symptoms among smokers who participated in a randomized cessation trial in a low-income country. We analyzed data from 269 smokers who participated in a randomized, placebo-controlled smoking cessation trial conducted in primary healthcare in Aleppo, Syria. All participants received behavioral counseling and were randomized to receive either 6 weeks of nicotine or placebo patch and were followed for one year. Throughout the study, lower total withdrawal score was associated with greater education (p = 0.044), older age of smoking initiation (p = 0.017), lower nicotine dependence (p = 0.024), higher confidence in ability to quit (p = 0.020), lower reported depression (p < 0.001), higher adherence to patch (p = 0.026), belief of receiving nicotine patches rather than placebo (p = 0.011), and waterpipe use (p = 0.047). Lower nicotine dependence, greater educational attainment, higher confidence in ability to quit and waterpipe use predict lower withdrawal severity. Waterpipe smoking may serve as a barrier to smoking cessation efforts in countries where its use is highly prevalent. Further, expectancies about the effects of pharmacotherapy appear to mediate the experience of nicotine withdrawal.

  16. Three nested randomized controlled trials of peer-only or multiple stakeholder group feedback within Delphi surveys during core outcome and information set development.

    PubMed

    Brookes, Sara T; Macefield, Rhiannon C; Williamson, Paula R; McNair, Angus G; Potter, Shelley; Blencowe, Natalie S; Strong, Sean; Blazeby, Jane M

    2016-08-17

    Methods for developing a core outcome or information set require involvement of key stakeholders to prioritise many items and achieve agreement as to the core set. The Delphi technique requires participants to rate the importance of items in sequential questionnaires (or rounds) with feedback provided in each subsequent round such that participants are able to consider the views of others. This study examines the impact of receiving feedback from different stakeholder groups, on the subsequent rating of items and the level of agreement between stakeholders. Randomized controlled trials were nested within the development of three core sets each including a Delphi process with two rounds of questionnaires, completed by patients and health professionals. Participants rated items from 1 (not essential) to 9 (absolutely essential). For round 2, participants were randomized to receive feedback from their peer stakeholder group only (peer) or both stakeholder groups separately (multiple). Decisions as to which items to retain following each round were determined by pre-specified criteria. Whilst type of feedback did not impact on the percentage of items for which a participant subsequently changed their rating, or the magnitude of change, it did impact on items retained at the end of round 2. Each core set contained discordant items retained by one feedback group but not the other (3-22 % discordant items). Consensus between patients and professionals in items to retain was greater amongst those receiving multiple group feedback in each core set (65-82 % agreement for peer-only feedback versus 74-94 % for multiple feedback). In addition, differences in round 2 scores were smaller between stakeholder groups receiving multiple feedback than between those receiving peer group feedback only. Variability in item scores across stakeholders was reduced following any feedback but this reduction was consistently greater amongst the multiple feedback group. In the development of

  17. PRagmatic trial Of Video Education in Nursing homes: The design and rationale for a pragmatic cluster randomized trial in the nursing home setting.

    PubMed

    Mor, Vincent; Volandes, Angelo E; Gutman, Roee; Gatsonis, Constantine; Mitchell, Susan L

    2017-04-01

    Background/Aims Nursing homes are complex healthcare systems serving an increasingly sick population. Nursing homes must engage patients in advance care planning, but do so inconsistently. Video decision support tools improved advance care planning in small randomized controlled trials. Pragmatic trials are increasingly employed in health services research, although not commonly in the nursing home setting to which they are well-suited. This report presents the design and rationale for a pragmatic cluster randomized controlled trial that evaluated the "real world" application of an Advance Care Planning Video Program in two large US nursing home healthcare systems. Methods PRagmatic trial Of Video Education in Nursing homes was conducted in 360 nursing homes (N = 119 intervention/N = 241 control) owned by two healthcare systems. Over an 18-month implementation period, intervention facilities were instructed to offer the Advance Care Planning Video Program to all patients. Control facilities employed usual advance care planning practices. Patient characteristics and outcomes were ascertained from Medicare Claims, Minimum Data Set assessments, and facility electronic medical record data. Intervention adherence was measured using a Video Status Report embedded into electronic medical record systems. The primary outcome was the number of hospitalizations/person-day alive among long-stay patients with advanced dementia or cardiopulmonary disease. The rationale for the approaches to facility randomization and recruitment, intervention implementation, population selection, data acquisition, regulatory issues, and statistical analyses are discussed. Results The large number of well-characterized candidate facilities enabled several unique design features including stratification on historical hospitalization rates, randomization prior to recruitment, and 2:1 control to intervention facilities ratio. Strong endorsement from corporate leadership made randomization

  18. Bayesian Geostatistical Modeling of Leishmaniasis Incidence in Brazil

    PubMed Central

    Karagiannis-Voules, Dimitrios-Alexios; Scholte, Ronaldo G. C.; Guimarães, Luiz H.; Utzinger, Jürg; Vounatsou, Penelope

    2013-01-01

    Background Leishmaniasis is endemic in 98 countries with an estimated 350 million people at risk and approximately 2 million cases annually. Brazil is one of the most severely affected countries. Methodology We applied Bayesian geostatistical negative binomial models to analyze reported incidence data of cutaneous and visceral leishmaniasis in Brazil covering a 10-year period (2001–2010). Particular emphasis was placed on spatial and temporal patterns. The models were fitted using integrated nested Laplace approximations to perform fast approximate Bayesian inference. Bayesian variable selection was employed to determine the most important climatic, environmental, and socioeconomic predictors of cutaneous and visceral leishmaniasis. Principal Findings For both types of leishmaniasis, precipitation and socioeconomic proxies were identified as important risk factors. The predicted number of cases in 2010 were 30,189 (standard deviation [SD]: 7,676) for cutaneous leishmaniasis and 4,889 (SD: 288) for visceral leishmaniasis. Our risk maps predicted the highest numbers of infected people in the states of Minas Gerais and Pará for visceral and cutaneous leishmaniasis, respectively. Conclusions/Significance Our spatially explicit, high-resolution incidence maps identified priority areas where leishmaniasis control efforts should be targeted with the ultimate goal to reduce disease incidence. PMID:23675545

  19. Kriging in the Shadows: Geostatistical Interpolation for Remote Sensing

    NASA Technical Reports Server (NTRS)

    Rossi, Richard E.; Dungan, Jennifer L.; Beck, Louisa R.

    1994-01-01

    It is often useful to estimate obscured or missing remotely sensed data. Traditional interpolation methods, such as nearest-neighbor or bilinear resampling, do not take full advantage of the spatial information in the image. An alternative method, a geostatistical technique known as indicator kriging, is described and demonstrated using a Landsat Thematic Mapper image in southern Chiapas, Mexico. The image was first classified into pasture and nonpasture land cover. For each pixel that was obscured by cloud or cloud shadow, the probability that it was pasture was assigned by the algorithm. An exponential omnidirectional variogram model was used to characterize the spatial continuity of the image for use in the kriging algorithm. Assuming a cutoff probability level of 50%, the error was shown to be 17% with no obvious spatial bias but with some tendency to categorize nonpasture as pasture (overestimation). While this is a promising result, the method's practical application in other missing data problems for remotely sensed images will depend on the amount and spatial pattern of the unobscured pixels and missing pixels and the success of the spatial continuity model used.

  20. Use of geostatistics in planning optimum drilling program

    SciTech Connect

    Ghose S. )

    1989-08-01

    Application of geostatistics in the natural resources industry is well established. In a typical process of estimation, the statistically dependent geological data are used to predict the characteristics of a deposit. The estimator used is the best linear unbiased estimator (or BLUE), and a numerical factor of confidence is also provided. The natural inhomogeneity and anisotropy of a deposit are also quantified with preciseness. Drilling is the most reliable way of obtaining data for mining and related industries. However, it is often difficult to decide what is the optimum number of drill holes necessary for evaluation. In this paper, sequential measures of percent variation at 95% confidence level of a geological variable have been used to decipher economically optimum drilling density. A coal reserve model has been used to illustrate the method and findings. Fictitious drilling data were added (within the domain of population characteristics) in stages, to obtain a point of stability, beyond which the gain was significant (diminishing marginal benefit). The final relations are established by graphically projecting and comparing two variables - cost and precision. By mapping the percent variation at each stage, the localized areas of discrepancies can be identified. These are the locations where additional drilling is needed. The system can be controlled if performed at progressive stages and the preciseness toward stability is monitored.

  1. An interactive Bayesian geostatistical inverse protocol for hydraulic tomography

    USGS Publications Warehouse

    Fienen, Michael N.; Clemo, Tom; Kitanidis, Peter K.

    2008-01-01

    Hydraulic tomography is a powerful technique for characterizing heterogeneous hydrogeologic parameters. An explicit trade-off between characterization based on measurement misfit and subjective characterization using prior information is presented. We apply a Bayesian geostatistical inverse approach that is well suited to accommodate a flexible model with the level of complexity driven by the data and explicitly considering uncertainty. Prior information is incorporated through the selection of a parameter covariance model characterizing continuity and providing stability. Often, discontinuities in the parameter field, typically caused by geologic contacts between contrasting lithologic units, necessitate subdivision into zones across which there is no correlation among hydraulic parameters. We propose an interactive protocol in which zonation candidates are implied from the data and are evaluated using cross validation and expert knowledge. Uncertainty introduced by limited knowledge of dynamic regional conditions is mitigated by using drawdown rather than native head values. An adjoint state formulation of MODFLOW-2000 is used to calculate sensitivities which are used both for the solution to the inverse problem and to guide protocol decisions. The protocol is tested using synthetic two-dimensional steady state examples in which the wells are located at the edge of the region of interest.

  2. Spatiotemporal analysis of olive flowering using geostatistical techniques.

    PubMed

    Rojo, Jesús; Pérez-Badia, Rosa

    2015-02-01

    Analysis of flowering patterns in the olive (Olea europaea L.) are of considerable agricultural and ecological interest, and also provide valuable information for allergy-sufferers, enabling identification of the major sources of airborne pollen at any given moment by interpreting the aerobiological data recorded in pollen traps. The present spatiotemporal analysis of olive flowering in central Spain combined geostatistical techniques with the application of a Geographic Information Systems, and compared results for flowering intensity with airborne pollen records. The results were used to obtain continuous phenological maps which determined the pattern of the succession of the olive flowering. The results show also that, although the highest airborne olive-pollen counts were recorded during the greatest flowering intensity of the groves closest to the pollen trap, the counts recorded at the start of the pollen season were not linked to local olive groves, which had not yet begin to flower. To detect the remote sources of olive pollen several episodes of pollen recorded before the local flowering season were analysed using a HYSPLIT trajectory model and the findings showed that western, southern and southwestern winds transported pollen grains into the study area from earlier-flowering groves located outside the territory.

  3. A multiple-point geostatistical method for characterizing uncertainty of subsurface alluvial units and its effects on flow and transport

    USGS Publications Warehouse

    Cronkite-Ratcliff, C.; Phelps, G.A.; Boucher, A.

    2012-01-01

    This report provides a proof-of-concept to demonstrate the potential application of multiple-point geostatistics for characterizing geologic heterogeneity and its effect on flow and transport simulation. The study presented in this report is the result of collaboration between the U.S. Geological Survey (USGS) and Stanford University. This collaboration focused on improving the characterization of alluvial deposits by incorporating prior knowledge of geologic structure and estimating the uncertainty of the modeled geologic units. In this study, geologic heterogeneity of alluvial units is characterized as a set of stochastic realizations, and uncertainty is indicated by variability in the results of flow and transport simulations for this set of realizations. This approach is tested on a hypothetical geologic scenario developed using data from the alluvial deposits in Yucca Flat, Nevada. Yucca Flat was chosen as a data source for this test case because it includes both complex geologic and hydrologic characteristics and also contains a substantial amount of both surface and subsurface geologic data. Multiple-point geostatistics is used to model geologic heterogeneity in the subsurface. A three-dimensional (3D) model of spatial variability is developed by integrating alluvial units mapped at the surface with vertical drill-hole data. The SNESIM (Single Normal Equation Simulation) algorithm is used to represent geologic heterogeneity stochastically by generating 20 realizations, each of which represents an equally probable geologic scenario. A 3D numerical model is used to simulate groundwater flow and contaminant transport for each realization, producing a distribution of flow and transport responses to the geologic heterogeneity. From this distribution of flow and transport responses, the frequency of exceeding a given contaminant concentration threshold can be used as an indicator of uncertainty about the location of the contaminant plume boundary.

  4. Suprascapular neuropathy in the setting of rotator cuff tears: study protocol for a double-blinded randomized controlled trial.

    PubMed

    Sachinis, Nikolaos Platon; Boutsiadis, Achilleas; Papagiannopoulos, Sotirios; Ditsios, Konstantinos; Christodoulou, Anastasios; Papadopoulos, Pericles

    2016-11-22

    It has been indicated that rotator cuff tears, especially large or massive ones, can cause suprascapular neuropathy. When such a diagnosis has been established, it is still unknown whether an arthroscopic release of the superior transverse scapular ligament during cuff repair can change the course of this neuropathy. This is a single-center, double-blinded randomized controlled trial for which 42 patients with large or massive repairable rotator cuff tears and suprascapular neuropathy will be recruited and followed up at 6 and 12 months. Nerve function will be measured by nerve conduction and electromyography studies preoperatively and at the selected follow-up periods. Patients will be randomly divided into equally numbered groups, the first one being the control group. Patients of this group will undergo arthroscopic repair of the rotator cuff without combined arthroscopic release of the superior transverse scapular ligament; in the second group the ligament will be released. The primary objective is to test the null hypothesis that arthroscopic repair of large/massive rotator cuff tears in patients with combined suprascapular neuropathy provides equivalent outcomes to one-stage arthroscopic cuff repair where the superior suprascapular ligament is additionally released. The secondary objective is to search for a relation between rotator cuff tear size and degree of suprascapular nerve recovery. The tertiary objective is to demonstrate any relation between rotator cuff muscle fatty infiltration grade and degree of suprascapular nerve function. Patients, clinicians during follow-up clinics and the neurologist will be blinded to the type of surgery performed. To the best of our knowledge, we are unaware of any prospective, randomized double-blinded studies with similar objectives. So far, the evidence suggests a positive correlation between massive rotator cuff tears and suprascapular neuropathy. However, there is mixed evidence suggesting that neuropathy can be

  5. Characterization of indoor air contaminants in a randomly selected set of commercial nail salons in Salt Lake County, Utah, USA.

    PubMed

    Alaves, Victor M; Sleeth, Darrah K; Thiese, Matthew S; Larson, Rodney R

    2013-01-01

    Air samples were collected in 12 randomly selected commercial nail salons in Salt Lake County, Utah. Measurements of salon physical/chemical parameters (room volume, CO2 levels) were obtained. Volatile organic compound (VOC) concentrations were collected using summa air canisters and sorbent media tubes for an 8-h period. Multivariate analyses were used to identify relationships between salon physical/chemical characteristics and the VOCs found in the air samples. The ACGIH(®) additive mixing formula was also applied to determine if there were potential overexposures to the combined airborne concentrations of chemicals monitored. Methyl methacrylate was detected in 58% of the establishments despite having been banned for use in nail products by the state of Utah. Formaldehyde was found above the NIOSH REL(®) (0.016 ppm) in 58% of the establishments. Given the assortment of VOCs to which nail salon workers are potentially exposed, a combination of engineering as well as personal protective equipment is recommended.

  6. Cognitive behaviour therapy for low self-esteem: a preliminary randomized controlled trial in a primary care setting.

    PubMed

    Waite, Polly; McManus, Freda; Shafran, Roz

    2012-12-01

    Low self-esteem (LSE) is associated with psychiatric disorder, and is distressing and debilitating in its own right. Hence, it is frequent target for treatment in cognitive behavioural interventions, yet it has rarely been the primary focus for intervention. This paper reports on a preliminary randomized controlled trial of cognitive behaviour therapy (CBT) for LSE using Fennell's (1997) cognitive conceptualisation and transdiagnostic treatment approach (1997, 1999). Twenty-two participants were randomly allocated to either immediate treatment (IT) (n=11) or to a waitlist condition (WL) (n=11). Treatment consisted of 10 sessions of individual CBT accompanied by workbooks. Participants allocated to the WL condition received the CBT intervention once the waitlist period was completed and all participants were followed up 11 weeks after completing CBT. The IT group showed significantly better functioning than the WL group on measures of LSE, overall functioning and depression and had fewer psychiatric diagnoses at the end of treatment. The WL group showed the same pattern of response to CBT as the group who had received CBT immediately. All treatment gains were maintained at follow-up assessment. The sample size is small and consists mainly of women with a high level of educational attainment and the follow-up period was relatively short. These preliminary findings suggest that a focused, brief CBT intervention can be effective in treating LSE and associated symptoms and diagnoses in a clinically representative group of individuals with a range of different and co-morbid disorders. Copyright © 2012 Elsevier Ltd. All rights reserved.

  7. GIS, geostatistics, metadata banking, and tree-based models for data analysis and mapping in environmental monitoring and epidemiology.

    PubMed

    Schröder, Winfried

    2006-05-01

    By the example of environmental monitoring, some applications of geographic information systems (GIS), geostatistics, metadata banking, and Classification and Regression Trees (CART) are presented. These tools are recommended for mapping statistically estimated hot spots of vectors and pathogens. GIS were introduced as tools for spatially modelling the real world. The modelling can be done by mapping objects according to the spatial information content of data. Additionally, this can be supported by geostatistical and multivariate statistical modelling. This is demonstrated by the example of modelling marine habitats of benthic communities and of terrestrial ecoregions. Such ecoregionalisations may be used to predict phenomena based on the statistical relation between measurements of an interesting phenomenon such as, e.g., the incidence of medically relevant species and correlated characteristics of the ecoregions. The combination of meteorological data and data on plant phenology can enhance the spatial resolution of the information on climate change. To this end, meteorological and phenological data have to be correlated. To enable this, both data sets which are from disparate monitoring networks have to be spatially connected by means of geostatistical estimation. This is demonstrated by the example of transformation of site-specific data on plant phenology into surface data. The analysis allows for spatial comparison of the phenology during the two periods 1961-1990 and 1991-2002 covering whole Germany. The changes in both plant phenology and air temperature were proved to be statistically significant. Thus, they can be combined by GIS overlay technique to enhance the spatial resolution of the information on the climate change and use them for the prediction of vector incidences at the regional scale. The localisation of such risk hot spots can be done by geometrically merging surface data on promoting factors. This is demonstrated by the example of the

  8. The systematic early integration of palliative care into multidisciplinary oncology care in the hospital setting (IPAC), a randomized controlled trial: the study protocol.

    PubMed

    Vanbutsele, Gaëlle; Van Belle, Simon; De Laat, Martine; Surmont, Veerle; Geboes, Karen; Eecloo, Kim; Pardon, Koen; Deliens, Luc

    2015-12-15

    Previous studies in the US and Canada, have shown the positive impact of early palliative care programs for advanced cancer patients on quality of life (QoL) and even survival time. There has been a lack of similar research in Europe. In order to generalize the findings from the US and Canada research on a larger scale, similar studies are needed in different countries with different care settings. The aim of this paper is to describe the research protocol of a randomized controlled trial, situated in Flanders, Belgium, evaluating the effect of systematic early integration of palliative care in standard oncology care. A randomized controlled trial will be conducted as follows: 182 patients with advanced cancer will be recruited from the departments of Medical Oncology, Digestive Oncology and Thoracic Oncology of the Ghent University Hospital. The trial will randomize patients to either systematic early integration of palliative care in standard oncology care or standard oncology care alone. Patients and informal caregivers will be asked to fill out questionnaires on QoL, mood, illness understanding and satisfaction with care at baseline, 12 weeks and every six weeks thereafter. Other outcome measures are end-of-life care decisions and overall survival time. This trial will be the first randomized controlled trial in the Belgian health care setting to evaluate the effect of systematic early integration of palliative care for advanced cancer patients. The results will enable us to evaluate whether systematic early integration of palliative care has positive effects on QoL, mood and patient illness-understanding and which components of the intervention contribute to these effects. Clinicaltrials.gov Identifier: NCT01865396 , registered 24(th) of May, 2013.

  9. Circumspheres of sets of n + 1 random points in the d-dimensional Euclidean unit ball (1 ≤ n ≤ d)

    NASA Astrophysics Data System (ADS)

    Le Caër, G.

    2017-05-01

    In the d-dimensional Euclidean space, any set of n + 1 independent random points, uniformly distributed in the interior of a unit ball of center O, determines almost surely a circumsphere of center C and radius Ω (1 ≤ n ≤ d) and an n-flat (1 ≤ n ≤ d - 1). The orthogonal projection of O onto this flat is called O' while Δ designates the distance O'C . The classical problem of the distance between two random points in a unit ball corresponds to n = 1. The focus is set on the family of circumspheres which are contained in this unit ball. For any d ≥2 and 1 ≤n ≤d -1 , the joint probability density function of the distance Δ ≡O'C and circumradius Ω has a simple closed-form expression. The marginal probability density functions of Δ and Ω are both products of powers and a Gauss hypergeometric function. Stochastic representations of the latter random variables are described in terms of geometric means of two independent beta random variables. For n =d ≥1 , Δ and Ω have a joint Dirichlet distribution with parameters (d, d2, 1) while Δ and Ω are beta distributed. Results of Monte Carlo simulations are in very good agreement with their calculated counterparts. The tail behavior of the circumradius probability density function has been studied by Monte Carlo simulations for 2 ≤n =d ≤9 , where all circumspheres are this time considered, regardless of whether or not they are entirely contained in the unit ball.

  10. Geostatistics as a tool to improve the natural background level definition: An application in groundwater.

    PubMed

    Dalla Libera, Nico; Fabbri, Paolo; Mason, Leonardo; Piccinini, Leonardo; Pola, Marco

    2017-11-15

    The Natural Background Level (NBL), suggested by UE BRIDGE project, is suited for spatially distributed datasets providing a regional value that could be higher than the Threshold Value (TV) set by every country. In hydro-geochemically dis-homogeneous areas, the use of a unique regional NBL, higher than TV, could arise problems to distinguish between natural occurrences and anthropogenic contaminant sources. Hence, the goal of this study is to improve the NBL definition employing a geostatistical approach, which reconstructs the contaminant spatial structure accounting geochemical and hydrogeological relationships. This integrated mapping is fundamental to evaluate the contaminant's distribution impact on the NBL, giving indications to improve it. We decided to test this method on the Drainage Basin of Venice Lagoon (DBVL, NE Italy), where the existing NBL is seven times higher than the TV. This area is notoriously affected by naturally occurring arsenic contamination. An available geochemical dataset collected by 50 piezometers was used to reconstruct the spatial distribution of arsenic in the densely populated area of the DBVL. A cokriging approach was applied exploiting the geochemical relationships among As, Fe and NH4+. The obtained spatial predictions of arsenic concentrations were divided into three different zones: i) areas with an As concentration lower than the TV, ii) areas with an As concentration between the TV and the median of the values higher than the TV, and iii) areas with an As concentration higher than the median. Following the BRIDGE suggestions, where enough samples were available, the 90th percentile for each zone was calculated to obtain a local NBL (LNBL). Differently from the original NBL, this local value gives more detailed water quality information accounting the hydrogeological and geochemical setting, and contaminant spatial variation. Hence, the LNBL could give more indications about the distinction between natural occurrence and

  11. Geostatistical applications in petroleum geology and sedimentary geology

    SciTech Connect

    Murray, C.J.

    1992-01-01

    Statistical tools can aid the geologist in understanding geologic data, and in the solution of real-world problems. This thesis examines several tools, concentrating on techniques with wide applicability in sedimentary and petroleum geology. A case study presents the use of geostatistics for mapping hydrocarbon pore volume and risk assessment in an area surrounding Amos Draw field in the northern Powder River basin of Wyoming. The study used sequential Gaussian and indicator simulation techniques, and documents the ability of indicator simulation to incorporate correlated secondary data. Reservoir characterization requires the generation of numerical grids of geologic properties. Because those properties differ for each rock type, one should first simulate the distribution of rock types, and then the distribution of the reservoir properties. This thesis proposes two multivariate statistical techniques, discriminant function analysis and cluster analysis, for the identification of petrophysical rock types in the Muddy Formation at Amos Draw. The geology of those rock types is discussed using core descriptions, thin-sections, and well log data. The rock types were simulated in three dimensions using indicator principal component simulation. The study also used simulated annealing for post-processing of the simulations, incorporating information from the wells on the transition frequencies between the rock types. The third case study used runs analysis for the identification of patterns in bed thickness and grain size in turbidites. Upward-thickening and thinning patterns have been used to assign turbidite sequences to depositional environments, although there has been disagreement on their identification. Runs analysis was applied to a turbidite section in the Sites Formation at Cache Creek, in northern California.

  12. Geostatistic applied to seismic noise measurements for hydrothermal basin characterization

    NASA Astrophysics Data System (ADS)

    Boaga, Jacopo; Trevisani, Sebastiano; Agostini, Laura; Galgaro, Antonio

    2016-04-01

    We present a geo-statistical analysis applied to seismic noise measurements in the framework of a thermal basin characterization. The site test is located in the N-E part of Italy (Caldiero, Verona Province) where more than 100 passive single station seismic noise measurements were conducted. The final aim was the characterization of an important hydrothermal basin, which is exploited since the Roman Period. The huge amount of measurements offers high density cover, since the measurements point has average spacing of 100 m for a total area investigated of ca 100ha. The HVSR (Horizontal to Vertical Spectral Ratio) is a geophysical passive technique used to retrieve fundamental resonance frequency of the subsoil. The measurement consists in passive recording of seismic noise with 3 components broadband receivers. From the spectral analysis of the recorded data, we can retrieve the resonance frequency of soil and hence information about depth and mechanical properties of soil covers. Since HVSR is a punctual measurement, 2d map of the results are usually extracted with interpolation procedure, as common kriging or natural neighbor techniques. Despite this accurate statistical procedure are rarely adopted for HVSR analysis, limiting the real significance of the dataset. As a matter of fact, rigorous statistical approach of the spatial distribution is neglected in common HVSR geophysical prospecting. Here we present the use of advanced spatial-statistic technique (e.g. cross-validation, residual distribution etc.) applied to HVSR data. Our results show as critic data scrubbing, joined to rigorous statistical approach for data interpolation, are mandatory to assure meaningful structural interpretation of microtremor HVSR survey. The maps obtained are compared with boreholes data, reflection seismic prospecting, and geological information. The proposed procedure highlighted the potential of these quick passive measurements, if correctly treated from the statistical point

  13. A geostatistical approach to mapping site response spectral amplifications

    USGS Publications Warehouse

    Thompson, E.M.; Baise, L.G.; Kayen, R.E.; Tanaka, Y.; Tanaka, H.

    2010-01-01

    If quantitative estimates of the seismic properties do not exist at a location of interest then the site response spectral amplifications must be estimated from data collected at other locations. Currently, the most common approach employs correlations of site class with maps of surficial geology. Analogously, correlations of site class with topographic slope can be employed where the surficial geology is unknown. Our goal is to identify and validate a method to estimate site response with greater spatial resolution and accuracy for regions where additional effort is warranted. This method consists of three components: region-specific data collection, a spatial model for interpolating seismic properties, and a theoretical method for computing spectral amplifications from the interpolated seismic properties. We consider three spatial interpolation schemes: correlations with surficial geology, termed the geologic trend (GT), ordinary kriging (OK), and kriging with a trend (KT). We estimate the spectral amplifications from seismic properties using the square root of impedance method, thereby linking the frequency-dependent spectral amplifications to the depth-dependent seismic properties. Thus, the range of periods for which this method is applicable is limited by the depth of exploration. A dense survey of near-surface S-wave slowness (Ss) throughout Kobe, Japan shows that the geostatistical methods give more accurate estimates of Ss than the topographic slope and GT methods, and the OK and KT methods perform equally well. We prefer the KT model because it can be seamlessly integrated with geologic maps that cover larger regions. Empirical spectral amplifications show that the region-specific data achieve more accurate estimates of observed median short-period amplifications than the topographic slope method. ?? 2010 Elsevier B.V.

  14. Patch-based iterative conditional geostatistical simulation using graph cuts

    NASA Astrophysics Data System (ADS)

    Li, Xue; Mariethoz, Gregoire; Lu, DeTang; Linde, Niklas

    2016-08-01

    Training image-based geostatistical methods are increasingly popular in groundwater hydrology even if existing algorithms present limitations that often make real-world applications difficult. These limitations include a computational cost that can be prohibitive for high-resolution 3-D applications, the presence of visual artifacts in the model realizations, and a low variability between model realizations due to the limited pool of patterns available in a finite-size training image. In this paper, we address these issues by proposing an iterative patch-based algorithm which adapts a graph cuts methodology that is widely used in computer graphics. Our adapted graph cuts method optimally cuts patches of pixel values borrowed from the training image and assembles them successively, each time accounting for the information of previously stitched patches. The initial simulation result might display artifacts, which are identified as regions of high cost. These artifacts are reduced by iteratively placing new patches in high-cost regions. In contrast to most patch-based algorithms, the proposed scheme can also efficiently address point conditioning. An advantage of the method is that the cut process results in the creation of new patterns that are not present in the training image, thereby increasing pattern variability. To quantify this effect, a new measure of variability is developed, the merging index, quantifies the pattern variability in the realizations with respect to the training image. A series of sensitivity analyses demonstrates the stability of the proposed graph cuts approach, which produces satisfying simulations for a wide range of parameters values. Applications to 2-D and 3-D cases are compared to state-of-the-art multiple-point methods. The results show that the proposed approach obtains significant speedups and increases variability between realizations. Connectivity functions applied to 2-D models transport simulations in 3-D models are used to

  15. Bayesian Geostatistical Modeling of Malaria Indicator Survey Data in Angola

    PubMed Central

    Gosoniu, Laura; Veta, Andre Mia; Vounatsou, Penelope

    2010-01-01

    The 2006–2007 Angola Malaria Indicator Survey (AMIS) is the first nationally representative household survey in the country assessing coverage of the key malaria control interventions and measuring malaria-related burden among children under 5 years of age. In this paper, the Angolan MIS data were analyzed to produce the first smooth map of parasitaemia prevalence based on contemporary nationwide empirical data in the country. Bayesian geostatistical models were fitted to assess the effect of interventions after adjusting for environmental, climatic and socio-economic factors. Non-linear relationships between parasitaemia risk and environmental predictors were modeled by categorizing the covariates and by employing two non-parametric approaches, the B-splines and the P-splines. The results of the model validation showed that the categorical model was able to better capture the relationship between parasitaemia prevalence and the environmental factors. Model fit and prediction were handled within a Bayesian framework using Markov chain Monte Carlo (MCMC) simulations. Combining estimates of parasitaemia prevalence with the number of children under we obtained estimates of the number of infected children in the country. The population-adjusted prevalence ranges from in Namibe province to in Malanje province. The odds of parasitaemia in children living in a household with at least ITNs per person was by 41% lower (CI: 14%, 60%) than in those with fewer ITNs. The estimates of the number of parasitaemic children produced in this paper are important for planning and implementing malaria control interventions and for monitoring the impact of prevention and control activities. PMID:20351775

  16. Massively Parallel Geostatistical Inversion of Coupled Processes in Heterogeneous Porous Media

    NASA Astrophysics Data System (ADS)

    Ngo, A.; Schwede, R. L.; Li, W.; Bastian, P.; Ippisch, O.; Cirpka, O. A.

    2012-04-01

    The quasi-linear geostatistical approach is an inversion scheme that can be used to estimate the spatial distribution of a heterogeneous hydraulic conductivity field. The estimated parameter field is considered to be a random variable that varies continuously in space, meets the measurements of dependent quantities (such as the hydraulic head, the concentration of a transported solute or its arrival time) and shows the required spatial correlation (described by certain variogram models). This is a method of conditioning a parameter field to observations. Upon discretization, this results in as many parameters as elements of the computational grid. For a full three dimensional representation of the heterogeneous subsurface it is hardly sufficient to work with resolutions (up to one million parameters) of the model domain that can be achieved on a serial computer. The forward problems to be solved within the inversion procedure consists of the elliptic steady-state groundwater flow equation and the formally elliptic but nearly hyperbolic steady-state advection-dominated solute transport equation in a heterogeneous porous medium. Both equations are discretized by Finite Element Methods (FEM) using fully scalable domain decomposition techniques. Whereas standard conforming FEM is sufficient for the flow equation, for the advection dominated transport equation, which rises well known numerical difficulties at sharp fronts or boundary layers, we use the streamline diffusion approach. The arising linear systems are solved using efficient iterative solvers with an AMG (algebraic multigrid) pre-conditioner. During each iteration step of the inversion scheme one needs to solve a multitude of forward and adjoint problems in order to calculate the sensitivities of each measurement and the related cross-covariance matrix of the unknown parameters and the observations. In order to reduce interprocess communications and to improve the scalability of the code on larger clusters

  17. A Novel Approach of Understanding and Incorporating Error of Chemical Transport Models into a Geostatistical Framework

    NASA Astrophysics Data System (ADS)

    Reyes, J.; Vizuete, W.; Serre, M. L.; Xu, Y.

    2015-12-01

    The EPA employs a vast monitoring network to measure ambient PM2.5 concentrations across the United States with one of its goals being to quantify exposure within the population. However, there are several areas of the country with sparse monitoring spatially and temporally. One means to fill in these monitoring gaps is to use PM2.5 modeled estimates from Chemical Transport Models (CTMs) specifically the Community Multi-scale Air Quality (CMAQ) model. CMAQ is able to provide complete spatial coverage but is subject to systematic and random error due to model uncertainty. Due to the deterministic nature of CMAQ, often these uncertainties are not quantified. Much effort is employed to quantify the efficacy of these models through different metrics of model performance. Currently evaluation is specific to only locations with observed data. Multiyear studies across the United States are challenging because the error and model performance of CMAQ are not uniform over such large space/time domains. Error changes regionally and temporally. Because of the complex mix of species that constitute PM2.5, CMAQ error is also a function of increasing PM2.5 concentration. To address this issue we introduce a model performance evaluation for PM2.5 CMAQ that is regionalized and non-linear. This model performance evaluation leads to error quantification for each CMAQ grid. Areas and time periods of error being better qualified. The regionalized error correction approach is non-linear and is therefore more flexible at characterizing model performance than approaches that rely on linearity assumptions and assume homoscedasticity of CMAQ predictions errors. Corrected CMAQ data are then incorporated into the modern geostatistical framework of Bayesian Maximum Entropy (BME). Through cross validation it is shown that incorporating error-corrected CMAQ data leads to more accurate estimates than just using observed data by themselves.

  18. A new approach to upscaling fracture network models while preserving geostatistical and geomechanical characteristics

    NASA Astrophysics Data System (ADS)

    Lei, Qinghua; Latham, John-Paul; Tsang, Chin-Fu; Xiang, Jiansheng; Lang, Philipp

    2015-07-01

    A new approach to upscaling two-dimensional fracture network models is proposed for preserving geostatistical and geomechanical characteristics of a smaller-scale "source" fracture pattern. First, the scaling properties of an outcrop system are examined in terms of spatial organization, lengths, connectivity, and normal/shear displacements using fractal geometry and power law relations. The fracture pattern is observed to be nonfractal with the fractal dimension D ≈ 2, while its length distribution tends to follow a power law with the exponent 2 < a < 3. To introduce a realistic distribution of fracture aperture and shear displacement, a geomechanical model using the combined finite-discrete element method captures the response of a fractured rock sample with a domain size L = 2 m under in situ stresses. Next, a novel scheme accommodating discrete-time random walks in recursive self-referencing lattices is developed to nucleate and propagate fractures together with their stress- and scale-dependent attributes into larger domains of up to 54 m × 54 m. The advantages of this approach include preserving the nonplanarity of natural cracks, capturing the existence of long fractures, retaining the realism of variable apertures, and respecting the stress dependency of displacement-length correlations. Hydraulic behavior of multiscale growth realizations is modeled by single-phase flow simulation, where distinct permeability scaling trends are observed for different geomechanical scenarios. A transition zone is identified where flow structure shifts from extremely channeled to distributed as the network scale increases. The results of this paper have implications for upscaling network characteristics for reservoir simulation.

  19. Geostatistics: a new tool for describing spatially-varied surface conditions from timber harvested and burned hillslopes

    Treesearch

    Peter R. Robichaud

    1997-01-01

    Geostatistics provides a method to describe the spatial continuity of many natural phenomena. Spatial models are based upon the concept of scaling, kriging and conditional simulation. These techniques were used to describe the spatially-varied surface conditions on timber harvest and burned hillslopes. Geostatistical techniques provided estimates of the ground cover (...

  20. Psychosocial education improves low back pain beliefs: results from a cluster randomized clinical trial (NCT00373009) in a primary prevention setting.

    PubMed

    George, Steven Z; Teyhen, Deydre S; Wu, Samuel S; Wright, Alison C; Dugan, Jessica L; Yang, Guijun; Robinson, Michael E; Childs, John D

    2009-07-01

    The general population has a pessimistic view of low back pain (LBP), and evidence-based information has been used to positively influence LBP beliefs in previously reported mass media studies. However, there is a lack of randomized trials investigating whether LBP beliefs can be modified in primary prevention settings. This cluster randomized clinical trial investigated the effect of an evidence-based psychosocial educational program (PSEP) on LBP beliefs for soldiers completing military training. A military setting was selected for this clinical trial, because LBP is a common cause of soldier disability. Companies of soldiers (n = 3,792) were recruited, and cluster randomized to receive a PSEP or no education (control group, CG). The PSEP consisted of an interactive seminar, and soldiers were issued the Back Book for reference material. The primary outcome measure was the back beliefs questionnaire (BBQ), which assesses inevitable consequences of and ability to cope with LBP. The BBQ was administered before randomization and 12 weeks later. A linear mixed model was fitted for the BBQ at the 12-week follow-up, and a generalized linear mixed model was fitted for the dichotomous outcomes on BBQ change of greater than two points. Sensitivity analyses were performed to account for drop out. BBQ scores (potential range: 9-45) improved significantly from baseline of 25.6 +/- 5.7 (mean +/- SD) to 26.9 +/- 6.2 for those receiving the PSEP, while there was a significant decline from 26.1 +/- 5.7 to 25.6 +/- 6.0 for those in the CG. The adjusted mean BBQ score at follow-up for those receiving the PSEP was 1.49 points higher than those in the CG (P < 0.0001). The adjusted odds ratio of BBQ improvement of greater than two points for those receiving the PSEP was 1.51 (95% CI = 1.22-1.86) times that of those in the CG. BBQ improvement was also mildly associated with race and college education. Sensitivity analyses suggested minimal influence of drop out. In conclusion, soldiers

  1. Effect of mobile reminders on screening yield during opportunistic screening for type 2 diabetes mellitus in a primary health care setting: A randomized trial

    PubMed Central

    Kumar, Sathish; Shewade, Hemant Deepak; Vasudevan, Kavita; Durairaju, Kathamuthu; Santhi, V.S.; Sunderamurthy, Bhuvaneswary; Krishnakumari, Velavane; Panigrahi, Krishna Chandra

    2015-01-01

    Objective. We wanted to study whether mobile reminders increased follow-up for definitive tests resulting in higher screening yield during opportunistic screening for diabetes. Methods. This was a facility-based parallel randomized controlled trial during routine outpatient department hours in a primary health care setting in Puducherry, India (2014). We offered random blood glucose testing to non-pregnant non-diabetes adults with age >30 years (667 total, 390 consented); eligible outpatients (random blood glucose ≥ 6.1 mmol/l, n = 268) were requested to follow-up for definitive tests (fasting and postprandial blood glucose). Eligible outpatients either received (intervention arm, n = 133) or did not receive mobile reminder (control arm, n = 135) to follow-up for definitive tests. We measured capillary blood glucose using a glucometer to make epidemiological diagnosis of diabetes. The trial was registered with Clinical Trial Registry of India (CTRI/2014/10/005138). Results. 85.7% of outpatients in intervention arm returned for definitive test when compared to 53.3% in control arm [Relative Risk = 1.61, (0.95 Confidence Interval — 1.35, 1.91)]. Screening yield in intervention and control arm was 18.6% and 10.2% respectively. Etiologic fraction was 45.2% and number needed to screen was 11.9. Conclusion. In countries like India, which is emerging as the diabetes capital of the world, considering the wide prevalent use of mobile phones, and real life resource limited settings in which this study was carried out, mobile reminders during opportunistic screening in primary health care setting improve screening yield of diabetes. PMID:26844130

  2. Use of geostatistical modeling to capture complex geology in finite-element analyses

    SciTech Connect

    Rautman, C.A.; Longenbaugh, R.S.; Ryder, E.E.

    1995-12-01

    This paper summarizes a number of transient thermal analyses performed for a representative two-dimensional cross section of volcanic tuffs at Yucca Mountain using the finite element, nonlinear heat-conduction code COYOTE-II. In addition to conventional design analyses, in which material properties are formulated as a uniform single material and as horizontally layered, internally uniform matters, an attempt was made to increase the resemblance of the thermal property field to the actual geology by creating two fairly complex, geologically realistic models. The first model was created by digitizing an existing two-dimensional geologic cross section of Yucca Mountain. The second model was created using conditional geostatistical simulation. Direct mapping of geostatistically generated material property fields onto finite element computational meshes was demonstrated to yield temperature fields approximately equivalent to those generated through more conventional procedures. However, the ability to use the geostatistical models offers a means of simplifying the physical-process analyses.

  3. Geostatistical modeling of a heterogeneous site bordering the Venice lagoon, Italy.

    PubMed

    Trevisani, Sebastiano; Fabbri, Paolo

    2010-01-01

    Geostatistical methods are well suited for analyzing the local and spatial uncertainties that accompany the modeling of highly heterogeneous three-dimensional (3D) geological architectures. The spatial modeling of 3D hydrogeological architectures is crucial for polluted site characterization, in regards to both groundwater modeling and planning remediation procedures. From this perspective, the polluted site of Porto Marghera, located on the periphery of the Venice lagoon, represents an interesting example. For this site, the available dense spatial sampling network, with 769 boreholes over an area of 6 km(2), allows us to evaluate the high geological heterogeneity by means of indicator kriging and sequential indicator simulation. We show that geostatistical methodologies and ad hoc post processing of geostatistical analysis results allow us to effectively analyze the high hydrogeological heterogeneity of the studied site.

  4. TiConverter: A training image converting tool for multiple-point geostatistics

    NASA Astrophysics Data System (ADS)

    Fadlelmula F., Mohamed M.; Killough, John; Fraim, Michael

    2016-11-01

    TiConverter is a tool developed to ease the application of multiple-point geostatistics whether by the open source Stanford Geostatistical Modeling Software (SGeMS) or other available commercial software. TiConverter has a user-friendly interface and it allows the conversion of 2D training images into numerical representations in four different file formats without the need for additional code writing. These are the ASCII (.txt), the geostatistical software library (GSLIB) (.txt), the Isatis (.dat), and the VTK formats. It performs the conversion based on the RGB color system. In addition, TiConverter offers several useful tools including image resizing, smoothing, and segmenting tools. The purpose of this study is to introduce the TiConverter, and to demonstrate its application and advantages with several examples from the literature.

  5. Use of geostatistics for remediation planning to transcend urban political boundaries.

    PubMed

    Milillo, Tammy M; Sinha, Gaurav; Gardella, Joseph A

    2012-11-01

    Soil remediation plans are often dictated by areas of jurisdiction or property lines instead of scientific information. This study exemplifies how geostatistically interpolated surfaces can substantially improve remediation planning. Ordinary kriging, ordinary co-kriging, and inverse distance weighting spatial interpolation methods were compared for analyzing surface and sub-surface soil sample data originally collected by the US EPA and researchers at the University at Buffalo in Hickory Woods, an industrial-residential neighborhood in Buffalo, NY, where both lead and arsenic contamination is present. Past clean-up efforts estimated contamination levels from point samples, but parcel and agency jurisdiction boundaries were used to define remediation sites, rather than geostatistical models estimating the spatial behavior of the contaminants in the soil. Residents were understandably dissatisfied with the arbitrariness of the remediation plan. In this study we show how geostatistical mapping and participatory assessment can make soil remediation scientifically defensible, socially acceptable, and economically feasible.

  6. Associations of High-Dose Melphalan Pharmacokinetics and Outcomes in the Setting of a Randomized Cryotherapy Trial.

    PubMed

    Cho, Y K; Sborov, D W; Lamprecht, M; Li, J; Wang, J; Hade, E M; Gao, Y; Tackett, K; Williams, N; Benson, D M; Efebera, Y A; Rosko, A E; Devine, S M; Poi, M; Hofmeister, C C; Phelps, M A

    2017-09-01

    High-dose melphalan followed by autologous stem cell transplantation remains the standard of care for eligible patients with multiple myeloma, but disease response and toxicity, including severe mucositis, varies among patients. Our randomized trial investigated duration of cryotherapy (2 and 6 h) for reduction of mucositis prevalence and severity and explored factors associated with variability in pharmacokinetics and outcomes from melphalan therapy. The results demonstrate that 2-h is at least as effective as 6-h cryotherapy in decreasing severe mucositis. From a population pharmacokinetic model, we identified that fat-free mass, hematocrit, and creatinine clearance were significant covariates, as reported previously. Furthermore, we observed the rs4240803 SLC7A5 polymorphism was significantly associated with pharmacokinetic variability, and pharmacokinetics was associated with both mucositis and neutropenia. However, melphalan exposure was not associated with progression-free or overall survival in our dataset. These findings contribute to ongoing efforts to personalize melphalan dosing in transplant patients. © 2017 American Society for Clinical Pharmacology and Therapeutics.

  7. Classroom-based Interventions and Teachers' Perceived Job Stressors and Confidence: Evidence from a Randomized Trial in Head Start Settings.

    PubMed

    Zhai, Fuhua; Raver, C Cybele; Li-Grining, Christine

    2011-09-01

    Preschool teachers' job stressors have received increasing attention but have been understudied in the literature. We investigated the impacts of a classroom-based intervention, the Chicago School Readiness Project (CSRP), on teachers' perceived job stressors and confidence, as indexed by their perceptions of job control, job resources, job demands, and confidence in behavior management. Using a clustered randomized controlled trial (RCT) design, the CSRP provided multifaceted services to the treatment group, including teacher training and mental health consultation, which were accompanied by stress-reduction services and workshops. Overall, 90 teachers in 35 classrooms at 18 Head Start sites participated in the study. After adjusting for teacher and classroom factors and site fixed effects, we found that the CSRP had significant effects on the improvement of teachers' perceived job control and work-related resources. We also found that the CSRP decreased teachers' confidence in behavior management and had no statistically significant effects on job demands. Overall, we did not find significant moderation effects of teacher race/ethnicity, education, teaching experience, or teacher type. The implications for research and policy are discussed.

  8. Weight gain prevention in the school worksite setting: Results of a multi-level cluster randomized trial

    PubMed Central

    Lemon, Stephenie C.; Wang, Monica L.; Wedick, Nicole M.; Estabrook, Barbara; Druker, Susan; Schneider, Kristin L.; Li, Wenjun; Pbert, Lori

    2014-01-01

    Objective To describe the effectiveness, reach and implementation of a weight gain prevention intervention among public school employees. Method A multi-level intervention was tested in a cluster randomized trial among 782 employees in 12 central Massachusetts public high schools from 2009 to 2012. The intervention targeted the nutrition and physical activity environment and policies, the social environment and individual knowledge, attitudes and skills. The intervention was compared to a materials only condition. The primary outcome measures were change in weight and body mass index (BMI) at 24-month follow-up. Implementation of physical environment, policy and social environment strategies at the school and interpersonal levels, and intervention participation at the individual level were assessed. Results At 24-month follow-up, there was a net change (difference of the difference) of −3.03 pounds (p=.04) and of −.48 BMI units (p=.05) between intervention and comparison conditions. The majority of intervention strategies were successfully implemented by all intervention schools, although establishing formal policies was challenging. Employee participation in programs targeting the physical and social environment was maintained over time. Conclusion This study supports that a multi-level intervention integrated within the organizational culture can be successfully implemented and prevent weight gain in public high school employees. PMID:24345602

  9. Screening and Brief Intervention for Unhealthy Drug Use in Primary Care Settings: Randomized Clinical Trials Are Needed

    PubMed Central

    Saitz, Richard; Alford, Daniel P.; Bernstein, Judith; Cheng, Debbie M.; Samet, Jeffrey; Palfai, Tibor

    2010-01-01

    The efficacy of screening and brief intervention (SBI) for drug use in primary care patients is largely unknown. Because of this lack of evidence, US professional organizations do not recommend it. Yet, a strong theoretical case can be made for drug SBI. Drug use is common and associated with numerous health consequences, patients usually do not seek help for drug abuse and dependence, and SBI has proven efficacy for unhealthy alcohol use. On the other hand, the diversity of drugs of abuse and the high prevalence of abuse and dependence among those who use them raise concerns that drug SBI may have limited or no efficacy. Federal efforts to disseminate SBI for drug use are underway, and reimbursement codes to compensate clinicians for these activities have been developed. However, the discrepancies between science and policy developments underscore the need for evidence-based research regarding the efficacy of SBI for drug use. This article discusses the rationale for drug SBI and existing research on its potential to improve drug-use outcomes and makes the argument that randomized controlled trials to determine its efficacy are urgently needed to bridge the gap between research, policy, and clinical practice. PMID:20936079

  10. Classroom-based Interventions and Teachers’ Perceived Job Stressors and Confidence: Evidence from a Randomized Trial in Head Start Settings

    PubMed Central

    Zhai, Fuhua; Raver, C. Cybele; Li-Grining, Christine

    2011-01-01

    Preschool teachers’ job stressors have received increasing attention but have been understudied in the literature. We investigated the impacts of a classroom-based intervention, the Chicago School Readiness Project (CSRP), on teachers’ perceived job stressors and confidence, as indexed by their perceptions of job control, job resources, job demands, and confidence in behavior management. Using a clustered randomized controlled trial (RCT) design, the CSRP provided multifaceted services to the treatment group, including teacher training and mental health consultation, which were accompanied by stress-reduction services and workshops. Overall, 90 teachers in 35 classrooms at 18 Head Start sites participated in the study. After adjusting for teacher and classroom factors and site fixed effects, we found that the CSRP had significant effects on the improvement of teachers’ perceived job control and work-related resources. We also found that the CSRP decreased teachers’ confidence in behavior management and had no statistically significant effects on job demands. Overall, we did not find significant moderation effects of teacher race/ethnicity, education, teaching experience, or teacher type. The implications for research and policy are discussed. PMID:21927538

  11. Multiple-Point Geostatistics and Near-Surface Geophysics for Modeling Heterogeneity in a Coastal Aquifer

    NASA Astrophysics Data System (ADS)

    Trainor, W. J.; Knight, R. J.; Caers, J. K.

    2007-12-01

    In order to effectively manage groundwater resources, water agencies have begun to incorporate precipitation, temperature, stream-gauge, land-cover and groundwater level data into their aquifer models. For Western States in particular, stored groundwater is an important provider for agriculture and human consumption. But the estimates of groundwater quantity are arguably the most uncertain in the water balance equation. Current practice in constructing subsurface models relies on substandard and incomplete data due due, in large part, to budgetary constraints. Once a final model has been developed, the possible inaccuracies in the geological scenarios are rarely examined or investigated. How wrong can the subsurface model be while still giving accurate prediction results? How sensitive is the model response to perturbations in the subsurface parameters and long-term irrigation, precipitation and recharge conditions? This study examines these questions through a sensitivity analysis. The "working" aquifers of California's agricultural central coast were used as analog systems for the construction of this sensitivity study. The fluvial geologic interpretations of these coastal aquifer systems were used in Boolean (object-oriented) and multiple-point geostatistical algorithms to create many alternative permeability fields, reflecting the uncertainty in the spatial distribution and geological scenario of the subsurface permeability field. Two sets of models were created using SNESIM, a multiple-point geostatistical algorithm. SNESIM is able to generate a stochastic facies realization using a training image (TI -- a conceptual idea of geologic system) with rotation and affinity maps. The first set of models are higher entropy, representing less continuous clay layers. These were created from a TI of clay ellipses (which was created using GSLIB Ellipsim program). The second set of models are more heterogeneous by using a fluvial TI within SNESIM. All the realizations

  12. Geostatistical discrimination between different sources of soil pollutants using a magneto-geochemical data set.

    PubMed

    Zawadzki, Jarosław; Szuszkiewicz, Marcin; Fabijańczyk, Piotr; Magiera, Tadeusz

    2016-12-01

    The primary goal of this work was to distinguish between soil pollution from long-range and local transport of atmospheric pollutants using soil magnetometry supported by geochemical analyses. The study area was located in the Izery region of Poland (within the "Black Triangle" region, which is the nickname for one of Europe's most polluted areas, where Germany, Poland and the Czech Republic meet). One site of the study area was situated in the Forest Glade and was exposed to anthropogenic pollution from a former glasswork. The second site of the study area was located on a neighboring hill (Granicznik) of which the western, northwestern and southwestern parts of the slope were exposed to the long-range transport of atmospheric pollutants from the Czech Republic, Germany and Poland. Magnetic susceptibility was measured on the soil surface and in the soil samples using a MS2 Bartington meter equipped with MS2D and MS2C sensors, respectively. Using soil magnetometry, it was possible to discriminate between long-range transport of atmospheric pollutants and anthropogenic pollution related to the former glasswork located in the Forest Glade. Additionally, using MS2C measurements and geochemical analyses of sixteen trace elements, it was possible to discriminate between natural and anthropogenic origins of a soil magnetic susceptibility signal. Our results indicate that the Forest Glade site is characterized by relatively significant anthropogenic translocation of topsoil horizons, presence of artefacts, more hot spots, very high spatial variability, and higher nugget effect than on the Granicznik Hill.

  13. Psychoneuroendocrine effects of cognitive-behavioral stress management in a naturalistic setting--a randomized controlled trial.

    PubMed

    Gaab, J; Sonderegger, L; Scherrer, S; Ehlert, U

    2006-05-01

    It is assumed that chronic or extensive release of cortisol due to stress has deleterious effects on somatic and psychological health, making interventions aiming to reduce and/or normalize cortisol secretion to stress of interest. Cognitive-behavioral stress management (CBSM) has repeatedly been shown to effectively reduce cortisol responses to acute psychosocial stress. However, the effects of CBSM on psychoneuroendocrine responses during "real-life" stress have yet not been examined in healthy subjects. Eight weeks before all subjects took an important academic exam, 28 healthy economics students were randomly assigned to four weekly sessions of cognitive behavioral stress management (CBSM) training or a waiting control condition. Psychological and somatic symptoms were repeatedly assessed throughout the preparation period. Salivary cortisol (cortisol awakening response and short circadian cortisol profile) was repeatedly measured at baseline and on the day of the exam. In addition, cognitive appraisal was assessed on the day of the exam. Subjects in the CBSM group showed significantly lower anxiety and somatic symptom levels throughout the period prior to the exam. On the day of the exam, groups differed in their cortisol awakening stress responses, with significantly attenuated cortisol levels in controls. Short circadian cortisol levels did not differ between groups. Interestingly, groups differed in their associations between cortisol responses before the exam and cognitive stress appraisal, with dissociation in controls but not in the CBSM group. The results show that CBSM reduces psychological and somatic symptoms and influences the ability to show a cortisol response corresponding to subjectively perceived stress. In line with current psychoneuroendocrine models, the inability to mount a cortisol response corresponding to the cognitive appraisal in controls could be a result of a dysregulated HPA axis, probably as a consequence of longlasting stress.

  14. Creating an integrated care model for childhood obesity: a randomized pilot study utilizing telehealth in a community primary care setting.

    PubMed

    Fleischman, A; Hourigan, S E; Lyon, H N; Landry, M G; Reynolds, J; Steltz, S K; Robinson, L; Keating, S; Feldman, H A; Antonelli, R C; Ludwig, D S; Ebbeling, C B

    2016-12-01

    In an integrated care model, involving primary care providers (PCPs) and obesity specialists, telehealth may be useful for overcoming barriers to treating childhood obesity. We conducted a pilot study comparing body mass index (BMI) changes between two arms (i) PCP in-person clinic visits plus obesity specialist tele-visits ( PCP visits + specialist tele-visits) and (ii) PCP in-person clinic visits only ( PCP visits only), with ongoing tele-consultation between PCPs and obesity specialists for both arms. Patients (N = 40, 10-17 years, BMI ≥ 95th percentile) were randomized to Group 1 or 2. Both groups had PCP visits every 3 months for 12 months. Using a cross-over protocol, Group 1 had PCP visits + specialist tele-visits during the first 6 months and PCP visits only during the second 6 months, and Group 2 followed the opposite sequence. Each of 12 tele-visits was conducted by a dietitian or psychologist with a patient and parent. Retention rates were 90% at 6 months and 80% at 12 months. BMI (z-score) decreased more for Group 1 (started with PCP visits + specialist tele-visits) vs. Group 2 (started with PCP visits only) at 3 months (-0.11 vs. -0.05, P = 0.049) following frequent tele-visits. At 6 months (primary outcome), BMI was lower than baseline within Group 1 (-0.11, P = 0.0006) but not Group 2 (-0.06, P = 0.08); however, decrease in BMI at 6 months did not differ between groups. After crossover, BMI remained lower than baseline for Group 1 and dropped below baseline for Group 2. An integrated care model utilizing telehealth holds promise for treating children with obesity. © 2016 World Obesity Federation.

  15. Random-phase approximation correlation energies from Lanczos chains and an optimal basis set: theory and applications to the benzene dimer.

    PubMed

    Rocca, Dario

    2014-05-14

    A new ab initio approach is introduced to compute the correlation energy within the adiabatic connection fluctuation dissipation theorem in the random phase approximation. First, an optimally small basis set to represent the response functions is obtained by diagonalizing an approximate dielectric matrix containing the kinetic energy contribution only. Then, the Lanczos algorithm is used to compute the full dynamical dielectric matrix and the correlation energy. The convergence issues with respect to the number of empty states or the dimension of the basis set are avoided and the dynamical effects are easily kept into account. To demonstrate the accuracy and efficiency of this approach the binding curves for three different configurations of the benzene dimer are computed: T-shaped, sandwich, and slipped parallel.

  16. Geostatistical analysis of groundwater level using Euclidean and non-Euclidean distance metrics and variable variogram fitting criteria

    NASA Astrophysics Data System (ADS)

    Theodoridou, Panagiota G.; Karatzas, George P.; Varouchakis, Emmanouil A.; Corzo Perez, Gerald A.

    2015-04-01

    Groundwater level is an important information in hydrological modelling. Geostatistical methods are often employed to map the free surface of an aquifer. In geostatistical analysis using Kriging techniques the selection of the optimal variogram model is very important for the optimal method performance. This work compares three different criteria, the least squares sum method, the Akaike Information Criterion and the Cressie's Indicator, to assess the theoretical variogram that fits to the experimental one and investigates the impact on the prediction results. Moreover, five different distance functions (Euclidean, Minkowski, Manhattan, Canberra, and Bray-Curtis) are applied to calculate the distance between observations that affects both the variogram calculation and the Kriging estimator. Cross validation analysis in terms of Ordinary Kriging is applied by using sequentially a different distance metric and the above three variogram fitting criteria. The spatial dependence of the observations in the tested dataset is studied by fitting classical variogram models and the Matérn model. The proposed comparison analysis performed for a data set of two hundred fifty hydraulic head measurements distributed over an alluvial aquifer that covers an area of 210 km2. The study area is located in the Prefecture of Drama, which belongs to the Water District of East Macedonia (Greece). This area was selected in terms of hydro-geological data availability and geological homogeneity. The analysis showed that a combination of the Akaike information Criterion for the variogram fitting assessment and the Brays-Curtis distance metric provided the most accurate cross-validation results. The Power-law variogram model provided the best fit to the experimental data. The aforementioned approach for the specific dataset in terms of the Ordinary Kriging method improves the prediction efficiency in comparison to the classical Euclidean distance metric. Therefore, maps of the spatial

  17. Effectiveness of a self-management program for dual sensory impaired seniors in aged care settings: study protocol for a cluster randomized controlled trial

    PubMed Central

    2013-01-01

    Background Five to 25 percent of residents in aged care settings have a combined hearing and visual sensory impairment. Usual care is generally restricted to single sensory impairment, neglecting the consequences of dual sensory impairment on social participation and autonomy. The aim of this study is to evaluate the effectiveness of a self-management program for seniors who acquired dual sensory impairment at old age. Methods/Design In a cluster randomized, single-blind controlled trial, with aged care settings as the unit of randomization, the effectiveness of a self-management program will be compared to usual care. A minimum of 14 and maximum of 20 settings will be randomized to either the intervention cluster or the control cluster, aiming to include a total of 132 seniors with dual sensory impairment. Each senior will be linked to a licensed practical nurse working at the setting. During a five to six month intervention period, nurses at the intervention clusters will be trained in a self-management program to support and empower seniors to use self-management strategies. In two separate diaries, nurses keep track of the interviews with the seniors and their reflections on their own learning process. Nurses of the control clusters offer care as usual. At senior level, the primary outcome is the social participation of the seniors measured using the Hearing Handicap Questionnaire and the Activity Card Sort, and secondary outcomes are mood, autonomy and quality of life. At nurse level, the outcome is job satisfaction. Effectiveness will be evaluated using linear mixed model analysis. Discussion The results of this study will provide evidence for the effectiveness of the Self-Management Program for seniors with dual sensory impairment living in aged care settings. The findings are expected to contribute to the knowledge on the program’s potential to enhance social participation and autonomy of the seniors, as well as increasing the job satisfaction of the

  18. Effectiveness of a self-management program for dual sensory impaired seniors in aged care settings: study protocol for a cluster randomized controlled trial.

    PubMed

    Roets-Merken, Lieve M; Graff, Maud J L; Zuidema, Sytse U; Hermsen, Pieter G J M; Teerenstra, Steven; Kempen, Gertrudis I J M; Vernooij-Dassen, Myrra J F J

    2013-10-07

    Five to 25 percent of residents in aged care settings have a combined hearing and visual sensory impairment. Usual care is generally restricted to single sensory impairment, neglecting the consequences of dual sensory impairment on social participation and autonomy. The aim of this study is to evaluate the effectiveness of a self-management program for seniors who acquired dual sensory impairment at old age. In a cluster randomized, single-blind controlled trial, with aged care settings as the unit of randomization, the effectiveness of a self-management program will be compared to usual care. A minimum of 14 and maximum of 20 settings will be randomized to either the intervention cluster or the control cluster, aiming to include a total of 132 seniors with dual sensory impairment. Each senior will be linked to a licensed practical nurse working at the setting. During a five to six month intervention period, nurses at the intervention clusters will be trained in a self-management program to support and empower seniors to use self-management strategies. In two separate diaries, nurses keep track of the interviews with the seniors and their reflections on their own learning process. Nurses of the control clusters offer care as usual. At senior level, the primary outcome is the social participation of the seniors measured using the Hearing Handicap Questionnaire and the Activity Card Sort, and secondary outcomes are mood, autonomy and quality of life. At nurse level, the outcome is job satisfaction. Effectiveness will be evaluated using linear mixed model analysis. The results of this study will provide evidence for the effectiveness of the Self-Management Program for seniors with dual sensory impairment living in aged care settings. The findings are expected to contribute to the knowledge on the program's potential to enhance social participation and autonomy of the seniors, as well as increasing the job satisfaction of the licensed practical nurses. Furthermore, an

  19. Family-centered rounds in Pakistani pediatric intensive care settings: non-randomized pre- and post-study design.

    PubMed

    Ladak, Laila Akbar; Premji, Shahirose Sadrudin; Amanullah, Muhammad Muneer; Haque, Anwarul; Ajani, Khairulnissa; Siddiqui, Fahad Javaid

    2013-06-01

    Involvement of family in bedside rounds is one strategy to implement family-centered care to help families get clear information about their child, and be actively involved in decision-making about care. However in developing countries such as Pakistan, daily bedside rounds include the physician, residents, medical students and a nurse/technician. Parents are not currently a part of these rounds. To assess whether family-centered rounds improve parents' and health care professionals' satisfaction, decrease patient length of stay, and improve time utilization when compared to traditional practice rounds in a population with a low literacy rate, socioeconomic status, and different cultural values and beliefs. A non-randomized before-after study design. A private hospital in Karachi, Pakistan. A convenience sample of 82 parents, whose children were hospitalized for a minimum of 48h, and 25 health care professionals able to attend two consecutive rounds. During the before phase, traditional bedside rounds were practiced; and during after phase, family-centered rounds were practiced. Parents and health care professionals completed a questionnaire on the second day of rounds. An observational form facilitated data collection on length of stay and time utilization during. Parents' ratings during the family-centered rounds were significantly higher for some parental satisfaction items: evidence of team work (p=0.007), use of simple language during the rounds (p=0.002), feeling of inclusion in discussion at rounds (p=0.03), decision making (p=0.01), and preference for family-centered rounds (p=<0.001). No significant differences were found in health care professionals' satisfaction between rounds. Patient length of stay was significantly reduced in the family-centered rounds group, while no significant difference was found in the duration of rounds. Family-centered rounds served as an opportunity for parents to correct/add to patient history or documentation. Parents were

  20. MD-Logic overnight type 1 diabetes control in home settings: A multicentre, multinational, single blind randomized trial.

    PubMed

    Nimri, Revital; Bratina, Natasa; Kordonouri, Olga; Avbelj Stefanija, Magdalena; Fath, Maryam; Biester, Torben; Muller, Ido; Atlas, Eran; Miller, Shahar; Fogel, Aviel; Phillip, Moshe; Danne, Thomas; Battelino, Tadej

    2017-04-01

    To evaluate the safety, efficacy and need for remote monitoring of the MD-Logic closed-loop system during short-term overnight use at home. Seventy-five patients (38 male; aged 10-54 years; average A1c, 7.8% ± 0.7%, 61.8 ± 7.2 mmol/mol) were enrolled from 3 clinical sites. Patients were randomly assigned to participate in 2 overnight crossover periods, each including 4 consecutive nights, 1 under closed-loop control and 1 under sensor-augmented pump (SAP) therapy in the patient's home. Both study arms were supervised using a remote-monitoring system in a blinded manner. Primary endpoints were time spent with glucose levels below 70 mg/dL and percentage of nights in which mean overnight glucose levels were within 90 to 140 mg/dL. The median [interquartile range] percentage of time spent in hypoglycaemia was significantly lower on nights when MD-Logic was used, compared to SAP therapy (2.07 [0, 4.78] and 2.6 [0, 10.34], respectively; P = .004) and the percentage of individual nights with a mean overnight glucose level in target was significantly greater (75 [42, 75] and 50 [25,75], respectively; P = .008). The time spent in target range was increased by a median of 28% (P = .001), with the same amount of insulin (10.69 [7.28, 13.94] and 10.41[6.9, 14.07], respectively; P = .087). The remote monitoring triggered calls for hypoglycaemia at twice the rate during SAP therapy compared to closed-loop control (62 and 29, respectively; P = .002). The MD-Logic system demonstrated a safe and efficient profile during overnight use by children, adolescents and adults with type 1 diabetes and, therefore, provides an effective means of mitigating the risk of nocturnal hypoglycaemia. © 2016 John Wiley & Sons Ltd.

  1. A Controlled Design of Aligned and Random Nanofibers for 3D Bi-functionalized Nerve Conduits Fabricated via a Novel Electrospinning Set-up

    PubMed Central

    Kim, Jeong In; Hwang, Tae In; Aguilar, Ludwig Erik; Park, Chan Hee; Kim, Cheol Sang

    2016-01-01

    Scaffolds made of aligned nanofibers are favorable for nerve regeneration due to their superior nerve cell attachment and proliferation. However, it is challenging not only to produce a neat mat or a conduit form with aligned nanofibers but also to use these for surgical applications as a nerve guide conduit due to their insufficient mechanical strength. Furthermore, no studies have been reported on the fabrication of aligned nanofibers and randomly-oriented nanofibers on the same mat. In this study, we have successfully produced a mat with both aligned and randomly-oriented nanofibers by using a novel electrospinning set up. A new conduit with a highly-aligned electrospun mat is produced with this modified electrospinning method, and this proposed conduit with favorable features, such as selective permeability, hydrophilicity and nerve growth directional steering, were fabricated as nerve guide conduits (NGCs). The inner surface of the nerve conduit is covered with highly aligned electrospun nanofibers and is able to enhance the proliferation of neural cells. The central part of the tube is double-coated with randomly-oriented nanofibers over the aligned nanofibers, strengthening the weak mechanical strength of the aligned nanofibers. PMID:27021221

  2. A Controlled Design of Aligned and Random Nanofibers for 3D Bi-functionalized Nerve Conduits Fabricated via a Novel Electrospinning Set-up.

    PubMed

    Kim, Jeong In; Hwang, Tae In; Aguilar, Ludwig Erik; Park, Chan Hee; Kim, Cheol Sang

    2016-03-29

    Scaffolds made of aligned nanofibers are favorable for nerve regeneration due to their superior nerve cell attachment and proliferation. However, it is challenging not only to produce a neat mat or a conduit form with aligned nanofibers but also to use these for surgical applications as a nerve guide conduit due to their insufficient mechanical strength. Furthermore, no studies have been reported on the fabrication of aligned nanofibers and randomly-oriented nanofibers on the same mat. In this study, we have successfully produced a mat with both aligned and randomly-oriented nanofibers by using a novel electrospinning set up. A new conduit with a highly-aligned electrospun mat is produced with this modified electrospinning method, and this proposed conduit with favorable features, such as selective permeability, hydrophilicity and nerve growth directional steering, were fabricated as nerve guide conduits (NGCs). The inner surface of the nerve conduit is covered with highly aligned electrospun nanofibers and is able to enhance the proliferation of neural cells. The central part of the tube is double-coated with randomly-oriented nanofibers over the aligned nanofibers, strengthening the weak mechanical strength of the aligned nanofibers.

  3. Cardiovascular risk management and its impact on hypertension control in primary care in low-resource settings: a cluster-randomized trial.

    PubMed

    Mendis, Shanthi; Johnston, S Claiborne; Fan, Wu; Oladapo, Olulola; Cameron, Ali; Faramawi, Mohammed F

    2010-06-01

    To evaluate a simple cardiovascular risk management package for assessing and managing cardiovascular risk using hypertension as an entry point in primary care facilities in low-resource settings. Two geographically distant regions in two countries (China and Nigeria) were selected and 10 pairs of primary care facilities in each region were randomly selected and matched. Regions were then randomly assigned to a control group, which received usual care, or to an intervention group, which applied the cardiovascular risk management package. Each facility enrolled 60 consecutive patients with hypertension. Intervention sites educated patients about risk factors at baseline and initiated treatment with hydrochlorothiazide at 4 months in patients at medium risk of a cardiovascular event, according to a standardized treatment algorithm. Systolic blood pressure change from baseline to 12 months was the primary outcome measure. The study included 2397 patients with baseline hypertension: 1191 in 20 intervention facilities and 1206 in 20 control facilities. Systolic and diastolic blood pressure decreased more in intervention patients than in controls. However, at 12 months more than half of patients still had uncontrolled hypertension (systolic blood pressure > 140 mmHg and/or diastolic blood pressure > 90 mmHg). Behavioural risk factors had improved among intervention patients in Nigeria but not in China. Only about 2% of hypertensive patients required referral to the next level of care. Even in low-resource settings, hypertensive patients can be effectively assessed and managed in primary care facilities.

  4. Effectiveness of intensive group and individual interventions for smoking cessation in primary health care settings: a randomized trial

    PubMed Central

    2010-01-01

    Objectives Primary: To compare the effectiveness of intensive group and individual interventions for smoking cessation in a primary health care setting; secondary: to identify the variables associated with smoking cessation. Methods Three-pronged clinical trial with randomisation at the individual level. We performed the following: an intensive individual intervention (III), an intensive group intervention (IGI) and a minimal intervention (MI). Included in the study were smokers who were prepared to quit smoking. Excluded from the study were individuals aged less than 18 years or with severe mental conditions or terminal illnesses. The outcome measure was continued abstinence at 12 months confirmed through CO-oximetry (CO). The analysis was based on intention to treat. Results In total, 287 smokers were recruited: 81 in the III, 111 in the IGI, and 95 in the MI. Continued abstinence at 12 months confirmed through CO was 7.4% in the III, 5.4% in the IGI, and 1% in the MI. No significant differences were noted between III and MI on the one hand, and between IGI and MI on the other [RR 7.04 (0.9-7.2) and RR 5.1 (0.6-41.9), respectively]. No differences were noted between IGI and III [RR 0.7 (0.2-2.2)]. In multivariate analysis, only overall visit length showed a statistically significant association with smoking cessation. Conclusions The effectiveness of intensive smoking interventions in this study was lower than expected. No statistically significant differences were found between the results of individual and group interventions. Trial registration number ISRCTN32323770 PMID:20178617

  5. UNCERT: geostatistics, uncertainty analysis and visualization software applied to groundwater flow and contaminant transport modeling

    NASA Astrophysics Data System (ADS)

    Wingle, William L.; Poeter, Eileen P.; McKenna, Sean A.

    1999-05-01

    UNCERT is a 2D and 3D geostatistics, uncertainty analysis and visualization software package applied to ground water flow and contaminant transport modeling. It is a collection of modules that provides tools for linear regression, univariate statistics, semivariogram analysis, inverse-distance gridding, trend-surface analysis, simple and ordinary kriging and discrete conditional indicator simulation. Graphical user interfaces for MODFLOW and MT3D, ground water flow and contaminant transport models, are provided for streamlined data input and result analysis. Visualization tools are included for displaying data input and output. These include, but are not limited to, 2D and 3D scatter plots, histograms, box and whisker plots, 2D contour maps, surface renderings of 2D gridded data and 3D views of gridded data. By design, UNCERT's graphical user interface and visualization tools facilitate model design and analysis. There are few built in restrictions on data set sizes and each module (with two exceptions) can be run in either graphical or batch mode. UNCERT is in the public domain and is available from the World Wide Web with complete on-line and printable (PDF) documentation. UNCERT is written in ANSI-C with a small amount of FORTRAN77, for UNIX workstations running X-Windows and Motif (or Lesstif). This article discusses the features of each module and demonstrates how they can be used individually and in combination. The tools are applicable to a wide range of fields and are currently used by researchers in the ground water, mining, mathematics, chemistry and geophysics, to name a few disciplines.

  6. Geostatistical conditional simulation for the assessment of contaminated land by abandoned heavy metal mining.

    PubMed

    Ersoy, Adem; Yunsel, Tayfun Yusuf; Atici, Umit

    2008-02-01

    Abandoned mine workings can undoubtedly cause varying degrees of contamination of soil with heavy metals such as lead and zinc has occurred on a global scale. Exposure to these elements may cause to harm human health and environment. In the study, a total of 269 soil samples were collected at 1, 5, and 10 m regular grid intervals of 100 x 100 m area of Carsington Pasture in the UK. Cell declustering technique was applied to the data set due to no statistical representativity. Directional experimental semivariograms of the elements for the transformed data showed that both geometric and zonal anisotropy exists in the data. The most evident spatial dependence structure of the continuity for the directional experimental semivariogram, characterized by spherical and exponential models of Pb and Zn were obtained. This study reports the spatial distribution and uncertainty of Pb and Zn concentrations in soil at the study site using a probabilistic approach. The approach was based on geostatistical sequential Gaussian simulation (SGS), which is used to yield a series of conditional images characterized by equally probable spatial distributions of the heavy elements concentrations across the area. Postprocessing of many simulations allowed the mapping of contaminated and uncontaminated areas, and provided a model for the uncertainty in the spatial distribution of element concentrations. Maps of the simulated Pb and Zn concentrations revealed the extent and severity of contamination. SGS was validated by statistics, histogram, variogram reproduction, and simulation errors. The maps of the elements might be used in the remediation studies, help decision-makers and others involved in the abandoned heavy metal mining site in the world.

  7. Error modeling based on geostatistics for uncertainty analysis in crop mapping using Gaofen-1 multispectral imagery

    NASA Astrophysics Data System (ADS)

    You, Jiong; Pei, Zhiyuan

    2015-01-01

    With the development of remote sensing technology, its applications in agriculture monitoring systems, crop mapping accuracy, and spatial distribution are more and more being explored by administrators and users. Uncertainty in crop mapping is profoundly affected by the spatial pattern of spectral reflectance values obtained from the applied remote sensing data. Errors in remotely sensed crop cover information and the propagation in derivative products need to be quantified and handled correctly. Therefore, this study discusses the methods of error modeling for uncertainty characterization in crop mapping using GF-1 multispectral imagery. An error modeling framework based on geostatistics is proposed, which introduced the sequential Gaussian simulation algorithm to explore the relationship between classification errors and the spectral signature from remote sensing data source. On this basis, a misclassification probability model to produce a spatially explicit classification error probability surface for the map of a crop is developed, which realizes the uncertainty characterization for crop mapping. In this process, trend surface analysis was carried out to generate a spatially varying mean response and the corresponding residual response with spatial variation for the spectral bands of GF-1 multispectral imagery. Variogram models were employed to measure the spatial dependence in the spectral bands and the derived misclassification probability surfaces. Simulated spectral data and classification results were quantitatively analyzed. Through experiments using data sets from a region in the low rolling country located at the Yangtze River valley, it was found that GF-1 multispectral imagery can be used for crop mapping with a good overall performance, the proposal error modeling framework can be used to quantify the uncertainty in crop mapping, and the misclassification probability model can summarize the spatial variation in map accuracy and is helpful for

  8. Quantifying the Relationship between Dynamical Cores and Physical Parameterizations by Geostatistical Methods

    NASA Astrophysics Data System (ADS)

    Yorgun, M. S.; Rood, R. B.

    2010-12-01

    The behavior of atmospheric models is sensitive to the algorithms that are used to represent the equations of motion. Typically, comprehensive models are conceived in terms of the resolved fluid dynamics (i.e. the dynamical core) and subgrid, unresolved physics represented by parameterizations. Deterministic weather predictions are often validated with feature-by-feature comparison. Probabilistic weather forecasts and climate projects are evaluated with statistical methods. We seek to develop model evaluation strategies that identify like “objects” - coherent systems with an associated set of measurable parameters. This makes it possible to evaluate processes in models without needing to reproduce the time and location of, for example, a particular observed cloud system. Process- and object-based evaluation preserves information in the observations by avoiding the need for extensive spatial and temporal averaging. As a concrete example, we focus on analyzing how the choice of dynamical core impacts the representation of precipitation in the Pacific Northwest of the United States, Western Canada, and Alaska; this brings attention to the interaction of the resolved and the parameterized components of the model. Two dynamical cores are considered within the Community Atmosphere Model. These are the Spectral (Eulerian), which relies on global basis functions and the Finite Volume (FV), which uses only local information. We introduce the concept of "meteorological realism" that is, do local representations of large-scale phenomena, for example, fronts and orographic precipitation, look like the observations? A follow on question is, does the representation of these phenomena improve with resolution? Our approach to quantify meteorological realism starts with methods of geospatial statistics. Specifically, we employ variography, which is a geostatistical method which is used to measure the spatial continuity of a regionalized variable, and principle component

  9. Geostatistical analysis of soil geochemical data from an industrial area (Puertollano, South-Central Spain).

    NASA Astrophysics Data System (ADS)

    Esbrí, José M.; Higueras, Pablo; López-Berdonces, Miguel A.; García-Noguero, Eva M.; González-Corrochano, Beatriz; Fernández-Calderón, Sergio; Martínez-Coronado, Alba

    2015-04-01

    Puertollano is the biggest industrial city of Castilla-La Mancha, with 48,086 inhabitants. It is located 250 km South of Madrid in the North border of the Ojailén River valley. The industrial area includes a big coal open pit (ENCASUR), two power plants (EON and ELCOGAS), a petrochemical complex (REPSOL) and a fertiliser factory (ENFERSA), all located in the proximities of the town. These industries suppose a complex scenario in terms of metals and metalloids emissions. For instance, mercury emissions declared to PRTR inventory during 2010 were 210 kg year-1 (REPSOL), 130 kg year-1 (ELCOGAS) and 11,9 kg year-1 (EON). Besides it still remains an unaccounted possibly of diffuse sources of other potentially toxic elements coming from the different industrial sites. Multielemental analyses of soils from two different depths covering the whole valley were carried out by means of XRF with a portable Oxford Instruments device. Geostatistical data treatment was performed using SURFER software, applying block kriging to obtain interpolation maps for the study area. Semivariograms of elemental concentrations make a clear distinction between volatile (Hg, Se) and non-volatile elements (Cu, Ni), with differences in scales and variances between the two soil horizons considered. Semivariograms also show different models for elements emitted by combustion processes (Ni) and for anomalous elements from geological substrate (Pb, Zn). In addition to differences in anisotropy of data, these models reflect different forms of elemental dispersion; despite this, identification of particular sources for the different elements is not possible for this geochemical data set.

  10. Video-games used in a group setting is feasible and effective to improve indicators of physical activity in individuals with chronic stroke: a randomized controlled trial.

    PubMed

    Givon, Noa; Zeilig, Gabi; Weingarden, Harold; Rand, Debbie

    2016-04-01

    To investigate the feasibility of using video-games in a group setting and to compare the effectiveness of video-games as a group intervention to a traditional group intervention for improving physical activity in individuals with chronic stroke. A single-blind randomized controlled trial with evaluations pre and post a 3-month intervention, and at 3-month follow-up. Compliance (session attendance), satisfaction and adverse effects were feasibility measures. Grip strength and gait speed were measures of physical activity. Hip accelerometers quantified steps/day and the Action Research Arm Test assessed the functional ability of the upper extremity. Forty-seven community-dwelling individuals with chronic stroke (29-78 years) were randomly allocated to receive video-game (N=24) or traditional therapy (N=23) in a group setting. There was high treatment compliance for both interventions (video-games-78%, traditional therapy-66%), but satisfaction was rated higher for the video-game (93%) than the traditional therapy (71%) (χ(2)=4.98, P=0.026). Adverse effects were not reported in either group. Significant improvements were demonstrated in both groups for gait speed (F=3.9, P=0.02), grip strength of the weaker (F=6.67, P=0.002) and stronger hands (F=7.5, P=0.001). Daily steps and functional ability of the weaker hand did not increase in either group. Using video-games in a small group setting is feasible, safe and satisfying. Video-games improve indicators of physical activity of individuals with chronic stroke. © The Author(s) 2015.

  11. Predictive and Prognostic Role of Tumor-Infiltrating Lymphocytes for Early Breast Cancer According to Disease Subtypes: Sensitivity Analysis of Randomized Trials in Adjuvant and Neoadjuvant Setting.

    PubMed

    Carbognin, Luisa; Pilotto, Sara; Nortilli, Rolando; Brunelli, Matteo; Nottegar, Alessia; Sperduti, Isabella; Giannarelli, Diana; Bria, Emilio; Tortora, Giampaolo

    2016-03-01

    The role of tumor-infiltrating lymphocytes (TILs) in breast cancer (BC) is still an issue for clinical research. Toward this end, a sensitivity analysis of neoadjuvant and adjuvant randomized clinical trials was performed according to disease subtypes. Pathological complete responses (pCRs) after neoadjuvant treatment according to the presence or absence of lymphocyte-predominant BC (LPBC) were extracted and cumulated as odds ratios (ORs) by adopting a random-effects model by subtype. Overall survival hazard ratios as a function of 10% incremental values of stromal TILs (sTILs) in adjuvant trials were extracted. The interaction test was adopted to determine the differential effect according to the subtype. Eight trials (5,514 patients) were identified. With regard to neoadjuvant setting (4 studies), a significant interaction (p < .0001) according to LPBC was found. The presence of LPBC was associated with a 29.5% increase in pCR rate compared with non-LPBC (p < .0001). The pCR rate was significantly higher in patients with LPBC in triple-negative BC (TNBC) and HER2-positive BC settings, with an absolute difference of 15.7% (95% confidence interval [CI], 4.9%-26.2%) and 33.3% (95% CI, 23.6%-42.7%), respectively. With respect to the adjuvant setting (4 studies), a significant interaction (p < .0001) according to sTILs was found. A survival benefit was more likely to be determined for HER2-positive BC (p = .025) and TNBC (p < .0001), with no statistically significant difference for estrogen receptor-positive/HER2-negative disease. Despite the retrospective nature of this analysis, the presence of TILs may represent a robust predictive and prognostic marker for BC, particularly for TNBC and HER2-positive disease. ©AlphaMed Press.

  12. Spatial analysis of the distribution of Spodoptera frugiperda (J.E. Smith) (Lepidoptera: Noctuidae) and losses in maize crop productivity using geostatistics.

    PubMed

    Farias, Paulo R S; Barbosa, José C; Busoli, Antonio C; Overal, William L; Miranda, Vicente S; Ribeiro, Susane M

    2008-01-01

    The fall armyworm, Spodoptera frugiperda (J.E. Smith), is one of the chief pests of maize in the Americas. The study of its spatial distribution is fundamental for designing correct control strategies, improving sampling methods, determining actual and potential crop losses, and adopting precise agricultural techniques. In São Paulo state, Brazil, a maize field was sampled at weekly intervals, from germination through harvest, for caterpillar densities, using quadrates. In each of 200 quadrates, 10 plants were sampled per week. Harvest weights were obtained in the field for each quadrate, and ear diameters and lengths were also sampled (15 ears per quadrate) and used to estimate potential productivity of the quadrate. Geostatistical analyses of caterpillar densities showed greatest ranges for small caterpillars when semivariograms were adjusted for a spherical model that showed greatest fit. As the caterpillars developed in the field, their spatial distribution became increasingly random, as shown by a model adjusted to a straight line, indicating a lack of spatial dependence among samples. Harvest weight and ear length followed the spherical model, indicating the existence of spatial variability of the production parameters in the maize field. Geostatistics shows promise for the application of precise methods in the integrated control of pests.

  13. Hungarian contribution to the Global Soil Organic Carbon Map (GSOC17) using advanced machine learning algorithms and geostatistics

    NASA Astrophysics Data System (ADS)

    Szatmári, Gábor; Laborczi, Annamária; Takács, Katalin; Pásztor, László

    2017-04-01

    The knowledge about soil organic carbon (SOC) baselines and changes, and the detection of vulnerable hot spots for SOC losses and gains under climate change and changed land management is still fairly limited. Thus Global Soil Partnership (GSP) has been requested to develop a global SOC mapping campaign by 2017. GSPs concept builds on official national data sets, therefore, a bottom-up (country-driven) approach is pursued. The elaborated Hungarian methodology suits the general specifications of GSOC17 provided by GSP. The input data for GSOC17@HU mapping approach has involved legacy soil data bases, as well as proper environmental covariates related to the main soil forming factors, such as climate, organisms, relief and parent material. Nowadays, digital soil mapping (DSM) highly relies on the assumption that soil properties of interest can be modelled as a sum of a deterministic and stochastic component, which can be treated and modelled separately. We also adopted this assumption in our methodology. In practice, multiple regression techniques are commonly used to model the deterministic part. However, this global (and usually linear) models commonly oversimplify the often complex and non-linear relationship, which has a crucial effect on the resulted soil maps. Thus, we integrated machine learning algorithms (namely random forest and quantile regression forest) in the elaborated methodology, supposing then to be more suitable for the problem in hand. This approach has enable us to model the GSOC17 soil properties in that complex and non-linear forms as the soil itself. Furthermore, it has enable us to model and assess the uncertainty of the results, which is highly relevant in decision making. The applied methodology has used geostatistical approach to model the stochastic part of the spatial variability of the soil properties of interest. We created GSOC17@HU map with 1 km grid resolution according to the GSPs specifications. The map contributes to the GSPs

  14. Effect of a Quality Improvement Intervention With Daily Round Checklists, Goal Setting, and Clinician Prompting on Mortality of Critically Ill Patients: A Randomized Clinical Trial.

    PubMed

    Cavalcanti, Alexandre B; Bozza, Fernando Augusto; Machado, Flavia R; Salluh, Jorge I F; Campagnucci, Valquiria Pelisser; Vendramim, Patricia; Guimaraes, Helio Penna; Normilio-Silva, Karina; Damiani, Lucas Petri; Romano, Edson; Carrara, Fernanda; Lubarino Diniz de Souza, Juliana; Silva, Aline Reis; Ramos, Grazielle Viana; Teixeira, Cassiano; Brandão da Silva, Nilton; Chang, Chung-Chou H; Angus, Derek C; Berwanger, Otavio

    2016-04-12

    The effectiveness of checklists, daily goal assessments, and clinician prompts as quality improvement interventions in intensive care units (ICUs) is uncertain. To determine whether a multifaceted quality improvement intervention reduces the mortality of critically ill adults. This study had 2 phases. Phase 1 was an observational study to assess baseline data on work climate, care processes, and clinical outcomes, conducted between August 2013 and March 2014 in 118 Brazilian ICUs. Phase 2 was a cluster randomized trial conducted between April and November 2014 with the same ICUs. The first 60 admissions of longer than 48 hours per ICU were enrolled in each phase. Intensive care units were randomized to a quality improvement intervention, including a daily checklist and goal setting during multidisciplinary rounds with follow-up clinician prompting for 11 care processes, or to routine care. In-hospital mortality truncated at 60 days (primary outcome) was analyzed using a random-effects logistic regression model, adjusted for patients' severity and the ICU's baseline standardized mortality ratio. Exploratory secondary outcomes included adherence to care processes, safety climate, and clinical events. A total of 6877 patients (mean age, 59.7 years; 3218 [46.8%] women) were enrolled in the baseline (observational) phase and 6761 (mean age, 59.6 years; 3098 [45.8%] women) in the randomized phase, with 3327 patients enrolled in ICUs (n = 59) assigned to the intervention group and 3434 patients in ICUs (n = 59) assigned to routine care. There was no significant difference in in-hospital mortality between the intervention group and the usual care group, with 1096 deaths (32.9%) and 1196 deaths (34.8%), respectively (odds ratio, 1.02; 95% CI, 0.82-1.26; P = .88). Among 20 prespecified secondary outcomes not adjusted for multiple comparisons, 6 were significantly improved in the intervention group (use of low tidal volumes, avoidance of heavy sedation, use of

  15. Combined assimilation of streamflow and satellite soil moisture with the particle filter and geostatistical modeling

    NASA Astrophysics Data System (ADS)

    Yan, Hongxiang; Moradkhani, Hamid

    2016-08-01

    Assimilation of satellite soil moisture and streamflow data into a distributed hydrologic model has received increasing attention over the past few years. This study provides a detailed analysis of the joint and separate assimilation of streamflow and Advanced Scatterometer (ASCAT) surface soil moisture into a distributed Sacramento Soil Moisture Accounting (SAC-SMA) model, with the use of recently developed particle filter-Markov chain Monte Carlo (PF-MCMC) method. Performance is assessed over the Salt River Watershed in Arizona, which is one of the watersheds without anthropogenic effects in Model Parameter Estimation Experiment (MOPEX). A total of five data assimilation (DA) scenarios are designed and the effects of the locations of streamflow gauges and the ASCAT soil moisture on the predictions of soil moisture and streamflow are assessed. In addition, a geostatistical model is introduced to overcome the significantly biased satellite soil moisture and also discontinuity issue. The results indicate that: (1) solely assimilating outlet streamflow can lead to biased soil moisture estimation; (2) when the study area can only be partially covered by the satellite data, the geostatistical approach can estimate the soil moisture for those uncovered grid cells; (3) joint assimilation of streamflow and soil moisture from geostatistical modeling can further improve the surface soil moisture prediction. This study recommends that the geostatistical model is a helpful tool to aid the remote sensing technique and the hydrologic DA study.

  16. Estimating Solar PV Output Using Modern Space/Time Geostatistics (Presentation)

    SciTech Connect

    Lee, S. J.; George, R.; Bush, B.

    2009-04-29

    This presentation describes a project that uses mapping techniques to predict solar output at subhourly resolution at any spatial point, develop a methodology that is applicable to natural resources in general, and demonstrate capability of geostatistical techniques to predict the output of a potential solar plant.

  17. The use of geostatistics in the study of floral phenology of Vulpia geniculata (L.) link.

    PubMed

    León Ruiz, Eduardo J; García Mozo, Herminia; Domínguez Vilches, Eugenio; Galán, Carmen

    2012-01-01

    Traditionally phenology studies have been focused on changes through time, but there exist many instances in ecological research where it is necessary to interpolate among spatially stratified samples. The combined use of Geographical Information Systems (GIS) and Geostatistics can be an essential tool for spatial analysis in phenological studies. Geostatistics are a family of statistics that describe correlations through space/time and they can be used for both quantifying spatial correlation and interpolating unsampled points. In the present work, estimations based upon Geostatistics and GIS mapping have enabled the construction of spatial models that reflect phenological evolution of Vulpia geniculata (L.) Link throughout the study area during sampling season. Ten sampling points, scattered throughout the city and low mountains in the "Sierra de Córdoba" were chosen to carry out the weekly phenological monitoring during flowering season. The phenological data were interpolated by applying the traditional geostatitical method of Kriging, which was used to elaborate weekly estimations of V. geniculata phenology in unsampled areas. Finally, the application of Geostatistics and GIS to create phenological maps could be an essential complement in pollen aerobiological studies, given the increased interest in obtaining automatic aerobiological forecasting maps.

  18. Integrated geostatistics for modeling fluid contacts and shales in Prudhoe Bay

    SciTech Connect

    Perez, G.; Chopra, A.K.; Severson, C.D.

    1997-12-01

    Geostatistics techniques are being used increasingly to model reservoir heterogeneity at a wide range of scales. A variety of techniques is now available with differing underlying assumptions, complexity, and applications. This paper introduces a novel method of geostatistics to model dynamic gas-oil contacts and shales in the Prudhoe Bay reservoir. The method integrates reservoir description and surveillance data within the same geostatistical framework. Surveillance logs and shale data are transformed to indicator variables. These variables are used to evaluate vertical and horizontal spatial correlation and cross-correlation of gas and shale at different times and to develop variogram models. Conditional simulation techniques are used to generate multiple three-dimensional (3D) descriptions of gas and shales that provide a measure of uncertainty. These techniques capture the complex 3D distribution of gas-oil contacts through time. The authors compare results of the geostatistical method with conventional techniques as well as with infill wells drilled after the study. Predicted gas-oil contacts and shale distributions are in close agreement with gas-oil contacts observed at infill wells.

  19. Adapting geostatistics to analyze spatial and temporal trends in weed populations

    USDA-ARS?s Scientific Manuscript database

    Geostatistics were originally developed in mining to estimate the location, abundance and quality of ore over large areas from soil samples to optimize future mining efforts. Here, some of these methods were adapted to weeds to account for a limited distribution area (i.e., inside a field), variatio...

  20. Spatial interpolation of forest conditions using co-conditional geostatistical simulation

    Treesearch

    H. Todd Mowrer

    2000-01-01

    In recent work the author used the geostatistical Monte Carlo technique of sequential Gaussian simulation (s.G.s.) to investigate uncertainty in a GIS analysis of potential old-growth forest areas. The current study compares this earlier technique to that of co-conditional simulation, wherein the spatial cross-correlations between variables are included. As in the...

  1. The Use of Geostatistics in the Study of Floral Phenology of Vulpia geniculata (L.) Link

    PubMed Central

    León Ruiz, Eduardo J.; García Mozo, Herminia; Domínguez Vilches, Eugenio; Galán, Carmen

    2012-01-01

    Traditionally phenology studies have been focused on changes through time, but there exist many instances in ecological research where it is necessary to interpolate among spatially stratified samples. The combined use of Geographical Information Systems (GIS) and Geostatistics can be an essential tool for spatial analysis in phenological studies. Geostatistics are a family of statistics that describe correlations through space/time and they can be used for both quantifying spatial correlation and interpolating unsampled points. In the present work, estimations based upon Geostatistics and GIS mapping have enabled the construction of spatial models that reflect phenological evolution of Vulpia geniculata (L.) Link throughout the study area during sampling season. Ten sampling points, scattered troughout the city and low mountains in the “Sierra de Córdoba” were chosen to carry out the weekly phenological monitoring during flowering season. The phenological data were interpolated by applying the traditional geostatitical method of Kriging, which was used to ellaborate weekly estimations of V. geniculata phenology in unsampled areas. Finally, the application of Geostatistics and GIS to create phenological maps could be an essential complement in pollen aerobiological studies, given the increased interest in obtaining automatic aerobiological forecasting maps. PMID:22629169

  2. Effects of Prereactivation Propranolol on Cocaine Craving Elicited by Imagery Script/Cue Sets in Opioid-dependent Polydrug Users: A Randomized Study

    PubMed Central

    Jobes, Michelle L.; Aharonovich, Efrat; Epstein, David H.; Phillips, Karran A.; Reamer, David; Anderson, Micheline; Preston, Kenzie L.

    2015-01-01

    Objectives Relapse to drug misuse may follow exposure to drug cues that elicit craving. The learned associations, or “emotional memories,” that underlie responses to cues may be attenuated or erased by the beta-adrenergic antagonist propranolol during a “reconsolidation window” shortly after the memories are reactivated by cues. Methods We evaluated the effects of propranolol on cue-induced drug cravings in healthy opioid-dependent individuals who used cocaine while receiving methadone maintenance (n = 33). Participants were asked to recall specific cocaine-use and neutral events in an interview; these events were used to develop personalized auditory script/cue sets. Approximately one week later, propranolol (40 mg) or placebo (random assignment, double-blind) was administered orally before presentation of the script/cue sets; the presentation of the script/cue sets were tested 1 and 5 weeks after the propranolol/placebo session. Ongoing drug use was monitored via urine screens and self-report in twice-weekly visits. Results Cue reactivity, as assessed by craving scales and physiological responses, was unexpectedly greater in the propranolol group than in the placebo group. This counter-hypothesized group difference was present acutely during propranolol administration and appeared to persist (without reaching statistical significance) during the subsequent test sessions. Conclusions Our results do not support the use of propranolol for cue-induced cocaine craving in opioid-maintained patients. PMID:26501788

  3. Predictive and Prognostic Role of Tumor-Infiltrating Lymphocytes for Early Breast Cancer According to Disease Subtypes: Sensitivity Analysis of Randomized Trials in Adjuvant and Neoadjuvant Setting

    PubMed Central

    Carbognin, Luisa; Pilotto, Sara; Nortilli, Rolando; Brunelli, Matteo; Nottegar, Alessia; Sperduti, Isabella; Giannarelli, Diana; Tortora, Giampaolo

    2016-01-01

    Background. The role of tumor-infiltrating lymphocytes (TILs) in breast cancer (BC) is still an issue for clinical research. Toward this end, a sensitivity analysis of neoadjuvant and adjuvant randomized clinical trials was performed according to disease subtypes. Methods. Pathological complete responses (pCRs) after neoadjuvant treatment according to the presence or absence of lymphocyte-predominant BC (LPBC) were extracted and cumulated as odds ratios (ORs) by adopting a random-effects model by subtype. Overall survival hazard ratios as a function of 10% incremental values of stromal TILs (sTILs) in adjuvant trials were extracted. The interaction test was adopted to determine the differential effect according to the subtype. Results. Eight trials (5,514 patients) were identified. With regard to neoadjuvant setting (4 studies), a significant interaction (p < .0001) according to LPBC was found. The presence of LPBC was associated with a 29.5% increase in pCR rate compared with non-LPBC (p < .0001). The pCR rate was significantly higher in patients with LPBC in triple-negative BC (TNBC) and HER2-positive BC settings, with an absolute difference of 15.7% (95% confidence interval [CI], 4.9%–26.2%) and 33.3% (95% CI, 23.6%–42.7%), respectively. With respect to the adjuvant setting (4 studies), a significant interaction (p < .0001) according to sTILs was found. A survival benefit was more likely to be determined for HER2-positive BC (p = .025) and TNBC (p < .0001), with no statistically significant difference for estrogen receptor-positive/HER2-negative disease. Conclusion. Despite the retrospective nature of this analysis, the presence of TILs may represent a robust predictive and prognostic marker for BC, particularly for TNBC and HER2-positive disease. Implications for Practice: This sensitivity analysis of neoadjuvant and adjuvant randomized clinical trials in breast cancer explores the potential predictive and prognostic role of tumor-infiltrating lymphocytes

  4. The effectiveness of physical activity monitoring and distance counseling in an occupational setting – Results from a randomized controlled trial (CoAct)

    PubMed Central

    2012-01-01

    Background Lack of physical activity (PA) is a known risk factor for many health conditions. The workplace is a setting often used to promote activity and health. We investigated the effectiveness of an intervention on PA and productivity-related outcomes in an occupational setting. Methods We conducted a randomized controlled trial of 12 months duration with two 1:1 allocated parallel groups of insurance company employees. Eligibility criteria included permanent employment and absence of any condition that risked the participant’s health during PA. Subjects in the intervention group monitored their daily PA with an accelerometer, set goals, had access to an online service to help them track their activity levels, and received counseling via telephone or web messages for 12 months. The control group received the results of a fitness test and an information leaflet on PA at the beginning of the study. The intervention’s aim was to increase PA, improve work productivity, and decrease sickness absence. Primary outcomes were PA (measured as MET minutes per week), work productivity (quantity and quality of work; QQ index), and sickness absence (SA) days at 12 months. Participants were assigned to groups using block randomization with a computer-generated scheme. The study was not blinded. Results There were 544 randomized participants, of which 521 were included in the analysis (64% female, mean age 43 years). At 12 months, there was no significant difference in physical activity levels between the intervention group (n = 264) and the control group (n = 257). The adjusted mean difference was −206 MET min/week [95% Bayesian credible interval −540 to 128; negative values favor control group]. There was also no significant difference in the QQ index (−0.5 [−4.4 to 3.3]) or SA days (0.0 [−1.2 to 0.9]). Of secondary outcomes, body weight (0.5 kg [0.0 to 1.0]) and percentage of body fat (0.6% [0.2% to 1.1%]) were slightly higher in the

  5. Geostatistical prediction of stream-flow regime in southeastern United States

    NASA Astrophysics Data System (ADS)

    Pugliese, Alessio; Castellarin, Attilio; Archfield, Stacey; Farmer, William

    2015-04-01

    similar performances independently of the interpretation of the curves (i.e. period-of-record/annual, or complete/seasonal) or Q* (MAF or MAP*); at -site performances are satisfactory or good (i.e. Nash-Sutcliffe Efficiency NSE ranges from 0.60 to 0.90 for cross-validated FDCs, depending on the model setting), while the overall performance at regional scale indicates that OK and TK are associated with smaller BIAS and RMSE relative to the six benchmark procedures. Acknowledgements: We thankfully acknowledge Alessia Bononi and Antonio Liguori for their preliminary analyses and Jon O. Skøien and Edzer Pebesma for their helpful assistance with R-packages rtop and gstat. The study is part of the research activities carried out by the working group: Anthropogenic and Climatic Controls on WateR AvailabilitY (ACCuRAcY) of Panta Rhei - Everything Flows Change in Hydrology and Society (IAHS Scientific Decade 2013-2022). References Pugliese, A., A. Castellarin, A., Brath (2014): Geostatistical prediction of flow-duration curves in an index-flow framework, Hydrol. Earth Syst. Sci., 18, 3801-3816,doi:10.5194/hess-18-3801-2014. Castiglioni, S., A. Castellarin, A. Montanari (2009): Prediction of low-flow indices in ungauged basins through physiographical space-based interpolation, Journal of Hydrology, 378, 272-280.

  6. Introduction to This Special Issue on Geostatistics and Geospatial Techniques in Remote Sensing

    NASA Technical Reports Server (NTRS)

    Atkinson, Peter; Quattrochi, Dale A.; Goodman, H. Michael (Technical Monitor)

    2000-01-01

    The germination of this special Computers & Geosciences (C&G) issue began at the Royal Geographical Society (with the Institute of British Geographers) (RGS-IBG) annual meeting in January 1997 held at the University of Exeter, UK. The snow and cold of the English winter were tempered greatly by warm and cordial discussion of how to stimulate and enhance cooperation on geostatistical and geospatial research in remote sensing 'across the big pond' between UK and US researchers. It was decided that one way forward would be to hold parallel sessions in 1998 on geostatistical and geospatial research in remote sensing at appropriate venues in both the UK and the US. Selected papers given at these sessions would be published as special issues of C&G on the UK side and Photogrammetric Engineering and Remote Sensing (PE&RS) on the US side. These issues would highlight the commonality in research on geostatistical and geospatial research in remote sensing on both sides of the Atlantic Ocean. As a consequence, a session on "Geostatistics and Geospatial Techniques for Remote Sensing of Land Surface Processes" was held at the RGS-IBG annual meeting in Guildford, Surrey, UK in January 1998, organized by the Modeling and Advanced Techniques Special Interest Group (MAT SIG) of the Remote Sensing Society (RSS). A similar session was held at the Association of American Geographers (AAG) annual meeting in Boston, Massachusetts in March 1998, sponsored by the AAG's Remote Sensing Specialty Group (RSSG). The 10 papers that make up this issue of C&G, comprise 7 papers from the UK and 3 papers from the LIS. We are both co-editors of each of the journal special issues, with the lead editor of each journal issue being from their respective side of the Atlantic. The special issue of PE&RS (vol. 65) that constitutes the other half of this co-edited journal series was published in early 1999, comprising 6 papers by US authors. We are indebted to the International Association for Mathematical

  7. Introduction to This Special Issue on Geostatistics and Geospatial Techniques in Remote Sensing

    NASA Technical Reports Server (NTRS)

    Atkinson, Peter; Quattrochi, Dale A.; Goodman, H. Michael (Technical Monitor)

    2000-01-01

    The germination of this special Computers & Geosciences (C&G) issue began at the Royal Geographical Society (with the Institute of British Geographers) (RGS-IBG) annual meeting in January 1997 held at the University of Exeter, UK. The snow and cold of the English winter were tempered greatly by warm and cordial discussion of how to stimulate and enhance cooperation on geostatistical and geospatial research in remote sensing 'across the big pond' between UK and US researchers. It was decided that one way forward would be to hold parallel sessions in 1998 on geostatistical and geospatial research in remote sensing at appropriate venues in both the UK and the US. Selected papers given at these sessions would be published as special issues of C&G on the UK side and Photogrammetric Engineering and Remote Sensing (PE&RS) on the US side. These issues would highlight the commonality in research on geostatistical and geospatial research in remote sensing on both sides of the Atlantic Ocean. As a consequence, a session on "Geostatistics and Geospatial Techniques for Remote Sensing of Land Surface Processes" was held at the RGS-IBG annual meeting in Guildford, Surrey, UK in January 1998, organized by the Modeling and Advanced Techniques Special Interest Group (MAT SIG) of the Remote Sensing Society (RSS). A similar session was held at the Association of American Geographers (AAG) annual meeting in Boston, Massachusetts in March 1998, sponsored by the AAG's Remote Sensing Specialty Group (RSSG). The 10 papers that make up this issue of C&G, comprise 7 papers from the UK and 3 papers from the LIS. We are both co-editors of each of the journal special issues, with the lead editor of each journal issue being from their respective side of the Atlantic. The special issue of PE&RS (vol. 65) that constitutes the other half of this co-edited journal series was published in early 1999, comprising 6 papers by US authors. We are indebted to the International Association for Mathematical

  8. How well are the ASAS/OMERACT Core Outcome Sets for Ankylosing Spondylitis implemented in randomized clinical trials? A systematic literature review.

    PubMed

    Bautista-Molano, Wilson; Navarro-Compán, Victoria; Landewé, Robert B M; Boers, Maarten; Kirkham, Jamie J; van der Heijde, Désirée

    2014-09-01

    This study aims to investigate how well the Assessment of SpondyloArthritis international Society (ASAS)/Outcome Measures in Rheumatology Clinical Trials (OMERACT) core set and response criteria for ankylosing spondylitis (AS) have been implemented in randomized controlled trials (RCTs) testing pharmacological and non-pharmacological interventions. A systematic literature search was performed up to June 2013 looking for RCTs in patients with axial spondyloarthritis (SpA) (AS and non-radiographic axial SpA). The assessed domains and instruments belonging to the core sets for disease-controlling anti-rheumatic therapy (DC-ART) and symptom-modifying anti-rheumatic drugs (SMARDs) were extracted. Results were reported separately for those trials published until 2 years after the publication of the core set (1 April 2001; 'control trials') and those trials published at least 2 years after the publication date ('implementation trials'). One hundred twenty-three articles from 99 RCTs were included in the analysis, comparing 48 'control trials' and 51 'implementation trials'. Regarding DC-ART core set, the following domains were significantly more frequently assessed in the 'implementation group' in comparison to the 'control group': 'physical function' (100 vs 41.7 %; p ≤ 0.001), 'peripheral joints/entheses' (100 vs 33.3 %; p ≤ 0.001) and 'fatigue' (100 vs 0 %; p ≤ 0.001). Three instruments were significantly more used in the 'implementation group': Bath Ankylosing Spondylitis Functional Index (BASFI) (100 vs 8.3 %; p = ≤ 0.001), CRP (92.3 vs 58.3 %; p = 0.01) and Bath Ankylosing Spondylitis Metrology Index (BASMI) (53.8 vs 0 %; p = 0.001). Regarding SMARD core set domains, physical function (92 vs 23 %; p ≤ 0.001) and fatigue (84 vs 17 %; p ≤ 0.001), as well as the instruments BASFI (88 vs 14 %; p ≤ 0.001) and BASMI (52 vs 0 %; p ≤ 0.001), increased significantly in the 'implementation group'. Twenty per cent of

  9. Focused breastfeeding counselling improves short- and long-term success in an early-discharge setting: A cluster-randomized study.

    PubMed

    Nilsson, Ingrid M S; Strandberg-Larsen, Katrine; Knight, Christopher H; Hansen, Anne Vinkel; Kronborg, Hanne

    2017-02-14

    Length of postnatal hospitalization has decreased and has been shown to be associated with infant nutritional problems and increase in readmissions. We aimed to evaluate if guidelines for breastfeeding counselling in an early discharge hospital setting had an effect on maternal breastfeeding self-efficacy, infant readmission and breastfeeding duration. A cluster randomized trial was conducted and assigned nine maternity settings in Denmark to intervention or usual care. Women were eligible if they expected a single infant, intended to breastfeed, were able to read Danish, and expected to be discharged within 50 hr postnatally. Between April 2013 and August 2014, 2,065 mothers were recruited at intervention and 1,476 at reference settings. Results show that the intervention did not affect maternal breastfeeding self-efficacy (primary outcome). However, less infants were readmitted 1 week postnatally in the intervention compared to the reference group (adjusted OR 0.55, 95% CI 0.37, -0.81), and 6 months following birth, more infants were exclusively breastfed in the intervention group (adjusted OR 1.36, 95% CI 1.02, -1.81). Moreover, mothers in the intervention compared to the reference group were breastfeeding more frequently (p < .001), and spend more hours skin to skin with their infants (p < .001). The infants were less often treated for jaundice (p = 0.003) and there was more paternal involvement (p = .037). In an early discharge hospital setting, a focused breastfeeding programme concentrating on increased skin to skin contact, frequent breastfeeding, good positioning of the mother infant dyad, and enhanced involvement of the father improved short-term and long-term breastfeeding success.

  10. TTM-based motivational counselling does not increase physical activity of low back pain patients in a primary care setting--A cluster-randomized controlled trial.

    PubMed

    Leonhardt, Corinna; Keller, Stefan; Chenot, Jean-François; Luckmann, Judith; Basler, Heinz-Dieter; Wegscheider, Karl; Baum, Erika; Donner-Banzhoff, Norbert; Pfingsten, Michael; Hildebrandt, Jan; Kochen, Michael M; Becker, Annette

    2008-01-01

    To investigate the effectiveness of a TTM-based motivational counselling approach by trained practice nurses to promote physical activity of low back pain patients in a German primary care setting. Data were collected in a cluster-randomized controlled trial with three study arms via questionnaires and patient interviews at baseline and after 6 and 12 months. We analysed total physical activity and self-efficacy by using random effect models to allow for clustering. A total of 1378 low back pain patients, many with acute symptoms, were included in the study. Nearly 40% of all patients reported sufficient physical activity at baseline. While there were significant improvements in patients' physical activity behaviour in all study arms, there was no evidence for an intervention effect. The outcome may be explained by insufficient performance of the practice nurses, implementation barriers caused by the German health care system and the heterogenous sample. Given the objective to incorporate practice nurses into patient education, there is a need for a better basic training of the nurses and for a change towards an organizational structure that facilitates patient-nurse communication. Counselling for low back pain patients has to consider more specificated aims for different subgroups.

  11. Random blood glucose may be used to assess long-term glycaemic control among patients with type 2 diabetes mellitus in a rural African clinical setting.

    PubMed

    Rasmussen, Jon B; Nordin, Lovisa S; Rasmussen, Niclas S; Thomsen, Jakúp A; Street, Laura A; Bygbjerg, Ib C; Christensen, Dirk L

    2014-12-01

    To investigate the diagnostic accuracy of random blood glucose (RBG) on good glycaemic control among patients with diabetes mellitus (DM) in a rural African setting. Cross-sectional study at St. Francis' Hospital in eastern Zambia. RBG and HbA1c were measured during one clinical review only. Other information obtained was age, sex, body mass index, waist circumference, blood pressure, urine albumin-creatinine ratio, duration since diagnosis and medication. One hundred and one patients with DM (type 1 DM = 23, type 2 DM = 78) were included. Spearman's rank correlation coefficient revealed a significant correlation between RBG and HbA1c among the patients with type 2 DM (r = 0.73, P < 0.001) but not patients with type 1 DM (r = 0.17, P = 0.44). Furthermore, in a multivariate linear regression model (R(2) = 0.71) RBG (per mmol/l increment) (B = 0.28, 95% CI:0.24-0.32, P < 0.001) was significantly associated with HbA1c among the patients with type 2 DM. Based on ROC analysis (AUC = 0.80, SE = 0.05), RBG ≤7.5 mmol/l was determined as the optimal cut-off value for good glycaemic control (HbA1c <7.0% [53 mmol/mol]) among patients with type 2 DM (sensitivity = 76.7%; specificity = 70.8%; positive predictive value = 62.2%; negative predictive value = 82.9%). Random blood glucose could possibly be used to assess glycaemic control among patients with type 2 DM in rural settings of sub-Saharan Africa. © 2014 John Wiley & Sons Ltd.

  12. Limited efficacy of a nonrestricted intervention on antimicrobial prescription of commonly used antibiotics in the hospital setting: results of a randomized controlled trial.

    PubMed

    Masiá, M; Matoses, C; Padilla, S; Murcia, A; Sánchez, V; Romero, I; Navarro, A; Hernández, I; Gutiérrez, F

    2008-07-01

    Most interventions aimed at diminishing the use of antimicrobials in hospitals have focussed on newly introduced antibiotics and very few have been randomly controlled. We evaluated the impact on antibiotic consumption of an intervention without restrictions in antibiotic use, focussed on commonly used antibiotics with a controlled randomized trial. All new prescriptions of levofloxacin, carbapenems, or vancomycin in hospitalized patients were randomized to an intervention or a control group. Intervention consisted of an antibiotic regimen counselling targeted to match local antibiotic guidelines, performed using only patients' charts. Clinical charts of patients assigned to the control group were reviewed daily by a pharmacist. The primary endpoint was a reduction in consumption of the targeted antibiotics. Two hundred seventy-eight prescriptions corresponding to 253 patients were included: 146 were assigned to the intervention and 132 to the control group. Total consumption of the targeted antibiotics (median [IQR]) was slightly lower in the intervention (8 [4-12] defined daily doses [DDDs] per patient) than in the control group (10 [6-16] DDDs per patient; p = 0.04). No differences in number of DDDs were observed when antibiotics of substitution were included (11.05 [6-18.2] vs 10 [6-16.5] in the intervention and control groups, respectively, p = 0.13). The total number of days on treatment with the targeted antibiotics was lower in the intervention (4 [3-7] days per patient) than in the control group (6 [4-10] days per patient; p = 0.002). Differences in number of days on treatment only reached statistical significance in the prescriptions of carbapenems. There were no differences between intervention and control groups in terms of number of deaths, hospital readmissions, length of hospital stay, or antibiotic costs. In this trial, an intervention without restrictions focussed on antimicrobial prescriptions of commonly used antibiotics in the hospital setting

  13. Development of a Core Set of Outcomes for Randomized Controlled Trials with Multiple Outcomes – Example of Pulp Treatments of Primary Teeth for Extensive Decay in Children

    PubMed Central

    Smaïl-Faugeron, Violaine; Fron Chabouis, Hélène; Durieux, Pierre; Attal, Jean-Pierre; Muller-Bolla, Michèle; Courson, Frédéric

    2013-01-01

    Objectives Evidence-based comparisons of interventions can be challenging because of the diversity of outcomes in randomized controlled trials (RCTs). We aimed to describe outcomes in RCTs assessing pulp treatments for primary teeth and to develop a core set of component outcomes to be part of composite outcome defining the failure of a pulp treatment. Methods We systematically reviewed articles of RCTs comparing pulp treatments for primary molars published up to February 2012. We abstracted all outcomes assessed in each trial, then used a small-group consensus process to group similar outcomes, which were reduced to a composite outcome of failure of a pulp treatment by a 3-round Delphi process involving expert authors and dentists. Results We included 47 reports of RCTs in the review, for 83 reported outcomes (median 11 outcomes per RCT). These outcomes were grouped into 24 overarching outcome categories. We contacted 210 experts for the Delphi process and 25% to 30% participated. The process identified the following 5 component outcomes as part of a composite outcome of failure of a pulp treatment: soft-tissue pathology, pain, pathologic mobility, pathologic radiolucency and pathologic root resorption. Conclusions RCTs of pulp treatments for primary teeth investigate diverse outcomes. Our consensus process, involving clinicians but no patient, allowed for compiling a core set of component outcomes to define the composite outcome failure of a pulp treatment for primary teeth. PMID:23300955

  14. A geostatistical methodology for the optimal design of space-time hydraulic head monitoring networks and its application to the Valle de Querétaro aquifer

    NASA Astrophysics Data System (ADS)

    Herrera, G. S.; Júnez-Ferreira, H. E.

    2012-12-01

    This work introduces a new methodology for the optimal design of space-time hydraulic head monitoring networks and its application to the Valle de Querétaro aquifer in Mexico. The selection of the space-time monitoring points is done using a static Kalman filter combined with a sequential optimization method. The Kalman filter requires as input a space-time covariance matrix, which is derived from a geostatistical analysis. A sequential optimization method that selects the space-time point that minimizes a function of the variance, in each step, is used. We demonstrate the methodology applying it to the redesign of the hydraulic head monitoring network of the Valle de Querétaro aquifer with the objective of selecting from a set of monitoring positions and times, those that minimize the spatiotemporal redundancy. The database for the geostatistical space-time analysis corresponds to information of 273 wells located within the aquifer for the period 1970-2007. A total of 1,435 hydraulic head data were used to construct the experimental space-time variogram. The results show that from the existing monitoring program that consists of 418 space-time monitoring points, only 178 are not redundant. The implied reduction of monitoring costs was possible because the proposed method is successful in propagating information in space and time.

  15. A geostatistical methodology for the optimal design of space-time hydraulic head monitoring networks and its application to the Valle de Querétaro aquifer.

    PubMed

    Júnez-Ferreira, H E; Herrera, G S

    2013-04-01

    This paper presents a new methodology for the optimal design of space-time hydraulic head monitoring networks and its application to the Valle de Querétaro aquifer in Mexico. The selection of the space-time monitoring points is done using a static Kalman filter combined with a sequential optimization method. The Kalman filter requires as input a space-time covariance matrix, which is derived from a geostatistical analysis. A sequential optimization method that selects the space-time point that minimizes a function of the variance, in each step, is used. We demonstrate the methodology applying it to the redesign of the hydraulic head monitoring network of the Valle de Querétaro aquifer with the objective of selecting from a set of monitoring positions and times, those that minimize the spatiotemporal redundancy. The database for the geostatistical space-time analysis corresponds to information of 273 wells located within the aquifer for the period 1970-2007. A total of 1,435 hydraulic head data were used to construct the experimental space-time variogram. The results show that from the existing monitoring program that consists of 418 space-time monitoring points, only 178 are not redundant. The implied reduction of monitoring costs was possible because the proposed method is successful in propagating information in space and time.

  16. Feasibility intervention trial of two types of improved cookstoves in three resource-limited settings: study protocol for a randomized controlled trial

    PubMed Central

    2013-01-01

    Background Exposure to biomass fuel smoke is one of the leading risk factors for disease burden worldwide. International campaigns are currently promoting the widespread adoption of improved cookstoves in resource-limited settings, yet little is known about the cultural and social barriers to successful improved cookstove adoption and how these barriers affect environmental exposures and health outcomes. Design We plan to conduct a one-year crossover, feasibility intervention trial in three resource-limited settings (Kenya, Nepal and Peru). We will enroll 40 to 46 female primary cooks aged 20 to 49 years in each site (total 120 to 138). Methods At baseline, we will collect information on sociodemographic characteristics and cooking practices, and measure respiratory health and blood pressure for all participating women. An initial observational period of four months while households use their traditional, open-fire design cookstoves will take place prior to randomization. All participants will then be randomized to receive one of two types of improved, ventilated cookstoves with a chimney: a commercially-constructed cookstove (Envirofit G3300/G3355) or a locally-constructed cookstove. After four months of observation, participants will crossover and receive the other improved cookstove design and be followed for another four months. During each of the three four-month study periods, we will collect monthly information on self-reported respiratory symptoms, cooking practices, compliance with cookstove use (intervention periods only), and measure peak expiratory flow, forced expiratory volume at 1 second, exhaled carbon monoxide and blood pressure. We will also measure pulmonary function testing in the women participants and 24-hour kitchen particulate matter and carbon monoxide levels at least once per period. Discussion Findings from this study will help us better understand the behavioral, biological, and environmental changes that occur with a cookstove

  17. Geostatistical applications in ground-water modeling in south-central Kansas

    USGS Publications Warehouse

    Ma, T.-S.; Sophocleous, M.; Yu, Y.-S.

    1999-01-01

    This paper emphasizes the supportive role of geostatistics in applying ground-water models. Field data of 1994 ground-water level, bedrock, and saltwater-freshwater interface elevations in south-central Kansas were collected and analyzed using the geostatistical approach. Ordinary kriging was adopted to estimate initial conditions for ground-water levels and topography of the Permian bedrock at the nodes of a finite difference grid used in a three-dimensional numerical model. Cokriging was used to estimate initial conditions for the saltwater-freshwater interface. An assessment of uncertainties in the estimated data is presented. The kriged and cokriged estimation variances were analyzed to evaluate the adequacy of data employed in the modeling. Although water levels and bedrock elevations are well described by spherical semivariogram models, additional data are required for better cokriging estimation of the interface data. The geostatistically analyzed data were employed in a numerical model of the Siefkes site in the project area. Results indicate that the computed chloride concentrations and ground-water drawdowns reproduced the observed data satisfactorily.This paper emphasizes the supportive role of geostatistics in applying ground-water models. Field data of 1994 ground-water level, bedrock, and saltwater-freshwater interface elevations in south-central Kansas were collected and analyzed using the geostatistical approach. Ordinary kriging was adopted to estimate initial conditions for ground-water levels and topography of the Permian bedrock at the nodes of a finite difference grid used in a three-dimensional numerical model. Cokriging was used to estimate initial conditions for the saltwater-freshwater interface. An assessment of uncertainties in the estimated data is presented. The kriged and cokriged estimation variances were analyzed to evaluate the adequacy of data employed in the modeling. Although water levels and bedrock elevations are well described

  18. SRS 2010 Vegetation Inventory GeoStatistical Mapping Results for Custom Reaction Intensity and Total Dead Fuels.

    SciTech Connect

    Edwards, Lloyd A.; Paresol, Bernard

    2014-09-01

    This report of the geostatistical analysis results of the fire fuels response variables, custom reaction intensity and total dead fuels is but a part of an SRS 2010 vegetation inventory project. For detailed description of project, theory and background including sample design, methods, and results please refer to USDA Forest Service Savannah River Site internal report “SRS 2010 Vegetation Inventory GeoStatistical Mapping Report”, (Edwards & Parresol 2013).

  19. A randomized trial of computer-based reminders and audit and feedback to improve HIV screening in a primary care setting.

    PubMed

    Sundaram, V; Lazzeroni, L C; Douglass, L R; Sanders, G D; Tempio, P; Owens, D K

    2009-08-01

    Despite recommendations for voluntary HIV screening, few medical centres have implemented screening programmes. The objective of the study was to determine whether an intervention with computer-based reminders and feedback would increase screening for HIV in a Department of Veterans Affairs (VA) health-care system. The design of the study was a randomized controlled trial at five primary care clinics at the VA Palo Alto Health Care System. All primary care providers were eligible to participate in the study. The study intervention was computer-based reminders to either assess HIV risk behaviours or to offer HIV testing; feedback on adherence to reminders was provided. The main outcome measure was the difference in HIV testing rates between intervention and control group providers. The control group providers tested 1.0% (n = 67) and 1.4% (n = 106) of patients in the preintervention and intervention period, respectively; intervention providers tested 1.8% (n = 98) and 1.9% (n = 114), respectively (P = 0.75). In our random sample of 753 untested patients, 204 (27%) had documented risk behaviours. Providers were more likely to adhere to reminders to test rather than with reminders to perform risk assessment (11% versus 5%, P < 0.01). Sixty-one percent of providers felt that lack of time prevented risk assessment. In conclusion, in primary care clinics in our setting, HIV testing rates were low. Providers were unaware of the high rates of risky behaviour in their patient population and perceived important barriers to testing. Low-intensity clinical reminders and feedback did not increase rates of screening.

  20. Mobile phone technologies improve adherence to antiretroviral treatment in a resource-limited setting: a randomized controlled trial of text message reminders

    PubMed Central

    Pop-Eleches, Cristian; Thirumurthy, Harsha; Habyarimana, James P.; Zivin, Joshua G.; Goldstein, Markus P.; de Walque, Damien; MacKeen, Leslie; Haberer, Jessica; Kimaiyo, Sylvester; Sidle, John; Ngare, Duncan; Bangsberg, David R.

    2013-01-01

    Objective There is limited evidence on whether growing mobile phone availability in sub-Saharan Africa can be used to promote high adherence to antiretroviral therapy (ART). This study tested the efficacy of short message service (SMS) reminders on adherence to ART among patients attending a rural clinic in Kenya. Design A randomized controlled trial of four SMS reminder interventions with 48 weeks of follow-up. Methods Four hundred and thirty-one adult patients who had initiated ART within 3 months were enrolled and randomly assigned to a control group or one of the four intervention groups. Participants in the intervention groups received SMS reminders that were either short or long and sent at a daily or weekly frequency. Adherence was measured using the medication event monitoring system. The primary outcome was whether adherence exceeded 90% during each 12-week period of analysis and the 48-week study period. The secondary outcome was whether there were treatment interruptions lasting at least 48 h. Results In intention-to-treat analysis, 53% of participants receiving weekly SMS reminders achieved adherence of at least 90% during the 48 weeks of the study, compared with 40% of participants in the control group (P=0.03). Participants in groups receiving weekly reminders were also significantly less likely to experience treatment interruptions exceeding 48 h during the 48-week follow-up period than participants in the control group (81 vs. 90%, P = 0.03). Conclusion These results suggest that SMS reminders may be an important tool to achieve optimal treatment response in resource-limited settings. PMID:21252632

  1. No Effect of Insecticide Treated Curtain Deployment on Aedes Infestation in a Cluster Randomized Trial in a Setting of Low Dengue Transmission in Guantanamo, Cuba

    PubMed Central

    Lambert, Isora; Montada, Domingo; Baly, Alberto; Van der Stuyft, Patrick

    2015-01-01

    Objective & Methodology The current study evaluated the effectiveness and cost-effectiveness of Insecticide Treated Curtain (ITC) deployment for reducing dengue vector infestation levels in the Cuban context with intensive routine control activities. A cluster randomized controlled trial took place in Guantanamo city, east Cuba. Twelve neighborhoods (about 500 households each) were selected among the ones with the highest Aedes infestation levels in the previous two years, and were randomly allocated to the intervention and control arms. Long lasting ITC (PermaNet) were distributed in the intervention clusters in March 2009. Routine control activities were continued in the whole study area. In both study arms, we monitored monthly pre- and post-intervention House Index (HI, number of houses with at least 1 container with Aedes immature stages/100 houses inspected), during 12 and 18 months respectively. We evaluated the effect of ITC deployment on HI by fitting a generalized linear regression model with a negative binomial link function to these data. Principal Findings At distribution, the ITC coverage (% of households using ≥1 ITC) reached 98.4%, with a median of 3 ITC distributed/household. After 18 months, the coverage remained 97.4%. The local Aedes species was susceptible to deltamethrin (mosquito mortality rate of 99.7%) and the residual deltamethrin activity in the ITC was within acceptable levels (mosquito mortality rate of 73.1%) after one year of curtain use. Over the 18 month observation period after ITC distribution, the adjusted HI rate ratio, intervention versus control clusters, was 1.15 (95% CI 0.57 to 2.34). The annualized cost per household of ITC implementation was 3.8 USD, against 16.8 USD for all routine ACP activities. Conclusion Deployment of ITC in a setting with already intensive routine Aedes control actions does not lead to reductions in Aedes infestation levels. PMID:25794192

  2. The effectiveness of the Pain Resource Nurse Program to improve pain management in the hospital setting: A cluster randomized controlled trial.

    PubMed

    Gunnarsdottir, Sigridur; Zoëga, Sigridur; Serlin, Ronald C; Sveinsdottir, Herdis; Hafsteinsdottir, Elin Johanna Gudrun; Fridriksdottir, Nanna; Gretarsdottir, Elfa Tholl; Ward, Sandra Evelyn

    2017-07-16

    The Pain Resource Nurse program is a widely disseminated, evidence-based, nursing staff development program, designed to improve pain management in hospitals. The program has shown promising results, but has never been tested with a rigorous research design. Our objective was to test the effectiveness of the Pain Resource Nurse program. Hypothesized outcomes included improvements in nurses' knowledge, attitudes, and assessment practices, and in patients' participation in decision-making, adequacy of pain management, pain severity, time spent in severe pain, pain interference, and satisfaction. Cluster randomized controlled trial. A 650-bed university hospital in Iceland Participants: The sample consisted of a) patients ≥18 years of age, native speaking, hospitalized for at least 24h, alert and able to participate; and b) registered nurses who worked on the participating units. Twenty three surgical and medical inpatient units were randomly assigned to the Pain Resource Nurse program (n=12) or to wait list control (n=11). The American Pain Society Outcome Questionnaire and the Knowledge and Attitudes Survey were used to collect data from patients and nurses respectively. Baseline data (T1) for patients were collected simultaneously on all units, followed by data collection from nurses. Then randomization took place, and the Pain Resource Nurse program was instituted. Ten months later, follow up (T2) data were collected, after which the nurses on the control group units received the Pain Resource Nurse program. At baseline, data were collected from 305 of the 396 eligible patients and at follow up from 326 of the 392 eligible patients, a 77% and 83% response rate respectively. At baseline, 232 of 479 eligible nurses responded and at follow-up 176 of the eligible 451 nurses responded, a 49% and 39% response rate, respectively. A nested mixed model analysis of covariance revealed that the intervention was successful in changing pain assessment practices, with pain

  3. High Intensity Interval Training in a Real World Setting: A Randomized Controlled Feasibility Study in Overweight Inactive Adults, Measuring Change in Maximal Oxygen Uptake

    PubMed Central

    Lunt, Helen; Draper, Nick; Marshall, Helen C.; Logan, Florence J.; Hamlin, Michael J.; Shearman, Jeremy P.; Cotter, James D.; Kimber, Nicholas E.; Blackwell, Gavin; Frampton, Christopher M. A.

    2014-01-01

    Background In research clinic settings, overweight adults undertaking HIIT (high intensity interval training) improve their fitness as effectively as those undertaking conventional walking programs but can do so within a shorter time spent exercising. We undertook a randomized controlled feasibility (pilot) study aimed at extending HIIT into a real world setting by recruiting overweight/obese, inactive adults into a group based activity program, held in a community park. Methods Participants were allocated into one of three groups. The two interventions, aerobic interval training and maximal volitional interval training, were compared with an active control group undertaking walking based exercise. Supervised group sessions (36 per intervention) were held outdoors. Cardiorespiratory fitness was measured using VO2max (maximal oxygen uptake, results expressed in ml/min/kg), before and after the 12 week interventions. Results On ITT (intention to treat) analyses, baseline (N = 49) and exit (N = 39) O2 was 25.3±4.5 and 25.3±3.9, respectively. Participant allocation and baseline/exit VO2max by group was as follows: Aerobic interval training N =  16, 24.2±4.8/25.6±4.8; maximal volitional interval training N = 16, 25.0±2.8/25.2±3.4; walking N = 17, 26.5±5.3/25.2±3.6. The post intervention change in VO2max was +1.01 in the aerobic interval training, −0.06 in the maximal volitional interval training and −1.03 in the walking subgroups. The aerobic interval training subgroup increased VO2max compared to walking (p = 0.03). The actual (observed, rather than prescribed) time spent exercising (minutes per week, ITT analysis) was 74 for aerobic interval training, 45 for maximal volitional interval training and 116 for walking (p =  0.001). On descriptive analysis, the walking subgroup had the fewest adverse events. Conclusions In contrast to earlier studies, the improvement in cardiorespiratory fitness in a cohort of overweight

  4. An integrated intervention to reduce intimate partner violence and psychological distress with refugees in low-resource settings: study protocol for the Nguvu cluster randomized trial.

    PubMed

    Tol, Wietse A; Greene, M Claire; Likindikoki, Samuel; Misinzo, Lusia; Ventevogel, Peter; Bonz, Ann G; Bass, Judith K; Mbwambo, Jessie K K

    2017-05-18

    Intimate partner violence (IPV) is a critical public health and human rights concern globally, including for refugee women in low-resource settings. Little is known about effective interventions for this population. IPV and psychological distress have a bi-directional relationship, indicating the potential benefit of a structured psychological component as part of efforts to reduce IPV for women currently in violent relationships. This protocol describes a cluster randomized controlled trial aimed at evaluating an 8-session integrated psychological and advocacy intervention (Nguvu) with female adult survivors of past-year IPV displaying moderate to severe psychological distress. Outcomes are reductions in: recurrence of IPV; symptoms of anxiety, depression and post-traumatic stress (primary); and functional impairment (secondary). Hypothesized mediators of the intervention are improvements in social support, coping skills and support seeking. We will recruit 400 participants from existing women's support groups operating within villages in Nyarugusu refugee camp, Tanzania. Women's groups will be randomized to receive the intervention (Nguvu and usual care) or usual care alone. All eligible women will complete a baseline assessment (week 0) followed by a post-treatment (week 9) and a 3-month post-treatment assessment (week 20). The efficacy of the intervention will be determined by between-group differences in the longitudinal trajectories of primary outcomes evaluated using mixed-effects models. Study procedures have been approved by Institutional Review Boards in the United States and Tanzania. This trial will provide evidence on the efficacy of a novel integrated group intervention aimed at secondary prevention of IPV that includes a structured psychological component to address psychological distress. The psychological and advocacy components of the proposed intervention have been shown to be efficacious for their respective outcomes when delivered in

  5. Geochemical and geostatistical evaluation, Arkansas Canyon Planning Unit, Fremont and Custer Counties, Colorado

    USGS Publications Warehouse

    Weiland, E.F.; Connors, R.A.; Robinson, M.L.; Lindemann, J.W.; Meyer, W.T.

    1982-01-01

    A mineral assessment of the Arkansas Canyon Planning Unit was undertaken by Barringer Resources Inc., under the terms of contract YA-553-CTO-100 with the Bureau of Land Management, Colorado State Office. The study was based on a geochemical-geostatistical survey in which 700 stream sediment samples were collected and analyzed for 25 elements. Geochemical results were interpreted by statistical processing which included factor, discriminant, multiple regression and characteristic analysis. The major deposit types evaluated were massive sulfide-base metal, sedimentary and magmatic uranium, thorium vein, magmatic segregation, and carbonatite related deposits. Results of the single element data and multivariate geostatistical analysis indicate that limited potential exists for base metal mineralization near the Horseshoe, El Plomo, and Green Mountain Mines. Thirty areas are considered to be anomalous with regard to one or more of the geochemical parameters evaluated during this study. The evaluation of carbonatite related mineralization was restricted due to the lack of geochemical data specific to this environment.

  6. Three-dimensional geostatistical inversion of flowmeter and pumping test data.

    PubMed

    Li, Wei; Englert, Andreas; Cirpka, Olaf A; Vereecken, Harry

    2008-01-01

    We jointly invert field data of flowmeter and multiple pumping tests in fully screened wells to estimate hydraulic conductivity using a geostatistical method. We use the steady-state drawdowns of pumping tests and the discharge profiles of flowmeter tests as our data in the inference. The discharge profiles need not be converted to absolute hydraulic conductivities. Consequently, we do not need measurements of depth-averaged hydraulic conductivity at well locations. The flowmeter profiles contain information about relative vertical distributions of hydraulic conductivity, while drawdown measurements of pumping tests provide information about horizontal fluctuation of the depth-averaged hydraulic conductivity. We apply the method to data obtained at the Krauthausen test site of the Forschungszentrum Jülich, Germany. The resulting estimate of our joint three-dimensional (3D) geostatistical inversion shows an improved 3D structure in comparison to the inversion of pumping test data only.

  7. Geostatistics for spatial genetic structures: study of wild populations of perennial ryegrass.

    PubMed

    Monestiez, P; Goulard, M; Charmet, G

    1994-04-01

    Methods based on geostatistics were applied to quantitative traits of agricultural interest measured on a collection of 547 wild populations of perennial ryegrass in France. The mathematical background of these methods, which resembles spatial autocorrelation analysis, is briefly described. When a single variable is studied, the spatial structure analysis is similar to spatial autocorrelation analysis, and a spatial prediction method, called "kriging", gives a filtered map of the spatial pattern over all the sampled area. When complex interactions of agronomic traits with different evaluation sites define a multivariate structure for the spatial analysis, geostatistical methods allow the spatial variations to be broken down into two main spatial structures with ranges of 120 km and 300 km, respectively. The predicted maps that corresponded to each range were interpreted as a result of the isolation-by-distance model and as a consequence of selection by environmental factors. Practical collecting methodology for breeders may be derived from such spatial structures.

  8. Geostatistical simulations for radon indoor with a nested model including the housing factor.

    PubMed

    Cafaro, C; Giovani, C; Garavaglia, M

    2016-01-01

    The radon prone areas definition is matter of many researches in radioecology, since radon is considered a leading cause of lung tumours, therefore the authorities ask for support to develop an appropriate sanitary prevention strategy. In this paper, we use geostatistical tools to elaborate a definition accounting for some of the available information about the dwellings. Co-kriging is the proper interpolator used in geostatistics to refine the predictions by using external covariates. In advance, co-kriging is not guaranteed to improve significantly the results obtained by applying the common lognormal kriging. Here, instead, such multivariate approach leads to reduce the cross-validation residual variance to an extent which is deemed as satisfying. Furthermore, with the application of Monte Carlo simulations, the paradigm provides a more conservative radon prone areas definition than the one previously made by lognormal kriging.

  9. Statistical and Geostatistical analysis of interannual rainfall data in the island of Crete

    NASA Astrophysics Data System (ADS)

    Kotsopoulou, Anastasia; Varouchakis, Emmanouil

    2017-04-01

    Hydrological modeling requires spatially distributed precipitation data of high accuracy. However, precipitation is usually measured at a limited number of locations. In particular, in areas of complex terrain, where the topography plays a key role in the precipitation process, rainfall stations are usually sparse. Spatial interpolation techniques can be applied both to interpolate rainfall data and to combine them with secondary information that may improve the results. Regression Kriging (RK) is an interpolation methodology that combines a regression approach with a geostatistical approach. RK along with Ordinary Kriging (OK) are applied to represent the rainfall spatial distribution on the island of Crete, Greece. A period of 30 years is examined, and the statistical analysis of rainfall data is performed to identify key hydrological years and to analyze the interannual behavior of the rainfall. Then, geostatistical analysis is conducted to present the average spatial distribution of rainfall on the island of Crete.

  10. Effects of a settings-based intervention to promote student wellbeing and reduce smoking in vocational schools: A non-randomized controlled study.

    PubMed

    Andersen, Susan; Rod, Morten Hulvej; Ersbøll, Annette Kjær; Stock, Christiane; Johansen, Christoffer; Holmberg, Teresa; Zinckernagel, Line; Ingholt, Liselotte; Sørensen, Betina Bang; Tolstrup, Janne Schurmann

    2016-07-01

    School dropout and health risk behavior such as cigarette smoking represent major problems among students attending upper secondary vocational education. Modifications to the social environment may promote educational attainment as well as health and wellbeing of young people. However, there is a need for more evidence-based intervention programs. The aim of this study was to assess the effectiveness of an intervention targeting the socio-environmental setting at vocational schools on student wellbeing and smoking. We conducted a non-randomized controlled trial of 5794 students (mean age 21 years; 81% male) in 10 (four intervention and six comparison) large vocational schools in Denmark. The intervention involved changes in everyday school practices focusing on four themes: (i) introduction activities, (ii) daily class meetings, (iii) scheduled breaks and (iv) pleasant non-smoking environment. Outcomes were student wellbeing (four subscales: school connectedness, student support, teacher relatedness, positive valuing of the profession) and daily smoking measured at 10-week follow-up. We found statistically significant between-group difference in school connectedness, but not in student support, teacher relatedness and valuing the profession. The intervention had no effect on daily smoking. However, we found a statistically significant interaction between baseline smoking status and condition. This interaction suggested that baseline occasional smokers in the intervention group had significantly reduced odds ratio (OR) of becoming a daily smoker compared to baseline occasional smokers in the control group (8% versus 16%; OR = 0.44). The positive effects on school connectedness and in preventing occasional smokers becoming daily smokers indicate that it is possible to tackle school-related wellbeing and smoking in a high risk population through settings-based interventions. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Tomogram-based comparison of geostatistical models: Application to the Macrodispersion Experiment (MADE) site

    NASA Astrophysics Data System (ADS)

    Linde, Niklas; Lochbühler, Tobias; Dogan, Mine; Van Dam, Remke L.

    2015-12-01

    We propose a new framework to compare alternative geostatistical descriptions of a given site. Multiple realizations of each of the considered geostatistical models and their corresponding tomograms (based on inversion of noise-contaminated simulated data) are used as a multivariate training image. The training image is scanned with a direct sampling algorithm to obtain conditional realizations of hydraulic conductivity that are not only in agreement with the geostatistical model, but also honor the spatially varying resolution of the site-specific tomogram. Model comparison is based on the quality of the simulated geophysical data from the ensemble of conditional realizations. The tomogram in this study is obtained by inversion of cross-hole ground-penetrating radar (GPR) first-arrival travel time data acquired at the MAcro-Dispersion Experiment (MADE) site in Mississippi (USA). Various heterogeneity descriptions ranging from multi-Gaussian fields to fields with complex multiple-point statistics inferred from outcrops are considered. Under the assumption that the relationship between porosity and hydraulic conductivity inferred from local measurements is valid, we find that conditioned multi-Gaussian realizations and derivatives thereof can explain the crosshole geophysical data. A training image based on an aquifer analog from Germany was found to be in better agreement with the geophysical data than the one based on the local outcrop, which appears to under-represent high hydraulic conductivity zones. These findings are only based on the information content in a single resolution-limited tomogram and extending the analysis to tracer or higher resolution surface GPR data might lead to different conclusions (e.g., that discrete facies boundaries are necessary). Our framework makes it possible to identify inadequate geostatistical models and petrophysical relationships, effectively narrowing the space of possible heterogeneity representations.

  12. Application of Bayesian geostatistics for evaluation of mass discharge uncertainty at contaminated sites

    NASA Astrophysics Data System (ADS)

    Troldborg, Mads; Nowak, Wolfgang; Lange, Ida V.; Santos, Marta C.; Binning, Philip J.; Bjerg, Poul L.

    2012-09-01

    Mass discharge estimates are increasingly being used when assessing risks of groundwater contamination and designing remedial systems at contaminated sites. Such estimates are, however, rather uncertain as they integrate uncertain spatial distributions of both concentration and groundwater flow. Here a geostatistical simulation method for quantifying the uncertainty of the mass discharge across a multilevel control plane is presented. The method accounts for (1) heterogeneity of both the flow field and the concentration distribution through Bayesian geostatistics, (2) measurement uncertainty, and (3) uncertain source zone and transport parameters. The method generates conditional realizations of the spatial flow and concentration distribution. An analytical macrodispersive transport solution is employed to simulate the mean concentration distribution, and a geostatistical model of the Box-Cox transformed concentration data is used to simulate observed deviations from this mean solution. By combining the flow and concentration realizations, a mass discharge probability distribution is obtained. The method has the advantage of avoiding the heavy computational burden of three-dimensional numerical flow and transport simulation coupled with geostatistical inversion. It may therefore be of practical relevance to practitioners compared to existing methods that are either too simple or computationally demanding. The method is demonstrated on a field site contaminated with chlorinated ethenes. For this site, we show that including a physically meaningful concentration trend and the cosimulation of hydraulic conductivity and hydraulic gradient across the transect helps constrain the mass discharge uncertainty. The number of sampling points required for accurate mass discharge estimation and the relative influence of different data types on mass discharge uncertainty is discussed.

  13. Geostatistical mapping of effluent-affected sediment distribution on the Palos Verdes Shelf

    SciTech Connect

    Murray, Christopher J. ); Lee, H J.; Hampton, M A.

    2001-12-01

    Geostatistical techniques were used to study the spatial continuity of the thickness of effluent-affected sediment in the offshore Palos Verdes margin area. The thickness data were measured directly from cores and indirectly from high-frequency subbottom profiles collected over the Palos Verdes Margin. Strong spatial continuity of the sediment thickness data was identified, with a maximum range of correlation in excess of 1.4 km. The spatial correlation showed a marked anisotropy, and was more than twice as continuous in the alongshore direction as in the cross-shelf direction. Sequential indicator simulation employing models fit to the thickness data variograms was used to map the distribution of the sediment, and to quantify the uncertainty in those estimates. A strong correlation between sediment thickness data and measurements of the mass of the contaminant p,p?-DDE per unit area was identified. A calibration based on the bivariate distribution of the thickness and p,p?-DDE data was applied using Markov-Bayes indicator simulation to extend the geostatistical study and map the contamination levels in the sediment. Integrating the map grids produced by the geostatistical study of the two variables indicated that 7.8 million cubic meters of effluent-affected sediment exist in the map area, containing approximately 61 to 72 Mg (metric tons) of p,p?-DDE. Most of the contaminated sediment (about 85% of the sediment and 89% of the p,p?-DDE) occurs in water depths less than 100 m. The geostatistical study also indicated that the samples available for mapping are well distributed and the uncertainty of the estimates of the thickness and contamination level of the sediments is lowest in areas where the contaminated sediment is most prevalent.

  14. Application of a computationally efficient geostatistical approach to characterizing variably spaced water-table data

    SciTech Connect

    Quinn, J.J.

    1996-02-01

    Geostatistical analysis of hydraulic head data is useful in producing unbiased contour plots of head estimates and relative errors. However, at most sites being characterized, monitoring wells are generally present at different densities, with clusters of wells in some areas and few wells elsewhere. The problem that arises when kriging data at different densities is in achieving adequate resolution of the grid while maintaining computational efficiency and working within software limitations. For the site considered, 113 data points were available over a 14-mi{sup 2} study area, including 57 monitoring wells within an area of concern of 1.5 mi{sup 2}. Variogram analyses of the data indicate a linear model with a negligible nugget effect. The geostatistical package used in the study allows a maximum grid of 100 by 100 cells. Two-dimensional kriging was performed for the entire study area with a 500-ft grid spacing, while the smaller zone was modeled separately with a 100-ft spacing. In this manner, grid cells for the dense area and the sparse area remained small relative to the well separation distances, and the maximum dimensions of the program were not exceeded. The spatial head results for the detailed zone were then nested into the regional output by use of a graphical, object-oriented database that performed the contouring of the geostatistical output. This study benefitted from the two-scale approach and from very fine geostatistical grid spacings relative to typical data separation distances. The combining of the sparse, regional results with those from the finer-resolution area of concern yielded contours that honored the actual data at every measurement location. The method applied in this study can also be used to generate reproducible, unbiased representations of other types of spatial data.

  15. Geostatistical mapping of effluent-affected sediment distribution on the Palos Verdes shelf

    USGS Publications Warehouse

    Murray, C.J.; Lee, H.J.; Hampton, M.A.

    2002-01-01

    Geostatistical techniques were used to study the spatial continuity of the thickness of effluent-affected sediment in the offshore Palos Verdes Margin area. The thickness data were measured directly from cores and indirectly from high-frequency subbottom profiles collected over the Palos Verdes Margin. Strong spatial continuity of the sediment thickness data was identified, with a maximum range of correlation in excess of 1.4 km. The spatial correlation showed a marked anisotropy, and was more than twice as continuous in the alongshore direction as in the cross-shelf direction. Sequential indicator simulation employing models fit to the thickness data variograms was used to map the distribution of the sediment, and to quantify the uncertainty in those estimates. A strong correlation between sediment thickness data and measurements of the mass of the contaminant p,p???-DDE per unit area was identified. A calibration based on the bivariate distribution of the thickness and p,p???-DDE data was applied using Markov-Bayes indicator simulation to extend the geostatistical study and map the contamination levels in the sediment. Integrating the map grids produced by the geostatistical study of the two variables indicated that 7.8 million m3 of effluent-affected sediment exist in the map area, containing approximately 61-72 Mg (metric tons) of p,p???-DDE. Most of the contaminated sediment (about 85% of the sediment and 89% of the p,p???-DDE) occurs in water depths < 100 m. The geostatistical study also indicated that the samples available for mapping are well distributed and the uncertainty of the estimates of the thickness and contamination level of the sediments is lowest in areas where the contaminated sediment is most prevalent. ?? 2002 Elsevier Science Ltd. All rights reserved.

  16. A geostatistical approach to predicting sulfur content in the Pittsburgh coal bed

    USGS Publications Warehouse

    Watson, W.D.; Ruppert, L.F.; Bragg, L.J.; Tewalt, S.J.

    2001-01-01

    The US Geological Survey (USGS) is completing a national assessment of coal resources in the five top coal-producing regions in the US. Point-located data provide measurements on coal thickness and sulfur content. The sample data and their geologic interpretation represent the most regionally complete and up-to-date assessment of what is known about top-producing US coal beds. The sample data are analyzed using a combination of geologic and Geographic Information System (GIS) models to estimate tonnages and qualities of the coal beds. Traditionally, GIS practitioners use contouring to represent geographical patterns of "similar" data values. The tonnage and grade of coal resources are then assessed by using the contour lines as references for interpolation. An assessment taken to this point is only indicative of resource quantity and quality. Data users may benefit from a statistical approach that would allow them to better understand the uncertainty and limitations of the sample data. To develop a quantitative approach, geostatistics were applied to the data on coal sulfur content from samples taken in the Pittsburgh coal bed (located in the eastern US, in the southwestern part of the state of Pennsylvania, and in adjoining areas in the states of Ohio and West Virginia). Geostatistical methods that account for regional and local trends were applied to blocks 2.7 mi (4.3 km) on a side. The data and geostatistics support conclusions concerning the average sulfur content and its degree of reliability at regional- and economic-block scale over the large, contiguous part of the Pittsburgh outcrop, but not to a mine scale. To validate the method, a comparison was made with the sulfur contents in sample data taken from 53 coal mines located in the study area. The comparison showed a high degree of similarity between the sulfur content in the mine samples and the sulfur content represented by the geostatistically derived contours. Published by Elsevier Science B.V.

  17. An in vitro systematic spectroscopic examination of the photostabilities of a random set of commercial sunscreen lotions and their chemical UVB/UVA active agents.

    PubMed

    Serpone, Nick; Salinaro, Angela; Emeline, Alexei V; Horikoshi, Satoshi; Hidaka, Hisao; Zhao, Jincai

    2002-12-01

    The photostabilities of a random set of commercially available sunscreen lotions and their active ingredients are examined spectroscopically subsequent to simulated sunlight UV exposure. Loss of filtering efficacy can occur because of possible photochemical modifications of the sunscreen active agents. Changes in absorption of UVA/ UVB sunlight by agents in sunscreen lotions also leads to a reduction of the expected photoprotection of human skin and DNA against the harmful UV radiation. The active ingredients were investigated in aqueous media and in organic solvents of various polarities (methanol, acetonitrile, and n-hexane) under aerobic and anaerobic conditions The UV absorption features are affected by the nature of the solvents with properties closely related to oil-in-water (o/w) or water-in-oil (w/o) emulsions actually used in sunscreen formulations, and by the presence of molecular oxygen. The photostabilities of two combined chemical ingredients (oxybenzone and octyl methoxycinnamate) and the combination oxybenzone/titanium dioxide were also explored. In the latter case, oxybenzone undergoes significant photodegradation in the presence of the physical filter TiO2.

  18. Implementation Intentions as a Strategy to Increase the Notification Rate of Potential Ocular Tissue Donors by Nurses: A Clustered Randomized Trial in Hospital Settings

    PubMed Central

    2014-01-01

    Aim. The purpose of this study is to evaluate the impact, among nurses in hospital settings, of a questionnaire-based implementation intentions intervention on notification of potential ocular tissue donors to donation stakeholders. Methods. This randomized intervention was clustered at the level of hospital departments with two study arms: questionnaire-based implementation intentions intervention and control. In the intervention group, nurses were asked to plan specific actions if faced with a number of barriers when reporting potential ocular donors. The primary outcome was the potential ocular tissue donors' notification rate before and after the intervention. Analysis was based on a generalized linear model with an identity link and a binomial distribution. Results. We compared outcomes in 26 departments from 5 hospitals, 13 departments per condition. The implementation intentions intervention did not significantly increase the notification rate of ocular tissue donors (intervention: 23.1% versus control: 21.1%; χ 2 = 1.14, 2; P = 0.56). Conclusion. A single and brief implementation intentions intervention among nurses did not modify the notification rate of potential ocular tissue donors to donation stakeholders. Low exposure to the intervention was a major challenge in this study. Further studies should carefully consider a multicomponent intervention to increase exposure to this type of intervention. PMID:25132990

  19. Text Messaging to Improve Hypertension Medication Adherence in African Americans From Primary Care and Emergency Department Settings: Results From Two Randomized Feasibility Studies

    PubMed Central

    Hirzel, Lindsey; Dawood, Rachelle M; Dawood, Katee L; Nichols, Lauren P; Artinian, Nancy T; Schwiebert, Loren; Yarandi, Hossein N; Roberson, Dana N; Plegue, Melissa A; Mango, LynnMarie C; Levy, Phillip D

    2017-01-01

    Background Hypertension (HTN) is an important problem in the United States, with an estimated 78 million Americans aged 20 years and older suffering from this condition. Health disparities related to HTN are common in the United States, with African Americans suffering from greater prevalence of the condition than whites, as well as greater severity, earlier onset, and more complications. Medication adherence is an important component of HTN management, but adherence is often poor, and simply forgetting to take medications is often cited as a reason. Mobile health (mHealth) strategies have the potential to be a low-cost and effective method for improving medication adherence that also has broad reach. Objective Our goal was to determine the feasibility, acceptability, and preliminary clinical effectiveness of BPMED, an intervention designed to improve medication adherence among African Americans with uncontrolled HTN, through fully automated text messaging support. Methods We conducted two parallel, unblinded randomized controlled pilot trials with African-American patients who had uncontrolled HTN, recruited from primary care and emergency department (ED) settings. In each trial, participants were randomized to receive either usual care or the BPMED intervention for one month. Data were collected in-person at baseline and one-month follow-up, assessing the effect on medication adherence, systolic and diastolic blood pressure (SBP and DBP), medication adherence self-efficacy, and participant satisfaction. Data for both randomized controlled pilot trials were analyzed separately and combined. Results A total of 58 primary care and 65 ED participants were recruited with retention rates of 91% (53/58) and 88% (57/65), respectively. BPMED participants consistently showed numerically greater, yet nonsignificant, improvements in measures of medication adherence (mean change 0.9, SD 2.0 vs mean change 0.5, SD 1.5, P=.26), SBP (mean change –12.6, SD 24.0 vs mean change

  20. A geostatistical approach for describing spatial pattern in stream networks

    USGS Publications Warehouse

    Ganio, L.M.; Torgersen, C.E.; Gresswell, R.E.

    2005-01-01

    The shape and configuration of branched networks influence ecological patterns and processes. Recent investigations of network influences in riverine ecology stress the need to quantify spatial structure not only in a two-dimensional plane, but also in networks. An initial step in understanding data from stream networks is discerning non-random patterns along the network. On the other hand, data collected in the network may be spatially autocorrelated and thus not suitable for traditional statistical analyses. Here we provide a method that uses commercially available software to construct an empirical variogram to describe spatial pattern in the relative abundance of coastal cutthroat trout in headwater stream networks. We describe the mathematical and practical considerations involved in calculating a variogram using a non-Euclidean distance metric to incorporate the network pathway structure in the analysis of spatial variability, and use a non-parametric technique to ascertain if the pattern in the empirical variogram is non-random.

  1. Hierarchical probabilistic regionalization of volcanism for Sengan region in Japan using multivariate statistical techniques and geostatistical interpolation techniques

    SciTech Connect

    Park, Jinyong; Balasingham, P; McKenna, Sean Andrew; Pinnaduwa H.S.W. Kulatilake

    2004-09-01

    Sandia National Laboratories, under contract to Nuclear Waste Management Organization of Japan (NUMO), is performing research on regional classification of given sites in Japan with respect to potential volcanic disruption using multivariate statistics and geo-statistical interpolation techniques. This report provides results obtained for hierarchical probabilistic regionalization of volcanism for the Sengan region in Japan by applying multivariate statistical techniques and geostatistical interpolation techniques on the geologic data provided by NUMO. A workshop report produced in September 2003 by Sandia National Laboratories (Arnold et al., 2003) on volcanism lists a set of most important geologic variables as well as some secondary information related to volcanism. Geologic data extracted for the Sengan region in Japan from the data provided by NUMO revealed that data are not available at the same locations for all the important geologic variables. In other words, the geologic variable vectors were found to be incomplete spatially. However, it is necessary to have complete geologic variable vectors to perform multivariate statistical analyses. As a first step towards constructing complete geologic variable vectors, the Universal Transverse Mercator (UTM) zone 54 projected coordinate system and a 1 km square regular grid system were selected. The data available for each geologic variable on a geographic coordinate system were transferred to the aforementioned grid system. Also the recorded data on volcanic activity for Sengan region were produced on the same grid system. Each geologic variable map was compared with the recorded volcanic activity map to determine the geologic variables that are most important for volcanism. In the regionalized classification procedure, this step is known as the variable selection step. The following variables were determined as most important for volcanism: geothermal gradient, groundwater temperature, heat discharge, groundwater

  2. Integration of GIS, Geostatistics, and 3-D Technology to Assess the Spatial Distribution of Soil Moisture

    NASA Technical Reports Server (NTRS)

    Betts, M.; Tsegaye, T.; Tadesse, W.; Coleman, T. L.; Fahsi, A.

    1998-01-01

    The spatial and temporal distribution of near surface soil moisture is of fundamental importance to many physical, biological, biogeochemical, and hydrological processes. However, knowledge of these space-time dynamics and the processes which control them remains unclear. The integration of geographic information systems (GIS) and geostatistics together promise a simple mechanism to evaluate and display the spatial and temporal distribution of this vital hydrologic and physical variable. Therefore, this research demonstrates the use of geostatistics and GIS to predict and display soil moisture distribution under vegetated and non-vegetated plots. The research was conducted at the Winfred Thomas Agricultural Experiment Station (WTAES), Hazel Green, Alabama. Soil moisture measurement were done on a 10 by 10 m grid from tall fescue grass (GR), alfalfa (AA), bare rough (BR), and bare smooth (BS) plots. Results indicated that variance associated with soil moisture was higher for vegetated plots than non-vegetated plots. The presence of vegetation in general contributed to the spatial variability of soil moisture. Integration of geostatistics and GIS can improve the productivity of farm lands and the precision of farming.

  3. Geostatistical estimation of signal-to-noise ratios for spectral vegetation indices

    USGS Publications Warehouse

    Ji, Lei; Zhang, Li; Rover, Jennifer R.; Wylie, Bruce K.; Chen, Xuexia

    2014-01-01

    In the past 40 years, many spectral vegetation indices have been developed to quantify vegetation biophysical parameters. An ideal vegetation index should contain the maximum level of signal related to specific biophysical characteristics and the minimum level of noise such as background soil influences and atmospheric effects. However, accurate quantification of signal and noise in a vegetation index remains a challenge, because it requires a large number of field measurements or laboratory experiments. In this study, we applied a geostatistical method to estimate signal-to-noise ratio (S/N) for spectral vegetation indices. Based on the sample semivariogram of vegetation index images, we used the standardized noise to quantify the noise component of vegetation indices. In a case study in the grasslands and shrublands of the western United States, we demonstrated the geostatistical method for evaluating S/N for a series of soil-adjusted vegetation indices derived from the Moderate Resolution Imaging Spectroradiometer (MODIS) sensor. The soil-adjusted vegetation indices were found to have higher S/N values than the traditional normalized difference vegetation index (NDVI) and simple ratio (SR) in the sparsely vegetated areas. This study shows that the proposed geostatistical analysis can constitute an efficient technique for estimating signal and noise components in vegetation indices.

  4. A conceptual sedimentological-geostatistical model of aquifer heterogeneity based on outcrop studies

    SciTech Connect

    Davis, J.M.

    1994-01-01

    Three outcrop studies were conducted in deposits of different depositional environments. At each site, permeability measurements were obtained with an air-minipermeameter developed as part of this study. In addition, the geological units were mapped with either surveying, photographs, or both. Geostatistical analysis of the permeability data was performed to estimate the characteristics of the probability distribution function and the spatial correlation structure. The information obtained from the geological mapping was then compared with the results of the geostatistical analysis for any relationships that may exist. The main field site was located in the Albuquerque Basin of central New Mexico at an outcrop of the Pliocene-Pleistocene Sierra Ladrones Formation. The second study was conducted on the walls of waste pits in alluvial fan deposits at the Nevada Test Site. The third study was conducted on an outcrop of an eolian deposit (miocene) south of Socorro, New Mexico. The results of the three studies were then used to construct a conceptual model relating depositional environment to geostatistical models of heterogeneity. The model presented is largely qualitative but provides a basis for further hypothesis formulation and testing.

  5. Geostatistical estimation of signal-to-noise ratios for spectral vegetation indices

    NASA Astrophysics Data System (ADS)

    Ji, Lei; Zhang, Li; Rover, Jennifer; Wylie, Bruce K.; Chen, Xuexia

    2014-10-01

    In the past 40 years, many spectral vegetation indices have been developed to quantify vegetation biophysical parameters. An ideal vegetation index should contain the maximum level of signal related to specific biophysical characteristics and the minimum level of noise such as background soil influences and atmospheric effects. However, accurate quantification of signal and noise in a vegetation index remains a challenge, because it requires a large number of field measurements or laboratory experiments. In this study, we applied a geostatistical method to estimate signal-to-noise ratio (S/N) for spectral vegetation indices. Based on the sample semivariogram of vegetation index images, we used the standardized noise to quantify the noise component of vegetation indices. In a case study in the grasslands and shrublands of the western United States, we demonstrated the geostatistical method for evaluating S/N for a series of soil-adjusted vegetation indices derived from the Moderate Resolution Imaging Spectroradiometer (MODIS) sensor. The soil-adjusted vegetation indices were found to have higher S/N values than the traditional normalized difference vegetation index (NDVI) and simple ratio (SR) in the sparsely vegetated areas. This study shows that the proposed geostatistical analysis can constitute an efficient technique for estimating signal and noise components in vegetation indices.

  6. Integration of GIS, Geostatistics, and 3-D Technology to Assess the Spatial Distribution of Soil Moisture

    NASA Technical Reports Server (NTRS)

    Betts, M.; Tsegaye, T.; Tadesse, W.; Coleman, T. L.; Fahsi, A.

    1998-01-01

    The spatial and temporal distribution of near surface soil moisture is of fundamental importance to many physical, biological, biogeochemical, and hydrological processes. However, knowledge of these space-time dynamics and the processes which control them remains unclear. The integration of geographic information systems (GIS) and geostatistics together promise a simple mechanism to evaluate and display the spatial and temporal distribution of this vital hydrologic and physical variable. Therefore, this research demonstrates the use of geostatistics and GIS to predict and display soil moisture distribution under vegetated and non-vegetated plots. The research was conducted at the Winfred Thomas Agricultural Experiment Station (WTAES), Hazel Green, Alabama. Soil moisture measurement were done on a 10 by 10 m grid from tall fescue grass (GR), alfalfa (AA), bare rough (BR), and bare smooth (BS) plots. Results indicated that variance associated with soil moisture was higher for vegetated plots than non-vegetated plots. The presence of vegetation in general contributed to the spatial variability of soil moisture. Integration of geostatistics and GIS can improve the productivity of farm lands and the precision of farming.

  7. A geostatistical methodology to assess the accuracy of unsaturated flow models

    SciTech Connect

    Smoot, J.L.; Williams, R.E.

    1996-04-01

    The Pacific Northwest National Laboratory spatiotemporal movement of water injected into (PNNL) has developed a Hydrologic unsaturated sediments at the Hanford Site in Evaluation Methodology (HEM) to assist the Washington State was used to develop a new U.S. Nuclear Regulatory Commission in method for evaluating mathematical model evaluating the potential that infiltrating meteoric predictions. Measured water content data were water will produce leachate at commercial low- interpolated geostatistically to a 16 x 16 x 36 level radioactive waste disposal sites. Two key grid at several time intervals. Then a issues are raised in the HEM: (1) evaluation of mathematical model was used to predict water mathematical models that predict facility content at the same grid locations at the selected performance, and (2) estimation of the times. Node-by-node comparison of the uncertainty associated with these mathematical mathematical model predictions with the model predictions. The technical objective of geostatistically interpolated values was this research is to adapt geostatistical tools conducted. The method facilitates a complete commonly used for model parameter estimation accounting and categorization of model error at to the problem of estimating the spatial every node. The comparison suggests that distribution of the dependent variable to be model results generally are within measurement calculated by the model. To fulfill this error. The worst model error occurs in silt objective, a database describing the lenses and is in excess of measurement error.

  8. Topsoil moisture mapping using geostatistical techniques under different Mediterranean climatic conditions.

    PubMed

    Martínez-Murillo, J F; Hueso-González, P; Ruiz-Sinoga, J D

    2017-10-01

    Soil mapping has been considered as an important factor in the widening of Soil Science and giving response to many different environmental questions. Geostatistical techniques, through kriging and co-kriging techniques, have made possible to improve the understanding of eco-geomorphologic variables, e.g., soil moisture. This study is focused on mapping of topsoil moisture using geostatistical techniques under different Mediterranean climatic conditions (humid, dry and semiarid) in three small watersheds and considering topography and soil properties as key factors. A Digital Elevation Model (DEM) with a resolution of 1×1m was derived from a topographical survey as well as soils were sampled to analyzed soil properties controlling topsoil moisture, which was measured during 4-years. Afterwards, some topography attributes were derived from the DEM, the soil properties analyzed in laboratory, and the topsoil moisture was modeled for the entire watersheds applying three geostatistical techniques: i) ordinary kriging; ii) co-kriging considering as co-variate topography attributes; and iii) co-kriging ta considering as co-variates topography attributes and gravel content. The results indicated topsoil moisture was more accurately mapped in the dry and semiarid watersheds when co-kriging procedure was performed. The study is a contribution to improve the efficiency and accuracy of studies about the Mediterranean eco-geomorphologic system and soil hydrology in field conditions. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Bootstrapped models for intrinsic random functions

    SciTech Connect

    Campbell, K.

    1988-08-01

    Use of intrinsic random function stochastic models as a basis for estimation in geostatistical work requires the identification of the generalized covariance function of the underlying process. The fact that this function has to be estimated from data introduces an additional source of error into predictions based on the model. This paper develops the sample reuse procedure called the bootstrap in the context of intrinsic random functions to obtain realistic estimates of these errors. Simulation results support the conclusion that bootstrap distributions of functionals of the process, as well as their kriging variance, provide a reasonable picture of variability introduced by imperfect estimation of the generalized covariance function.

  10. Bootstrapped models for intrinsic random functions

    SciTech Connect

    Campbell, K.

    1987-01-01

    The use of intrinsic random function stochastic models as a basis for estimation in geostatistical work requires the identification of the generalized covariance function of the underlying process, and the fact that this function has to be estimated from the data introduces an additional source of error into predictions based on the model. This paper develops the sample reuse procedure called the ''bootstrap'' in the context of intrinsic random functions to obtain realistic estimates of these errors. Simulation results support the conclusion that bootstrap distributions of functionals of the process, as well as of their ''kriging variance,'' provide a reasonable picture of the variability introduced by imperfect estimation of the generalized covariance function.

  11. Adaptive goal setting and financial incentives: a 2 × 2 factorial randomized controlled trial to increase adults' physical activity.

    PubMed

    Adams, Marc A; Hurley, Jane C; Todd, Michael; Bhuiyan, Nishat; Jarrett, Catherine L; Tucker, Wesley J; Hollingshead, Kevin E; Angadi, Siddhartha S

    2017-03-29

    Emerging interventions that rely on and harness variability in behavior to adapt to individual performance over time may outperform interventions that prescribe static goals (e.g., 10,000 steps/day). The purpose of this factorial trial was to compare adaptive vs. static goal setting and immediate vs. delayed, non-contingent financial rewards for increasing free-living physical activity (PA). A 4-month 2 × 2 factorial randomized controlled trial tested main effects for goal setting (adaptive vs. static goals) and rewards (immediate vs. delayed) and interactions between factors to increase steps/day as measured by a Fitbit Zip. Moderate-to-vigorous PA (MVPA) minutes/day was examined as a secondary outcome. Participants (N = 96) were mainly female (77%), aged 41 ± 9.5 years, and all were insufficiently active and overweight/obese (mean BMI = 34.1 ± 6.2). Participants across all groups increased by 2389 steps/day on average from baseline to intervention phase (p < .001). Participants receiving static goals showed a stronger increase in steps per day from baseline phase to intervention phase (2630 steps/day) than those receiving adaptive goals (2149 steps/day; difference = 482 steps/day, p = .095). Participants receiving immediate rewards showed stronger improvement (2762 step/day increase) from baseline to intervention phase than those receiving delayed rewards (2016 steps/day increase; difference = 746 steps/day, p = .009). However, the adaptive goals group showed a slower decrease in steps/day from the beginning of the intervention phase to the end of the intervention phase (i.e. less than half the rate) compared to the static goals group (-7.7 steps vs. -18.3 steps each day; difference = 10.7 steps/day, p < .001) resulting in better improvements for the adaptive goals group by study end. Rate of change over the intervention phase did not differ between reward groups. Significant goal phase x goal setting x reward interactions were

  12. Quantifying aggregated uncertainty in Plasmodium falciparum malaria prevalence and populations at risk via efficient space-time geostatistical joint simulation.

    PubMed

    Gething, Peter W; Patil, Anand P; Hay, Simon I

    2010-04-01

    Risk maps estimating the spatial distribution of infectious diseases are required to guide public health policy from local to global scales. The advent of model-based geostatistics (MBG) has allowed these maps to be generated in a formal statistical framework, providing robust metrics of map uncertainty that enhances their utility for decision-makers. In many settings, decision-makers require spatially aggregated measures over large regions such as the mean prevalence within a country or administrative region, or national populations living under different levels of risk. Existing MBG mapping approaches provide suitable metrics of local uncertainty--the fidelity of predictions at each mapped pixel--but have not been adapted for measuring uncertainty over large areas, due largely to a series of fundamental computational constraints. Here the authors present a new efficient approximating algorithm that can generate for the first time the necessary joint simulation of prevalence values across the very large prediction spaces needed for global scale mapping. This new approach is implemented in conjunction with an established model for P. falciparum allowing robust estimates of mean prevalence at any specified level of spatial aggregation. The model is used to provide estimates of national populations at risk under three policy-relevant prevalence thresholds, along with accompanying model-based measures of uncertainty. By overcoming previously unchallenged computational barriers, this study illustrates how MBG approaches, already at the forefront of infectious disease mapping, can be extended to provide large-scale aggregate measures appropriate for decision-makers.

  13. Geostatistics and Geographic Information Systems to Study the Spatial Distribution of Grapholita molesta (Busck) (Lepidoptera: Tortricidae) in Peach Fields.

    PubMed

    Duarte, F; Calvo, M V; Borges, A; Scatoni, I B

    2015-08-01

    The oriental fruit moth, Grapholita molesta (Busck), is the most serious pest in peach, and several insecticide applications are required to reduce crop damage to acceptable levels. Geostatistics and Geographic Information Systems (GIS) are employed to measure the range of spatial correlation of G. molesta in order to define the optimum sampling distance for performing spatial analysis and to determine the current distribution of the pest in peach orchards of southern Uruguay. From 2007 to 2010, 135 pheromone traps per season were installed and georeferenced in peach orchards distributed over 50,000 ha. Male adult captures were recorded weekly from September to April. Structural analysis of the captures was performed, yielding 14 semivariograms for the accumulated captures analyzed by generation and growing season. Two sets of maps were constructed to describe the pest distribution. Nine significant models were obtained in the 14 evaluated periods. The range estimated for the correlation was from 908 to 6884 m. Three hot spots of high population level and some areas with comparatively low populations were constant over the 3-year period, while there is a greater variation in the size of the population in different generations and years in other areas.

  14. Evaluating shrub-associated spatial patterns of soil properties in a shrub-steppe ecosystem using multiple-variable geostatistics

    SciTech Connect

    Halvorson, J.J.; Smith, J.L. |; Bolton, H. Jr.; Rossi, R.E.

    1995-09-01

    Geostatistics are often calculated for a single variable at a time, even though many natural phenomena are functions of several variables. The objective of this work was to demonstrate a nonparametric approach for assessing the spatial characteristics of multiple-variable phenomena. Specifically, we analyzed the spatial characteristics of resource islands in the soil under big sagebrush (Artemisia tridentala Nutt.), a dominant shrub in the intermountain western USA. For our example, we defined resource islands as a function of six soil variables representing concentrations of soil resources, populations of microorganisms, and soil microbial physiological variables. By collectively evaluating the indicator transformations of these individual variables, we created a new data set, termed a multiple-variable indicator transform or MVIT. Alternate MVITs were obtained by varying the selection criteria. Each MVIT was analyzed with variography to characterize spatial continuity, and with indicator kriging to predict the combined probability of their occurrence at unsampled locations in the landscape. Simple graphical analysis and variography demonstrated spatial dependence for all individual soil variables. Maps derived from ordinary kriging of MVITs suggested that the combined probabilities for encountering zones of above-median resources were greatest near big sagebrush. 51 refs., 5 figs., 1 tab.

  15. Assessment of nitrate pollution in the Grand Morin aquifers (France): combined use of geostatistics and physically based modeling.

    PubMed

    Flipo, Nicolas; Jeannée, Nicolas; Poulin, Michel; Even, Stéphanie; Ledoux, Emmanuel

    2007-03-01

    The objective of this work is to combine several approaches to better understand nitrate fate in the Grand Morin aquifers (2700 km(2)), part of the Seine basin. cawaqs results from the coupling of the hydrogeological model newsam with the hydrodynamic and biogeochemical model of river ProSe. cawaqs is coupled with the agronomic model Stics in order to simulate nitrate migration in basins. First, kriging provides a satisfactory representation of aquifer nitrate contamination from local observations, to set initial conditions for the physically based model. Then associated confidence intervals, derived from data using geostatistics, are used to validate cawaqs results. Results and evaluation obtained from the combination of these approaches are given (period 1977-1988). Then cawaqs is used to simulate nitrate fate for a 20-year period (1977-1996). The mean nitrate concentrations increase in aquifers is 0.09 mgN L(-1)yr(-1), resulting from an average infiltration flux of 3500 kgN.km(-2)yr(-1).

  16. Groundwater levels time series sensitivity to pluviometry and air temperature: a geostatistical approach to Sfax region, Tunisia.

    PubMed

    Triki, Ibtissem; Trabelsi, Nadia; Hentati, Imen; Zairi, Moncef

    2014-03-01

    In this paper, the pattern of groundwater level fluctuations is investigated by statistical techniques for 24 monitoring wells located in an unconfined coastal aquifer in Sfax (Tunisia) for a time period from 1997 to 2006. Firstly, a geostatistical study is performed to characterize the temporal behaviors of data sets in terms of variograms and to make predictions about the value of the groundwater level at unsampled times. Secondly, multivariate statistical methods, i.e., principal component analysis (PCA) and cluster analysis (CA) of time series of groundwater levels are used to classify groundwater hydrographs regard to identical fluctuation pattern. Three groundwater groups (A, B, and C) were identified. In group "A," water level decreases continuously throughout the study periods with rapid annual cyclic variation, whereas in group "B," the water level contains much less high-frequency variation. The wells of group "C" represents a steady and gradual increase of groundwater levels caused by the aquifer artificial recharge. Furthermore, a cross-correlation analysis is used to investigate the aquifer response to local rainfall and temperature records. The result revealed that the temperature is more affecting the variation of the groundwater level of group A wells than the rainfall. However, the second and the third groups are less affected by rainfall or temperature.

  17. How well do randomized controlled trial data generalize to 'real-world' clinical practice settings? A comparison of two generalized anxiety disorder studies.

    PubMed

    Kasper, Siegfried; Brasser, Matthias; Schweizer, Edward; Lyndon, Gavin; Prieto, Rita

    2014-01-01

    The aim of this post-hoc comparison is to compare efficacy and tolerability results from two generalized anxiety disorder (GAD) studies: a placebo-controlled, randomized controlled trial (RCT) and a study conducted in the clinical practice setting, and to evaluate the extent to which results from RCTs in GAD patients can be generalized to clinical practice. In the clinical practice study, GAD outpatients (n=578) were treated with 4 weeks of pregabalin 150-600mg/day. In the double-blind placebo-controlled RCT, GAD outpatients (n=249) were randomized to 8 weeks of pregabalin (300-600mg/day), or placebo (only the first 4 weeks are included in the current analysis). Efficacy measures included the Hospital Anxiety and Depression Scale - Anxiety and Depression subscales (HADS-A; HADS-D), a visual analogue anxiety scale (VAS-Anxiety), and the Medical Outcomes Study Sleep Problems Index (MOS-SPI). Baseline HADS-A and HADS-D scores were both higher in the clinical practice study vs. the RCT. In the RCT, treatment with pregabalin resulted in significantly greater Week 4 change vs. placebo in the HADS-A (-5.3 vs. -3.9; P<0.005), VAS-Anxiety (-24.0 vs. -13.3; P<0.02), MOS-SPI (-19.1 vs. -9.5; P<0.01), and HADS-D (-2.7 vs. -1.4; P<0.05). The magnitude of Week 4 improvement on pregabalin in the clinical practice study was numerically larger on the HADS-A (-5.9), VAS-Anxiety (-36.0), MOS-SPI (-22.7), and HADS-D (-5.1), despite use of lower doses. These results suggest that clinical practice patients with GAD may achieve comparable efficacy on lower doses of pregabalin than tested in RCTs, despite having comparable levels of anxiety symptom severity at baseline. The current exploratory comparison also suggests that results from RCTs in patients with GAD may not be directly generalizable to clinical practice. © 2013 Published by Elsevier B.V. and ECNP.

  18. Adoption and maintenance of gym-based strength training in the community setting in adults with excess weight or type 2 diabetes: a randomized controlled trial.

    PubMed

    Teychenne, Megan; Ball, Kylie; Salmon, Jo; Daly, Robin M; Crawford, David A; Sethi, Parneet; Jorna, Michelle; Dunstan, David W

    2015-08-25

    Participant adoption and maintenance is a major challenge in strength training (ST) programs in the community-setting. In adults who were overweight or with type 2 diabetes (T2DM), the aim of this study was to compare the effectiveness of a standard ST program (SST) to an enhanced program (EST) on the adoption and maintenance of ST and cardio-metabolic risk factors and muscle strength. A 12-month cluster-randomized controlled trial consisting of a 6-month adoption phase followed by a 6-month maintenance phase. In 2008-2009, men and women aged 40-75 years (n = 318) with T2DM (n = 117) or a BMI >25 (n = 201) who had not participated in ST previously were randomized into either a SST or an EST program (which included additional motivationally-tailored behavioral counselling). Adoption and maintenance were defined as undertaking ≥ 3 weekly gym-based exercise sessions during the first 6-months and from 6-12 months respectively and were assessed using a modified version of the CHAMPS (Community Healthy Activity Models Program for Seniors) instrument. Relative to the SST group, the adjusted odds ratio (OR) of adopting ST for all participants in the EST group was 3.3 (95 % CI 1.2 to 9.4). In stratified analyses including only those with T2DM, relative to the SST group, the adjusted OR of adopting ST in the EST group was 8.2 (95 % CI 1.5-45.5). No significant between-group differences were observed for maintenance of ST in either pooled or stratified analyses. In those with T2DM, there was a significant reduction in HbA1c in the EST compared to SST group during the adoption phase (net difference, -0.13 % [-0.26 to -0.01]), which persisted after 12-months (-0.17 % [-0.3 to -0.05]). A behaviorally-focused community-based EST intervention was more effective than a SST program for the adoption of ST in adults with excess weight or T2DM and led to greater improvements in glycemic control in those with T2DM. Registered at ACTRN12611000695909 (Date registered

  19. Forced-use therapy for children with cerebral palsy in the community setting: a single-blinded randomized controlled pilot trial.

    PubMed

    Eugster-Buesch, Francisca; de Bruin, Eling D; Boltshauser, Eugen; Steinlin, Maja; Küenzle, Christoph; Müller, Elisabeth; Capone, Andrea; Pfann, Renat; Meyer-Heim, Andreas

    2012-01-01

    The aim of this study was to elucidate the feasibility, efficacy, and sustainability of a home-based, two-week, forced-use therapy (FUT) program for children with hemiplegic cerebral palsy (CP). A single-blinded, randomized controlled design was chosen. The Melbourne Assessment of Unilateral Upper Limb Function (MA) was carried out at baseline, pretest, post-test, and follow-up at two weeks, three months, and 12 months. Additionally, a questionnaire was used to evaluate the clinical relevance and integration of FUT in the home setting. 23 children, ages six to 16 years, took part in the study and were randomized into either an intervention group (n=12, mean age 9.8 ± 3.5 years) or a control group ($n=$ 11, mean age 11.7 ± 3.7 years). The intervention consisted of constraint of the unaffected hand for six hours per day and promotion of different activities of daily living according to an age-related manual for the use of the non-constraint hand. Unpaired t-tests for the change in MA scores relative to the pre-test values showed no difference between the groups at any time point: post-test (p=0.304), two weeks (p=0.193), or three months (p=0.957). Improvements in Activities of Daily Living (ADLs) assessed by questionnaires were observed by 64% of parents of the intervention group. Fifty-five percent of parents stated that the FUT program was exhausting and only 45% indicated that they achieved constraint for 6 hours per day. Our results evaluating a home-based FUT program of 14 days show no statistically significant improvement of upper extremity function in children with CP. The lack of compliance and absence of structured exercises proved to be considerable pitfalls of the home-based FUT program. Therefore, future home based FUT concepts should put special emphasis on the close monitoring and support of children and their families, as well as the integration of structured exercise sessions.

  20. A Bayesian geostatistical approach for evaluating the uncertainty of contaminant mass discharges from point sources

    NASA Astrophysics Data System (ADS)

    Troldborg, M.; Nowak, W.; Binning, P. J.; Bjerg, P. L.

    2012-12-01

    Estimates of mass discharge (mass/time) are increasingly being used when assessing risks of groundwater contamination and designing remedial systems at contaminated sites. Mass discharge estimates are, however, prone to rather large uncertainties as they integrate uncertain spatial distributions of both concentration and groundwater flow velocities. For risk assessments or any other decisions that are being based on mass discharge estimates, it is essential to address these uncertainties. We present a novel Bayesian geostatistical approach for quantifying the uncertainty of the mass discharge across a multilevel control plane. The method decouples the flow and transport simulation and has the advantage of avoiding the heavy computational burden of three-dimensional numerical flow and transport simulation coupled with geostatistical inversion. It may therefore be of practical relevance to practitioners compared to existing methods that are either too simple or computationally demanding. The method is based on conditional geostatistical simulation and accounts for i) heterogeneity of both the flow field and the concentration distribution through Bayesian geostatistics (including the uncertainty in covariance functions), ii) measurement uncertainty, and iii) uncertain source zone geometry and transport parameters. The method generates multiple equally likely realizations of the spatial flow and concentration distribution, which all honour the measured data at the control plane. The flow realizations are generated by analytical co-simulation of the hydraulic conductivity and the hydraulic gradient across the control plane. These realizations are made consistent with measurements of both hydraulic conductivity and head at the site. An analytical macro-dispersive transport solution is employed to simulate the mean concentration distribution across the control plane, and a geostatistical model of the Box-Cox transformed concentration data is used to simulate observed

  1. Working with men to prevent intimate partner violence in a conflict-affected setting: a pilot cluster randomized controlled trial in rural Côte d’Ivoire

    PubMed Central

    2014-01-01

    Background Evidence from armed conflict settings points to high levels of intimate partner violence (IPV) against women. Current knowledge on how to prevent IPV is limited—especially within war-affected settings. To inform prevention programming on gender-based violence in settings affected by conflict, we evaluated the impact of adding a targeted men’s intervention to a community-based prevention programme in Côte d’Ivoire. Methods We conducted a two-armed, non-blinded cluster randomized trial in Côte d’Ivoire among 12 pair-matched communities spanning government-controlled, UN buffer, and rebel–controlled zones. The intervention communities received a 16-week IPV prevention intervention using a men’s discussion group format. All communities received community-based prevention programmes. Baseline data were collected from couples in September 2010 (pre-intervention) and follow-up in March 2012 (one year post-intervention). The primary trial outcome was women’s reported experiences of physical and/or sexual IPV in the last 12 months. We also assessed men’s reported intention to use physical IPV, attitudes towards sexual IPV, use of hostility and conflict management skills, and participation in gendered household tasks. An adjusted cluster-level intention to treat analysis was used to compare outcomes between intervention and control communities at follow-up. Results At follow-up, reported levels of physical and/or sexual IPV in the intervention arm had decreased compared to the control arm (ARR 0.52, 95% CI 0.18-1.51, not significant). Men participating in the intervention reported decreased intentions to use physical IPV (ARR 0.83, 95% CI 0.66-1.06) and improved attitudes toward sexual IPV (ARR 1.21, 95% CI 0.77-1.91). Significant differences were found between men in the intervention and control arms’ reported ability to control their hostility and manage conflict (ARR 1.3, 95% CI 1.06-1.58), and participation in gendered household tasks (ARR

  2. A multicenter randomized controlled trial of a plant-based nutrition program to reduce body weight and cardiovascular risk in the corporate setting: the GEICO study.

    PubMed

    Mishra, S; Xu, J; Agarwal, U; Gonzales, J; Levin, S; Barnard, N D

    2013-07-01

    To determine the effects of a low-fat plant-based diet program on anthropometric and biochemical measures in a multicenter corporate setting. Employees from 10 sites of a major US company with body mass index ≥ 25 kg/m(2) and/or previous diagnosis of type 2 diabetes were randomized to either follow a low-fat vegan diet, with weekly group support and work cafeteria options available, or make no diet changes for 18 weeks. Dietary intake, body weight, plasma lipid concentrations, blood pressure and glycated hemoglobin (HbA1C) were determined at baseline and 18 weeks. Mean body weight fell 2.9 kg and 0.06 kg in the intervention and control groups, respectively (P<0.001). Total and low-density lipoprotein (LDL) cholesterol fell 8.0 and 8.1 mg/dl in the intervention group and 0.01 and 0.9 mg/dl in the control group (P<0.01). HbA1C fell 0.6 percentage point and 0.08 percentage point in the intervention and control group, respectively (P<0.01).Among study completers, mean changes in body weight were -4.3 kg and -0.08 kg in the intervention and control groups, respectively (P<0.001). Total and LDL cholesterol fell 13.7 and 13.0 mg/dl in the intervention group and 1.3 and 1.7 mg/dl in the control group (P<0.001). HbA1C levels decreased 0.7 percentage point and 0.1 percentage point in the intervention and control group, respectively (P<0.01). An 18-week dietary intervention using a low-fat plant-based diet in a corporate setting improves body weight, plasma lipids, and, in individuals with diabetes, glycemic control.

  3. A multicenter randomized controlled trial of a plant-based nutrition program to reduce body weight and cardiovascular risk in the corporate setting: the GEICO study

    PubMed Central

    Mishra, S; Xu, J; Agarwal, U; Gonzales, J; Levin, S; Barnard, N D

    2013-01-01

    Background/objectives: To determine the effects of a low-fat plant-based diet program on anthropometric and biochemical measures in a multicenter corporate setting. Subjects/methods: Employees from 10 sites of a major US company with body mass index ⩾25 kg/m2 and/or previous diagnosis of type 2 diabetes were randomized to either follow a low-fat vegan diet, with weekly group support and work cafeteria options available, or make no diet changes for 18 weeks. Dietary intake, body weight, plasma lipid concentrations, blood pressure and glycated hemoglobin (HbA1C) were determined at baseline and 18 weeks. Results: Mean body weight fell 2.9 kg and 0.06 kg in the intervention and control groups, respectively (P<0.001). Total and low-density lipoprotein (LDL) cholesterol fell 8.0 and 8.1 mg/dl in the intervention group and 0.01 and 0.9 mg/dl in the control group (P<0.01). HbA1C fell 0.6 percentage point and 0.08 percentage point in the intervention and control group, respectively (P<0.01). Among study completers, mean changes in body weight were −4.3 kg and −0.08 kg in the intervention and control groups, respectively (P<0.001). Total and LDL cholesterol fell 13.7 and 13.0 mg/dl in the intervention group and 1.3 and 1.7 mg/dl in the control group (P<0.001). HbA1C levels decreased 0.7 percentage point and 0.1 percentage point in the intervention and control group, respectively (P<0.01). Conclusions: An 18-week dietary intervention using a low-fat plant-based diet in a corporate setting improves body weight, plasma lipids, and, in individuals with diabetes, glycemic control. PMID:23695207

  4. Impact of physical activity on energy balance, food intake and choice in normal weight and obese children in the setting of acute social stress: a randomized controlled trial.

    PubMed

    Horsch, Antje; Wobmann, Marion; Kriemler, Susi; Munsch, Simone; Borloz, Sylvie; Balz, Alexandra; Marques-Vidal, Pedro; Borghini, Ayala; Puder, Jardena J

    2015-02-19

    Psychological stress negatively influences food intake and food choices, thereby contributing to the development of childhood obesity. Physical activity can also moderate eating behavior and influence calorie intake. However, it is unknown if acute physical activity influences food intake and overall energy balance after acute stress exposure in children. We therefore investigated the impact of acute physical activity on overall energy balance (food intake minus energy expenditure), food intake, and choice in the setting of acute social stress in normal weight (NW) and overweight/obese (OW/OB) children as well as the impact of psychological risk factors. After receiving written consent from their parents, 26 NW (BMI < 90(th) percentile) and 24 7-to 11-year-old OW (n = 5)/OB (n = 19, BMI ≥ 90(th) percentile) children were randomly allocated using computer-generated numbers (1:1, after stratification for weight status) to acute moderate physical or to sedentary activity for 30 min. Afterwards, all children were exposed to an acute social stressor. Children and their parents completed self-report questionnaires. At the end of the stressor, children were allowed to eat freely from a range of 12 different foods (6 sweet/6 salty; each of low/high caloric density). Energy balance, food intake/choice and obesity-related psychological risk factors were assessed. Lower overall energy balance (p = 0.019) and a decreased choice of low density salty foods (p < 0.001) in NW children compared with OW/OB children was found after acute moderate physical activity but not sedentary activity. Independent of their allocation, OW/OB children ate more high density salty foods (104 kcal (34 to 173), p = 0.004) following stress. They scored higher on impulsive behavior (p = 0.005), restrained eating (p < 0.001) and parental corporal punishment (p = 0.03), but these psychological factors were not related to stress-induced food intake/choice. Positive parenting tended to be related to

  5. An alcohol-focused intervention versus a healthy living intervention for problem drinkers identified in a general hospital setting (ADAPTA): study protocol for a randomized, controlled pilot trial

    PubMed Central

    2013-01-01

    Background Alcohol misuse is a major cause of premature mortality and ill health. Although there is a high prevalence of alcohol problems among patients presenting to general hospital, many of these people are not help seekers and do not engage in specialist treatment. Hospital admission is an opportunity to steer people towards specialist treatment, which can reduce health-care utilization and costs to the public sector and produce substantial individual health and social benefits. Alcohol misuse is associated with other lifestyle problems, which are amenable to intervention. It has been suggested that the development of a healthy or balanced lifestyle is potentially beneficial for reducing or abstaining from alcohol use, and relapse prevention. The aim of the study is to test whether or not the offer of a choice of health-related lifestyle interventions is more acceptable, and therefore able to engage more problem drinkers in treatment, than an alcohol-focused intervention. Methods/design This is a pragmatic, randomized, controlled, open pilot study in a UK general hospital setting with concurrent economic evaluation and a qualitative component. Potential participants are those admitted to hospital with a diagnosis likely to be responsive to addiction interventions who score equal to or more than 16 on the Alcohol Use Disorders Identification Test (AUDIT). The main purpose of this pilot study is to evaluate the acceptability of two sorts of interventions (healthy living related versus alcohol focused) to the participants and to assess the components and processes of the design. Qualitative research will be undertaken to explore acceptability and the impact of the approach, assessment, recruitment and intervention on trial participants and non-participants. The effectiveness of the two treatments will be compared at 6 months using AUDIT scores as the primary outcome measure. There will be additional economic, qualitative and secondary outcome measurements

  6. Prospective Randomized Double-Blind Pilot Study of Site-Specific Consensus Atlas Implementation for Rectal Cancer Target Volume Delineation in the Cooperative Group Setting

    SciTech Connect

    Fuller, Clifton D.; Nijkamp, Jasper; Duppen, Joop C.; Rasch, Coen R.N.; Thomas, Charles R.; Wang, Samuel J.; Okunieff, Paul; Jones, William E.; Baseman, Daniel; Patel, Shilpen; Demandante, Carlo G.N.; Harris, Anna M.; Smith, Benjamin D.; Katz, Alan W.; McGann, Camille

    2011-02-01

    Purpose: Variations in target volume delineation represent a significant hurdle in clinical trials involving conformal radiotherapy. We sought to determine the effect of a consensus guideline-based visual atlas on contouring the target volumes. Methods and Materials: A representative case was contoured (Scan 1) by 14 physician observers and a reference expert with and without target volume delineation instructions derived from a proposed rectal cancer clinical trial involving conformal radiotherapy. The gross tumor volume (GTV), and two clinical target volumes (CTVA, including the internal iliac, presacral, and perirectal nodes, and CTVB, which included the external iliac nodes) were contoured. The observers were randomly assigned to receipt (Group A) or nonreceipt (Group B) of a consensus guideline and atlas for anorectal cancers and then instructed to recontour the same case/images (Scan 2). Observer variation was analyzed volumetrically using the conformation number (CN, where CN = 1 equals total agreement). Results: Of 14 evaluable contour sets (1 expert and 7 Group A and 6 Group B observers), greater agreement was found for the GTV (mean CN, 0.75) than for the CTVs (mean CN, 0.46-0.65). Atlas exposure for Group A led to significantly increased interobserver agreement for CTVA (mean initial CN, 0.68, after atlas use, 0.76; p = .03) and increased agreement with the expert reference (initial mean CN, 0.58; after atlas use, 0.69; p = .02). For the GTV and CTVB, neither the interobserver nor the expert agreement was altered after atlas exposure. Conclusion: Consensus guideline atlas implementation resulted in a detectable difference in interobserver agreement and a greater approximation of expert volumes for the CTVA but not for the GTV or CTVB in the specified case. Visual atlas inclusion should be considered as a feature in future clinical trials incorporating conformal RT.

  7. Randomized controlled trial of meat compared with multimicronutrient-fortified cereal in infants and toddlers with high stunting rates in diverse settings.

    PubMed

    Krebs, Nancy F; Mazariegos, Manolo; Chomba, Elwyn; Sami, Neelofar; Pasha, Omrana; Tshefu, Antoinette; Carlo, Waldemar A; Goldenberg, Robert L; Bose, Carl L; Wright, Linda L; Koso-Thomas, Marion; Goco, Norman; Kindem, Mark; McClure, Elizabeth M; Westcott, Jamie; Garces, Ana; Lokangaka, Adrien; Manasyan, Albert; Imenda, Edna; Hartwell, Tyler D; Hambidge, K Michael

    2012-10-01

    Improved complementary feeding is cited as a critical factor for reducing stunting. Consumption of meats has been advocated, but its efficacy in low-resource settings has not been tested. The objective was to test the hypothesis that daily intake of 30 to 45 g meat from 6 to 18 mo of age would result in greater linear growth velocity and improved micronutrient status in comparison with an equicaloric multimicronutrient-fortified cereal. This was a cluster randomized efficacy trial conducted in the Democratic Republic of Congo, Zambia, Guatemala, and Pakistan. Individual daily portions of study foods and education messages to enhance complementary feeding were delivered to participants. Blood tests were obtained at trial completion. A total of 532 (86.1%) and 530 (85.8%) participants from the meat and cereal arms, respectively, completed the study. Linear growth velocity did not differ between treatment groups: 1.00 (95% CI: 0.99, 1.02) and 1.02 (95% CI: 1.00, 1.04) cm/mo for the meat and cereal groups, respectively (P = 0.39). From baseline to 18 mo, stunting [length-for-age z score (LAZ) <-2.0] rates increased from ~33% to nearly 50%. Years of maternal education and maternal height were positively associated with linear growth velocity (P = 0.0006 and 0.003, respectively); LAZ at 6 mo was negatively associated (P < 0.0001). Anemia rates did not differ by group; iron deficiency was significantly lower in the cereal group. The high rate of stunting at baseline and the lack of effect of either the meat or multiple micronutrient-fortified cereal intervention to reverse its progression argue for multifaceted interventions beginning in the pre- and early postnatal periods.

  8. Effects of a simple home-based exercise program on fall prevention in older adults: A 12-month primary care setting, randomized controlled trial.

    PubMed

    Boongird, Chitima; Keesukphan, Prasit; Phiphadthakusolkul, Soontraporn; Rattanasiri, Sasivimol; Thakkinstian, Ammarin

    2017-04-24

    To investigate the effects of a simple home-based exercise program on falls, physical functioning, fear of falling and quality of life in a primary care setting. Participants (n = 439), aged ≥65 years with mild-to-moderate balance dysfunction were randomly assigned to an exercise (n = 219) or control (n = 220) group. The program consisted of five combined exercises, which progressed in difficulty, and a walking plan. Controls received fall prevention education. Physical functioning and other outcomes were measured at 3- and 6-month follow-up visits. Falls were monitored with fall diaries and phone interviews at 3, 6, 9, and 12 months respectively. The 12 months of the home-based exercise program showed the incidence of falls was 0.30 falls per person year in the exercise group, compared with 0.40 in the control group. The estimated incidence rate ratio was 0.75 (95% CI 0.55-1.04), which was not statistically significant. The fear of falling (measured by the Thai fall efficacy scale) was significantly lower in the exercise than control group (24.7 vs 27.0, P = 0.003). Also, the trend of program adherence increased in the exercise group. (29.6% to 56.8%). This simple home-based exercise program showed a reduction in fear of falling and a positive trend towards exercise adherence. Further studies should focus on factors associated with exercise adherence, the benefits of increased home visits and should follow participants longer in order to evaluate the effects of the program. Geriatr Gerontol Int 2017; ••: ••-••. © 2017 Japan Geriatrics Society.

  9. A random set scoring model for prioritization of disease candidate genes using protein complexes and data-mining of GeneRIF, OMIM and PubMed records.

    PubMed

    Jiang, Li; Edwards, Stefan M; Thomsen, Bo; Workman, Christopher T; Guldbrandtsen, Bernt; Sørensen, Peter

    2014-09-24

    Prioritizing genetic variants is a challenge because disease susceptibility loci are often located in genes of unknown function or the relationship with the corresponding phenotype is unclear. A global data-mining exercise on the biomedical literature can establish the phenotypic profile of genes with respect to their connection to disease phenotypes. The importance of protein-protein interaction networks in the genetic heterogeneity of common diseases or complex traits is becoming increasingly recognized. Thus, the development of a network-based approach combined with phenotypic profiling would be useful for disease gene prioritization. We developed a random-set scoring model and implemented it to quantify phenotype relevance in a network-based disease gene-prioritization approach. We validated our approach based on different gene phenotypic profiles, which were generated from PubMed abstracts, OMIM, and GeneRIF records. We also investigated the validity of several vocabulary filters and different likelihood thresholds for predicted protein-protein interactions in terms of their effect on the network-based gene-prioritization approach, which relies on text-mining of the phenotype data. Our method demonstrated good precision and sensitivity compared with those of two alternative complex-based prioritization approaches. We then conducted a global ranking of all human genes according to their relevance to a range of human diseases. The resulting accurate ranking of known causal genes supported the reliability of our approach. Moreover, these data suggest many promising novel candidate genes for human disorders that have a complex mode of inheritance. We have implemented and validated a network-based approach to prioritize genes for human diseases based on their phenotypic profile. We have devised a powerful and transparent tool to identify and rank candidate genes. Our global gene prioritization provides a unique resource for the biological interpretation of data

  10. Optimization of ventilator setting by flow and pressure waveforms analysis during noninvasive ventilation for acute exacerbations of COPD: a multicentric randomized controlled trial

    PubMed Central

    2011-01-01

    Introduction The analysis of flow and pressure waveforms generated by ventilators can be useful in the optimization of patient-ventilator interactions, notably in chronic obstructive pulmonary disease (COPD) patients. To date, however, a real clinical benefit of this approach has not been proven. Methods The aim of the present randomized, multi-centric, controlled study was to compare optimized ventilation, driven by the analysis of flow and pressure waveforms, to standard ventilation (same physician, same initial ventilator setting, same time spent at the bedside while the ventilator screen was obscured with numerical data always available). The primary aim was the rate of pH normalization at two hours, while secondary aims were changes in PaCO2, respiratory rate and the patient's tolerance to ventilation (all parameters evaluated at baseline, 30, 120, 360 minutes and 24 hours after the beginning of ventilation). Seventy patients (35 for each group) with acute exacerbation of COPD were enrolled. Results Optimized ventilation led to a more rapid normalization of pH at two hours (51 vs. 26% of patients), to a significant improvement of the patient's tolerance to ventilation at two hours, and to a higher decrease of PaCO2 at two and six hours. Optimized ventilation induced physicians to use higher levels of external positive end-expiratory pressure, more sensitive inspiratory triggers and a faster speed of pressurization. Conclusions The analysis of the waveforms generated by ventilators has a significant positive effect on physiological and patient-centered outcomes during acute exacerbation of COPD. The acquisition of specific skills in this field should be encouraged. Trial registration ClinicalTrials.gov NCT01291303. PMID:22115190

  11. Hand-held echocardiography in the setting of pre-operative cardiac evaluation of patients undergoing non-cardiac surgery: results from a randomized pilot study.

    PubMed

    Cavallari, Ilaria; Mega, Simona; Goffredo, Costanza; Patti, Giuseppe; Chello, Massimo; Di Sciascio, Germano

    2015-06-01

    Transthoracic echocardiography is not a routine test in the pre-operative cardiac evaluation of patients undergoing non-cardiac surgery but may be considered in those with known heart failure and valvular heart disease or complaining cardiac symptoms. In this setting, hand-held echocardiography (HHE) could find a potential application as an alternative to standard echocardiography in selected patients; however, its utility in this context has not been investigated. The aim of this pilot study was to evaluate the conclusiveness of HHE compared to standard echocardiography in this subset of patients. 100 patients scheduled for non-cardiac surgery were randomized to receive a standard exam with a Philips Ie33 or a bedside evaluation with a pocket-size imaging device (Opti-Go, Philips Medical System). The primary endpoint was the percentage of satisfactory diagnosis at the end of the examination referred as conclusiveness. Secondary endpoints were the mean duration time and the mean waiting time to perform the exams. No significant difference in terms of conclusiveness between HHE and standard echo was found (86 vs 96%; P = 0.08). Mean duration time of the examinations was 6.1 ± 1.2 min with HHE and 13.1 ± 2.6 min with standard echocardiography (P < 0.001). HHE resulted in a consistent save of waiting time because it was performed the same day of clinical evaluation whereas patients waited 10.1 ± 6.1 days for a standard echocardiography (P < 0.001). This study suggests the potential role of HHE for pre-operative evaluation of selected patients undergoing non-cardiac surgery, since it provided similar information but it was faster and earlier performed compared to standard echocardiography.

  12. [Spatial variability of soil nutrients based on geostatistics combined with GIS--a case study in Zunghua City of Hebei Province].

    PubMed

    Guo, X; Fu, B; Ma, K; Chen, L

    2000-08-01

    Geostatistics combined with GIS was applied to analyze the spatial variability of soil nutrients in topsoil (0-20 cm) in Zunghua City of Hebei Province. GIS can integrate attribute data with geographical data of system variables, which makes the application of geostatistics technique for large spatial scale more convenient. Soil nutrient data in this study included available N (alkaline hydrolyzing nitrogen), total N, available K, available P and organic matter. The results showed that the semivariograms of soil nutrients were best described by spherical model, except for that of available K, which was best fitted by complex structure of exponential model and linear with sill model. The spatial variability of available K was mainly produced by structural factor, while that of available N, total N, available P and organic matter was primarily caused by random factor. However, their spatial heterogeneity degree was different: the degree of total N and organic matter was higher, and that of available P and available N was lower. The results also indicated that the spatial correlation of the five tested soil nutrients at this large scale was moderately dependent. The ranges of available N and available P were almost same, which were 5 km and 5.5 km, respectively. The range of total N was up to 18 km, and that of organic matter was 8.5 km. For available K, the spatial variability scale primarily expressed exponential model between 0-3.5 km, but linear with sill model between 3.5-25.5 km. In addition, five soil nutrients exhibited different isotropic ranges. Available N and available P were isotropic through the whole research range (0-28 km). The isotropic range of available K was 0-8 km, and that of total N and organic matter was 0-10 km.

  13. Analysis of the spatio-temporal distribution of Eurygaster integriceps (Hemiptera: Scutelleridae) by using spatial analysis by distance indices and geostatistics.

    PubMed

    Karimzadeh, R; Hejazi, M J; Helali, H; Iranipour, S; Mohammadi, S A

    2011-10-01

    Eurygaster integriceps Puton (Hemiptera: Scutelleridae) is the most serious insect pest of wheat (Triticum aestivum L.) and barley (Hordeum vulgare L.) in Iran. In this study, spatio-temporal distribution of this pest was determined in wheat by using spatial analysis by distance indices (SADIE) and geostatistics. Global positioning and geographic information systems were used for spatial sampling and mapping the distribution of this insect. The study was conducted for three growing seasons in Gharamalek, an agricultural region to the west of Tabriz, Iran. Weekly sampling began when E. integriceps adults migrated to wheat fields from overwintering sites and ended when the new generation adults appeared at the end of season. The adults were sampled using 1- by 1-m quadrat and distance-walk methods. A sweep net was used for sampling the nymphs, and five 180° sweeps were considered as the sampling unit. The results of spatial analyses by using geostatistics and SADIE indicated that E. integriceps adults were clumped after migration to fields and had significant spatial dependency. The second- and third-instar nymphs showed aggregated spatial structure in the middle of growing season. At the end of the season, population distribution changed toward random or regular patterns; and fourth and fifth instars had weaker spatial structure compared with younger nymphs. In Iran, management measures for E. integriceps in wheat fields are mainly applied against overwintering adults, as well as second and third instars. Because of the aggregated distribution of these life stages, site-specific spraying of chemicals is feasible in managing E. integriceps.

  14. Predicting water quality impaired stream segments using landscape-scale data and a regional geostatistical model: a case study in Maryland.

    PubMed

    Peterson, Erin E; Urquhart, N Scott

    2006-10-01

    In the United States, probability-based water quality surveys are typically used to meet the requirements of Section 305(b) of the Clean Water Act. The survey design allows an inference to be generated concerning regional stream condition, but it cannot be used to identify water quality impaired stream segments. Therefore, a rapid and cost-efficient method is needed to locate potentially impaired stream segments throughout large areas. We fit a set of geostatistical models to 312 samples of dissolved organic carbon (DOC) collected in 1996 for the Maryland Biological Stream Survey using coarse-scale watershed characteristics. The models were developed using two distance measures, straight-line distance (SLD) and weighted asymmetric hydrologic distance (WAHD). We used the Corrected Spatial Akaike Information Criterion and the mean square prediction er