Sample records for random effects approach

  1. A comparison of methods for estimating the random effects distribution of a linear mixed model.

    PubMed

    Ghidey, Wendimagegn; Lesaffre, Emmanuel; Verbeke, Geert

    2010-12-01

    This article reviews various recently suggested approaches to estimate the random effects distribution in a linear mixed model, i.e. (1) the smoothing by roughening approach of Shen and Louis,(1) (2) the semi-non-parametric approach of Zhang and Davidian,(2) (3) the heterogeneity model of Verbeke and Lesaffre( 3) and (4) a flexible approach of Ghidey et al. (4) These four approaches are compared via an extensive simulation study. We conclude that for the considered cases, the approach of Ghidey et al. (4) often shows to have the smallest integrated mean squared error for estimating the random effects distribution. An analysis of a longitudinal dental data set illustrates the performance of the methods in a practical example.

  2. Individualizing drug dosage with longitudinal data.

    PubMed

    Zhu, Xiaolu; Qu, Annie

    2016-10-30

    We propose a two-step procedure to personalize drug dosage over time under the framework of a log-linear mixed-effect model. We model patients' heterogeneity using subject-specific random effects, which are treated as the realizations of an unspecified stochastic process. We extend the conditional quadratic inference function to estimate both fixed-effect coefficients and individual random effects on a longitudinal training data sample in the first step and propose an adaptive procedure to estimate new patients' random effects and provide dosage recommendations for new patients in the second step. An advantage of our approach is that we do not impose any distribution assumption on estimating random effects. Moreover, the new approach can accommodate more general time-varying covariates corresponding to random effects. We show in theory and numerical studies that the proposed method is more efficient compared with existing approaches, especially when covariates are time varying. In addition, a real data example of a clozapine study confirms that our two-step procedure leads to more accurate drug dosage recommendations. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  3. A theoretical approach to quantify the effect of random cracks on rock deformation in uniaxial compression

    NASA Astrophysics Data System (ADS)

    Zhou, Shuwei; Xia, Caichu; Zhou, Yu

    2018-06-01

    Cracks have a significant effect on the uniaxial compression of rocks. Thus, a theoretically analytical approach was proposed to assess the effects of randomly distributed cracks on the effective Young’s modulus during the uniaxial compression of rocks. Each stage of the rock failure during uniaxial compression was analyzed and classified. The analytical approach for the effective Young’s modulus of a rock with only a single crack was derived while considering the three crack states under stress, namely, opening, closure-sliding, and closure-nonsliding. The rock was then assumed to have many cracks with randomly distributed direction, and the effect of crack shape and number during each stage of the uniaxial compression on the effective Young’s modulus was considered. Thus, the approach for the effective Young’s modulus was used to obtain the whole stress-strain process of uniaxial compression. Afterward, the proposed approach was employed to analyze the effects of related parameters on the whole stress-stain curve. The proposed approach was eventually compared with some existing rock tests to validate its applicability and feasibility. The proposed approach has clear physical meaning and shows favorable agreement with the rock test results.

  4. General Framework for Effect Sizes in Cluster Randomized Experiments

    ERIC Educational Resources Information Center

    VanHoudnos, Nathan

    2016-01-01

    Cluster randomized experiments are ubiquitous in modern education research. Although a variety of modeling approaches are used to analyze these data, perhaps the most common methodology is a normal mixed effects model where some effects, such as the treatment effect, are regarded as fixed, and others, such as the effect of group random assignment…

  5. Logistic random effects regression models: a comparison of statistical packages for binary and ordinal outcomes.

    PubMed

    Li, Baoyue; Lingsma, Hester F; Steyerberg, Ewout W; Lesaffre, Emmanuel

    2011-05-23

    Logistic random effects models are a popular tool to analyze multilevel also called hierarchical data with a binary or ordinal outcome. Here, we aim to compare different statistical software implementations of these models. We used individual patient data from 8509 patients in 231 centers with moderate and severe Traumatic Brain Injury (TBI) enrolled in eight Randomized Controlled Trials (RCTs) and three observational studies. We fitted logistic random effects regression models with the 5-point Glasgow Outcome Scale (GOS) as outcome, both dichotomized as well as ordinal, with center and/or trial as random effects, and as covariates age, motor score, pupil reactivity or trial. We then compared the implementations of frequentist and Bayesian methods to estimate the fixed and random effects. Frequentist approaches included R (lme4), Stata (GLLAMM), SAS (GLIMMIX and NLMIXED), MLwiN ([R]IGLS) and MIXOR, Bayesian approaches included WinBUGS, MLwiN (MCMC), R package MCMCglmm and SAS experimental procedure MCMC.Three data sets (the full data set and two sub-datasets) were analysed using basically two logistic random effects models with either one random effect for the center or two random effects for center and trial. For the ordinal outcome in the full data set also a proportional odds model with a random center effect was fitted. The packages gave similar parameter estimates for both the fixed and random effects and for the binary (and ordinal) models for the main study and when based on a relatively large number of level-1 (patient level) data compared to the number of level-2 (hospital level) data. However, when based on relatively sparse data set, i.e. when the numbers of level-1 and level-2 data units were about the same, the frequentist and Bayesian approaches showed somewhat different results. The software implementations differ considerably in flexibility, computation time, and usability. There are also differences in the availability of additional tools for model evaluation, such as diagnostic plots. The experimental SAS (version 9.2) procedure MCMC appeared to be inefficient. On relatively large data sets, the different software implementations of logistic random effects regression models produced similar results. Thus, for a large data set there seems to be no explicit preference (of course if there is no preference from a philosophical point of view) for either a frequentist or Bayesian approach (if based on vague priors). The choice for a particular implementation may largely depend on the desired flexibility, and the usability of the package. For small data sets the random effects variances are difficult to estimate. In the frequentist approaches the MLE of this variance was often estimated zero with a standard error that is either zero or could not be determined, while for Bayesian methods the estimates could depend on the chosen "non-informative" prior of the variance parameter. The starting value for the variance parameter may be also critical for the convergence of the Markov chain.

  6. Correction of confounding bias in non-randomized studies by appropriate weighting.

    PubMed

    Schmoor, Claudia; Gall, Christine; Stampf, Susanne; Graf, Erika

    2011-03-01

    In non-randomized studies, the assessment of a causal effect of treatment or exposure on outcome is hampered by possible confounding. Applying multiple regression models including the effects of treatment and covariates on outcome is the well-known classical approach to adjust for confounding. In recent years other approaches have been promoted. One of them is based on the propensity score and considers the effect of possible confounders on treatment as a relevant criterion for adjustment. Another proposal is based on using an instrumental variable. Here inference relies on a factor, the instrument, which affects treatment but is thought to be otherwise unrelated to outcome, so that it mimics randomization. Each of these approaches can basically be interpreted as a simple reweighting scheme, designed to address confounding. The procedures will be compared with respect to their fundamental properties, namely, which bias they aim to eliminate, which effect they aim to estimate, and which parameter is modelled. We will expand our overview of methods for analysis of non-randomized studies to methods for analysis of randomized controlled trials and show that analyses of both study types may target different effects and different parameters. The considerations will be illustrated using a breast cancer study with a so-called Comprehensive Cohort Study design, including a randomized controlled trial and a non-randomized study in the same patient population as sub-cohorts. This design offers ideal opportunities to discuss and illustrate the properties of the different approaches. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Full Bayes Poisson gamma, Poisson lognormal, and zero inflated random effects models: Comparing the precision of crash frequency estimates.

    PubMed

    Aguero-Valverde, Jonathan

    2013-01-01

    In recent years, complex statistical modeling approaches have being proposed to handle the unobserved heterogeneity and the excess of zeros frequently found in crash data, including random effects and zero inflated models. This research compares random effects, zero inflated, and zero inflated random effects models using a full Bayes hierarchical approach. The models are compared not just in terms of goodness-of-fit measures but also in terms of precision of posterior crash frequency estimates since the precision of these estimates is vital for ranking of sites for engineering improvement. Fixed-over-time random effects models are also compared to independent-over-time random effects models. For the crash dataset being analyzed, it was found that once the random effects are included in the zero inflated models, the probability of being in the zero state is drastically reduced, and the zero inflated models degenerate to their non zero inflated counterparts. Also by fixing the random effects over time the fit of the models and the precision of the crash frequency estimates are significantly increased. It was found that the rankings of the fixed-over-time random effects models are very consistent among them. In addition, the results show that by fixing the random effects over time, the standard errors of the crash frequency estimates are significantly reduced for the majority of the segments on the top of the ranking. Copyright © 2012 Elsevier Ltd. All rights reserved.

  8. Logistic random effects regression models: a comparison of statistical packages for binary and ordinal outcomes

    PubMed Central

    2011-01-01

    Background Logistic random effects models are a popular tool to analyze multilevel also called hierarchical data with a binary or ordinal outcome. Here, we aim to compare different statistical software implementations of these models. Methods We used individual patient data from 8509 patients in 231 centers with moderate and severe Traumatic Brain Injury (TBI) enrolled in eight Randomized Controlled Trials (RCTs) and three observational studies. We fitted logistic random effects regression models with the 5-point Glasgow Outcome Scale (GOS) as outcome, both dichotomized as well as ordinal, with center and/or trial as random effects, and as covariates age, motor score, pupil reactivity or trial. We then compared the implementations of frequentist and Bayesian methods to estimate the fixed and random effects. Frequentist approaches included R (lme4), Stata (GLLAMM), SAS (GLIMMIX and NLMIXED), MLwiN ([R]IGLS) and MIXOR, Bayesian approaches included WinBUGS, MLwiN (MCMC), R package MCMCglmm and SAS experimental procedure MCMC. Three data sets (the full data set and two sub-datasets) were analysed using basically two logistic random effects models with either one random effect for the center or two random effects for center and trial. For the ordinal outcome in the full data set also a proportional odds model with a random center effect was fitted. Results The packages gave similar parameter estimates for both the fixed and random effects and for the binary (and ordinal) models for the main study and when based on a relatively large number of level-1 (patient level) data compared to the number of level-2 (hospital level) data. However, when based on relatively sparse data set, i.e. when the numbers of level-1 and level-2 data units were about the same, the frequentist and Bayesian approaches showed somewhat different results. The software implementations differ considerably in flexibility, computation time, and usability. There are also differences in the availability of additional tools for model evaluation, such as diagnostic plots. The experimental SAS (version 9.2) procedure MCMC appeared to be inefficient. Conclusions On relatively large data sets, the different software implementations of logistic random effects regression models produced similar results. Thus, for a large data set there seems to be no explicit preference (of course if there is no preference from a philosophical point of view) for either a frequentist or Bayesian approach (if based on vague priors). The choice for a particular implementation may largely depend on the desired flexibility, and the usability of the package. For small data sets the random effects variances are difficult to estimate. In the frequentist approaches the MLE of this variance was often estimated zero with a standard error that is either zero or could not be determined, while for Bayesian methods the estimates could depend on the chosen "non-informative" prior of the variance parameter. The starting value for the variance parameter may be also critical for the convergence of the Markov chain. PMID:21605357

  9. A Stochastic Simulation Framework for the Prediction of Strategic Noise Mapping and Occupational Noise Exposure Using the Random Walk Approach

    PubMed Central

    Haron, Zaiton; Bakar, Suhaimi Abu; Dimon, Mohamad Ngasri

    2015-01-01

    Strategic noise mapping provides important information for noise impact assessment and noise abatement. However, producing reliable strategic noise mapping in a dynamic, complex working environment is difficult. This study proposes the implementation of the random walk approach as a new stochastic technique to simulate noise mapping and to predict the noise exposure level in a workplace. A stochastic simulation framework and software, namely RW-eNMS, were developed to facilitate the random walk approach in noise mapping prediction. This framework considers the randomness and complexity of machinery operation and noise emission levels. Also, it assesses the impact of noise on the workers and the surrounding environment. For data validation, three case studies were conducted to check the accuracy of the prediction data and to determine the efficiency and effectiveness of this approach. The results showed high accuracy of prediction results together with a majority of absolute differences of less than 2 dBA; also, the predicted noise doses were mostly in the range of measurement. Therefore, the random walk approach was effective in dealing with environmental noises. It could predict strategic noise mapping to facilitate noise monitoring and noise control in the workplaces. PMID:25875019

  10. Using Multisite Experiments to Study Cross-Site Variation in Treatment Effects: A Hybrid Approach with Fixed Intercepts and A Random Treatment Coefficient

    ERIC Educational Resources Information Center

    Bloom, Howard S.; Raudenbush, Stephen W.; Weiss, Michael J.; Porter, Kristin

    2017-01-01

    The present article considers a fundamental question in evaluation research: "By how much do program effects vary across sites?" The article first presents a theoretical model of cross-site impact variation and a related estimation model with a random treatment coefficient and fixed site-specific intercepts. This approach eliminates…

  11. Extensively Parameterized Mutation-Selection Models Reliably Capture Site-Specific Selective Constraint.

    PubMed

    Spielman, Stephanie J; Wilke, Claus O

    2016-11-01

    The mutation-selection model of coding sequence evolution has received renewed attention for its use in estimating site-specific amino acid propensities and selection coefficient distributions. Two computationally tractable mutation-selection inference frameworks have been introduced: One framework employs a fixed-effects, highly parameterized maximum likelihood approach, whereas the other employs a random-effects Bayesian Dirichlet Process approach. While both implementations follow the same model, they appear to make distinct predictions about the distribution of selection coefficients. The fixed-effects framework estimates a large proportion of highly deleterious substitutions, whereas the random-effects framework estimates that all substitutions are either nearly neutral or weakly deleterious. It remains unknown, however, how accurately each method infers evolutionary constraints at individual sites. Indeed, selection coefficient distributions pool all site-specific inferences, thereby obscuring a precise assessment of site-specific estimates. Therefore, in this study, we use a simulation-based strategy to determine how accurately each approach recapitulates the selective constraint at individual sites. We find that the fixed-effects approach, despite its extensive parameterization, consistently and accurately estimates site-specific evolutionary constraint. By contrast, the random-effects Bayesian approach systematically underestimates the strength of natural selection, particularly for slowly evolving sites. We also find that, despite the strong differences between their inferred selection coefficient distributions, the fixed- and random-effects approaches yield surprisingly similar inferences of site-specific selective constraint. We conclude that the fixed-effects mutation-selection framework provides the more reliable software platform for model application and future development. © The Author 2016. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  12. Random function representation of stationary stochastic vector processes for probability density evolution analysis of wind-induced structures

    NASA Astrophysics Data System (ADS)

    Liu, Zhangjun; Liu, Zenghui

    2018-06-01

    This paper develops a hybrid approach of spectral representation and random function for simulating stationary stochastic vector processes. In the proposed approach, the high-dimensional random variables, included in the original spectral representation (OSR) formula, could be effectively reduced to only two elementary random variables by introducing the random functions that serve as random constraints. Based on this, a satisfactory simulation accuracy can be guaranteed by selecting a small representative point set of the elementary random variables. The probability information of the stochastic excitations can be fully emerged through just several hundred of sample functions generated by the proposed approach. Therefore, combined with the probability density evolution method (PDEM), it could be able to implement dynamic response analysis and reliability assessment of engineering structures. For illustrative purposes, a stochastic turbulence wind velocity field acting on a frame-shear-wall structure is simulated by constructing three types of random functions to demonstrate the accuracy and efficiency of the proposed approach. Careful and in-depth studies concerning the probability density evolution analysis of the wind-induced structure have been conducted so as to better illustrate the application prospects of the proposed approach. Numerical examples also show that the proposed approach possesses a good robustness.

  13. Effectiveness of Treatment Approaches for Children and Adolescents with Reading Disabilities: A Meta-Analysis of Randomized Controlled Trials

    PubMed Central

    Galuschka, Katharina; Ise, Elena; Krick, Kathrin; Schulte-Körne, Gerd

    2014-01-01

    Children and adolescents with reading disabilities experience a significant impairment in the acquisition of reading and spelling skills. Given the emotional and academic consequences for children with persistent reading disorders, evidence-based interventions are critically needed. The present meta-analysis extracts the results of all available randomized controlled trials. The aims were to determine the effectiveness of different treatment approaches and the impact of various factors on the efficacy of interventions. The literature search for published randomized-controlled trials comprised an electronic search in the databases ERIC, PsycINFO, PubMed, and Cochrane, and an examination of bibliographical references. To check for unpublished trials, we searched the websites clinicaltrials.com and ProQuest, and contacted experts in the field. Twenty-two randomized controlled trials with a total of 49 comparisons of experimental and control groups could be included. The comparisons evaluated five reading fluency trainings, three phonemic awareness instructions, three reading comprehension trainings, 29 phonics instructions, three auditory trainings, two medical treatments, and four interventions with coloured overlays or lenses. One trial evaluated the effectiveness of sunflower therapy and another investigated the effectiveness of motor exercises. The results revealed that phonics instruction is not only the most frequently investigated treatment approach, but also the only approach whose efficacy on reading and spelling performance in children and adolescents with reading disabilities is statistically confirmed. The mean effect sizes of the remaining treatment approaches did not reach statistical significance. The present meta-analysis demonstrates that severe reading and spelling difficulties can be ameliorated with appropriate treatment. In order to be better able to provide evidence-based interventions to children and adolescent with reading disabilities, research should intensify the application of blinded randomized controlled trials. PMID:24587110

  14. Membrane fouling in a submerged membrane bioreactor: An unified approach to construct topography and to evaluate interaction energy between two randomly rough surfaces.

    PubMed

    Cai, Xiang; Shen, Liguo; Zhang, Meijia; Chen, Jianrong; Hong, Huachang; Lin, Hongjun

    2017-11-01

    Quantitatively evaluating interaction energy between two randomly rough surfaces is the prerequisite to quantitatively understand and control membrane fouling in membrane bioreactors (MBRs). In this study, a new unified approach to construct rough topographies and to quantify interaction energy between a randomly rough particle and a randomly rough membrane was proposed. It was found that, natural rough topographies of both foulants and membrane could be well constructed by a modified two-variable Weierstrass-Mandelbrot (WM) function included in fractal theory. Spatial differential relationships between two constructed surfaces were accordingly established. Thereafter, a new approach combining these relationships, surface element integration (SEI) approach and composite Simpson's rule was deduced to calculate the interaction energy between two randomly rough surfaces in a submerged MBR. The obtained results indicate the profound effects of surface morphology on interaction energy and membrane fouling. This study provided a basic approach to investigate membrane fouling and interface behaviors. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. LOD score exclusion analyses for candidate genes using random population samples.

    PubMed

    Deng, H W; Li, J; Recker, R R

    2001-05-01

    While extensive analyses have been conducted to test for, no formal analyses have been conducted to test against, the importance of candidate genes with random population samples. We develop a LOD score approach for exclusion analyses of candidate genes with random population samples. Under this approach, specific genetic effects and inheritance models at candidate genes can be analysed and if a LOD score is < or = - 2.0, the locus can be excluded from having an effect larger than that specified. Computer simulations show that, with sample sizes often employed in association studies, this approach has high power to exclude a gene from having moderate genetic effects. In contrast to regular association analyses, population admixture will not affect the robustness of our analyses; in fact, it renders our analyses more conservative and thus any significant exclusion result is robust. Our exclusion analysis complements association analysis for candidate genes in random population samples and is parallel to the exclusion mapping analyses that may be conducted in linkage analyses with pedigrees or relative pairs. The usefulness of the approach is demonstrated by an application to test the importance of vitamin D receptor and estrogen receptor genes underlying the differential risk to osteoporotic fractures.

  16. Randomized Control Trials on the Dynamic Geometry Approach

    ERIC Educational Resources Information Center

    Jiang, Zhonghong; White, Alexander; Rosenwasser, Alana

    2011-01-01

    The project reported here is conducting repeated randomized control trials of an approach to high school geometry that utilizes Dynamic Geometry (DG) software to supplement ordinary instructional practices. It compares effects of that intervention with standard instruction that does not make use of computer drawing/exploration tools. The basic…

  17. Optical characterization of randomly microrough surfaces covered with very thin overlayers using effective medium approximation and Rayleigh-Rice theory

    NASA Astrophysics Data System (ADS)

    Ohlídal, Ivan; Vohánka, Jiří; Čermák, Martin; Franta, Daniel

    2017-10-01

    The modification of the effective medium approximation for randomly microrough surfaces covered by very thin overlayers based on inhomogeneous fictitious layers is formulated. The numerical analysis of this modification is performed using simulated ellipsometric data calculated using the Rayleigh-Rice theory. The system used to perform this numerical analysis consists of a randomly microrough silicon single crystal surface covered with a SiO2 overlayer. A comparison to the effective medium approximation based on homogeneous fictitious layers is carried out within this numerical analysis. For ellipsometry of the system mentioned above the possibilities and limitations of both the effective medium approximation approaches are discussed. The results obtained by means of the numerical analysis are confirmed by the ellipsometric characterization of two randomly microrough silicon single crystal substrates covered with native oxide overlayers. It is shown that the effective medium approximation approaches for this system exhibit strong deficiencies compared to the Rayleigh-Rice theory. The practical consequences implied by these results are presented. The results concerning the random microroughness are verified by means of measurements performed using atomic force microscopy.

  18. Should multiple imputation be the method of choice for handling missing data in randomized trials?

    PubMed Central

    Sullivan, Thomas R; White, Ian R; Salter, Amy B; Ryan, Philip; Lee, Katherine J

    2016-01-01

    The use of multiple imputation has increased markedly in recent years, and journal reviewers may expect to see multiple imputation used to handle missing data. However in randomized trials, where treatment group is always observed and independent of baseline covariates, other approaches may be preferable. Using data simulation we evaluated multiple imputation, performed both overall and separately by randomized group, across a range of commonly encountered scenarios. We considered both missing outcome and missing baseline data, with missing outcome data induced under missing at random mechanisms. Provided the analysis model was correctly specified, multiple imputation produced unbiased treatment effect estimates, but alternative unbiased approaches were often more efficient. When the analysis model overlooked an interaction effect involving randomized group, multiple imputation produced biased estimates of the average treatment effect when applied to missing outcome data, unless imputation was performed separately by randomized group. Based on these results, we conclude that multiple imputation should not be seen as the only acceptable way to handle missing data in randomized trials. In settings where multiple imputation is adopted, we recommend that imputation is carried out separately by randomized group. PMID:28034175

  19. Should multiple imputation be the method of choice for handling missing data in randomized trials?

    PubMed

    Sullivan, Thomas R; White, Ian R; Salter, Amy B; Ryan, Philip; Lee, Katherine J

    2016-01-01

    The use of multiple imputation has increased markedly in recent years, and journal reviewers may expect to see multiple imputation used to handle missing data. However in randomized trials, where treatment group is always observed and independent of baseline covariates, other approaches may be preferable. Using data simulation we evaluated multiple imputation, performed both overall and separately by randomized group, across a range of commonly encountered scenarios. We considered both missing outcome and missing baseline data, with missing outcome data induced under missing at random mechanisms. Provided the analysis model was correctly specified, multiple imputation produced unbiased treatment effect estimates, but alternative unbiased approaches were often more efficient. When the analysis model overlooked an interaction effect involving randomized group, multiple imputation produced biased estimates of the average treatment effect when applied to missing outcome data, unless imputation was performed separately by randomized group. Based on these results, we conclude that multiple imputation should not be seen as the only acceptable way to handle missing data in randomized trials. In settings where multiple imputation is adopted, we recommend that imputation is carried out separately by randomized group.

  20. The Ecological Effects of Universal and Selective Violence Prevention Programs for Middle School Students: A Randomized Trial

    ERIC Educational Resources Information Center

    Simon, Thomas R.; Ikeda, Robin M.; Smith, Emilie Phillips; Reese, Le'Roy E.; Rabiner, David L.; Miller, Shari; Winn, Donna-Marie; Dodge, Kenneth A.; Asher, Steven R.; Horne, Arthur M.; Orpinas, Pamela; Martin, Roy; Quinn, William H.; Tolan, Patrick H.; Gorman-Smith, Deborah; Henry, David B.; Gay, Franklin N.; Schoeny, Michael; Farrell, Albert D.; Meyer, Aleta L.; Sullivan, Terri N.; Allison, Kevin W.

    2009-01-01

    This study reports the findings of a multisite randomized trial evaluating the separate and combined effects of 2 school-based approaches to reduce violence among early adolescents. A total of 37 schools at 4 sites were randomized to 4 conditions: (1) a universal intervention that involved implementing a student curriculum and teacher training…

  1. Impulsive synchronization of Markovian jumping randomly coupled neural networks with partly unknown transition probabilities via multiple integral approach.

    PubMed

    Chandrasekar, A; Rakkiyappan, R; Cao, Jinde

    2015-10-01

    This paper studies the impulsive synchronization of Markovian jumping randomly coupled neural networks with partly unknown transition probabilities via multiple integral approach. The array of neural networks are coupled in a random fashion which is governed by Bernoulli random variable. The aim of this paper is to obtain the synchronization criteria, which is suitable for both exactly known and partly unknown transition probabilities such that the coupled neural network is synchronized with mixed time-delay. The considered impulsive effects can be synchronized at partly unknown transition probabilities. Besides, a multiple integral approach is also proposed to strengthen the Markovian jumping randomly coupled neural networks with partly unknown transition probabilities. By making use of Kronecker product and some useful integral inequalities, a novel Lyapunov-Krasovskii functional was designed for handling the coupled neural network with mixed delay and then impulsive synchronization criteria are solvable in a set of linear matrix inequalities. Finally, numerical examples are presented to illustrate the effectiveness and advantages of the theoretical results. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Understanding and comparisons of different sampling approaches for the Fourier Amplitudes Sensitivity Test (FAST)

    PubMed Central

    Xu, Chonggang; Gertner, George

    2013-01-01

    Fourier Amplitude Sensitivity Test (FAST) is one of the most popular uncertainty and sensitivity analysis techniques. It uses a periodic sampling approach and a Fourier transformation to decompose the variance of a model output into partial variances contributed by different model parameters. Until now, the FAST analysis is mainly confined to the estimation of partial variances contributed by the main effects of model parameters, but does not allow for those contributed by specific interactions among parameters. In this paper, we theoretically show that FAST analysis can be used to estimate partial variances contributed by both main effects and interaction effects of model parameters using different sampling approaches (i.e., traditional search-curve based sampling, simple random sampling and random balance design sampling). We also analytically calculate the potential errors and biases in the estimation of partial variances. Hypothesis tests are constructed to reduce the effect of sampling errors on the estimation of partial variances. Our results show that compared to simple random sampling and random balance design sampling, sensitivity indices (ratios of partial variances to variance of a specific model output) estimated by search-curve based sampling generally have higher precision but larger underestimations. Compared to simple random sampling, random balance design sampling generally provides higher estimation precision for partial variances contributed by the main effects of parameters. The theoretical derivation of partial variances contributed by higher-order interactions and the calculation of their corresponding estimation errors in different sampling schemes can help us better understand the FAST method and provide a fundamental basis for FAST applications and further improvements. PMID:24143037

  3. Understanding and comparisons of different sampling approaches for the Fourier Amplitudes Sensitivity Test (FAST).

    PubMed

    Xu, Chonggang; Gertner, George

    2011-01-01

    Fourier Amplitude Sensitivity Test (FAST) is one of the most popular uncertainty and sensitivity analysis techniques. It uses a periodic sampling approach and a Fourier transformation to decompose the variance of a model output into partial variances contributed by different model parameters. Until now, the FAST analysis is mainly confined to the estimation of partial variances contributed by the main effects of model parameters, but does not allow for those contributed by specific interactions among parameters. In this paper, we theoretically show that FAST analysis can be used to estimate partial variances contributed by both main effects and interaction effects of model parameters using different sampling approaches (i.e., traditional search-curve based sampling, simple random sampling and random balance design sampling). We also analytically calculate the potential errors and biases in the estimation of partial variances. Hypothesis tests are constructed to reduce the effect of sampling errors on the estimation of partial variances. Our results show that compared to simple random sampling and random balance design sampling, sensitivity indices (ratios of partial variances to variance of a specific model output) estimated by search-curve based sampling generally have higher precision but larger underestimations. Compared to simple random sampling, random balance design sampling generally provides higher estimation precision for partial variances contributed by the main effects of parameters. The theoretical derivation of partial variances contributed by higher-order interactions and the calculation of their corresponding estimation errors in different sampling schemes can help us better understand the FAST method and provide a fundamental basis for FAST applications and further improvements.

  4. Meta-analysis of laparoscopic versus open repair of perforated peptic ulcer.

    PubMed

    Antoniou, Stavros A; Antoniou, George A; Koch, Oliver O; Pointner, Rudolph; Granderath, Frank A

    2013-01-01

    Laparoscopic treatment of perforated peptic ulcer (PPU) has been introduced as an alternative procedure to open surgery. It has been postulated that the minimally invasive approach involves less operative stress and results in decreased morbidity and mortality. We conducted a meta-analysis of randomized trials to test this hypothesis. Medline, EMBASE, and the Cochrane Central Register of Randomized Trials databases were searched, with no date or language restrictions. Our literature search identified 4 randomized trials, with a cumulative number of 289 patients, that compared the laparoscopic approach with open sutured repair of perforated ulcer. Analysis of outcomes did not favor either approach in terms of morbidity, mortality, and reoperation rate, although odds ratios seemed to consistently support the laparoscopic approach. Results did not determine the comparative efficiency and safety of laparoscopic or open approach for PPU. In view of an increased interest in the laparoscopic approach, further randomized trials are considered essential to determine the relative effectiveness of laparoscopic and open repair of PPU.

  5. Meta-analysis of Laparoscopic Versus Open Repair of Perforated Peptic Ulcer

    PubMed Central

    Antoniou, George A.; Koch, Oliver O.; Pointner, Rudolph; Granderath, Frank A.

    2013-01-01

    Background and Objectives: Laparoscopic treatment of perforated peptic ulcer (PPU) has been introduced as an alternative procedure to open surgery. It has been postulated that the minimally invasive approach involves less operative stress and results in decreased morbidity and mortality. Methods: We conducted a meta-analysis of randomized trials to test this hypothesis. Medline, EMBASE, and the Cochrane Central Register of Randomized Trials databases were searched, with no date or language restrictions. Results: Our literature search identified 4 randomized trials, with a cumulative number of 289 patients, that compared the laparoscopic approach with open sutured repair of perforated ulcer. Analysis of outcomes did not favor either approach in terms of morbidity, mortality, and reoperation rate, although odds ratios seemed to consistently support the laparoscopic approach. Results did not determine the comparative efficiency and safety of laparoscopic or open approach for PPU. Conclusion: In view of an increased interest in the laparoscopic approach, further randomized trials are considered essential to determine the relative effectiveness of laparoscopic and open repair of PPU. PMID:23743368

  6. A Structural Modeling Approach to a Multilevel Random Coefficients Model.

    ERIC Educational Resources Information Center

    Rovine, Michael J.; Molenaar, Peter C. M.

    2000-01-01

    Presents a method for estimating the random coefficients model using covariance structure modeling and allowing one to estimate both fixed and random effects. The method is applied to real and simulated data, including marriage data from J. Belsky and M. Rovine (1990). (SLD)

  7. Cross-validation analysis for genetic evaluation models for ranking in endurance horses.

    PubMed

    García-Ballesteros, S; Varona, L; Valera, M; Gutiérrez, J P; Cervantes, I

    2018-01-01

    Ranking trait was used as a selection criterion for competition horses to estimate racing performance. In the literature the most common approaches to estimate breeding values are the linear or threshold statistical models. However, recent studies have shown that a Thurstonian approach was able to fix the race effect (competitive level of the horses that participate in the same race), thus suggesting a better prediction accuracy of breeding values for ranking trait. The aim of this study was to compare the predictability of linear, threshold and Thurstonian approaches for genetic evaluation of ranking in endurance horses. For this purpose, eight genetic models were used for each approach with different combinations of random effects: rider, rider-horse interaction and environmental permanent effect. All genetic models included gender, age and race as systematic effects. The database that was used contained 4065 ranking records from 966 horses and that for the pedigree contained 8733 animals (47% Arabian horses), with an estimated heritability around 0.10 for the ranking trait. The prediction ability of the models for racing performance was evaluated using a cross-validation approach. The average correlation between real and predicted performances across genetic models was around 0.25 for threshold, 0.58 for linear and 0.60 for Thurstonian approaches. Although no significant differences were found between models within approaches, the best genetic model included: the rider and rider-horse random effects for threshold, only rider and environmental permanent effects for linear approach and all random effects for Thurstonian approach. The absolute correlations of predicted breeding values among models were higher between threshold and Thurstonian: 0.90, 0.91 and 0.88 for all animals, top 20% and top 5% best animals. For rank correlations these figures were 0.85, 0.84 and 0.86. The lower values were those between linear and threshold approaches (0.65, 0.62 and 0.51). In conclusion, the Thurstonian approach is recommended for the routine genetic evaluations for ranking in endurance horses.

  8. A comparison of three random effects approaches to analyze repeated bounded outcome scores with an application in a stroke revalidation study.

    PubMed

    Molas, Marek; Lesaffre, Emmanuel

    2008-12-30

    Discrete bounded outcome scores (BOS), i.e. discrete measurements that are restricted on a finite interval, often occur in practice. Examples are compliance measures, quality of life measures, etc. In this paper we examine three related random effects approaches to analyze longitudinal studies with a BOS as response: (1) a linear mixed effects (LM) model applied to a logistic transformed modified BOS; (2) a model assuming that the discrete BOS is a coarsened version of a latent random variable, which after a logistic-normal transformation, satisfies an LM model; and (3) a random effects probit model. We consider also the extension whereby the variability of the BOS is allowed to depend on covariates. The methods are contrasted using a simulation study and on a longitudinal project, which documents stroke rehabilitation in four European countries using measures of motor and functional recovery. Copyright 2008 John Wiley & Sons, Ltd.

  9. Testing the Intervention Effect in Single-Case Experiments: A Monte Carlo Simulation Study

    ERIC Educational Resources Information Center

    Heyvaert, Mieke; Moeyaert, Mariola; Verkempynck, Paul; Van den Noortgate, Wim; Vervloet, Marlies; Ugille, Maaike; Onghena, Patrick

    2017-01-01

    This article reports on a Monte Carlo simulation study, evaluating two approaches for testing the intervention effect in replicated randomized AB designs: two-level hierarchical linear modeling (HLM) and using the additive method to combine randomization test "p" values (RTcombiP). Four factors were manipulated: mean intervention effect,…

  10. A dynamic spatio-temporal model for spatial data

    USGS Publications Warehouse

    Hefley, Trevor J.; Hooten, Mevin B.; Hanks, Ephraim M.; Russell, Robin; Walsh, Daniel P.

    2017-01-01

    Analyzing spatial data often requires modeling dependencies created by a dynamic spatio-temporal data generating process. In many applications, a generalized linear mixed model (GLMM) is used with a random effect to account for spatial dependence and to provide optimal spatial predictions. Location-specific covariates are often included as fixed effects in a GLMM and may be collinear with the spatial random effect, which can negatively affect inference. We propose a dynamic approach to account for spatial dependence that incorporates scientific knowledge of the spatio-temporal data generating process. Our approach relies on a dynamic spatio-temporal model that explicitly incorporates location-specific covariates. We illustrate our approach with a spatially varying ecological diffusion model implemented using a computationally efficient homogenization technique. We apply our model to understand individual-level and location-specific risk factors associated with chronic wasting disease in white-tailed deer from Wisconsin, USA and estimate the location the disease was first introduced. We compare our approach to several existing methods that are commonly used in spatial statistics. Our spatio-temporal approach resulted in a higher predictive accuracy when compared to methods based on optimal spatial prediction, obviated confounding among the spatially indexed covariates and the spatial random effect, and provided additional information that will be important for containing disease outbreaks.

  11. Analyzing degradation data with a random effects spline regression model

    DOE PAGES

    Fugate, Michael Lynn; Hamada, Michael Scott; Weaver, Brian Phillip

    2017-03-17

    This study proposes using a random effects spline regression model to analyze degradation data. Spline regression avoids having to specify a parametric function for the true degradation of an item. A distribution for the spline regression coefficients captures the variation of the true degradation curves from item to item. We illustrate the proposed methodology with a real example using a Bayesian approach. The Bayesian approach allows prediction of degradation of a population over time and estimation of reliability is easy to perform.

  12. Analyzing degradation data with a random effects spline regression model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fugate, Michael Lynn; Hamada, Michael Scott; Weaver, Brian Phillip

    This study proposes using a random effects spline regression model to analyze degradation data. Spline regression avoids having to specify a parametric function for the true degradation of an item. A distribution for the spline regression coefficients captures the variation of the true degradation curves from item to item. We illustrate the proposed methodology with a real example using a Bayesian approach. The Bayesian approach allows prediction of degradation of a population over time and estimation of reliability is easy to perform.

  13. Facilitating Involvement in Alcoholics Anonymous During Outpatient Treatment: A Randomized Clinical Trial

    PubMed Central

    Walitzer, Kimberly S.; Dermen, Kurt H.; Barrick, Christopher

    2009-01-01

    AIM This study evaluated two strategies to facilitate involvement in Alcoholics Anonymous (AA) – a 12-step-based directive approach and a motivational enhancement approach – during skills-focused individual treatment. DESIGN Randomized controlled trial with assessments at baseline, end of treatment, and 3, 6, 9, and 12 months after treatment. PARTICIPANTS, SETTING, and INTERVENTION 169 alcoholic outpatients (57 women) randomly assigned to one of three conditions: a directive approach to facilitating AA, a motivational enhancement approach to facilitating AA, or treatment as usual with no special emphasis on AA. MEASUREMENTS Self-report of AA meeting attendance and involvement, alcohol consumption (percent days abstinent, percent days heavy drinking), and negative alcohol consequences. FINDINGS Participants exposed to the 12-step directive condition for facilitating AA involvement reported more AA meeting attendance, more evidence of active involvement in AA, and a higher percent days abstinent relative to participants in the treatment-as-usual comparison group. Evidence suggested also that the effect of the directive strategy on abstinent days was partially mediated through AA involvement. The motivational enhancement approach to facilitating AA had no effect on outcome measures. CONCLUSIONS These results suggest that treatment providers can use a 12-step-based directive approach to effectively facilitate involvement in AA and thereby improve client outcome. PMID:19207347

  14. Neither fixed nor random: weighted least squares meta-analysis.

    PubMed

    Stanley, T D; Doucouliagos, Hristos

    2015-06-15

    This study challenges two core conventional meta-analysis methods: fixed effect and random effects. We show how and explain why an unrestricted weighted least squares estimator is superior to conventional random-effects meta-analysis when there is publication (or small-sample) bias and better than a fixed-effect weighted average if there is heterogeneity. Statistical theory and simulations of effect sizes, log odds ratios and regression coefficients demonstrate that this unrestricted weighted least squares estimator provides satisfactory estimates and confidence intervals that are comparable to random effects when there is no publication (or small-sample) bias and identical to fixed-effect meta-analysis when there is no heterogeneity. When there is publication selection bias, the unrestricted weighted least squares approach dominates random effects; when there is excess heterogeneity, it is clearly superior to fixed-effect meta-analysis. In practical applications, an unrestricted weighted least squares weighted average will often provide superior estimates to both conventional fixed and random effects. Copyright © 2015 John Wiley & Sons, Ltd.

  15. A note on variance estimation in random effects meta-regression.

    PubMed

    Sidik, Kurex; Jonkman, Jeffrey N

    2005-01-01

    For random effects meta-regression inference, variance estimation for the parameter estimates is discussed. Because estimated weights are used for meta-regression analysis in practice, the assumed or estimated covariance matrix used in meta-regression is not strictly correct, due to possible errors in estimating the weights. Therefore, this note investigates the use of a robust variance estimation approach for obtaining variances of the parameter estimates in random effects meta-regression inference. This method treats the assumed covariance matrix of the effect measure variables as a working covariance matrix. Using an example of meta-analysis data from clinical trials of a vaccine, the robust variance estimation approach is illustrated in comparison with two other methods of variance estimation. A simulation study is presented, comparing the three methods of variance estimation in terms of bias and coverage probability. We find that, despite the seeming suitability of the robust estimator for random effects meta-regression, the improved variance estimator of Knapp and Hartung (2003) yields the best performance among the three estimators, and thus may provide the best protection against errors in the estimated weights.

  16. Demystifying the memory effect: A geometrical approach to understanding speckle correlations

    NASA Astrophysics Data System (ADS)

    Prunty, Aaron C.; Snieder, Roel K.

    2017-05-01

    The memory effect has seen a surge of research into its fundamental properties and applications since its discovery by Feng et al. [Phys. Rev. Lett. 61, 834 (1988)]. While the wave trajectories for which the memory effect holds are hidden implicitly in the diffusion probability function [Phys. Rev. B 40, 737 (1989)], the physical intuition of why these trajectories satisfy the memory effect has often been masked by the derivation of the memory correlation function itself. In this paper, we explicitly derive the specific trajectories through a random medium for which the memory effect holds. Our approach shows that the memory effect follows from a simple conservation argument, which imposes geometrical constraints on the random trajectories that contribute to the memory effect. We illustrate the time-domain effects of these geometrical constraints with numerical simulations of pulse transmission through a random medium. The results of our derivation and numerical simulations are consistent with established theory and experimentation.

  17. Evaluating the Use of Random Distribution Theory to Introduce Statistical Inference Concepts to Business Students

    ERIC Educational Resources Information Center

    Larwin, Karen H.; Larwin, David A.

    2011-01-01

    Bootstrapping methods and random distribution methods are increasingly recommended as better approaches for teaching students about statistical inference in introductory-level statistics courses. The authors examined the effect of teaching undergraduate business statistics students using random distribution and bootstrapping simulations. It is the…

  18. LOD score exclusion analyses for candidate QTLs using random population samples.

    PubMed

    Deng, Hong-Wen

    2003-11-01

    While extensive analyses have been conducted to test for, no formal analyses have been conducted to test against, the importance of candidate genes as putative QTLs using random population samples. Previously, we developed an LOD score exclusion mapping approach for candidate genes for complex diseases. Here, we extend this LOD score approach for exclusion analyses of candidate genes for quantitative traits. Under this approach, specific genetic effects (as reflected by heritability) and inheritance models at candidate QTLs can be analyzed and if an LOD score is < or = -2.0, the locus can be excluded from having a heritability larger than that specified. Simulations show that this approach has high power to exclude a candidate gene from having moderate genetic effects if it is not a QTL and is robust to population admixture. Our exclusion analysis complements association analysis for candidate genes as putative QTLs in random population samples. The approach is applied to test the importance of Vitamin D receptor (VDR) gene as a potential QTL underlying the variation of bone mass, an important determinant of osteoporosis.

  19. Analysis of stationary and dynamic factors affecting highway accident occurrence: A dynamic correlated grouped random parameters binary logit approach.

    PubMed

    Fountas, Grigorios; Sarwar, Md Tawfiq; Anastasopoulos, Panagiotis Ch; Blatt, Alan; Majka, Kevin

    2018-04-01

    Traditional accident analysis typically explores non-time-varying (stationary) factors that affect accident occurrence on roadway segments. However, the impact of time-varying (dynamic) factors is not thoroughly investigated. This paper seeks to simultaneously identify pre-crash stationary and dynamic factors of accident occurrence, while accounting for unobserved heterogeneity. Using highly disaggregate information for the potential dynamic factors, and aggregate data for the traditional stationary elements, a dynamic binary random parameters (mixed) logit framework is employed. With this approach, the dynamic nature of weather-related, and driving- and pavement-condition information is jointly investigated with traditional roadway geometric and traffic characteristics. To additionally account for the combined effect of the dynamic and stationary factors on the accident occurrence, the developed random parameters logit framework allows for possible correlations among the random parameters. The analysis is based on crash and non-crash observations between 2011 and 2013, drawn from urban and rural highway segments in the state of Washington. The findings show that the proposed methodological framework can account for both stationary and dynamic factors affecting accident occurrence probabilities, for panel effects, for unobserved heterogeneity through the use of random parameters, and for possible correlation among the latter. The comparative evaluation among the correlated grouped random parameters, the uncorrelated random parameters logit models, and their fixed parameters logit counterpart, demonstrate the potential of the random parameters modeling, in general, and the benefits of the correlated grouped random parameters approach, specifically, in terms of statistical fit and explanatory power. Published by Elsevier Ltd.

  20. A statistical approach to EMI - Theory and experiment

    NASA Astrophysics Data System (ADS)

    Weiner, Donald; Capraro, Gerard

    A probabilistic approach to electromagnetic interference (EMI) is presented. The approach is illustrated by analyzing an experimental circuit in which EMI occurs. Both random and weakly nonlinear effects are accounted for in the analysis.

  1. Effective Perron-Frobenius eigenvalue for a correlated random map

    NASA Astrophysics Data System (ADS)

    Pool, Roman R.; Cáceres, Manuel O.

    2010-09-01

    We investigate the evolution of random positive linear maps with various type of disorder by analytic perturbation and direct simulation. Our theoretical result indicates that the statistics of a random linear map can be successfully described for long time by the mean-value vector state. The growth rate can be characterized by an effective Perron-Frobenius eigenvalue that strongly depends on the type of correlation between the elements of the projection matrix. We apply this approach to an age-structured population dynamics model. We show that the asymptotic mean-value vector state characterizes the population growth rate when the age-structured model has random vital parameters. In this case our approach reveals the nontrivial dependence of the effective growth rate with cross correlations. The problem was reduced to the calculation of the smallest positive root of a secular polynomial, which can be obtained by perturbations in terms of Green’s function diagrammatic technique built with noncommutative cumulants for arbitrary n -point correlations.

  2. Approaches to answering critical CER questions.

    PubMed

    Kinnier, Christine V; Chung, Jeanette W; Bilimoria, Karl Y

    2015-01-01

    While randomized controlled trials (RCTs) are the gold standard for research, many research questions cannot be ethically and practically answered using an RCT. Comparative effectiveness research (CER) techniques are often better suited than RCTs to address the effects of an intervention under routine care conditions, an outcome otherwise known as effectiveness. CER research techniques covered in this section include: effectiveness-oriented experimental studies such as pragmatic trials and cluster randomized trials, treatment response heterogeneity, observational and database studies including adjustment techniques such as sensitivity analysis and propensity score analysis, systematic reviews and meta-analysis, decision analysis, and cost effectiveness analysis. Each section describes the technique and covers the strengths and weaknesses of the approach.

  3. When They Call, Will They Come? A Contextually Responsive Approach for Engaging Multistressed Families in an Urban Child Mental Health Center: A Randomized Clinical Trial

    ERIC Educational Resources Information Center

    Stern, Susan B.; Walsh, Margaret; Mercado, Micaela; Levene, Kathryn; Pepler, Debra J.; Carr, Ashley; Heppell, Allison; Lowe, Erin

    2015-01-01

    Objective: This study examines the effect of an ecological and contextually responsive approach, during initial intake call, on engagement for multistressed families seeking child mental health services in an urban setting. Methods: Using a randomized design, parents were allocated to phone Intake As Usual (IAU) or Enhanced Engagement Phone Intake…

  4. Usefulness of Mendelian Randomization in Observational Epidemiology

    PubMed Central

    Bochud, Murielle; Rousson, Valentin

    2010-01-01

    Mendelian randomization refers to the random allocation of alleles at the time of gamete formation. In observational epidemiology, this refers to the use of genetic variants to estimate a causal effect between a modifiable risk factor and an outcome of interest. In this review, we recall the principles of a “Mendelian randomization” approach in observational epidemiology, which is based on the technique of instrumental variables; we provide simulations and an example based on real data to demonstrate its implications; we present the results of a systematic search on original articles having used this approach; and we discuss some limitations of this approach in view of what has been found so far. PMID:20616999

  5. Neither fixed nor random: weighted least squares meta-regression.

    PubMed

    Stanley, T D; Doucouliagos, Hristos

    2017-03-01

    Our study revisits and challenges two core conventional meta-regression estimators: the prevalent use of 'mixed-effects' or random-effects meta-regression analysis and the correction of standard errors that defines fixed-effects meta-regression analysis (FE-MRA). We show how and explain why an unrestricted weighted least squares MRA (WLS-MRA) estimator is superior to conventional random-effects (or mixed-effects) meta-regression when there is publication (or small-sample) bias that is as good as FE-MRA in all cases and better than fixed effects in most practical applications. Simulations and statistical theory show that WLS-MRA provides satisfactory estimates of meta-regression coefficients that are practically equivalent to mixed effects or random effects when there is no publication bias. When there is publication selection bias, WLS-MRA always has smaller bias than mixed effects or random effects. In practical applications, an unrestricted WLS meta-regression is likely to give practically equivalent or superior estimates to fixed-effects, random-effects, and mixed-effects meta-regression approaches. However, random-effects meta-regression remains viable and perhaps somewhat preferable if selection for statistical significance (publication bias) can be ruled out and when random, additive normal heterogeneity is known to directly affect the 'true' regression coefficient. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  6. Random attractor of non-autonomous stochastic Boussinesq lattice system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, Min, E-mail: zhaomin1223@126.com; Zhou, Shengfan, E-mail: zhoushengfan@yahoo.com

    2015-09-15

    In this paper, we first consider the existence of tempered random attractor for second-order non-autonomous stochastic lattice dynamical system of nonlinear Boussinesq equations effected by time-dependent coupled coefficients and deterministic forces and multiplicative white noise. Then, we establish the upper semicontinuity of random attractors as the intensity of noise approaches zero.

  7. The Effectiveness of Mandatory-Random Student Drug Testing

    ERIC Educational Resources Information Center

    James-Burdumy, Susanne; Goesling, Brian; Deke, John; Einspruch, Eric

    2011-01-01

    One approach some U.S. schools now use to combat high rates of adolescent substance use is school-based mandatory-random student drug testing (MRSDT). Under MRSDT, students and their parents sign consent forms agreeing to the students' participation in random drug testing as a condition of participating in athletics and other school-sponsored…

  8. Effectiveness of Supplemental Kindergarten Vocabulary Instruction for English Learners: A Randomized Study of Immediate and Longer-Term Effects of Two Approaches

    ERIC Educational Resources Information Center

    Vadasy, Patricia F.; Sanders, Elizabeth A.; Nelson, J. Ron

    2015-01-01

    A two-cohort cluster randomized trial was conducted to estimate effects of small-group supplemental vocabulary instruction for at-risk kindergarten English learners (ELs). "Connections" students received explicit instruction in high-frequency decodable root words, and interactive book reading (IBR) students were taught the same words in…

  9. A unified approach for squeal instability analysis of disc brakes with two types of random-fuzzy uncertainties

    NASA Astrophysics Data System (ADS)

    Lü, Hui; Shangguan, Wen-Bin; Yu, Dejie

    2017-09-01

    Automotive brake systems are always subjected to various types of uncertainties and two types of random-fuzzy uncertainties may exist in the brakes. In this paper, a unified approach is proposed for squeal instability analysis of disc brakes with two types of random-fuzzy uncertainties. In the proposed approach, two uncertainty analysis models with mixed variables are introduced to model the random-fuzzy uncertainties. The first one is the random and fuzzy model, in which random variables and fuzzy variables exist simultaneously and independently. The second one is the fuzzy random model, in which uncertain parameters are all treated as random variables while their distribution parameters are expressed as fuzzy numbers. Firstly, the fuzziness is discretized by using α-cut technique and the two uncertainty analysis models are simplified into random-interval models. Afterwards, by temporarily neglecting interval uncertainties, the random-interval models are degraded into random models, in which the expectations, variances, reliability indexes and reliability probabilities of system stability functions are calculated. And then, by reconsidering the interval uncertainties, the bounds of the expectations, variances, reliability indexes and reliability probabilities are computed based on Taylor series expansion. Finally, by recomposing the analysis results at each α-cut level, the fuzzy reliability indexes and probabilities can be obtained, by which the brake squeal instability can be evaluated. The proposed approach gives a general framework to deal with both types of random-fuzzy uncertainties that may exist in the brakes and its effectiveness is demonstrated by numerical examples. It will be a valuable supplement to the systematic study of brake squeal considering uncertainty.

  10. Estimation of treatment effect in a subpopulation: An empirical Bayes approach.

    PubMed

    Shen, Changyu; Li, Xiaochun; Jeong, Jaesik

    2016-01-01

    It is well recognized that the benefit of a medical intervention may not be distributed evenly in the target population due to patient heterogeneity, and conclusions based on conventional randomized clinical trials may not apply to every person. Given the increasing cost of randomized trials and difficulties in recruiting patients, there is a strong need to develop analytical approaches to estimate treatment effect in subpopulations. In particular, due to limited sample size for subpopulations and the need for multiple comparisons, standard analysis tends to yield wide confidence intervals of the treatment effect that are often noninformative. We propose an empirical Bayes approach to combine both information embedded in a target subpopulation and information from other subjects to construct confidence intervals of the treatment effect. The method is appealing in its simplicity and tangibility in characterizing the uncertainty about the true treatment effect. Simulation studies and a real data analysis are presented.

  11. A Poisson approach to the validation of failure time surrogate endpoints in individual patient data meta-analyses.

    PubMed

    Rotolo, Federico; Paoletti, Xavier; Burzykowski, Tomasz; Buyse, Marc; Michiels, Stefan

    2017-01-01

    Surrogate endpoints are often used in clinical trials instead of well-established hard endpoints for practical convenience. The meta-analytic approach relies on two measures of surrogacy: one at the individual level and one at the trial level. In the survival data setting, a two-step model based on copulas is commonly used. We present a new approach which employs a bivariate survival model with an individual random effect shared between the two endpoints and correlated treatment-by-trial interactions. We fit this model using auxiliary mixed Poisson models. We study via simulations the operating characteristics of this mixed Poisson approach as compared to the two-step copula approach. We illustrate the application of the methods on two individual patient data meta-analyses in gastric cancer, in the advanced setting (4069 patients from 20 randomized trials) and in the adjuvant setting (3288 patients from 14 randomized trials).

  12. A pattern-mixture model approach for handling missing continuous outcome data in longitudinal cluster randomized trials.

    PubMed

    Fiero, Mallorie H; Hsu, Chiu-Hsieh; Bell, Melanie L

    2017-11-20

    We extend the pattern-mixture approach to handle missing continuous outcome data in longitudinal cluster randomized trials, which randomize groups of individuals to treatment arms, rather than the individuals themselves. Individuals who drop out at the same time point are grouped into the same dropout pattern. We approach extrapolation of the pattern-mixture model by applying multilevel multiple imputation, which imputes missing values while appropriately accounting for the hierarchical data structure found in cluster randomized trials. To assess parameters of interest under various missing data assumptions, imputed values are multiplied by a sensitivity parameter, k, which increases or decreases imputed values. Using simulated data, we show that estimates of parameters of interest can vary widely under differing missing data assumptions. We conduct a sensitivity analysis using real data from a cluster randomized trial by increasing k until the treatment effect inference changes. By performing a sensitivity analysis for missing data, researchers can assess whether certain missing data assumptions are reasonable for their cluster randomized trial. Copyright © 2017 John Wiley & Sons, Ltd.

  13. Can Rational Prescribing Be Improved by an Outcome-Based Educational Approach? A Randomized Trial Completed in Iran

    ERIC Educational Resources Information Center

    Esmaily, Hamideh M.; Silver, Ivan; Shiva, Shadi; Gargani, Alireza; Maleki-Dizaji, Nasrin; Al-Maniri, Abdullah; Wahlstrom, Rolf

    2010-01-01

    Introduction: An outcome-based education approach has been proposed to develop more effective continuing medical education (CME) programs. We have used this approach in developing an outcome-based educational intervention for general physicians working in primary care (GPs) and evaluated its effectiveness compared with a concurrent CME program in…

  14. Random effects coefficient of determination for mixed and meta-analysis models

    PubMed Central

    Demidenko, Eugene; Sargent, James; Onega, Tracy

    2011-01-01

    The key feature of a mixed model is the presence of random effects. We have developed a coefficient, called the random effects coefficient of determination, Rr2, that estimates the proportion of the conditional variance of the dependent variable explained by random effects. This coefficient takes values from 0 to 1 and indicates how strong the random effects are. The difference from the earlier suggested fixed effects coefficient of determination is emphasized. If Rr2 is close to 0, there is weak support for random effects in the model because the reduction of the variance of the dependent variable due to random effects is small; consequently, random effects may be ignored and the model simplifies to standard linear regression. The value of Rr2 apart from 0 indicates the evidence of the variance reduction in support of the mixed model. If random effects coefficient of determination is close to 1 the variance of random effects is very large and random effects turn into free fixed effects—the model can be estimated using the dummy variable approach. We derive explicit formulas for Rr2 in three special cases: the random intercept model, the growth curve model, and meta-analysis model. Theoretical results are illustrated with three mixed model examples: (1) travel time to the nearest cancer center for women with breast cancer in the U.S., (2) cumulative time watching alcohol related scenes in movies among young U.S. teens, as a risk factor for early drinking onset, and (3) the classic example of the meta-analysis model for combination of 13 studies on tuberculosis vaccine. PMID:23750070

  15. An in silico approach helped to identify the best experimental design, population, and outcome for future randomized clinical trials.

    PubMed

    Bajard, Agathe; Chabaud, Sylvie; Cornu, Catherine; Castellan, Anne-Charlotte; Malik, Salma; Kurbatova, Polina; Volpert, Vitaly; Eymard, Nathalie; Kassai, Behrouz; Nony, Patrice

    2016-01-01

    The main objective of our work was to compare different randomized clinical trial (RCT) experimental designs in terms of power, accuracy of the estimation of treatment effect, and number of patients receiving active treatment using in silico simulations. A virtual population of patients was simulated and randomized in potential clinical trials. Treatment effect was modeled using a dose-effect relation for quantitative or qualitative outcomes. Different experimental designs were considered, and performances between designs were compared. One thousand clinical trials were simulated for each design based on an example of modeled disease. According to simulation results, the number of patients needed to reach 80% power was 50 for crossover, 60 for parallel or randomized withdrawal, 65 for drop the loser (DL), and 70 for early escape or play the winner (PW). For a given sample size, each design had its own advantage: low duration (parallel, early escape), high statistical power and precision (crossover), and higher number of patients receiving the active treatment (PW and DL). Our approach can help to identify the best experimental design, population, and outcome for future RCTs. This may be particularly useful for drug development in rare diseases, theragnostic approaches, or personalized medicine. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. On the apparent insignificance of the randomness of flexible joints on large space truss dynamics

    NASA Technical Reports Server (NTRS)

    Koch, R. M.; Klosner, J. M.

    1993-01-01

    Deployable periodic large space structures have been shown to exhibit high dynamic sensitivity to period-breaking imperfections and uncertainties. These can be brought on by manufacturing or assembly errors, structural imperfections, as well as nonlinear and/or nonconservative joint behavior. In addition, the necessity of precise pointing and position capability can require the consideration of these usually negligible and unknown parametric uncertainties and their effect on the overall dynamic response of large space structures. This work describes the use of a new design approach for the global dynamic solution of beam-like periodic space structures possessing parametric uncertainties. Specifically, the effect of random flexible joints on the free vibrations of simply-supported periodic large space trusses is considered. The formulation is a hybrid approach in terms of an extended Timoshenko beam continuum model, Monte Carlo simulation scheme, and first-order perturbation methods. The mean and mean-square response statistics for a variety of free random vibration problems are derived for various input random joint stiffness probability distributions. The results of this effort show that, although joint flexibility has a substantial effect on the modal dynamic response of periodic large space trusses, the effect of any reasonable uncertainty or randomness associated with these joint flexibilities is insignificant.

  17. The Random-Effect DINA Model

    ERIC Educational Resources Information Center

    Huang, Hung-Yu; Wang, Wen-Chung

    2014-01-01

    The DINA (deterministic input, noisy, and gate) model has been widely used in cognitive diagnosis tests and in the process of test development. The outcomes known as slip and guess are included in the DINA model function representing the responses to the items. This study aimed to extend the DINA model by using the random-effect approach to allow…

  18. Application of random effects to the study of resource selection by animals

    USGS Publications Warehouse

    Gillies, C.S.; Hebblewhite, M.; Nielsen, S.E.; Krawchuk, M.A.; Aldridge, Cameron L.; Frair, J.L.; Saher, D.J.; Stevens, C.E.; Jerde, C.L.

    2006-01-01

    1. Resource selection estimated by logistic regression is used increasingly in studies to identify critical resources for animal populations and to predict species occurrence.2. Most frequently, individual animals are monitored and pooled to estimate population-level effects without regard to group or individual-level variation. Pooling assumes that both observations and their errors are independent, and resource selection is constant given individual variation in resource availability.3. Although researchers have identified ways to minimize autocorrelation, variation between individuals caused by differences in selection or available resources, including functional responses in resource selection, have not been well addressed.4. Here we review random-effects models and their application to resource selection modelling to overcome these common limitations. We present a simple case study of an analysis of resource selection by grizzly bears in the foothills of the Canadian Rocky Mountains with and without random effects.5. Both categorical and continuous variables in the grizzly bear model differed in interpretation, both in statistical significance and coefficient sign, depending on how a random effect was included. We used a simulation approach to clarify the application of random effects under three common situations for telemetry studies: (a) discrepancies in sample sizes among individuals; (b) differences among individuals in selection where availability is constant; and (c) differences in availability with and without a functional response in resource selection.6. We found that random intercepts accounted for unbalanced sample designs, and models with random intercepts and coefficients improved model fit given the variation in selection among individuals and functional responses in selection. Our empirical example and simulations demonstrate how including random effects in resource selection models can aid interpretation and address difficult assumptions limiting their generality. This approach will allow researchers to appropriately estimate marginal (population) and conditional (individual) responses, and account for complex grouping, unbalanced sample designs and autocorrelation.

  19. Application of random effects to the study of resource selection by animals.

    PubMed

    Gillies, Cameron S; Hebblewhite, Mark; Nielsen, Scott E; Krawchuk, Meg A; Aldridge, Cameron L; Frair, Jacqueline L; Saher, D Joanne; Stevens, Cameron E; Jerde, Christopher L

    2006-07-01

    1. Resource selection estimated by logistic regression is used increasingly in studies to identify critical resources for animal populations and to predict species occurrence. 2. Most frequently, individual animals are monitored and pooled to estimate population-level effects without regard to group or individual-level variation. Pooling assumes that both observations and their errors are independent, and resource selection is constant given individual variation in resource availability. 3. Although researchers have identified ways to minimize autocorrelation, variation between individuals caused by differences in selection or available resources, including functional responses in resource selection, have not been well addressed. 4. Here we review random-effects models and their application to resource selection modelling to overcome these common limitations. We present a simple case study of an analysis of resource selection by grizzly bears in the foothills of the Canadian Rocky Mountains with and without random effects. 5. Both categorical and continuous variables in the grizzly bear model differed in interpretation, both in statistical significance and coefficient sign, depending on how a random effect was included. We used a simulation approach to clarify the application of random effects under three common situations for telemetry studies: (a) discrepancies in sample sizes among individuals; (b) differences among individuals in selection where availability is constant; and (c) differences in availability with and without a functional response in resource selection. 6. We found that random intercepts accounted for unbalanced sample designs, and models with random intercepts and coefficients improved model fit given the variation in selection among individuals and functional responses in selection. Our empirical example and simulations demonstrate how including random effects in resource selection models can aid interpretation and address difficult assumptions limiting their generality. This approach will allow researchers to appropriately estimate marginal (population) and conditional (individual) responses, and account for complex grouping, unbalanced sample designs and autocorrelation.

  20. Application of Poisson random effect models for highway network screening.

    PubMed

    Jiang, Ximiao; Abdel-Aty, Mohamed; Alamili, Samer

    2014-02-01

    In recent years, Bayesian random effect models that account for the temporal and spatial correlations of crash data became popular in traffic safety research. This study employs random effect Poisson Log-Normal models for crash risk hotspot identification. Both the temporal and spatial correlations of crash data were considered. Potential for Safety Improvement (PSI) were adopted as a measure of the crash risk. Using the fatal and injury crashes that occurred on urban 4-lane divided arterials from 2006 to 2009 in the Central Florida area, the random effect approaches were compared to the traditional Empirical Bayesian (EB) method and the conventional Bayesian Poisson Log-Normal model. A series of method examination tests were conducted to evaluate the performance of different approaches. These tests include the previously developed site consistence test, method consistence test, total rank difference test, and the modified total score test, as well as the newly proposed total safety performance measure difference test. Results show that the Bayesian Poisson model accounting for both temporal and spatial random effects (PTSRE) outperforms the model that with only temporal random effect, and both are superior to the conventional Poisson Log-Normal model (PLN) and the EB model in the fitting of crash data. Additionally, the method evaluation tests indicate that the PTSRE model is significantly superior to the PLN model and the EB model in consistently identifying hotspots during successive time periods. The results suggest that the PTSRE model is a superior alternative for road site crash risk hotspot identification. Copyright © 2013 Elsevier Ltd. All rights reserved.

  1. Cluster randomized trials in comparative effectiveness research: randomizing hospitals to test methods for prevention of healthcare-associated infections.

    PubMed

    Platt, Richard; Takvorian, Samuel U; Septimus, Edward; Hickok, Jason; Moody, Julia; Perlin, Jonathan; Jernigan, John A; Kleinman, Ken; Huang, Susan S

    2010-06-01

    The need for evidence about the effectiveness of therapeutics and other medical practices has triggered new interest in methods for comparative effectiveness research. Describe an approach to comparative effectiveness research involving cluster randomized trials in networks of hospitals, health plans, or medical practices with centralized administrative and informatics capabilities. We discuss the example of an ongoing cluster randomized trial to prevent methicillin-resistant Staphylococcus aureus (MRSA) infection in intensive care units (ICUs). The trial randomizes 45 hospitals to: (a) screening cultures of ICU admissions, followed by Contact Precautions if MRSA-positive, (b) screening cultures of ICU admissions followed by decolonization if MRSA-positive, or (c) universal decolonization of ICU admissions without screening. All admissions to adult ICUs. The primary outcome is MRSA-positive clinical cultures occurring >or=2 days following ICU admission. Secondary outcomes include blood and urine infection caused by MRSA (and, separately, all pathogens), as well as the development of resistance to decolonizing agents. Recruitment of hospitals is complete. Data collection will end in Summer 2011. This trial takes advantage of existing personnel, procedures, infrastructure, and information systems in a large integrated hospital network to conduct a low-cost evaluation of prevention strategies under usual practice conditions. This approach is applicable to many comparative effectiveness topics in both inpatient and ambulatory settings.

  2. A randomization approach to handling data scaling in nuclear medicine.

    PubMed

    Bai, Chuanyong; Conwell, Richard; Kindem, Joel

    2010-06-01

    In medical imaging, data scaling is sometimes desired to handle the system complexity, such as uniformity calibration. Since the data are usually saved in short integer, conventional data scaling will first scale the data in floating point format and then truncate or round the floating point data to short integer data. For example, when using truncation, scaling of 9 by 1.1 results in 9 and scaling of 10 by 1.1 results in 11. When the count level is low, such scaling may change the local data distribution and affect the intended application of the data. In this work, the authors use an example gated cardiac SPECT study to illustrate the effect of conventional scaling by factors of 1.1 and 1.2. The authors then scaled the data with the same scaling factors using a randomization approach, in which a random number evenly distributed between 0 and 1 is generated to determine how the floating point data will be saved as short integer data. If the random number is between 0 and 0.9, then 9.9 will be saved as 10, otherwise 9. In other words, the floating point value 9.9 will be saved in short integer value as 10 with 90% probability or 9 with 10% probability. For statistical analysis of the performance, the authors applied the conventional approach with rounding and the randomization approach to 50 consecutive gated studies from a clinical site. For the example study, the image reconstructed from the original data showed an apparent perfusion defect at the apex of the myocardium. The defect size was noticeably changed by scaling with 1.1 and 1.2 using the conventional approaches with truncation and rounding. Using the randomization approach, in contrast, the images from the scaled data appeared identical to the original image. Line profile analysis of the scaled data showed that the randomization approach introduced the least change to the data as compared to the conventional approaches. For the 50 gated data sets, significantly more studies showed quantitative differences between the original images and the images from the data scaled by 1.2 using the rounding approach than the randomization approach [46/50 (92%) versus 3/50 (6%), p < 0.05]. Likewise, significantly more studies showed visually noticeable differences between the original images and the images from the data scaled by 1.2 using the rounding approach than randomization [29/50 (58%) versus 1/50 (2%), p < 0.05]. In conclusion, the proposed randomization approach minimizes the scaling-introduced local data change as compared to the conventional approaches. It is preferred for nuclear medicine data scaling.

  3. Random effects coefficient of determination for mixed and meta-analysis models.

    PubMed

    Demidenko, Eugene; Sargent, James; Onega, Tracy

    2012-01-01

    The key feature of a mixed model is the presence of random effects. We have developed a coefficient, called the random effects coefficient of determination, [Formula: see text], that estimates the proportion of the conditional variance of the dependent variable explained by random effects. This coefficient takes values from 0 to 1 and indicates how strong the random effects are. The difference from the earlier suggested fixed effects coefficient of determination is emphasized. If [Formula: see text] is close to 0, there is weak support for random effects in the model because the reduction of the variance of the dependent variable due to random effects is small; consequently, random effects may be ignored and the model simplifies to standard linear regression. The value of [Formula: see text] apart from 0 indicates the evidence of the variance reduction in support of the mixed model. If random effects coefficient of determination is close to 1 the variance of random effects is very large and random effects turn into free fixed effects-the model can be estimated using the dummy variable approach. We derive explicit formulas for [Formula: see text] in three special cases: the random intercept model, the growth curve model, and meta-analysis model. Theoretical results are illustrated with three mixed model examples: (1) travel time to the nearest cancer center for women with breast cancer in the U.S., (2) cumulative time watching alcohol related scenes in movies among young U.S. teens, as a risk factor for early drinking onset, and (3) the classic example of the meta-analysis model for combination of 13 studies on tuberculosis vaccine.

  4. Assessing variance components in multilevel linear models using approximate Bayes factors: A case study of ethnic disparities in birthweight

    PubMed Central

    Saville, Benjamin R.; Herring, Amy H.; Kaufman, Jay S.

    2013-01-01

    Racial/ethnic disparities in birthweight are a large source of differential morbidity and mortality worldwide and have remained largely unexplained in epidemiologic models. We assess the impact of maternal ancestry and census tract residence on infant birth weights in New York City and the modifying effects of race and nativity by incorporating random effects in a multilevel linear model. Evaluating the significance of these predictors involves the test of whether the variances of the random effects are equal to zero. This is problematic because the null hypothesis lies on the boundary of the parameter space. We generalize an approach for assessing random effects in the two-level linear model to a broader class of multilevel linear models by scaling the random effects to the residual variance and introducing parameters that control the relative contribution of the random effects. After integrating over the random effects and variance components, the resulting integrals needed to calculate the Bayes factor can be efficiently approximated with Laplace’s method. PMID:24082430

  5. Random Time Identity Based Firewall In Mobile Ad hoc Networks

    NASA Astrophysics Data System (ADS)

    Suman, Patel, R. B.; Singh, Parvinder

    2010-11-01

    A mobile ad hoc network (MANET) is a self-organizing network of mobile routers and associated hosts connected by wireless links. MANETs are highly flexible and adaptable but at the same time are highly prone to security risks due to the open medium, dynamically changing network topology, cooperative algorithms, and lack of centralized control. Firewall is an effective means of protecting a local network from network-based security threats and forms a key component in MANET security architecture. This paper presents a review of firewall implementation techniques in MANETs and their relative merits and demerits. A new approach is proposed to select MANET nodes at random for firewall implementation. This approach randomly select a new node as firewall after fixed time and based on critical value of certain parameters like power backup. This approach effectively balances power and resource utilization of entire MANET because responsibility of implementing firewall is equally shared among all the nodes. At the same time it ensures improved security for MANETs from outside attacks as intruder will not be able to find out the entry point in MANET due to the random selection of nodes for firewall implementation.

  6. Revisiting Fixed- and Random-Effects Models: Some Considerations for Policy-Relevant Education Research

    ERIC Educational Resources Information Center

    Clarke, Paul; Crawford, Claire; Steele, Fiona; Vignoles, Anna

    2015-01-01

    The use of fixed (FE) and random effects (RE) in two-level hierarchical linear regression is discussed in the context of education research. We compare the robustness of FE models with the modelling flexibility and potential efficiency of those from RE models. We argue that the two should be seen as complementary approaches. We then compare both…

  7. Massage Therapy for Pain and Function in Patients With Arthritis: A Systematic Review of Randomized Controlled Trials.

    PubMed

    Nelson, Nicole L; Churilla, James R

    2017-09-01

    Massage therapy is gaining interest as a therapeutic approach to managing osteoarthritis and rheumatoid arthritis symptoms. To date, there have been no systematic reviews investigating the effects of massage therapy on these conditions. Systematic review was used. The primary aim of this review was to critically appraise and synthesize the current evidence regarding the effects of massage therapy as a stand-alone treatment on pain and functional outcomes among those with osteoarthritis or rheumatoid arthritis. Relevant randomized controlled trials were searched using the electronic databases Google Scholar, MEDLINE, and PEDro. The PEDro scale was used to assess risk of bias, and the quality of evidence was assessed with the GRADE approach. This review found seven randomized controlled trials representing 352 participants who satisfied the inclusion criteria. Risk of bias ranged from four to seven. Our results found low- to moderate-quality evidence that massage therapy is superior to nonactive therapies in reducing pain and improving certain functional outcomes. It is unclear whether massage therapy is more effective than other forms of treatment. There is a need for large, methodologically rigorous randomized controlled trials investigating the effectiveness of massage therapy as an intervention for individuals with arthritis.

  8. A Randomized Controlled Trial of Two Syntactic Treatment Procedures With Cantonese-Speaking, School-Age Children With Language Disorders.

    PubMed

    To, Carol K S; Lui, Hoi Ming; Li, Xin Xin; Lam, Gary Y H

    2015-08-01

    In this study, we aimed to evaluate the efficacy of sentence-combining (SC) and narrative-based (NAR) intervention approaches to syntax intervention using a randomized-controlled-trial design. Fifty-two Cantonese-speaking, school-age children with language impairment were assigned randomly to either the SC or the NAR treatment arm. Seven children did not receive treatment as assigned. Intervention in both arms targeted the same complex syntactical structures. The SC group focused on sentence combination training, whereas the NAR group made use of narratives in which the target structures were embedded. Pretest and posttest performances measured using a standardized language assessment were subjected to analyses of covariance mixed-effect-model analyses of variance. Children in both treatment arms demonstrated significant growth after 4 months of intervention. The main effect between treatment arms and time was not significant after controlling the pretest performance, suggesting that both treatment approaches showed similar effects. The main effect of time was significant. This study provided evidence to support language intervention in the school years in Cantonese-speaking children. However, neither approach was shown to be more efficacious than the other. Future researchers could examine the effects of a longer treatment period and include functional outcome measures.

  9. Random-effects meta-analysis: the number of studies matters.

    PubMed

    Guolo, Annamaria; Varin, Cristiano

    2017-06-01

    This paper investigates the impact of the number of studies on meta-analysis and meta-regression within the random-effects model framework. It is frequently neglected that inference in random-effects models requires a substantial number of studies included in meta-analysis to guarantee reliable conclusions. Several authors warn about the risk of inaccurate results of the traditional DerSimonian and Laird approach especially in the common case of meta-analysis involving a limited number of studies. This paper presents a selection of likelihood and non-likelihood methods for inference in meta-analysis proposed to overcome the limitations of the DerSimonian and Laird procedure, with a focus on the effect of the number of studies. The applicability and the performance of the methods are investigated in terms of Type I error rates and empirical power to detect effects, according to scenarios of practical interest. Simulation studies and applications to real meta-analyses highlight that it is not possible to identify an approach uniformly superior to alternatives. The overall recommendation is to avoid the DerSimonian and Laird method when the number of meta-analysis studies is modest and prefer a more comprehensive procedure that compares alternative inferential approaches. R code for meta-analysis according to all of the inferential methods examined in the paper is provided.

  10. Using Non-experimental Data to Estimate Treatment Effects

    PubMed Central

    Stuart, Elizabeth A.; Marcus, Sue M.; Horvitz-Lennon, Marcela V.; Gibbons, Robert D.; Normand, Sharon-Lise T.

    2009-01-01

    While much psychiatric research is based on randomized controlled trials (RCTs), where patients are randomly assigned to treatments, sometimes RCTs are not feasible. This paper describes propensity score approaches, which are increasingly used for estimating treatment effects in non-experimental settings. The primary goal of propensity score methods is to create sets of treated and comparison subjects who look as similar as possible, in essence replicating a randomized experiment, at least with respect to observed patient characteristics. A study to estimate the metabolic effects of antipsychotic medication in a sample of Florida Medicaid beneficiaries with schizophrenia illustrates methods. PMID:20563313

  11. Estimation of treatment efficacy with complier average causal effects (CACE) in a randomized stepped wedge trial.

    PubMed

    Gruber, Joshua S; Arnold, Benjamin F; Reygadas, Fermin; Hubbard, Alan E; Colford, John M

    2014-05-01

    Complier average causal effects (CACE) estimate the impact of an intervention among treatment compliers in randomized trials. Methods used to estimate CACE have been outlined for parallel-arm trials (e.g., using an instrumental variables (IV) estimator) but not for other randomized study designs. Here, we propose a method for estimating CACE in randomized stepped wedge trials, where experimental units cross over from control conditions to intervention conditions in a randomized sequence. We illustrate the approach with a cluster-randomized drinking water trial conducted in rural Mexico from 2009 to 2011. Additionally, we evaluated the plausibility of assumptions required to estimate CACE using the IV approach, which are testable in stepped wedge trials but not in parallel-arm trials. We observed small increases in the magnitude of CACE risk differences compared with intention-to-treat estimates for drinking water contamination (risk difference (RD) = -22% (95% confidence interval (CI): -33, -11) vs. RD = -19% (95% CI: -26, -12)) and diarrhea (RD = -0.8% (95% CI: -2.1, 0.4) vs. RD = -0.1% (95% CI: -1.1, 0.9)). Assumptions required for IV analysis were probably violated. Stepped wedge trials allow investigators to estimate CACE with an approach that avoids the stronger assumptions required for CACE estimation in parallel-arm trials. Inclusion of CACE estimates in stepped wedge trials with imperfect compliance could enhance reporting and interpretation of the results of such trials.

  12. Evaluation of uncertainty in the adjustment of fundamental constants

    NASA Astrophysics Data System (ADS)

    Bodnar, Olha; Elster, Clemens; Fischer, Joachim; Possolo, Antonio; Toman, Blaza

    2016-02-01

    Combining multiple measurement results for the same quantity is an important task in metrology and in many other areas. Examples include the determination of fundamental constants, the calculation of reference values in interlaboratory comparisons, or the meta-analysis of clinical studies. However, neither the GUM nor its supplements give any guidance for this task. Various approaches are applied such as weighted least-squares in conjunction with the Birge ratio or random effects models. While the former approach, which is based on a location-scale model, is particularly popular in metrology, the latter represents a standard tool used in statistics for meta-analysis. We investigate the reliability and robustness of the location-scale model and the random effects model with particular focus on resulting coverage or credible intervals. The interval estimates are obtained by adopting a Bayesian point of view in conjunction with a non-informative prior that is determined by a currently favored principle for selecting non-informative priors. Both approaches are compared by applying them to simulated data as well as to data for the Planck constant and the Newtonian constant of gravitation. Our results suggest that the proposed Bayesian inference based on the random effects model is more reliable and less sensitive to model misspecifications than the approach based on the location-scale model.

  13. Comparison of Cognitive Orientation to daily Occupational Performance and conventional occupational therapy on occupational performance in individuals with stroke: A randomized controlled trial.

    PubMed

    Ahn, Si-Nae; Yoo, Eun-Young; Jung, Min-Ye; Park, Hae-Yean; Lee, Ji-Yeon; Choi, Yoo-Im

    2017-01-01

    Cognitive Orientation to daily Occupational Performance (CO-OP) approach based on cognitive strategy in occupational therapy. To investigate the effects of CO-OP approach on occupational performance in individuals with hemiparetic stroke. This study was designed as a 5-week, randomized, single-blind. Forty-three participants who had a diagnosis of first stroke were enrolled in this study. The participants were randomly assigned to the experimental group (n = 20) or the control group (n = 23). The experimental group conducted CO-OP approach while the control group conducted conventional occupational therapy based on occupational performance components. This study measured Canadian Occupational Performance Measure (COPM) and Performance Quality Rating Scale (PQRS). Outcome measurements were performed at baseline and post-intervention. After training, the scores of COPM and PQRS in trained task were significantly higher for the score in the experimental group than the control group. In addition, the non-trained task was significantly higher for the score in the experimental group than the control group in COPM and the PQRS. This study suggests that the CO-OP approach is beneficial effects on the occupational performance to improvement in individuals with hemiparetic stroke, and have positive effects on generalization and transfer of acquired skills.

  14. 'Mendelian randomization': an approach for exploring causal relations in epidemiology.

    PubMed

    Gupta, V; Walia, G K; Sachdeva, M P

    2017-04-01

    To assess the current status of Mendelian randomization (MR) approach in effectively influencing the observational epidemiology for examining causal relationships. Narrative review on studies related to principle, strengths, limitations, and achievements of MR approach. Observational epidemiological studies have repeatedly produced several beneficiary associations which were discarded when tested by standard randomized controlled trials (RCTs). The technique which is more feasible, highly similar to RCTs, and has the potential to establish a causal relationship between modifiable exposures and disease outcomes is known as MR. The technique uses genetic variants related to modifiable traits/exposures as instruments for detecting causal and directional associations with outcomes. In the last decade, the approach of MR has methodologically developed and progressed to a stage of high acceptance among the epidemiologists and is gradually expanding the landscape of causal relationships in non-communicable chronic diseases. Copyright © 2016 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.

  15. Chi-Squared Test of Fit and Sample Size-A Comparison between a Random Sample Approach and a Chi-Square Value Adjustment Method.

    PubMed

    Bergh, Daniel

    2015-01-01

    Chi-square statistics are commonly used for tests of fit of measurement models. Chi-square is also sensitive to sample size, which is why several approaches to handle large samples in test of fit analysis have been developed. One strategy to handle the sample size problem may be to adjust the sample size in the analysis of fit. An alternative is to adopt a random sample approach. The purpose of this study was to analyze and to compare these two strategies using simulated data. Given an original sample size of 21,000, for reductions of sample sizes down to the order of 5,000 the adjusted sample size function works as good as the random sample approach. In contrast, when applying adjustments to sample sizes of lower order the adjustment function is less effective at approximating the chi-square value for an actual random sample of the relevant size. Hence, the fit is exaggerated and misfit under-estimated using the adjusted sample size function. Although there are big differences in chi-square values between the two approaches at lower sample sizes, the inferences based on the p-values may be the same.

  16. Radiative characterization of random fibrous media with long cylindrical fibers: Comparison of single- and multi-RTE approaches

    NASA Astrophysics Data System (ADS)

    Randrianalisoa, Jaona; Haussener, Sophia; Baillis, Dominique; Lipiński, Wojciech

    2017-11-01

    Radiative heat transfer is analyzed in participating media consisting of long cylindrical fibers with a diameter in the limit of geometrical optics. The absorption and scattering coefficients and the scattering phase function of the medium are determined based on the discrete-level medium geometry and optical properties of individual fibers. The fibers are assumed to be randomly oriented and positioned inside the medium. Two approaches are employed: a volume-averaged two-intensity approach referred to as multi-RTE approach and a homogenized single-intensity approach referred to as the single-RTE approach. Both approaches require effective properties, determined using direct Monte Carlo ray tracing techniques. The macroscopic radiative transfer equations (for single intensity or two volume-averaged intensities) with the corresponding effective properties are solved using Monte Carlo techniques and allow for the determination of the radiative flux distribution as well as overall transmittance and reflectance of the medium. The results are compared against predictions by the direct Monte Carlo simulation on the exact morphology. The effects of fiber volume fraction and optical properties on the effective radiative properties and the overall slab radiative characteristics are investigated. The single-RTE approach gives accurate predictions for high porosity fibrous media (porosity about 95%). The multi-RTE approach is recommended for isotropic fibrous media with porosity in the range of 79-95%.

  17. A spatial error model with continuous random effects and an application to growth convergence

    NASA Astrophysics Data System (ADS)

    Laurini, Márcio Poletti

    2017-10-01

    We propose a spatial error model with continuous random effects based on Matérn covariance functions and apply this model for the analysis of income convergence processes (β -convergence). The use of a model with continuous random effects permits a clearer visualization and interpretation of the spatial dependency patterns, avoids the problems of defining neighborhoods in spatial econometrics models, and allows projecting the spatial effects for every possible location in the continuous space, circumventing the existing aggregations in discrete lattice representations. We apply this model approach to analyze the economic growth of Brazilian municipalities between 1991 and 2010 using unconditional and conditional formulations and a spatiotemporal model of convergence. The results indicate that the estimated spatial random effects are consistent with the existence of income convergence clubs for Brazilian municipalities in this period.

  18. Multivariate Meta-Analysis of Genetic Association Studies: A Simulation Study

    PubMed Central

    Neupane, Binod; Beyene, Joseph

    2015-01-01

    In a meta-analysis with multiple end points of interests that are correlated between or within studies, multivariate approach to meta-analysis has a potential to produce more precise estimates of effects by exploiting the correlation structure between end points. However, under random-effects assumption the multivariate estimation is more complex (as it involves estimation of more parameters simultaneously) than univariate estimation, and sometimes can produce unrealistic parameter estimates. Usefulness of multivariate approach to meta-analysis of the effects of a genetic variant on two or more correlated traits is not well understood in the area of genetic association studies. In such studies, genetic variants are expected to roughly maintain Hardy-Weinberg equilibrium within studies, and also their effects on complex traits are generally very small to modest and could be heterogeneous across studies for genuine reasons. We carried out extensive simulation to explore the comparative performance of multivariate approach with most commonly used univariate inverse-variance weighted approach under random-effects assumption in various realistic meta-analytic scenarios of genetic association studies of correlated end points. We evaluated the performance with respect to relative mean bias percentage, and root mean square error (RMSE) of the estimate and coverage probability of corresponding 95% confidence interval of the effect for each end point. Our simulation results suggest that multivariate approach performs similarly or better than univariate method when correlations between end points within or between studies are at least moderate and between-study variation is similar or larger than average within-study variation for meta-analyses of 10 or more genetic studies. Multivariate approach produces estimates with smaller bias and RMSE especially for the end point that has randomly or informatively missing summary data in some individual studies, when the missing data in the endpoint are imputed with null effects and quite large variance. PMID:26196398

  19. Multivariate Meta-Analysis of Genetic Association Studies: A Simulation Study.

    PubMed

    Neupane, Binod; Beyene, Joseph

    2015-01-01

    In a meta-analysis with multiple end points of interests that are correlated between or within studies, multivariate approach to meta-analysis has a potential to produce more precise estimates of effects by exploiting the correlation structure between end points. However, under random-effects assumption the multivariate estimation is more complex (as it involves estimation of more parameters simultaneously) than univariate estimation, and sometimes can produce unrealistic parameter estimates. Usefulness of multivariate approach to meta-analysis of the effects of a genetic variant on two or more correlated traits is not well understood in the area of genetic association studies. In such studies, genetic variants are expected to roughly maintain Hardy-Weinberg equilibrium within studies, and also their effects on complex traits are generally very small to modest and could be heterogeneous across studies for genuine reasons. We carried out extensive simulation to explore the comparative performance of multivariate approach with most commonly used univariate inverse-variance weighted approach under random-effects assumption in various realistic meta-analytic scenarios of genetic association studies of correlated end points. We evaluated the performance with respect to relative mean bias percentage, and root mean square error (RMSE) of the estimate and coverage probability of corresponding 95% confidence interval of the effect for each end point. Our simulation results suggest that multivariate approach performs similarly or better than univariate method when correlations between end points within or between studies are at least moderate and between-study variation is similar or larger than average within-study variation for meta-analyses of 10 or more genetic studies. Multivariate approach produces estimates with smaller bias and RMSE especially for the end point that has randomly or informatively missing summary data in some individual studies, when the missing data in the endpoint are imputed with null effects and quite large variance.

  20. Meta-analysis in evidence-based healthcare: a paradigm shift away from random effects is overdue.

    PubMed

    Doi, Suhail A R; Furuya-Kanamori, Luis; Thalib, Lukman; Barendregt, Jan J

    2017-12-01

    Each year up to 20 000 systematic reviews and meta-analyses are published whose results influence healthcare decisions, thus making the robustness and reliability of meta-analytic methods one of the world's top clinical and public health priorities. The evidence synthesis makes use of either fixed-effect or random-effects statistical methods. The fixed-effect method has largely been replaced by the random-effects method as heterogeneity of study effects led to poor error estimation. However, despite the widespread use and acceptance of the random-effects method to correct this, it too remains unsatisfactory and continues to suffer from defective error estimation, posing a serious threat to decision-making in evidence-based clinical and public health practice. We discuss here the problem with the random-effects approach and demonstrate that there exist better estimators under the fixed-effect model framework that can achieve optimal error estimation. We argue for an urgent return to the earlier framework with updates that address these problems and conclude that doing so can markedly improve the reliability of meta-analytical findings and thus decision-making in healthcare.

  1. Sequential causal inference: Application to randomized trials of adaptive treatment strategies

    PubMed Central

    Dawson, Ree; Lavori, Philip W.

    2009-01-01

    SUMMARY Clinical trials that randomize subjects to decision algorithms, which adapt treatments over time according to individual response, have gained considerable interest as investigators seek designs that directly inform clinical decision making. We consider designs in which subjects are randomized sequentially at decision points, among adaptive treatment options under evaluation. We present a sequential method to estimate the comparative effects of the randomized adaptive treatments, which are formalized as adaptive treatment strategies. Our causal estimators are derived using Bayesian predictive inference. We use analytical and empirical calculations to compare the predictive estimators to (i) the ‘standard’ approach that allocates the sequentially obtained data to separate strategy-specific groups as would arise from randomizing subjects at baseline; (ii) the semi-parametric approach of marginal mean models that, under appropriate experimental conditions, provides the same sequential estimator of causal differences as the proposed approach. Simulation studies demonstrate that sequential causal inference offers substantial efficiency gains over the standard approach to comparing treatments, because the predictive estimators can take advantage of the monotone structure of shared data among adaptive strategies. We further demonstrate that the semi-parametric asymptotic variances, which are marginal ‘one-step’ estimators, may exhibit significant bias, in contrast to the predictive variances. We show that the conditions under which the sequential method is attractive relative to the other two approaches are those most likely to occur in real studies. PMID:17914714

  2. A cost-effective approach to the development of printed materials: a randomized controlled trial of three strategies.

    PubMed

    Paul, C L; Redman, S; Sanson-Fisher, R W

    2004-12-01

    Printed materials have been a primary mode of communication in public health education. Three major approaches to the development of these materials--the application of characteristics identified in the literature, behavioral strategies and marketing strategies--have major implications for both the effectiveness and cost of materials. However, little attention has been directed towards the cost-effectiveness of such approaches. In the present study, three pamphlets were developed using successive addition of each approach: first literature characteristics only ('C' pamphlet), then behavioral strategies ('C + B' pamphlet) and then marketing strategies ('C + B + M' pamphlet). Each pamphlet encouraged women to join a Pap Test Reminder Service (PTRS). Each pamphlet was mailed to a randomly selected sample of 2700 women aged 50-69 years. Registrations with the PTRS were monitored and 420 women in each pamphlet group were surveyed by telephone. It was reported that the 'C + B' and 'C + B + M' pamphlets were significantly more effective than the 'C' pamphlet. The 'C + B' pamphlet was the most cost-effective of the three pamphlets. There were no significant differences between any of the pamphlet groups on acceptability, knowledge or attitudes. It was suggested that the inclusion of behavioral strategies is likely to be a cost-effective approach to the development of printed health education materials.

  3. Probabilistic models for reactive behaviour in heterogeneous condensed phase media

    NASA Astrophysics Data System (ADS)

    Baer, M. R.; Gartling, D. K.; DesJardin, P. E.

    2012-02-01

    This work presents statistically-based models to describe reactive behaviour in heterogeneous energetic materials. Mesoscale effects are incorporated in continuum-level reactive flow descriptions using probability density functions (pdfs) that are associated with thermodynamic and mechanical states. A generalised approach is presented that includes multimaterial behaviour by treating the volume fraction as a random kinematic variable. Model simplifications are then sought to reduce the complexity of the description without compromising the statistical approach. Reactive behaviour is first considered for non-deformable media having a random temperature field as an initial state. A pdf transport relationship is derived and an approximate moment approach is incorporated in finite element analysis to model an example application whereby a heated fragment impacts a reactive heterogeneous material which leads to a delayed cook-off event. Modelling is then extended to include deformation effects associated with shock loading of a heterogeneous medium whereby random variables of strain, strain-rate and temperature are considered. A demonstrative mesoscale simulation of a non-ideal explosive is discussed that illustrates the joint statistical nature of the strain and temperature fields during shock loading to motivate the probabilistic approach. This modelling is derived in a Lagrangian framework that can be incorporated in continuum-level shock physics analysis. Future work will consider particle-based methods for a numerical implementation of this modelling approach.

  4. Estimating Unbiased Treatment Effects in Education Using a Regression Discontinuity Design

    ERIC Educational Resources Information Center

    Smith, William C.

    2014-01-01

    The ability of regression discontinuity (RD) designs to provide an unbiased treatment effect while overcoming the ethical concerns plagued by Random Control Trials (RCTs) make it a valuable and useful approach in education evaluation. RD is the only explicitly recognized quasi-experimental approach identified by the Institute of Education…

  5. Vehicle track segmentation using higher order random fields

    DOE PAGES

    Quach, Tu -Thach

    2017-01-09

    Here, we present an approach to segment vehicle tracks in coherent change detection images, a product of combining two synthetic aperture radar images taken at different times. The approach uses multiscale higher order random field models to capture track statistics, such as curvatures and their parallel nature, that are not currently utilized in existing methods. These statistics are encoded as 3-by-3 patterns at different scales. The model can complete disconnected tracks often caused by sensor noise and various environmental effects. Coupling the model with a simple classifier, our approach is effective at segmenting salient tracks. We improve the F-measure onmore » a standard vehicle track data set to 0.963, up from 0.897 obtained by the current state-of-the-art method.« less

  6. Vehicle track segmentation using higher order random fields

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quach, Tu -Thach

    Here, we present an approach to segment vehicle tracks in coherent change detection images, a product of combining two synthetic aperture radar images taken at different times. The approach uses multiscale higher order random field models to capture track statistics, such as curvatures and their parallel nature, that are not currently utilized in existing methods. These statistics are encoded as 3-by-3 patterns at different scales. The model can complete disconnected tracks often caused by sensor noise and various environmental effects. Coupling the model with a simple classifier, our approach is effective at segmenting salient tracks. We improve the F-measure onmore » a standard vehicle track data set to 0.963, up from 0.897 obtained by the current state-of-the-art method.« less

  7. The RESPECT Approach to Tailored Telephone Education

    ERIC Educational Resources Information Center

    Brouse, Corey H.; Basch, Charles E.; Wolf, Randi L.

    2008-01-01

    Objective: The objective of the RESPECT approach to tailored telephone education (TTE) is described. This approach was shown to be highly effective through a randomized intervention trial for increasing the rate of colorectal cancer (CRC) screening. Methods: At the conclusion of the trial, the investigators identified the main principles that…

  8. Meditation and Yoga for Posttraumatic Stress Disorder: A Meta-Analytic Review of Randomized Controlled Trials

    PubMed Central

    Gallegos, Autumn M.; Crean, Hugh F.; Pigeon, Wilfred R.; Heffner, Kathi L.

    2018-01-01

    Posttraumatic stress disorder (PTSD) is a chronic and debilitating disorder that affects the lives of 7-8% of adults in the U.S. Although several interventions demonstrate clinical effectiveness for treating PTSD, many patients continue to have residual symptoms and ask for a variety of treatment options. Complementary health approaches, such as meditation and yoga, hold promise for treating symptoms of PTSD. This meta-analysis evaluates the effect size (ES) of yoga and meditation on PTSD outcomes in adult patients. We also examined whether the intervention type, PTSD outcome measure, study population, sample size, or control condition moderated the effects of complementary approaches on PTSD outcomes. The studies included were 19 randomized control trials with data on 1,173 participants. A random effects model yielded a statistically significant ES in the small to medium range (ES = −.39, p < .001, 95% CI [−.57, −.22]). There were no appreciable differences between intervention types, study population, outcome measures, or control condition. There was, however, a marginally significant higher ES for sample size ≤ 30 (ES = −.78, k = 5). These findings suggest that meditation and yoga are promising complementary approaches in the treatment of PTSD among adults and warrant further study. PMID:29100863

  9. The causal effect of red blood cell folate on genome-wide methylation in cord blood: a Mendelian randomization approach.

    PubMed

    Binder, Alexandra M; Michels, Karin B

    2013-12-04

    Investigation of the biological mechanism by which folate acts to affect fetal development can inform appraisal of expected benefits and risk management. This research is ethically imperative given the ubiquity of folic acid fortified products in the US. Considering that folate is an essential component in the one-carbon metabolism pathway that provides methyl groups for DNA methylation, epigenetic modifications provide a putative molecular mechanism mediating the effect of folic acid supplementation on neonatal and pediatric outcomes. In this study we use a Mendelian Randomization Unnecessary approach to assess the effect of red blood cell (RBC) folate on genome-wide DNA methylation in cord blood. Site-specific CpG methylation within the proximal promoter regions of approximately 14,500 genes was analyzed using the Illumina Infinium Human Methylation27 Bead Chip for 50 infants from the Epigenetic Birth Cohort at Brigham and Women's Hospital in Boston. Using methylenetetrahydrofolate reductase genotype as the instrument, the Mendelian Randomization approach identified 7 CpG loci with a significant (mostly positive) association between RBC folate and methylation level. Among the genes in closest proximity to this significant subset of CpG loci, several enriched biologic processes were involved in nucleic acid transport and metabolic processing. Compared to the standard ordinary least squares regression method, our estimates were demonstrated to be more robust to unmeasured confounding. To the authors' knowledge, this is the largest genome-wide analysis of the effects of folate on methylation pattern, and the first to employ Mendelian Randomization to assess the effects of an exposure on epigenetic modifications. These results can help guide future analyses of the causal effects of periconceptional folate levels on candidate pathways.

  10. A simple method for assessing occupational exposure via the one-way random effects model.

    PubMed

    Krishnamoorthy, K; Mathew, Thomas; Peng, Jie

    2016-11-01

    A one-way random effects model is postulated for the log-transformed shift-long personal exposure measurements, where the random effect in the model represents an effect due to the worker. Simple closed-form confidence intervals are proposed for the relevant parameters of interest using the method of variance estimates recovery (MOVER). The performance of the confidence bounds is evaluated and compared with those based on the generalized confidence interval approach. Comparison studies indicate that the proposed MOVER confidence bounds are better than the generalized confidence bounds for the overall mean exposure and an upper percentile of the exposure distribution. The proposed methods are illustrated using a few examples involving industrial hygiene data.

  11. Prospective randomized clinical trial comparing laparoscopic cholecystectomy and hybrid natural orifice transluminal endoscopic surgery (NOTES) (NCT00835250).

    PubMed

    Noguera, José F; Cuadrado, Angel; Dolz, Carlos; Olea, José M; García, Juan C

    2012-12-01

    Natural orifice transluminal endoscopic surgery (NOTES) is a technique still in experimental development whose safety and effectiveness call for assessment through clinical trials. In this paper we present a three-arm, noninferiority, prospective randomized clinical trial of 1 year duration comparing the vaginal and transumbilical approaches for transluminal endoscopic surgery with the conventional laparoscopic approach for elective cholecystectomy. Sixty female patients between the ages of 18 and 65 years who were eligible for elective cholecystectomy were randomized in a ratio of 1:1:1 to receive hybrid transvaginal NOTES (TV group), hybrid transumbilical NOTES (TU group) or conventional laparoscopy (CL group). The main study variable was parietal complications (wound infection, bleeding, and eventration). The analysis was by intention to treat, and losses were not replaced. Cholecystectomy was successfully performed on 94% of the patients. One patient in the TU group was reconverted to CL owing to difficulty in maneuvering the endoscope. After a minimum follow-up period of 1 year, no differences were noted in the rate of parietal complications. Postoperative pain, length of hospital stay, and time off from work were similar in the three groups. No patient developed dyspareunia. Surgical time was longer among cases in which a flexible endoscope was used (CL, 47.04 min; TV, 64.85 min; TU, 59.80 min). NOTES approaches using the flexible endoscope are not inferior in safety or effectiveness to conventional laparoscopy. The transumbilical approach with flexible endoscope is as effective and safe as the transvaginal approach and is a promising, single-incision approach.

  12. A Randomized Clinical Trial of Alternative Stress Management Interventions in Persons with HIV Infection

    ERIC Educational Resources Information Center

    McCain, Nancy L.; Gray, D. Patricia; Elswick, R. K., Jr.; Robins, Jolynne W.; Tuck, Inez; Walter, Jeanne M.; Rausch, Sarah M.; Ketchum, Jessica McKinney

    2008-01-01

    Research in psychoneuroimmunology suggests that immunosuppression associated with perceived stress may contribute to disease progression in persons with HIV infection. While stress management interventions may enhance immune function, few alternative approaches have yet been tested. This randomized clinical trial was conducted to test effects of…

  13. Random Forests for Evaluating Pedagogy and Informing Personalized Learning

    ERIC Educational Resources Information Center

    Spoon, Kelly; Beemer, Joshua; Whitmer, John C.; Fan, Juanjuan; Frazee, James P.; Stronach, Jeanne; Bohonak, Andrew J.; Levine, Richard A.

    2016-01-01

    Random forests are presented as an analytics foundation for educational data mining tasks. The focus is on course- and program-level analytics including evaluating pedagogical approaches and interventions and identifying and characterizing at-risk students. As part of this development, the concept of individualized treatment effects (ITE) is…

  14. Longitudinal analysis of the strengths and difficulties questionnaire scores of the Millennium Cohort Study children in England using M-quantile random-effects regression.

    PubMed

    Tzavidis, Nikos; Salvati, Nicola; Schmid, Timo; Flouri, Eirini; Midouhas, Emily

    2016-02-01

    Multilevel modelling is a popular approach for longitudinal data analysis. Statistical models conventionally target a parameter at the centre of a distribution. However, when the distribution of the data is asymmetric, modelling other location parameters, e.g. percentiles, may be more informative. We present a new approach, M -quantile random-effects regression, for modelling multilevel data. The proposed method is used for modelling location parameters of the distribution of the strengths and difficulties questionnaire scores of children in England who participate in the Millennium Cohort Study. Quantile mixed models are also considered. The analyses offer insights to child psychologists about the differential effects of risk factors on children's outcomes.

  15. Evaluating the Health Impact of Large-Scale Public Policy Changes: Classical and Novel Approaches

    PubMed Central

    Basu, Sanjay; Meghani, Ankita; Siddiqi, Arjumand

    2018-01-01

    Large-scale public policy changes are often recommended to improve public health. Despite varying widely—from tobacco taxes to poverty-relief programs—such policies present a common dilemma to public health researchers: how to evaluate their health effects when randomized controlled trials are not possible. Here, we review the state of knowledge and experience of public health researchers who rigorously evaluate the health consequences of large-scale public policy changes. We organize our discussion by detailing approaches to address three common challenges of conducting policy evaluations: distinguishing a policy effect from time trends in health outcomes or preexisting differences between policy-affected and -unaffected communities (using difference-in-differences approaches); constructing a comparison population when a policy affects a population for whom a well-matched comparator is not immediately available (using propensity score or synthetic control approaches); and addressing unobserved confounders by utilizing quasi-random variations in policy exposure (using regression discontinuity, instrumental variables, or near-far matching approaches). PMID:28384086

  16. Methods for sample size determination in cluster randomized trials

    PubMed Central

    Rutterford, Clare; Copas, Andrew; Eldridge, Sandra

    2015-01-01

    Background: The use of cluster randomized trials (CRTs) is increasing, along with the variety in their design and analysis. The simplest approach for their sample size calculation is to calculate the sample size assuming individual randomization and inflate this by a design effect to account for randomization by cluster. The assumptions of a simple design effect may not always be met; alternative or more complicated approaches are required. Methods: We summarise a wide range of sample size methods available for cluster randomized trials. For those familiar with sample size calculations for individually randomized trials but with less experience in the clustered case, this manuscript provides formulae for a wide range of scenarios with associated explanation and recommendations. For those with more experience, comprehensive summaries are provided that allow quick identification of methods for a given design, outcome and analysis method. Results: We present first those methods applicable to the simplest two-arm, parallel group, completely randomized design followed by methods that incorporate deviations from this design such as: variability in cluster sizes; attrition; non-compliance; or the inclusion of baseline covariates or repeated measures. The paper concludes with methods for alternative designs. Conclusions: There is a large amount of methodology available for sample size calculations in CRTs. This paper gives the most comprehensive description of published methodology for sample size calculation and provides an important resource for those designing these trials. PMID:26174515

  17. Finite-sample corrected generalized estimating equation of population average treatment effects in stepped wedge cluster randomized trials.

    PubMed

    Scott, JoAnna M; deCamp, Allan; Juraska, Michal; Fay, Michael P; Gilbert, Peter B

    2017-04-01

    Stepped wedge designs are increasingly commonplace and advantageous for cluster randomized trials when it is both unethical to assign placebo, and it is logistically difficult to allocate an intervention simultaneously to many clusters. We study marginal mean models fit with generalized estimating equations for assessing treatment effectiveness in stepped wedge cluster randomized trials. This approach has advantages over the more commonly used mixed models that (1) the population-average parameters have an important interpretation for public health applications and (2) they avoid untestable assumptions on latent variable distributions and avoid parametric assumptions about error distributions, therefore, providing more robust evidence on treatment effects. However, cluster randomized trials typically have a small number of clusters, rendering the standard generalized estimating equation sandwich variance estimator biased and highly variable and hence yielding incorrect inferences. We study the usual asymptotic generalized estimating equation inferences (i.e., using sandwich variance estimators and asymptotic normality) and four small-sample corrections to generalized estimating equation for stepped wedge cluster randomized trials and for parallel cluster randomized trials as a comparison. We show by simulation that the small-sample corrections provide improvement, with one correction appearing to provide at least nominal coverage even with only 10 clusters per group. These results demonstrate the viability of the marginal mean approach for both stepped wedge and parallel cluster randomized trials. We also study the comparative performance of the corrected methods for stepped wedge and parallel designs, and describe how the methods can accommodate interval censoring of individual failure times and incorporate semiparametric efficient estimators.

  18. Longitudinal data analysis with non-ignorable missing data.

    PubMed

    Tseng, Chi-hong; Elashoff, Robert; Li, Ning; Li, Gang

    2016-02-01

    A common problem in the longitudinal data analysis is the missing data problem. Two types of missing patterns are generally considered in statistical literature: monotone and non-monotone missing data. Nonmonotone missing data occur when study participants intermittently miss scheduled visits, while monotone missing data can be from discontinued participation, loss to follow-up, and mortality. Although many novel statistical approaches have been developed to handle missing data in recent years, few methods are available to provide inferences to handle both types of missing data simultaneously. In this article, a latent random effects model is proposed to analyze longitudinal outcomes with both monotone and non-monotone missingness in the context of missing not at random. Another significant contribution of this article is to propose a new computational algorithm for latent random effects models. To reduce the computational burden of high-dimensional integration problem in latent random effects models, we develop a new computational algorithm that uses a new adaptive quadrature approach in conjunction with the Taylor series approximation for the likelihood function to simplify the E-step computation in the expectation-maximization algorithm. Simulation study is performed and the data from the scleroderma lung study are used to demonstrate the effectiveness of this method. © The Author(s) 2012.

  19. Random forests of interaction trees for estimating individualized treatment effects in randomized trials.

    PubMed

    Su, Xiaogang; Peña, Annette T; Liu, Lei; Levine, Richard A

    2018-04-29

    Assessing heterogeneous treatment effects is a growing interest in advancing precision medicine. Individualized treatment effects (ITEs) play a critical role in such an endeavor. Concerning experimental data collected from randomized trials, we put forward a method, termed random forests of interaction trees (RFIT), for estimating ITE on the basis of interaction trees. To this end, we propose a smooth sigmoid surrogate method, as an alternative to greedy search, to speed up tree construction. The RFIT outperforms the "separate regression" approach in estimating ITE. Furthermore, standard errors for the estimated ITE via RFIT are obtained with the infinitesimal jackknife method. We assess and illustrate the use of RFIT via both simulation and the analysis of data from an acupuncture headache trial. Copyright © 2018 John Wiley & Sons, Ltd.

  20. Linear mixed model for heritability estimation that explicitly addresses environmental variation.

    PubMed

    Heckerman, David; Gurdasani, Deepti; Kadie, Carl; Pomilla, Cristina; Carstensen, Tommy; Martin, Hilary; Ekoru, Kenneth; Nsubuga, Rebecca N; Ssenyomo, Gerald; Kamali, Anatoli; Kaleebu, Pontiano; Widmer, Christian; Sandhu, Manjinder S

    2016-07-05

    The linear mixed model (LMM) is now routinely used to estimate heritability. Unfortunately, as we demonstrate, LMM estimates of heritability can be inflated when using a standard model. To help reduce this inflation, we used a more general LMM with two random effects-one based on genomic variants and one based on easily measured spatial location as a proxy for environmental effects. We investigated this approach with simulated data and with data from a Uganda cohort of 4,778 individuals for 34 phenotypes including anthropometric indices, blood factors, glycemic control, blood pressure, lipid tests, and liver function tests. For the genomic random effect, we used identity-by-descent estimates from accurately phased genome-wide data. For the environmental random effect, we constructed a covariance matrix based on a Gaussian radial basis function. Across the simulated and Ugandan data, narrow-sense heritability estimates were lower using the more general model. Thus, our approach addresses, in part, the issue of "missing heritability" in the sense that much of the heritability previously thought to be missing was fictional. Software is available at https://github.com/MicrosoftGenomics/FaST-LMM.

  1. Using Natural Experiments to Study the Impact of Media on the Family

    ERIC Educational Resources Information Center

    Price, Joseph; Dahl, Gordon B.

    2012-01-01

    The randomized trial is the gold standard in scientific research and is used by several fields to study the effects of media. Although useful for studying the immediate response to media exposure, the experimental approach is not well suited to studying long-term effects or behavior outside the laboratory. The "natural experiment" approach, a…

  2. Assessing the quality of a non-randomized pragmatic trial for primary prevention of falls among older adults.

    PubMed

    Albert, Steven M; Edelstein, Offer; King, Jennifer; Flatt, Jason; Lin, Chyongchiou J; Boudreau, Robert; Newman, Anne B

    2015-01-01

    Current approaches to falls prevention mostly rely on secondary and tertiary prevention and target individuals at high risk of falls. An alternative is primary prevention, in which all seniors are screened, referred as appropriate, and educated regarding falls risk. Little information is available on research designs that allow investigation of this approach in the setting of aging services delivery, where randomization may not be possible. Healthy Steps for Older Adults, a statewide program of the Pennsylvania (PA) Department of Aging, involves a combination of education about falls and screening for balance problems, with referral to personal physicians and home safety assessments. We developed a non-randomized statewide trial, Falls Free PA, to assess its effectiveness in reducing falls incidence over 12 months. We recruited 814 seniors who completed the program (503 first-time participants, 311 people repeating the program) and 1,020 who did not participate in the program, from the same sites. We assessed the quality of this non-randomized design by examining recruitment, follow-up across study groups, and comparability at baseline. Of older adults approached in senior centers, 90.5 % (n = 2,219) signed informed consent, and 1,834 (82.4 %) completed baseline assessments and were eligible for follow-up. Attrition in the three groups over 12 months was low and non-differential (<10 % for withdrawal and <2 % for other loss to follow-up). Median follow-up, which involved standardized monthly assessment of falls, was 10 months in all study groups. At baseline, the groups did not differ in measures of health or falls risk factors. Comparable status at baseline, recruitment from common sites, and similar experience with retention suggest that the non-randomized design will be effective for assessment of this approach to primary prevention of falls.

  3. Unbiased split variable selection for random survival forests using maximally selected rank statistics.

    PubMed

    Wright, Marvin N; Dankowski, Theresa; Ziegler, Andreas

    2017-04-15

    The most popular approach for analyzing survival data is the Cox regression model. The Cox model may, however, be misspecified, and its proportionality assumption may not always be fulfilled. An alternative approach for survival prediction is random forests for survival outcomes. The standard split criterion for random survival forests is the log-rank test statistic, which favors splitting variables with many possible split points. Conditional inference forests avoid this split variable selection bias. However, linear rank statistics are utilized by default in conditional inference forests to select the optimal splitting variable, which cannot detect non-linear effects in the independent variables. An alternative is to use maximally selected rank statistics for the split point selection. As in conditional inference forests, splitting variables are compared on the p-value scale. However, instead of the conditional Monte-Carlo approach used in conditional inference forests, p-value approximations are employed. We describe several p-value approximations and the implementation of the proposed random forest approach. A simulation study demonstrates that unbiased split variable selection is possible. However, there is a trade-off between unbiased split variable selection and runtime. In benchmark studies of prediction performance on simulated and real datasets, the new method performs better than random survival forests if informative dichotomous variables are combined with uninformative variables with more categories and better than conditional inference forests if non-linear covariate effects are included. In a runtime comparison, the method proves to be computationally faster than both alternatives, if a simple p-value approximation is used. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  4. The effect of a chest imaging lecture on emergency department doctors' ability to interpret chest CT images: a randomized study.

    PubMed

    Keijzers, Gerben; Sithirasenan, Vasugi

    2012-02-01

    To assess the chest computed tomography (CT) imaging interpreting skills of emergency department (ED) doctors and to study the effect of a CT chest imaging interpretation lecture on these skills. Sixty doctors in two EDs were randomized, using computerized randomization, to either attend a chest CT interpretation lecture or not to attend this lecture. Within 2 weeks of the lecture, the participants completed a questionnaire on demographic variables, anatomical knowledge, and diagnostic interpretation of 10 chest CT studies. Outcome measures included anatomical knowledge score, diagnosis score, and the combined overall score, all expressed as a percentage of correctly answered questions (0-100). Data on 58 doctors were analyzed, of which 27 were randomized to attend the lecture. The CT interpretation lecture did not have an effect on anatomy knowledge scores (72.9 vs. 70.2%), diagnosis scores (71.2 vs. 69.2%), or overall scores (71.4 vs. 69.5%). Twenty-nine percent of doctors stated that they had a systematic approach to chest CT interpretation. Overall self-perceived competency for interpreting CT imaging (brain, chest, abdomen) was low (between 3.2 and 5.2 on a 10-point Visual Analogue Scale). A single chest CT interpretation lecture did not improve chest CT interpretation by ED doctors. Less than one-third of doctors had a systematic approach to chest CT interpretation. A standardized systematic approach may improve interpretation skills.

  5. A Randomized Clinical Trial Comparison Between Pivotal Response Treatment (PRT) and Structured Applied Behavior Analysis (ABA) Intervention for Children with Autism

    PubMed Central

    Mohammadzaheri, Fereshteh; Koegel, Lynn Kern; Rezaee, Mohammad; Rafiee, Seyed Majid

    2014-01-01

    Accumulating studies are documenting specific motivational variables that, when combined into a naturalistic teaching paradigm, can positively influence the effectiveness of interventions for children with autism spectrum disorder (ASD). The purpose of this study was to compare two ABA intervention procedures, a naturalistic approach, Pivotal Response Treatment (PRT) with a structured ABA approach in a school setting. A Randomized Clinical Trial design using two groups of children, matched according to age, sex and mean length of utterance was used to compare the interventions. The data showed that the PRT approach was significantly more effective in improving targeted and untargeted areas after three months of intervention. The results are discussed in terms of variables that produce more rapid improvements in communication for children with ASD. PMID:24840596

  6. A random effects meta-analysis model with Box-Cox transformation.

    PubMed

    Yamaguchi, Yusuke; Maruo, Kazushi; Partlett, Christopher; Riley, Richard D

    2017-07-19

    In a random effects meta-analysis model, true treatment effects for each study are routinely assumed to follow a normal distribution. However, normality is a restrictive assumption and the misspecification of the random effects distribution may result in a misleading estimate of overall mean for the treatment effect, an inappropriate quantification of heterogeneity across studies and a wrongly symmetric prediction interval. We focus on problems caused by an inappropriate normality assumption of the random effects distribution, and propose a novel random effects meta-analysis model where a Box-Cox transformation is applied to the observed treatment effect estimates. The proposed model aims to normalise an overall distribution of observed treatment effect estimates, which is sum of the within-study sampling distributions and the random effects distribution. When sampling distributions are approximately normal, non-normality in the overall distribution will be mainly due to the random effects distribution, especially when the between-study variation is large relative to the within-study variation. The Box-Cox transformation addresses this flexibly according to the observed departure from normality. We use a Bayesian approach for estimating parameters in the proposed model, and suggest summarising the meta-analysis results by an overall median, an interquartile range and a prediction interval. The model can be applied for any kind of variables once the treatment effect estimate is defined from the variable. A simulation study suggested that when the overall distribution of treatment effect estimates are skewed, the overall mean and conventional I 2 from the normal random effects model could be inappropriate summaries, and the proposed model helped reduce this issue. We illustrated the proposed model using two examples, which revealed some important differences on summary results, heterogeneity measures and prediction intervals from the normal random effects model. The random effects meta-analysis with the Box-Cox transformation may be an important tool for examining robustness of traditional meta-analysis results against skewness on the observed treatment effect estimates. Further critical evaluation of the method is needed.

  7. Multilevel covariance regression with correlated random effects in the mean and variance structure.

    PubMed

    Quintero, Adrian; Lesaffre, Emmanuel

    2017-09-01

    Multivariate regression methods generally assume a constant covariance matrix for the observations. In case a heteroscedastic model is needed, the parametric and nonparametric covariance regression approaches can be restrictive in the literature. We propose a multilevel regression model for the mean and covariance structure, including random intercepts in both components and allowing for correlation between them. The implied conditional covariance function can be different across clusters as a result of the random effect in the variance structure. In addition, allowing for correlation between the random intercepts in the mean and covariance makes the model convenient for skewedly distributed responses. Furthermore, it permits us to analyse directly the relation between the mean response level and the variability in each cluster. Parameter estimation is carried out via Gibbs sampling. We compare the performance of our model to other covariance modelling approaches in a simulation study. Finally, the proposed model is applied to the RN4CAST dataset to identify the variables that impact burnout of nurses in Belgium. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Ultrasound-guided versus fluoroscopy-guided sacroiliac joint intra-articular injections in the noninflammatory sacroiliac joint dysfunction: a prospective, randomized, single-blinded study.

    PubMed

    Jee, Haemi; Lee, Ji-Hae; Park, Ki Deok; Ahn, Jaeki; Park, Yongbum

    2014-02-01

    To compare the short-term effects and safety of ultrasound (US)-guided sacroiliac joint (SIJ) injections with fluoroscopy (FL)-guided SIJ injections in patients with noninflammatory SIJ dysfunction. Prospective, randomized controlled trial. University hospital. Patients (N=120) with noninflammatory sacroiliac arthritis were enrolled. All procedures were performed using an FL or US apparatus. Subjects were randomly assigned to either the FL or US group. Immediately after the SIJ injections, fluoroscopy was applied to verify the correct placement of the injected medication and intravascular injections. Treatment effects and functional improvement were compared at 2 and 12 weeks after the procedures. The verbal numeric pain scale and Oswestry Disability Index improved at 2 and 12 weeks after the injections without statistical significances between groups. Of 55 US-guided injections, 48 (87.3%) were successful and 7 (12.7%) were missed. The FL-guided SIJ approach exhibited a greater accuracy (98.2%) than the US-guided approach. Vascularization around the SIJ was seen in 34 of 55 patients. Among the 34 patients, 7 had vascularization inside the joint, 23 had vascularization around the joint, and 4 had vascularization both inside and around the joint. Three cases of intravascular injections occurred in the FL group. The US-guided approach may facilitate the identification and avoidance of the critical vessels around or within the SIJ. Function and pain relief significantly improved in both groups without significant differences between groups. The US-guided approach was shown to be as effective as the FL-guided approach in treatment effects. However, diagnostic application in the SIJ may be limited because of the significantly lower accuracy rate (87.3%). Copyright © 2014 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  9. Meta-analysis in clinical trials revisited.

    PubMed

    DerSimonian, Rebecca; Laird, Nan

    2015-11-01

    In this paper, we revisit a 1986 article we published in this Journal, Meta-Analysis in Clinical Trials, where we introduced a random-effects model to summarize the evidence about treatment efficacy from a number of related clinical trials. Because of its simplicity and ease of implementation, our approach has been widely used (with more than 12,000 citations to date) and the "DerSimonian and Laird method" is now often referred to as the 'standard approach' or a 'popular' method for meta-analysis in medical and clinical research. The method is especially useful for providing an overall effect estimate and for characterizing the heterogeneity of effects across a series of studies. Here, we review the background that led to the original 1986 article, briefly describe the random-effects approach for meta-analysis, explore its use in various settings and trends over time and recommend a refinement to the method using a robust variance estimator for testing overall effect. We conclude with a discussion of repurposing the method for Big Data meta-analysis and Genome Wide Association Studies for studying the importance of genetic variants in complex diseases. Published by Elsevier Inc.

  10. Curative effects of microneedle fractional radiofrequency system on skin laxity in Asian patients: A prospective, double-blind, randomized, controlled face-split study.

    PubMed

    Lu, Wenli; Wu, Pinru; Zhang, Zhen; Chen, Jinan; Chen, Xiangdong; Ewelina, Biskup

    2017-04-01

    To date, no studies compared curative effects of thermal lesions in deep and superficial dermal layers in the same patient (face-split study). To evaluate skin laxity effects of microneedle fractional radiofrequency induced thermal lesions in different dermal layers. 13 patients underwent three sessions of a randomized face-split microneedle fractional radiofrequency system (MFRS) treatment of deep dermal and superficial dermal layer. Skin laxity changes were evaluated objectively (digital images, 2 independent experts) and subjectively (patients' satisfaction numerical rating). 12 of 13 subjects completed a course of 3 treatments and a 1-year follow-up. Improvement of nasolabial folds in deep dermal approach was significantly better than that in superficial approach at three months (P=.0002) and 12 months (P=.0057) follow-up. Effects on infraorbital rhytides were only slightly better (P=.3531). MFRS is an effective method to improve skin laxity. Thermal lesion approach seems to provide better outcomes when applied to deep dermal layers. It is necessary to consider the skin thickness of different facial regions when choosing the treatment depth.

  11. Using instrumental variables to disentangle treatment and placebo effects in blinded and unblinded randomized clinical trials influenced by unmeasured confounders

    NASA Astrophysics Data System (ADS)

    Chaibub Neto, Elias

    2016-11-01

    Clinical trials traditionally employ blinding as a design mechanism to reduce the influence of placebo effects. In practice, however, it can be difficult or impossible to blind study participants and unblinded trials are common in medical research. Here we show how instrumental variables can be used to quantify and disentangle treatment and placebo effects in randomized clinical trials comparing control and active treatments in the presence of confounders. The key idea is to use randomization to separately manipulate treatment assignment and psychological encouragement conversations/interactions that increase the participants’ desire for improved symptoms. The proposed approach is able to improve the estimation of treatment effects in blinded studies and, most importantly, opens the doors to account for placebo effects in unblinded trials.

  12. Modelling wildland fire propagation by tracking random fronts

    NASA Astrophysics Data System (ADS)

    Pagnini, G.; Mentrelli, A.

    2013-11-01

    Wildland fire propagation is studied in literature by two alternative approaches, namely the reaction-diffusion equation and the level-set method. These two approaches are considered alternative each other because the solution of the reaction-diffusion equation is generally a continuous smooth function that has an exponential decay and an infinite support, while the level-set method, which is a front tracking technique, generates a sharp function with a finite support. However, these two approaches can indeed be considered complementary and reconciled. Turbulent hot-air transport and fire spotting are phenomena with a random character that are extremely important in wildland fire propagation. As a consequence the fire front gets a random character, too. Hence a tracking method for random fronts is needed. In particular, the level-set contourn is here randomized accordingly to the probability density function of the interface particle displacement. Actually, when the level-set method is developed for tracking a front interface with a random motion, the resulting averaged process emerges to be governed by an evolution equation of the reaction-diffusion type. In this reconciled approach, the rate of spread of the fire keeps the same key and characterizing role proper to the level-set approach. The resulting model emerges to be suitable to simulate effects due to turbulent convection as fire flank and backing fire, the faster fire spread because of the actions by hot air pre-heating and by ember landing, and also the fire overcoming a firebreak zone that is a case not resolved by models based on the level-set method. Moreover, from the proposed formulation it follows a correction for the rate of spread formula due to the mean jump-length of firebrands in the downwind direction for the leeward sector of the fireline contour.

  13. Endovascular therapy for acute ischemic stroke.

    PubMed

    Broderick, Joseph P

    2009-03-01

    To review advances in endovascular therapy for acute ischemic stroke. Data from primate studies, randomized studies of intravenous recombinant tissue-type plasminogen activator, and nonrandomized and randomized studies of endovascular therapy were reviewed. Clinical trial data demonstrate the superiority of endovascular treatment with thrombolytic medication or mechanical methods to reopen arteries compared with control patients from the PROACT II Trial treated with heparin alone. However, these same clinical trials, as well as preclinical primate models, indicate that recanalization, whether by endovascular approaches or standard-dose recombinant tissue-type plasminogen activator, is unlikely to improve clinical outcome after a certain time point. Although the threshold beyond which reperfusion has no or little benefit has yet to be conclusively defined, accumulated data to this point indicate an overall threshold of approximately 6 to 7 hours. In addition, although the risk of symptomatic intracerebral hemorrhage is similar in trials of intravenous lytics and endovascular approaches, endovascular approaches have distinctive risk profiles that can impact outcome. The treatment of acute ischemic stroke is evolving with new tools to reopen arteries and salvage the ischemic brain. Ongoing randomized trials of these new approaches are prerequisite next steps to demonstrate whether reperfusion translates into clinical effectiveness. Physiologic time to reperfusion will remain critical no matter which tools prove most effective and safest.

  14. Effects of Cooperative Concept Mapping Teaching Approach on Secondary School Students' Motivation in Biology in Gucha District, Kenya

    ERIC Educational Resources Information Center

    Keraro, Fred Nyabuti; Wachanga, Samuel W.; Orora, William

    2007-01-01

    This study investigated the effects of using the cooperative concept mapping (CCM) teaching approach on secondary school students' motivation in biology. A non equivalent control group design under the quasi-experimental research was used in which a random sample of four co-educational secondary schools was used. The four schools were randomly…

  15. Effect of Self Regulated Learning Approach on Junior Secondary School Students' Achievement in Basic Science

    ERIC Educational Resources Information Center

    Nwafor, Chika E.; Obodo, Abigail Chikaodinaka; Okafor, Gabriel

    2015-01-01

    This study explored the effect of self-regulated learning approach on junior secondary school students' achievement in basic science. Quasi-experimental design was used for the study.Two co-educational schools were drawn for the study through simple random sampling technique. One school was assigned to the treatment group while the other was…

  16. Confidence intervals for a difference between lognormal means in cluster randomization trials.

    PubMed

    Poirier, Julia; Zou, G Y; Koval, John

    2017-04-01

    Cluster randomization trials, in which intact social units are randomized to different interventions, have become popular in the last 25 years. Outcomes from these trials in many cases are positively skewed, following approximately lognormal distributions. When inference is focused on the difference between treatment arm arithmetic means, existent confidence interval procedures either make restricting assumptions or are complex to implement. We approach this problem by assuming log-transformed outcomes from each treatment arm follow a one-way random effects model. The treatment arm means are functions of multiple parameters for which separate confidence intervals are readily available, suggesting that the method of variance estimates recovery may be applied to obtain closed-form confidence intervals. A simulation study showed that this simple approach performs well in small sample sizes in terms of empirical coverage, relatively balanced tail errors, and interval widths as compared to existing methods. The methods are illustrated using data arising from a cluster randomization trial investigating a critical pathway for the treatment of community acquired pneumonia.

  17. Random matrix approach to plasmon resonances in the random impedance network model of disordered nanocomposites

    NASA Astrophysics Data System (ADS)

    Olekhno, N. A.; Beltukov, Y. M.

    2018-05-01

    Random impedance networks are widely used as a model to describe plasmon resonances in disordered metal-dielectric and other two-component nanocomposites. In the present work, the spectral properties of resonances in random networks are studied within the framework of the random matrix theory. We have shown that the appropriate ensemble of random matrices for the considered problem is the Jacobi ensemble (the MANOVA ensemble). The obtained analytical expressions for the density of states in such resonant networks show a good agreement with the results of numerical simulations in a wide range of metal filling fractions 0

  18. Cluster Randomized Test-Negative Design (CR-TND) Trials: A Novel and Efficient Method to Assess the Efficacy of Community Level Dengue Interventions.

    PubMed

    Anders, Katherine L; Cutcher, Zoe; Kleinschmidt, Immo; Donnelly, Christl A; Ferguson, Neil M; Indriani, Citra; O'Neill, Scott L; Jewell, Nicholas P; Simmons, Cameron P

    2018-05-07

    Cluster randomized trials are the gold standard for assessing efficacy of community-level interventions, such as vector control strategies against dengue. We describe a novel cluster randomized trial methodology with a test-negative design, which offers advantages over traditional approaches. It utilizes outcome-based sampling of patients presenting with a syndrome consistent with the disease of interest, who are subsequently classified as test-positive cases or test-negative controls on the basis of diagnostic testing. We use simulations of a cluster trial to demonstrate validity of efficacy estimates under the test-negative approach. This demonstrates that, provided study arms are balanced for both test-negative and test-positive illness at baseline and that other test-negative design assumptions are met, the efficacy estimates closely match true efficacy. We also briefly discuss analytical considerations for an odds ratio-based effect estimate arising from clustered data, and outline potential approaches to analysis. We conclude that application of the test-negative design to certain cluster randomized trials could increase their efficiency and ease of implementation.

  19. 76 FR 17654 - Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-30

    ... OMB Review; Comment Request Title: Evaluation of Adolescent Pregnancy Prevention Approaches-- First... as part of the Evaluation of Adolescent Pregnancy Prevention Approaches (PPA). PPA is a random assignment evaluation designed to result in rigorous evidence on effective ways to reduce teen pregnancy. The...

  20. Physiotherapy treatment approaches for the recovery of postural control and lower limb function following stroke: a systematic review.

    PubMed

    Pollock, Alex; Baer, Gillian; Langhorne, Peter; Pomeroy, Valerie

    2007-05-01

    To determine whether there is a difference in global dependency and functional independence in patients with stroke associated with different approaches to physiotherapy treatment. We searched the Cochrane Stroke Group Trials Register (last searched May 2005), Cochrane Central Register of Controlled Trials (CENTRAL) (Cochrane Library Issue 2, 2005), MEDLINE (1966 to May 2005), EMBASE (1980 to May 2005) and CINAHL (1982 to May 2005). We contacted experts and researchers with an interest in stroke rehabilitation. Inclusion criteria were: (a) randomized or quasi-randomized controlled trials; (b) adults with a clinical diagnosis of stroke; (c) physiotherapy treatment approaches aimed at promoting postural control and lower limb function; (d) measures of disability, motor impairment or participation. Two independent reviewers categorized identified trials according to the inclusion/exclusion criteria, documented the methodological quality and extracted the data. Twenty trials (1087 patients) were included in the review. Comparisons included: neurophysiological approach versus other approach; motor learning approach versus other approach; mixed approach versus other approach for the outcomes of global dependency and functional independence. A mixed approach was significantly more effective than no treatment control at improving functional independence (standardized mean difference (SMD) 0.94, 95% confidence interval (CI) 0.08 to 1.80). There were no significant differences found for any other comparisons. Physiotherapy intervention, using a 'mix' of components from different 'approaches' is more effective than no treatment control in attaining functional independence following stroke. There is insufficient evidence to conclude that any one physiotherapy 'approach' is more effective in promoting recovery of disability than any other approach.

  1. A Two-Stage Estimation Method for Random Coefficient Differential Equation Models with Application to Longitudinal HIV Dynamic Data.

    PubMed

    Fang, Yun; Wu, Hulin; Zhu, Li-Xing

    2011-07-01

    We propose a two-stage estimation method for random coefficient ordinary differential equation (ODE) models. A maximum pseudo-likelihood estimator (MPLE) is derived based on a mixed-effects modeling approach and its asymptotic properties for population parameters are established. The proposed method does not require repeatedly solving ODEs, and is computationally efficient although it does pay a price with the loss of some estimation efficiency. However, the method does offer an alternative approach when the exact likelihood approach fails due to model complexity and high-dimensional parameter space, and it can also serve as a method to obtain the starting estimates for more accurate estimation methods. In addition, the proposed method does not need to specify the initial values of state variables and preserves all the advantages of the mixed-effects modeling approach. The finite sample properties of the proposed estimator are studied via Monte Carlo simulations and the methodology is also illustrated with application to an AIDS clinical data set.

  2. Restricted spatial regression in practice: Geostatistical models, confounding, and robustness under model misspecification

    USGS Publications Warehouse

    Hanks, Ephraim M.; Schliep, Erin M.; Hooten, Mevin B.; Hoeting, Jennifer A.

    2015-01-01

    In spatial generalized linear mixed models (SGLMMs), covariates that are spatially smooth are often collinear with spatially smooth random effects. This phenomenon is known as spatial confounding and has been studied primarily in the case where the spatial support of the process being studied is discrete (e.g., areal spatial data). In this case, the most common approach suggested is restricted spatial regression (RSR) in which the spatial random effects are constrained to be orthogonal to the fixed effects. We consider spatial confounding and RSR in the geostatistical (continuous spatial support) setting. We show that RSR provides computational benefits relative to the confounded SGLMM, but that Bayesian credible intervals under RSR can be inappropriately narrow under model misspecification. We propose a posterior predictive approach to alleviating this potential problem and discuss the appropriateness of RSR in a variety of situations. We illustrate RSR and SGLMM approaches through simulation studies and an analysis of malaria frequencies in The Gambia, Africa.

  3. Use of the Randomization Test in Single-Case Research

    ERIC Educational Resources Information Center

    Grünke, Matthias; Boon, Richard T.; Burke, Mack D.

    2015-01-01

    The purpose of this study was to illustrate the use of the randomization test for single-case research designs (SCR; Kratochwill & Levin, 2010). To demonstrate the application of this approach, a systematic replication of Grünke, Wilbert, and Calder Stegemann (2013) was conducted to evaluate the effects of a story map to improve the reading…

  4. Randomized Controlled Trial for Behavioral Smoking and Weight Control Treatment: Effect of Concurrent Versus Sequential Intervention.

    ERIC Educational Resources Information Center

    Spring, Bonnie; Pagoto, Sherry; Pingitore, Regina; Doran, Neal; Schneider, Kristin; Hedeker, Don

    2004-01-01

    The authors compared simultaneous versus sequential approaches to multiple health behavior change in diet, exercise, and cigarette smoking. Female regular smokers (N = 315) randomized to 3 conditions received 16 weeks of behavioral smoking treatment, quit smoking at Week 5, and were followed for 9 months after quit date. Weight management was…

  5. The Impact of Random Metal Detector Searches on Contraband Possession and Feelings of Safety at School

    ERIC Educational Resources Information Center

    Bhatt, Rachana; Davis, Tomeka

    2018-01-01

    Weapons at school pose a danger to students as well as faculty. Educational administrators have attempted to reduce their prevalence by implementing random weapons searches in schools. This article examines the effectiveness of this approach using data from two geographically adjacent school districts in Florida (Miami-Dade and Broward). In the…

  6. Radiative transfer in multilayered random medium with laminar structure - Green's function approach

    NASA Technical Reports Server (NTRS)

    Karam, M. A.; Fung, A. K.

    1986-01-01

    For a multilayered random medium with a laminar structure a Green's function approach is introduced to obtain the emitted intensity due to an arbitrary point source. It is then shown that the approach is applicable to both active and passive remote sensing. In active remote sensing, the computed radar backscattering cross section for the multilayered medium includes the effects of both volume multiple scattering and surface multiple scattering at the layer boundaries. In passive remote sensing, the brightness temperature is obtained for arbitrary temperature profiles in the layers. As an illustration the brightness temperature and reflectivity are calculated for a bounded layer and compared with results in the literature.

  7. Which Approach Is More Effective in the Selection of Plants with Antimicrobial Activity?

    PubMed Central

    Silva, Ana Carolina Oliveira; Santana, Elidiane Fonseca; Saraiva, Antonio Marcos; Coutinho, Felipe Neves; Castro, Ricardo Henrique Acre; Pisciottano, Maria Nelly Caetano; Amorim, Elba Lúcia Cavalcanti; Albuquerque, Ulysses Paulino

    2013-01-01

    The development of the present study was based on selections using random, direct ethnopharmacological, and indirect ethnopharmacological approaches, aiming to evaluate which method is the best for bioprospecting new antimicrobial plant drugs. A crude extract of 53 species of herbaceous plants collected in the semiarid region of Northeast Brazil was tested against 11 microorganisms. Well-agar diffusion and minimum inhibitory concentration (MIC) techniques were used. Ten extracts from direct, six from random, and three from indirect ethnopharmacological selections exhibited activities that ranged from weak to very active against the organisms tested. The strain most susceptible to the evaluated extracts was Staphylococcus aureus. The MIC analysis revealed the best result for the direct ethnopharmacological approach, considering that some species yielded extracts classified as active or moderately active (MICs between 250 and 1000 µg/mL). Furthermore, one species from this approach inhibited the growth of the three Candida strains. Thus, it was concluded that the direct ethnopharmacological approach is the most effective when selecting species for bioprospecting new plant drugs with antimicrobial activities. PMID:23878595

  8. Depth Reconstruction from Single Images Using a Convolutional Neural Network and a Condition Random Field Model.

    PubMed

    Liu, Dan; Liu, Xuejun; Wu, Yiguang

    2018-04-24

    This paper presents an effective approach for depth reconstruction from a single image through the incorporation of semantic information and local details from the image. A unified framework for depth acquisition is constructed by joining a deep Convolutional Neural Network (CNN) and a continuous pairwise Conditional Random Field (CRF) model. Semantic information and relative depth trends of local regions inside the image are integrated into the framework. A deep CNN network is firstly used to automatically learn a hierarchical feature representation of the image. To get more local details in the image, the relative depth trends of local regions are incorporated into the network. Combined with semantic information of the image, a continuous pairwise CRF is then established and is used as the loss function of the unified model. Experiments on real scenes demonstrate that the proposed approach is effective and that the approach obtains satisfactory results.

  9. A Hybrid Approach to Protect Palmprint Templates

    PubMed Central

    Sun, Dongmei; Xiong, Ke; Qiu, Zhengding

    2014-01-01

    Biometric template protection is indispensable to protect personal privacy in large-scale deployment of biometric systems. Accuracy, changeability, and security are three critical requirements for template protection algorithms. However, existing template protection algorithms cannot satisfy all these requirements well. In this paper, we propose a hybrid approach that combines random projection and fuzzy vault to improve the performances at these three points. Heterogeneous space is designed for combining random projection and fuzzy vault properly in the hybrid scheme. New chaff point generation method is also proposed to enhance the security of the heterogeneous vault. Theoretical analyses of proposed hybrid approach in terms of accuracy, changeability, and security are given in this paper. Palmprint database based experimental results well support the theoretical analyses and demonstrate the effectiveness of proposed hybrid approach. PMID:24982977

  10. A hybrid approach to protect palmprint templates.

    PubMed

    Liu, Hailun; Sun, Dongmei; Xiong, Ke; Qiu, Zhengding

    2014-01-01

    Biometric template protection is indispensable to protect personal privacy in large-scale deployment of biometric systems. Accuracy, changeability, and security are three critical requirements for template protection algorithms. However, existing template protection algorithms cannot satisfy all these requirements well. In this paper, we propose a hybrid approach that combines random projection and fuzzy vault to improve the performances at these three points. Heterogeneous space is designed for combining random projection and fuzzy vault properly in the hybrid scheme. New chaff point generation method is also proposed to enhance the security of the heterogeneous vault. Theoretical analyses of proposed hybrid approach in terms of accuracy, changeability, and security are given in this paper. Palmprint database based experimental results well support the theoretical analyses and demonstrate the effectiveness of proposed hybrid approach.

  11. Design approaches to experimental mediation☆

    PubMed Central

    Pirlott, Angela G.; MacKinnon, David P.

    2016-01-01

    Identifying causal mechanisms has become a cornerstone of experimental social psychology, and editors in top social psychology journals champion the use of mediation methods, particularly innovative ones when possible (e.g. Halberstadt, 2010, Smith, 2012). Commonly, studies in experimental social psychology randomly assign participants to levels of the independent variable and measure the mediating and dependent variables, and the mediator is assumed to causally affect the dependent variable. However, participants are not randomly assigned to levels of the mediating variable(s), i.e., the relationship between the mediating and dependent variables is correlational. Although researchers likely know that correlational studies pose a risk of confounding, this problem seems forgotten when thinking about experimental designs randomly assigning participants to levels of the independent variable and measuring the mediator (i.e., “measurement-of-mediation” designs). Experimentally manipulating the mediator provides an approach to solving these problems, yet these methods contain their own set of challenges (e.g., Bullock, Green, & Ha, 2010). We describe types of experimental manipulations targeting the mediator (manipulations demonstrating a causal effect of the mediator on the dependent variable and manipulations targeting the strength of the causal effect of the mediator) and types of experimental designs (double randomization, concurrent double randomization, and parallel), provide published examples of the designs, and discuss the strengths and challenges of each design. Therefore, the goals of this paper include providing a practical guide to manipulation-of-mediator designs in light of their challenges and encouraging researchers to use more rigorous approaches to mediation because manipulation-of-mediator designs strengthen the ability to infer causality of the mediating variable on the dependent variable. PMID:27570259

  12. Design approaches to experimental mediation.

    PubMed

    Pirlott, Angela G; MacKinnon, David P

    2016-09-01

    Identifying causal mechanisms has become a cornerstone of experimental social psychology, and editors in top social psychology journals champion the use of mediation methods, particularly innovative ones when possible (e.g. Halberstadt, 2010, Smith, 2012). Commonly, studies in experimental social psychology randomly assign participants to levels of the independent variable and measure the mediating and dependent variables, and the mediator is assumed to causally affect the dependent variable. However, participants are not randomly assigned to levels of the mediating variable(s), i.e., the relationship between the mediating and dependent variables is correlational. Although researchers likely know that correlational studies pose a risk of confounding, this problem seems forgotten when thinking about experimental designs randomly assigning participants to levels of the independent variable and measuring the mediator (i.e., "measurement-of-mediation" designs). Experimentally manipulating the mediator provides an approach to solving these problems, yet these methods contain their own set of challenges (e.g., Bullock, Green, & Ha, 2010). We describe types of experimental manipulations targeting the mediator (manipulations demonstrating a causal effect of the mediator on the dependent variable and manipulations targeting the strength of the causal effect of the mediator) and types of experimental designs (double randomization, concurrent double randomization, and parallel), provide published examples of the designs, and discuss the strengths and challenges of each design. Therefore, the goals of this paper include providing a practical guide to manipulation-of-mediator designs in light of their challenges and encouraging researchers to use more rigorous approaches to mediation because manipulation-of-mediator designs strengthen the ability to infer causality of the mediating variable on the dependent variable.

  13. The role of randomized cluster crossover trials for comparative effectiveness testing in anesthesia: design of the Benzodiazepine-Free Cardiac Anesthesia for Reduction in Postoperative Delirium (B-Free) trial.

    PubMed

    Spence, Jessica; Belley-Côté, Emilie; Lee, Shun Fu; Bangdiwala, Shrikant; Whitlock, Richard; LeManach, Yannick; Syed, Summer; Lamy, Andre; Jacobsohn, Eric; MacIsaac, Sarah; Devereaux, P J; Connolly, Stuart

    2018-07-01

    Increasingly, clinicians and researchers recognize that studies of interventions need to evaluate not only their therapeutic efficacy (i.e., the effect on an outcome in ideal, controlled settings) but also their real-world effectiveness in broad, unselected patient groups. Effectiveness trials inform clinical practice by comparing variations in therapeutic approaches that fall within the standard of care. In this article, we discuss the need for studies of comparative effectiveness in anesthesia and the limitations of individual patient randomized-controlled trials in determining comparative effectiveness. We introduce the concept of randomized cluster crossover trials as a means of answering questions of comparative effectiveness in anesthesia, using the design of the Benzodiazepine-Free Cardiac Anesthesia for Reduction in Postoperative Delirium (B-Free) trial (Clinicaltrials.gov identifier NCT03053869).

  14. The Effect of Learning Cycle Constructivist-Based Approach on Students' Academic Achievement and Attitude towards Chemistry in Secondary Schools in North-Eastern Part of Nigeria

    ERIC Educational Resources Information Center

    Jack, Gladys Uzezi

    2017-01-01

    This study investigated the effect of learning cycle constructivist-based approach on secondary schools students' academic achievement and their attitude towards chemistry. The design used was a pre-test, post-test non randomized control group quasi experimental research design. The design consisted of two instructional groups (learning cycle…

  15. Effects of Eclectic Learning Approach on Students' Academic Achievement and Retention in English at Elementary Level

    ERIC Educational Resources Information Center

    Suleman, Qaiser; Hussain, Ishtiaq

    2016-01-01

    The purpose of the research paper was to investigate the effect of eclectic learning approach on the academic achievement and retention of students in English at elementary level. A sample of forty students of 8th grade randomly selected from Government Boys High School Khurram District Karak was used. It was an experimental study and that's why…

  16. Effectiveness of Mutual Learning Approach in the Academic Achievement of B.Ed Students in Learning Optional II English

    ERIC Educational Resources Information Center

    Arulselvi, Evangelin

    2013-01-01

    The present study aims at finding out the effectiveness of Mutual learning approach over the conventional method in learning English optional II among B.Ed students. The randomized pre-test, post test, control group and experimental group design was employed. The B.Ed students of the same college formed the control and experimental groups. Each…

  17. A functional renormalization method for wave propagation in random media

    NASA Astrophysics Data System (ADS)

    Lamagna, Federico; Calzetta, Esteban

    2017-08-01

    We develop the exact renormalization group approach as a way to evaluate the effective speed of the propagation of a scalar wave in a medium with random inhomogeneities. We use the Martin-Siggia-Rose formalism to translate the problem into a non equilibrium field theory one, and then consider a sequence of models with a progressively lower infrared cutoff; in the limit where the cutoff is removed we recover the problem of interest. As a test of the formalism, we compute the effective dielectric constant of an homogeneous medium interspersed with randomly located, interpenetrating bubbles. A simple approximation to the renormalization group equations turns out to be equivalent to a self-consistent two-loops evaluation of the effective dielectric constant.

  18. Simplification of Markov chains with infinite state space and the mathematical theory of random gene expression bursts.

    PubMed

    Jia, Chen

    2017-09-01

    Here we develop an effective approach to simplify two-time-scale Markov chains with infinite state spaces by removal of states with fast leaving rates, which improves the simplification method of finite Markov chains. We introduce the concept of fast transition paths and show that the effective transitions of the reduced chain can be represented as the superposition of the direct transitions and the indirect transitions via all the fast transition paths. Furthermore, we apply our simplification approach to the standard Markov model of single-cell stochastic gene expression and provide a mathematical theory of random gene expression bursts. We give the precise mathematical conditions for the bursting kinetics of both mRNAs and proteins. It turns out that random bursts exactly correspond to the fast transition paths of the Markov model. This helps us gain a better understanding of the physics behind the bursting kinetics as an emergent behavior from the fundamental multiscale biochemical reaction kinetics of stochastic gene expression.

  19. Simplification of Markov chains with infinite state space and the mathematical theory of random gene expression bursts

    NASA Astrophysics Data System (ADS)

    Jia, Chen

    2017-09-01

    Here we develop an effective approach to simplify two-time-scale Markov chains with infinite state spaces by removal of states with fast leaving rates, which improves the simplification method of finite Markov chains. We introduce the concept of fast transition paths and show that the effective transitions of the reduced chain can be represented as the superposition of the direct transitions and the indirect transitions via all the fast transition paths. Furthermore, we apply our simplification approach to the standard Markov model of single-cell stochastic gene expression and provide a mathematical theory of random gene expression bursts. We give the precise mathematical conditions for the bursting kinetics of both mRNAs and proteins. It turns out that random bursts exactly correspond to the fast transition paths of the Markov model. This helps us gain a better understanding of the physics behind the bursting kinetics as an emergent behavior from the fundamental multiscale biochemical reaction kinetics of stochastic gene expression.

  20. Random walk study of electron motion in helium in crossed electromagnetic fields

    NASA Technical Reports Server (NTRS)

    Englert, G. W.

    1972-01-01

    Random walk theory, previously adapted to electron motion in the presence of an electric field, is extended to include a transverse magnetic field. In principle, the random walk approach avoids mathematical complexity and concomitant simplifying assumptions and permits determination of energy distributions and transport coefficients within the accuracy of available collisional cross section data. Application is made to a weakly ionized helium gas. Time of relaxation of electron energy distribution, determined by the random walk, is described by simple expressions based on energy exchange between the electron and an effective electric field. The restrictive effect of the magnetic field on electron motion, which increases the required number of collisions per walk to reach a terminal steady state condition, as well as the effect of the magnetic field on electron transport coefficients and mean energy can be quite adequately described by expressions involving only the Hall parameter.

  1. Emergency skill training--a randomized controlled study on the effectiveness of the 4-stage approach compared to traditional clinical teaching.

    PubMed

    Greif, Robert; Egger, Lars; Basciani, Reto M; Lockey, Andrew; Vogt, Andreas

    2010-12-01

    The "4-stage approach" has been widely accepted for practical skill training replacing the traditional 2 stages ("see one, do one"). However, the superior effectiveness of the 4-stage approach was never proved. To evaluate whether skill training with the 4-stage approach results in shorter performance time needed for a successful percutaneous needle-puncture cricothyroidotomy, and consequently in a reduced number of attempts needed to perform the skill in <60s compared to traditional teaching. Randomized controlled single-blinded parallel group study at the University Hospital Bern. With IRB approval and informed consent 128 undergraduate medical students were randomized in four groups: traditional teaching, no stage 2, no stage 3, and 4-stage approach for the training of cricothyroidotomy. Everyone watched a video of the cricothyroidotomy as stage 1 followed by skill training in the respective teaching group. Participants had to perform the cricothyroidotomy 10 times on skin-covered pig larynxes. Performance time was measured from skin palpation to trachea ventilation. Study participants filled out a self-rating on competency during the training. Performance time for each attempt was comparable in all groups and improved similarly to reach a performance time of <60 s. Self-rating revealed that all groups felt equally competent throughout. Even if the 4-stage approach is widely accepted and used as a didactic method for skill teaching we could not find evidence that its use or omitting stage 2 or 3 results in superior learning of an emergency skill compared to traditional teaching. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  2. The response of an airplane to random atmospheric disturbances

    NASA Technical Reports Server (NTRS)

    Diederich, Franklin W

    1957-01-01

    The statistical approach to the gust-load problem which consists in considering flight through turbulent air to be a stationary random process is extended by including the effect of lateral variation of the instantaneous gust intensity on the aerodynamic forces. The forces obtained in this manner are used in dynamic analyses of rigid and flexible airplanes free to move vertically, in pitch, and in roll. The effect of the interaction of longitudinal, vertical, and lateral gusts on the wing stresses is also considered.

  3. An Ecological Approach of Constraint Induced Movement Therapy for 2-3-Year-Old Children: A Randomized Control Trial

    ERIC Educational Resources Information Center

    Eliasson, Ann-Christin; Shaw, Karin; Berg, Elisabeth; Krumlinde-Sundholm, Lena

    2011-01-01

    The aim was to evaluate the effect of Eco-CIMT in young children with unilateral cerebral palsy in a randomized controlled crossover design. The training was implemented within the regular pediatric services, provided by the child's parents and/or preschool teacher and supervised by the child's regular therapist. Methods: Twenty-five children…

  4. Modeling of Academic Achievement of Primary School Students in Ethiopia Using Bayesian Multilevel Approach

    ERIC Educational Resources Information Center

    Sebro, Negusse Yohannes; Goshu, Ayele Taye

    2017-01-01

    This study aims to explore Bayesian multilevel modeling to investigate variations of average academic achievement of grade eight school students. A sample of 636 students is randomly selected from 26 private and government schools by a two-stage stratified sampling design. Bayesian method is used to estimate the fixed and random effects. Input and…

  5. Evaluating the Effectiveness of Developmental Mathematics by Embedding a Randomized Experiment within a Regression Discontinuity Design

    ERIC Educational Resources Information Center

    Moss, Brian G.; Yeaton, William H.; Lloyd, Jane E.

    2014-01-01

    Using a novel design approach, a randomized experiment (RE) was embedded within a regression discontinuity (RD) design (R-RE-D) to evaluate the impact of developmental mathematics at a large midwestern college ("n" = 2,122). Within a region of uncertainty near the cut-score, estimates of benefit from a prospective RE were closely…

  6. Vijana Vijiweni II: a cluster-randomized trial to evaluate the efficacy of a microfinance and peer health leadership intervention for HIV and intimate partner violence prevention among social networks of young men in Dar es Salaam.

    PubMed

    Kajula, Lusajo; Balvanz, Peter; Kilonzo, Mrema Noel; Mwikoko, Gema; Yamanis, Thespina; Mulawa, Marta; Kajuna, Deus; Hill, Lauren; Conserve, Donaldson; Reyes, Heathe Luz McNaughton; Leatherman, Sheila; Singh, Basant; Maman, Suzanne

    2016-02-03

    Intimate partner violence (IPV) and sexually transmitted infections (STIs), including HIV, remain important public health problems with devastating health effects for men and women in sub-Saharan Africa. There have been calls to engage men in prevention efforts, however, we lack effective approaches to reach and engage them. Social network approaches have demonstrated effective and sustained outcomes on changing risk behaviors in the U.S. Our team has identified and engaged naturally occurring social networks comprised mostly of young men in Dar es Salaam in an intervention designed to jointly reduce STI incidence and the perpetration of IPV. These stable networks are locally referred to as "camps." In a pilot study we demonstrated the feasibility and acceptability of a combined microfinance and peer health leadership intervention within these camp-based peer networks. We are implementing a cluster-randomized trial to evaluate the efficacy of an intervention combining microfinance with health leadership training in 60 camps in Dar es Salaam, Tanzania. Half of the camps have been randomized to the intervention arm, and half to a control arm. The camps in the intervention arm will receive a combined microfinance and health leadership intervention for a period of two years. The camps in the control arm will receive a delayed intervention. We have enrolled 1,258 men across the 60 study camps. Behavioral surveys will be conducted at baseline, 12-months post intervention launch and 30-month post intervention launch and biological samples will be drawn to test for Neisseria gonorrhea (NG), Chlamydia trachomatis (CT), and Trichomonas vaginalis (TV) at baseline and 30-months. The primary endpoints for assessing intervention impact are IPV perpetration and STI incidence. This is the first cluster-randomized trial targeting social networks of men in sub-Saharan Africa that jointly addresses HIV and IPV perpetration and has both biological and behavioral endpoints. Effective approaches to engage men in HIV and IPV prevention are needed in low resource, high prevalence settings like Tanzania. If we determine that this approach is effective, we will examine how to adapt and scale up this approach to other urban, sub-Saharan African settings. Clinical Trials.gov: NCT01865383 . Registration date: May 24, 2013.

  7. Causal mediation analysis for longitudinal data with exogenous exposure

    PubMed Central

    Bind, M.-A. C.; Vanderweele, T. J.; Coull, B. A.; Schwartz, J. D.

    2016-01-01

    Mediation analysis is a valuable approach to examine pathways in epidemiological research. Prospective cohort studies are often conducted to study biological mechanisms and often collect longitudinal measurements on each participant. Mediation formulae for longitudinal data have been developed. Here, we formalize the natural direct and indirect effects using a causal framework with potential outcomes that allows for an interaction between the exposure and the mediator. To allow different types of longitudinal measures of the mediator and outcome, we assume two generalized mixed-effects models for both the mediator and the outcome. The model for the mediator has subject-specific random intercepts and random exposure slopes for each cluster, and the outcome model has random intercepts and random slopes for the exposure, the mediator, and their interaction. We also expand our approach to settings with multiple mediators and derive the mediated effects, jointly through all mediators. Our method requires the absence of time-varying confounding with respect to the exposure and the mediator. This assumption is achieved in settings with exogenous exposure and mediator, especially when exposure and mediator are not affected by variables measured at earlier time points. We apply the methodology to data from the Normative Aging Study and estimate the direct and indirect effects, via DNA methylation, of air pollution, and temperature on intercellular adhesion molecule 1 (ICAM-1) protein levels. Our results suggest that air pollution and temperature have a direct effect on ICAM-1 protein levels (i.e. not through a change in ICAM-1 DNA methylation) and that temperature has an indirect effect via a change in ICAM-1 DNA methylation. PMID:26272993

  8. Comparing cluster-level dynamic treatment regimens using sequential, multiple assignment, randomized trials: Regression estimation and sample size considerations.

    PubMed

    NeCamp, Timothy; Kilbourne, Amy; Almirall, Daniel

    2017-08-01

    Cluster-level dynamic treatment regimens can be used to guide sequential treatment decision-making at the cluster level in order to improve outcomes at the individual or patient-level. In a cluster-level dynamic treatment regimen, the treatment is potentially adapted and re-adapted over time based on changes in the cluster that could be impacted by prior intervention, including aggregate measures of the individuals or patients that compose it. Cluster-randomized sequential multiple assignment randomized trials can be used to answer multiple open questions preventing scientists from developing high-quality cluster-level dynamic treatment regimens. In a cluster-randomized sequential multiple assignment randomized trial, sequential randomizations occur at the cluster level and outcomes are observed at the individual level. This manuscript makes two contributions to the design and analysis of cluster-randomized sequential multiple assignment randomized trials. First, a weighted least squares regression approach is proposed for comparing the mean of a patient-level outcome between the cluster-level dynamic treatment regimens embedded in a sequential multiple assignment randomized trial. The regression approach facilitates the use of baseline covariates which is often critical in the analysis of cluster-level trials. Second, sample size calculators are derived for two common cluster-randomized sequential multiple assignment randomized trial designs for use when the primary aim is a between-dynamic treatment regimen comparison of the mean of a continuous patient-level outcome. The methods are motivated by the Adaptive Implementation of Effective Programs Trial which is, to our knowledge, the first-ever cluster-randomized sequential multiple assignment randomized trial in psychiatry.

  9. A Robust Random Forest-Based Approach for Heart Rate Monitoring Using Photoplethysmography Signal Contaminated by Intense Motion Artifacts.

    PubMed

    Ye, Yalan; He, Wenwen; Cheng, Yunfei; Huang, Wenxia; Zhang, Zhilin

    2017-02-16

    The estimation of heart rate (HR) based on wearable devices is of interest in fitness. Photoplethysmography (PPG) is a promising approach to estimate HR due to low cost; however, it is easily corrupted by motion artifacts (MA). In this work, a robust approach based on random forest is proposed for accurately estimating HR from the photoplethysmography signal contaminated by intense motion artifacts, consisting of two stages. Stage 1 proposes a hybrid method to effectively remove MA with a low computation complexity, where two MA removal algorithms are combined by an accurate binary decision algorithm whose aim is to decide whether or not to adopt the second MA removal algorithm. Stage 2 proposes a random forest-based spectral peak-tracking algorithm, whose aim is to locate the spectral peak corresponding to HR, formulating the problem of spectral peak tracking into a pattern classification problem. Experiments on the PPG datasets including 22 subjects used in the 2015 IEEE Signal Processing Cup showed that the proposed approach achieved the average absolute error of 1.65 beats per minute (BPM) on the 22 PPG datasets. Compared to state-of-the-art approaches, the proposed approach has better accuracy and robustness to intense motion artifacts, indicating its potential use in wearable sensors for health monitoring and fitness tracking.

  10. Increase in the Random Dopant Induced Threshold Fluctuations and Lowering in Sub 100 nm MOSFETs Due to Quantum Effects: A 3-D Density-Gradient Simulation Study

    NASA Technical Reports Server (NTRS)

    Asenov, Asen; Slavcheva, G.; Brown, A. R.; Davies, J. H.; Saini, S.

    2000-01-01

    In this paper we present a detailed simulation study of the influence of quantum mechanical effects in the inversion layer on random dopant induced threshold voltage fluctuations and lowering in sub 100 nm MOSFETs. The simulations have been performed using a 3-D implementation of the density gradient (DG) formalism incorporated in our established 3-D atomistic simulation approach. This results in a self-consistent 3-D quantum mechanical picture, which implies not only the vertical inversion layer quantisation but also the lateral confinement effects related to current filamentation in the 'valleys' of the random potential fluctuations. We have shown that the net result of including quantum mechanical effects, while considering statistical dopant fluctuations, is an increase in both threshold voltage fluctuations and lowering. At the same time, the random dopant induced threshold voltage lowering partially compensates for the quantum mechanical threshold voltage shift in aggressively scaled MOSFETs with ultrathin gate oxides.

  11. Evaluation of empowerment model on indicators of metabolic control in patients with type 2 diabetes, a randomized clinical trial study.

    PubMed

    Ebrahimi, Hossein; Sadeghi, Mahdi; Amanpour, Farzaneh; Vahedi, Hamid

    2016-04-01

    Diabetes education is a major subject in achieving optimal glycemic control. Effective empowerment approach can be beneficial for improving patients' health. The aim of this study was to evaluate the effect of empowerment model on indicators of metabolic control in patients with type 2 diabetes. a randomized controlled trial of 103 patients with type 2 diabetes were randomly assigned to either the intervention (empowerment approach training) or the control group (conventional training) 2014. Empowerment approach training were performed for the experimental group for eight weeks. Data collection tool included demographic information form and indicators of metabolic control checklist. Analysis was performed by one-way analysis of variance, chi-square test, paired t-test, independent t-test and multiple linear regression. Before the intervention, two groups were homogeneous in terms of demographic variables, glycosylated hemoglobin (HbA1C), and other indicators of metabolic control. After the intervention, average HbA1C and other metabolic indicators except for LDL showed significant differences in the experimental group compared to the control group. study results indicated the positive effects of applying the empowerment model on the metabolic control indicators. Therefore, applying this model is recommended to nurses and the relevant authorities in order to improve clinical outcomes in diabetic patients. Copyright © 2015 Primary Care Diabetes Europe. Published by Elsevier Ltd. All rights reserved.

  12. Teaching Semantic Prosody of English Verbs through the DDL Approach and Its Effect on Learners' Vocabulary Choice Appropriateness in a Persian EFL Context

    ERIC Educational Resources Information Center

    Mansoory, Niloofar; Jafarpour, Mohsen

    2014-01-01

    This study examined teaching SP of English verbs through the data-driven learning (DDL) approach and its effect on learners' vocabulary choice appropriateness in the Persian English foreign language (EFL) context. In the present study, two male intact classes were selected. One of these two classes was randomly selected as a treatment group and…

  13. Effects of Making Sense of SCIENCE[TM] Professional Development on the Achievement of Middle School Students, Including English Language Learners. Final Report. NCEE 2012-4002

    ERIC Educational Resources Information Center

    Heller, Joan I.

    2012-01-01

    This study evaluated an approach to professional development for middle school science teachers by closely examining one grade 8 course that embodies that approach. Using a cluster-randomized experimental design, the study tested the effectiveness of the Making Sense of SCIENCE[TM] professional development course on force and motion (Daehler,…

  14. Fitting and Calibrating a Multilevel Mixed-Effects Stem Taper Model for Maritime Pine in NW Spain

    PubMed Central

    Arias-Rodil, Manuel; Castedo-Dorado, Fernando; Cámara-Obregón, Asunción; Diéguez-Aranda, Ulises

    2015-01-01

    Stem taper data are usually hierarchical (several measurements per tree, and several trees per plot), making application of a multilevel mixed-effects modelling approach essential. However, correlation between trees in the same plot/stand has often been ignored in previous studies. Fitting and calibration of a variable-exponent stem taper function were conducted using data from 420 trees felled in even-aged maritime pine (Pinus pinaster Ait.) stands in NW Spain. In the fitting step, the tree level explained much more variability than the plot level, and therefore calibration at plot level was omitted. Several stem heights were evaluated for measurement of the additional diameter needed for calibration at tree level. Calibration with an additional diameter measured at between 40 and 60% of total tree height showed the greatest improvement in volume and diameter predictions. If additional diameter measurement is not available, the fixed-effects model fitted by the ordinary least squares technique should be used. Finally, we also evaluated how the expansion of parameters with random effects affects the stem taper prediction, as we consider this a key question when applying the mixed-effects modelling approach to taper equations. The results showed that correlation between random effects should be taken into account when assessing the influence of random effects in stem taper prediction. PMID:26630156

  15. Predicting longitudinal trajectories of health probabilities with random-effects multinomial logit regression.

    PubMed

    Liu, Xian; Engel, Charles C

    2012-12-20

    Researchers often encounter longitudinal health data characterized with three or more ordinal or nominal categories. Random-effects multinomial logit models are generally applied to account for potential lack of independence inherent in such clustered data. When parameter estimates are used to describe longitudinal processes, however, random effects, both between and within individuals, need to be retransformed for correctly predicting outcome probabilities. This study attempts to go beyond existing work by developing a retransformation method that derives longitudinal growth trajectories of unbiased health probabilities. We estimated variances of the predicted probabilities by using the delta method. Additionally, we transformed the covariates' regression coefficients on the multinomial logit function, not substantively meaningful, to the conditional effects on the predicted probabilities. The empirical illustration uses the longitudinal data from the Asset and Health Dynamics among the Oldest Old. Our analysis compared three sets of the predicted probabilities of three health states at six time points, obtained from, respectively, the retransformation method, the best linear unbiased prediction, and the fixed-effects approach. The results demonstrate that neglect of retransforming random errors in the random-effects multinomial logit model results in severely biased longitudinal trajectories of health probabilities as well as overestimated effects of covariates on the probabilities. Copyright © 2012 John Wiley & Sons, Ltd.

  16. Effects of recreational soccer in men with prostate cancer undergoing androgen deprivation therapy: study protocol for the ‘FC Prostate’ randomized controlled trial

    PubMed Central

    2013-01-01

    Background Androgen deprivation therapy (ADT) is a cornerstone in the treatment of advanced prostate cancer. Adverse musculoskeletal and cardiovascular effects of ADT are widely reported and investigations into the potential of exercise to ameliorate the effects of treatment are warranted. The ‘Football Club (FC) Prostate’ study is a randomized trial comparing the effects of soccer training with standard treatment approaches on body composition, cardiovascular function, physical function parameters, glucose tolerance, bone health, and patient-reported outcomes in men undergoing ADT for prostate cancer. Methods/Design Using a single-center randomized controlled design, 80 men with histologically confirmed locally advanced or disseminated prostate cancer undergoing ADT for 6 months or more at The Copenhagen University Hospital will be enrolled on this trial. After baseline assessments eligible participants will be randomly assigned to a soccer training group or a control group receiving usual care. The soccer intervention will consist of 12 weeks of training 2–3 times/week for 45–60 min after which the assessment protocol will be repeated. Soccer training will then continue bi-weekly for an additional 20 weeks at the end of which all measures will be repeated to allow for additional analyses of long-term effects. The primary endpoint is changes in lean body mass from baseline to 12 weeks assessed by dual X-ray absorptiometry scan. Secondary endpoints include changes of cardiovascular, metabolic, and physical function parameters, as well as markers of bone metabolism and patient-reported outcomes. Discussion The FC Prostate trial will assess the safety and efficacy of a novel soccer-training approach to cancer rehabilitation on a number of clinically important health outcomes in men with advanced prostate cancer during ADT. The results may pave the way for innovative, community-based interventions in the approach to treating prostate cancer. Trial registration ClinicalTrials.gov: NCT01711892 PMID:24330570

  17. Coordination and Management of Multisite Complementary and Alternative Medicine (CAM) Therapies: Experience from a Multisite Reflexology Intervention Trial

    PubMed Central

    Rahbar, Mohammad H.; Wyatt, Gwen; Sikorskii, Alla; Victorson, David; Ardjomand-Hessabi, Manouchehr

    2011-01-01

    Background Multisite randomized clinical trials allow for increased research collaboration among investigators and expedite data collection efforts. As a result, government funding agencies typically look favorably upon this approach. As the field of complementary and alternative medicine (CAM) continues to evolve, so do increased calls for the use of more rigorous study design and trial methodologies, which can present challenges for investigators. Purpose To describe the processes involved in the coordination and management of a multisite randomized clinical trial of a CAM intervention. Methods Key aspects related to the coordination and management of a multisite CAM randomized clinical trial are presented, including organizational and site selection considerations, recruitment concerns and issues related to data collection and randomization to treatment groups. Management and monitoring of data, as well as quality assurance procedures are described. Finally, a real world perspective is shared from a recently conducted multisite randomized clinical trial of reflexology for women diagnosed with advanced breast cancer. Results The use of multiple sites in the conduct of CAM-based randomized clinical trials can provide an efficient, collaborative and robust approach to study coordination and data collection that maximizes efficiency and ensures the quality of results. Conclusions Multisite randomized clinical trial designs can offer the field of CAM research a more standardized and efficient approach to examine the effectiveness of novel therapies and treatments. Special attention must be given to intervention fidelity, consistent data collection and ensuring data quality. Assessment and reporting of quantitative indicators of data quality should be required. PMID:21664296

  18. Instructional Approaches on Science Performance, Attitude and Inquiry Ability in a Computer-Supported Collaborative Learning Environment

    ERIC Educational Resources Information Center

    Chen, Ching-Huei; Chen, Chia-Ying

    2012-01-01

    This study examined the effects of an inquiry-based learning (IBL) approach compared to that of a problem-based learning (PBL) approach on learner performance, attitude toward science and inquiry ability. Ninety-six students from three 7th-grade classes at a public school were randomly assigned to two experimental groups and one control group. All…

  19. A Two-Step Approach for Analysis of Nonignorable Missing Outcomes in Longitudinal Regression: an Application to Upstate KIDS Study.

    PubMed

    Liu, Danping; Yeung, Edwina H; McLain, Alexander C; Xie, Yunlong; Buck Louis, Germaine M; Sundaram, Rajeshwari

    2017-09-01

    Imperfect follow-up in longitudinal studies commonly leads to missing outcome data that can potentially bias the inference when the missingness is nonignorable; that is, the propensity of missingness depends on missing values in the data. In the Upstate KIDS Study, we seek to determine if the missingness of child development outcomes is nonignorable, and how a simple model assuming ignorable missingness would compare with more complicated models for a nonignorable mechanism. To correct for nonignorable missingness, the shared random effects model (SREM) jointly models the outcome and the missing mechanism. However, the computational complexity and lack of software packages has limited its practical applications. This paper proposes a novel two-step approach to handle nonignorable missing outcomes in generalized linear mixed models. We first analyse the missing mechanism with a generalized linear mixed model and predict values of the random effects; then, the outcome model is fitted adjusting for the predicted random effects to account for heterogeneity in the missingness propensity. Extensive simulation studies suggest that the proposed method is a reliable approximation to SREM, with a much faster computation. The nonignorability of missing data in the Upstate KIDS Study is estimated to be mild to moderate, and the analyses using the two-step approach or SREM are similar to the model assuming ignorable missingness. The two-step approach is a computationally straightforward method that can be conducted as sensitivity analyses in longitudinal studies to examine violations to the ignorable missingness assumption and the implications relative to health outcomes. © 2017 John Wiley & Sons Ltd.

  20. Unsupervised Metric Fusion Over Multiview Data by Graph Random Walk-Based Cross-View Diffusion.

    PubMed

    Wang, Yang; Zhang, Wenjie; Wu, Lin; Lin, Xuemin; Zhao, Xiang

    2017-01-01

    Learning an ideal metric is crucial to many tasks in computer vision. Diverse feature representations may combat this problem from different aspects; as visual data objects described by multiple features can be decomposed into multiple views, thus often provide complementary information. In this paper, we propose a cross-view fusion algorithm that leads to a similarity metric for multiview data by systematically fusing multiple similarity measures. Unlike existing paradigms, we focus on learning distance measure by exploiting a graph structure of data samples, where an input similarity matrix can be improved through a propagation of graph random walk. In particular, we construct multiple graphs with each one corresponding to an individual view, and a cross-view fusion approach based on graph random walk is presented to derive an optimal distance measure by fusing multiple metrics. Our method is scalable to a large amount of data by enforcing sparsity through an anchor graph representation. To adaptively control the effects of different views, we dynamically learn view-specific coefficients, which are leveraged into graph random walk to balance multiviews. However, such a strategy may lead to an over-smooth similarity metric where affinities between dissimilar samples may be enlarged by excessively conducting cross-view fusion. Thus, we figure out a heuristic approach to controlling the iteration number in the fusion process in order to avoid over smoothness. Extensive experiments conducted on real-world data sets validate the effectiveness and efficiency of our approach.

  1. Empirical likelihood inference in randomized clinical trials.

    PubMed

    Zhang, Biao

    2017-01-01

    In individually randomized controlled trials, in addition to the primary outcome, information is often available on a number of covariates prior to randomization. This information is frequently utilized to undertake adjustment for baseline characteristics in order to increase precision of the estimation of average treatment effects; such adjustment is usually performed via covariate adjustment in outcome regression models. Although the use of covariate adjustment is widely seen as desirable for making treatment effect estimates more precise and the corresponding hypothesis tests more powerful, there are considerable concerns that objective inference in randomized clinical trials can potentially be compromised. In this paper, we study an empirical likelihood approach to covariate adjustment and propose two unbiased estimating functions that automatically decouple evaluation of average treatment effects from regression modeling of covariate-outcome relationships. The resulting empirical likelihood estimator of the average treatment effect is as efficient as the existing efficient adjusted estimators 1 when separate treatment-specific working regression models are correctly specified, yet are at least as efficient as the existing efficient adjusted estimators 1 for any given treatment-specific working regression models whether or not they coincide with the true treatment-specific covariate-outcome relationships. We present a simulation study to compare the finite sample performance of various methods along with some results on analysis of a data set from an HIV clinical trial. The simulation results indicate that the proposed empirical likelihood approach is more efficient and powerful than its competitors when the working covariate-outcome relationships by treatment status are misspecified.

  2. Random Testing and Model Checking: Building a Common Framework for Nondeterministic Exploration

    NASA Technical Reports Server (NTRS)

    Groce, Alex; Joshi, Rajeev

    2008-01-01

    Two popular forms of dynamic analysis, random testing and explicit-state software model checking, are perhaps best viewed as search strategies for exploring the state spaces introduced by nondeterminism in program inputs. We present an approach that enables this nondeterminism to be expressed in the SPIN model checker's PROMELA language, and then lets users generate either model checkers or random testers from a single harness for a tested C program. Our approach makes it easy to compare model checking and random testing for models with precisely the same input ranges and probabilities and allows us to mix random testing with model checking's exhaustive exploration of non-determinism. The PROMELA language, as intended in its design, serves as a convenient notation for expressing nondeterminism and mixing random choices with nondeterministic choices. We present and discuss a comparison of random testing and model checking. The results derive from using our framework to test a C program with an effectively infinite state space, a module in JPL's next Mars rover mission. More generally, we show how the ability of the SPIN model checker to call C code can be used to extend SPIN's features, and hope to inspire others to use the same methods to implement dynamic analyses that can make use of efficient state storage, matching, and backtracking.

  3. Quantum Mechanical Enhancement of the Random Dopant Induced Threshold Voltage Fluctuations and Lowering in Sub 0.1 Micron MOSFETs

    NASA Technical Reports Server (NTRS)

    Asenov, Asen; Slavcheva, G.; Brown, A. R.; Davies, J. H.; Saini, Subhash

    1999-01-01

    A detailed study of the influence of quantum effects in the inversion layer on the random dopant induced threshold voltage fluctuations and lowering in sub 0.1 micron MOSFETs has been performed. This has been achieved using a full 3D implementation of the density gradient (DG) formalism incorporated in our previously published 3D 'atomistic' simulation approach. This results in a consistent, fully 3D, quantum mechanical picture which implies not only the vertical inversion layer quantisation but also the lateral confinement effects manifested by current filamentation in the 'valleys' of the random potential fluctuations. We have shown that the net result of including quantum mechanical effects, while considering statistical fluctuations, is an increase in both threshold voltage fluctuations and lowering.

  4. Cost effectiveness of the occupation-based approach for subacute stroke patients: result of a randomized controlled trial.

    PubMed

    Nagayama, Hirofumi; Tomori, Kounosuke; Ohno, Kanta; Takahashi, Kayoko; Nagatani, Ryutaro; Izumi, Ryota; Moriwaki, Kensuke; Yamauchi, Keita

    2017-07-01

    The cost effectiveness of occupational therapy for subacute stroke patients is unclear in the extant literature. Consequently, this study determined the cost effectiveness of the occupation-based approach using Aid for Decision-Making in Occupation Choice (ADOC) for subacute stroke patients compared with an impairment-based approach. We conducted an economic evaluation from a societal perspective alongside a pilot randomized controlled trial, with a single blind assessor for participants in 10 subacute rehabilitation units in Japan. The intervention group received occupation-based goal setting using ADOC, with interventions focused on meaningful occupations. The control group received an impairment-based approach focused on restoring capacities. For both groups, occupational-therapy intervention was administered more than five times per week, for over 40 min each time, and they received physical and speech therapy prior to discharge. The main outcomes were quality-adjusted life years (QALYs) and total costs. Further, sensitivity analyses were performed to examine the influence of parameter uncertainty on the base case results. The final number of participants was 24 in each of the two groups. In terms of QALYs, the intervention group is significantly higher than the control group (p = 0.001, difference 95% CI: 0.002-0.008) and total costs are not statistically significant. Applying a willingness-to-pay threshold of JPY 5 million/QALY, the probability of the occupation-based approach using ADOC being cost effective was estimated to be 65.3%. The results show that the occupation-based approach is associated with significantly improved QALYs and has potential cost effectiveness, compared with the impairment-based approach.

  5. Impact of Thematic Approach on Communication Skill in Preschool

    ERIC Educational Resources Information Center

    Ashokan, Varun; Venugopal, Kalpana

    2016-01-01

    The study investigated the effects of thematic approach on communication skills for preschool children. The study was a quasi experimental non-equivalent pretest-post-test control group design whereby 5-6 year old preschool children (n = 49) were randomly assigned to an experimental and a control group. The experimental group students were exposed…

  6. Experimental Effects of Program Management Approach on Teachers' Professional Ties and Social Capital

    ERIC Educational Resources Information Center

    Quinn, David M.; Kim, James S.

    2018-01-01

    Theory and empirical work suggest that teachers' social capital influences school improvement efforts. Social ties are prerequisite for social capital, yet little causal evidence exists on how malleable factors, such as instructional management approaches, affect teachers' ties. In this cluster-randomized trial, we apply a decision-making…

  7. Learning Historical Thinking with Oral History Interviews: A Cluster Randomized Controlled Intervention Study of Oral History Interviews in History Lessons

    ERIC Educational Resources Information Center

    Bertram, Christiane; Wagner, Wolfgang; Trautwein, Ulrich

    2017-01-01

    The present study examined the effectiveness of the oral history approach with respect to students' historical competence. A total of 35 ninth-grade classes (N = 900) in Germany were randomly assigned to one of four conditions--live, video, text, or a (nontreated) control group--in a pretest, posttest, and follow-up design. Comparing the three…

  8. Influence of General Self-Efficacy on the Effects of a School-Based Universal Primary Prevention Program of Depressive Symptoms in Adolescents: A Randomized and Controlled Follow-up Study

    ERIC Educational Resources Information Center

    Possel, Patrick; Baldus, Christiane; Horn, Andrea B.; Groen, Gunter; Hautzinger, Martin

    2005-01-01

    Background: Depressive disorders in adolescents are a widespread and increasing problem. Prevention seems a promising and feasible approach. Methods: We designed a cognitive-behavioral school-based universal primary prevention program and followed 347 eighth-grade students participating in a randomized controlled trial for three months. Results:…

  9. Measuring the Plasticity of Social Approach: A Randomized Controlled Trial of the Effects of the PEERS Intervention on EEG Asymmetry in Adolescents with Autism Spectrum Disorders

    ERIC Educational Resources Information Center

    Van Hecke, Amy Vaughan; Stevens, Sheryl; Carson, Audrey M.; Karst, Jeffrey S.; Dolan, Bridget; Schohl, Kirsten; McKindles, Ryan J.; Remmel, Rheanna; Brockman, Scott

    2015-01-01

    This study examined whether the Program for the Education and Enrichment of Relational Skills ("PEERS: Social skills for teenagers with developmental and autism spectrum disorders: The PEERS treatment manual," Routledge, New York, 2010a) affected neural function, via EEG asymmetry, in a randomized controlled trial of adolescents with…

  10. Meta-analysis of diagnostic test data: a bivariate Bayesian modeling approach.

    PubMed

    Verde, Pablo E

    2010-12-30

    In the last decades, the amount of published results on clinical diagnostic tests has expanded very rapidly. The counterpart to this development has been the formal evaluation and synthesis of diagnostic results. However, published results present substantial heterogeneity and they can be regarded as so far removed from the classical domain of meta-analysis, that they can provide a rather severe test of classical statistical methods. Recently, bivariate random effects meta-analytic methods, which model the pairs of sensitivities and specificities, have been presented from the classical point of view. In this work a bivariate Bayesian modeling approach is presented. This approach substantially extends the scope of classical bivariate methods by allowing the structural distribution of the random effects to depend on multiple sources of variability. Meta-analysis is summarized by the predictive posterior distributions for sensitivity and specificity. This new approach allows, also, to perform substantial model checking, model diagnostic and model selection. Statistical computations are implemented in the public domain statistical software (WinBUGS and R) and illustrated with real data examples. Copyright © 2010 John Wiley & Sons, Ltd.

  11. A motivation-focused weight loss maintenance program is an effective alternative to a skill-based approach.

    PubMed

    West, D S; Gorin, A A; Subak, L L; Foster, G; Bragg, C; Hecht, J; Schembri, M; Wing, R R

    2011-02-01

    Maintaining weight loss is a major challenge in obesity treatment. Individuals often indicate that waning motivation prompts cessation of effective weight management behaviors. Therefore, a novel weight loss maintenance program that specifically targets motivational factors was evaluated. Overweight women (N=338; 19% African American) with urinary incontinence were randomized to lifestyle obesity treatment or control and followed for 18 months. All participants in lifestyle (N=226) received the same initial 6-month group behavioral obesity treatment and were then randomized to (1) a novel motivation-focused maintenance program (N=113) or (2) a standard skill-based maintenance approach (N=113). Weight assessed at baseline, 6 and 18 months. Both treatment groups (motivation-focused and skill-based) achieved comparable 18-month weight losses (-5.48% for motivation-focused vs -5.55% in skill-based, P=0.98), and both groups lost significantly more than controls (-1.51%; P=0.0012 in motivation-focused and P=0.0021 in skill-based). A motivation-focused maintenance program offers an alternative, effective approach to weight maintenance expanding available evidence-based interventions beyond traditional skill-based programs.

  12. Suppression of the noise-induced effects in an electrostatic micro-plate using an adaptive back-stepping sliding mode control.

    PubMed

    Nwagoum Tuwa, Peguy Roussel; Woafo, P

    2018-01-01

    In this work, an adaptive backstepping sliding mode control approach is applied through the piezoelectric layer in order to control and to stabilize an electrostatic micro-plate. The mathematical model of the system by taking into account the small fluctuations in the gap considered as bounded noise is carried out. The accuracy of the proposed modal equation is proven using the method of lines. By using both approaches, the effects of noise are presented. It is found that they lead to pull-in instability as well as to random chaos. A suitable backstepping approach to improve the tracking performance is integrated to the adaptive sliding mode control in order to eliminate chattering phenomena and reinforce the robustness of the system in presence of uncertainties and external random disturbances. It is proved that all the variables of the closed-loop system are bounded and the system can follow the given reference signals as close as possible. Numerical simulations are provided to show the effectiveness of proposed controller. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  13. Comparing three knowledge communication strategies - Diffusion, Dissemination and Translation - through randomized controlled studies.

    PubMed

    Lane, Joseph P; Stone, Vathsala I

    2015-01-01

    This paper describes a series of three randomized controlled case studies comparing the effectiveness of three strategies for communicating new research-based knowledge (Diffusion, Dissemination, Translation), to different Assistive Technology (AT) stakeholder groups. Pre and post intervention measures for level of knowledge use (unaware, aware, interested, using) via the LOKUS instrument, assessed the relative effectiveness of the three strategies. The latter two approaches were both more effective than diffusion but also equally effective. The results question the value added by tailoring research findings to specific audiences, and instead supports the critical yet neglected role for relevance in determining knowledge use by stakeholders.

  14. A Random Finite Set Approach to Space Junk Tracking and Identification

    DTIC Science & Technology

    2014-09-03

    Final 3. DATES COVERED (From - To) 31 Jan 13 – 29 Apr 14 4. TITLE AND SUBTITLE A Random Finite Set Approach to Space Junk Tracking and...01-2013 to 29-04-2014 4. TITLE AND SUBTITLE A Random Finite Set Approach to Space Junk Tracking and Identification 5a. CONTRACT NUMBER FA2386-13...Prescribed by ANSI Std Z39-18 A Random Finite Set Approach to Space Junk Tracking and Indentification Ba-Ngu Vo1, Ba-Tuong Vo1, 1Department of

  15. An approximate generalized linear model with random effects for informative missing data.

    PubMed

    Follmann, D; Wu, M

    1995-03-01

    This paper develops a class of models to deal with missing data from longitudinal studies. We assume that separate models for the primary response and missingness (e.g., number of missed visits) are linked by a common random parameter. Such models have been developed in the econometrics (Heckman, 1979, Econometrica 47, 153-161) and biostatistics (Wu and Carroll, 1988, Biometrics 44, 175-188) literature for a Gaussian primary response. We allow the primary response, conditional on the random parameter, to follow a generalized linear model and approximate the generalized linear model by conditioning on the data that describes missingness. The resultant approximation is a mixed generalized linear model with possibly heterogeneous random effects. An example is given to illustrate the approximate approach, and simulations are performed to critique the adequacy of the approximation for repeated binary data.

  16. A one-dimensional stochastic approach to the study of cyclic voltammetry with adsorption effects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Samin, Adib J.

    In this study, a one-dimensional stochastic model based on the random walk approach is used to simulate cyclic voltammetry. The model takes into account mass transport, kinetics of the redox reactions, adsorption effects and changes in the morphology of the electrode. The model is shown to display the expected behavior. Furthermore, the model shows consistent qualitative agreement with a finite difference solution. This approach allows for an understanding of phenomena on a microscopic level and may be useful for analyzing qualitative features observed in experimentally recorded signals.

  17. A one-dimensional stochastic approach to the study of cyclic voltammetry with adsorption effects

    NASA Astrophysics Data System (ADS)

    Samin, Adib J.

    2016-05-01

    In this study, a one-dimensional stochastic model based on the random walk approach is used to simulate cyclic voltammetry. The model takes into account mass transport, kinetics of the redox reactions, adsorption effects and changes in the morphology of the electrode. The model is shown to display the expected behavior. Furthermore, the model shows consistent qualitative agreement with a finite difference solution. This approach allows for an understanding of phenomena on a microscopic level and may be useful for analyzing qualitative features observed in experimentally recorded signals.

  18. A random matrix approach to credit risk.

    PubMed

    Münnix, Michael C; Schäfer, Rudi; Guhr, Thomas

    2014-01-01

    We estimate generic statistical properties of a structural credit risk model by considering an ensemble of correlation matrices. This ensemble is set up by Random Matrix Theory. We demonstrate analytically that the presence of correlations severely limits the effect of diversification in a credit portfolio if the correlations are not identically zero. The existence of correlations alters the tails of the loss distribution considerably, even if their average is zero. Under the assumption of randomly fluctuating correlations, a lower bound for the estimation of the loss distribution is provided.

  19. A Random Matrix Approach to Credit Risk

    PubMed Central

    Guhr, Thomas

    2014-01-01

    We estimate generic statistical properties of a structural credit risk model by considering an ensemble of correlation matrices. This ensemble is set up by Random Matrix Theory. We demonstrate analytically that the presence of correlations severely limits the effect of diversification in a credit portfolio if the correlations are not identically zero. The existence of correlations alters the tails of the loss distribution considerably, even if their average is zero. Under the assumption of randomly fluctuating correlations, a lower bound for the estimation of the loss distribution is provided. PMID:24853864

  20. On Acoustic Source Specification for Rotor-Stator Interaction Noise Prediction

    NASA Technical Reports Server (NTRS)

    Nark, Douglas M.; Envia, Edmane; Burley, Caesy L.

    2010-01-01

    This paper describes the use of measured source data to assess the effects of acoustic source specification on rotor-stator interaction noise predictions. Specifically, the acoustic propagation and radiation portions of a recently developed coupled computational approach are used to predict tonal rotor-stator interaction noise from a benchmark configuration. In addition to the use of full measured data, randomization of source mode relative phases is also considered for specification of the acoustic source within the computational approach. Comparisons with sideline noise measurements are performed to investigate the effects of various source descriptions on both inlet and exhaust predictions. The inclusion of additional modal source content is shown to have a much greater influence on the inlet results. Reasonable agreement between predicted and measured levels is achieved for the inlet, as well as the exhaust when shear layer effects are taken into account. For the number of trials considered, phase randomized predictions follow statistical distributions similar to those found in previous statistical source investigations. The shape of the predicted directivity pattern relative to measurements also improved with phase randomization, having predicted levels generally within one standard deviation of the measured levels.

  1. Sensitivity of Above-Ground Biomass Estimates to Height-Diameter Modelling in Mixed-Species West African Woodlands

    PubMed Central

    Aynekulu, Ermias; Pitkänen, Sari; Packalen, Petteri

    2016-01-01

    It has been suggested that above-ground biomass (AGB) inventories should include tree height (H), in addition to diameter (D). As H is a difficult variable to measure, H-D models are commonly used to predict H. We tested a number of approaches for H-D modelling, including additive terms which increased the complexity of the model, and observed how differences in tree-level predictions of H propagated to plot-level AGB estimations. We were especially interested in detecting whether the choice of method can lead to bias. The compared approaches listed in the order of increasing complexity were: (B0) AGB estimations from D-only; (B1) involving also H obtained from a fixed-effects H-D model; (B2) involving also species; (B3) including also between-plot variability as random effects; and (B4) involving multilevel nested random effects for grouping plots in clusters. In light of the results, the modelling approach affected the AGB estimation significantly in some cases, although differences were negligible for some of the alternatives. The most important differences were found between including H or not in the AGB estimation. We observed that AGB predictions without H information were very sensitive to the environmental stress parameter (E), which can induce a critical bias. Regarding the H-D modelling, the most relevant effect was found when species was included as an additive term. We presented a two-step methodology, which succeeded in identifying the species for which the general H-D relation was relevant to modify. Based on the results, our final choice was the single-level mixed-effects model (B3), which accounts for the species but also for the plot random effects reflecting site-specific factors such as soil properties and degree of disturbance. PMID:27367857

  2. A Meta-Analysis of Red Yeast Rice: An Effective and Relatively Safe Alternative Approach for Dyslipidemia

    PubMed Central

    Li, Yinhua; Jiang, Long; Jia, Zhangrong; Xin, Wei; Yang, Shiwei; Yang, Qiu; Wang, Luya

    2014-01-01

    Objective To explore whether red yeast rice is a safe and effective alternative approach for dyslipidemia. Methods Pubmed, the Cochrane Library, EBSCO host, Chinese VIP Information (VIP), China National Knowledge Infrastructure (CNKI), Wanfang Databases were searched for appropriate articles. Randomized trials of RYR (not including Xuezhikang and Zhibituo) and placebo as control in patients with dyslipidemia were considered. Two authors read all papers and independently extracted all relevant information. The primary outcomes were serum total cholesterol (TC), low-density lipoprotein cholesterol (LDL-C), triglyceride (TG), and high-density lipoprotein cholesterol (HDL-C). The secondary outcomes were increased levels of alanine transaminase, aspartate aminotransferase, creatine kinase, creatinine and fasting blood glucose. Results A total of 13 randomized, placebo-controlled trials containing 804 participants were analyzed. Red yeast rice exhibited significant lowering effects on serum TC [WMD = −0.97 (95% CI: −1.13, −0.80) mmol/L, P<0.001], TG [WMD = −0.23 (95% CI: −0.31, −0.14) mmol/L, P<0.001], and LDL-C [WMD = −0.87 (95% CI: −1.03, −0.71) mmol/L, P<0.001] but no significant increasing effect on HDL-C [WMD = 0.08 (95% CI: −0.02, 0.19) mmol/L, P = 0.11] compared with placebo. No serious side effects were reported in all trials. Conclusions The meta-analysis suggests that red yeast rice is an effective and relatively safe approach for dyslipidemia. However, further long-term, rigorously designed randomized controlled trials are still warranted before red yeast rice could be recommended to patients with dyslipidemia, especially as an alternative to statins. PMID:24897342

  3. Learning accurate and interpretable models based on regularized random forests regression

    PubMed Central

    2014-01-01

    Background Many biology related research works combine data from multiple sources in an effort to understand the underlying problems. It is important to find and interpret the most important information from these sources. Thus it will be beneficial to have an effective algorithm that can simultaneously extract decision rules and select critical features for good interpretation while preserving the prediction performance. Methods In this study, we focus on regression problems for biological data where target outcomes are continuous. In general, models constructed from linear regression approaches are relatively easy to interpret. However, many practical biological applications are nonlinear in essence where we can hardly find a direct linear relationship between input and output. Nonlinear regression techniques can reveal nonlinear relationship of data, but are generally hard for human to interpret. We propose a rule based regression algorithm that uses 1-norm regularized random forests. The proposed approach simultaneously extracts a small number of rules from generated random forests and eliminates unimportant features. Results We tested the approach on some biological data sets. The proposed approach is able to construct a significantly smaller set of regression rules using a subset of attributes while achieving prediction performance comparable to that of random forests regression. Conclusion It demonstrates high potential in aiding prediction and interpretation of nonlinear relationships of the subject being studied. PMID:25350120

  4. Interviewer effects on non-response propensity in longitudinal surveys: a multilevel modelling approach

    PubMed Central

    Vassallo, Rebecca; Durrant, Gabriele B; Smith, Peter W F; Goldstein, Harvey

    2015-01-01

    The paper investigates two different multilevel approaches, the multilevel cross-classified and the multiple-membership models, for the analysis of interviewer effects on wave non-response in longitudinal surveys. The models proposed incorporate both interviewer and area effects to account for the non-hierarchical structure, the influence of potentially more than one interviewer across waves and possible confounding of area and interviewer effects arising from the non-random allocation of interviewers across areas. The methods are compared by using a data set: the UK Family and Children Survey. PMID:25598587

  5. The Effect of Educational Modules Strategy on the Direct and Postponed Study's Achievement of Seventh Primary Grade Students in Science, in Comparison with the Conventional Approach

    ERIC Educational Resources Information Center

    Alelaimat, Abeer Rashed; Ghoneem, Khowla Abd Al Raheem

    2012-01-01

    This study aimed at revealing the effect of educational modules strategy on the direct and postponed study's achievement of seventh primary grade students in science, in comparison with the conventional approach. The sample of the study consists of (174) male and female students randomly chosen from schools in the city of Mafraq, students are…

  6. Sample size calculations for stepped wedge and cluster randomised trials: a unified approach

    PubMed Central

    Hemming, Karla; Taljaard, Monica

    2016-01-01

    Objectives To clarify and illustrate sample size calculations for the cross-sectional stepped wedge cluster randomized trial (SW-CRT) and to present a simple approach for comparing the efficiencies of competing designs within a unified framework. Study Design and Setting We summarize design effects for the SW-CRT, the parallel cluster randomized trial (CRT), and the parallel cluster randomized trial with before and after observations (CRT-BA), assuming cross-sectional samples are selected over time. We present new formulas that enable trialists to determine the required cluster size for a given number of clusters. We illustrate by example how to implement the presented design effects and give practical guidance on the design of stepped wedge studies. Results For a fixed total cluster size, the choice of study design that provides the greatest power depends on the intracluster correlation coefficient (ICC) and the cluster size. When the ICC is small, the CRT tends to be more efficient; when the ICC is large, the SW-CRT tends to be more efficient and can serve as an alternative design when the CRT is an infeasible design. Conclusion Our unified approach allows trialists to easily compare the efficiencies of three competing designs to inform the decision about the most efficient design in a given scenario. PMID:26344808

  7. Random Versus Nonrandom Peer Review: A Case for More Meaningful Peer Review.

    PubMed

    Itri, Jason N; Donithan, Adam; Patel, Sohil H

    2018-05-10

    Random peer review programs are not optimized to discover cases with diagnostic error and thus have inherent limitations with respect to educational and quality improvement value. Nonrandom peer review offers an alternative approach in which diagnostic error cases are targeted for collection during routine clinical practice. The objective of this study was to compare error cases identified through random and nonrandom peer review approaches at an academic center. During the 1-year study period, the number of discrepancy cases and score of discrepancy were determined from each approach. The nonrandom peer review process collected 190 cases, of which 60 were scored as 2 (minor discrepancy), 94 as 3 (significant discrepancy), and 36 as 4 (major discrepancy). In the random peer review process, 1,690 cases were reviewed, of which 1,646 were scored as 1 (no discrepancy), 44 were scored as 2 (minor discrepancy), and none were scored as 3 or 4. Several teaching lessons and quality improvement measures were developed as a result of analysis of error cases collected through the nonrandom peer review process. Our experience supports the implementation of nonrandom peer review as a replacement to random peer review, with nonrandom peer review serving as a more effective method for collecting diagnostic error cases with educational and quality improvement value. Copyright © 2018 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  8. Network meta-analysis of disconnected networks: How dangerous are random baseline treatment effects?

    PubMed

    Béliveau, Audrey; Goring, Sarah; Platt, Robert W; Gustafson, Paul

    2017-12-01

    In network meta-analysis, the use of fixed baseline treatment effects (a priori independent) in a contrast-based approach is regularly preferred to the use of random baseline treatment effects (a priori dependent). That is because, often, there is not a need to model baseline treatment effects, which carry the risk of model misspecification. However, in disconnected networks, fixed baseline treatment effects do not work (unless extra assumptions are made), as there is not enough information in the data to update the prior distribution on the contrasts between disconnected treatments. In this paper, we investigate to what extent the use of random baseline treatment effects is dangerous in disconnected networks. We take 2 publicly available datasets of connected networks and disconnect them in multiple ways. We then compare the results of treatment comparisons obtained from a Bayesian contrast-based analysis of each disconnected network using random normally distributed and exchangeable baseline treatment effects to those obtained from a Bayesian contrast-based analysis of their initial connected network using fixed baseline treatment effects. For the 2 datasets considered, we found that the use of random baseline treatment effects in disconnected networks was appropriate. Because those datasets were not cherry-picked, there should be other disconnected networks that would benefit from being analyzed using random baseline treatment effects. However, there is also a risk for the normality and exchangeability assumption to be inappropriate in other datasets even though we have not observed this situation in our case study. We provide code, so other datasets can be investigated. Copyright © 2017 John Wiley & Sons, Ltd.

  9. Comparative effectiveness of congregation- versus clinic-based approach to prevention of mother-to-child HIV transmission: study protocol for a cluster randomized controlled trial

    PubMed Central

    2013-01-01

    Background A total of 22 priority countries have been identified by the WHO that account for 90% of pregnant women living with HIV. Nigeria is one of only 4 countries among the 22 with an HIV testing rate for pregnant women of less than 20%. Currently, most pregnant women must access a healthcare facility (HF) to be screened and receive available prevention of mother-to-child HIV transmission (PMTCT) interventions. Finding new approaches to increase HIV testing among pregnant women is necessary to realize the WHO/ President's Emergency Plan for AIDS Relief (PEPFAR) goal of eliminating new pediatric infections by 2015. Methods This cluster randomized trial tests the comparative effectiveness of a congregation-based Healthy Beginning Initiative (HBI) versus a clinic-based approach on the rates of HIV testing and PMTCT completion among a cohort of church attending pregnant women. Recruitment occurs at the level of the churches and participants (in that order), while randomization occurs only at the church level. The trial is unblinded, and the churches are informed of their randomization group. Eligible participants, pregnant women attending study churches, are recruited during prayer sessions. HBI is delivered by trained community health nurses and church-based health advisors and provides free, integrated on-site laboratory tests (HIV plus hemoglobin, malaria, hepatitis B, sickle cell gene, syphilis) during a church-organized ‘baby shower.’ The baby shower includes refreshments, gifts exchange, and an educational game show testing participants’ knowledge of healthy pregnancy habits in addition to HIV acquisition modes, and effective PMTCT interventions. Baby receptions provide a contact point for follow-up after delivery. This approach was designed to reduce barriers to screening including knowledge, access, cost and stigma. The primary aim is to evaluate the effect of HBI on the HIV testing rate among pregnant women. The secondary aims are to evaluate the effect of HBI on the rate of HIV testing among male partners of pregnant women and the rate of PMTCT completion among HIV-infected pregnant women. Discussion Results of this study will provide further understanding of the most effective strategies for increasing HIV testing among pregnant women in hard-to-reach communities. Trial Registration Clinicaltrials.gov, NCT01795261 PMID:23758933

  10. Estimating causal contrasts involving intermediate variables in the presence of selection bias.

    PubMed

    Valeri, Linda; Coull, Brent A

    2016-11-20

    An important goal across the biomedical and social sciences is the quantification of the role of intermediate factors in explaining how an exposure exerts an effect on an outcome. Selection bias has the potential to severely undermine the validity of inferences on direct and indirect causal effects in observational as well as in randomized studies. The phenomenon of selection may arise through several mechanisms, and we here focus on instances of missing data. We study the sign and magnitude of selection bias in the estimates of direct and indirect effects when data on any of the factors involved in the analysis is either missing at random or not missing at random. Under some simplifying assumptions, the bias formulae can lead to nonparametric sensitivity analyses. These sensitivity analyses can be applied to causal effects on the risk difference and risk-ratio scales irrespectively of the estimation approach employed. To incorporate parametric assumptions, we also develop a sensitivity analysis for selection bias in mediation analysis in the spirit of the expectation-maximization algorithm. The approaches are applied to data from a health disparities study investigating the role of stage at diagnosis on racial disparities in colorectal cancer survival. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  11. Causal mediation analysis for longitudinal data with exogenous exposure.

    PubMed

    Bind, M-A C; Vanderweele, T J; Coull, B A; Schwartz, J D

    2016-01-01

    Mediation analysis is a valuable approach to examine pathways in epidemiological research. Prospective cohort studies are often conducted to study biological mechanisms and often collect longitudinal measurements on each participant. Mediation formulae for longitudinal data have been developed. Here, we formalize the natural direct and indirect effects using a causal framework with potential outcomes that allows for an interaction between the exposure and the mediator. To allow different types of longitudinal measures of the mediator and outcome, we assume two generalized mixed-effects models for both the mediator and the outcome. The model for the mediator has subject-specific random intercepts and random exposure slopes for each cluster, and the outcome model has random intercepts and random slopes for the exposure, the mediator, and their interaction. We also expand our approach to settings with multiple mediators and derive the mediated effects, jointly through all mediators. Our method requires the absence of time-varying confounding with respect to the exposure and the mediator. This assumption is achieved in settings with exogenous exposure and mediator, especially when exposure and mediator are not affected by variables measured at earlier time points. We apply the methodology to data from the Normative Aging Study and estimate the direct and indirect effects, via DNA methylation, of air pollution, and temperature on intercellular adhesion molecule 1 (ICAM-1) protein levels. Our results suggest that air pollution and temperature have a direct effect on ICAM-1 protein levels (i.e. not through a change in ICAM-1 DNA methylation) and that temperature has an indirect effect via a change in ICAM-1 DNA methylation. © The Author 2015. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  12. Effects of unstratified and centre-stratified randomization in multi-centre clinical trials.

    PubMed

    Anisimov, Vladimir V

    2011-01-01

    This paper deals with the analysis of randomization effects in multi-centre clinical trials. The two randomization schemes most often used in clinical trials are considered: unstratified and centre-stratified block-permuted randomization. The prediction of the number of patients randomized to different treatment arms in different regions during the recruitment period accounting for the stochastic nature of the recruitment and effects of multiple centres is investigated. A new analytic approach using a Poisson-gamma patient recruitment model (patients arrive at different centres according to Poisson processes with rates sampled from a gamma distributed population) and its further extensions is proposed. Closed-form expressions for corresponding distributions of the predicted number of the patients randomized in different regions are derived. In the case of two treatments, the properties of the total imbalance in the number of patients on treatment arms caused by using centre-stratified randomization are investigated and for a large number of centres a normal approximation of imbalance is proved. The impact of imbalance on the power of the study is considered. It is shown that the loss of statistical power is practically negligible and can be compensated by a minor increase in sample size. The influence of patient dropout is also investigated. The impact of randomization on predicted drug supply overage is discussed. Copyright © 2010 John Wiley & Sons, Ltd.

  13. A random wave model for the Aharonov-Bohm effect

    NASA Astrophysics Data System (ADS)

    Houston, Alexander J. H.; Gradhand, Martin; Dennis, Mark R.

    2017-05-01

    We study an ensemble of random waves subject to the Aharonov-Bohm effect. The introduction of a point with a magnetic flux of arbitrary strength into a random wave ensemble gives a family of wavefunctions whose distribution of vortices (complex zeros) is responsible for the topological phase associated with the Aharonov-Bohm effect. Analytical expressions are found for the vortex number and topological charge densities as functions of distance from the flux point. Comparison is made with the distribution of vortices in the isotropic random wave model. The results indicate that as the flux approaches half-integer values, a vortex with the same sign as the fractional part of the flux is attracted to the flux point, merging with it in the limit of half-integer flux. We construct a statistical model of the neighbourhood of the flux point to study how this vortex-flux merger occurs in more detail. Other features of the Aharonov-Bohm vortex distribution are also explored.

  14. Evaluation of an integrated treatment for active duty service members with comorbid posttraumatic stress disorder and major depressive disorder: Study protocol for a randomized controlled trial.

    PubMed

    Walter, Kristen H; Glassman, Lisa H; Michael Hunt, W; Otis, Nicholas P; Thomsen, Cynthia J

    2018-01-01

    Posttraumatic stress disorder (PTSD) commonly co-occurs with major depressive disorder (MDD) in both civilian and military/veteran populations. Existing, evidence-based PTSD treatments, such as cognitive processing therapy (CPT), often reduce symptoms of both PTSD and depression; however, findings related to the influence of comorbid MDD on PTSD treatment outcomes are mixed, and few studies use samples of individuals with both conditions. Behavioral activation (BA), an approach that relies on behavioral principles, is an effective treatment for depression. We have integrated BA into CPT (BA+CPT), a more cognitive approach, to address depressive symptoms among active duty service members with both PTSD and comorbid MDD. We describe an ongoing randomized controlled trial investigating the efficacy of our innovative, integrated BA+CPT intervention, compared with standard CPT, for active duty service members with PTSD and comorbid MDD. We detail the development of this integrated treatment, as well as the design and implementation of the randomized controlled trial, to evaluate its effect on symptoms. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. Improvement of spontaneous language in stroke patients with chronic aphasia treated with music therapy: a randomized controlled trial.

    PubMed

    Raglio, Alfredo; Oasi, Osmano; Gianotti, Marta; Rossi, Agnese; Goulene, Karine; Stramba-Badiale, Marco

    2016-01-01

    The aim of this research is to evaluate the effects of active music therapy (MT) based on free-improvisation (relational approach) in addition to speech language therapy (SLT) compared with SLT alone (communicative-pragmatic approach: Promoting Aphasic's Communicative Effectiveness) in stroke patients with chronic aphasia. The experimental group (n = 10) was randomized to 30 MT individual sessions over 15 weeks in addition to 30 SLT individual sessions while the control group (n = 10) was randomized to only 30 SLT sessions during the same period. Psychological and speech language assessment were made before (T0) and after (T1) the treatments. The study shows a significant improvement in spontaneous speech in the experimental group (Aachener Aphasie subtest: p = 0.020; Cohen's d = 0.35); the 50% of the experimental group showed also an improvement in vitality scores of Short Form Health Survey (chi-square test = 4.114; p = 0.043). The current trial highlights the possibility that the combined use of MT and SLT can lead to a better result in the rehabilitation of patients with aphasia than SLT alone.

  16. A Comparison of Four Approaches to Account for Method Effects in Latent State-Trait Analyses

    ERIC Educational Resources Information Center

    Geiser, Christian; Lockhart, Ginger

    2012-01-01

    Latent state-trait (LST) analysis is frequently applied in psychological research to determine the degree to which observed scores reflect stable person-specific effects, effects of situations and/or person-situation interactions, and random measurement error. Most LST applications use multiple repeatedly measured observed variables as indicators…

  17. Accounting for center in the Early External Cephalic Version trials: an empirical comparison of statistical methods to adjust for center in a multicenter trial with binary outcomes.

    PubMed

    Reitsma, Angela; Chu, Rong; Thorpe, Julia; McDonald, Sarah; Thabane, Lehana; Hutton, Eileen

    2014-09-26

    Clustering of outcomes at centers involved in multicenter trials is a type of center effect. The Consolidated Standards of Reporting Trials Statement recommends that multicenter randomized controlled trials (RCTs) should account for center effects in their analysis, however most do not. The Early External Cephalic Version (EECV) trials published in 2003 and 2011 stratified by center at randomization, but did not account for center in the analyses, and due to the nature of the intervention and number of centers, may have been prone to center effects. Using data from the EECV trials, we undertook an empirical study to compare various statistical approaches to account for center effect while estimating the impact of external cephalic version timing (early or delayed) on the outcomes of cesarean section, preterm birth, and non-cephalic presentation at the time of birth. The data from the EECV pilot trial and the EECV2 trial were merged into one dataset. Fisher's exact method was used to test the overall effect of external cephalic version timing unadjusted for center effects. Seven statistical models that accounted for center effects were applied to the data. The models included: i) the Mantel-Haenszel test, ii) logistic regression with fixed center effect and fixed treatment effect, iii) center-size weighted and iv) un-weighted logistic regression with fixed center effect and fixed treatment-by-center interaction, iv) logistic regression with random center effect and fixed treatment effect, v) logistic regression with random center effect and random treatment-by-center interaction, and vi) generalized estimating equations. For each of the three outcomes of interest approaches to account for center effect did not alter the overall findings of the trial. The results were similar for the majority of the methods used to adjust for center, illustrating the robustness of the findings. Despite literature that suggests center effect can change the estimate of effect in multicenter trials, this empirical study does not show a difference in the outcomes of the EECV trials when accounting for center effect. The EECV2 trial was registered on 30 July 30 2005 with Current Controlled Trials: ISRCTN 56498577.

  18. Reliability Coupled Sensitivity Based Design Approach for Gravity Retaining Walls

    NASA Astrophysics Data System (ADS)

    Guha Ray, A.; Baidya, D. K.

    2012-09-01

    Sensitivity analysis involving different random variables and different potential failure modes of a gravity retaining wall focuses on the fact that high sensitivity of a particular variable on a particular mode of failure does not necessarily imply a remarkable contribution to the overall failure probability. The present paper aims at identifying a probabilistic risk factor ( R f ) for each random variable based on the combined effects of failure probability ( P f ) of each mode of failure of a gravity retaining wall and sensitivity of each of the random variables on these failure modes. P f is calculated by Monte Carlo simulation and sensitivity analysis of each random variable is carried out by F-test analysis. The structure, redesigned by modifying the original random variables with the risk factors, is safe against all the variations of random variables. It is observed that R f for friction angle of backfill soil ( φ 1 ) increases and cohesion of foundation soil ( c 2 ) decreases with an increase of variation of φ 1 , while R f for unit weights ( γ 1 and γ 2 ) for both soil and friction angle of foundation soil ( φ 2 ) remains almost constant for variation of soil properties. The results compared well with some of the existing deterministic and probabilistic methods and found to be cost-effective. It is seen that if variation of φ 1 remains within 5 %, significant reduction in cross-sectional area can be achieved. But if the variation is more than 7-8 %, the structure needs to be modified. Finally design guidelines for different wall dimensions, based on the present approach, are proposed.

  19. Effect of inventory method on niche models: random versus systematic error

    Treesearch

    Heather E. Lintz; Andrew N. Gray; Bruce McCune

    2013-01-01

    Data from large-scale biological inventories are essential for understanding and managing Earth's ecosystems. The Forest Inventory and Analysis Program (FIA) of the U.S. Forest Service is the largest biological inventory in North America; however, the FIA inventory recently changed from an amalgam of different approaches to a nationally-standardized approach in...

  20. Investigations of Students' Motivation Towards Learning Secondary School Physics through Mastery Learning Approach

    ERIC Educational Resources Information Center

    Changeiywo, Johnson M.; Wambugu, P. W.; Wachanga, S. W.

    2011-01-01

    Teaching method is a major factor that affects students' motivation to learn physics. This study investigated the effects of using mastery learning approach (MLA) on secondary school students' motivation to learn physics. Solomon four non-equivalent control group design under the quasi-experimental research method was used in which a random sample…

  1. Stimulating Graphical Summarization in Late Elementary Education: The Relationship between Two Instructional Mind-Map Approaches and Student Characteristics

    ERIC Educational Resources Information Center

    Merchie, Emmelien; Van Keer, Hilde

    2016-01-01

    This study examined the effectiveness of two instructional mind-mapping approaches to stimulate fifth and sixth graders' graphical summarization skills. Thirty-five fifth- and sixth-grade teachers and 644 students from 17 different elementary schools participated. A randomized quasi-experimental repeated-measures design was set up with two…

  2. The Effect of the Psychiatric Nursing Approach Based on the Tidal Model on Coping and Self-esteem in People with Alcohol Dependency: A Randomized Trial.

    PubMed

    Savaşan, Ayşegül; Çam, Olcay

    2017-06-01

    People with alcohol dependency have lower self-esteem than controls and when their alcohol use increases, their self-esteem decreases. Coping skills in alcohol related issues are predicted to reduce vulnerability to relapse. It is important to adapt care to individual needs so as to prevent a return to the cycle of alcohol use. The Tidal Model focuses on providing support and services to people who need to live a constructive life. The aim of the randomized study was to determine the effect of the psychiatric nursing approach based on the Tidal Model on coping and self-esteem in people with alcohol dependency. The study was semi-experimental in design with a control group, and was conducted on 36 individuals (18 experimental, 18 control). An experimental and a control group were formed by assigning persons to each group using the stratified randomization technique in the order in which they were admitted to hospital. The Coping Inventory (COPE) and the Coopersmith Self-Esteem Inventory (CSEI) were used as measurement instruments. The measurement instruments were applied before the application and three months after the application. In addition to routine treatment and follow-up, the psychiatric nursing approach based on the Tidal Model was applied to the experimental group in the One-to-One Sessions. The psychiatric nursing approach based on the Tidal Model is an approach which is effective in increasing the scores of people with alcohol dependency in positive reinterpretation and growth, active coping, restraint, emotional social support and planning and reducing their scores in behavioral disengagement. It was seen that self-esteem rose, but the difference from the control group did not reach significance. The psychiatric nursing approach based on the Tidal Model has an effect on people with alcohol dependency in maintaining their abstinence. The results of the study may provide practices on a theoretical basis for improving coping behaviors and self-esteem and facilitating the recovery process of alcohol dependents with implications for mental health nursing. Copyright © 2017 Elsevier Inc. All rights reserved.

  3. Random walks with shape prior for cochlea segmentation in ex vivo μCT.

    PubMed

    Ruiz Pujadas, Esmeralda; Kjer, Hans Martin; Piella, Gemma; Ceresa, Mario; González Ballester, Miguel Angel

    2016-09-01

    Cochlear implantation is a safe and effective surgical procedure to restore hearing in deaf patients. However, the level of restoration achieved may vary due to differences in anatomy, implant type and surgical access. In order to reduce the variability of the surgical outcomes, we previously proposed the use of a high-resolution model built from [Formula: see text] images and then adapted to patient-specific clinical CT scans. As the accuracy of the model is dependent on the precision of the original segmentation, it is extremely important to have accurate [Formula: see text] segmentation algorithms. We propose a new framework for cochlea segmentation in ex vivo [Formula: see text] images using random walks where a distance-based shape prior is combined with a region term estimated by a Gaussian mixture model. The prior is also weighted by a confidence map to adjust its influence according to the strength of the image contour. Random walks is performed iteratively, and the prior mask is aligned in every iteration. We tested the proposed approach in ten [Formula: see text] data sets and compared it with other random walks-based segmentation techniques such as guided random walks (Eslami et al. in Med Image Anal 17(2):236-253, 2013) and constrained random walks (Li et al. in Advances in image and video technology. Springer, Berlin, pp 215-226, 2012). Our approach demonstrated higher accuracy results due to the probability density model constituted by the region term and shape prior information weighed by a confidence map. The weighted combination of the distance-based shape prior with a region term into random walks provides accurate segmentations of the cochlea. The experiments suggest that the proposed approach is robust for cochlea segmentation.

  4. Comparative Effectiveness Research in Oncology

    PubMed Central

    2013-01-01

    Although randomized controlled trials represent the gold standard for comparative effective research (CER), a number of additional methods are available when randomized controlled trials are lacking or inconclusive because of the limitations of such trials. In addition to more relevant, efficient, and generalizable trials, there is a need for additional approaches utilizing rigorous methodology while fully recognizing their inherent limitations. CER is an important construct for defining and summarizing evidence on effectiveness and safety and comparing the value of competing strategies so that patients, providers, and policymakers can be offered appropriate recommendations for optimal patient care. Nevertheless, methodological as well as political and social challenges for CER remain. CER requires constant and sophisticated methodological oversight of study design and analysis similar to that required for randomized trials to reduce the potential for bias. At the same time, if appropriately conducted, CER offers an opportunity to identify the most effective and safe approach to patient care. Despite rising and unsustainable increases in health care costs, an even greater challenge to the implementation of CER arises from the social and political environment questioning the very motives and goals of CER. Oncologists and oncology professional societies are uniquely positioned to provide informed clinical and methodological expertise to steer the appropriate application of CER toward critical discussions related to health care costs, cost-effectiveness, and the comparative value of the available options for appropriate care of patients with cancer. PMID:23697601

  5. Cooperative effect of random and time-periodic coupling strength on synchronization transitions in one-way coupled neural system: mean field approach.

    PubMed

    Jiancheng, Shi; Min, Luo; Chusheng, Huang

    2017-08-01

    The cooperative effect of random coupling strength and time-periodic coupling strengh on synchronization transitions in one-way coupled neural system has been investigated by mean field approach. Results show that cooperative coupling strength (CCS) plays an active role for the enhancement of synchronization transitions. There exist an optimal frequency of CCS which makes the system display the best CCS-induced synchronization transitions, a critical frequency of CCS which can not further affect the CCS-induced synchronization transitions, and a critical amplitude of CCS which can not occur the CCS-induced synchronization transitions. Meanwhile, noise intensity plays a negative role for the CCS-induced synchronization transitions. Furthermore, it is found that the novel CCS amplitude-induced synchronization transitions and CCS frequency-induced synchronization transitions are found.

  6. Comparative effects of different dietary approaches on blood pressure in hypertensive and pre-hypertensive patients: A systematic review and network meta-analysis.

    PubMed

    Schwingshackl, Lukas; Chaimani, Anna; Schwedhelm, Carolina; Toledo, Estefania; Pünsch, Marina; Hoffmann, Georg; Boeing, Heiner

    2018-05-02

    Pairwise meta-analyses have shown beneficial effects of individual dietary approaches on blood pressure but their comparative effects have not been established. Therefore we performed a systematic review of different dietary intervention trials and estimated the aggregate blood pressure effects through network meta-analysis including hypertensive and pre-hypertensive patients. PubMed, Cochrane CENTRAL, and Google Scholar were searched until June 2017. The inclusion criteria were defined as follows: i) Randomized trial with a dietary approach; ii) hypertensive and pre-hypertensive adult patients; and iii) minimum intervention period of 12 weeks. In order to determine the pooled effect of each intervention relative to each of the other intervention for both diastolic and systolic blood pressure (SBP and DBP), random effects network meta-analysis was performed. A total of 67 trials comparing 13 dietary approaches (DASH, low-fat, moderate-carbohydrate, high-protein, low-carbohydrate, Mediterranean, Palaeolithic, vegetarian, low-GI/GL, low-sodium, Nordic, Tibetan, and control) enrolling 17,230 participants were included. In the network meta-analysis, the DASH, Mediterranean, low-carbohydrate, Palaeolithic, high-protein, low-glycaemic index, low-sodium, and low-fat dietary approaches were significantly more effective in reducing SBP (-8.73 to -2.32 mmHg) and DBP (-4.85 to -1.27 mmHg) compared to a control diet. According to the SUCRAs, the DASH diet was ranked the most effective dietary approach in reducing SBP (90%) and DBP (91%), followed by the Palaeolithic, and the low-carbohydrate diet (ranked 3rd for SBP) or the Mediterranean diet (ranked 3rd for DBP). For most comparisons, the credibility of evidence was rated very low to moderate, with the exception for the DASH vs. the low-fat dietary approach for which the quality of evidence was rated high. The present network meta-analysis suggests that the DASH dietary approach might be the most effective dietary measure to reduce blood pressure among hypertensive and pre-hypertensive patients based on high quality evidence.

  7. Does an outcome-based approach to continuing medical education improve physicians' competences in rational prescribing?

    PubMed

    Esmaily, Hamideh M; Savage, Carl; Vahidi, Rezagoli; Amini, Abolghasem; Dastgiri, Saeed; Hult, Hakan; Dahlgren, Lars Owe; Wahlstrom, Rolf

    2009-11-01

    Continuing medical education (CME) is compulsory in Iran, and traditionally it is lecture-based, which is mostly not successful. Outcome-based education has been proposed for CME programs. To evaluate the effectiveness of an outcome-based educational intervention with a new approach based on outcomes and aligned teaching methods, on knowledge and skills of general physicians (GPs) working in primary care compared with a concurrent CME program in the field of "Rational prescribing". The method used was cluster randomized controlled design. All GPs working in six cities in one province in Iran were invited to participate. The cities were matched and randomly divided into an intervention arm for education on rational prescribing with an outcome-based approach, and a control arm for a traditional program on the same topic. Knowledge and skills were assessed using a pre- and post-test, including case scenarios. In total, 112 GPs participated. There were significant improvements in knowledge and prescribing skills after the training in the intervention arm as well as in comparison with the changes in the control arm. The overall intervention effect was 26 percentage units. The introduction of an outcome-based approach in CME appears to be effective when creating programs to improve GPs' knowledge and skills.

  8. Behavioral Outcome Effects of Serious Gaming as an Adjunct to Treatment for Children With Attention-Deficit/Hyperactivity Disorder: A Randomized Controlled Trial.

    PubMed

    Bul, Kim C M; Kato, Pamela M; Van der Oord, Saskia; Danckaerts, Marina; Vreeke, Leonie J; Willems, Annik; van Oers, Helga J J; Van Den Heuvel, Ria; Birnie, Derk; Van Amelsvoort, Thérèse A M J; Franken, Ingmar H A; Maras, Athanasios

    2016-02-16

    The need for accessible and motivating treatment approaches within mental health has led to the development of an Internet-based serious game intervention (called "Plan-It Commander") as an adjunct to treatment as usual for children with attention-deficit/hyperactivity disorder (ADHD). The aim was to determine the effects of Plan-It Commander on daily life skills of children with ADHD in a multisite randomized controlled crossover open-label trial. Participants (N=170) in this 20-week trial had a diagnosis of ADHD and ranged in age from 8 to 12 years (male: 80.6%, 137/170; female: 19.4%, 33/170). They were randomized to a serious game intervention group (group 1; n=88) or a treatment-as-usual crossover group (group 2; n=82). Participants randomized to group 1 received a serious game intervention in addition to treatment as usual for the first 10 weeks and then received treatment as usual for the next 10 weeks. Participants randomized to group 2 received treatment as usual for the first 10 weeks and crossed over to the serious game intervention in addition to treatment as usual for the subsequent 10 weeks. Primary (parent report) and secondary (parent, teacher, and child self-report) outcome measures were administered at baseline, 10 weeks, and 10-week follow-up. After 10 weeks, participants in group 1 compared to group 2 achieved significantly greater improvements on the primary outcome of time management skills (parent-reported; P=.004) and on secondary outcomes of the social skill of responsibility (parent-reported; P=.04), and working memory (parent-reported; P=.02). Parents and teachers reported that total social skills improved over time within groups, whereas effects on total social skills and teacher-reported planning/organizing skills were nonsignificant between groups. Within group 1, positive effects were maintained or further improved in the last 10 weeks of the study. Participants in group 2, who played the serious game during the second period of the study (weeks 10 to 20), improved on comparable domains of daily life functioning over time. Plan-It Commander offers an effective therapeutic approach as an adjunct intervention to traditional therapeutic ADHD approaches that improve functional outcomes in daily life. International Standard Randomized Controlled Trial Number (ISRCTN): 62056259; http://www.controlled-trials.com/ISRCTN62056259 (Archived by WebCite at http://www.webcitation.org/6eNsiTDJV).

  9. Behavioral Outcome Effects of Serious Gaming as an Adjunct to Treatment for Children With Attention-Deficit/Hyperactivity Disorder: A Randomized Controlled Trial

    PubMed Central

    2016-01-01

    Background The need for accessible and motivating treatment approaches within mental health has led to the development of an Internet-based serious game intervention (called “Plan-It Commander”) as an adjunct to treatment as usual for children with attention-deficit/hyperactivity disorder (ADHD). Objective The aim was to determine the effects of Plan-It Commander on daily life skills of children with ADHD in a multisite randomized controlled crossover open-label trial. Methods Participants (N=170) in this 20-week trial had a diagnosis of ADHD and ranged in age from 8 to 12 years (male: 80.6%, 137/170; female: 19.4%, 33/170). They were randomized to a serious game intervention group (group 1; n=88) or a treatment-as-usual crossover group (group 2; n=82). Participants randomized to group 1 received a serious game intervention in addition to treatment as usual for the first 10 weeks and then received treatment as usual for the next 10 weeks. Participants randomized to group 2 received treatment as usual for the first 10 weeks and crossed over to the serious game intervention in addition to treatment as usual for the subsequent 10 weeks. Primary (parent report) and secondary (parent, teacher, and child self-report) outcome measures were administered at baseline, 10 weeks, and 10-week follow-up. Results After 10 weeks, participants in group 1 compared to group 2 achieved significantly greater improvements on the primary outcome of time management skills (parent-reported; P=.004) and on secondary outcomes of the social skill of responsibility (parent-reported; P=.04), and working memory (parent-reported; P=.02). Parents and teachers reported that total social skills improved over time within groups, whereas effects on total social skills and teacher-reported planning/organizing skills were nonsignificant between groups. Within group 1, positive effects were maintained or further improved in the last 10 weeks of the study. Participants in group 2, who played the serious game during the second period of the study (weeks 10 to 20), improved on comparable domains of daily life functioning over time. Conclusions Plan-It Commander offers an effective therapeutic approach as an adjunct intervention to traditional therapeutic ADHD approaches that improve functional outcomes in daily life. Trial Registration International Standard Randomized Controlled Trial Number (ISRCTN): 62056259; http://www.controlled-trials.com/ISRCTN62056259 (Archived by WebCite at http://www.webcitation.org/6eNsiTDJV). PMID:26883052

  10. The Impact of Individual Differences on a Bilingual Vocabulary Approach for Latino Preschoolers.

    PubMed

    Méndez, Lucía I; Crais, Elizabeth R; Kainz, Kirsten

    2018-04-17

    The purpose of this study was twofold: First, we replicated in a new sample our previous findings that a culturally and linguistically responsive (CLR) bilingual approach for English vocabulary instruction for preschool Latino dual language learners was effective. Subsequently, we investigated whether the positive effect of CLR instruction varies as a function of individual child characteristics, including baseline vocabulary levels and gender. Using a randomized pretest-posttest follow-up group design, we first replicated our previous study (N = 42) with a new sample by randomly assigning 35 Spanish-speaking Latino preschoolers to a CLR bilingual group or an English-only group. The preschoolers received small-group evidence-informed shared readings targeting 30 English words 3 times a week for 5 weeks in their preschools. Vocabulary outcomes were measured using both standardized and researcher-developed measures. We subsequently conducted further studies with the combined sample size of 77 children to examine the variability in intervention effects related to child gender and baseline vocabulary levels. The direct replication study confirmed findings of our earlier work suggesting that the CLR bilingual approach promoted greater gains in L1 and L2 vocabulary than in an English-only approach. The extension studies revealed that the effect of the CLR bilingual vocabulary approach on English and Spanish vocabulary outcomes was not impacted by gender or vocabulary status at baseline. This study provides additional evidence of the benefits of strategically combining L1 and L2 for vocabulary instruction over an English-only approach. Our findings also suggest that preschool Latino dual language learners can benefit from a bilingual vocabulary instructional approach regardless of gender or baseline vocabulary levels in L1.

  11. A Randomized Effectiveness Trial of a Systems-Level Approach to Stepped Care for War-Related PTSD

    DTIC Science & Technology

    2016-05-01

    digitize consent forms and store them centrally at RTI for the required six year time period rather than storing the hard copies at their respective posts ...treating depression and post -traumatic stress disorder in military personnel. Under review. Marshall G, et al. Temporal associations among PTSD...Belsher, B, Jaycox L.H. The cost-effectiveness of a collaborative care approach to treating depression and post -traumatic stress disorder in

  12. Prediction models for clustered data: comparison of a random intercept and standard regression model

    PubMed Central

    2013-01-01

    Background When study data are clustered, standard regression analysis is considered inappropriate and analytical techniques for clustered data need to be used. For prediction research in which the interest of predictor effects is on the patient level, random effect regression models are probably preferred over standard regression analysis. It is well known that the random effect parameter estimates and the standard logistic regression parameter estimates are different. Here, we compared random effect and standard logistic regression models for their ability to provide accurate predictions. Methods Using an empirical study on 1642 surgical patients at risk of postoperative nausea and vomiting, who were treated by one of 19 anesthesiologists (clusters), we developed prognostic models either with standard or random intercept logistic regression. External validity of these models was assessed in new patients from other anesthesiologists. We supported our results with simulation studies using intra-class correlation coefficients (ICC) of 5%, 15%, or 30%. Standard performance measures and measures adapted for the clustered data structure were estimated. Results The model developed with random effect analysis showed better discrimination than the standard approach, if the cluster effects were used for risk prediction (standard c-index of 0.69 versus 0.66). In the external validation set, both models showed similar discrimination (standard c-index 0.68 versus 0.67). The simulation study confirmed these results. For datasets with a high ICC (≥15%), model calibration was only adequate in external subjects, if the used performance measure assumed the same data structure as the model development method: standard calibration measures showed good calibration for the standard developed model, calibration measures adapting the clustered data structure showed good calibration for the prediction model with random intercept. Conclusion The models with random intercept discriminate better than the standard model only if the cluster effect is used for predictions. The prediction model with random intercept had good calibration within clusters. PMID:23414436

  13. Prediction models for clustered data: comparison of a random intercept and standard regression model.

    PubMed

    Bouwmeester, Walter; Twisk, Jos W R; Kappen, Teus H; van Klei, Wilton A; Moons, Karel G M; Vergouwe, Yvonne

    2013-02-15

    When study data are clustered, standard regression analysis is considered inappropriate and analytical techniques for clustered data need to be used. For prediction research in which the interest of predictor effects is on the patient level, random effect regression models are probably preferred over standard regression analysis. It is well known that the random effect parameter estimates and the standard logistic regression parameter estimates are different. Here, we compared random effect and standard logistic regression models for their ability to provide accurate predictions. Using an empirical study on 1642 surgical patients at risk of postoperative nausea and vomiting, who were treated by one of 19 anesthesiologists (clusters), we developed prognostic models either with standard or random intercept logistic regression. External validity of these models was assessed in new patients from other anesthesiologists. We supported our results with simulation studies using intra-class correlation coefficients (ICC) of 5%, 15%, or 30%. Standard performance measures and measures adapted for the clustered data structure were estimated. The model developed with random effect analysis showed better discrimination than the standard approach, if the cluster effects were used for risk prediction (standard c-index of 0.69 versus 0.66). In the external validation set, both models showed similar discrimination (standard c-index 0.68 versus 0.67). The simulation study confirmed these results. For datasets with a high ICC (≥15%), model calibration was only adequate in external subjects, if the used performance measure assumed the same data structure as the model development method: standard calibration measures showed good calibration for the standard developed model, calibration measures adapting the clustered data structure showed good calibration for the prediction model with random intercept. The models with random intercept discriminate better than the standard model only if the cluster effect is used for predictions. The prediction model with random intercept had good calibration within clusters.

  14. A randomized study of the effect of anonymity, quasi-anonymity, and Certificates of Confidentiality on postpartum women's disclosure of sensitive information.

    PubMed

    Beatty, Jessica R; Chase, Sara K; Ondersma, Steven J

    2014-01-01

    Under-reporting of substance use and other sensitive information is a substantial threat to internal study validity, particularly during the perinatal period. Anonymous approaches are associated with greater disclosure but are incompatible with longitudinal follow-up. Alternative approaches include use of a U.S. Federal Certificate of Confidentiality (CoC) and quasi-anonymous methods, in which there is no link between name and data. However, the relative effect of these procedures on disclosure is unknown. This randomized study was designed to evaluate the effects of consent condition (anonymous, quasi-anonymous, CoC, and traditional confidentiality) on disclosure of sensitive information among postpartum women. Participants were 200 postpartum, primarily African-American women who were randomly assigned to one of the four consent conditions and completed a brief computer-delivered assessment of alcohol and drug use, sexual risk, intimate partner violence, and emotional distress. Participants in the anonymous and quasi-anonymous conditions disclosed significantly more sensitive information than those in the traditional consent condition. In contrast, no advantage in overall disclosure was observed for the CoC condition. This result was largely consistent across specific content areas with the exception of emotional distress, disclosure of which was unrelated to consent condition. Although use of a CoC has limited impact on disclosure, the quasi-anonymous method may increase disclosure to a similar extent as full anonymity. Quasi-anonymous approaches should be considered when under-reporting is likely, a context in which the disadvantages of this approach must be balanced against its advantages. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  15. The WORD (Wholeness, Oneness, Righteousness, Deliverance): design of a randomized controlled trial testing the effectiveness of an evidence-based weight loss and maintenance intervention translated for a faith-based, rural, African American population using a community-based participatory approach.

    PubMed

    Yeary, Karen Hye-cheon Kim; Cornell, Carol E; Prewitt, Elaine; Bursac, Zoran; Tilford, J Mick; Turner, Jerome; Eddings, Kenya; Love, ShaRhonda; Whittington, Emily; Harris, Kimberly

    2015-01-01

    The positive effects of weight loss on obesity-related risk factors diminish unless weight loss is maintained. Yet little work has focused on the translation of evidence-based weight loss interventions with the aim of sustaining weight loss in underserved populations. Using a community-based participatory approach (CBPR) that engages the strong faith-based social infrastructure characteristic of rural African American communities is a promising way to sustain weight loss in African Americans, who bear a disproportionate burden of the obesity epidemic. Led by a collaborative community-academic partnership, The WORD aims to change dietary and physical activity behaviors to produce and maintain weight loss in rural, African American adults of faith. The WORD is a randomized controlled trial with 450 participants nested within 30 churches. All churches will receive a 16-session core weight loss intervention. Half of the churches will be randomized to receive an additional 12-session maintenance component. The WORD is a cultural adaptation of the Diabetes Prevention Program, whereby small groups will be led by trained church members. Participants will be assessed at baseline, 6, 12, and 18 months. A detailed cost-effectiveness and process evaluation will be included. The WORD aims to sustain weight loss in rural African Americans. The utilization of a CBPR approach and the engagement of the faith-based social infrastructure of African American communities will maximize the intervention's sustainability. Unique aspects of this trial include the focus on weight loss maintenance and the use of a faith-based CBPR approach in translating evidence-based obesity interventions. Copyright © 2014 Elsevier Inc. All rights reserved.

  16. Multi-factor challenge/response approach for remote biometric authentication

    NASA Astrophysics Data System (ADS)

    Al-Assam, Hisham; Jassim, Sabah A.

    2011-06-01

    Although biometric authentication is perceived to be more reliable than traditional authentication schemes, it becomes vulnerable to many attacks when it comes to remote authentication over open networks and raises serious privacy concerns. This paper proposes a biometric-based challenge-response approach to be used for remote authentication between two parties A and B over open networks. In the proposed approach, a remote authenticator system B (e.g. a bank) challenges its client A who wants to authenticate his/her self to the system by sending a one-time public random challenge. The client A responds by employing the random challenge along with secret information obtained from a password and a token to produce a one-time cancellable representation of his freshly captured biometric sample. The one-time biometric representation, which is based on multi-factor, is then sent back to B for matching. Here, we argue that eavesdropping of the one-time random challenge and/or the resulting one-time biometric representation does not compromise the security of the system, and no information about the original biometric data is leaked. In addition to securing biometric templates, the proposed protocol offers a practical solution for the replay attack on biometric systems. Moreover, we propose a new scheme for generating a password-based pseudo random numbers/permutation to be used as a building block in the proposed approach. The proposed scheme is also designed to provide protection against repudiation. We illustrate the viability and effectiveness of the proposed approach by experimental results based on two biometric modalities: fingerprint and face biometrics.

  17. Analysis of overdispersed count data by mixtures of Poisson variables and Poisson processes.

    PubMed

    Hougaard, P; Lee, M L; Whitmore, G A

    1997-12-01

    Count data often show overdispersion compared to the Poisson distribution. Overdispersion is typically modeled by a random effect for the mean, based on the gamma distribution, leading to the negative binomial distribution for the count. This paper considers a larger family of mixture distributions, including the inverse Gaussian mixture distribution. It is demonstrated that it gives a significantly better fit for a data set on the frequency of epileptic seizures. The same approach can be used to generate counting processes from Poisson processes, where the rate or the time is random. A random rate corresponds to variation between patients, whereas a random time corresponds to variation within patients.

  18. A random sampling approach for robust estimation of tissue-to-plasma ratio from extremely sparse data.

    PubMed

    Chu, Hui-May; Ette, Ene I

    2005-09-02

    his study was performed to develop a new nonparametric approach for the estimation of robust tissue-to-plasma ratio from extremely sparsely sampled paired data (ie, one sample each from plasma and tissue per subject). Tissue-to-plasma ratio was estimated from paired/unpaired experimental data using independent time points approach, area under the curve (AUC) values calculated with the naïve data averaging approach, and AUC values calculated using sampling based approaches (eg, the pseudoprofile-based bootstrap [PpbB] approach and the random sampling approach [our proposed approach]). The random sampling approach involves the use of a 2-phase algorithm. The convergence of the sampling/resampling approaches was investigated, as well as the robustness of the estimates produced by different approaches. To evaluate the latter, new data sets were generated by introducing outlier(s) into the real data set. One to 2 concentration values were inflated by 10% to 40% from their original values to produce the outliers. Tissue-to-plasma ratios computed using the independent time points approach varied between 0 and 50 across time points. The ratio obtained from AUC values acquired using the naive data averaging approach was not associated with any measure of uncertainty or variability. Calculating the ratio without regard to pairing yielded poorer estimates. The random sampling and pseudoprofile-based bootstrap approaches yielded tissue-to-plasma ratios with uncertainty and variability. However, the random sampling approach, because of the 2-phase nature of its algorithm, yielded more robust estimates and required fewer replications. Therefore, a 2-phase random sampling approach is proposed for the robust estimation of tissue-to-plasma ratio from extremely sparsely sampled data.

  19. Sequential Multiple Assignment Randomized Trial (SMART) with Adaptive Randomization for Quality Improvement in Depression Treatment Program

    PubMed Central

    Chakraborty, Bibhas; Davidson, Karina W.

    2015-01-01

    Summary Implementation study is an important tool for deploying state-of-the-art treatments from clinical efficacy studies into a treatment program, with the dual goals of learning about effectiveness of the treatments and improving the quality of care for patients enrolled into the program. In this article, we deal with the design of a treatment program of dynamic treatment regimens (DTRs) for patients with depression post acute coronary syndrome. We introduce a novel adaptive randomization scheme for a sequential multiple assignment randomized trial of DTRs. Our approach adapts the randomization probabilities to favor treatment sequences having comparatively superior Q-functions used in Q-learning. The proposed approach addresses three main concerns of an implementation study: it allows incorporation of historical data or opinions, it includes randomization for learning purposes, and it aims to improve care via adaptation throughout the program. We demonstrate how to apply our method to design a depression treatment program using data from a previous study. By simulation, we illustrate that the inputs from historical data are important for the program performance measured by the expected outcomes of the enrollees, but also show that the adaptive randomization scheme is able to compensate poorly specified historical inputs by improving patient outcomes within a reasonable horizon. The simulation results also confirm that the proposed design allows efficient learning of the treatments by alleviating the curse of dimensionality. PMID:25354029

  20. Design and implementation of a dental caries prevention trial in remote Canadian Aboriginal communities.

    PubMed

    Harrison, Rosamund; Veronneau, Jacques; Leroux, Brian

    2010-05-13

    The goal of this cluster randomized trial is to test the effectiveness of a counseling approach, Motivational Interviewing, to control dental caries in young Aboriginal children. Motivational Interviewing, a client-centred, directive counseling style, has not yet been evaluated as an approach for promotion of behaviour change in indigenous communities in remote settings. Aboriginal women were hired from the 9 communities to recruit expectant and new mothers to the trial, administer questionnaires and deliver the counseling to mothers in the test communities. The goal is for mothers to receive the intervention during pregnancy and at their child's immunization visits. Data on children's dental health status and family dental health practices will be collected when children are 30-months of age. The communities were randomly allocated to test or control group by a random "draw" over community radio. Sample size and power were determined based on an anticipated 20% reduction in caries prevalence. Randomization checks were conducted between groups. In the 5 test and 4 control communities, 272 of the original target sample size of 309 mothers have been recruited over a two-and-a-half year period. A power calculation using the actual attained sample size showed power to be 79% to detect a treatment effect. If an attrition fraction of 4% per year is maintained, power will remain at 80%. Power will still be > 90% to detect a 25% reduction in caries prevalence. The distribution of most baseline variables was similar for the two randomized groups of mothers. However, despite the random assignment of communities to treatment conditions, group differences exist for stage of pregnancy and prior tooth extractions in the family. Because of the group imbalances on certain variables, control of baseline variables will be done in the analyses of treatment effects. This paper explains the challenges of conducting randomized trials in remote settings, the importance of thorough community collaboration, and also illustrates the likelihood that some baseline variables that may be clinically important will be unevenly split in group-randomized trials when the number of groups is small. This trial is registered as ISRCTN41467632.

  1. Design and implementation of a dental caries prevention trial in remote Canadian Aboriginal communities

    PubMed Central

    2010-01-01

    Background The goal of this cluster randomized trial is to test the effectiveness of a counseling approach, Motivational Interviewing, to control dental caries in young Aboriginal children. Motivational Interviewing, a client-centred, directive counseling style, has not yet been evaluated as an approach for promotion of behaviour change in indigenous communities in remote settings. Methods/design Aboriginal women were hired from the 9 communities to recruit expectant and new mothers to the trial, administer questionnaires and deliver the counseling to mothers in the test communities. The goal is for mothers to receive the intervention during pregnancy and at their child's immunization visits. Data on children's dental health status and family dental health practices will be collected when children are 30-months of age. The communities were randomly allocated to test or control group by a random "draw" over community radio. Sample size and power were determined based on an anticipated 20% reduction in caries prevalence. Randomization checks were conducted between groups. Discussion In the 5 test and 4 control communities, 272 of the original target sample size of 309 mothers have been recruited over a two-and-a-half year period. A power calculation using the actual attained sample size showed power to be 79% to detect a treatment effect. If an attrition fraction of 4% per year is maintained, power will remain at 80%. Power will still be > 90% to detect a 25% reduction in caries prevalence. The distribution of most baseline variables was similar for the two randomized groups of mothers. However, despite the random assignment of communities to treatment conditions, group differences exist for stage of pregnancy and prior tooth extractions in the family. Because of the group imbalances on certain variables, control of baseline variables will be done in the analyses of treatment effects. This paper explains the challenges of conducting randomized trials in remote settings, the importance of thorough community collaboration, and also illustrates the likelihood that some baseline variables that may be clinically important will be unevenly split in group-randomized trials when the number of groups is small. Trial registration This trial is registered as ISRCTN41467632. PMID:20465831

  2. Study design and "evidence" in patient-oriented research.

    PubMed

    Concato, John

    2013-06-01

    Individual studies in patient-oriented research, whether described as "comparative effectiveness" or using other terms, are based on underlying methodological designs. A simple taxonomy of study designs includes randomized controlled trials on the one hand, and observational studies (such as case series, cohort studies, and case-control studies) on the other. A rigid hierarchy of these design types is a fairly recent phenomenon, promoted as a tenet of "evidence-based medicine," with randomized controlled trials receiving gold-standard status in terms of producing valid results. Although randomized trials have many strengths, and contribute substantially to the evidence base in clinical care, making presumptions about the quality of a study based solely on category of research design is unscientific. Both the limitations of randomized trials as well as the strengths of observational studies tend to be overlooked when a priori assumptions are made. This essay presents an argument in support of a more balanced approach to evaluating evidence, and discusses representative examples from the general medical as well as pulmonary and critical care literature. The simultaneous consideration of validity (whether results are correct "internally") and generalizability (how well results apply to "external" populations) is warranted in assessing whether a study's results are accurate for patients likely to receive the intervention-examining the intersection of clinical and methodological issues in what can be called a medicine-based evidence approach. Examination of cause-effect associations in patient-oriented research should recognize both the strengths and limitations of randomized trials as well as observational studies.

  3. A multi-site proof-of-concept investigation of computerized approach-avoidance training in adolescent cannabis users.

    PubMed

    Jacobus, Joanna; Taylor, Charles T; Gray, Kevin M; Meredith, Lindsay R; Porter, Anna M; Li, Irene; Castro, Norma; Squeglia, Lindsay M

    2018-06-01

    Few effective treatment options exist for cannabis-using youth. This pilot study aimed to test Approach-Avoidance Training to reduce cannabis use with non-treatment-seeking adolescents. Eighty cannabis-using non-treatment-seeking adolescents (average age 19) were recruited from San Diego, California and Charleston, South Carolina, and randomized to complete either six sessions of Cannabis Approach-Avoidance Task Training (CAAT-training) designed to reduce automatic approach biases for cannabis cues or CAAT-sham training. Change in two primary outcome variables was examined: 1) cannabis approach bias and 2) percent cannabis use days over study enrollment. Change in percent alcohol use days over study enrollment was explored as a secondary outcome. A mixed models repeated measures analysis confirmed the group by time interaction effect for approach bias failed to reach statistical significance (p = .06). Significant group by time interaction effects (ps < 0.05) predicted percent days of cannabis and alcohol use over study enrollment. Participants randomized to the avoid cannabis condition (CAAT-training) reported 7% fewer days of cannabis use compared to 0% change for sham; unexpectedly, those in the avoid cannabis condition reported 10% percent more alcohol use days compared to 3% more for sham. Computerized cognitive bias modification paradigms may have utility in reducing adolescent cannabis use. Future work should consider developing a paradigm that addresses both cannabis and alcohol, as well as alternative computerized approaches for coping with addictive behavior in conjunction with bias modification. Copyright © 2018 Elsevier B.V. All rights reserved.

  4. Modeling Training Site Vegetation Coverage Probability with a Random Optimizing Procedure: An Artificial Neural Network Approach.

    DTIC Science & Technology

    1998-05-01

    Coverage Probability with a Random Optimization Procedure: An Artificial Neural Network Approach by Biing T. Guan, George Z. Gertner, and Alan B...Modeling Training Site Vegetation Coverage Probability with a Random Optimizing Procedure: An Artificial Neural Network Approach 6. AUTHOR(S) Biing...coverage based on past coverage. Approach A literature survey was conducted to identify artificial neural network analysis techniques applicable for

  5. Relevant Feature Set Estimation with a Knock-out Strategy and Random Forests

    PubMed Central

    Ganz, Melanie; Greve, Douglas N.; Fischl, Bruce; Konukoglu, Ender

    2015-01-01

    Group analysis of neuroimaging data is a vital tool for identifying anatomical and functional variations related to diseases as well as normal biological processes. The analyses are often performed on a large number of highly correlated measurements using a relatively smaller number of samples. Despite the correlation structure, the most widely used approach is to analyze the data using univariate methods followed by post-hoc corrections that try to account for the data’s multivariate nature. Although widely used, this approach may fail to recover from the adverse effects of the initial analysis when local effects are not strong. Multivariate pattern analysis (MVPA) is a powerful alternative to the univariate approach for identifying relevant variations. Jointly analyzing all the measures, MVPA techniques can detect global effects even when individual local effects are too weak to detect with univariate analysis. Current approaches are successful in identifying variations that yield highly predictive and compact models. However, they suffer from lessened sensitivity and instabilities in identification of relevant variations. Furthermore, current methods’ user-defined parameters are often unintuitive and difficult to determine. In this article, we propose a novel MVPA method for group analysis of high-dimensional data that overcomes the drawbacks of the current techniques. Our approach explicitly aims to identify all relevant variations using a “knock-out” strategy and the Random Forest algorithm. In evaluations with synthetic datasets the proposed method achieved substantially higher sensitivity and accuracy than the state-of-the-art MVPA methods, and outperformed the univariate approach when the effect size is low. In experiments with real datasets the proposed method identified regions beyond the univariate approach, while other MVPA methods failed to replicate the univariate results. More importantly, in a reproducibility study with the well-known ADNI dataset the proposed method yielded higher stability and power than the univariate approach. PMID:26272728

  6. Controllability of social networks and the strategic use of random information.

    PubMed

    Cremonini, Marco; Casamassima, Francesca

    2017-01-01

    This work is aimed at studying realistic social control strategies for social networks based on the introduction of random information into the state of selected driver agents. Deliberately exposing selected agents to random information is a technique already experimented in recommender systems or search engines, and represents one of the few options for influencing the behavior of a social context that could be accepted as ethical, could be fully disclosed to members, and does not involve the use of force or of deception. Our research is based on a model of knowledge diffusion applied to a time-varying adaptive network and considers two well-known strategies for influencing social contexts: One is the selection of few influencers for manipulating their actions in order to drive the whole network to a certain behavior; the other, instead, drives the network behavior acting on the state of a large subset of ordinary, scarcely influencing users. The two approaches have been studied in terms of network and diffusion effects. The network effect is analyzed through the changes induced on network average degree and clustering coefficient, while the diffusion effect is based on two ad hoc metrics which are defined to measure the degree of knowledge diffusion and skill level, as well as the polarization of agent interests. The results, obtained through simulations on synthetic networks, show a rich dynamics and strong effects on the communication structure and on the distribution of knowledge and skills. These findings support our hypothesis that the strategic use of random information could represent a realistic approach to social network controllability, and that with both strategies, in principle, the control effect could be remarkable.

  7. A unified procedure for meta-analytic evaluation of surrogate end points in randomized clinical trials

    PubMed Central

    Dai, James Y.; Hughes, James P.

    2012-01-01

    The meta-analytic approach to evaluating surrogate end points assesses the predictiveness of treatment effect on the surrogate toward treatment effect on the clinical end point based on multiple clinical trials. Definition and estimation of the correlation of treatment effects were developed in linear mixed models and later extended to binary or failure time outcomes on a case-by-case basis. In a general regression setting that covers nonnormal outcomes, we discuss in this paper several metrics that are useful in the meta-analytic evaluation of surrogacy. We propose a unified 3-step procedure to assess these metrics in settings with binary end points, time-to-event outcomes, or repeated measures. First, the joint distribution of estimated treatment effects is ascertained by an estimating equation approach; second, the restricted maximum likelihood method is used to estimate the means and the variance components of the random treatment effects; finally, confidence intervals are constructed by a parametric bootstrap procedure. The proposed method is evaluated by simulations and applications to 2 clinical trials. PMID:22394448

  8. Meta-Analysis in Clinical Trials Revisited

    PubMed Central

    Laird, Nan

    2015-01-01

    In this paper, we revisit a 1986 article we published in this Journal, Meta-Analysis in Clinical Trials, where we introduced a random-effect model to summarize the evidence about treatment efficacy from a number of related clinical trials. Because of its simplicity and ease of implementation, our approach has been widely used (with more than 12,000 citations to date) and the “DerSimonian and Laird method” is now often referred to as the ‘standard approach’ or a ‘popular’ method for meta-analysis in medical and clinical research. The method is especially useful for providing an overall effect estimate and for characterizing the heterogeneity of effects across a series of studies. Here, we review the background that led to the original 1986 article, briefly describe the random-effects approach for meta-analysis, explore its use in various settings and trends over time and recommend a refinement to the method using a robust variance estimator for testing overall effect. We conclude with a discussion of repurposing the method for Big Data meta-analysis and Genome Wide Association Studies for studying the importance of genetic variants in complex diseases. PMID:26343745

  9. Toward a Framework for Learner Segmentation

    ERIC Educational Resources Information Center

    Azarnoush, Bahareh; Bekki, Jennifer M.; Runger, George C.; Bernstein, Bianca L.; Atkinson, Robert K.

    2013-01-01

    Effectively grouping learners in an online environment is a highly useful task. However, datasets used in this task often have large numbers of attributes of disparate types and different scales, which traditional clustering approaches cannot handle effectively. Here, a unique dissimilarity measure based on the random forest, which handles the…

  10. An Improvement in Instructional Quality: Can Evaluation of Teaching Effectiveness Make a Difference?

    ERIC Educational Resources Information Center

    Ngware, Moses Waithanji; Ndirangu, Mwangi

    2005-01-01

    Purpose: To report study findings on teaching effectiveness and feedback mechanisms in Kenyan universities, which can guide management in developing a comprehensive quality control policy. Design/methodology/approach: The study adopted an exploratory descriptive design. Three public and two private universities were randomly selected to…

  11. A unifying framework for marginalized random intercept models of correlated binary outcomes

    PubMed Central

    Swihart, Bruce J.; Caffo, Brian S.; Crainiceanu, Ciprian M.

    2013-01-01

    We demonstrate that many current approaches for marginal modeling of correlated binary outcomes produce likelihoods that are equivalent to the copula-based models herein. These general copula models of underlying latent threshold random variables yield likelihood-based models for marginal fixed effects estimation and interpretation in the analysis of correlated binary data with exchangeable correlation structures. Moreover, we propose a nomenclature and set of model relationships that substantially elucidates the complex area of marginalized random intercept models for binary data. A diverse collection of didactic mathematical and numerical examples are given to illustrate concepts. PMID:25342871

  12. Calibrating random forests for probability estimation.

    PubMed

    Dankowski, Theresa; Ziegler, Andreas

    2016-09-30

    Probabilities can be consistently estimated using random forests. It is, however, unclear how random forests should be updated to make predictions for other centers or at different time points. In this work, we present two approaches for updating random forests for probability estimation. The first method has been proposed by Elkan and may be used for updating any machine learning approach yielding consistent probabilities, so-called probability machines. The second approach is a new strategy specifically developed for random forests. Using the terminal nodes, which represent conditional probabilities, the random forest is first translated to logistic regression models. These are, in turn, used for re-calibration. The two updating strategies were compared in a simulation study and are illustrated with data from the German Stroke Study Collaboration. In most simulation scenarios, both methods led to similar improvements. In the simulation scenario in which the stricter assumptions of Elkan's method were not met, the logistic regression-based re-calibration approach for random forests outperformed Elkan's method. It also performed better on the stroke data than Elkan's method. The strength of Elkan's method is its general applicability to any probability machine. However, if the strict assumptions underlying this approach are not met, the logistic regression-based approach is preferable for updating random forests for probability estimation. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

  13. A statistical approach to selecting and confirming validation targets in -omics experiments

    PubMed Central

    2012-01-01

    Background Genomic technologies are, by their very nature, designed for hypothesis generation. In some cases, the hypotheses that are generated require that genome scientists confirm findings about specific genes or proteins. But one major advantage of high-throughput technology is that global genetic, genomic, transcriptomic, and proteomic behaviors can be observed. Manual confirmation of every statistically significant genomic result is prohibitively expensive. This has led researchers in genomics to adopt the strategy of confirming only a handful of the most statistically significant results, a small subset chosen for biological interest, or a small random subset. But there is no standard approach for selecting and quantitatively evaluating validation targets. Results Here we present a new statistical method and approach for statistically validating lists of significant results based on confirming only a small random sample. We apply our statistical method to show that the usual practice of confirming only the most statistically significant results does not statistically validate result lists. We analyze an extensively validated RNA-sequencing experiment to show that confirming a random subset can statistically validate entire lists of significant results. Finally, we analyze multiple publicly available microarray experiments to show that statistically validating random samples can both (i) provide evidence to confirm long gene lists and (ii) save thousands of dollars and hundreds of hours of labor over manual validation of each significant result. Conclusions For high-throughput -omics studies, statistical validation is a cost-effective and statistically valid approach to confirming lists of significant results. PMID:22738145

  14. Controversies in the treatment of patients with STEMI and multivessel disease: is it time for PCI of all lesions?

    PubMed

    Ong, Peter; Sechtem, Udo

    2016-06-01

    Several randomized trials have suggested a benefit for multivessel PCI in patients with STEMI and multivessel disease. However, none of the studies compared multivessel PCI with a staged PCI-approach which is the current guideline recommended approach. The results of the trials may overestimate the beneficial effect of the multivessel PCI approach because the control group did not receive any ischaemia testing for evaluation of the significance of remaining lesions. Thus, unfavourable aspects of the multivessel PCI approach such as overestimation of non-culprit lesions at the time of acute coronary angiography, complications associated with PCI of the non-culprit lesion (i.e. dissection, no-reflow, acute stent thrombosis) or increased risk for contrast induced nephropathy may have gone unnoticed as the comparative management pathway was unusual and likely inferior to the guideline recommended approach. We believe that culprit lesion only PCI and staged evaluation of remaining areas of myocardial ischaemia with subsequent PCI is still preferable in patients with STEMI and multivessel disease but a randomized study comparing this approach with multivessel PCI is needed.

  15. Comparison of Two Different Techniques of Cooperative Learning Approach: Undergraduates' Conceptual Understanding in the Context of Hormone Biochemistry

    ERIC Educational Resources Information Center

    Mutlu, Ayfer

    2018-01-01

    The purpose of the research was to compare the effects of two different techniques of the cooperative learning approach, namely Team-Game Tournament and Jigsaw, on undergraduates' conceptual understanding in a Hormone Biochemistry course. Undergraduates were randomly assigned to Group 1 (N = 23) and Group 2 (N = 29). Instructions were accomplished…

  16. Increasing the Precision of Estimates in Follow-Up Surveys: A Case Study. AIR 1983 Annual Forum Paper.

    ERIC Educational Resources Information Center

    Clark, Sheldon B.; Nichols, James O.

    Survey data concerning teacher education program graduates were used to demonstrate the advantages of a stratified random sampling approach, with followup, relative to a one-shot mailing to an entire population. Sampling issues involved in such an approach are addressed, particularly with regard to quantifying the effects of nonresponse on the…

  17. Cross-Sectional HIV Incidence Surveillance: A Benchmarking of Approaches for Estimating the 'Mean Duration of Recent Infection'.

    PubMed

    Kassanjee, Reshma; De Angelis, Daniela; Farah, Marian; Hanson, Debra; Labuschagne, Jan Phillipus Lourens; Laeyendecker, Oliver; Le Vu, Stéphane; Tom, Brian; Wang, Rui; Welte, Alex

    2017-03-01

    The application of biomarkers for 'recent' infection in cross-sectional HIV incidence surveillance requires the estimation of critical biomarker characteristics. Various approaches have been employed for using longitudinal data to estimate the Mean Duration of Recent Infection (MDRI) - the average time in the 'recent' state. In this systematic benchmarking of MDRI estimation approaches, a simulation platform was used to measure accuracy and precision of over twenty approaches, in thirty scenarios capturing various study designs, subject behaviors and test dynamics that may be encountered in practice. Results highlight that assuming a single continuous sojourn in the 'recent' state can produce substantial bias. Simple interpolation provides useful MDRI estimates provided subjects are tested at regular intervals. Regression performs the best - while 'random effects' describe the subject-clustering in the data, regression models without random effects proved easy to implement, stable, and of similar accuracy in scenarios considered; robustness to parametric assumptions was improved by regressing 'recent'/'non-recent' classifications rather than continuous biomarker readings. All approaches were vulnerable to incorrect assumptions about subjects' (unobserved) infection times. Results provided show the relationships between MDRI estimation performance and the number of subjects, inter-visit intervals, missed visits, loss to follow-up, and aspects of biomarker signal and noise.

  18. Cluster ensemble based on Random Forests for genetic data.

    PubMed

    Alhusain, Luluah; Hafez, Alaaeldin M

    2017-01-01

    Clustering plays a crucial role in several application domains, such as bioinformatics. In bioinformatics, clustering has been extensively used as an approach for detecting interesting patterns in genetic data. One application is population structure analysis, which aims to group individuals into subpopulations based on shared genetic variations, such as single nucleotide polymorphisms. Advances in DNA sequencing technology have facilitated the obtainment of genetic datasets with exceptional sizes. Genetic data usually contain hundreds of thousands of genetic markers genotyped for thousands of individuals, making an efficient means for handling such data desirable. Random Forests (RFs) has emerged as an efficient algorithm capable of handling high-dimensional data. RFs provides a proximity measure that can capture different levels of co-occurring relationships between variables. RFs has been widely considered a supervised learning method, although it can be converted into an unsupervised learning method. Therefore, RF-derived proximity measure combined with a clustering technique may be well suited for determining the underlying structure of unlabeled data. This paper proposes, RFcluE, a cluster ensemble approach for determining the underlying structure of genetic data based on RFs. The approach comprises a cluster ensemble framework to combine multiple runs of RF clustering. Experiments were conducted on high-dimensional, real genetic dataset to evaluate the proposed approach. The experiments included an examination of the impact of parameter changes, comparing RFcluE performance against other clustering methods, and an assessment of the relationship between the diversity and quality of the ensemble and its effect on RFcluE performance. This paper proposes, RFcluE, a cluster ensemble approach based on RF clustering to address the problem of population structure analysis and demonstrate the effectiveness of the approach. The paper also illustrates that applying a cluster ensemble approach, combining multiple RF clusterings, produces more robust and higher-quality results as a consequence of feeding the ensemble with diverse views of high-dimensional genetic data obtained through bagging and random subspace, the two key features of the RF algorithm.

  19. A weight-neutral versus weight-loss approach for health promotion in women with high BMI: A randomized-controlled trial.

    PubMed

    Mensinger, Janell L; Calogero, Rachel M; Stranges, Saverio; Tylka, Tracy L

    2016-10-01

    Weight loss is the primary recommendation for health improvement in individuals with high body mass index (BMI) despite limited evidence of long-term success. Alternatives to weight-loss approaches (such as Health At Every Size - a weight-neutral approach) have been met with their own concerns and require further empirical testing. This study compared the effectiveness of a weight-neutral versus a weight-loss program for health promotion. Eighty women, aged 30-45 years, with high body mass index (BMI ≥ 30 kg/m(2)) were randomized to 6 months of facilitator-guided weekly group meetings using structured manuals that emphasized either a weight-loss or weight-neutral approach to health. Health measurements occurred at baseline, post-intervention, and 24-months post-randomization. Measurements included blood pressure, lipid panels, blood glucose, BMI, weight, waist circumference, hip circumference, distress, self-esteem, quality of life, dietary risk, fruit and vegetable intake, intuitive eating, and physical activity. Intention-to-treat analyses were performed using linear mixed-effects models to examine group-by-time interaction effects and between and within-group differences. Group-by-time interactions were found for LDL cholesterol, intuitive eating, BMI, weight, and dietary risk. At post-intervention, the weight-neutral program had larger reductions in LDL cholesterol and greater improvements in intuitive eating; the weight-loss program had larger reductions in BMI, weight, and larger (albeit temporary) decreases in dietary risk. Significant positive changes were observed overall between baseline and 24-month follow-up for waist-to-hip ratio, total cholesterol, physical activity, fruit and vegetable intake, self-esteem, and quality of life. These findings highlight that numerous health benefits, even in the absence of weight loss, are achievable and sustainable in the long term using a weight-neutral approach. The trial positions weight-neutral programs as a viable health promotion alternative to weight-loss programs for women of high weight. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Short-term effects of a randomized computer-based out-of-school smoking prevention trial aimed at elementary schoolchildren.

    PubMed

    Ausems, Marlein; Mesters, Ilse; van Breukelen, Gerard; De Vries, Hein

    2002-06-01

    Smoking prevention programs usually run during school hours. In our study, an out-of-school program was developed consisting of a computer-tailored intervention aimed at the age group before school transition (11- to 12-year-old elementary schoolchildren). The aim of this study is to evaluate the additional effect of out-of-school smoking prevention. One hundred fifty-six participating schools were randomly allocated to one of four research conditions: (a) the in-school condition, an existing seven-lesson program; (b) the out-of-school condition, three computer-tailored letters sent to the students' homes; (c) the in-school and out-of-school condition, a combined approach; (d) the control condition. Pretest and 6 months follow-up data on smoking initiation and continuation, and data on psychosocial variables were collected from 3,349 students. Control and out-of-school conditions differed regarding posttest smoking initiation (18.1 and 10.4%) and regarding posttest smoking continuation (23.5 and 13.1%). Multilevel logistic regression analyses showed positive effects regarding the out-of-school program. Significant effects were not found regarding the in-school program, nor did the combined approach show stronger effects than the single-method approaches. The findings of this study suggest that smoking prevention trials for elementary schoolchildren can be effective when using out-of-school computer-tailored interventions. Copyright 2002 Elsevier Science (USA).

  1. Current Evidence on Auricular Therapy for Chemotherapy-Induced Nausea and Vomiting in Cancer Patients: A Systematic Review of Randomized Controlled Trials

    PubMed Central

    Molassiotis, Alexander; Wang, Tao; Suen, Lorna K. P.

    2014-01-01

    Auricular therapy (AT) has been historically viewed as a convenient approach adjunct to pharmacological therapy for cancer patients with chemotherapy-induced nausea and vomiting (CINV). The aim of this study was to assess the evidence of the therapeutic effect of AT for CINV management in cancer patients. Relevant randomized controlled trials were retrieved from 12 electronic databases without language restrictions. Meanwhile, manual search was conducted for Chinese journals on complementary medicine published within the last five years, and the reference lists of included studies were also checked to identify any possible eligible studies. Twenty-one studies with 1713 participants were included. The effect rate of AT for managing acute CINV ranged from 44.44% to 93.33% in the intervention groups and 15% to 91.67% in the control groups. For delayed CINV, it was 62.96% to 100% and 25% to 100%, respectively. AT seems to be a promising approach in managing CINV. However, the level of evidence was low and the definite effect cannot be concluded as there were significant methodological flaws identified in the analyzed studies. The implications drawn from the 21 studies put some clues for future practice in this area including the need to conduct more rigorously designed randomized controlled trials. PMID:25525445

  2. A pilot cluster randomized controlled trial of structured goal-setting following stroke.

    PubMed

    Taylor, William J; Brown, Melanie; William, Levack; McPherson, Kathryn M; Reed, Kirk; Dean, Sarah G; Weatherall, Mark

    2012-04-01

    To determine the feasibility, the cluster design effect and the variance and minimal clinical importance difference in the primary outcome in a pilot study of a structured approach to goal-setting. A cluster randomized controlled trial. Inpatient rehabilitation facilities. People who were admitted to inpatient rehabilitation following stroke who had sufficient cognition to engage in structured goal-setting and complete the primary outcome measure. Structured goal elicitation using the Canadian Occupational Performance Measure. Quality of life at 12 weeks using the Schedule for Individualised Quality of Life (SEIQOL-DW), Functional Independence Measure, Short Form 36 and Patient Perception of Rehabilitation (measuring satisfaction with rehabilitation). Assessors were blinded to the intervention. Four rehabilitation services and 41 patients were randomized. We found high values of the intraclass correlation for the outcome measures (ranging from 0.03 to 0.40) and high variance of the SEIQOL-DW (SD 19.6) in relation to the minimally importance difference of 2.1, leading to impractically large sample size requirements for a cluster randomized design. A cluster randomized design is not a practical means of avoiding contamination effects in studies of inpatient rehabilitation goal-setting. Other techniques for coping with contamination effects are necessary.

  3. Estimating the optimal dynamic antipsychotic treatment regime: Evidence from the sequential multiple assignment randomized CATIE Schizophrenia Study

    PubMed Central

    Shortreed, Susan M.; Moodie, Erica E. M.

    2012-01-01

    Summary Treatment of schizophrenia is notoriously difficult and typically requires personalized adaption of treatment due to lack of efficacy of treatment, poor adherence, or intolerable side effects. The Clinical Antipsychotic Trials in Intervention Effectiveness (CATIE) Schizophrenia Study is a sequential multiple assignment randomized trial comparing the typical antipsychotic medication, perphenazine, to several newer atypical antipsychotics. This paper describes the marginal structural modeling method for estimating optimal dynamic treatment regimes and applies the approach to the CATIE Schizophrenia Study. Missing data and valid estimation of confidence intervals are also addressed. PMID:23087488

  4. Mendelian randomization in nutritional epidemiology

    PubMed Central

    Qi, Lu

    2013-01-01

    Nutritional epidemiology aims to identify dietary and lifestyle causes for human diseases. Causality inference in nutritional epidemiology is largely based on evidence from studies of observational design, and may be distorted by unmeasured or residual confounding and reverse causation. Mendelian randomization is a recently developed methodology that combines genetic and classical epidemiological analysis to infer causality for environmental exposures, based on the principle of Mendel’s law of independent assortment. Mendelian randomization uses genetic variants as proxiesforenvironmentalexposuresofinterest.AssociationsderivedfromMendelian randomization analysis are less likely to be affected by confounding and reverse causation. During the past 5 years, a body of studies examined the causal effects of diet/lifestyle factors and biomarkers on a variety of diseases. The Mendelian randomization approach also holds considerable promise in the study of intrauterine influences on offspring health outcomes. However, the application of Mendelian randomization in nutritional epidemiology has some limitations. PMID:19674341

  5. A mediator effect size in randomized clinical trials.

    PubMed

    Kraemer, Helena Chmura

    2014-12-01

    To understand the process by which a treatment (T) achieves an effect on outcome (O) and thus to improve the effect of T on O, it is vital to detect mediators, to compare the impact of different mediators, and to develop hypotheses about the causal factors (all mediators) linking T and O. An index is needed to facilitate interpretation of the potential clinical importance of a mediator (M) of choice of T on treatment O in randomized clinical trials (RCTs). Ideally such a mediator effect size should (1) be invariant under any rescaling of M and O consistent with the model used, and (2) reflect the difference between the overall observed effect of T on O and what the maximal effect of T on O could be were the association between T and M broken. A mediator effect size is derived first for the traditional linear model, and then more generally for any categorical (ordered or non-ordered) potential mediator. Issues such as the problem of multiple treatments, outcomes and mediators, and of causal inferences, and the correspondence between this approach and earlier ones, are discussed. Illustrations are given of the application of the approach. Copyright © 2014 John Wiley & Sons, Ltd.

  6. Transport properties of random media: A new effective medium theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Busch, K.; Soukoulis, C.M.

    We present a new method for efficient, accurate calculations of transport properties of random media. It is based on the principle that the wave energy density should be uniform when averaged over length scales larger than the size of the scatterers. This scheme captures the effects of resonant scattering of the individual scatterer exactly, as well as the multiple scattering in a mean-field sense. It has been successfully applied to both ``scalar`` and ``vector`` classical wave calculations. Results for the energy transport velocity are in agreement with experiment. This approach is of general use and can be easily extended tomore » treat different types of wave propagation in random media. {copyright} {ital 1995} {ital The} {ital American} {ital Physical} {ital Society}.« less

  7. Personalized ovarian stimulation for assisted reproductive technology: study design considerations to move from hype to added value for patients.

    PubMed

    Mol, Ben W; Bossuyt, Patrick M; Sunkara, Sesh K; Garcia Velasco, Juan A; Venetis, Christos; Sakkas, Denny; Lundin, Kersti; Simón, Carlos; Taylor, Hugh S; Wan, Robert; Longobardi, Salvatore; Cottell, Evelyn; D'Hooghe, Thomas

    2018-06-01

    Although most medical treatments are designed for the average patient with a one-size-fits-all-approach, they may not benefit all. Better understanding of the function of genes, proteins, and metabolite, and of personal and environmental factors has led to a call for personalized medicine. Personalized reproductive medicine is still in its infancy, without clear guidance on treatment aspects that could be personalized and on trial design to evaluate personalized treatment effect and benefit-harm balance. While the rationale for a personalized approach often relies on retrospective analyses of large observational studies or real-world data, solid evidence of superiority of a personalized approach will come from randomized trials comparing outcomes and safety between a personalized and one-size-fits-all strategy. A more efficient, targeted randomized trial design may recruit only patients or couples for which the personalized approach would differ from the previous, standard approach. Multiple monocenter studies using the same study protocol (allowing future meta-analysis) might reduce the major center effect associated with multicenter studies. In certain cases, single-arm observational studies can generate the necessary evidence for a personalized approach. This review describes each of the main segments of patient care in assisted reproductive technologies treatment, addressing which aspects could be personalized, emphasizing current evidence and relevant study design. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.

  8. Lessons Learned from Community-Led Recruitment of Immigrants and Refugee Participants for a Randomized, Community-Based Participatory Research Study.

    PubMed

    Hanza, Marcelo M; Goodson, Miriam; Osman, Ahmed; Porraz Capetillo, Maria D; Hared, Abdullah; Nigon, Julie A; Meiers, Sonja J; Weis, Jennifer A; Wieland, Mark L; Sia, Irene G

    2016-10-01

    Ethnic minorities remain underrepresented in clinical trials despite efforts to increase their enrollment. Although community-based participatory research (CBPR) approaches have been effective for conducting research studies in minority and socially disadvantaged populations, protocols for CBPR recruitment design and implementation among immigrants and refugees have not been well described. We used a community-led and community-implemented CBPR strategy for recruiting 45 Hispanic, Somali, and Sudanese families (160 individuals) to participate in a large, randomized, community-based trial aimed at evaluating a physical activity and nutrition intervention. We achieved 97.7 % of our recruitment goal for families and 94.4 % for individuals. Use of a CBPR approach is an effective strategy for recruiting immigrant and refugee participants for clinical trials. We believe the lessons we learned during the process of participatory recruitment design and implementation will be helpful for others working with these populations.

  9. Exploring Inflated Pahohoe Lava Flow Morphologies and the Effects of Cooling Using a New Simulation Approach

    NASA Technical Reports Server (NTRS)

    Glaze, L. S.; Baloga, S. M.

    2014-01-01

    Pahoehoe lavas are recognized as an important landform on Earth, Mars and Io. Observations of such flows on Earth (e.g., Figure 1) indicate that the emplacement process is dominated by random effects. Existing models for lobate a`a lava flows that assume viscous fluid flow on an inclined plane are not appropriate for dealing with the numerous random factors present in pahoehoe emplacement. Thus, interpretation of emplacement conditions for pahoehoe lava flows on Mars requires fundamentally different models. A new model that implements a simulation approach has recently been developed that allows exploration of a variety of key influences on pahoehoe lobe emplacement (e.g., source shape, confinement, slope). One important factor that has an impact on the final topographic shape and morphology of a pahoehoe lobe is the volumetric flow rate of lava, where cooling of lava on the lobe surface influences the likelihood of subsequent breakouts.

  10. Modelling wildland fire propagation by tracking random fronts

    NASA Astrophysics Data System (ADS)

    Pagnini, G.; Mentrelli, A.

    2014-08-01

    Wildland fire propagation is studied in the literature by two alternative approaches, namely the reaction-diffusion equation and the level-set method. These two approaches are considered alternatives to each other because the solution of the reaction-diffusion equation is generally a continuous smooth function that has an exponential decay, and it is not zero in an infinite domain, while the level-set method, which is a front tracking technique, generates a sharp function that is not zero inside a compact domain. However, these two approaches can indeed be considered complementary and reconciled. Turbulent hot-air transport and fire spotting are phenomena with a random nature and they are extremely important in wildland fire propagation. Consequently, the fire front gets a random character, too; hence, a tracking method for random fronts is needed. In particular, the level-set contour is randomised here according to the probability density function of the interface particle displacement. Actually, when the level-set method is developed for tracking a front interface with a random motion, the resulting averaged process emerges to be governed by an evolution equation of the reaction-diffusion type. In this reconciled approach, the rate of spread of the fire keeps the same key and characterising role that is typical of the level-set approach. The resulting model emerges to be suitable for simulating effects due to turbulent convection, such as fire flank and backing fire, the faster fire spread being because of the actions by hot-air pre-heating and by ember landing, and also due to the fire overcoming a fire-break zone, which is a case not resolved by models based on the level-set method. Moreover, from the proposed formulation, a correction follows for the formula of the rate of spread which is due to the mean jump length of firebrands in the downwind direction for the leeward sector of the fireline contour. The presented study constitutes a proof of concept, and it needs to be subjected to a future validation.

  11. Study protocol for a cluster randomized trial of the Community of Voices choir intervention to promote the health and well-being of diverse older adults.

    PubMed

    Johnson, Julene K; Nápoles, Anna M; Stewart, Anita L; Max, Wendy B; Santoyo-Olsson, Jasmine; Freyre, Rachel; Allison, Theresa A; Gregorich, Steven E

    2015-10-13

    Older adults are the fastest growing segment of the United States population. There is an immediate need to identify novel, cost-effective community-based approaches that promote health and well-being for older adults, particularly those from diverse racial/ethnic and socioeconomic backgrounds. Because choral singing is multi-modal (requires cognitive, physical, and psychosocial engagement), it has the potential to improve health outcomes across several dimensions to help older adults remain active and independent. The purpose of this study is to examine the effect of a community choir program (Community of Voices) on health and well-being and to examine its costs and cost-effectiveness in a large sample of diverse, community-dwelling older adults. In this cluster randomized controlled trial, diverse adults age 60 and older were enrolled at Administration on Aging-supported senior centers and completed baseline assessments. The senior centers were randomly assigned to either start the choir immediately (intervention group) or wait 6 months to start (control). Community of Voices is a culturally tailored choir program delivered at the senior centers by professional music conductors that reflects three components of engagement (cognitive, physical, and psychosocial). We describe the nature of the study including the cluster randomized trial study design, sampling frame, sample size calculation, methods of recruitment and assessment, and primary and secondary outcomes. The study involves conducting a randomized trial of an intervention as delivered in "real-world" settings. The choir program was designed using a novel translational approach that integrated evidence-based research on the benefits of singing for older adults, community best practices related to community choirs for older adults, and the perspective of the participating communities. The practicality and relatively low cost of the choir intervention means it can be incorporated into a variety of community settings and adapted to diverse cultures and languages. If successful, this program will be a practical and acceptable community-based approach for promoting health and well-being of older adults. ClinicalTrials.gov NCT01869179 registered 9 January 2013.

  12. Personalized contact strategies and predictors of time to survey completion: analysis of two sequential randomized trials.

    PubMed

    Dinglas, Victor D; Huang, Minxuan; Sepulveda, Kristin A; Pinedo, Mariela; Hopkins, Ramona O; Colantuoni, Elizabeth; Needham, Dale M

    2015-01-09

    Effective strategies for contacting and recruiting study participants are critical in conducting clinical research. In this study, we conducted two sequential randomized controlled trials of mail- and telephone-based strategies for contacting and recruiting participants, and evaluated participant-related variables' association with time to survey completion and survey completion rates. Subjects eligible for this study were survivors of acute lung injury who had been previously enrolled in a 12-month observational follow-up study evaluating their physical, cognitive and mental health outcomes, with their last study visit completed at a median of 34 months previously. Eligible subjects were contacted to complete a new research survey as part of two randomized trials, initially using a randomized mail-based contact strategy, followed by a randomized telephone-based contact strategy for non-responders to the mail strategy. Both strategies focused on using either a personalized versus a generic approach. In addition, 18 potentially relevant subject-related variables (e.g., demographics, last known physical and mental health status) were evaluated for association with time to survey completion. Of 308 eligible subjects, 67% completed the survey with a median (IQR) of 3 (2, 5) contact attempts required. There was no significant difference in the time to survey completion for either randomized trial of mail- or phone-based contact strategy. Among all subject-related variables, age ≤40 years and minority race were independently associated with a longer time to survey completion. We found that age ≤40 years and minority race were associated with a longer time to survey completion, but personalized versus generic approaches to mail- and telephone-based contact strategies had no significant effect. Repeating both mail and telephone contact attempts was important for increasing survey completion rate. NCT00719446.

  13. Approximating prediction uncertainty for random forest regression models

    Treesearch

    John W. Coulston; Christine E. Blinn; Valerie A. Thomas; Randolph H. Wynne

    2016-01-01

    Machine learning approaches such as random forest have increased for the spatial modeling and mapping of continuous variables. Random forest is a non-parametric ensemble approach, and unlike traditional regression approaches there is no direct quantification of prediction error. Understanding prediction uncertainty is important when using model-based continuous maps as...

  14. Evaluation of a nurse-led dementia education and knowledge translation programme in primary care: A cluster randomized controlled trial.

    PubMed

    Wang, Yao; Xiao, Lily Dongxia; Ullah, Shahid; He, Guo-Ping; De Bellis, Anita

    2017-02-01

    The lack of dementia education programmes for health professionals in primary care is one of the major factors contributing to the unmet demand for dementia care services. To determine the effectiveness of a nurse-led dementia education and knowledge translation programme for health professionals in primary care; participants' satisfaction with the programme; and to understand participants' perceptions of and experiences in the programme. A cluster randomized controlled trial was used as the main methodology to evaluate health professionals' knowledge, attitudes and care approach. Focus groups were used at the end of the project to understand health professionals' perceptions of and experiences in the programme. Fourteen community health service centres in a province in China participated in the study. Seven centres were randomly assigned to the intervention or control group respectively and 85 health professionals in each group completed the programme. A train-the-trainer model was used to implement a dementia education and knowledge translation programme. Outcome variables were measured at baseline, on the completion of the programme and at 3-month follow-up. A mixed effect linear regression model was applied to compare the significant differences of outcome measures over time between the two groups. Focus groups were guided by four semi-structured questions and analysed using content analysis. Findings revealed significant effects of the education and knowledge translation programme on participants' knowledge, attitudes and a person-centred care approach. Focus groups confirmed that the programme had a positive impact on dementia care practice. A dementia education and knowledge translation programme for health professionals in primary care has positive effects on their knowledge, attitudes, care approach and care practice. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Bilateral robotic priming before task-oriented approach in subacute stroke rehabilitation: a pilot randomized controlled trial.

    PubMed

    Hsieh, Yu-Wei; Wu, Ching-Yi; Wang, Wei-En; Lin, Keh-Chung; Chang, Ku-Chou; Chen, Chih-Chi; Liu, Chien-Ting

    2017-02-01

    To investigate the treatment effects of bilateral robotic priming combined with the task-oriented approach on motor impairment, disability, daily function, and quality of life in patients with subacute stroke. A randomized controlled trial. Occupational therapy clinics in medical centers. Thirty-one subacute stroke patients were recruited. Participants were randomly assigned to receive bilateral priming combined with the task-oriented approach (i.e., primed group) or to the task-oriented approach alone (i.e., unprimed group) for 90 minutes/day, 5 days/week for 4 weeks. The primed group began with the bilateral priming technique by using a bimanual robot-aided device. Motor impairments were assessed by the Fugal-Meyer Assessment, grip strength, and the Box and Block Test. Disability and daily function were measured by the modified Rankin Scale, the Functional Independence Measure, and actigraphy. Quality of life was examined by the Stroke Impact Scale. The primed and unprimed groups improved significantly on most outcomes over time. The primed group demonstrated significantly better improvement on the Stroke Impact Scale strength subscale ( p = 0.012) and a trend for greater improvement on the modified Rankin Scale ( p = 0.065) than the unprimed group. Bilateral priming combined with the task-oriented approach elicited more improvements in self-reported strength and disability degrees than the task-oriented approach by itself. Further large-scale research with at least 31 participants in each intervention group is suggested to confirm the study findings.

  16. Randomized controlled trial of supplemental augmentative and alternative communication versus voice rest alone after phonomicrosurgery.

    PubMed

    Rousseau, Bernard; Gutmann, Michelle L; Mau, Theodore; Francis, David O; Johnson, Jeffrey P; Novaleski, Carolyn K; Vinson, Kimberly N; Garrett, C Gaelyn

    2015-03-01

    This randomized trial investigated voice rest and supplemental text-to-speech communication versus voice rest alone on visual analog scale measures of communication effectiveness and magnitude of voice use. Randomized clinical trial. Multicenter outpatient voice clinics. Thirty-seven patients undergoing phonomicrosurgery. Patients undergoing phonomicrosurgery were randomized to voice rest and supplemental text-to-speech communication or voice rest alone. The primary outcome measure was the impact of voice rest on ability to communicate effectively over a 7-day period. Pre- and postoperative magnitude of voice use was also measured as an observational outcome. Patients randomized to voice rest and supplemental text-to-speech communication reported higher median communication effectiveness on each postoperative day compared to those randomized to voice rest alone, with significantly higher median communication effectiveness on postoperative days 3 (P=.03) and 5 (P=.01). Magnitude of voice use did not differ on any preoperative (P>.05) or postoperative day (P>.05), nor did patients significantly decrease voice use as the surgery date approached (P>.05). However, there was a significant reduction in median voice use pre- to postoperatively across patients (P<.001) with median voice use ranging from 0 to 3 throughout the postoperative week. Supplemental text-to-speech communication increased patient-perceived communication effectiveness on postoperative days 3 and 5 over voice rest alone. With the prevalence of smartphones and the widespread use of text messaging, supplemental text-to-speech communication may provide an accessible and cost-effective communication option for patients on vocal restrictions. © American Academy of Otolaryngology—Head and Neck Surgery Foundation 2015.

  17. The positive deviance/hearth approach to reducing child malnutrition: systematic review.

    PubMed

    Bisits Bullen, Piroska A

    2011-11-01

    The Positive Deviance/Hearth approach aims to rehabilitate malnourished children using practices from mothers in the community who have well-nourished children despite living in poverty. This study assesses its effectiveness in a range of settings. Systematic review of peer reviewed intervention trials and grey literature evaluation reports of child malnutrition programs using the Positive Deviance/Hearth approach. Ten peer reviewed studies and 14 grey literature reports met the inclusion criteria. These described results for 17 unique Positive Deviance/Hearth programs in 12 countries. Nine programs used a pre- and post-test design without a control, which limited the conclusions that could be drawn. Eight used more robust designs such as non-randomized trials, non-randomized cross-sectional sibling studies and randomized controlled trials (RCTs). Of the eight programs that reported nutritional outcomes, five reported some type of positive result in terms of nutritional status - although the improvement was not always as large as predicted, or across the entire target population. Both the two RCTs demonstrated improvements in carer feeding practices. Qualitative results unanimously reported high levels of satisfaction from participants and recipient communities. Overall this study shows mixed results in terms of program effectiveness, although some Positive Deviance/Hearth programs have clearly been successful in particular settings. Sibling studies suggest that the Positive Deviance/Hearth approach may have a role in preventing malnutrition, not just rehabilitation. Further research is needed using more robust study designs and larger sample sizes. Issues related to community participation and consistency in reporting results need to be addressed. © 2011 Blackwell Publishing Ltd.

  18. Interpreting findings from Mendelian randomization using the MR-Egger method.

    PubMed

    Burgess, Stephen; Thompson, Simon G

    2017-05-01

    Mendelian randomization-Egger (MR-Egger) is an analysis method for Mendelian randomization using summarized genetic data. MR-Egger consists of three parts: (1) a test for directional pleiotropy, (2) a test for a causal effect, and (3) an estimate of the causal effect. While conventional analysis methods for Mendelian randomization assume that all genetic variants satisfy the instrumental variable assumptions, the MR-Egger method is able to assess whether genetic variants have pleiotropic effects on the outcome that differ on average from zero (directional pleiotropy), as well as to provide a consistent estimate of the causal effect, under a weaker assumption-the InSIDE (INstrument Strength Independent of Direct Effect) assumption. In this paper, we provide a critical assessment of the MR-Egger method with regard to its implementation and interpretation. While the MR-Egger method is a worthwhile sensitivity analysis for detecting violations of the instrumental variable assumptions, there are several reasons why causal estimates from the MR-Egger method may be biased and have inflated Type 1 error rates in practice, including violations of the InSIDE assumption and the influence of outlying variants. The issues raised in this paper have potentially serious consequences for causal inferences from the MR-Egger approach. We give examples of scenarios in which the estimates from conventional Mendelian randomization methods and MR-Egger differ, and discuss how to interpret findings in such cases.

  19. Finite-time stability of neutral-type neural networks with random time-varying delays

    NASA Astrophysics Data System (ADS)

    Ali, M. Syed; Saravanan, S.; Zhu, Quanxin

    2017-11-01

    This paper is devoted to the finite-time stability analysis of neutral-type neural networks with random time-varying delays. The randomly time-varying delays are characterised by Bernoulli stochastic variable. This result can be extended to analysis and design for neutral-type neural networks with random time-varying delays. On the basis of this paper, we constructed suitable Lyapunov-Krasovskii functional together and established a set of sufficient linear matrix inequalities approach to guarantee the finite-time stability of the system concerned. By employing the Jensen's inequality, free-weighting matrix method and Wirtinger's double integral inequality, the proposed conditions are derived and two numerical examples are addressed for the effectiveness of the developed techniques.

  20. Effect of packing method on the randomness of disc packings

    NASA Astrophysics Data System (ADS)

    Zhang, Z. P.; Yu, A. B.; Oakeshott, R. B. S.

    1996-06-01

    The randomness of disc packings, generated by random sequential adsorption (RSA), random packing under gravity (RPG) and Mason packing (MP) which gives a packing density close to that of the RSA packing, has been analysed, based on the Delaunay tessellation, and is evaluated at two levels, i.e. the randomness at individual subunit level which relates to the construction of a triangle from a given edge length distribution and the randomness at network level which relates to the connection between triangles from a given triangle frequency distribution. The Delaunay tessellation itself is also analysed and its almost perfect randomness at the two levels is demonstrated, which verifies the proposed approach and provides a random reference system for the present analysis. It is found that (i) the construction of a triangle subunit is not random for the RSA, MP and RPG packings, with the degree of randomness decreasing from the RSA to MP and then to RPG packing; (ii) the connection of triangular subunits in the network is almost perfectly random for the RSA packing, acceptable for the MP packing and not good for the RPG packing. Packing method is an important factor governing the randomness of disc packings.

  1. Variance approach for multi-objective linear programming with fuzzy random of objective function coefficients

    NASA Astrophysics Data System (ADS)

    Indarsih, Indrati, Ch. Rini

    2016-02-01

    In this paper, we define variance of the fuzzy random variables through alpha level. We have a theorem that can be used to know that the variance of fuzzy random variables is a fuzzy number. We have a multi-objective linear programming (MOLP) with fuzzy random of objective function coefficients. We will solve the problem by variance approach. The approach transform the MOLP with fuzzy random of objective function coefficients into MOLP with fuzzy of objective function coefficients. By weighted methods, we have linear programming with fuzzy coefficients and we solve by simplex method for fuzzy linear programming.

  2. Using environmental heterogeneity to plan for sea-level rise.

    PubMed

    Hunter, Elizabeth A; Nibbelink, Nathan P

    2017-12-01

    Environmental heterogeneity is increasingly being used to select conservation areas that will provide for future biodiversity under a variety of climate scenarios. This approach, termed conserving nature's stage (CNS), assumes environmental features respond to climate change more slowly than biological communities, but will CNS be effective if the stage were to change as rapidly as the climate? We tested the effectiveness of using CNS to select sites in salt marshes for conservation in coastal Georgia (U.S.A.), where environmental features will change rapidly as sea level rises. We calculated species diversity based on distributions of 7 bird species with a variety of niches in Georgia salt marshes. Environmental heterogeneity was assessed across six landscape gradients (e.g., elevation, salinity, and patch area). We used 2 approaches to select sites with high environmental heterogeneity: site complementarity (environmental diversity [ED]) and local environmental heterogeneity (environmental richness [ER]). Sites selected based on ER predicted present-day species diversity better than randomly selected sites (up to an 8.1% improvement), were resilient to areal loss from SLR (1.0% average areal loss by 2050 compared with 0.9% loss of randomly selected sites), and provided habitat to a threatened species (0.63 average occupancy compared with 0.6 average occupancy of randomly selected sites). Sites selected based on ED predicted species diversity no better or worse than random and were not resilient to SLR (2.9% average areal loss by 2050). Despite the discrepancy between the 2 approaches, CNS is a viable strategy for conservation site selection in salt marshes because the ER approach was successful. It has potential for application in other coastal areas where SLR will affect environmental features, but its performance may depend on the magnitude of geological changes caused by SLR. Our results indicate that conservation planners that had heretofore excluded low-lying coasts from CNS planning could include coastal ecosystems in regional conservation strategies. © 2017 Society for Conservation Biology.

  3. Elastic constants of random solid solutions by SQS and CPA approaches: the case of fcc Ti-Al.

    PubMed

    Tian, Li-Yun; Hu, Qing-Miao; Yang, Rui; Zhao, Jijun; Johansson, Börje; Vitos, Levente

    2015-08-12

    Special quasi-random structure (SQS) and coherent potential approximation (CPA) are techniques widely employed in the first-principles calculations of random alloys. Here we scrutinize these approaches by focusing on the local lattice distortion (LLD) and the crystal symmetry effects. We compare the elastic parameters obtained from SQS and CPA calculations, taking the random face-centered cubic (fcc) Ti(1-x)Al(x) (0 ≤ x ≤ 1) alloy as an example of systems with components showing different electronic structures and bonding characteristics. For the CPA and SQS calculations, we employ the Exact Muffin-Tin Orbitals (EMTO) method and the pseudopotential method as implemented in the Vienna Ab initio Simulation Package (VASP), respectively. We show that the predicted trends of the VASP-SQS and EMTO-CPA parameters against composition are in good agreement with each other. The energy associated with the LLD increases with x up to x = 0.625 ~ 0.750 and drops drastically thereafter. The influence of the LLD on the lattice constants and C12 elastic constant is negligible. C11 and C44 decrease after atomic relaxation for alloys with large LLD, however, the trends of C11 and C44 are not significantly affected. In general, the uncertainties in the elastic parameters associated with the symmetry lowering turn out to be superior to the differences between the two techniques including the effect of LLD.

  4. Multiobjective optimization approach: thermal food processing.

    PubMed

    Abakarov, A; Sushkov, Y; Almonacid, S; Simpson, R

    2009-01-01

    The objective of this study was to utilize a multiobjective optimization technique for the thermal sterilization of packaged foods. The multiobjective optimization approach used in this study is based on the optimization of well-known aggregating functions by an adaptive random search algorithm. The applicability of the proposed approach was illustrated by solving widely used multiobjective test problems taken from the literature. The numerical results obtained for the multiobjective test problems and for the thermal processing problem show that the proposed approach can be effectively used for solving multiobjective optimization problems arising in the food engineering field.

  5. Assessment on the Prevention of Progression by Rosiglitazone on Atherosclerosis in diabetes patients with Cardiovascular History (APPROACH): study design and baseline characteristics.

    PubMed

    Ratner, Robert E; Cannon, Christopher P; Gerstein, Hertzel C; Nesto, Richard W; Serruys, Patrick W; Van Es, Gerrit-Anne; Kolatkar, Nikheel S; Kravitz, Barbara G; Zalewski, Andrew; Fitzgerald, Peter J

    2008-12-01

    Rosiglitazone, a thiazolidinedione, has effects on insulin sensitivity and cardiovascular risk factors that may favorably impact the progression of coronary atherosclerosis. APPROACH is a double-blind randomized clinical trial comparing the effects of the insulin sensitizer rosiglitazone with the insulin secretagogue glipizide on the progression of coronary atherosclerosis. Patients with type 2 diabetes and coronary artery disease undergoing clinically indicated coronary angiography or percutaneous coronary intervention are randomized to receive rosiglitazone or glipizide for 18 months using a titration algorithm designed to provide comparable glycemic control between treatment groups. The primary end point is change in percent atheroma volume from baseline to study completion in a nonintervened coronary artery, as measured by intravascular ultrasound. Cardiovascular events are adjudicated by an end point committee. A total of 672 patients were randomized. The mean age was 61 years, hemoglobin A(1c) (HbA(1c)) 7.2%, body mass index 29.5 kg/m(2), and median duration of diabetes 4.8 years. At baseline, approximately half of the participants were receiving oral antidiabetic monotherapy (53.9%) with 27.5% receiving dual combination therapy and 17.9% treated with diet and exercise alone. Approximately two thirds of the participants (68%) had dyslipidemia, 79.9% hypertension, and 24% prior myocardial infarction. APPROACH has fully enrolled a high-risk patient population and will compare the glucose-independent effects of rosiglitazone and glipizide on the progression of coronary atherosclerosis, as well as provide additional data on the cardiovascular safety of rosiglitazone in patients with type 2 diabetes and coronary artery disease.

  6. Community-based intermittent mass testing and treatment for malaria in an area of high transmission intensity, western Kenya: study design and methodology for a cluster randomized controlled trial.

    PubMed

    Samuels, Aaron M; Awino, Nobert; Odongo, Wycliffe; Abong'o, Benard; Gimnig, John; Otieno, Kephas; Shi, Ya Ping; Were, Vincent; Allen, Denise Roth; Were, Florence; Sang, Tony; Obor, David; Williamson, John; Hamel, Mary J; Patrick Kachur, S; Slutsker, Laurence; Lindblade, Kim A; Kariuki, Simon; Desai, Meghna

    2017-06-07

    Most human Plasmodium infections in western Kenya are asymptomatic and are believed to contribute importantly to malaria transmission. Elimination of asymptomatic infections requires active treatment approaches, such as mass testing and treatment (MTaT) or mass drug administration (MDA), as infected persons do not seek care for their infection. Evaluations of community-based approaches that are designed to reduce malaria transmission require careful attention to study design to ensure that important effects can be measured accurately. This manuscript describes the study design and methodology of a cluster-randomized controlled trial to evaluate a MTaT approach for malaria transmission reduction in an area of high malaria transmission. Ten health facilities in western Kenya were purposively selected for inclusion. The communities within 3 km of each health facility were divided into three clusters of approximately equal population size. Two clusters around each health facility were randomly assigned to the control arm, and one to the intervention arm. Three times per year for 2 years, after the long and short rains, and again before the long rains, teams of community health volunteers visited every household within the intervention arm, tested all consenting individuals with malaria rapid diagnostic tests, and treated all positive individuals with an effective anti-malarial. The effect of mass testing and treatment on malaria transmission was measured through population-based longitudinal cohorts, outpatient visits for clinical malaria, periodic population-based cross-sectional surveys, and entomological indices.

  7. The Effects of Scenario Planning on Participant Reports of Resilience

    ERIC Educational Resources Information Center

    Chermack, Thomas J.; Coons, Laura M.; O'barr, Gregory; Khatami, Shiva

    2017-01-01

    Purpose: The purpose of this research is to examine the effects of scenario planning on participant ratings of resilience. Design/methodology/approach: The research design is a quasi experimental pretest/posttest with treatment and control groups. Random selection or assignment was not achieved. Findings: Results show a significant difference in…

  8. Effects of Differentiated Reading on Elementary Students' Reading Comprehension and Attitudes toward Reading

    ERIC Educational Resources Information Center

    Shaunessy-Dedrick, Elizabeth; Evans, Linda; Ferron, John; Lindo, Myriam

    2015-01-01

    In this investigation, we examined the effects of a differentiated reading approach on fourth grade students' reading comprehension and attitudes toward reading. Eight Title I schools within one urban district were randomly assigned to treatment (Schoolwide Enrichment Model-Reading [SEM-R]) or control (district reading curriculum) conditions.…

  9. Item and Testlet Position Effects in Computer-Based Alternate Assessments for Students with Disabilities

    ERIC Educational Resources Information Center

    Bulut, Okan; Lei, Ming; Guo, Qi

    2018-01-01

    Item positions in educational assessments are often randomized across students to prevent cheating. However, if altering item positions results in any significant impact on students' performance, it may threaten the validity of test scores. Two widely used approaches for detecting position effects -- logistic regression and hierarchical…

  10. A Joint Modeling Approach for Reaction Time and Accuracy in Psycholinguistic Experiments

    ERIC Educational Resources Information Center

    Loeys, T.; Rosseel, Y.; Baten, K.

    2011-01-01

    In the psycholinguistic literature, reaction times and accuracy can be analyzed separately using mixed (logistic) effects models with crossed random effects for item and subject. Given the potential correlation between these two outcomes, a joint model for the reaction time and accuracy may provide further insight. In this paper, a Bayesian…

  11. Applying Generalizability Theory To Evaluate Treatment Effect in Single-Subject Research.

    ERIC Educational Resources Information Center

    Lefebvre, Daniel J.; Suen, Hoi K.

    An empirical investigation of methodological issues associated with evaluating treatment effect in single-subject research (SSR) designs is presented. This investigation: (1) conducted a generalizability (G) study to identify the sources of systematic and random measurement error (SRME); (2) used an analytic approach based on G theory to integrate…

  12. Extended Mixed-Efects Item Response Models with the MH-RM Algorithm

    ERIC Educational Resources Information Center

    Chalmers, R. Philip

    2015-01-01

    A mixed-effects item response theory (IRT) model is presented as a logical extension of the generalized linear mixed-effects modeling approach to formulating explanatory IRT models. Fixed and random coefficients in the extended model are estimated using a Metropolis-Hastings Robbins-Monro (MH-RM) stochastic imputation algorithm to accommodate for…

  13. The Effects of a Multilinguistic Morphological Awareness Approach for Improving Language and Literacy

    ERIC Educational Resources Information Center

    Wolter, Julie A.; Dilworth, Valisa

    2014-01-01

    The purpose of this study was to examine the effectiveness of a multilinguistic intervention to improve reading and spelling in primary grade students who struggle with literacy. Twenty second-grade students with spelling deficits were randomly assigned to receive a multilinguistic intervention with a phonological and orthographic awareness…

  14. The Role of Executive Functions for Dyadic Literacy Learning in Kindergarten

    ERIC Educational Resources Information Center

    van de Sande, Eva; Segers, Eliane; Verhoeven, Ludo

    2018-01-01

    The current study used a dyadic and coconstructive approach to examine how to embed exercises that support executive functioning into early literacy instruction to empower its effects. Using a randomized controlled trial design with 100 children, we examined the effects of dyadic activities in which children scaffolded each other's learning and…

  15. Improved estimation of random vibration loads in launch vehicles

    NASA Technical Reports Server (NTRS)

    Mehta, R.; Erwin, E.; Suryanarayan, S.; Krishna, Murali M. R.

    1993-01-01

    Random vibration induced load is an important component of the total design load environment for payload and launch vehicle components and their support structures. The current approach to random vibration load estimation is based, particularly at the preliminary design stage, on the use of Miles' equation which assumes a single degree-of-freedom (DOF) system and white noise excitation. This paper examines the implications of the use of multi-DOF system models and response calculation based on numerical integration using the actual excitation spectra for random vibration load estimation. The analytical study presented considers a two-DOF system and brings out the effects of modal mass, damping and frequency ratios on the random vibration load factor. The results indicate that load estimates based on the Miles' equation can be significantly different from the more accurate estimates based on multi-DOF models.

  16. Using Big Data to Emulate a Target Trial When a Randomized Trial Is Not Available.

    PubMed

    Hernán, Miguel A; Robins, James M

    2016-04-15

    Ideally, questions about comparative effectiveness or safety would be answered using an appropriately designed and conducted randomized experiment. When we cannot conduct a randomized experiment, we analyze observational data. Causal inference from large observational databases (big data) can be viewed as an attempt to emulate a randomized experiment-the target experiment or target trial-that would answer the question of interest. When the goal is to guide decisions among several strategies, causal analyses of observational data need to be evaluated with respect to how well they emulate a particular target trial. We outline a framework for comparative effectiveness research using big data that makes the target trial explicit. This framework channels counterfactual theory for comparing the effects of sustained treatment strategies, organizes analytic approaches, provides a structured process for the criticism of observational studies, and helps avoid common methodologic pitfalls. © The Author 2016. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  17. Comparing the Effects of Teen Mentors to Adult Teachers on Child Lifestyle Behaviors and Health Outcomes in Appalachia

    ERIC Educational Resources Information Center

    Smith, Laureen H.; Holloman, Christopher

    2013-01-01

    Childhood obesity prevalence rates in the United States are the highest in the rural Appalachian areas. Teens mentoring younger children to reverse obesity health risks are an understudied approach. This randomized-controlled trial compared the effects of two curriculum delivery methods and assessed the mediating effects of the number of sessions…

  18. An Efficient ERP-Based Brain-Computer Interface Using Random Set Presentation and Face Familiarity

    PubMed Central

    Müller, Klaus-Robert; Lee, Seong-Whan

    2014-01-01

    Event-related potential (ERP)-based P300 spellers are commonly used in the field of brain-computer interfaces as an alternative channel of communication for people with severe neuro-muscular diseases. This study introduces a novel P300 based brain-computer interface (BCI) stimulus paradigm using a random set presentation pattern and exploiting the effects of face familiarity. The effect of face familiarity is widely studied in the cognitive neurosciences and has recently been addressed for the purpose of BCI. In this study we compare P300-based BCI performances of a conventional row-column (RC)-based paradigm with our approach that combines a random set presentation paradigm with (non-) self-face stimuli. Our experimental results indicate stronger deflections of the ERPs in response to face stimuli, which are further enhanced when using the self-face images, and thereby improving P300-based spelling performance. This lead to a significant reduction of stimulus sequences required for correct character classification. These findings demonstrate a promising new approach for improving the speed and thus fluency of BCI-enhanced communication with the widely used P300-based BCI setup. PMID:25384045

  19. An efficient ERP-based brain-computer interface using random set presentation and face familiarity.

    PubMed

    Yeom, Seul-Ki; Fazli, Siamac; Müller, Klaus-Robert; Lee, Seong-Whan

    2014-01-01

    Event-related potential (ERP)-based P300 spellers are commonly used in the field of brain-computer interfaces as an alternative channel of communication for people with severe neuro-muscular diseases. This study introduces a novel P300 based brain-computer interface (BCI) stimulus paradigm using a random set presentation pattern and exploiting the effects of face familiarity. The effect of face familiarity is widely studied in the cognitive neurosciences and has recently been addressed for the purpose of BCI. In this study we compare P300-based BCI performances of a conventional row-column (RC)-based paradigm with our approach that combines a random set presentation paradigm with (non-) self-face stimuli. Our experimental results indicate stronger deflections of the ERPs in response to face stimuli, which are further enhanced when using the self-face images, and thereby improving P300-based spelling performance. This lead to a significant reduction of stimulus sequences required for correct character classification. These findings demonstrate a promising new approach for improving the speed and thus fluency of BCI-enhanced communication with the widely used P300-based BCI setup.

  20. A randomized controlled pilot study of the effectiveness of occupational therapy for children with sensory modulation disorder.

    PubMed

    Miller, Lucy Jane; Coll, Joseph R; Schoen, Sarah A

    2007-01-01

    A pilot randomized controlled trial (RCT) of the effectiveness of occupational therapy using a sensory integration approach (OT-SI) was conducted with children who had sensory modulation disorders (SMDs). This study evaluated the effectiveness of three treatment groups. In addition, sample size estimates for a large scale, multisite RCT were calculated. Twenty-four children with SMD were randomly assigned to one of three treatment conditions; OT-SI, Activity Protocol, and No Treatment. Pretest and posttest measures of behavior, sensory and adaptive functioning, and physiology were administered. The OT-SI group, compared to the other two groups, made significant gains on goal attainment scaling and on the Attention subtest and the Cognitive/Social composite of the Leiter International Performance Scale-Revised. Compared to the control groups, OT-SI improvement trends on the Short Sensory Profile, Child Behavior Checklist, and electrodermal reactivity were in the hypothesized direction. Findings suggest that OT-SI may be effective in ameliorating difficulties of children with SMD.

  1. Interaction of the sonic boom with atmospheric turbulence

    NASA Technical Reports Server (NTRS)

    Rusak, Zvi; Cole, Julian D.

    1994-01-01

    Theoretical research was carried out to study the effect of free-stream turbulence on sonic boom pressure fields. A new transonic small-disturbance model to analyze the interactions of random disturbances with a weak shock was developed. The model equation has an extended form of the classic small-disturbance equation for unsteady transonic aerodynamics. An alternative approach shows that the pressure field may be described by an equation that has an extended form of the classic nonlinear acoustics equation that describes the propagation of sound beams with narrow angular spectrum. The model shows that diffraction effects, nonlinear steepening effects, focusing and caustic effects and random induced vorticity fluctuations interact simultaneously to determine the development of the shock wave in space and time and the pressure field behind it. A finite-difference algorithm to solve the mixed type elliptic-hyperbolic flows around the shock wave was also developed. Numerical calculations of shock wave interactions with various deterministic and random fluctuations will be presented in a future report.

  2. Variable- and Person-Centered Approaches to the Analysis of Early Adolescent Substance Use: Linking Peer, Family, and Intervention Effects with Developmental Trajectories

    ERIC Educational Resources Information Center

    Connell, Arin M.; Dishion, Thomas J.; Deater-Deckard, Kirby

    2006-01-01

    This 4-year study of 698 young adolescents examined the covariates of early onset substance use from Grade 6 through Grade 9. The youth were randomly assigned to a family-centered Adolescent Transitions Program (ATP) condition. Variable-centered (zero-inflated Poisson growth model) and person-centered (latent growth mixture model) approaches were…

  3. Changing Health Behaviors to Improve Health Outcomes after Angioplasty: A Randomized Trial of Net Present Value versus Future Value Risk Communication

    ERIC Educational Resources Information Center

    Charlson, M. E.; Peterson, J. C.; Boutin-Foster, C.; Briggs, W. M.; Ogedegbe, G. G.; McCulloch, C. E.; Hollenberg, J.; Wong, C.; Allegrante, J. P.

    2008-01-01

    Patients who have undergone angioplasty experience difficulty modifying at-risk behaviors for subsequent cardiac events. The purpose of this study was to test whether an innovative approach to framing of risk, based on "net present value" economic theory, would be more effective in behavioral intervention than the standard "future value approach"…

  4. Causal inference as an emerging statistical approach in neurology: an example for epilepsy in the elderly.

    PubMed

    Moura, Lidia Mvr; Westover, M Brandon; Kwasnik, David; Cole, Andrew J; Hsu, John

    2017-01-01

    The elderly population faces an increasing number of cases of chronic neurological conditions, such as epilepsy and Alzheimer's disease. Because the elderly with epilepsy are commonly excluded from randomized controlled clinical trials, there are few rigorous studies to guide clinical practice. When the elderly are eligible for trials, they either rarely participate or frequently have poor adherence to therapy, thus limiting both generalizability and validity. In contrast, large observational data sets are increasingly available, but are susceptible to bias when using common analytic approaches. Recent developments in causal inference-analytic approaches also introduce the possibility of emulating randomized controlled trials to yield valid estimates. We provide a practical example of the application of the principles of causal inference to a large observational data set of patients with epilepsy. This review also provides a framework for comparative-effectiveness research in chronic neurological conditions.

  5. Treatment of eating disorders in child and adolescent psychiatry.

    PubMed

    Herpertz-Dahlmann, Beate

    2017-11-01

    Recent research on the multimodal treatment of eating disorders in child and adolescent psychiatry has yielded a significant increase in randomized controlled trials and systematic reviews. This review aims to present relevant findings published during the last 2 years related to medical and psychological treatment of anorexia nervosa, bulimia nervosa and avoidant/restrictive food intake disorder (ARFID). For anorexia nervosa, recent reports described the efficacy of different treatment settings, lengths of hospital stay and high vs. low-calorie refeeding programmes. For both anorexia and bulimia nervosa, a number of randomized controlled trials comparing individual and family-oriented treatment approaches were published. For the newly defined ARFID, only very preliminary results on possible treatment approaches implying a multidisciplinary treatment programme were obtained. Although there is some evidence of the effectiveness of new child and adolescent psychiatric treatment approaches to eating disorders, the relapse rate remains very high, and there is an urgent need for ongoing intensive research.

  6. Implementation research design: integrating participatory action research into randomized controlled trials

    PubMed Central

    Leykum, Luci K; Pugh, Jacqueline A; Lanham, Holly J; Harmon, Joel; McDaniel, Reuben R

    2009-01-01

    Background A gap continues to exist between what is known to be effective and what is actually delivered in the usual course of medical care. The goal of implementation research is to reduce this gap. However, a tension exists between the need to obtain generalizeable knowledge through implementation trials, and the inherent differences between healthcare organizations that make standard interventional approaches less likely to succeed. The purpose of this paper is to explore the integration of participatory action research and randomized controlled trial (RCT) study designs to suggest a new approach for studying interventions in healthcare settings. Discussion We summarize key elements of participatory action research, with particular attention to its collaborative, reflective approach. Elements of participatory action research and RCT study designs are discussed and contrasted, with a complex adaptive systems approach used to frame their integration. Summary The integration of participatory action research and RCT design results in a new approach that reflects not only the complex nature of healthcare organizations, but also the need to obtain generalizeable knowledge regarding the implementation process. PMID:19852784

  7. A Bayesian Approach to the Paleomagnetic Conglomerate Test

    NASA Astrophysics Data System (ADS)

    Heslop, David; Roberts, Andrew P.

    2018-02-01

    The conglomerate test has served the paleomagnetic community for over 60 years as a means to detect remagnetizations. The test states that if a suite of clasts within a bed have uniformly random paleomagnetic directions, then the conglomerate cannot have experienced a pervasive event that remagnetized the clasts in the same direction. The current form of the conglomerate test is based on null hypothesis testing, which results in a binary "pass" (uniformly random directions) or "fail" (nonrandom directions) outcome. We have recast the conglomerate test in a Bayesian framework with the aim of providing more information concerning the level of support a given data set provides for a hypothesis of uniformly random paleomagnetic directions. Using this approach, we place the conglomerate test in a fully probabilistic framework that allows for inconclusive results when insufficient information is available to draw firm conclusions concerning the randomness or nonrandomness of directions. With our method, sample sets larger than those typically employed in paleomagnetism may be required to achieve strong support for a hypothesis of random directions. Given the potentially detrimental effect of unrecognized remagnetizations on paleomagnetic reconstructions, it is important to provide a means to draw statistically robust data-driven inferences. Our Bayesian analysis provides a means to do this for the conglomerate test.

  8. Stochastic uncertainty analysis for solute transport in randomly heterogeneous media using a Karhunen‐Loève‐based moment equation approach

    USGS Publications Warehouse

    Liu, Gaisheng; Lu, Zhiming; Zhang, Dongxiao

    2007-01-01

    A new approach has been developed for solving solute transport problems in randomly heterogeneous media using the Karhunen‐Loève‐based moment equation (KLME) technique proposed by Zhang and Lu (2004). The KLME approach combines the Karhunen‐Loève decomposition of the underlying random conductivity field and the perturbative and polynomial expansions of dependent variables including the hydraulic head, flow velocity, dispersion coefficient, and solute concentration. The equations obtained in this approach are sequential, and their structure is formulated in the same form as the original governing equations such that any existing simulator, such as Modular Three‐Dimensional Multispecies Transport Model for Simulation of Advection, Dispersion, and Chemical Reactions of Contaminants in Groundwater Systems (MT3DMS), can be directly applied as the solver. Through a series of two‐dimensional examples, the validity of the KLME approach is evaluated against the classical Monte Carlo simulations. Results indicate that under the flow and transport conditions examined in this work, the KLME approach provides an accurate representation of the mean concentration. For the concentration variance, the accuracy of the KLME approach is good when the conductivity variance is 0.5. As the conductivity variance increases up to 1.0, the mismatch on the concentration variance becomes large, although the mean concentration can still be accurately reproduced by the KLME approach. Our results also indicate that when the conductivity variance is relatively large, neglecting the effects of the cross terms between velocity fluctuations and local dispersivities, as done in some previous studies, can produce noticeable errors, and a rigorous treatment of the dispersion terms becomes more appropriate.

  9. An effective medium approach to predict the apparent contact angle of drops on super-hydrophobic randomly rough surfaces.

    PubMed

    Bottiglione, F; Carbone, G

    2015-01-14

    The apparent contact angle of large 2D drops with randomly rough self-affine profiles is numerically investigated. The numerical approach is based upon the assumption of large separation of length scales, i.e. it is assumed that the roughness length scales are much smaller than the drop size, thus making it possible to treat the problem through a mean-field like approach relying on the large-separation of scales. The apparent contact angle at equilibrium is calculated in all wetting regimes from full wetting (Wenzel state) to partial wetting (Cassie state). It was found that for very large values of the roughness Wenzel parameter (r(W) > -1/ cos θ(Y), where θ(Y) is the Young's contact angle), the interface approaches the perfect non-wetting condition and the apparent contact angle is almost equal to 180°. The results are compared with the case of roughness on one single scale (sinusoidal surface) and it is found that, given the same value of the Wenzel roughness parameter rW, the apparent contact angle is much larger for the case of a randomly rough surface, proving that the multi-scale character of randomly rough surfaces is a key factor to enhance superhydrophobicity. Moreover, it is shown that for millimetre-sized drops, the actual drop pressure at static equilibrium weakly affects the wetting regime, which instead seems to be dominated by the roughness parameter. For this reason a methodology to estimate the apparent contact angle is proposed, which relies only upon the micro-scale properties of the rough surface.

  10. Logistic regression of family data from retrospective study designs.

    PubMed

    Whittemore, Alice S; Halpern, Jerry

    2003-11-01

    We wish to study the effects of genetic and environmental factors on disease risk, using data from families ascertained because they contain multiple cases of the disease. To do so, we must account for the way participants were ascertained, and for within-family correlations in both disease occurrences and covariates. We model the joint probability distribution of the covariates of ascertained family members, given family disease occurrence and pedigree structure. We describe two such covariate models: the random effects model and the marginal model. Both models assume a logistic form for the distribution of one person's covariates that involves a vector beta of regression parameters. The components of beta in the two models have different interpretations, and they differ in magnitude when the covariates are correlated within families. We describe ascertainment assumptions needed to estimate consistently the parameters beta(RE) in the random effects model and the parameters beta(M) in the marginal model. Under the ascertainment assumptions for the random effects model, we show that conditional logistic regression (CLR) of matched family data gives a consistent estimate beta(RE) for beta(RE) and a consistent estimate for the covariance matrix of beta(RE). Under the ascertainment assumptions for the marginal model, we show that unconditional logistic regression (ULR) gives a consistent estimate for beta(M), and we give a consistent estimator for its covariance matrix. The random effects/CLR approach is simple to use and to interpret, but it can use data only from families containing both affected and unaffected members. The marginal/ULR approach uses data from all individuals, but its variance estimates require special computations. A C program to compute these variance estimates is available at http://www.stanford.edu/dept/HRP/epidemiology. We illustrate these pros and cons by application to data on the effects of parity on ovarian cancer risk in mother/daughter pairs, and use simulations to study the performance of the estimates. Copyright 2003 Wiley-Liss, Inc.

  11. First passage time: Connecting random walks to functional responses in heterogeneous environments (Invited)

    NASA Astrophysics Data System (ADS)

    Lewis, M. A.; McKenzie, H.; Merrill, E.

    2010-12-01

    In this talk I will outline first passage time analysis for animals undertaking complex movement patterns, and will demonstrate how first passage time can be used to derive functional responses in predator prey systems. The result is a new approach to understanding type III functional responses based on a random walk model. I will extend the analysis to heterogeneous environments to assess the effects of linear features on functional responses in wolves and elk using GPS tracking data.

  12. Application of the Radon-FCL approach to seismic random noise suppression and signal preservation

    NASA Astrophysics Data System (ADS)

    Meng, Fanlei; Li, Yue; Liu, Yanping; Tian, Yanan; Wu, Ning

    2016-08-01

    The fractal conservation law (FCL) is a linear partial differential equation that is modified by an anti-diffusive term of lower order. The analysis indicated that this algorithm could eliminate high frequencies and preserve or amplify low/medium-frequencies. Thus, this method is quite suitable for the simultaneous noise suppression and enhancement or preservation of seismic signals. However, the conventional FCL filters seismic data only along the time direction, thereby ignoring the spatial coherence between neighbouring traces, which leads to the loss of directional information. Therefore, we consider the development of the conventional FCL into the time-space domain and propose a Radon-FCL approach. We applied a Radon transform to implement the FCL method in this article; performing FCL filtering in the Radon domain achieves a higher level of noise attenuation. Using this method, seismic reflection events can be recovered with the sacrifice of fewer frequency components while effectively attenuating more random noise than conventional FCL filtering. Experiments using both synthetic and common shot point data demonstrate the advantages of the Radon-FCL approach versus the conventional FCL method with regard to both random noise attenuation and seismic signal preservation.

  13. Information without Implementation: A Practical Example for Developing a Best Practice Education Control Group.

    PubMed

    Balderson, Benjamin H; McCurry, Susan M; Vitiello, Michael V; Shortreed, Susan M; Rybarczyk, Bruce D; Keefe, Francis J; Korff, Michael Von

    2016-01-01

    This article considers methodology for developing an education-only control group and proposes a simple approach to designing rigorous and well-accepted control groups. This approach is demonstrated in a large randomized trial. The Lifestyles trial (n = 367) compared three group interventions: (a) cognitive-behavioral treatment (CBT) for osteoarthritis pain, (b) CBT for osteoarthritis pain and insomnia, and (c) education-only control (EOC). EOC emulated the interventions excluding hypothesized treatment components and controlling for nonspecific treatment effects. Results showed this approach resulted in a control group that was highly credible and acceptable to patients. This approach can be an effective and practical guide for developing high-quality control groups in trials of behavioral interventions.

  14. Comparison of the Predictive Performance and Interpretability of Random Forest and Linear Models on Benchmark Data Sets.

    PubMed

    Marchese Robinson, Richard L; Palczewska, Anna; Palczewski, Jan; Kidley, Nathan

    2017-08-28

    The ability to interpret the predictions made by quantitative structure-activity relationships (QSARs) offers a number of advantages. While QSARs built using nonlinear modeling approaches, such as the popular Random Forest algorithm, might sometimes be more predictive than those built using linear modeling approaches, their predictions have been perceived as difficult to interpret. However, a growing number of approaches have been proposed for interpreting nonlinear QSAR models in general and Random Forest in particular. In the current work, we compare the performance of Random Forest to those of two widely used linear modeling approaches: linear Support Vector Machines (SVMs) (or Support Vector Regression (SVR)) and partial least-squares (PLS). We compare their performance in terms of their predictivity as well as the chemical interpretability of the predictions using novel scoring schemes for assessing heat map images of substructural contributions. We critically assess different approaches for interpreting Random Forest models as well as for obtaining predictions from the forest. We assess the models on a large number of widely employed public-domain benchmark data sets corresponding to regression and binary classification problems of relevance to hit identification and toxicology. We conclude that Random Forest typically yields comparable or possibly better predictive performance than the linear modeling approaches and that its predictions may also be interpreted in a chemically and biologically meaningful way. In contrast to earlier work looking at interpretation of nonlinear QSAR models, we directly compare two methodologically distinct approaches for interpreting Random Forest models. The approaches for interpreting Random Forest assessed in our article were implemented using open-source programs that we have made available to the community. These programs are the rfFC package ( https://r-forge.r-project.org/R/?group_id=1725 ) for the R statistical programming language and the Python program HeatMapWrapper [ https://doi.org/10.5281/zenodo.495163 ] for heat map generation.

  15. Rating the raters in a mixed model: An approach to deciphering the rater reliability

    NASA Astrophysics Data System (ADS)

    Shang, Junfeng; Wang, Yougui

    2013-05-01

    Rating the raters has attracted extensive attention in recent years. Ratings are quite complex in that the subjective assessment and a number of criteria are involved in a rating system. Whenever the human judgment is a part of ratings, the inconsistency of ratings is the source of variance in scores, and it is therefore quite natural for people to verify the trustworthiness of ratings. Accordingly, estimation of the rater reliability will be of great interest and an appealing issue. To facilitate the evaluation of the rater reliability in a rating system, we propose a mixed model where the scores of the ratees offered by a rater are described with the fixed effects determined by the ability of the ratees and the random effects produced by the disagreement of the raters. In such a mixed model, for the rater random effects, we derive its posterior distribution for the prediction of random effects. To quantitatively make a decision in revealing the unreliable raters, the predictive influence function (PIF) serves as a criterion which compares the posterior distributions of random effects between the full data and rater-deleted data sets. The benchmark for this criterion is also discussed. This proposed methodology of deciphering the rater reliability is investigated in the multiple simulated and two real data sets.

  16. A dynamic programming-based particle swarm optimization algorithm for an inventory management problem under uncertainty

    NASA Astrophysics Data System (ADS)

    Xu, Jiuping; Zeng, Ziqiang; Han, Bernard; Lei, Xiao

    2013-07-01

    This article presents a dynamic programming-based particle swarm optimization (DP-based PSO) algorithm for solving an inventory management problem for large-scale construction projects under a fuzzy random environment. By taking into account the purchasing behaviour and strategy under rules of international bidding, a multi-objective fuzzy random dynamic programming model is constructed. To deal with the uncertainties, a hybrid crisp approach is used to transform fuzzy random parameters into fuzzy variables that are subsequently defuzzified by using an expected value operator with optimistic-pessimistic index. The iterative nature of the authors' model motivates them to develop a DP-based PSO algorithm. More specifically, their approach treats the state variables as hidden parameters. This in turn eliminates many redundant feasibility checks during initialization and particle updates at each iteration. Results and sensitivity analysis are presented to highlight the performance of the authors' optimization method, which is very effective as compared to the standard PSO algorithm.

  17. Active music therapy approach in amyotrophic lateral sclerosis: a randomized-controlled trial.

    PubMed

    Raglio, Alfredo; Giovanazzi, Elena; Pain, Debora; Baiardi, Paola; Imbriani, Chiara; Imbriani, Marcello; Mora, Gabriele

    2016-12-01

    This randomized controlled study assessed the efficacy of active music therapy (AMT) on anxiety, depression, and quality of life in amyotrophic lateral sclerosis (ALS). Communication and relationship during AMT treatment were also evaluated. Thirty patients were assigned randomly to experimental [AMT plus standard of care (SC)] or control (SC) groups. AMT consisted of 12 sessions (three times a week), whereas the SC treatment was based on physical and speech rehabilitation sessions, occupational therapy, and psychological support. ALS Functional Rating Scale-Revised, Hospital Anxiety and Depression Scale, McGill Quality of Life Questionnaire, and Music Therapy Rating Scale were administered to assess functional, psychological, and music therapy outcomes. The AMT group improved significantly in McGill Quality of Life Questionnaire global scores (P=0.035) and showed a positive trend in nonverbal and sonorous-music relationship during the treatment. Further studies involving larger samples in a longer AMT intervention are needed to confirm the effectiveness of this approach in ALS.

  18. Charged-particle therapy in cancer: clinical uses and future perspectives.

    PubMed

    Durante, Marco; Orecchia, Roberto; Loeffler, Jay S

    2017-08-01

    Radiotherapy with high-energy charged particles has become an attractive therapeutic option for patients with several tumour types because this approach better spares healthy tissue from radiation than conventional photon therapy. The cost associated with the delivery of charged particles, however, is higher than that of even the most elaborate photon-delivery technologies. Reliable evidence of the relative cost-effectiveness of both modalities can only come from the results of randomized clinical trials. Thus, the hurdles that currently limit direct comparisons of these two approaches in clinical trials, especially those related to insurance coverage, should be removed. Herein, we review several randomized trials of charged-particle therapies that are ongoing, with results that will enable selective delivery to patients who are most likely to benefit from them. We also discuss aspects related to radiobiology, including the immune response and hypoxia, which will need to be taken into consideration in future randomized trials to fully exploit the potential of charged particles.

  19. Effects of traditional and discovery instructional approaches on learning outcomes for learners of different intellectual development: A study of chemistry students in Zambia

    NASA Astrophysics Data System (ADS)

    Mulopo, Moses M.; Seymour Fowler, H.

    This study examined the differential effectiveness of traditional and discovery methods of instruction for the teaching of science concepts, understandings about science, and scientific attitudes, to learners at the concrete and formal level of cognitive development. The dependent variables were achievement, understanding science, and scientific attitude; assessed through the use of the ACS Achievement Test (high school chemistry, Form 1979), the Test on Understanding Science (Form W), and the Test on Scientific Attitude, respectively. Mode of instruction and cognitive development were the independent variables. Subjects were 120 Form IV (11th grade) males enrolled in chemistry classes in Lusaka, Zambia. Sixty of these were concrete reasoners (mean age = 18.23) randomly selected from one of the two schools. The remaining 60 subjects were formal reasoners (mean age 18.06) randomly selected from a second boys' school. Each of these two groups was randomly split into two subgroups with 30 subjects. Traditional and discovery approaches were randomly assigned to the two subgroups of concrete reasoners and to the two subgroups of formal reasoners. Prior to instruction, the subjects were pretested using the ACS Achievement Test, the Test on Understanding Science, and the Test on Scientific Attitude. Subjects received instruction covering eight chemistry topics during approximately 10 weeks. Posttests followed using the same standard tests. Two-way analysis of covariance, with pretest scores serving as covariates was used and 0.05 level of significant was accepted. Tukey WSD technique was used as a follow-up test where applicable. It was found that (1) for the formal reasoners, the discovery group earned significantly higher understanding science scores than the traditional group. For the concrete reasoners mode of instruction did not make a difference; (2) overall, formal reasoners earned significantly higher achievement scores than concrete reasoners; (3) in general, subjects taught by the discovery approach earned significantly higher scientific attitude scores than those taught by the traditional approach. The traditional group outperformed the discovery group in achievement scores. It was concluded that the traditional approach might be an efficient instructional mode for the teaching of scientific facts and principles to high school students, while the discovery approach seemed to be more suitable for teaching scientific attitudes and for promoting understanding about science and scientists among formal operational learners.

  20. Assessing the Effect of Cooperative Learning on Financial Accounting Achievement among Secondary School Students

    ERIC Educational Resources Information Center

    Inuwa, Umar; Abdullah, Zarifah; Hassan, Haslinda

    2017-01-01

    This study examined the effect of cooperative learning approach on financial accounting achievement among secondary school students in Gombe state, Nigeria. A pre-test-post-test-control group design was adopted. 120 students participated in the study were selected randomly from six schools. The students were divided into two equal groups, namely:…

  1. The Effects of Parent Training on Knowledge of Transition Services for Students with Disabilities

    ERIC Educational Resources Information Center

    Young, John; Morgan, Robert L.; Callow-Heusser, Catherine A.; Lindstrom, Lauren

    2016-01-01

    This study examined effects of two parent-training approaches to increase knowledge of transition resources by (a) giving parents a brochure describing local transition services or (b) providing the same brochure plus 60 min of small-group training. We randomly assigned parents to groups who completed pre- and posttests on knowledge of transition…

  2. The Effect of Content-Focused Coaching on the Quality of Classroom Text Discussions

    ERIC Educational Resources Information Center

    Matsumura, Lindsay Clare; Garnier, Helen E.; Spybrook, Jessaca

    2012-01-01

    This study examines the effect of a comprehensive literacy-coaching program focused on enacting a discussion-based approach to reading comprehension instruction (content-focused coaching [CFC]) on the quality of classroom text discussions over 2 years. The study used a cluster-randomized trial in which schools were assigned to either CFC or…

  3. "Familias: Preparando La Nueva Generación": A Randomized Control Trial Testing the Effects on Positive Parenting Practices

    ERIC Educational Resources Information Center

    Marsiglia, Flavio F.; Williams, Lela Rankin; Ayers, Stephanie L.; Booth, Jaime M.

    2014-01-01

    Objectives: This article reports the effects of a culturally grounded parenting intervention to strengthen positive parenting practices. Method: The intervention was designed and tested with primarily Mexican origin parents in a large urban setting of the southwestern United States using an ecodevelopmental approach. Parents (N = 393) were…

  4. Impulsivity moderates the effect of approach bias modification on healthy food consumption.

    PubMed

    Kakoschke, Naomi; Kemps, Eva; Tiggemann, Marika

    2017-10-01

    The study aimed to modify approach bias for healthy and unhealthy food and to determine its effect on subsequent food consumption. In addition, we investigated the potential moderating role of impulsivity in the effect of approach bias re-training on food consumption. Participants were 200 undergraduate women (17-26 years) who were randomly allocated to one of five conditions of an approach-avoidance task varying in the training of an approach bias for healthy food, unhealthy food, and non-food cues in a single session of 10 min. Outcome variables were approach bias for healthy and unhealthy food and the proportion of healthy relative to unhealthy snack food consumed. As predicted, approach bias for healthy food significantly increased in the 'avoid unhealthy food/approach healthy food' condition. Importantly, the effect of training on snack consumption was moderated by trait impulsivity. Participants high in impulsivity consumed a greater proportion of healthy snack food following the 'avoid unhealthy food/approach healthy food' training. This finding supports the suggestion that automatic processing of appetitive cues has a greater influence on consumption behaviour in individuals with poor self-regulatory control. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Random mechanics: Nonlinear vibrations, turbulences, seisms, swells, fatigue

    NASA Astrophysics Data System (ADS)

    Kree, P.; Soize, C.

    The random modeling of physical phenomena, together with probabilistic methods for the numerical calculation of random mechanical forces, are analytically explored. Attention is given to theoretical examinations such as probabilistic concepts, linear filtering techniques, and trajectory statistics. Applications of the methods to structures experiencing atmospheric turbulence, the quantification of turbulence, and the dynamic responses of the structures are considered. A probabilistic approach is taken to study the effects of earthquakes on structures and to the forces exerted by ocean waves on marine structures. Theoretical analyses by means of vector spaces and stochastic modeling are reviewed, as are Markovian formulations of Gaussian processes and the definition of stochastic differential equations. Finally, random vibrations with a variable number of links and linear oscillators undergoing the square of Gaussian processes are investigated.

  6. Randomized Controlled Trial of Supplemental Augmentative and Alternative Communication versus Voice Rest Alone after Phonomicrosurgery

    PubMed Central

    Rousseau, Bernard; Gutmann, Michelle L.; Mau, I-fan Theodore; Francis, David O.; Johnson, Jeffrey P.; Novaleski, Carolyn K.; Vinson, Kimberly N.; Garrett, C. Gaelyn

    2015-01-01

    Objective This randomized trial investigated voice rest and supplemental text-to-speech communication versus voice rest alone on visual analog scale measures of communication effectiveness and magnitude of voice use. Study Design Randomized clinical trial. Setting Multicenter outpatient voice clinics. Subjects Thirty-seven patients undergoing phonomicrosurgery. Methods Patients undergoing phonomicrosurgery were randomized to voice rest and supplemental text-to-speech communication or voice rest alone. The primary outcome measure was the impact of voice rest on ability to communicate effectively over a seven-day period. Pre- and post-operative magnitude of voice use was also measured as an observational outcome. Results Patients randomized to voice rest and supplemental text-to-speech communication reported higher median communication effectiveness on each post-operative day compared to those randomized to voice rest alone, with significantly higher median communication effectiveness on post-operative day 3 (p = 0.03) and 5 (p = 0.01). Magnitude of voice use did not differ on any pre-operative (p > 0.05) or post-operative day (p > 0.05), nor did patients significantly decrease voice use as the surgery date approached (p > 0.05). However, there was a significant reduction in median voice use pre- to post-operatively across patients (p < 0.001) with median voice use ranging from 0–3 throughout the post-operative week. Conclusion Supplemental text-to-speech communication increased patient perceived communication effectiveness on post-operative days 3 and 5 over voice rest alone. With the prevalence of smartphones and the widespread use of text messaging, supplemental text-to-speech communication may provide an accessible and cost-effective communication option for patients on vocal restrictions. PMID:25605690

  7. Functionality, Complexity, and Approaches to Assessment of Resilience Under Constrained Energy and Information

    DTIC Science & Technology

    2015-03-26

    albeit powerful , method available for exploring CAS. As discussed above, there are many useful mathematical tools appropriate for CAS modeling. Agent-based...cells, tele- phone calls, and sexual contacts approach power -law distributions. [48] Networks in general are robust against random failures, but...targeted failures can have powerful effects – provided the targeter has a good understanding of the network structure. Some argue (convincingly) that all

  8. Effects of Systemic Therapy on Mental Health of Children and Adolescents: A Meta-Analysis.

    PubMed

    Riedinger, Verena; Pinquart, Martin; Teubert, Daniela

    2017-01-01

    Systemic therapy is a frequently used form of psychotherapy for the treatment of mental disorders in children and adolescents. The present study reports the results of the first meta-analysis on the effects of systemic treatment of mental disorders and behavior problems in children and adolescents. Based on systematic search in electronic databases (PsycINFO, Psyndex, PubMed, ISI Web of Knowledge, CINAHL), k = 56 randomized, controlled trials met the inclusion criteria. We computed a random-effects meta-analysis. Systemic therapy showed small-to-medium effects in comparison with an untreated control group (posttest: k = 7, g = .59 standard deviation units, follow-up: k = 2, g = .27) and alternative treatment (posttest: k = 43, g = .32, follow-up: k = 38, g = .28). At follow-up, longer interventions produced larger effect sizes. No other moderator effects were identified. Although available randomized, controlled trials show convincing results, their effects refer to a limited number of systemic approaches and mental disorders, and also pertain to adolescents rather than younger children. Thus, more research is needed before more general conclusions about the effects of systemic therapy can be drawn.

  9. Applying the Transtheoretical Model to evaluate the effect of a call-recall program in enhancing Pap smear practice: a cluster randomized trial.

    PubMed

    Abdullah, Fauziah; Su, Tin Tin

    2013-01-01

    The objective of this study was to evaluate the effect of a call-recall approach in enhancing Pap smear practice by changes of motivation stage among non-compliant women. A cluster randomized controlled trial with parallel and un-blinded design was conducted between January and November 2010 in 40 public secondary schools in Malaysia among 403 female teachers who never or infrequently attended for a Pap test. A cluster randomization was applied in assigning schools to both groups. An intervention group received an invitation and reminder (call-recall program) for a Pap test (20 schools with 201 participants), while the control group received usual care from the existing cervical screening program (20 schools with 202 participants). Multivariate logistic regression was performed to determine the effect of the intervention program on the action stage (Pap smear uptake) at 24 weeks. In both groups, pre-contemplation stage was found as the highest proportion of changes in stages. At 24 weeks, an intervention group showed two times more in the action stage than control group (adjusted odds ratio 2.44, 95% CI 1.29-4.62). The positive effect of a call-recall approach in motivating women to change the behavior of screening practice should be appreciated by policy makers and health care providers in developing countries as an intervention to enhance Pap smear uptake. Copyright © 2013 Elsevier Inc. All rights reserved.

  10. NIMROD: a program for inference via a normal approximation of the posterior in models with random effects based on ordinary differential equations.

    PubMed

    Prague, Mélanie; Commenges, Daniel; Guedj, Jérémie; Drylewicz, Julia; Thiébaut, Rodolphe

    2013-08-01

    Models based on ordinary differential equations (ODE) are widespread tools for describing dynamical systems. In biomedical sciences, data from each subject can be sparse making difficult to precisely estimate individual parameters by standard non-linear regression but information can often be gained from between-subjects variability. This makes natural the use of mixed-effects models to estimate population parameters. Although the maximum likelihood approach is a valuable option, identifiability issues favour Bayesian approaches which can incorporate prior knowledge in a flexible way. However, the combination of difficulties coming from the ODE system and from the presence of random effects raises a major numerical challenge. Computations can be simplified by making a normal approximation of the posterior to find the maximum of the posterior distribution (MAP). Here we present the NIMROD program (normal approximation inference in models with random effects based on ordinary differential equations) devoted to the MAP estimation in ODE models. We describe the specific implemented features such as convergence criteria and an approximation of the leave-one-out cross-validation to assess the model quality of fit. In pharmacokinetics models, first, we evaluate the properties of this algorithm and compare it with FOCE and MCMC algorithms in simulations. Then, we illustrate NIMROD use on Amprenavir pharmacokinetics data from the PUZZLE clinical trial in HIV infected patients. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  11. How to derive biological information from the value of the normalization constant in allometric equations.

    PubMed

    Kaitaniemi, Pekka

    2008-04-09

    Allometric equations are widely used in many branches of biological science. The potential information content of the normalization constant b in allometric equations of the form Y = bX(a) has, however, remained largely neglected. To demonstrate the potential for utilizing this information, I generated a large number of artificial datasets that resembled those that are frequently encountered in biological studies, i.e., relatively small samples including measurement error or uncontrolled variation. The value of X was allowed to vary randomly within the limits describing different data ranges, and a was set to a fixed theoretical value. The constant b was set to a range of values describing the effect of a continuous environmental variable. In addition, a normally distributed random error was added to the values of both X and Y. Two different approaches were then used to model the data. The traditional approach estimated both a and b using a regression model, whereas an alternative approach set the exponent a at its theoretical value and only estimated the value of b. Both approaches produced virtually the same model fit with less than 0.3% difference in the coefficient of determination. Only the alternative approach was able to precisely reproduce the effect of the environmental variable, which was largely lost among noise variation when using the traditional approach. The results show how the value of b can be used as a source of valuable biological information if an appropriate regression model is selected.

  12. [The effectiveness of psychosocial treatment approaches for alcohol dependence--a review].

    PubMed

    Bottlender, M; Köhler, J; Soyka, M

    2006-01-01

    Treatment approaches which are used in the context of inpatient alcoholism treatment are frequently neither theoretically justified nor empirically examined. In view of the enormous method variety the necessity exists for the development of treatment guidelines in order to offer indicators of promising treatment achievement for practitioners and pension funds. In a first step, it must be examined which treatments are effective, which are ineffective and which are possibly even counter-productive. This article aims to give a comprehensive review of randomized-controlled studies/meta-analysis on the efficacy of different treatment approaches. This article reporting the literature review is part of a larger programme to develop clinical practice guidelines for rehabilitation which is supported in form, content and finance by the German Pension Institute for Salaried Employees (Bundesversicherungsanstalt für Angestellte, BfA). Summing up, treatment is effective compared to no treatment. Cognitive behavioural therapy included in a multimodal treatment program is effective. There are a number of treatment protocols for which controlled research has consistently found positive results like social skills training, community reinforcement approaches, behaviour contracting, motivation-enhancing treatment, and family/marital therapy. There is also a number of commonly used treatment approaches that brought neither a positive result or were counter productive like relapse prevention, non-behavioural marital therapy, and insight psychotherapy, confrontational counseling, education, relaxation training, and milieu therapy. Support for matching to a specific treatment is weak, but interventions against alcohol problems should be differentiated according to the severity of the alcohol problem. Since treatment evaluation is mainly accomplished in the US and supplying structures with respect to the US and Germany are substantially different, a generalized transmission of US-American research results on Germany is to be evaluated carefully. Randomized-controlled studies are needed in Germany.

  13. The effectiveness of a voice treatment approach for teachers with self-reported voice problems.

    PubMed

    Gillivan-Murphy, Patricia; Drinnan, Michael J; O'Dwyer, Tadhg P; Ridha, Hayder; Carding, Paul

    2006-09-01

    Teachers are considered the professional group most at risk of developing voice-problems, but limited treatment effectiveness evidence exists. We studied prospectively the effectiveness of a 6-week combined treatment approach using vocal function exercises (VFEs) and vocal hygiene (VH) education with 20 teachers with self-reported voice problems. Twenty subjects were randomly assigned to a no-treatment control (n = 11) and a treatment group (n = 9). Fibreoptic endoscopic evaluation was carried out on all subjects before randomization. Two self-report voice outcome measures were used: the Voice-Related Quality of Life (VRQOL) and the Voice Symptom Severity Scale (VoiSS). A Voice Care Knowledge Visual Analogue Scale (VAS), developed specifically for the study, was also used to evaluate change in selected voice knowledge areas. A Student unpaired t test revealed a statistically significant (P < 0.05) improvement in the treatment group as measured by the VoiSS. There was not a significant improvement in the treatment group as measured by the V-RQOL. The difference in voice care knowledge areas was also significant for the treatment group (P < 0.05). This study suggests that a voice treatment approach of VFEs and VH education improved self-reported voice symptoms and voice care knowledge in a group of teachers.

  14. Effects of a social accountability approach, CARE's Community Score Card, on reproductive health-related outcomes in Malawi: A cluster-randomized controlled evaluation.

    PubMed

    Gullo, Sara; Galavotti, Christine; Sebert Kuhlmann, Anne; Msiska, Thumbiko; Hastings, Phil; Marti, C Nathan

    2017-01-01

    Social accountability approaches, which emphasize mutual responsibility and accountability by community members, health care workers, and local health officials for improving health outcomes in the community, are increasingly being employed in low-resource settings. We evaluated the effects of a social accountability approach, CARE's Community Score Card (CSC), on reproductive health outcomes in Ntcheu district, Malawi using a cluster-randomized control design. We matched 10 pairs of communities, randomly assigning one from each pair to intervention and control arms. We conducted two independent cross-sectional surveys of women who had given birth in the last 12 months, at baseline and at two years post-baseline. Using difference-in-difference (DiD) and local average treatment effect (LATE) estimates, we evaluated the effects on outcomes including modern contraceptive use, antenatal and postnatal care service utilization, and service satisfaction. We also evaluated changes in indicators developed by community members and service providers in the intervention areas. DiD analyses showed significantly greater improvements in the proportion of women receiving a home visit during pregnancy (B = 0.20, P < .01), receiving a postnatal visit (B = 0.06, P = .01), and overall service satisfaction (B = 0.16, P < .001) in intervention compared to control areas. LATE analyses estimated significant effects of the CSC intervention on home visits by health workers (114% higher in intervention compared to control) (B = 1.14, P < .001) and current use of modern contraceptives (57% higher) (B = 0.57, P < .01). All 13 community- and provider-developed indicators improved, with 6 of them showing significant improvements. By facilitating the relationship between community members, health service providers, and local government officials, the CSC contributed to important improvements in reproductive health-related outcomes. Further, the CSC builds mutual accountability, and ensures that solutions to problems are locally-relevant, locally-supported and feasible to implement.

  15. Time-varying SMART design and data analysis methods for evaluating adaptive intervention effects.

    PubMed

    Dai, Tianjiao; Shete, Sanjay

    2016-08-30

    In a standard two-stage SMART design, the intermediate response to the first-stage intervention is measured at a fixed time point for all participants. Subsequently, responders and non-responders are re-randomized and the final outcome of interest is measured at the end of the study. To reduce the side effects and costs associated with first-stage interventions in a SMART design, we proposed a novel time-varying SMART design in which individuals are re-randomized to the second-stage interventions as soon as a pre-fixed intermediate response is observed. With this strategy, the duration of the first-stage intervention will vary. We developed a time-varying mixed effects model and a joint model that allows for modeling the outcomes of interest (intermediate and final) and the random durations of the first-stage interventions simultaneously. The joint model borrows strength from the survival sub-model in which the duration of the first-stage intervention (i.e., time to response to the first-stage intervention) is modeled. We performed a simulation study to evaluate the statistical properties of these models. Our simulation results showed that the two modeling approaches were both able to provide good estimations of the means of the final outcomes of all the embedded interventions in a SMART. However, the joint modeling approach was more accurate for estimating the coefficients of first-stage interventions and time of the intervention. We conclude that the joint modeling approach provides more accurate parameter estimates and a higher estimated coverage probability than the single time-varying mixed effects model, and we recommend the joint model for analyzing data generated from time-varying SMART designs. In addition, we showed that the proposed time-varying SMART design is cost-efficient and equally effective in selecting the optimal embedded adaptive intervention as the standard SMART design.

  16. Selective randomized load balancing and mesh networks with changing demands

    NASA Astrophysics Data System (ADS)

    Shepherd, F. B.; Winzer, P. J.

    2006-05-01

    We consider the problem of building cost-effective networks that are robust to dynamic changes in demand patterns. We compare several architectures using demand-oblivious routing strategies. Traditional approaches include single-hop architectures based on a (static or dynamic) circuit-switched core infrastructure and multihop (packet-switched) architectures based on point-to-point circuits in the core. To address demand uncertainty, we seek minimum cost networks that can carry the class of hose demand matrices. Apart from shortest-path routing, Valiant's randomized load balancing (RLB), and virtual private network (VPN) tree routing, we propose a third, highly attractive approach: selective randomized load balancing (SRLB). This is a blend of dual-hop hub routing and randomized load balancing that combines the advantages of both architectures in terms of network cost, delay, and delay jitter. In particular, we give empirical analyses for the cost (in terms of transport and switching equipment) for the discussed architectures, based on three representative carrier networks. Of these three networks, SRLB maintains the resilience properties of RLB while achieving significant cost reduction over all other architectures, including RLB and multihop Internet protocol/multiprotocol label switching (IP/MPLS) networks using VPN-tree routing.

  17. Mixed models approaches for joint modeling of different types of responses.

    PubMed

    Ivanova, Anna; Molenberghs, Geert; Verbeke, Geert

    2016-01-01

    In many biomedical studies, one jointly collects longitudinal continuous, binary, and survival outcomes, possibly with some observations missing. Random-effects models, sometimes called shared-parameter models or frailty models, received a lot of attention. In such models, the corresponding variance components can be employed to capture the association between the various sequences. In some cases, random effects are considered common to various sequences, perhaps up to a scaling factor; in others, there are different but correlated random effects. Even though a variety of data types has been considered in the literature, less attention has been devoted to ordinal data. For univariate longitudinal or hierarchical data, the proportional odds mixed model (POMM) is an instance of the generalized linear mixed model (GLMM; Breslow and Clayton, 1993). Ordinal data are conveniently replaced by a parsimonious set of dummies, which in the longitudinal setting leads to a repeated set of dummies. When ordinal longitudinal data are part of a joint model, the complexity increases further. This is the setting considered in this paper. We formulate a random-effects based model that, in addition, allows for overdispersion. Using two case studies, it is shown that the combination of random effects to capture association with further correction for overdispersion can improve the model's fit considerably and that the resulting models allow to answer research questions that could not be addressed otherwise. Parameters can be estimated in a fairly straightforward way, using the SAS procedure NLMIXED.

  18. The Coach2Move Approach: Development and Acceptability of an Individually Tailored Physical Therapy Strategy to Increase Activity Levels in Older Adults With Mobility Problems.

    PubMed

    de Vries, Nienke M; van Ravensberg, C Dorine; Hobbelen, Johannes S M; van der Wees, Philip J; Olde Rikkert, Marcel G M; Staal, J Bart; Nijhuis-van der Sanden, Maria W G

    2015-01-01

    Despite the positive effects of physical activity on numerous aspects of health, many older adults remain sedentary even after participating in physical activity interventions. Standardized exercise programs do not necessarily bring about the behavioral change that is necessary. Therefore, a patient-centered approach is needed. The purpose of this study was to develop and assess the acceptability and potential effectiveness of the Coach2Move strategy; a physical therapy (PT) approach aimed at improving the long-term level of physical activity in mobility-limited older adults. The Coach2Move strategy was developed on the basis of 2 systematic literature studies and expert consultations. Multiple focus group meetings and a Delphi procedure were organized to gain consensus on the Coach2Move strategy. Acceptability and potential effectiveness were studied in a pilot study with a pre-/postdesign in which 2 physical therapists and 12 patients participated. To assess acceptability, patients were interviewed, discussion were held with the involved physical therapists was held, and health records were studied. Potential effectiveness was tested measuring the level of physical activity, frailty, quality of life, and mobility before and after treatment. On the basis of the literature study and expert consultations, an algorithm based on the Hypothesis Oriented Algorithm for Clinicians Part II was developed: the Coach2Move approach. Key elements of the Coach2Move approach include an extensive intake using motivational interviewing, clinical reasoning, coaching to increase physical activity and self-management, focusing on meaningful activities, and working according to 3 patient-tailored intervention profiles with a predefined number of sessions. The pilot study showed high appraisal of the strategy by both physical therapists and patients. Moreover, a potential effect on the level of physical activity, frailty, quality of life, and mobility was observed. Because the pilot study was not randomized or controlled and included a small sample, no conclusions can be drawn about the effectiveness of the Coach2Move strategy. However, all suggestions made in this study were implemented in an ongoing, randomized controlled trial in which the Coach2Move strategy will be compared to usual care PT. In conclusion, the Coach2Move strategy can be considered acceptable in PT practice and showed potential benefits. The results on the (cost-)effectiveness of this strategy based on a large, randomized, controlled trial are expected in 2014.

  19. A Randomized Controlled Trial Comparing the McKenzie Method to Motor Control Exercises in People With Chronic Low Back Pain and a Directional Preference.

    PubMed

    Halliday, Mark H; Pappas, Evangelos; Hancock, Mark J; Clare, Helen A; Pinto, Rafael Z; Robertson, Gavin; Ferreira, Paulo H

    2016-07-01

    Study Design Randomized clinical trial. Background Motor control exercises are believed to improve coordination of the trunk muscles. It is unclear whether increases in trunk muscle thickness can be facilitated by approaches such as the McKenzie method. Furthermore, it is unclear which approach may have superior clinical outcomes. Objectives The primary aim was to compare the effects of the McKenzie method and motor control exercises on trunk muscle recruitment in people with chronic low back pain classified with a directional preference. The secondary aim was to conduct a between-group comparison of outcomes for pain, function, and global perceived effect. Methods Seventy people with chronic low back pain who demonstrated a directional preference using the McKenzie assessment were randomized to receive 12 treatments over 8 weeks with the McKenzie method or with motor control approaches. All outcomes were collected at baseline and at 8-week follow-up by blinded assessors. Results No significant between-group difference was found for trunk muscle thickness of the transversus abdominis (-5.8%; 95% confidence interval [CI]: -15.2%, 3.7%), obliquus internus (-0.7%; 95% CI: -6.6%, 5.2%), and obliquus externus (1.2%; 95% CI: -4.3%, 6.8%). Perceived recovery was slightly superior in the McKenzie group (-0.8; 95% CI: -1.5, -0.1) on a -5 to +5 scale. No significant between-group differences were found for pain or function (P = .99 and P = .26, respectively). Conclusion We found no significant effect of treatment group for trunk muscle thickness. Participants reported a slightly greater sense of perceived recovery with the McKenzie method than with the motor control approach. Level of Evidence Therapy, level 1b-. Registered September 7, 2011 at www.anzctr.org.au (ACTRN12611000971932). J Orthop Sports Phys Ther 2016;46(7):514-522. Epub 12 May 2016. doi:10.2519/jospt.2016.6379.

  20. A simple and efficient alternative to implementing systematic random sampling in stereological designs without a motorized microscope stage.

    PubMed

    Melvin, Neal R; Poda, Daniel; Sutherland, Robert J

    2007-10-01

    When properly applied, stereology is a very robust and efficient method to quantify a variety of parameters from biological material. A common sampling strategy in stereology is systematic random sampling, which involves choosing a random sampling [corrected] start point outside the structure of interest, and sampling relevant objects at [corrected] sites that are placed at pre-determined, equidistant intervals. This has proven to be a very efficient sampling strategy, and is used widely in stereological designs. At the microscopic level, this is most often achieved through the use of a motorized stage that facilitates the systematic random stepping across the structure of interest. Here, we report a simple, precise and cost-effective software-based alternative to accomplishing systematic random sampling under the microscope. We believe that this approach will facilitate the use of stereological designs that employ systematic random sampling in laboratories that lack the resources to acquire costly, fully automated systems.

  1. A random walk approach to quantum algorithms.

    PubMed

    Kendon, Vivien M

    2006-12-15

    The development of quantum algorithms based on quantum versions of random walks is placed in the context of the emerging field of quantum computing. Constructing a suitable quantum version of a random walk is not trivial; pure quantum dynamics is deterministic, so randomness only enters during the measurement phase, i.e. when converting the quantum information into classical information. The outcome of a quantum random walk is very different from the corresponding classical random walk owing to the interference between the different possible paths. The upshot is that quantum walkers find themselves further from their starting point than a classical walker on average, and this forms the basis of a quantum speed up, which can be exploited to solve problems faster. Surprisingly, the effect of making the walk slightly less than perfectly quantum can optimize the properties of the quantum walk for algorithmic applications. Looking to the future, even with a small quantum computer available, the development of quantum walk algorithms might proceed more rapidly than it has, especially for solving real problems.

  2. Efficacy of a modern neuroscience approach versus usual care evidence-based physiotherapy on pain, disability and brain characteristics in chronic spinal pain patients: protocol of a randomized clinical trial

    PubMed Central

    2014-01-01

    Background Among the multiple conservative modalities, physiotherapy is a commonly utilized treatment modality in managing chronic non-specific spinal pain. Despite the scientific progresses with regard to pain and motor control neuroscience, treatment of chronic spinal pain (CSP) often tends to stick to a peripheral biomechanical model, without targeting brain mechanisms. With a view to enhance clinical efficacy of existing physiotherapeutic treatments for CSP, the development of clinical strategies targeted at ‘training the brain’ is to be pursued. Promising proof-of-principle results have been reported for the effectiveness of a modern neuroscience approach to CSP when compared to usual care, but confirmation is required in a larger, multi-center trial with appropriate evidence-based control intervention and long-term follow-up. The aim of this study is to assess the effectiveness of a modern neuroscience approach, compared to usual care evidence-based physiotherapy, for reducing pain and improving functioning in patients with CSP. A secondary objective entails examining the effectiveness of the modern neuroscience approach versus usual care physiotherapy for normalizing brain gray matter in patients with CSP. Methods/Design The study is a multi-center, triple-blind, two-arm (1:1) randomized clinical trial with 1-year follow-up. 120 CSP patients will be randomly allocated to either the experimental (receiving pain neuroscience education followed by cognition-targeted motor control training) or the control group (receiving usual care physiotherapy), each comprising of 3 months treatment. The main outcome measures are pain (including symptoms and indices of central sensitization) and self-reported disability. Secondary outcome measures include brain gray matter structure, motor control, muscle properties, and psychosocial correlates. Clinical assessment and brain imaging will be performed at baseline, post-treatment and at 1-year follow-up. Web-based questionnaires will be completed at baseline, after the first 3 treatment sessions, post-treatment, and at 6 and 12-months follow-up. Discussion Findings may provide empirical evidence on: (1) the effectiveness of a modern neuroscience approach to CSP for reducing pain and improving functioning, (2) the effectiveness of a modern neuroscience approach for normalizing brain gray matter in CSP patients, and (3) factors associated with therapy success. Hence, this trial might contribute towards refining guidelines for good clinical practice and might be used as a basis for health authorities’ recommendations. Trial registration ClinicalTrials.gov Identifier: NCT02098005. PMID:24885889

  3. A randomized controlled trial of a nurse-led case management programme for hospital-discharged older adults with co-morbidities

    PubMed Central

    Chow, Susan Ka Yee; Wong, Frances Kam Yuet

    2014-01-01

    Aim To examine the effects of a nurse-led case management programme for hospital-discharged older adults with co-morbidities. Background The most significant chronic conditions today involve diseases of the cardiovascular, respiratory, endocrine and renal systems. Previous studies have suggested that a nurse-led case management approach using either telephone follow-ups or home visits was able to improve clinical and patient outcomes for patients having a single, chronic disease, while the effects for older patients having at least two long-term conditions are unknown. A self-help programme using motivation and empowerment approaches is the framework of care in the study. Design Randomized controlled trial. Method The study was conducted from 2010–2012. Older patients having at least two chronic diseases were included for analysis. The participants were randomized into three arms: two study groups and one control group. Data were collected at baseline and at 4 and 12 weeks later. Results Two hundred and eighty-one patients completed the study. The interventions demonstrated significant differences in hospital readmission rates within 84 days post discharge. The two intervention groups had lower readmission rates than the control group. Patients in the two study arms had significantly better self-rated health and self-efficacy. There was significant difference between the groups in the physical composite score, but no significant difference in mental component score in SF-36 scale. Conclusion The postdischarge interventions led by the nurse case managers on self-management of disease using the empowerment approach were able to provide effective clinical and patient outcomes for older patients having co-morbidities. PMID:24617755

  4. One-Year Randomized Controlled Trial and Follow-Up of Integrated Neurocognitive Therapy for Schizophrenia Outpatients

    PubMed Central

    Mueller, Daniel R.; Schmidt, Stefanie J.; Roder, Volker

    2015-01-01

    Objective: Cognitive remediation (CR) approaches have demonstrated to be effective in improving cognitive functions in schizophrenia. However, there is a lack of integrated CR approaches that target multiple neuro- and social-cognitive domains with a special focus on the generalization of therapy effects to functional outcome. Method: This 8-site randomized controlled trial evaluated the efficacy of a novel CR group therapy approach called integrated neurocognitive therapy (INT). INT includes well-defined exercises to improve all neuro- and social-cognitive domains as defined by the Measurement And Treatment Research to Improve Cognition in Schizophrenia (MATRICS) initiative by compensation and restitution. One hundred and fifty-six outpatients with a diagnosis of schizophrenia or schizoaffective disorder according to DSM-IV-TR or ICD-10 were randomly assigned to receive 15 weeks of INT or treatment as usual (TAU). INT patients received 30 bi-weekly therapy sessions. Each session lasted 90min. Mixed models were applied to assess changes in neurocognition, social cognition, symptoms, and functional outcome at post-treatment and at 9-month follow-up. Results: In comparison to TAU, INT patients showed significant improvements in several neuro- and social-cognitive domains, negative symptoms, and functional outcome after therapy and at 9-month follow-up. Number-needed-to-treat analyses indicate that only 5 INT patients are necessary to produce durable and meaningful improvements in functional outcome. Conclusions: Integrated interventions on neurocognition and social cognition have the potential to improve not only cognitive performance but also functional outcome. These findings are important as treatment guidelines for schizophrenia have criticized CR for its poor generalization effects. PMID:25713462

  5. Quantitative benefit-harm assessment for setting research priorities: the example of roflumilast for patients with COPD.

    PubMed

    Puhan, Milo A; Yu, Tsung; Boyd, Cynthia M; Ter Riet, Gerben

    2015-07-02

    When faced with uncertainties about the effects of medical interventions regulatory agencies, guideline developers, clinicians, and researchers commonly ask for more research, and in particular for more randomized trials. The conduct of additional randomized trials is, however, sometimes not the most efficient way to reduce uncertainty. Instead, approaches such as value of information analysis or other approaches should be used to prioritize research that will most likely reduce uncertainty and inform decisions. In situations where additional research for specific interventions needs to be prioritized, we propose the use of quantitative benefit-harm assessments that illustrate how the benefit-harm balance may change as a consequence of additional research. The example of roflumilast for patients with chronic obstructive pulmonary disease shows that additional research on patient preferences (e.g., how important are exacerbations relative to psychiatric harms?) or outcome risks (e.g., what is the incidence of psychiatric outcomes in patients with chronic obstructive pulmonary disease without treatment?) is sometimes more valuable than additional randomized trials. We propose that quantitative benefit-harm assessments have the potential to explore the impact of additional research and to identify research priorities Our approach may be seen as another type of value of information analysis and as a useful approach to stimulate specific new research that has the potential to change current estimates of the benefit-harm balance and decision making.

  6. Shakeout: A New Approach to Regularized Deep Neural Network Training.

    PubMed

    Kang, Guoliang; Li, Jun; Tao, Dacheng

    2018-05-01

    Recent years have witnessed the success of deep neural networks in dealing with a plenty of practical problems. Dropout has played an essential role in many successful deep neural networks, by inducing regularization in the model training. In this paper, we present a new regularized training approach: Shakeout. Instead of randomly discarding units as Dropout does at the training stage, Shakeout randomly chooses to enhance or reverse each unit's contribution to the next layer. This minor modification of Dropout has the statistical trait: the regularizer induced by Shakeout adaptively combines , and regularization terms. Our classification experiments with representative deep architectures on image datasets MNIST, CIFAR-10 and ImageNet show that Shakeout deals with over-fitting effectively and outperforms Dropout. We empirically demonstrate that Shakeout leads to sparser weights under both unsupervised and supervised settings. Shakeout also leads to the grouping effect of the input units in a layer. Considering the weights in reflecting the importance of connections, Shakeout is superior to Dropout, which is valuable for the deep model compression. Moreover, we demonstrate that Shakeout can effectively reduce the instability of the training process of the deep architecture.

  7. Closing the Achievement Gap through Modification of Neurocognitive and Neuroendocrine Function: Results from a Cluster Randomized Controlled Trial of an Innovative Approach to the Education of Children in Kindergarten

    PubMed Central

    Blair, Clancy; Raver, C. Cybele

    2014-01-01

    Effective early education is essential for academic achievement and positive life outcomes, particularly for children in poverty. Advances in neuroscience suggest that a focus on self-regulation in education can enhance children’s engagement in learning and establish beneficial academic trajectories in the early elementary grades. Here, we experimentally evaluate an innovative approach to the education of children in kindergarten that embeds support for self-regulation, particularly executive functions, into literacy, mathematics, and science learning activities. Results from a cluster randomized controlled trial involving 29 schools, 79 classrooms, and 759 children indicated positive effects on executive functions, reasoning ability, the control of attention, and levels of salivary cortisol and alpha amylase. Results also demonstrated improvements in reading, vocabulary, and mathematics at the end of kindergarten that increased into the first grade. A number of effects were specific to high-poverty schools, suggesting that a focus on executive functions and associated aspects of self-regulation in early elementary education holds promise for closing the achievement gap. PMID:25389751

  8. Closing the achievement gap through modification of neurocognitive and neuroendocrine function: results from a cluster randomized controlled trial of an innovative approach to the education of children in kindergarten.

    PubMed

    Blair, Clancy; Raver, C Cybele

    2014-01-01

    Effective early education is essential for academic achievement and positive life outcomes, particularly for children in poverty. Advances in neuroscience suggest that a focus on self-regulation in education can enhance children's engagement in learning and establish beneficial academic trajectories in the early elementary grades. Here, we experimentally evaluate an innovative approach to the education of children in kindergarten that embeds support for self-regulation, particularly executive functions, into literacy, mathematics, and science learning activities. Results from a cluster randomized controlled trial involving 29 schools, 79 classrooms, and 759 children indicated positive effects on executive functions, reasoning ability, the control of attention, and levels of salivary cortisol and alpha amylase. Results also demonstrated improvements in reading, vocabulary, and mathematics at the end of kindergarten that increased into the first grade. A number of effects were specific to high-poverty schools, suggesting that a focus on executive functions and associated aspects of self-regulation in early elementary education holds promise for closing the achievement gap.

  9. A null model for Pearson coexpression networks.

    PubMed

    Gobbi, Andrea; Jurman, Giuseppe

    2015-01-01

    Gene coexpression networks inferred by correlation from high-throughput profiling such as microarray data represent simple but effective structures for discovering and interpreting linear gene relationships. In recent years, several approaches have been proposed to tackle the problem of deciding when the resulting correlation values are statistically significant. This is most crucial when the number of samples is small, yielding a non-negligible chance that even high correlation values are due to random effects. Here we introduce a novel hard thresholding solution based on the assumption that a coexpression network inferred by randomly generated data is expected to be empty. The threshold is theoretically derived by means of an analytic approach and, as a deterministic independent null model, it depends only on the dimensions of the starting data matrix, with assumptions on the skewness of the data distribution compatible with the structure of gene expression levels data. We show, on synthetic and array datasets, that the proposed threshold is effective in eliminating all false positive links, with an offsetting cost in terms of false negative detected edges.

  10. The effect of static scanning and mobility training on mobility in people with hemianopia after stroke: A randomized controlled trial comparing standardized versus non-standardized treatment protocols

    PubMed Central

    2011-01-01

    Background Visual loss following stroke impacts significantly on activities of daily living and is an independent risk factor for becoming dependent. Routinely, allied health clinicians provide training for visual field loss, mainly with eye movement based therapy. The effectiveness of the compensatory approach to rehabilitation remains inconclusive largely due to difficulty in validating functional outcome with the varied type and dosage of therapy received by an individual patient. This study aims to determine which treatment is more effective, a standardized approach or individualized therapy in patients with homonymous hemianopia post stroke. Methods/Design This study is a double-blind randomized controlled, multicenter trial. A standardised scanning rehabilitation program (Neuro Vision Technology (NVT) program) of 7 weeks at 3 times per week, is compared to individualized therapy recommended by clinicians. Discussion The results of the trial will provide information that could potentially inform the allocation of resources in visual rehabilitation post stroke. Trial Registration Australia and New Zealand Clinical Trials Register (ANZCTR): ACTRN12610000494033 PMID:21767413

  11. Increasing independent decision-making skills of women with mental retardation in simulated interpersonal situations of abuse.

    PubMed

    Khemka, I

    2000-09-01

    The effectiveness of two decision-making training approaches in increasing independent decision-making skills of 36 women with mild mental retardation in response to hypothetical social interpersonal situations involving abuse was evaluated. Participants were randomly assigned to a control or one of two training conditions (a decision-making training approach that either addressed both cognitive and motivational aspects of decision-making or included only instruction on the cognitive aspect of decision-making). Although both approaches were effective relative to a control condition, the combined cognitive and motivational training approach was superior to the cognitive only training approach. The superiority of this approach was also reflected on a verbally presented generalization task requiring participants to respond to a decision-making situation involving abuse from their own perspective and on a locus of control scale that measured perceptions of control.

  12. Peer Influence, Genetic Propensity, and Binge Drinking: A Natural Experiment and a Replication.

    PubMed

    Guo, Guang; Li, Yi; Wang, Hongyu; Cai, Tianji; Duncan, Greg J

    2015-11-01

    The authors draw data from the College Roommate Study (ROOM) and the National Longitudinal Study of Adolescent Health to investigate gene-environment interaction effects on youth binge drinking. In ROOM, the environmental influence was measured by the precollege drinking behavior of randomly assigned roommates. Random assignment safeguards against friend selection and removes the threat of gene-environment correlation that makes gene-environment interaction effects difficult to interpret. On average, being randomly assigned a drinking peer as opposed to a nondrinking peer increased college binge drinking by 0.5-1.0 episodes per month, or 20%-40% the average amount of binge drinking. However, this peer influence was found only among youths with a medium level of genetic propensity for alcohol use; those with either a low or high genetic propensity were not influenced by peer drinking. A replication of the findings is provided in data drawn from Add Health. The study shows that gene-environment interaction analysis can uncover social-contextual effects likely to be missed by traditional sociological approaches.

  13. Protocol for the effect evaluation of Individual Placement and Support (IPS): a randomized controlled multicenter trial of IPS versus treatment as usual for patients with moderate to severe mental illness in Norway.

    PubMed

    Sveinsdottir, Vigdis; Løvvik, Camilla; Fyhn, Tonje; Monstad, Karin; Ludvigsen, Kari; Øverland, Simon; Reme, Silje Endresen

    2014-11-18

    Roughly one third of disability pensions in Norway are issued for mental and behavioral disorders, and vocational rehabilitation offered to this group has traditionally been dominated by train-and-place approaches with assisted or sheltered employment. Based on a more innovative place-and-train approach, Individual Placement and Support (IPS) involves supported employment in real-life competitive work settings, and has shown great promise for patients with severe mental illness. The study is a multicenter Randomized Controlled Trial (RCT) of IPS in a Norwegian context, involving an effect evaluation, a process evaluation, and a cost/benefit analysis. IPS will be compared to high quality treatment as usual (TAU), with labor market participation and educational activity at 12 months post inclusion as the primary outcome. The primary outcome will be measured using register data, and the project will also include complete follow-up up to 4 years after inclusion for long-term outcome data. Secondary outcomes include mental health status, disability and quality of life, collected through survey questionnaires at baseline, and after 6 and 12 months. Participants will include patients undergoing treatment for moderate to severe mental illness who are either unemployed or on sickness or social benefits. The estimated total sample size of 400-500 will be randomly assigned to the interventions. To be eligible, participants must have an expressed desire to work, and sufficient Norwegian reading and writing skills to fill out the questionnaires. The Effect Evaluation of Individual Placement and Support (IPS) will be one of the largest randomized controlled trials to date investigating the effectiveness of IPS on competitive employment, and the first study to evaluate the effectiveness of IPS for patients with moderate to severe mental illness within a Norwegian context. Clinicaltrials.gov: NCT01964092 . Registered October 16th, 2013.

  14. Open Lung Approach for the Acute Respiratory Distress Syndrome: A Pilot, Randomized Controlled Trial.

    PubMed

    Kacmarek, Robert M; Villar, Jesús; Sulemanji, Demet; Montiel, Raquel; Ferrando, Carlos; Blanco, Jesús; Koh, Younsuck; Soler, Juan Alfonso; Martínez, Domingo; Hernández, Marianela; Tucci, Mauro; Borges, Joao Batista; Lubillo, Santiago; Santos, Arnoldo; Araujo, Juan B; Amato, Marcelo B P; Suárez-Sipmann, Fernando

    2016-01-01

    The open lung approach is a mechanical ventilation strategy involving lung recruitment and a decremental positive end-expiratory pressure trial. We compared the Acute Respiratory Distress Syndrome network protocol using low levels of positive end-expiratory pressure with open lung approach resulting in moderate to high levels of positive end-expiratory pressure for the management of established moderate/severe acute respiratory distress syndrome. A prospective, multicenter, pilot, randomized controlled trial. A network of 20 multidisciplinary ICUs. Patients meeting the American-European Consensus Conference definition for acute respiratory distress syndrome were considered for the study. At 12-36 hours after acute respiratory distress syndrome onset, patients were assessed under standardized ventilator settings (FIO2≥0.5, positive end-expiratory pressure ≥10 cm H2O). If Pao2/FIO2 ratio remained less than or equal to 200 mm Hg, patients were randomized to open lung approach or Acute Respiratory Distress Syndrome network protocol. All patients were ventilated with a tidal volume of 4 to 8 ml/kg predicted body weight. From 1,874 screened patients with acute respiratory distress syndrome, 200 were randomized: 99 to open lung approach and 101 to Acute Respiratory Distress Syndrome network protocol. Main outcome measures were 60-day and ICU mortalities, and ventilator-free days. Mortality at day-60 (29% open lung approach vs. 33% Acute Respiratory Distress Syndrome Network protocol, p = 0.18, log rank test), ICU mortality (25% open lung approach vs. 30% Acute Respiratory Distress Syndrome network protocol, p = 0.53 Fisher's exact test), and ventilator-free days (8 [0-20] open lung approach vs. 7 [0-20] d Acute Respiratory Distress Syndrome network protocol, p = 0.53 Wilcoxon rank test) were not significantly different. Airway driving pressure (plateau pressure - positive end-expiratory pressure) and PaO2/FIO2 improved significantly at 24, 48 and 72 hours in patients in open lung approach compared with patients in Acute Respiratory Distress Syndrome network protocol. Barotrauma rate was similar in both groups. In patients with established acute respiratory distress syndrome, open lung approach improved oxygenation and driving pressure, without detrimental effects on mortality, ventilator-free days, or barotrauma. This pilot study supports the need for a large, multicenter trial using recruitment maneuvers and a decremental positive end-expiratory pressure trial in persistent acute respiratory distress syndrome.

  15. Unitary n -designs via random quenches in atomic Hubbard and spin models: Application to the measurement of Rényi entropies

    NASA Astrophysics Data System (ADS)

    Vermersch, B.; Elben, A.; Dalmonte, M.; Cirac, J. I.; Zoller, P.

    2018-02-01

    We present a general framework for the generation of random unitaries based on random quenches in atomic Hubbard and spin models, forming approximate unitary n -designs, and their application to the measurement of Rényi entropies. We generalize our protocol presented in Elben et al. [Phys. Rev. Lett. 120, 050406 (2018), 10.1103/PhysRevLett.120.050406] to a broad class of atomic and spin-lattice models. We further present an in-depth numerical and analytical study of experimental imperfections, including the effect of decoherence and statistical errors, and discuss connections of our approach with many-body quantum chaos.

  16. Spatially-Distributed Cost-Effectiveness Analysis Framework to Control Phosphorus from Agricultural Diffuse Pollution.

    PubMed

    Geng, Runzhe; Wang, Xiaoyan; Sharpley, Andrew N; Meng, Fande

    2015-01-01

    Best management practices (BMPs) for agricultural diffuse pollution control are implemented at the field or small-watershed scale. However, the benefits of BMP implementation on receiving water quality at multiple spatial is an ongoing challenge. In this paper, we introduce an integrated approach that combines risk assessment (i.e., Phosphorus (P) index), model simulation techniques (Hydrological Simulation Program-FORTRAN), and a BMP placement tool at various scales to identify the optimal location for implementing multiple BMPs and estimate BMP effectiveness after implementation. A statistically significant decrease in nutrient discharge from watersheds is proposed to evaluate the effectiveness of BMPs, strategically targeted within watersheds. Specifically, we estimate two types of cost-effectiveness curves (total pollution reduction and proportion of watersheds improved) for four allocation approaches. Selection of a ''best approach" depends on the relative importance of the two types of effectiveness, which involves a value judgment based on the random/aggregated degree of BMP distribution among and within sub-watersheds. A statistical optimization framework is developed and evaluated in Chaohe River Watershed located in the northern mountain area of Beijing. Results show that BMP implementation significantly (p >0.001) decrease P loss from the watershed. Remedial strategies where BMPs were targeted to areas of high risk of P loss, deceased P loads compared with strategies where BMPs were randomly located across watersheds. Sensitivity analysis indicated that aggregated BMP placement in particular watershed is the most cost-effective scenario to decrease P loss. The optimization approach outlined in this paper is a spatially hierarchical method for targeting nonpoint source controls across a range of scales from field to farm, to watersheds, to regions. Further, model estimates showed targeting at multiple scales is necessary to optimize program efficiency. The integrated model approach described that selects and places BMPs at varying levels of implementation, provides a new theoretical basis and technical guidance for diffuse pollution management in agricultural watersheds.

  17. The use of propensity scores to assess the generalizability of results from randomized trials

    PubMed Central

    Stuart, Elizabeth A.; Cole, Stephen R.; Bradshaw, Catherine P.; Leaf, Philip J.

    2014-01-01

    Randomized trials remain the most accepted design for estimating the effects of interventions, but they do not necessarily answer a question of primary interest: Will the program be effective in a target population in which it may be implemented? In other words, are the results generalizable? There has been very little statistical research on how to assess the generalizability, or “external validity,” of randomized trials. We propose the use of propensity-score-based metrics to quantify the similarity of the participants in a randomized trial and a target population. In this setting the propensity score model predicts participation in the randomized trial, given a set of covariates. The resulting propensity scores are used first to quantify the difference between the trial participants and the target population, and then to match, subclassify, or weight the control group outcomes to the population, assessing how well the propensity score-adjusted outcomes track the outcomes actually observed in the population. These metrics can serve as a first step in assessing the generalizability of results from randomized trials to target populations. This paper lays out these ideas, discusses the assumptions underlying the approach, and illustrates the metrics using data on the evaluation of a schoolwide prevention program called Positive Behavioral Interventions and Supports. PMID:24926156

  18. Effects of Sex Education and Kegel Exercises on the Sexual Function of Postmenopausal Women: A Randomized Clinical Trial.

    PubMed

    Nazarpour, Soheila; Simbar, Masoumeh; Ramezani Tehrani, Fahimeh; Alavi Majd, Hamid

    2017-07-01

    The sex lives of women are strongly affected by menopause. Non-pharmacologic approaches to improving the sexual function of postmenopausal women might prove effective. To compare two methods of intervention (formal sex education and Kegel exercises) with routine postmenopausal care services in a randomized clinical trial. A randomized clinical trial was conducted of 145 postmenopausal women residing in Chalus and Noshahr, Iran. Their sexual function statuses were assessed using the Female Sexual Function Index (FSFI) questionnaire. After obtaining written informed consents, they were randomly assigned to one of three groups: (i) formal sex education, (ii) Kegel exercises, or (iii) routine postmenopausal care. After 12 weeks, all participants completed the FSFI again. Analysis of covariance was used to compare the participants' sexual function before and after the interventions, and multiple linear regression analysis was used to determine the predictive factors for variation in FSFI scores in the postintervention stage. Sexual function was assessed using the FSFI. There were no statistically significant differences in demographic and socioeconomic characteristics and FSFI total scores among the three study groups at the outset of the study. After 12 weeks, the scores of arousal in the formal sex education and Kegel groups were significantly higher compared with the control group (3.38 and 3.15 vs 2.77, respectively). The scores of orgasm and satisfaction in the Kegel group were significantly higher compared with the control group (4.43 and 4.88 vs 3.95 and 4.39, respectively). Formal sex education and Kegel exercises were used as two non-pharmacologic approaches to improve the sexual function of women after menopause. The main strength of this study was its design: a well-organized randomized trial using precise eligibility criteria with a small sample loss. The second strength was the methods of intervention used, namely non-pharmacologic approaches that are simple, easily accessible, and fairly inexpensive. The main limitation of the study was our inability to objectively assess the participants' commitment to exercise and the sexual function of their partners. Sex education programs and Kegel exercises could cause improvements in some domains of sexual function-specifically arousal, orgasm, and satisfaction-in postmenopausal women. Nazarpour S, Simbar M, Tehrani FR, Majd HA. Effects of Sex Education and Kegel Exercises on the Sexual Function of Postmenopausal Women: A Randomized Clinical Trial. J Sex Med 2017;14:959-967. Copyright © 2017 International Society for Sexual Medicine. Published by Elsevier Inc. All rights reserved.

  19. Scaling Techniques for Combustion Device Random Vibration Predictions

    NASA Technical Reports Server (NTRS)

    Kenny, R. J.; Ferebee, R. C.; Duvall, L. D.

    2016-01-01

    This work presents compares scaling techniques that can be used for prediction of combustion device component random vibration levels with excitation due to the internal combustion dynamics. Acceleration and unsteady dynamic pressure data from multiple component test programs are compared and normalized per the two scaling approaches reviewed. Two scaling technique are reviewed and compared against the collected component test data. The first technique is an existing approach developed by Barrett, and the second technique is an updated approach new to this work. Results from utilizing both techniques are presented and recommendations about future component random vibration prediction approaches are given.

  20. Suspicious activity recognition in infrared imagery using Hidden Conditional Random Fields for outdoor perimeter surveillance

    NASA Astrophysics Data System (ADS)

    Rogotis, Savvas; Ioannidis, Dimosthenis; Tzovaras, Dimitrios; Likothanassis, Spiros

    2015-04-01

    The aim of this work is to present a novel approach for automatic recognition of suspicious activities in outdoor perimeter surveillance systems based on infrared video processing. Through the combination of size, speed and appearance based features, like the Center-Symmetric Local Binary Patterns, short-term actions are identified and serve as input, along with user location, for modeling target activities using the theory of Hidden Conditional Random Fields. HCRFs are used to directly link a set of observations to the most appropriate activity label and as such to discriminate high risk activities (e.g. trespassing) from zero risk activities (e.g loitering outside the perimeter). Experimental results demonstrate the effectiveness of our approach in identifying suspicious activities for video surveillance systems.

  1. Noise characteristics of nanoscaled redox-cycling sensors: investigations based on random walks.

    PubMed

    Kätelhön, Enno; Krause, Kay J; Singh, Pradyumna S; Lemay, Serge G; Wolfrum, Bernhard

    2013-06-19

    We investigate noise effects in nanoscaled electrochemical sensors using a three-dimensional simulation based on random walks. The presented approach allows the prediction of time-dependent signals and noise characteristics for redox cycling devices of arbitrary geometry. We demonstrate that the simulation results closely match experimental data as well as theoretical expectations with regard to measured currents and noise power spectra. We further analyze the impact of the sensor design on characteristics of the noise power spectrum. Specific transitions between independent noise sources in the frequency domain are indicative of the sensor-reservoir coupling and can be used to identify stationary design features or time-dependent blocking mechanisms. We disclose the source code of our simulation. Since our approach is highly flexible with regard to the implemented boundary conditions, it opens up the possibility for integrating a variety of surface-specific molecular reactions in arbitrary electrochemical systems. Thus, it may become a useful tool for the investigation of a wide range of noise effects in nanoelectrochemical sensors.

  2. Exploring Differential Effects Across Two Decoding Treatments on Item-Level Transfer in Children with Significant Word Reading Difficulties: A New Approach for Testing Intervention Elements.

    PubMed

    Steacy, Laura M; Elleman, Amy M; Lovett, Maureen W; Compton, Donald L

    2016-01-01

    In English, gains in decoding skill do not map directly onto increases in word reading. However, beyond the Self-Teaching Hypothesis (Share, 1995), little is known about the transfer of decoding skills to word reading. In this study, we offer a new approach to testing specific decoding elements on transfer to word reading. To illustrate, we modeled word-reading gains among children with reading disability (RD) enrolled in Phonological and Strategy Training (PHAST) or Phonics for Reading (PFR). Conditions differed in sublexical training with PHAST stressing multi-level connections and PFR emphasizing simple grapheme-phoneme correspondences. Thirty-seven children with RD, 3 rd - 6 th grade, were randomly assigned 60 lessons of PHAST or PFR. Crossed random-effects models allowed us to identify specific intervention elements that differentially impacted word-reading performance at posttest, with children in PHAST better able to read words with variant vowel pronunciations. Results suggest that sublexical emphasis influences transfer gains to word reading.

  3. Environmental Health Practice: Statistically Based Performance Measurement

    PubMed Central

    Enander, Richard T.; Gagnon, Ronald N.; Hanumara, R. Choudary; Park, Eugene; Armstrong, Thomas; Gute, David M.

    2007-01-01

    Objectives. State environmental and health protection agencies have traditionally relied on a facility-by-facility inspection-enforcement paradigm to achieve compliance with government regulations. We evaluated the effectiveness of a new approach that uses a self-certification random sampling design. Methods. Comprehensive environmental and occupational health data from a 3-year statewide industry self-certification initiative were collected from representative automotive refinishing facilities located in Rhode Island. Statistical comparisons between baseline and postintervention data facilitated a quantitative evaluation of statewide performance. Results. The analysis of field data collected from 82 randomly selected automotive refinishing facilities showed statistically significant improvements (P<.05, Fisher exact test) in 4 major performance categories: occupational health and safety, air pollution control, hazardous waste management, and wastewater discharge. Statistical significance was also shown when a modified Bonferroni adjustment for multiple comparisons was performed. Conclusions. Our findings suggest that the new self-certification approach to environmental and worker protection is effective and can be used as an adjunct to further enhance state and federal enforcement programs. PMID:17267709

  4. Point and interval estimation of pollinator importance: a study using pollination data of Silene caroliniana.

    PubMed

    Reynolds, Richard J; Fenster, Charles B

    2008-05-01

    Pollinator importance, the product of visitation rate and pollinator effectiveness, is a descriptive parameter of the ecology and evolution of plant-pollinator interactions. Naturally, sources of its variation should be investigated, but the SE of pollinator importance has never been properly reported. Here, a Monte Carlo simulation study and a result from mathematical statistics on the variance of the product of two random variables are used to estimate the mean and confidence limits of pollinator importance for three visitor species of the wildflower, Silene caroliniana. Both methods provided similar estimates of mean pollinator importance and its interval if the sample size of the visitation and effectiveness datasets were comparatively large. These approaches allowed us to determine that bumblebee importance was significantly greater than clearwing hawkmoth, which was significantly greater than beefly. The methods could be used to statistically quantify temporal and spatial variation in pollinator importance of particular visitor species. The approaches may be extended for estimating the variance of more than two random variables. However, unless the distribution function of the resulting statistic is known, the simulation approach is preferable for calculating the parameter's confidence limits.

  5. Onset of natural convection in a continuously perturbed system

    NASA Astrophysics Data System (ADS)

    Ghorbani, Zohreh; Riaz, Amir

    2017-11-01

    The convective mixing triggered by gravitational instability plays an important role in CO2 sequestration in saline aquifers. The linear stability analysis and the numerical simulation concerning convective mixing in porous media requires perturbations of small amplitude to be imposed on the concentration field in the form of an initial shape function. In aquifers, however, the instability is triggered by local porosity and permeability. In this work, we consider a canonical 2D homogeneous system where perturbations arise due to spatial variation of porosity in the system. The advantage of this approach is not only the elimination of the required initial shape function, but it also serves as a more realistic approach. Using a reduced nonlinear method, we first explore the effect of harmonic variations of porosity in the transverse and streamwise direction on the onset time of convection and late time behavior. We then obtain the optimal porosity structure that minimizes the convection onset. We further examine the effect of a random porosity distribution, that is independent of the spatial mode of porosity structure, on the convection onset. Using high-order pseudospectral DNS, we explore how the random distribution differs from the modal approach in predicting the onset time.

  6. Decisions with Uncertain Consequences—A Total Ordering on Loss-Distributions

    PubMed Central

    König, Sandra; Schauer, Stefan

    2016-01-01

    Decisions are often based on imprecise, uncertain or vague information. Likewise, the consequences of an action are often equally unpredictable, thus putting the decision maker into a twofold jeopardy. Assuming that the effects of an action can be modeled by a random variable, then the decision problem boils down to comparing different effects (random variables) by comparing their distribution functions. Although the full space of probability distributions cannot be ordered, a properly restricted subset of distributions can be totally ordered in a practically meaningful way. We call these loss-distributions, since they provide a substitute for the concept of loss-functions in decision theory. This article introduces the theory behind the necessary restrictions and the hereby constructible total ordering on random loss variables, which enables decisions under uncertainty of consequences. Using data obtained from simulations, we demonstrate the practical applicability of our approach. PMID:28030572

  7. Using Temporal Correlations and Full Distributions to Separate Intrinsic and Extrinsic Fluctuations in Biological Systems

    NASA Astrophysics Data System (ADS)

    Hilfinger, Andreas; Chen, Mark; Paulsson, Johan

    2012-12-01

    Studies of stochastic biological dynamics typically compare observed fluctuations to theoretically predicted variances, sometimes after separating the intrinsic randomness of the system from the enslaving influence of changing environments. But variances have been shown to discriminate surprisingly poorly between alternative mechanisms, while for other system properties no approaches exist that rigorously disentangle environmental influences from intrinsic effects. Here, we apply the theory of generalized random walks in random environments to derive exact rules for decomposing time series and higher statistics, rather than just variances. We show for which properties and for which classes of systems intrinsic fluctuations can be analyzed without accounting for extrinsic stochasticity and vice versa. We derive two independent experimental methods to measure the separate noise contributions and show how to use the additional information in temporal correlations to detect multiplicative effects in dynamical systems.

  8. Effects of a Multicomponent Life-Style Intervention on Weight, Glycemic Control, Depressive Symptoms, and Renal Function in Low-Income, Minority Patients With Type 2 Diabetes: Results of the Community Approach to Lifestyle Modification for Diabetes Randomized Controlled Trial.

    PubMed

    Moncrieft, Ashley E; Llabre, Maria M; McCalla, Judith Rey; Gutt, Miriam; Mendez, Armando J; Gellman, Marc D; Goldberg, Ronald B; Schneiderman, Neil

    2016-09-01

    Few interventions have combined life-style and psychosocial approaches in the context of Type 2 diabetes management. The purpose of this study was to determine the effect of a multicomponent behavioral intervention on weight, glycemic control, renal function, and depressive symptoms in a sample of overweight/obese adults with Type 2 diabetes and marked depressive symptoms. A sample of 111 adults with Type 2 diabetes were randomly assigned to a 1-year intervention (n = 57) or usual care (n = 54) in a parallel groups design. Primary outcomes included weight, glycosylated hemoglobin, and Beck Depression Inventory II score. Estimated glomerular filtration rate served as a secondary outcome. All measures were assessed at baseline and 6 and 12 months after randomization by assessors blind to randomization. Latent growth modeling was used to examine intervention effects on each outcome. The intervention resulted in decreased weight (mean [M] = 0.322 kg, standard error [SE] = 0.124 kg, p = .010) and glycosylated hemoglobin (M = 0.066%, SE = 0.028%, p = .017), and Beck Depression Inventory II scores (M = 1.009, SE = 0.226, p < .001), and improved estimated glomerular filtration rate (M = 0.742 ml·min·1.73 m, SE = 0.318 ml·min·1.73 m, p = .020) each month during the first 6 months relative to usual care. Multicomponent behavioral interventions targeting weight loss and depressive symptoms as well as diet and physical activity are efficacious in the management of Type 2 diabetes. This study is registered at Clinicaltrials.gov ID: NCT01739205.

  9. Observational Study Designs for Comparative Effectiveness Research: An Alternative Approach to Close Evidence Gaps in Head-and-Neck Cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goulart, Bernardo H.L., E-mail: bhg@uw.edu; University of Washington, Seattle, Washington; Ramsey, Scott D.

    Comparative effectiveness research (CER) has emerged as an approach to improve quality of care and patient outcomes while reducing healthcare costs by providing evidence to guide healthcare decisions. Randomized controlled trials (RCTs) have represented the ideal study design to support treatment decisions in head-and-neck (H and N) cancers. In RCTs, formal chance (randomization) determines treatment allocation, which prevents selection bias from distorting the measure of treatment effects. Despite this advantage, only a minority of patients qualify for inclusion in H and N RCTs, which limits the validity of their results to the broader H and N cancer patient population seenmore » in clinical practice. Randomized controlled trials often do not address other knowledge gaps in the management of H and N cancer, including treatment comparisons for rare types of H and N cancers, monitoring of rare or late toxicity events (eg, osteoradionecrosis), or in some instances an RCT is simply not feasible. Observational studies, or studies in which treatment allocation occurs independently of investigators' choice or randomization, may address several of these gaps in knowledge, thereby complementing the role of RCTs. This critical review discusses how observational CER studies complement RCTs in generating the evidence to inform healthcare decisions and improve the quality of care and outcomes of H and N cancer patients. Review topics include a balanced discussion about the strengths and limitations of both RCT and observational CER study designs; a brief description of design and analytic techniques to handle selection bias in observational studies; examples of observational studies that inform current clinical practices and management of H and N cancers; and suggestions for relevant CER questions that could be addressed by an observational study design.« less

  10. Observational study designs for comparative effectiveness research: an alternative approach to close evidence gaps in head-and-neck cancer.

    PubMed

    Goulart, Bernardo H L; Ramsey, Scott D; Parvathaneni, Upendra

    2014-01-01

    Comparative effectiveness research (CER) has emerged as an approach to improve quality of care and patient outcomes while reducing healthcare costs by providing evidence to guide healthcare decisions. Randomized controlled trials (RCTs) have represented the ideal study design to support treatment decisions in head-and-neck (H&N) cancers. In RCTs, formal chance (randomization) determines treatment allocation, which prevents selection bias from distorting the measure of treatment effects. Despite this advantage, only a minority of patients qualify for inclusion in H&N RCTs, which limits the validity of their results to the broader H&N cancer patient population seen in clinical practice. Randomized controlled trials often do not address other knowledge gaps in the management of H&N cancer, including treatment comparisons for rare types of H&N cancers, monitoring of rare or late toxicity events (eg, osteoradionecrosis), or in some instances an RCT is simply not feasible. Observational studies, or studies in which treatment allocation occurs independently of investigators' choice or randomization, may address several of these gaps in knowledge, thereby complementing the role of RCTs. This critical review discusses how observational CER studies complement RCTs in generating the evidence to inform healthcare decisions and improve the quality of care and outcomes of H&N cancer patients. Review topics include a balanced discussion about the strengths and limitations of both RCT and observational CER study designs; a brief description of design and analytic techniques to handle selection bias in observational studies; examples of observational studies that inform current clinical practices and management of H&N cancers; and suggestions for relevant CER questions that could be addressed by an observational study design. Copyright © 2014 Elsevier Inc. All rights reserved.

  11. Effect of an educational toolkit on quality of care: a pragmatic cluster randomized trial.

    PubMed

    Shah, Baiju R; Bhattacharyya, Onil; Yu, Catherine H Y; Mamdani, Muhammad M; Parsons, Janet A; Straus, Sharon E; Zwarenstein, Merrick

    2014-02-01

    Printed educational materials for clinician education are one of the most commonly used approaches for quality improvement. The objective of this pragmatic cluster randomized trial was to evaluate the effectiveness of an educational toolkit focusing on cardiovascular disease screening and risk reduction in people with diabetes. All 933,789 people aged ≥40 years with diagnosed diabetes in Ontario, Canada were studied using population-level administrative databases, with additional clinical outcome data collected from a random sample of 1,592 high risk patients. Family practices were randomly assigned to receive the educational toolkit in June 2009 (intervention group) or May 2010 (control group). The primary outcome in the administrative data study, death or non-fatal myocardial infarction, occurred in 11,736 (2.5%) patients in the intervention group and 11,536 (2.5%) in the control group (p = 0.77). The primary outcome in the clinical data study, use of a statin, occurred in 700 (88.1%) patients in the intervention group and 725 (90.1%) in the control group (p = 0.26). Pre-specified secondary outcomes, including other clinical events, processes of care, and measures of risk factor control, were also not improved by the intervention. A limitation is the high baseline rate of statin prescribing in this population. The educational toolkit did not improve quality of care or cardiovascular outcomes in a population with diabetes. Despite being relatively easy and inexpensive to implement, printed educational materials were not effective. The study highlights the need for a rigorous and scientifically based approach to the development, dissemination, and evaluation of quality improvement interventions. http://www.ClinicalTrials.gov NCT01411865 and NCT01026688.

  12. The Effect of the Integration of Corpora in Reading Comprehension Classrooms on English as a Foreign Language Learners' Vocabulary Development

    ERIC Educational Resources Information Center

    Gordani, Yahya

    2013-01-01

    This study used a randomized pretest-posttest control group design to examine the effect of the integration of corpora in general English courses on the students' vocabulary development. To enhance the learners' lexical repertoire and thereby improve their reading comprehension, an online corpus-based approach was integrated into 42 hours of…

  13. Investigating Best Practice and Effectiveness of Leadership Wisdom among Principals of Excellent Secondary School Malaysia: Perceptions of Senior Assistants

    ERIC Educational Resources Information Center

    Ahmad, Abdul Razaq; Salleh, Mohamad Johdi; Awang, Mohd Mahzan; Mohamad, Nazifah Alwani

    2013-01-01

    The aim of the current study is to investigate the practices and effectiveness of leadership wisdom among the principals of excellent secondary schools as perceived by the Senior Assistants. This research employed survey approach by using a validated questionnaire. The respondents were 417 Senior Assistants, who were randomly selected from the…

  14. The Effect of Instructional Objectives and General Objectives on Student Self-Evaluation of Psychomotor Performance in Power Mechanics.

    ERIC Educational Resources Information Center

    Janeczko, Robert John

    The major purpose of this study was to ascertain the relative effects of student exposure to instructional objectives upon student self-evaluation of psychomotor activities in a college-level power mechanics course. A randomized posttest-only control group design was used with two different approaches to the statement of the objectives. Four…

  15. Estimating Treatment Effects from Contaminated Multi-Period Education Experiments: The Dynamic Impacts of Class Size Reductions. NBER Working Paper No. 15200

    ERIC Educational Resources Information Center

    Ding, Weili; Lehrer, Steven F.

    2009-01-01

    This paper introduces an empirical strategy to estimate dynamic treatment effects in randomized trials that provide treatment in multiple stages and in which various noncompliance problems arise such as attrition and selective transitions between treatment and control groups. Our approach is applied to the highly influential four year randomized…

  16. A Cognitive Approach to Child Mistreatment Prevention among Medically At-Risk Infants

    ERIC Educational Resources Information Center

    Bugental, Daphne Blunt; Schwartz, Alex

    2009-01-01

    The authors assessed the effectiveness of a home visitation program in enhancing the early parenting history of infants born at medical risk--a population that is at risk for mistreatment. A randomized clinical trial design was used to compare the effects of a cognitively based extension of the Healthy Start home visitation program (HV+) with a…

  17. AFRESh: an adaptive framework for compression of reads and assembled sequences with random access functionality.

    PubMed

    Paridaens, Tom; Van Wallendael, Glenn; De Neve, Wesley; Lambert, Peter

    2017-05-15

    The past decade has seen the introduction of new technologies that lowered the cost of genomic sequencing increasingly. We can even observe that the cost of sequencing is dropping significantly faster than the cost of storage and transmission. The latter motivates a need for continuous improvements in the area of genomic data compression, not only at the level of effectiveness (compression rate), but also at the level of functionality (e.g. random access), configurability (effectiveness versus complexity, coding tool set …) and versatility (support for both sequenced reads and assembled sequences). In that regard, we can point out that current approaches mostly do not support random access, requiring full files to be transmitted, and that current approaches are restricted to either read or sequence compression. We propose AFRESh, an adaptive framework for no-reference compression of genomic data with random access functionality, targeting the effective representation of the raw genomic symbol streams of both reads and assembled sequences. AFRESh makes use of a configurable set of prediction and encoding tools, extended by a Context-Adaptive Binary Arithmetic Coding scheme (CABAC), to compress raw genetic codes. To the best of our knowledge, our paper is the first to describe an effective implementation CABAC outside of its' original application. By applying CABAC, the compression effectiveness improves by up to 19% for assembled sequences and up to 62% for reads. By applying AFRESh to the genomic symbols of the MPEG genomic compression test set for reads, a compression gain is achieved of up to 51% compared to SCALCE, 42% compared to LFQC and 44% compared to ORCOM. When comparing to generic compression approaches, a compression gain is achieved of up to 41% compared to GNU Gzip and 22% compared to 7-Zip at the Ultra setting. Additionaly, when compressing assembled sequences of the Human Genome, a compression gain is achieved up to 34% compared to GNU Gzip and 16% compared to 7-Zip at the Ultra setting. A Windows executable version can be downloaded at https://github.com/tparidae/AFresh . tom.paridaens@ugent.be. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  18. Problem-based learning in dental education: a systematic review of the literature.

    PubMed

    Bassir, Seyed Hossein; Sadr-Eshkevari, Pooyan; Amirikhorheh, Shaden; Karimbux, Nadeem Y

    2014-01-01

    The purpose of this systematic review was to compare the effectiveness of problem-based learning (PBL) with that of traditional (non-PBL) approaches in dental education. The search strategy included electronic and manual searches of studies published up to October 2012. The PICO (Population, Intervention, Comparator, and Outcome) framework was utilized to guide the inclusion or exclusion of studies. The search strategy identified 436 articles, seventeen of which met the inclusion criteria. No randomized controlled trial was found comparing the effectiveness of PBL with that of lecture-based approach at the level of an entire curriculum. Three randomized controlled trials had evaluated the effectiveness of PBL at a single course level. The quality assessment rated four studies as being of moderate quality, while the other studies were assessed as being of weak quality. This review concludes that there are a very limited number of well-designed controlled studies evaluating the effectiveness of PBL in dental education. The data in those studies reveal that PBL does not negatively influence the acquisition of factual knowledge in dental students and PBL enhances the ability of students in applying their knowledge to clinical situations. In addition, PBL positively affects students' perceived preparedness.

  19. Multilevel structural equation models for assessing moderation within and across levels of analysis.

    PubMed

    Preacher, Kristopher J; Zhang, Zhen; Zyphur, Michael J

    2016-06-01

    Social scientists are increasingly interested in multilevel hypotheses, data, and statistical models as well as moderation or interactions among predictors. The result is a focus on hypotheses and tests of multilevel moderation within and across levels of analysis. Unfortunately, existing approaches to multilevel moderation have a variety of shortcomings, including conflated effects across levels of analysis and bias due to using observed cluster averages instead of latent variables (i.e., "random intercepts") to represent higher-level constructs. To overcome these problems and elucidate the nature of multilevel moderation effects, we introduce a multilevel structural equation modeling (MSEM) logic that clarifies the nature of the problems with existing practices and remedies them with latent variable interactions. This remedy uses random coefficients and/or latent moderated structural equations (LMS) for unbiased tests of multilevel moderation. We describe our approach and provide an example using the publicly available High School and Beyond data with Mplus syntax in Appendix. Our MSEM method eliminates problems of conflated multilevel effects and reduces bias in parameter estimates while offering a coherent framework for conceptualizing and testing multilevel moderation effects. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  20. Pharmacological approaches to the treatment of complicated grief: rationale and a brief review of the literature

    PubMed Central

    Bui, Eric; Nadal-Vicens, Mireya; M. Simon, Naomi

    2012-01-01

    Complicated grief (CG) is a common and often under-acknowledged cause of profound impairment experienced after the loss of a loved one. Although both clinical and basic research suggests that pharmacological agents might be of use in the treatment of CG, research on pharmacological approaches to this condition is still scarce. Three open-label trials and one randomized trial on bereavement-related depression suggest that tricyclic antidepressants may be effective, although they may be more efficacious for depressive symptoms than for grief-specific symptoms. Four open-label trials (total number of participants, 50) of selective serotonin reuptake inhibitors (SSRIs) have yielded results, providing very preliminary support that they might be effective in the treatment of CG, both as a standalone treatment and in conjunction with psychotherapeutic interventions. These more recent studies have shown an effect on both depression and grief-specific scales. Furthermore, therapeutic interventions for CG may be more effective in conjunction with SSRI administration. Given the small number of pharmacological studies to date, there is a need for randomized trials to test the potential efficacy of pharmacological agents in the treatment of CG. PMID:22754287

  1. Stochastic uncertainty analysis for unconfined flow systems

    USGS Publications Warehouse

    Liu, Gaisheng; Zhang, Dongxiao; Lu, Zhiming

    2006-01-01

    A new stochastic approach proposed by Zhang and Lu (2004), called the Karhunen‐Loeve decomposition‐based moment equation (KLME), has been extended to solving nonlinear, unconfined flow problems in randomly heterogeneous aquifers. This approach is on the basis of an innovative combination of Karhunen‐Loeve decomposition, polynomial expansion, and perturbation methods. The random log‐transformed hydraulic conductivity field (lnKS) is first expanded into a series in terms of orthogonal Gaussian standard random variables with their coefficients obtained as the eigenvalues and eigenfunctions of the covariance function of lnKS. Next, head h is decomposed as a perturbation expansion series Σh(m), where h(m) represents the mth‐order head term with respect to the standard deviation of lnKS. Then h(m) is further expanded into a polynomial series of m products of orthogonal Gaussian standard random variables whose coefficients hi1,i2,...,im(m) are deterministic and solved sequentially from low to high expansion orders using MODFLOW‐2000. Finally, the statistics of head and flux are computed using simple algebraic operations on hi1,i2,...,im(m). A series of numerical test results in 2‐D and 3‐D unconfined flow systems indicated that the KLME approach is effective in estimating the mean and (co)variance of both heads and fluxes and requires much less computational effort as compared to the traditional Monte Carlo simulation technique.

  2. Extending existing structural identifiability analysis methods to mixed-effects models.

    PubMed

    Janzén, David L I; Jirstrand, Mats; Chappell, Michael J; Evans, Neil D

    2018-01-01

    The concept of structural identifiability for state-space models is expanded to cover mixed-effects state-space models. Two methods applicable for the analytical study of the structural identifiability of mixed-effects models are presented. The two methods are based on previously established techniques for non-mixed-effects models; namely the Taylor series expansion and the input-output form approach. By generating an exhaustive summary, and by assuming an infinite number of subjects, functions of random variables can be derived which in turn determine the distribution of the system's observation function(s). By considering the uniqueness of the analytical statistical moments of the derived functions of the random variables, the structural identifiability of the corresponding mixed-effects model can be determined. The two methods are applied to a set of examples of mixed-effects models to illustrate how they work in practice. Copyright © 2017 Elsevier Inc. All rights reserved.

  3. Pre-Survey Text Messages (SMS) Improve Participation Rate in an Australian Mobile Telephone Survey: An Experimental Study.

    PubMed

    Dal Grande, Eleonora; Chittleborough, Catherine Ruth; Campostrini, Stefano; Dollard, Maureen; Taylor, Anne Winifred

    2016-01-01

    Mobile telephone numbers are increasingly being included in household surveys samples. As approach letters cannot be sent because many do not have address details, alternatives approaches have been considered. This study assesses the effectiveness of sending a short message service (SMS) to a random sample of mobile telephone numbers to increase response rates. A simple random sample of 9000 Australian mobile telephone numbers: 4500 were randomly assigned to be sent a pre-notification SMS, and the remaining 4500 did not have a SMS sent. Adults aged 18 years and over, and currently in paid employment, were eligible to participate. American Association for Public Opinion Research formulas were used to calculated response cooperation and refusal rates. Response and cooperation rate were higher for the SMS groups (12.4% and 28.6% respectively) than the group with no SMS (7.7% and 16.0%). Refusal rates were lower for the SMS group (27.3%) than the group with no SMS (35.9%). When asked, 85.8% of the pre-notification group indicated they remembered receiving a SMS about the study. Sending a pre-notification SMS is effective in improving participation in population-based surveys. Response rates were increased by 60% and cooperation rates by 79%.

  4. The Promoting Effective Advance Care for Elders (PEACE) randomized pilot study: theoretical framework and study design.

    PubMed

    Allen, Kyle R; Hazelett, Susan E; Radwany, Steven; Ertle, Denise; Fosnight, Susan M; Moore, Pamela S

    2012-04-01

    Practice guidelines are available for hospice and palliative medicine specialists and geriatricians. However, these guidelines do not adequately address the needs of patients who straddle the 2 specialties: homebound chronically ill patients. The purpose of this article is to describe the theoretical basis for the Promoting Effective Advance Care for Elders (PEACE) randomized pilot study. PEACE is an ongoing 2-group randomized pilot study (n=80) to test an in-home interdisciplinary care management intervention that combines palliative care approaches to symptom management, psychosocial and emotional support, and advance care planning with geriatric medicine approaches to optimizing function and addressing polypharmacy. The population comprises new enrollees into PASSPORT, Ohio's community-based, long-term care Medicaid waiver program. All PASSPORT enrollees have geriatric/palliative care crossover needs because they are nursing home eligible. The intervention is based on Wagner's Chronic Care Model and includes comprehensive interdisciplinary care management for these low-income frail elders with chronic illnesses, uses evidence-based protocols, emphasizes patient activation, and integrates with community-based long-term care and other community agencies. Our model, with its standardized, evidence-based medical and psychosocial intervention protocols, will transport easily to other sites that are interested in optimizing outcomes for community-based, chronically ill older adults. © Mary Ann Liebert, Inc.

  5. Effectiveness of interactive discussion group in suicide risk assessment among general nurses in Taiwan: a randomized controlled trial.

    PubMed

    Wu, Chia-Yi; Lin, Yi-Yin; Yeh, Mei Chang; Huang, Lian-Hua; Chen, Shaw-Ji; Liao, Shih-Cheng; Lee, Ming-Been

    2014-11-01

    The evidence of suicide prevention training for nurses is scarce. Strategies to enhance general nurses' ability in suicide risk assessment are critical to develop effective training programs in general medical settings. This study was aimed to examine the effectiveness of an interactive discussion group in a suicide prevention training program for general nurses. In this randomized study with two groups of pre-post study design, the sample was recruited from the Medical, Surgical, and Emergency/Intensive Care Sectors of a 2000-bed general hospital via stratified randomization. Among the 111 nurses, 57 participants randomly assigned to the control group received a two-hour baseline suicide gatekeeper lecture, and 54 participants assigning to the experimental group received an additional five-hour group discussion about suicide risk assessment skills. Using a case vignette, the nurses discussed and assessed suicide risk factors specified in a 10-item Chinese SAD PERSONS Scale during a group discussion intervention. The findings revealed that the nurses achieved significant and consistent improvements of risk identification and assessment after the intervention without influencing their mental health status for assessing suicide risks. The result suggested an effective approach of interactive group discussion for facilitating critical thinking and learning suicide risk assessment skills among general nurses. Copyright © 2014 Elsevier Ltd. All rights reserved.

  6. Effect of Cosmos caudatus (Ulam raja) supplementation in patients with type 2 diabetes: Study protocol for a randomized controlled trial.

    PubMed

    Cheng, Shi-Hui; Ismail, Amin; Anthony, Joseph; Ng, Ooi Chuan; Hamid, Azizah Abdul; Yusof, Barakatun-Nisak Mohd

    2016-02-27

    Type 2 diabetes mellitus is a major health threat worldwide. Cosmos caudatus is one of the medicinal plants used to treat type 2 diabetes. Therefore, this study aims to determine the effectiveness and safety of C. caudatus in patients with type 2 diabetes. Metabolomic approach will be carried out to compare the metabolite profiles between C. Caudatus treated diabetic patients and diabetic controls. This is a single-center, randomized, controlled, two-arm parallel design clinical trial that will be carried out in a tertiary hospital in Malaysia. In this study, 100 patients diagnosed with type 2 diabetes will be enrolled. Diabetic patients who meet the eligibility criteria will be randomly allocated to two groups, which are diabetic C. caudatus treated(U) group and diabetic control (C) group. Primary and secondary outcomes will be measured at baseline, 4, 8, and 12 weeks. The serum and urine metabolome of both groups will be examined using proton NMR spectroscopy. The study will be the first randomized controlled trial to assess whether C. caudatus can confer beneficial effect in patients with type 2 diabetes. The results of this trial will provide clinical evidence on the effectiveness and safety of C. caudatus in patients with type 2 diabetes. ClinicalTrials.gov identifier: NCT02322268.

  7. A novel cognitive behaviour therapy for bipolar disorders (Think Effectively About Mood Swings or TEAMS): study protocol for a randomized controlled trial.

    PubMed

    Mansell, Warren; Tai, Sara; Clark, Alexandra; Akgonul, Savas; Dunn, Graham; Davies, Linda; Law, Heather; Morriss, Richard; Tinning, Neil; Morrison, Anthony P

    2014-10-24

    Existing psychological therapies for bipolar disorders have been found to have mixed results, with a consensus that they provide a significant, but modest, effect on clinical outcomes. Typically, these approaches have focused on promoting strategies to prevent future relapse. An alternative treatment approach, termed 'Think Effectively About Mood Swings' (TEAMS) addresses current symptoms, including subclinical hypomania, depression and anxiety, and promotes long-term recovery. Following the publication of a theoretical model, a range of research studies testing the model and a case series have demonstrated positive results. The current study reports the protocol of a feasibility randomized controlled trial to inform a future multi-centre trial. A target number of 84 patients with a diagnosis of bipolar I or II disorder, or bipolar disorder not-otherwise-specified are screened, allocated to a baseline assessment and randomized to either 16 sessions of TEAMS therapy plus treatment-as-usual (TAU) or TAU. Patients complete self-report inventories of depression, anxiety, recovery status and bipolar cognitions targeted by TEAMS. Assessments of diagnosis, bipolar symptoms, medication, access to services and quality of life are conducted by assessors blind to treatment condition at 3, 6, 12 and 18 months post-randomization. The main aim is to evaluate recruitment and retention of participants into both arms of the study, as well as adherence to therapy, to determine feasibility and acceptability. It is predicted that TEAMS plus TAU will reduce self-reported depression in comparison to TAU alone at six months post-randomization. The secondary hypotheses are that TEAMS will reduce the severity of hypomanic symptoms and anxiety, reduce bipolar cognitions, improve social functioning and promote recovery compared to TAU alone at post-treatment and follow-up. The study also incorporates semi-structured interviews about the experiences of previous treatment and the experience of TEAMS therapy that will be subject to qualitative analyses to inform future developments of the approach. The design will provide preliminary evidence of efficacy, feasibility, acceptability, uptake, attrition and barriers to treatment to design a definitive trial of this novel intervention compared to treatment as usual. This trial was registered with Current Controlled Trials (ISRCTN83928726) on registered 25 July 2014.

  8. Percutaneous transhepatic vs. endoscopic retrograde biliary drainage for suspected malignant hilar obstruction: study protocol for a randomized controlled trial.

    PubMed

    Al-Kawas, Firas; Aslanian, Harry; Baillie, John; Banovac, Filip; Buscaglia, Jonathan M; Buxbaum, James; Chak, Amitabh; Chong, Bradford; Coté, Gregory A; Draganov, Peter V; Dua, Kulwinder; Durkalski, Valerie; Elmunzer, B Joseph; Foster, Lydia D; Gardner, Timothy B; Geller, Brian S; Jamidar, Priya; Jamil, Laith H; Keswani, Rajesh N; Khashab, Mouen A; Lang, Gabriel D; Law, Ryan; Lichtenstein, David; Lo, Simon K; McCarthy, Sean; Melo, Silvio; Mullady, Daniel; Nieto, Jose; Bayne Selby, J; Singh, Vikesh K; Spitzer, Rebecca L; Strife, Brian; Tarnaksy, Paul; Taylor, Jason R; Tokar, Jeffrey; Wang, Andrew Y; Williams, April; Willingham, Field; Yachimski, Patrick

    2018-02-14

    The optimal approach to the drainage of malignant obstruction at the liver hilum remains uncertain. We aim to compare percutaneous transhepatic biliary drainage (PTBD) to endoscopic retrograde cholangiography (ERC) as the first intervention in patients with cholestasis due to suspected malignant hilar obstruction (MHO). The INTERCPT trial is a multi-center, comparative effectiveness, randomized, superiority trial of PTBD vs. ERC for decompression of suspected MHO. One hundred and eighty-four eligible patients across medical centers in the United States, who provide informed consent, will be randomly assigned in 1:1 fashion via a web-based electronic randomization system to either ERC or PTBD as the initial drainage and, if indicated, diagnostic procedure. All subsequent clinical interventions, including crossover to the alternative procedure, will be dictated by treating physicians per usual clinical care. Enrolled subjects will be assessed for successful biliary drainage (primary outcome measure), adequate tissue diagnosis, adverse events, the need for additional procedures, hospitalizations, and oncological outcomes over a 6-month follow-up period. Subjects, treating clinicians and outcome assessors will not be blinded. The INTERCPT trial is designed to determine whether PTBD or ERC is the better initial approach when managing a patient with suspected MHO, a common clinical dilemma that has never been investigated in a randomized trial. ClinicalTrials.gov, Identifier: NCT03172832 . Registered on 1 June 2017.

  9. The Individualized Diet and Exercise Adherence Pilot Trial (IDEA-P) in prostate cancer patients undergoing androgen deprivation therapy: study protocol for a randomized controlled trial.

    PubMed

    Focht, Brian C; Lucas, Alexander R; Grainger, Elizabeth; Simpson, Christina; Thomas-Ahner, Jennifer M; Clinton, Steven K

    2014-09-09

    Androgen deprivation therapy (ADT) is the foundation of treatment for men with metastatic prostate cancer and is now frequently incorporated into multimodality strategies for the curative treatment of locally advanced prostate cancer. Nevertheless, the catabolic effects of ADT result in meaningful adverse effects on physiological and quality of life outcomes, which may, in turn, increase the risk of functional decline, frailty, cardiovascular disease, and metabolic syndrome. Recent evidence demonstrates that lifestyle intervention promoting change in exercise and dietary behaviors is a promising approach, and may offset, or even reverse, the adverse effects accompanying ADT. Unfortunately, the limited existing studies of the effects of exercise and dietary interventions targeting patients with prostate cancer on ADT are characterized by high attrition rates and poor postintervention maintenance of treatment effects. Consequently, the Individualized Diet and Exercise Adherence Pilot Trial (IDEA-P) is designed to contrast the effects of a lifestyle intervention designed to promote independent self-management of exercise and dietary behavior with those of standard care disease management approach in the treatment of prostate cancer. A total of 40 patients with prostate cancer undergoing ADT will be randomly assigned to lifestyle intervention or standard care. Outcomes of interest in IDEA-P include changes in self-reported and objectively assessed physical function and physical activity, dietary behavior, body composition, muscular strength, and quality of life. Outcomes will be obtained at baseline, 2-month, and 3-month assessments by trial personnel blinded to participants' randomization assignment. Findings from this study will establish the feasibility and preliminary efficacy of an innovative lifestyle intervention designed to promote progressively independent self-regulated exercise and dietary behavior change in the treatment of patients with prostate cancer undergoing ADT. ClinicalTrials.gov NCT02050906.

  10. Endoscopic Evacuation of Basal Ganglia Hemorrhage via Keyhole Approach Using an Adjustable Cannula in Comparison with Craniotomy

    PubMed Central

    Zhang, Heng-Zhu; Li, Yu-Ping; Yan, Zheng-cun; Wang, Xing-dong; She, Lei; Wang, Xiao-dong; Dong, Lun

    2014-01-01

    Neuroendoscopic (NE) surgery as a minimal invasive treatment for basal ganglia hemorrhage is a promising approach. The present study aims to evaluate the efficacy and safety of NE approach using an adjustable cannula to treat basal ganglia hemorrhage. In this study, we analysed the clinical and radiographic outcomes between NE group (21 cases) and craniotomy group (30 cases). The results indicated that NE surgery might be an effective and safe approach for basal ganglia haemorrhage, and it is also suggested that NE approach may improve good functional recovery. However, NE approach only suits the selected patient, and the usefulness of NE approach needs further randomized controlled trials (RCTs) to evaluate. PMID:24949476

  11. Multigroup Propensity Score Approach to Evaluating an Effectiveness Trial of the New Beginnings Program.

    PubMed

    Tein, Jenn-Yun; Mazza, Gina L; Gunn, Heather J; Kim, Hanjoe; Stuart, Elizabeth A; Sandler, Irwin N; Wolchik, Sharlene A

    2018-06-01

    We used a multigroup propensity score approach to evaluate a randomized effectiveness trial of the New Beginnings Program (NBP), an intervention targeting divorced or separated families. Two features of effectiveness trials, high nonattendance rates and inclusion of an active control, make program effects harder to detect. To estimate program effects based on actual intervention participation, we created a synthetic inactive control comprised of nonattenders and assessed the impact of attending the NBP or active control relative to no intervention (inactive control). We estimated propensity scores using generalized boosted models and applied inverse probability of treatment weighting for the comparisons. Relative to the inactive control, NBP strengthened parenting quality as well as reduced child exposure to interparental conflict, parent psychological distress, and child internalizing problems. Some effects were moderated by parent gender, parent ethnicity, or child age. On the other hand, the effects of active versus inactive control were minimal for parenting and in the unexpected direction for child internalizing problems. Findings from the propensity score approach complement and enhance the interpretation of findings from the intention-to-treat approach.

  12. Complementary and Alternative Approaches to Pain Relief During Labor

    PubMed Central

    Theau-Yonneau, Anne

    2007-01-01

    This review evaluated the effect of complementary and alternative medicine on pain during labor with conventional scientific methods using electronic data bases through 2006 were used. Only randomized controlled trials with outcome measures for labor pain were kept for the conclusions. Many studies did not meet the scientific inclusion criteria. According to the randomized control trials, we conclude that for the decrease of labor pain and/or reduction of the need for conventional analgesic methods: (i) There is an efficacy found for acupressure and sterile water blocks. (ii) Most results favored some efficacy for acupuncture and hydrotherapy. (iii) Studies for other complementary or alternative therapies for labor pain control have not shown their effectiveness. PMID:18227907

  13. Urban tree cover change in Detroit and Atlanta, USA, 1951-2010

    Treesearch

    Krista Merry; Jacek Siry; Pete Bettinger; J.M. Bowker

    2014-01-01

    We assessed tree cover using random points and polygons distributed within the administrative boundaries of Detroit, MI and Atlanta, GA. Two approaches were tested, a point-based approach using 1000 randomly located sample points, and polygon-based approach using 250 circular areas, 200 m in radius (12.56 ha). In the case of Atlanta, both approaches arrived at similar...

  14. Comparison of Wiltse's paraspinal approach and open book laminectomy for thoracolumbar burst fractures with greenstick lamina fractures: a randomized controlled trial.

    PubMed

    Chen, Zhi-da; Wu, Jin; Yao, Xiao-Tao; Cai, Tao-Yi; Zeng, Wen-Rong; Lin, Bin

    2018-03-02

    Posterior short-segment pedicle screw fixation is used to treat thoracolumbar burst fractures. However, no randomized controlled studies have compared the efficacy of the two approaches--the Wiltse's paraspinal approach and open book laminectomy in the treatment of thoracolumbar burst fractures with greenstick lamina fractures. Patients with burst fractures of the thoracolumbar spine without neurological deficit were randomized to receive either the Wiltse's paraspinal approach (group A, 24 patients) or open book laminectomy (group B, 23 patients). Patients were followed postoperatively for average of 27.4 months. Clinical and radiographic data of the two approaches were collected and compared. Our results showed the anterior segmental height, kyphotic angle, visual analog scale (VAS) score, and Smiley-Webster Scale (SWS) score significantly improved postoperatively in both groups, indicating that both the Wiltse's paraspinal approach and open book laminectomy can effectively treat thoracolumbar burst fractures with greenstick lamina fractures. The Wiltse's paraspinal approach was found to have significantly shorter operating time, less blood loss, and shorter length of hospital stay compared to open book laminectomy. However, there were two (2/24) patients in group A that had neurological deficits postoperatively and required a second exploratory operation. Dural tears and/or cauda equina entrapment were subsequently found in four patients in group B and all two patients of neurological deficits in group A during operation. No screw loosening, plate breakage, or other internal fixation failures were found at final follow-up. The results demonstrated that either of the two surgical approaches can achieve satisfactory results in treating thoracolumbar burst fractures in patients with greenstick lamina fractures. However, if there is any clinical or radiographic suspicion of a dural tear and/or cauda equina entrapment pre-operation, patients should receive an open book laminectomy to avoid a second exploratory operation. More research is still needed to optimize clinical decision-making regarding surgical approach.

  15. A Randomized Control Trial for Evaluating Efficacies of Two Online Cognitive Interventions With and Without Fear-Appeal Imagery Approaches in Preventing Unprotected Anal Sex Among Chinese Men Who Have Sex with Men.

    PubMed

    Lau, Joseph T F; Lee, Annisa L; Tse, Wai S; Mo, Phoenix K H; Fong, Francois; Wang, Zixin; Cameron, Linda D; Sheer, Vivian

    2016-09-01

    Fear appeal approach has been used in health promotion, but its effectiveness has been mixed. It has not been well applied to HIV prevention among men who have sex with men (MSM). The present study developed and evaluated the relative efficacy of three online interventions (SC: STD-related cognitive approach, SCFI: STD-related cognitive plus fear appeal imagery approach, Control: HIV-related information based approach) in reducing prevalence of unprotected anal intercourse (UAI) among 396 MSM using a randomized controlled trial design. Participants' levels of fear-related emotions immediately after watching the assigned intervention materials were also assessed. Participants were evaluated at baseline and 3 months after the intervention. Results showed that participants in the SCFI scored significantly higher in the instrument assessing fear after the watching the intervention materials. However, no statistically significant differences were found across the three groups in terms of UAI at Month 3. Some significant within-group reductions in some measures of UAI were found in three groups. Further studies are warranted to test the role of fear appeal in HIV prevention.

  16. Size-dependent piezoelectric energy-harvesting analysis of micro/nano bridges subjected to random ambient excitations

    NASA Astrophysics Data System (ADS)

    Radgolchin, Moeen; Moeenfard, Hamid

    2018-02-01

    The construction of self-powered micro-electro-mechanical units by converting the mechanical energy of the systems into electrical power has attracted much attention in recent years. While power harvesting from deterministic external excitations is state of the art, it has been much more difficult to derive mathematical models for scavenging electrical energy from ambient random vibrations, due to the stochastic nature of the excitations. The current research concerns analytical modeling of micro-bridge energy harvesters based on random vibration theory. Since classical elasticity fails to accurately predict the mechanical behavior of micro-structures, strain gradient theory is employed as a powerful tool to increase the accuracy of the random vibration modeling of the micro-harvester. Equations of motion of the system in the time domain are derived using the Lagrange approach. These are then utilized to determine the frequency and impulse responses of the structure. Assuming the energy harvester to be subjected to a combination of broadband and limited-band random support motion and transverse loading, closed-form expressions for mean, mean square, correlation and spectral density of the output power are derived. The suggested formulation is further exploited to investigate the effect of the different design parameters, including the geometric properties of the structure as well as the properties of the electrical circuit on the resulting power. Furthermore, the effect of length scale parameters on the harvested energy is investigated in detail. It is observed that the predictions of classical and even simple size-dependent theories (such as couple stress) appreciably differ from the findings of strain gradient theory on the basis of random vibration. This study presents a first-time modeling of micro-scale harvesters under stochastic excitations using a size-dependent approach and can be considered as a reliable foundation for future research in the field of micro/nano harvesters subjected to non-deterministic loads.

  17. Protocol for evaluating the effects of a therapeutic foot exercise program on injury incidence, foot functionality and biomechanics in long-distance runners: a randomized controlled trial.

    PubMed

    Matias, Alessandra B; Taddei, Ulisses T; Duarte, Marcos; Sacco, Isabel C N

    2016-04-14

    Overall performance, particularly in a very popular sports activity such as running, is typically influenced by the status of the musculoskeletal system and the level of training and conditioning of the biological structures. Any change in the musculoskeletal system's biomechanics, especially in the feet and ankles, will strongly influence the biomechanics of runners, possibly predisposing them to injuries. A thorough understanding of the effects of a therapeutic approach focused on feet biomechanics, on strength and functionality of lower limb muscles will contribute to the adoption of more effective therapeutic and preventive strategies for runners. A randomized, prospective controlled and parallel trial with blind assessment is designed to study the effects of a "ground-up" therapeutic approach focused on the foot-ankle complex as it relates to the incidence of running-related injuries in the lower limbs. One hundred and eleven (111) healthy long-distance runners will be randomly assigned to either a control (CG) or intervention (IG) group. IG runners will participate in a therapeutic exercise protocol for the foot-ankle for 8 weeks, with 1 directly supervised session and 3 remotely supervised sessions per week. After the 8-week period, IG runners will keep exercising for the remaining 10 months of the study, supervised only by web-enabled software three times a week. At baseline, 2 months, 4 months and 12 months, all runners will be assessed for running-related injuries (primary outcome), time for the occurrence of the first injury, foot health and functionality, muscle trophism, intrinsic foot muscle strength, dynamic foot arch strain and lower-limb biomechanics during walking and running (secondary outcomes). This is the first randomized clinical trial protocol to assess the effect of an exercise protocol that was designed specifically for the foot-and-ankle complex on running-related injuries to the lower limbs of long-distance runners. We intend to show that the proposed protocol is an innovative and effective approach to decreasing the incidence of injuries. We also expect a lengthening in the time of occurrence of the first injury, an improvement in foot function, an increase in foot muscle mass and strength and beneficial biomechanical changes while running and walking after a year of exercising. Clinicaltrials.gov Identifier NCT02306148 (November 28, 2014) under the name "Effects of Foot Strengthening on the Prevalence of Injuries in Long Distance Runners". Committee of Ethics in Research of the School of Medicine of the University of Sao Paulo (18/03/2015, Protocol # 031/15).

  18. Principal Score Methods: Assumptions, Extensions, and Practical Considerations

    ERIC Educational Resources Information Center

    Feller, Avi; Mealli, Fabrizia; Miratrix, Luke

    2017-01-01

    Researchers addressing posttreatment complications in randomized trials often turn to principal stratification to define relevant assumptions and quantities of interest. One approach for the subsequent estimation of causal effects in this framework is to use methods based on the "principal score," the conditional probability of belonging…

  19. Solution-Focused Brief Therapy: Impacts on Academic and Emotional Difficulties

    ERIC Educational Resources Information Center

    Daki, Julia; Savage, Robert S.

    2010-01-01

    This randomized control trial study evaluated the effectiveness of the solution-focused approach in addressing academic, motivational, and socioemotional needs of 14 children with reading difficulties. The intervention group received five 40-min solution-focused sessions. The control group received academic homework support. Results showed…

  20. School-based programmes for preventing smoking.

    PubMed

    Thomas, R; Perera, R

    2006-07-19

    Smoking rates in adolescents are rising in some countries. Helping young people to avoid starting smoking is a widely endorsed goal of public health, but there is uncertainty about how to do this. Schools provide a route for communicating with a large proportion of young people, and school-based programmes for smoking prevention have been widely developed and evaluated. To review all randomized controlled trials of behavioural interventions in schools to prevent children (aged 5 to12) and adolescents (aged 13 to18) starting smoking. We searched the Cochrane Central Register of Controlled Trials (CENTRAL) and the Cochrane Tobacco Addiction Group's Specialized Register, MEDLINE, EMBASE, PsyclNFO, ERIC, CINAHL, Health Star, Dissertation Abstracts and studies identified in the bibliographies of articles. Individual MEDLINE searches were made for 133 authors who had undertaken randomized controlled trials in this area. Types of studies: those in which individual students, classes, schools, or school districts were randomized to the intervention or control groups and followed for at least six months. Children (aged 5 to12) or adolescents (aged 13 to18) in school settings. Types of interventions: Classroom programmes or curricula, including those with associated family and community interventions, intended to deter use of tobacco. We included programmes or curricula that provided information, those that used social influences approaches, those that taught generic social competence, and those that included interventions beyond the school into the community. We included programmes with a drug or alcohol focus if outcomes for tobacco use were reported. Types of outcome measures: Prevalence of non-smoking at follow up among those not smoking at baseline. We did not require biochemical validation of self-reported tobacco use for study inclusion. We assessed whether identified citations were randomized controlled trials. We assessed the quality of design and execution, and abstracted outcome data. Because of the marked heterogeneity of design and outcomes, we computed pooled estimates only for those trials that could be analyzed together and for which statistical data were available. We predominantly synthesized the data using narrative systematic review. We grouped studies by intervention method (information; social competence; social influences; combined social influences/social competence; multi-modal programmes). Within each group, we placed them into three categories (low, medium and high risk of bias) according to validity using quality criteria for reported study design. Of the 94 randomized controlled trials identified, we classified 23 as category one (most valid). There was one category one study of information-giving and two of teaching social comeptence. There were thirteen category one studies of social influences interventions. Of these, nine found some positive effect of intervention on smoking prevalence, and four failed to detect an effect on smoking prevalence. The largest and most rigorous study, the Hutchinson Smoking Prevention Project, found no long-term effect of an intensive eight-year programme on smoking behaviour. There were three category one RCTs of combined social influences and social competence interventions: one provided significant results and one only for instruction by health educators compared to self-instruction. There was a lack of high quality evidence about the effectiveness of combinations of social influences and social competence approaches. There was one category one study providing data on social influences compared with information giving. There were four category one studies of multi-modal approaches but they provided limited evidence about the effectiveness of multi-modal approaches including community initiatives. There is one rigorous test of the effects of information-giving about smoking. There are well-conducted randomized controlled trials to test the effects of social influences interventions: in half of the group of best quality studies those in the intervention group smoke less than those in the control, but many studies failed to detect an effect of the intervention. There are only three high quality RCTs which test the effectiveness of combinations of social influences and social competence interventions, and four which test multi-modal interventions; half showed significant positive results.

  1. Impact of different dietary approaches on glycemic control and cardiovascular risk factors in patients with type 2 diabetes: a protocol for a systematic review and network meta-analysis.

    PubMed

    Schwingshackl, Lukas; Chaimani, Anna; Hoffmann, Georg; Schwedhelm, Carolina; Boeing, Heiner

    2017-03-20

    Dietary advice is one of the cornerstones in the management of type 2 diabetes mellitus. The American Diabetes Association recommended a hypocaloric diet for overweight or obese adults with type 2 diabetes in order to induce weight loss. However, there is limited evidence on the optimal approaches to control hyperglycemia in type 2 diabetes patients. The aim of the present study is to assess the comparative efficacy of different dietary approaches on glycemic control and blood lipids in patients with type 2 diabetes mellitus in a systematic review including a standard pairwise and network meta-analysis of randomized trials. We will conduct searches in Cochrane Central Register of Controlled Trials (CENTRAL) on the Cochrane Library, PubMed (from 1966), and Google Scholar. Citations, abstracts, and relevant papers will be screened for eligibility by two reviewers independently. Randomized controlled trials (with a control group or randomized trials with at least two intervention groups) will be included if they meet the following criteria: (1) include type 2 diabetes mellitus, (2) include patients aged ≥18 years, (3) include dietary intervention (different type of diets: e.g., Mediterranean dietary pattern, low-carbohydrate diet, low-fat diet, vegetarian diet, high protein diet); either hypo, iso-caloric, or ad libitum diets, (4) minimum intervention period of 12 weeks. For each outcome measure of interest, random effects pairwise and network meta-analyses will be performed in order to determine the pooled relative effect of each intervention relative to every other intervention in terms of the post-intervention values (or mean differences between the changes from baseline value scores). Subgroup analyses are planned for study length, sample size, age, and sex. This systematic review will synthesize the available evidence on the comparative efficacy of different dietary approaches in the management of glycosylated hemoglobin (primary outcome), fasting glucose, and cardiovascular risk factors in type 2 diabetes mellitus patients. The results of the present network meta-analysis will influence evidence-based treatment decisions since it will be fundamental for based recommendations in the management of type 2 diabetes. PROSPERO 42016047464.

  2. Modular Approach to Therapy for Anxiety, Depression, Trauma, or Conduct Problems in outpatient child and adolescent mental health services in New Zealand: study protocol for a randomized controlled trial.

    PubMed

    Lucassen, Mathijs F G; Stasiak, Karolina; Crengle, Sue; Weisz, John R; Frampton, Christopher M A; Bearman, Sarah Kate; Ugueto, Ana M; Herren, Jennifer; Cribb-Su'a, Ainsleigh; Faleafa, Monique; Kingi-'Ulu'ave, Denise; Loy, Jik; Scott, Rebecca M; Hartdegen, Morgyn; Merry, Sally N

    2015-10-12

    Mental health disorders are common and disabling for young people because of the potential to disrupt key developmental tasks. Implementation of evidence-based psychosocial therapies in New Zealand is limited, owing to the inaccessibility, length, and cost of training in these therapies. Furthermore, most therapies address one problem area at a time, although comorbidity and changing clinical needs commonly occur in practice. A more flexible approach is needed. The Modular Approach to Therapy for Children with Anxiety, Depression, Trauma, or Conduct Problems (MATCH-ADTC) is designed to overcome these challenges; it provides a range of treatment modules addressing different problems, within a single training program. A clinical trial of MATCH-ADTC in the USA showed that MATCH-ADTC outperformed usual care and standard evidence-based treatment on several clinical measures. We aim to replicate these findings and evaluate the impact of providing training and supervision in MATCH-ADTC to: (1) improve clinical outcomes for youth attending mental health services; (2) increase the amount of evidence-based therapy content; (3) increase the efficiency of service delivery. This is an assessor-blinded multi-site effectiveness randomized controlled trial. Randomization occurs at two levels: (1) clinicians (≥60) are randomized to intervention or usual care; (2) youth participants (7-14 years old) accepted for treatment in child and adolescent mental health services (with a primary disorder that includes anxiety, depression, trauma-related symptoms, or disruptive behavior) are randomly allocated to receive MATCH-ADTC or usual care. Youth participants are recruited from 'mainstream', Māori-specific, and Pacific-specific child and adolescent mental health services. We originally planned to recruit 400 youth participants, but this has been revised to 200 participants. Centralized computer randomization ensures allocation concealment. The primary outcome measures are: (i) the difference in trajectory of change of clinical severity between groups (using the parent-rated Brief Problem Monitor); (ii) clinicians' use of evidence-based treatment procedures during therapy sessions; (iii) total time spent by clinicians delivering therapy. If MATCH-ADTC demonstrates effectiveness it could offer a practical efficient method to increase access to evidence-based therapies, and improve outcomes for youth attending secondary care services. Australian and New Zealand Clinical Trials Registry ACTRN12614000297628 .

  3. A comparison of observation-level random effect and Beta-Binomial models for modelling overdispersion in Binomial data in ecology & evolution.

    PubMed

    Harrison, Xavier A

    2015-01-01

    Overdispersion is a common feature of models of biological data, but researchers often fail to model the excess variation driving the overdispersion, resulting in biased parameter estimates and standard errors. Quantifying and modeling overdispersion when it is present is therefore critical for robust biological inference. One means to account for overdispersion is to add an observation-level random effect (OLRE) to a model, where each data point receives a unique level of a random effect that can absorb the extra-parametric variation in the data. Although some studies have investigated the utility of OLRE to model overdispersion in Poisson count data, studies doing so for Binomial proportion data are scarce. Here I use a simulation approach to investigate the ability of both OLRE models and Beta-Binomial models to recover unbiased parameter estimates in mixed effects models of Binomial data under various degrees of overdispersion. In addition, as ecologists often fit random intercept terms to models when the random effect sample size is low (<5 levels), I investigate the performance of both model types under a range of random effect sample sizes when overdispersion is present. Simulation results revealed that the efficacy of OLRE depends on the process that generated the overdispersion; OLRE failed to cope with overdispersion generated from a Beta-Binomial mixture model, leading to biased slope and intercept estimates, but performed well for overdispersion generated by adding random noise to the linear predictor. Comparison of parameter estimates from an OLRE model with those from its corresponding Beta-Binomial model readily identified when OLRE were performing poorly due to disagreement between effect sizes, and this strategy should be employed whenever OLRE are used for Binomial data to assess their reliability. Beta-Binomial models performed well across all contexts, but showed a tendency to underestimate effect sizes when modelling non-Beta-Binomial data. Finally, both OLRE and Beta-Binomial models performed poorly when models contained <5 levels of the random intercept term, especially for estimating variance components, and this effect appeared independent of total sample size. These results suggest that OLRE are a useful tool for modelling overdispersion in Binomial data, but that they do not perform well in all circumstances and researchers should take care to verify the robustness of parameter estimates of OLRE models.

  4. Does telemedicine improve treatment outcomes for diabetes? A meta-analysis of results from 55 randomized controlled trials.

    PubMed

    Su, Dejun; Zhou, Junmin; Kelley, Megan S; Michaud, Tzeyu L; Siahpush, Mohammad; Kim, Jungyoon; Wilson, Fernando; Stimpson, Jim P; Pagán, José A

    2016-06-01

    To assess the overall effect of telemedicine on diabetes management and to identify features of telemedicine interventions that are associated with better diabetes management outcomes. Hedges's g was estimated as the summary measure of mean difference in HbA1c between patients with diabetes who went through telemedicine care and those who went through conventional, non-telemedicine care using a random-effects model. Q statistics were calculated to assess if the effect of telemedicine on diabetes management differs by types of diabetes, age groups of patients, duration of intervention, and primary telemedicine approaches used. The analysis included 55 randomized controlled trials with a total of 9258 patients with diabetes, out of which 4607 were randomized to telemedicine groups and 4651 to conventional, non-telemedicine care groups. The results favored telemedicine over conventional care (Hedges's g=-0.48, p<0.001) in diabetes management. The beneficial effect of telemedicine were more pronounced among patients with type 2 diabetes (Hedges's g=-0.63, p<0.001) than among those with type 1 diabetes (Hedges's g=-0.27, p=0.027) (Q=4.25, p=0.04). Compared to conventional care, telemedicine is more effective in improving treatment outcomes for diabetes patients, especially for those with type 2 diabetes. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  5. Constrained Perturbation Regularization Approach for Signal Estimation Using Random Matrix Theory

    NASA Astrophysics Data System (ADS)

    Suliman, Mohamed; Ballal, Tarig; Kammoun, Abla; Al-Naffouri, Tareq Y.

    2016-12-01

    In this supplementary appendix we provide proofs and additional extensive simulations that complement the analysis of the main paper (constrained perturbation regularization approach for signal estimation using random matrix theory).

  6. Modeling of correlated data with informative cluster sizes: An evaluation of joint modeling and within-cluster resampling approaches.

    PubMed

    Zhang, Bo; Liu, Wei; Zhang, Zhiwei; Qu, Yanping; Chen, Zhen; Albert, Paul S

    2017-08-01

    Joint modeling and within-cluster resampling are two approaches that are used for analyzing correlated data with informative cluster sizes. Motivated by a developmental toxicity study, we examined the performances and validity of these two approaches in testing covariate effects in generalized linear mixed-effects models. We show that the joint modeling approach is robust to the misspecification of cluster size models in terms of Type I and Type II errors when the corresponding covariates are not included in the random effects structure; otherwise, statistical tests may be affected. We also evaluate the performance of the within-cluster resampling procedure and thoroughly investigate the validity of it in modeling correlated data with informative cluster sizes. We show that within-cluster resampling is a valid alternative to joint modeling for cluster-specific covariates, but it is invalid for time-dependent covariates. The two methods are applied to a developmental toxicity study that investigated the effect of exposure to diethylene glycol dimethyl ether.

  7. A Mock Randomized Controlled Trial With Audience Response Technology for Teaching and Learning Epidemiology.

    PubMed

    Baker, Philip R A; Francis, Daniel P; Cathcart, Abby

    2017-04-01

    The study's objective was to apply and assess an active learning approach to epidemiology and critical appraisal. Active learning comprised a mock, randomized controlled trial (RCT) conducted with learners in 3 countries. The mock trial consisted of blindly eating red Smarties candy (intervention) compared to yellow Smarties (control) to determine whether red Smarties increase happiness. Audience response devices were employed with the 3-fold purposes to produce outcome data for analysis of the effects of red Smarties, identify baseline and subsequent changes in participant's knowledge and confidence in understanding of RCTs, and assess the teaching approach. Of those attending, 82% (117 of 143 learners) participated in the trial component. Participating in the mock trial was a positive experience, and the use of the technology aided learning. The trial produced data that learners analyzed in "real time" during the class. The mock RCT is a fun and engaging approach to teaching RCTs and helping students to develop skills in critical appraisal.

  8. Validating predictions from climate envelope models

    USGS Publications Warehouse

    Watling, J.; Bucklin, D.; Speroterra, C.; Brandt, L.; Cabal, C.; Romañach, Stephanie S.; Mazzotti, Frank J.

    2013-01-01

    Climate envelope models are a potentially important conservation tool, but their ability to accurately forecast species’ distributional shifts using independent survey data has not been fully evaluated. We created climate envelope models for 12 species of North American breeding birds previously shown to have experienced poleward range shifts. For each species, we evaluated three different approaches to climate envelope modeling that differed in the way they treated climate-induced range expansion and contraction, using random forests and maximum entropy modeling algorithms. All models were calibrated using occurrence data from 1967–1971 (t1) and evaluated using occurrence data from 1998–2002 (t2). Model sensitivity (the ability to correctly classify species presences) was greater using the maximum entropy algorithm than the random forest algorithm. Although sensitivity did not differ significantly among approaches, for many species, sensitivity was maximized using a hybrid approach that assumed range expansion, but not contraction, in t2. Species for which the hybrid approach resulted in the greatest improvement in sensitivity have been reported from more land cover types than species for which there was little difference in sensitivity between hybrid and dynamic approaches, suggesting that habitat generalists may be buffered somewhat against climate-induced range contractions. Specificity (the ability to correctly classify species absences) was maximized using the random forest algorithm and was lowest using the hybrid approach. Overall, our results suggest cautious optimism for the use of climate envelope models to forecast range shifts, but also underscore the importance of considering non-climate drivers of species range limits. The use of alternative climate envelope models that make different assumptions about range expansion and contraction is a new and potentially useful way to help inform our understanding of climate change effects on species.

  9. Effect of Retraining Approach-Avoidance Tendencies on an Exercise Task: A Randomized Controlled Trial.

    PubMed

    Cheval, Boris; Sarrazin, Philippe; Pelletier, Luc; Friese, Malte

    2016-12-01

    Promoting regular physical activity (PA) and lessening sedentary behaviors (SB) constitute a public health priority. Recent evidence suggests that PA and SB are not only related to reflective processes (eg, behavioral intentions), but also to impulsive approach-avoidance tendencies (IAAT). This study aims to test the effect of a computerized IAAT intervention on an exercise task. Participants (N = 115) were randomly assigned to 1 of 3 experimental conditions, in which they were either trained to approach PA and avoid SB (ApPA-AvSB condition), to approach SB and avoid PA (ApSB-AvPA condition), or to approach and avoid PA and SB equally often (active control condition). The main outcome variable was the time spent carrying out a moderate intensity exercise task. IAAT toward PA decreased in the ApSB-AvPA condition, tended to increase in the ApPA-AvSB condition, and remained stable in the control condition. Most importantly, the ApPA-AvSB manipulation led to more time spent exercising than the ApSB-AvPA condition. Sensitivity analyses excluding individuals who were highly physically active further revealed that participants in the ApPA-AvSB condition spent more time exercising than participants in the control condition. These findings provide preliminary evidence that a single intervention session can successfully change impulsive approach tendencies toward PA and can increase the time devoted to an exercise task, especially among individuals who need to be more physically active. Potential implications for health behavior theories and behavior change interventions are outlined.

  10. Meta-analysis of Odds Ratios: Current Good Practices

    PubMed Central

    Chang, Bei-Hung; Hoaglin, David C.

    2016-01-01

    Background Many systematic reviews of randomized clinical trials lead to meta-analyses of odds ratios. The customary methods of estimating an overall odds ratio involve weighted averages of the individual trials’ estimates of the logarithm of the odds ratio. That approach, however, has several shortcomings, arising from assumptions and approximations, that render the results unreliable. Although the problems have been documented in the literature for many years, the conventional methods persist in software and applications. A well-developed alternative approach avoids the approximations by working directly with the numbers of subjects and events in the arms of the individual trials. Objective We aim to raise awareness of methods that avoid the conventional approximations, can be applied with widely available software, and produce more-reliable results. Methods We summarize the fixed-effect and random-effects approaches to meta-analysis; describe conventional, approximate methods and alternative methods; apply the methods in a meta-analysis of 19 randomized trials of endoscopic sclerotherapy in patients with cirrhosis and esophagogastric varices; and compare the results. We demonstrate the use of SAS, Stata, and R software for the analysis. Results In the example, point estimates and confidence intervals for the overall log-odds-ratio differ between the conventional and alternative methods, in ways that can affect inferences. Programming is straightforward in the three software packages; an appendix gives the details. Conclusions The modest additional programming required should not be an obstacle to adoption of the alternative methods. Because their results are unreliable, use of the conventional methods for meta-analysis of odds ratios should be discontinued. PMID:28169977

  11. Randomized Trial of Family Therapy versus Non-Family Treatment for Adolescent Behavior Problems in Usual Care

    PubMed Central

    Hogue, Aaron; Dauber, Sarah; Henderson, Craig E.; Bobek, Molly; Johnson, Candace; Lichvar, Emily; Morgenstern, Jon

    2014-01-01

    Objective A major focus of implementation science is discovering whether evidence-based approaches can be delivered with fidelity and potency in routine practice. This randomized trial compared usual care family therapy (UC-FT), implemented without a treatment manual or extramural support as the standard-of-care approach in a community clinic, to non-family treatment (UC-Other) for adolescent conduct and substance use disorders. Method The study recruited 205 adolescents (mean age 15.7 years; 52% male; 59% Hispanic American, 21% African American) from a community referral network, enrolling 63% for primary mental health problems and 37% for primary substance use problems. Clients were randomly assigned to either the UC-FT site or one of five UC-Other sites. Implementation data confirmed that UC-FT showed adherence to the family therapy approach and differentiation from UC-Other. Follow-ups were completed at 3, 6, and 12 months post-baseline. Results There was no between-group difference in treatment attendance. Both conditions demonstrated improvements in externalizing, internalizing, and delinquency symptoms. However, UC-FT produced greater reductions in youth-reported externalizing and internalizing among the whole sample, in delinquency among substance-using youth, and in alcohol and drug use among substance-using youth. The degree to which UC-FT outperformed UC-Other was consistent with effect sizes from controlled trials of manualized family therapy models. Conclusions Non-manualized family therapy can be effective for adolescent behavior problems within diverse populations in usual care, and it may be superior to non-family alternatives. PMID:25496283

  12. A two-stage Monte Carlo approach to the expression of uncertainty with finite sample sizes.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crowder, Stephen Vernon; Moyer, Robert D.

    2005-05-01

    Proposed supplement I to the GUM outlines a 'propagation of distributions' approach to deriving the distribution of a measurand for any non-linear function and for any set of random inputs. The supplement's proposed Monte Carlo approach assumes that the distributions of the random inputs are known exactly. This implies that the sample sizes are effectively infinite. In this case, the mean of the measurand can be determined precisely using a large number of Monte Carlo simulations. In practice, however, the distributions of the inputs will rarely be known exactly, but must be estimated using possibly small samples. If these approximatedmore » distributions are treated as exact, the uncertainty in estimating the mean is not properly taken into account. In this paper, we propose a two-stage Monte Carlo procedure that explicitly takes into account the finite sample sizes used to estimate parameters of the input distributions. We will illustrate the approach with a case study involving the efficiency of a thermistor mount power sensor. The performance of the proposed approach will be compared to the standard GUM approach for finite samples using simple non-linear measurement equations. We will investigate performance in terms of coverage probabilities of derived confidence intervals.« less

  13. Acute effects of intoxication and arousal on approach / avoidance biases toward sexual risk stimuli in heterosexual men

    PubMed Central

    Simons, Jeffrey S.; Maisto, Stephen A.; Wray, Tyler B.; Emery, Noah N.

    2015-01-01

    This study tested the effects of alcohol intoxication and physiological arousal on cognitive biases toward erotic stimuli and condoms. Ninety-seven heterosexual men were randomized to 1 of 6 independent conditions in a 2 (high arousal or control) × 3 (alcohol target BAC = 0.08), placebo, or juice control) design and then completed a variant of the Approach Avoidance Task (AAT). The AAT assessed reaction times toward approaching and avoiding erotic stimuli and condoms with a joystick. Consistent with hypotheses, the alcohol condition exhibited an approach bias toward erotic stimuli, whereas the control and placebo groups exhibited an approach bias toward condom stimuli. Similarly, the participants in the high arousal condition exhibited an approach bias toward erotic stimuli and the low arousal control condition exhibited an approach bias toward condoms. The results suggest that acute changes in intoxication and physiological arousal independently foster biased responding towards sexual stimuli and these biases are associated with sexual risk intentions. PMID:25808719

  14. Random Matrix Approach for Primal-Dual Portfolio Optimization Problems

    NASA Astrophysics Data System (ADS)

    Tada, Daichi; Yamamoto, Hisashi; Shinzato, Takashi

    2017-12-01

    In this paper, we revisit the portfolio optimization problems of the minimization/maximization of investment risk under constraints of budget and investment concentration (primal problem) and the maximization/minimization of investment concentration under constraints of budget and investment risk (dual problem) for the case that the variances of the return rates of the assets are identical. We analyze both optimization problems by the Lagrange multiplier method and the random matrix approach. Thereafter, we compare the results obtained from our proposed approach with the results obtained in previous work. Moreover, we use numerical experiments to validate the results obtained from the replica approach and the random matrix approach as methods for analyzing both the primal and dual portfolio optimization problems.

  15. The effect of Think Pair Share (TPS) using scientific approach on students’ self-confidence and mathematical problem-solving

    NASA Astrophysics Data System (ADS)

    Rifa’i, A.; Lestari, H. P.

    2018-03-01

    This study was designed to know the effects of Think Pair Share using Scientific Approach on students' self-confidence and mathematical problem-solving. Quasi-experimental with pre-test post-test non-equivalent group method was used as a basis for design this study. Self-confidence questionnaire and problem-solving test have been used for measurement of the two variables. Two classes of the first grade in religious senior high school (MAN) in Indonesia were randomly selected for this study. Teaching sequence and series from mathematics book at control group in the traditional way and at experiment group has been in TPS using scientific approach learning method. For data analysis regarding students’ problem-solving skill and self-confidence, One-Sample t-Test, Independent Sample t-Test, and Multivariate of Variance (MANOVA) were used. The results showed that (1) TPS using a scientific approach and traditional learning had positive effects (2) TPS using scientific approach learning in comparative with traditional learning had a more significant effect on students’ self-confidence and problem-solving skill.

  16. Bayesian analysis of zero inflated spatiotemporal HIV/TB child mortality data through the INLA and SPDE approaches: Applied to data observed between 1992 and 2010 in rural North East South Africa

    NASA Astrophysics Data System (ADS)

    Musenge, Eustasius; Chirwa, Tobias Freeman; Kahn, Kathleen; Vounatsou, Penelope

    2013-06-01

    Longitudinal mortality data with few deaths usually have problems of zero-inflation. This paper presents and applies two Bayesian models which cater for zero-inflation, spatial and temporal random effects. To reduce the computational burden experienced when a large number of geo-locations are treated as a Gaussian field (GF) we transformed the field to a Gaussian Markov Random Fields (GMRF) by triangulation. We then modelled the spatial random effects using the Stochastic Partial Differential Equations (SPDEs). Inference was done using a computationally efficient alternative to Markov chain Monte Carlo (MCMC) called Integrated Nested Laplace Approximation (INLA) suited for GMRF. The models were applied to data from 71,057 children aged 0 to under 10 years from rural north-east South Africa living in 15,703 households over the years 1992-2010. We found protective effects on HIV/TB mortality due to greater birth weight, older age and more antenatal clinic visits during pregnancy (adjusted RR (95% CI)): 0.73(0.53;0.99), 0.18(0.14;0.22) and 0.96(0.94;0.97) respectively. Therefore childhood HIV/TB mortality could be reduced if mothers are better catered for during pregnancy as this can reduce mother-to-child transmissions and contribute to improved birth weights. The INLA and SPDE approaches are computationally good alternatives in modelling large multilevel spatiotemporal GMRF data structures.

  17. Thymectomy in Myasthenia Gravis

    PubMed Central

    Aydin, Yener; Ulas, Ali Bilal; Mutlu, Vahit; Colak, Abdurrahim; Eroglu, Atilla

    2017-01-01

    In recent years, thymectomy has become a widespread procedure in the treatment of myasthenia gravis (MG). Likelihood of remission was highest in preoperative mild disease classification (Osserman classification 1, 2A). In absence of thymoma or hyperplasia, there was no relationship between age and gender in remission with thymectomy. In MG treatment, randomized trials that compare conservative treatment with thymectomy have started, recently. As with non-randomized trials, remission with thymectomy in MG treatment was better than conservative treatment with only medication. There are four major methods for the surgical approach: transcervical, minimally invasive, transsternal, and combined transcervical transsternal thymectomy. Transsternal approach with thymectomy is the accepted standard surgical approach for many years. In recent years, the incidence of thymectomy has been increasing with minimally invasive techniques using thoracoscopic and robotic methods. There are not any randomized, controlled studies which are comparing surgical techniques. However, when comparing non-randomized trials, it is seen that minimally invasive thymectomy approaches give similar results to more aggressive approaches. PMID:28416933

  18. Comparing Web, Group and Telehealth Formats of a Military Parenting Program

    DTIC Science & Technology

    2017-06-01

    directed approaches. Comparative effectiveness will be tested by specifying a non - equivalence hypothesis for group -based and web-facilitated relative...Comparative effectiveness will be tested by specifying a non - equivalence hypothesis fro group based and individualized facilitated relative to self-directed...documents for review and approval. 1a. Finalize human subjects protocol and consent documents for pilot group (N=5 families), and randomized controlled

  19. Capturing the Cumulative Effects of School Reform: An 11-Year Study of the Impacts of America's Choice on Student Achievement

    ERIC Educational Resources Information Center

    May, Henry; Supovitz, Jonathan A.

    2006-01-01

    This article presents the results of an 11-year longitudinal study of the impact of America's Choice comprehensive school reform (CSR) design on student learning gains in Rochester, New York. A quasi-experimental interrupted time-series approach using Bayesian hierarchical growth curve analysis with crossed random effects is used to compare the…

  20. The Employment Retention and Advancement Project: How Effective Are Different Approaches Aiming to Increase Employment Retention and Advancement? Final Impacts for Twelve Models. Executive Summary

    ERIC Educational Resources Information Center

    Hendra, Richard; Dillman, Keri-Nicole; Hamilton, Gayle; Lundquist, Erika; Martinson, Karin; Wavelet, Melissa

    2010-01-01

    This report summarizes the final impact results for the national Employment Retention and Advancement (ERA) project. This project tested, using a random assignment design, the effectiveness of numerous programs intended to promote steady work and career advancement. All the programs targeted current and former welfare recipients and other low-wage…

  1. Comparing Ways of Altering Parent-Child Interaction.

    ERIC Educational Resources Information Center

    Kogan, Kate L.; Tyler, Nancy B.

    This study tests the effectiveness of 2 approaches to parenting instruction for parents of preschool developmentally delayed children aged 3 through 5. Sixty parent/child pairs were randomly assigned to 1 of 3 groups: (1) individual parenting instruction only, (2) individual plus group instruction, and (3) comparison group with no instruction.…

  2. A Randomized Controlled Trial of Brief Interventions for Body Dissatisfaction

    ERIC Educational Resources Information Center

    Wade, Tracey; George, Wing Man; Atkinson, Melissa

    2009-01-01

    The authors examined the relative effectiveness of 3 different approaches to the experience of body dissatisfaction compared to a control and ruminative attention control condition, with respect to increasing weight and appearance satisfaction. One hundred female undergraduates (mean age = 24.38, SD = 9.39) underwent a body dissatisfaction…

  3. Treatment Approaches for Presurgical Anxiety: A Health Care Concern.

    ERIC Educational Resources Information Center

    Keogh, Nancy Jones; And Others

    To test the differential effectiveness of preoperative instruction (factual information, emotional expression, and trust relationship), mastery modeling, and coping modeling, 100 children, aged 7-12, were studied. Subjects from two hospitals were randomly assigned to four experimental groups and one control group: alone (the control group, N=20);…

  4. A Composite Likelihood Inference in Latent Variable Models for Ordinal Longitudinal Responses

    ERIC Educational Resources Information Center

    Vasdekis, Vassilis G. S.; Cagnone, Silvia; Moustaki, Irini

    2012-01-01

    The paper proposes a composite likelihood estimation approach that uses bivariate instead of multivariate marginal probabilities for ordinal longitudinal responses using a latent variable model. The model considers time-dependent latent variables and item-specific random effects to be accountable for the interdependencies of the multivariate…

  5. Allowing for Correlations between Correlations in Random-Effects Meta-Analysis of Correlation Matrices

    ERIC Educational Resources Information Center

    Prevost, A. Toby; Mason, Dan; Griffin, Simon; Kinmonth, Ann-Louise; Sutton, Stephen; Spiegelhalter, David

    2007-01-01

    Practical meta-analysis of correlation matrices generally ignores covariances (and hence correlations) between correlation estimates. The authors consider various methods for allowing for covariances, including generalized least squares, maximum marginal likelihood, and Bayesian approaches, illustrated using a 6-dimensional response in a series of…

  6. Recruiting Unmotivated Smokers into a Smoking Induction Trial

    ERIC Educational Resources Information Center

    Harris, Kari Jo; Bradley-Ewing, Andrea; Goggin, Kathy; Richter, Kimber P.; Patten, Christi; Williams, Karen; Lee, Hyoung S.; Staggs, Vincent S.; Catley, Delwyn

    2016-01-01

    Little is known about effective methods to recruit unmotivated smokers into cessation induction trials, the reasons unmotivated smokers agree to participate, and the impact of those reasons on study outcomes. A mixed-method approach was used to examine recruitment data from a randomized controlled cessation induction trial that enrolled 255 adult…

  7. Alcohol-Related Incident Guardianship and Undergraduate College Parties: Enhancing the Social Norms Marketing Approach

    ERIC Educational Resources Information Center

    Gilbertson, Troy A.

    2006-01-01

    This randomized experiment examines the effects of contextual information on undergraduate college student's levels of alcohol-related incident guardianship at college parties. The research is conceptualized using routine activities theory and the theory of planned behavior. The experiment examines attitudinal variations about heavy drinking…

  8. Mastery, Maladaptive Learning Behaviour, and Academic Achievement: An Intervention Approach

    ERIC Educational Resources Information Center

    Ranellucci, John; Hall, Nathan; Muis, Krista; Lajoie, Susanne; Robinson, Kristy

    2017-01-01

    The effects of three interventions designed to boost academic achievement among mastery-oriented students were evaluated on interest-based studying, social desirability, and perceived goal difficulty. Undergraduate students (N = 177) completed relevant self-report measures at the beginning and the end of the semester and were randomly assigned to…

  9. Random walks of colloidal probes in viscoelastic materials

    NASA Astrophysics Data System (ADS)

    Khan, Manas; Mason, Thomas G.

    2014-04-01

    To overcome limitations of using a single fixed time step in random walk simulations, such as those that rely on the classic Wiener approach, we have developed an algorithm for exploring random walks based on random temporal steps that are uniformly distributed in logarithmic time. This improvement enables us to generate random-walk trajectories of probe particles that span a highly extended dynamic range in time, thereby facilitating the exploration of probe motion in soft viscoelastic materials. By combining this faster approach with a Maxwell-Voigt model (MVM) of linear viscoelasticity, based on a slowly diffusing harmonically bound Brownian particle, we rapidly create trajectories of spherical probes in soft viscoelastic materials over more than 12 orders of magnitude in time. Appropriate windowing of these trajectories over different time intervals demonstrates that random walk for the MVM is neither self-similar nor self-affine, even if the viscoelastic material is isotropic. We extend this approach to spatially anisotropic viscoelastic materials, using binning to calculate the anisotropic mean square displacements and creep compliances along different orthogonal directions. The elimination of a fixed time step in simulations of random processes, including random walks, opens up interesting possibilities for modeling dynamics and response over a highly extended temporal dynamic range.

  10. Statistical controversies in clinical research: an initial evaluation of a surrogate end point using a single randomized clinical trial and the Prentice criteria

    PubMed Central

    Heller, G.

    2015-01-01

    Surrogate end point research has grown in recent years with the increasing development and usage of biomarkers in clinical research. Surrogacy analysis is derived through randomized clinical trial data and it is carried out at the individual level and at the trial level. A common surrogate analysis at the individual level is the application of the Prentice criteria. An approach for the evaluation of the Prentice criteria is discussed, with a focus on its most difficult component, the determination of whether the treatment effect is captured by the surrogate. An interpretation of this criterion is illustrated using data from a randomized clinical trial in prostate cancer. PMID:26254442

  11. A practical approach to automate randomized design of experiments for ligand-binding assays.

    PubMed

    Tsoi, Jennifer; Patel, Vimal; Shih, Judy

    2014-03-01

    Design of experiments (DOE) is utilized in optimizing ligand-binding assay by modeling factor effects. To reduce the analyst's workload and error inherent with DOE, we propose the integration of automated liquid handlers to perform the randomized designs. A randomized design created from statistical software was imported into custom macro converting the design into a liquid-handler worklist to automate reagent delivery. An optimized assay was transferred to a contract research organization resulting in a successful validation. We developed a practical solution for assay optimization by integrating DOE and automation to increase assay robustness and enable successful method transfer. The flexibility of this process allows it to be applied to a variety of assay designs.

  12. HyDEn: A Hybrid Steganocryptographic Approach for Data Encryption Using Randomized Error-Correcting DNA Codes

    PubMed Central

    Regoui, Chaouki; Durand, Guillaume; Belliveau, Luc; Léger, Serge

    2013-01-01

    This paper presents a novel hybrid DNA encryption (HyDEn) approach that uses randomized assignments of unique error-correcting DNA Hamming code words for single characters in the extended ASCII set. HyDEn relies on custom-built quaternary codes and a private key used in the randomized assignment of code words and the cyclic permutations applied on the encoded message. Along with its ability to detect and correct errors, HyDEn equals or outperforms existing cryptographic methods and represents a promising in silico DNA steganographic approach. PMID:23984392

  13. Image subsampling and point scoring approaches for large-scale marine benthic monitoring programs

    NASA Astrophysics Data System (ADS)

    Perkins, Nicholas R.; Foster, Scott D.; Hill, Nicole A.; Barrett, Neville S.

    2016-07-01

    Benthic imagery is an effective tool for quantitative description of ecologically and economically important benthic habitats and biota. The recent development of autonomous underwater vehicles (AUVs) allows surveying of spatial scales that were previously unfeasible. However, an AUV collects a large number of images, the scoring of which is time and labour intensive. There is a need to optimise the way that subsamples of imagery are chosen and scored to gain meaningful inferences for ecological monitoring studies. We examine the trade-off between the number of images selected within transects and the number of random points scored within images on the percent cover of target biota, the typical output of such monitoring programs. We also investigate the efficacy of various image selection approaches, such as systematic or random, on the bias and precision of cover estimates. We use simulated biotas that have varying size, abundance and distributional patterns. We find that a relatively small sampling effort is required to minimise bias. An increased precision for groups that are likely to be the focus of monitoring programs is best gained through increasing the number of images sampled rather than the number of points scored within images. For rare species, sampling using point count approaches is unlikely to provide sufficient precision, and alternative sampling approaches may need to be employed. The approach by which images are selected (simple random sampling, regularly spaced etc.) had no discernible effect on mean and variance estimates, regardless of the distributional pattern of biota. Field validation of our findings is provided through Monte Carlo resampling analysis of a previously scored benthic survey from temperate waters. We show that point count sampling approaches are capable of providing relatively precise cover estimates for candidate groups that are not overly rare. The amount of sampling required, in terms of both the number of images and number of points, varies with the abundance, size and distributional pattern of target biota. Therefore, we advocate either the incorporation of prior knowledge or the use of baseline surveys to establish key properties of intended target biota in the initial stages of monitoring programs.

  14. The effects of mindfulness-based stress reduction on psychosocial outcomes and quality of life in early-stage breast cancer patients: a randomized trial.

    PubMed

    Henderson, Virginia P; Clemow, Lynn; Massion, Ann O; Hurley, Thomas G; Druker, Susan; Hébert, James R

    2012-01-01

    The aim of this study was determine the effectiveness of a mindfulness-based stress-reduction (MBSR) program on quality of life (QOL) and psychosocial outcomes in women with early-stage breast cancer, using a three-arm randomized controlled clinical trial (RCT). This RCT consisting of 172 women, aged 20-65 with stage I or II breast cancer consisted of the 8-week MBSR, which was compared to a nutrition education program (NEP) and usual supportive care (UC). Follow-up was performed at three post-intervention points: 4 months, 1, and 2 years. Standardized, validated self-administered questionnaires were adopted to assess psychosocial variables. Statistical analysis included descriptive and regression analyses incorporating both intention-to-treat and post hoc multivariable approaches of the 163 women with complete data at baseline, those who were randomized to MBSR experienced a significant improvement in the primary measures of QOL and coping outcomes compared to the NEP, UC, or both, including the spirituality subscale of the FACT-B as well as dealing with illness scale increases in active behavioral coping and active cognitive coping. Secondary outcome improvements resulting in significant between-group contrasts favoring the MBSR group at 4 months included meaningfulness, depression, paranoid ideation, hostility, anxiety, unhappiness, and emotional control. Results tended to decline at 12 months and even more at 24 months, though at all times, they were as robust in women with lower expectation of effect as in those with higher expectation. The MBSR intervention appears to benefit psychosocial adjustment in cancer patients, over and above the effects of usual care or a credible control condition. The universality of effects across levels of expectation indicates a potential to utilize this stress reduction approach as complementary therapy in oncologic practice.

  15. Mindfulness-Based Cognitive Therapy as a Treatment for Chronic Tinnitus: A Randomized Controlled Trial.

    PubMed

    McKenna, Laurence; Marks, Elizabeth M; Hallsworth, Christopher A; Schaette, Roland

    2017-01-01

    Tinnitus is experienced by up to 15% of the population and can lead to significant disability and distress. There is rarely a medical or surgical target and psychological therapies are recommended. We investigated whether mindfulness-based cognitive therapy (MBCT) could offer an effective new therapy for tinnitus. This single-site randomized controlled trial compared MBCT to intensive relaxation training (RT) for chronic, distressing tinnitus in adults. Both treatments involved 8 weekly, 120-min sessions focused on either relaxation (RT) or mindfulness meditation (MBCT). Assessments were completed at baseline and at treatment commencement 8 weeks later. The primary outcomes were tinnitus severity (Tinnitus Questionnaire) and psychological distress (Clinical Outcomes in Routine Evaluation - Non-Risk, CORE-NR), 16 weeks after baseline. The analysis utilized a modified intention-to-treat approach. A total of 75 patients were randomly allocated to MBCT (n = 39) or RT (n = 36). Both groups showed significant reductions in tinnitus severity and loudness, psychological distress, anxiety, depression, and disability. MBCT led to a significantly greater reduction in tinnitus severity than RT, with a mean difference of 6.3 (95% CI 1.3-11.4, p = 0.016). Effects persisted 6 months later, with a mean difference of 7.2 (95% CI 2.1-2.3, p = 0.006) and a standardized effect size of 0.56 (95% CI 0.16-0.96). Treatment was effective regardless of initial tinnitus severity, duration, or hearing loss. MBCT is effective in reducing tinnitus severity in chronic tinnitus patients compared to intensive RT. It also reduces psychological distress and disability. Future studies should explore the generalizability of this approach and how outcome relates to different aspects of the intervention. © 2017 The Author(s) Published by S. Karger AG, Basel.

  16. Assessing the predictive capability of randomized tree-based ensembles in streamflow modelling

    NASA Astrophysics Data System (ADS)

    Galelli, S.; Castelletti, A.

    2013-02-01

    Combining randomization methods with ensemble prediction is emerging as an effective option to balance accuracy and computational efficiency in data-driven modeling. In this paper we investigate the prediction capability of extremely randomized trees (Extra-Trees), in terms of accuracy, explanation ability and computational efficiency, in a streamflow modeling exercise. Extra-Trees are a totally randomized tree-based ensemble method that (i) alleviates the poor generalization property and tendency to overfitting of traditional standalone decision trees (e.g. CART); (ii) is computationally very efficient; and, (iii) allows to infer the relative importance of the input variables, which might help in the ex-post physical interpretation of the model. The Extra-Trees potential is analyzed on two real-world case studies (Marina catchment (Singapore) and Canning River (Western Australia)) representing two different morphoclimatic contexts comparatively with other tree-based methods (CART and M5) and parametric data-driven approaches (ANNs and multiple linear regression). Results show that Extra-Trees perform comparatively well to the best of the benchmarks (i.e. M5) in both the watersheds, while outperforming the other approaches in terms of computational requirement when adopted on large datasets. In addition, the ranking of the input variable provided can be given a physically meaningful interpretation.

  17. Assessing the predictive capability of randomized tree-based ensembles in streamflow modelling

    NASA Astrophysics Data System (ADS)

    Galelli, S.; Castelletti, A.

    2013-07-01

    Combining randomization methods with ensemble prediction is emerging as an effective option to balance accuracy and computational efficiency in data-driven modelling. In this paper, we investigate the prediction capability of extremely randomized trees (Extra-Trees), in terms of accuracy, explanation ability and computational efficiency, in a streamflow modelling exercise. Extra-Trees are a totally randomized tree-based ensemble method that (i) alleviates the poor generalisation property and tendency to overfitting of traditional standalone decision trees (e.g. CART); (ii) is computationally efficient; and, (iii) allows to infer the relative importance of the input variables, which might help in the ex-post physical interpretation of the model. The Extra-Trees potential is analysed on two real-world case studies - Marina catchment (Singapore) and Canning River (Western Australia) - representing two different morphoclimatic contexts. The evaluation is performed against other tree-based methods (CART and M5) and parametric data-driven approaches (ANNs and multiple linear regression). Results show that Extra-Trees perform comparatively well to the best of the benchmarks (i.e. M5) in both the watersheds, while outperforming the other approaches in terms of computational requirement when adopted on large datasets. In addition, the ranking of the input variable provided can be given a physically meaningful interpretation.

  18. The effects of the one-step replica symmetry breaking on the Sherrington-Kirkpatrick spin glass model in the presence of random field with a joint Gaussian probability density function for the exchange interactions and random fields

    NASA Astrophysics Data System (ADS)

    Hadjiagapiou, Ioannis A.; Velonakis, Ioannis N.

    2018-07-01

    The Sherrington-Kirkpatrick Ising spin glass model, in the presence of a random magnetic field, is investigated within the framework of the one-step replica symmetry breaking. The two random variables (exchange integral interaction Jij and random magnetic field hi) are drawn from a joint Gaussian probability density function characterized by a correlation coefficient ρ, assuming positive and negative values. The thermodynamic properties, the three different phase diagrams and system's parameters are computed with respect to the natural parameters of the joint Gaussian probability density function at non-zero and zero temperatures. The low temperature negative entropy controversy, a result of the replica symmetry approach, has been partly remedied in the current study, leading to a less negative result. In addition, the present system possesses two successive spin glass phase transitions with characteristic temperatures.

  19. Analysis of foliage effects on mobile propagation in dense urban environments

    NASA Astrophysics Data System (ADS)

    Bronshtein, Alexander; Mazar, Reuven; Lu, I.-Tai

    2000-07-01

    Attempts to reduce the interference level and to increase the spectral efficiency of cellular radio communication systems operating in dense urban and suburban areas lead to the microcellular approach with a consequent requirement to lower antenna heights. In large metropolitan areas having high buildings this requirement causes a situation where the transmitting and receiving antennas are both located below the rooftops, and the city street acts as a type of a waveguiding channel for the propagating signal. In this work, the city street is modeled as a random multislit waveguide with randomly distributed regions of foliage parallel to the building boundaries. The statistical propagation characteristics are expressed in terms of multiple ray-fields approaching the observer. Algorithms for predicting the path-loss along the waveguide and for computing the transverse field structure are presented.

  20. A web-based approach to managing stress and mood disorders in the workforce.

    PubMed

    Billings, Douglas W; Cook, Royer F; Hendrickson, April; Dove, David C

    2008-08-01

    To evaluate the effectiveness of a web-based multimedia health promotion program for the workplace, designed to help reduce stress and to prevent depression, anxiety, and substance abuse. Using a randomized controlled trial design, 309 working adults were randomly assigned to the web-based condition or to a wait-list control condition. All participants were assessed on multiple self-reported outcomes at pretest and posttest. Relative to controls, the web-based group reduced their stress, increased their knowledge of depression and anxiety, developed more positive attitudes toward treatment, and adopted a more healthy approach to alcohol consumption. We found that a brief and easily adaptable web-based stress management program can simultaneously reduce worker stress and address stigmatized behavioral health problems by embedding this prevention material into a more positive stress management framework.

  1. Large-scale inverse model analyses employing fast randomized data reduction

    NASA Astrophysics Data System (ADS)

    Lin, Youzuo; Le, Ellen B.; O'Malley, Daniel; Vesselinov, Velimir V.; Bui-Thanh, Tan

    2017-08-01

    When the number of observations is large, it is computationally challenging to apply classical inverse modeling techniques. We have developed a new computationally efficient technique for solving inverse problems with a large number of observations (e.g., on the order of 107 or greater). Our method, which we call the randomized geostatistical approach (RGA), is built upon the principal component geostatistical approach (PCGA). We employ a data reduction technique combined with the PCGA to improve the computational efficiency and reduce the memory usage. Specifically, we employ a randomized numerical linear algebra technique based on a so-called "sketching" matrix to effectively reduce the dimension of the observations without losing the information content needed for the inverse analysis. In this way, the computational and memory costs for RGA scale with the information content rather than the size of the calibration data. Our algorithm is coded in Julia and implemented in the MADS open-source high-performance computational framework (http://mads.lanl.gov). We apply our new inverse modeling method to invert for a synthetic transmissivity field. Compared to a standard geostatistical approach (GA), our method is more efficient when the number of observations is large. Most importantly, our method is capable of solving larger inverse problems than the standard GA and PCGA approaches. Therefore, our new model inversion method is a powerful tool for solving large-scale inverse problems. The method can be applied in any field and is not limited to hydrogeological applications such as the characterization of aquifer heterogeneity.

  2. Seeking mathematics success for college students: a randomized field trial of an adapted approach

    NASA Astrophysics Data System (ADS)

    Gula, Taras; Hoessler, Carolyn; Maciejewski, Wes

    2015-11-01

    Many students enter the Canadian college system with insufficient mathematical ability and leave the system with little improvement. Those students who enter with poor mathematics ability typically take a developmental mathematics course as their first and possibly only mathematics course. The educational experiences that comprise a developmental mathematics course vary widely and are, too often, ineffective at improving students' ability. This trend is concerning, since low mathematics ability is known to be related to lower rates of success in subsequent courses. To date, little attention has been paid to the selection of an instructional approach to consistently apply across developmental mathematics courses. Prior research suggests that an appropriate instructional method would involve explicit instruction and practising mathematical procedures linked to a mathematical concept. This study reports on a randomized field trial of a developmental mathematics approach at a college in Ontario, Canada. The new approach is an adaptation of the JUMP Math program, an explicit instruction method designed for primary and secondary school curriculae, to the college learning environment. In this study, a subset of courses was assigned to JUMP Math and the remainder was taught in the same style as in the previous years. We found consistent, modest improvement in the JUMP Math sections compared to the non-JUMP sections, after accounting for potential covariates. The findings from this randomized field trial, along with prior research on effective education for developmental mathematics students, suggest that JUMP Math is a promising way to improve college student outcomes.

  3. Preoperative clonidine use in trans-sphenoidal pituitary adenoma surgeries - a randomized controlled trial.

    PubMed

    Bajaj, Jitin; Mittal, Radhe Shyam; Sharma, Achal

    2017-02-01

    Pituitary masses are common lesions accounting for about 15-20% of all brain tumours. Oozing blood is an annoyance in microscopic sublabial trans-sphenoidal approach for these masses. There have been many ways of reducing the ooze, having their own pros and cons. To find out the efficacy and safety of clonidine in reducing blood loss in pituitary adenoma surgery through a randomized masked trial. It was a prospective randomized controlled trial done. Total 50 patients of pituitary adenomas were randomized into two groups. Group A (25 patients) was given 200 μg clonidine orally, while Group B (25 patients) was given placebo. Surgeon, anaesthesiologist and patient were blinded for the trial. Sublabial trans-septal trans-sphenoidal approach to sella and excision of mass was performed in each patient. Patients were studied for pre-, intra- and post-operative blood pressure and heart rate, pre- and post-operative imaging findings, intra-operative blood loss, bleeding grading by surgeon, surgeon's satisfaction about condition of specific part and quality of surgical field, operative time and extent of resection. Blood loss during the surgery, operative time and bleeding grading by the surgeon were found significantly less in the clonidine group, while quality of surgical field, condition of the specific part and extent of resection were found significantly better in the clonidine group (p value <.05). There was no untoward adverse effect of the drug in the test group. Clonidine is a safe and effective drug to reduce bleeding in trans-sphenoidal microscopic pituitary adenoma surgeries.

  4. A Bayesian, generalized frailty model for comet assays.

    PubMed

    Ghebretinsae, Aklilu Habteab; Faes, Christel; Molenberghs, Geert; De Boeck, Marlies; Geys, Helena

    2013-05-01

    This paper proposes a flexible modeling approach for so-called comet assay data regularly encountered in preclinical research. While such data consist of non-Gaussian outcomes in a multilevel hierarchical structure, traditional analyses typically completely or partly ignore this hierarchical nature by summarizing measurements within a cluster. Non-Gaussian outcomes are often modeled using exponential family models. This is true not only for binary and count data, but also for, example, time-to-event outcomes. Two important reasons for extending this family are for (1) the possible occurrence of overdispersion, meaning that the variability in the data may not be adequately described by the models, which often exhibit a prescribed mean-variance link, and (2) the accommodation of a hierarchical structure in the data, owing to clustering in the data. The first issue is dealt with through so-called overdispersion models. Clustering is often accommodated through the inclusion of random subject-specific effects. Though not always, one conventionally assumes such random effects to be normally distributed. In the case of time-to-event data, one encounters, for example, the gamma frailty model (Duchateau and Janssen, 2007 ). While both of these issues may occur simultaneously, models combining both are uncommon. Molenberghs et al. ( 2010 ) proposed a broad class of generalized linear models accommodating overdispersion and clustering through two separate sets of random effects. Here, we use this method to model data from a comet assay with a three-level hierarchical structure. Although a conjugate gamma random effect is used for the overdispersion random effect, both gamma and normal random effects are considered for the hierarchical random effect. Apart from model formulation, we place emphasis on Bayesian estimation. Our proposed method has an upper hand over the traditional analysis in that it (1) uses the appropriate distribution stipulated in the literature; (2) deals with the complete hierarchical nature; and (3) uses all information instead of summary measures. The fit of the model to the comet assay is compared against the background of more conventional model fits. Results indicate the toxicity of 1,2-dimethylhydrazine dihydrochloride at different dose levels (low, medium, and high).

  5. Detection of mastitis in dairy cattle by use of mixture models for repeated somatic cell scores: a Bayesian approach via Gibbs sampling.

    PubMed

    Odegård, J; Jensen, J; Madsen, P; Gianola, D; Klemetsdal, G; Heringstad, B

    2003-11-01

    The distribution of somatic cell scores could be regarded as a mixture of at least two components depending on a cow's udder health status. A heteroscedastic two-component Bayesian normal mixture model with random effects was developed and implemented via Gibbs sampling. The model was evaluated using datasets consisting of simulated somatic cell score records. Somatic cell score was simulated as a mixture representing two alternative udder health statuses ("healthy" or "diseased"). Animals were assigned randomly to the two components according to the probability of group membership (Pm). Random effects (additive genetic and permanent environment), when included, had identical distributions across mixture components. Posterior probabilities of putative mastitis were estimated for all observations, and model adequacy was evaluated using measures of sensitivity, specificity, and posterior probability of misclassification. Fitting different residual variances in the two mixture components caused some bias in estimation of parameters. When the components were difficult to disentangle, so were their residual variances, causing bias in estimation of Pm and of location parameters of the two underlying distributions. When all variance components were identical across mixture components, the mixture model analyses returned parameter estimates essentially without bias and with a high degree of precision. Including random effects in the model increased the probability of correct classification substantially. No sizable differences in probability of correct classification were found between models in which a single cow effect (ignoring relationships) was fitted and models where this effect was split into genetic and permanent environmental components, utilizing relationship information. When genetic and permanent environmental effects were fitted, the between-replicate variance of estimates of posterior means was smaller because the model accounted for random genetic drift.

  6. Discrete-element modeling of nacre-like materials: Effects of random microstructures on strain localization and mechanical performance

    NASA Astrophysics Data System (ADS)

    Abid, Najmul; Mirkhalaf, Mohammad; Barthelat, Francois

    2018-03-01

    Natural materials such as nacre, collagen, and spider silk are composed of staggered stiff and strong inclusions in a softer matrix. This type of hybrid microstructure results in remarkable combinations of stiffness, strength, and toughness and it now inspires novel classes of high-performance composites. However, the analytical and numerical approaches used to predict and optimize the mechanics of staggered composites often neglect statistical variations and inhomogeneities, which may have significant impacts on modulus, strength, and toughness. Here we present an analysis of localization using small representative volume elements (RVEs) and large scale statistical volume elements (SVEs) based on the discrete element method (DEM). DEM is an efficient numerical method which enabled the evaluation of more than 10,000 microstructures in this study, each including about 5,000 inclusions. The models explore the combined effects of statistics, inclusion arrangement, and interface properties. We find that statistical variations have a negative effect on all properties, in particular on the ductility and energy absorption because randomness precipitates the localization of deformations. However, the results also show that the negative effects of random microstructures can be offset by interfaces with large strain at failure accompanied by strain hardening. More specifically, this quantitative study reveals an optimal range of interface properties where the interfaces are the most effective at delaying localization. These findings show how carefully designed interfaces in bioinspired staggered composites can offset the negative effects of microstructural randomness, which is inherent to most current fabrication methods.

  7. Multidimensional Normalization to Minimize Plate Effects of Suspension Bead Array Data.

    PubMed

    Hong, Mun-Gwan; Lee, Woojoo; Nilsson, Peter; Pawitan, Yudi; Schwenk, Jochen M

    2016-10-07

    Enhanced by the growing number of biobanks, biomarker studies can now be performed with reasonable statistical power by using large sets of samples. Antibody-based proteomics by means of suspension bead arrays offers one attractive approach to analyze serum, plasma, or CSF samples for such studies in microtiter plates. To expand measurements beyond single batches, with either 96 or 384 samples per plate, suitable normalization methods are required to minimize the variation between plates. Here we propose two normalization approaches utilizing MA coordinates. The multidimensional MA (multi-MA) and MA-loess both consider all samples of a microtiter plate per suspension bead array assay and thus do not require any external reference samples. We demonstrate the performance of the two MA normalization methods with data obtained from the analysis of 384 samples including both serum and plasma. Samples were randomized across 96-well sample plates, processed, and analyzed in assay plates, respectively. Using principal component analysis (PCA), we could show that plate-wise clusters found in the first two components were eliminated by multi-MA normalization as compared with other normalization methods. Furthermore, we studied the correlation profiles between random pairs of antibodies and found that both MA normalization methods substantially reduced the inflated correlation introduced by plate effects. Normalization approaches using multi-MA and MA-loess minimized batch effects arising from the analysis of several assay plates with antibody suspension bead arrays. In a simulated biomarker study, multi-MA restored associations lost due to plate effects. Our normalization approaches, which are available as R package MDimNormn, could also be useful in studies using other types of high-throughput assay data.

  8. The Effects of Approach-Avoidance Modification on Social Anxiety Disorder: A Pilot Study

    PubMed Central

    Asnaani, Anu; Rinck, Mike; Becker, Eni; Hofmann, Stefan G.

    2014-01-01

    Cognitive bias modification has recently been discussed as a possible intervention for mental disorders. A specific form of this novel treatment approach is approach-avoidance modification. In order to examine the efficacy of approach-avoidance modification for positive stimuli associated with social anxiety, we recruited 43 individuals with social anxiety disorder and randomly assigned them to a training (implicit training to approach smiling faces) or a control (equal approach and avoidance of smiling faces) condition in three sessions over the course of a one-week period. Dependent measures included clinician ratings, self-report measures of social anxiety, and overt behavior during behavioral approach tasks. No group differences in any of the outcome measures were observed after training. In addition, while individuals in the training group showed increased approach tendency in one of the sessions, this effect was inconsistent across the three sessions and did not result in long-term changes in implicit approach tendencies between the groups over the course of the entire study. These results suggest that approach-avoidance modification might result in short-lasting effects on implicit approach tendencies towards feared positive stimuli, but this modification may not result in meaningful behavioral change or symptom reduction in individuals with social anxiety disorder. PMID:24659832

  9. Effectiveness and Cost-Effectiveness of Occupation-Based Occupational Therapy Using the Aid for Decision Making in Occupation Choice (ADOC) for Older Residents: Pilot Cluster Randomized Controlled Trial

    PubMed Central

    Nagayama, Hirofumi; Tomori, Kounosuke; Ohno, Kanta; Takahashi, Kayoko; Ogahara, Kakuya; Sawada, Tatsunori; Uezu, Sei; Nagatani, Ryutaro; Yamauchi, Keita

    2016-01-01

    Background Care-home residents are mostly inactive, have little interaction with staff, and are dependent on staff to engage in daily occupations. We recently developed an iPad application called the Aid for Decision-making in Occupation Choice (ADOC) to promote shared decision-making in activities and occupation-based goal setting by choosing from illustrations describing daily activities. This study aimed to evaluate if interventions based on occupation-based goal setting using the ADOC could focus on meaningful activities to improve quality of life and independent activities of daily living, with greater cost-effectiveness than an impairment-based approach as well as to evaluate the feasibility of conducting a large cluster, randomized controlled trial. Method In this single (assessor)-blind pilot cluster randomized controlled trial, the intervention group (ADOC group) received occupational therapy based on occupation-based goal setting using the ADOC, and the interventions were focused on meaningful occupations. The control group underwent an impairment-based approach focused on restoring capacities, without goal setting tools. In both groups, the 20-minute individualized intervention sessions were conducted twice a week for 4 months. Main Outcome Measures Short Form-36 (SF-36) score, SF-6D utility score, quality adjusted life years (QALY), Barthel Index, and total care cost. Results We randomized and analyzed 12 facilities (44 participants, 18.5% drop-out rate), with 6 facilities each allocated to the ADOC (n = 23) and control (n = 21) groups. After the 4-month intervention, the ADOC group had a significantly greater change in the BI score, with improved scores (P = 0.027, 95% CI 0.41 to 6.87, intracluster correlation coefficient = 0.14). No other outcome was significantly different. The incremental cost-effectiveness ratio, calculated using the change in BI score, was $63.1. Conclusion The results suggest that occupational therapy using the ADOC for older residents might be effective and cost-effective. We also found that conducting an RCT in the occupational therapy setting is feasible. Trial Registration UMIN Clinical Trials Registry UMIN000012994 PMID:26930191

  10. Effectiveness and Cost-Effectiveness of Occupation-Based Occupational Therapy Using the Aid for Decision Making in Occupation Choice (ADOC) for Older Residents: Pilot Cluster Randomized Controlled Trial.

    PubMed

    Nagayama, Hirofumi; Tomori, Kounosuke; Ohno, Kanta; Takahashi, Kayoko; Ogahara, Kakuya; Sawada, Tatsunori; Uezu, Sei; Nagatani, Ryutaro; Yamauchi, Keita

    2016-01-01

    Care-home residents are mostly inactive, have little interaction with staff, and are dependent on staff to engage in daily occupations. We recently developed an iPad application called the Aid for Decision-making in Occupation Choice (ADOC) to promote shared decision-making in activities and occupation-based goal setting by choosing from illustrations describing daily activities. This study aimed to evaluate if interventions based on occupation-based goal setting using the ADOC could focus on meaningful activities to improve quality of life and independent activities of daily living, with greater cost-effectiveness than an impairment-based approach as well as to evaluate the feasibility of conducting a large cluster, randomized controlled trial. In this single (assessor)-blind pilot cluster randomized controlled trial, the intervention group (ADOC group) received occupational therapy based on occupation-based goal setting using the ADOC, and the interventions were focused on meaningful occupations. The control group underwent an impairment-based approach focused on restoring capacities, without goal setting tools. In both groups, the 20-minute individualized intervention sessions were conducted twice a week for 4 months. Short Form-36 (SF-36) score, SF-6D utility score, quality adjusted life years (QALY), Barthel Index, and total care cost. We randomized and analyzed 12 facilities (44 participants, 18.5% drop-out rate), with 6 facilities each allocated to the ADOC (n = 23) and control (n = 21) groups. After the 4-month intervention, the ADOC group had a significantly greater change in the BI score, with improved scores (P = 0.027, 95% CI 0.41 to 6.87, intracluster correlation coefficient = 0.14). No other outcome was significantly different. The incremental cost-effectiveness ratio, calculated using the change in BI score, was $63.1. The results suggest that occupational therapy using the ADOC for older residents might be effective and cost-effective. We also found that conducting an RCT in the occupational therapy setting is feasible. UMIN Clinical Trials Registry UMIN000012994.

  11. Using Audit Information to Adjust Parameter Estimates for Data Errors in Clinical Trials

    PubMed Central

    Shepherd, Bryan E.; Shaw, Pamela A.; Dodd, Lori E.

    2013-01-01

    Background Audits are often performed to assess the quality of clinical trial data, but beyond detecting fraud or sloppiness, the audit data is generally ignored. In earlier work using data from a non-randomized study, Shepherd and Yu (2011) developed statistical methods to incorporate audit results into study estimates, and demonstrated that audit data could be used to eliminate bias. Purpose In this manuscript we examine the usefulness of audit-based error-correction methods in clinical trial settings where a continuous outcome is of primary interest. Methods We demonstrate the bias of multiple linear regression estimates in general settings with an outcome that may have errors and a set of covariates for which some may have errors and others, including treatment assignment, are recorded correctly for all subjects. We study this bias under different assumptions including independence between treatment assignment, covariates, and data errors (conceivable in a double-blinded randomized trial) and independence between treatment assignment and covariates but not data errors (possible in an unblinded randomized trial). We review moment-based estimators to incorporate the audit data and propose new multiple imputation estimators. The performance of estimators is studied in simulations. Results When treatment is randomized and unrelated to data errors, estimates of the treatment effect using the original error-prone data (i.e., ignoring the audit results) are unbiased. In this setting, both moment and multiple imputation estimators incorporating audit data are more variable than standard analyses using the original data. In contrast, in settings where treatment is randomized but correlated with data errors and in settings where treatment is not randomized, standard treatment effect estimates will be biased. And in all settings, parameter estimates for the original, error-prone covariates will be biased. Treatment and covariate effect estimates can be corrected by incorporating audit data using either the multiple imputation or moment-based approaches. Bias, precision, and coverage of confidence intervals improve as the audit size increases. Limitations The extent of bias and the performance of methods depend on the extent and nature of the error as well as the size of the audit. This work only considers methods for the linear model. Settings much different than those considered here need further study. Conclusions In randomized trials with continuous outcomes and treatment assignment independent of data errors, standard analyses of treatment effects will be unbiased and are recommended. However, if treatment assignment is correlated with data errors or other covariates, naive analyses may be biased. In these settings, and when covariate effects are of interest, approaches for incorporating audit results should be considered. PMID:22848072

  12. Multilevel and quasi-Monte Carlo methods for uncertainty quantification in particle travel times through random heterogeneous porous media

    NASA Astrophysics Data System (ADS)

    Crevillén-García, D.; Power, H.

    2017-08-01

    In this study, we apply four Monte Carlo simulation methods, namely, Monte Carlo, quasi-Monte Carlo, multilevel Monte Carlo and multilevel quasi-Monte Carlo to the problem of uncertainty quantification in the estimation of the average travel time during the transport of particles through random heterogeneous porous media. We apply the four methodologies to a model problem where the only input parameter, the hydraulic conductivity, is modelled as a log-Gaussian random field by using direct Karhunen-Loéve decompositions. The random terms in such expansions represent the coefficients in the equations. Numerical calculations demonstrating the effectiveness of each of the methods are presented. A comparison of the computational cost incurred by each of the methods for three different tolerances is provided. The accuracy of the approaches is quantified via the mean square error.

  13. Multilevel and quasi-Monte Carlo methods for uncertainty quantification in particle travel times through random heterogeneous porous media.

    PubMed

    Crevillén-García, D; Power, H

    2017-08-01

    In this study, we apply four Monte Carlo simulation methods, namely, Monte Carlo, quasi-Monte Carlo, multilevel Monte Carlo and multilevel quasi-Monte Carlo to the problem of uncertainty quantification in the estimation of the average travel time during the transport of particles through random heterogeneous porous media. We apply the four methodologies to a model problem where the only input parameter, the hydraulic conductivity, is modelled as a log-Gaussian random field by using direct Karhunen-Loéve decompositions. The random terms in such expansions represent the coefficients in the equations. Numerical calculations demonstrating the effectiveness of each of the methods are presented. A comparison of the computational cost incurred by each of the methods for three different tolerances is provided. The accuracy of the approaches is quantified via the mean square error.

  14. Multilevel and quasi-Monte Carlo methods for uncertainty quantification in particle travel times through random heterogeneous porous media

    PubMed Central

    Power, H.

    2017-01-01

    In this study, we apply four Monte Carlo simulation methods, namely, Monte Carlo, quasi-Monte Carlo, multilevel Monte Carlo and multilevel quasi-Monte Carlo to the problem of uncertainty quantification in the estimation of the average travel time during the transport of particles through random heterogeneous porous media. We apply the four methodologies to a model problem where the only input parameter, the hydraulic conductivity, is modelled as a log-Gaussian random field by using direct Karhunen–Loéve decompositions. The random terms in such expansions represent the coefficients in the equations. Numerical calculations demonstrating the effectiveness of each of the methods are presented. A comparison of the computational cost incurred by each of the methods for three different tolerances is provided. The accuracy of the approaches is quantified via the mean square error. PMID:28878974

  15. An Approach to Assess Generalizability in Comparative Effectiveness Research: A Case Study of the Whole Systems Demonstrator Cluster Randomized Trial Comparing Telehealth with Usual Care for Patients with Chronic Health Conditions.

    PubMed

    Steventon, Adam; Grieve, Richard; Bardsley, Martin

    2015-11-01

    Policy makers require estimates of comparative effectiveness that apply to the population of interest, but there has been little research on quantitative approaches to assess and extend the generalizability of randomized controlled trial (RCT)-based evaluations. We illustrate an approach using observational data. Our example is the Whole Systems Demonstrator (WSD) trial, in which 3230 adults with chronic conditions were assigned to receive telehealth or usual care. First, we used novel placebo tests to assess whether outcomes were similar between the RCT control group and a matched subset of nonparticipants who received usual care. We matched on 65 baseline variables obtained from the electronic medical record. Second, we conducted sensitivity analysis to consider whether the estimates of treatment effectiveness were robust to alternative assumptions about whether "usual care" is defined by the RCT control group or nonparticipants. Thus, we provided alternative estimates of comparative effectiveness by contrasting the outcomes of the RCT telehealth group and matched nonparticipants. For some endpoints, such as the number of outpatient attendances, the placebo tests passed, and the effectiveness estimates were robust to the choice of comparison group. However, for other endpoints, such as emergency admissions, the placebo tests failed and the estimates of treatment effect differed markedly according to whether telehealth patients were compared with RCT controls or matched nonparticipants. The proposed placebo tests indicate those cases when estimates from RCTs do not generalize to routine clinical practice and motivate complementary estimates of comparative effectiveness that use observational data. Future RCTs are recommended to incorporate these placebo tests and the accompanying sensitivity analyses to enhance their relevance to policy making. © The Author(s) 2015.

  16. Effectiveness of Neuromuscular Electrical Stimulation on Patients With Dysphagia With Medullary Infarction.

    PubMed

    Zhang, Ming; Tao, Tao; Zhang, Zhao-Bo; Zhu, Xiao; Fan, Wen-Guo; Pu, Li-Jun; Chu, Lei; Yue, Shou-Wei

    2016-03-01

    To evaluate and compare the effects of neuromuscular electrical stimulation (NMES) acting on the sensory input or motor muscle in treating patients with dysphagia with medullary infarction. Prospective randomized controlled study. Department of physical medicine and rehabilitation. Patients with dysphagia with medullary infarction (N=82). Participants were randomized over 3 intervention groups: traditional swallowing therapy, sensory approach combined with traditional swallowing therapy, and motor approach combined with traditional swallowing therapy. Electrical stimulation sessions were for 20 minutes, twice a day, for 5d/wk, over a 4-week period. Swallowing function was evaluated by the water swallow test and Standardized Swallowing Assessment, oral intake was evaluated by the Functional Oral Intake Scale, quality of life was evaluated by the Swallowing-Related Quality of Life (SWAL-QOL) Scale, and cognition was evaluated by the Mini-Mental State Examination (MMSE). There were no statistically significant differences between the groups in age, sex, duration, MMSE score, or severity of the swallowing disorder (P>.05). All groups showed improved swallowing function (P≤.01); the sensory approach combined with traditional swallowing therapy group showed significantly greater improvement than the other 2 groups, and the motor approach combined with traditional swallowing therapy group showed greater improvement than the traditional swallowing therapy group (P<.05). SWAL-QOL Scale scores increased more significantly in the sensory approach combined with traditional swallowing therapy and motor approach combined with traditional swallowing therapy groups than in the traditional swallowing therapy group, and the sensory approach combined with traditional swallowing therapy and motor approach combined with traditional swallowing therapy groups showed statistically significant differences (P=.04). NMES that targets either sensory input or motor muscle coupled with traditional therapy is conducive to recovery from dysphagia and improves quality of life for patients with dysphagia with medullary infarction. A sensory approach appears to be better than a motor approach. Copyright © 2016 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  17. SMART DOCS: A New Patient-Centered Outcomes and Coordinated-Care Management Approach for the Future Practice of Sleep Medicine

    PubMed Central

    Kushida, Clete A.; Nichols, Deborah A.; Holmes, Tyson H.; Miller, Ric; Griffin, Kara; Cardell, Chia-Yu; Hyde, Pamela R.; Cohen, Elyse; Manber, Rachel; Walsh, James K.

    2015-01-01

    The practice of medicine is currently undergoing a transformation to become more efficient, cost-effective, and patient centered in its delivery of care. The aim of this article is to stimulate discussion within the sleep medicine community in addressing these needs by our approach as well as other approaches to sleep medicine care. The primary goals of the Sustainable Methods, Algorithms, and Research Tools for Delivering Optimal Care Study (SMART DOCS) are: (1) to introduce a new Patient-Centered Outcomes and Coordinated-Care Management (PCCM) approach for the future practice of sleep medicine, and (2) to test the PCCM approach against a Conventional Diagnostic and Treatment Outpatient Medical Care (CONV) approach in a randomized, two-arm, single-center, long-term, comparative effectiveness trial. The PCCM approach is integrated into a novel outpatient care delivery model for patients with sleep disorders that includes the latest technology, allowing providers to obtain more accurate and rapid diagnoses and to make evidence-based treatment recommendations, while simultaneously enabling patients to have access to personalized medical information and reports regarding their diagnosis and treatment so that they can make more informed health care decisions. Additionally, the PCCM approach facilitates better communication between patients, referring primary care physicians, sleep specialists, and allied health professionals so that providers can better assist patients in achieving their preferred outcomes. A total of 1,506 patients 18 y or older will be randomized to either the PCCM or CONV approach and will be followed for at least 1 y with endpoints of improved health care performance, better health, and cost control. Clinical Trials Registration: ClinicalTrials.gov Identifier: NCT02037438. Citation: Kushida CA, Nichols DA, Holmes TH, Miller R, Griffin K, Cardell CY, Hyde PR, Cohen E, Manber R, Walsh JK. SMART DOCS: a new patient-centered outcomes and coordinated-care management approach for the future practice of sleep medicine. SLEEP 2015;38(2):315–326. PMID:25409112

  18. The effectiveness and applicability of different lifestyle interventions for enhancing wellbeing: the study design for a randomized controlled trial for persons with metabolic syndrome risk factors and psychological distress.

    PubMed

    Lappalainen, Raimo; Sairanen, Essi; Järvelä, Elina; Rantala, Sanni; Korpela, Riitta; Puttonen, Sampsa; Kujala, Urho M; Myllymäki, Tero; Peuhkuri, Katri; Mattila, Elina; Kaipainen, Kirsikka; Ahtinen, Aino; Karhunen, Leila; Pihlajamäki, Jussi; Järnefelt, Heli; Laitinen, Jaana; Kutinlahti, Eija; Saarelma, Osmo; Ermes, Miikka; Kolehmainen, Marjukka

    2014-04-04

    Obesity and stress are among the most common lifestyle-related health problems. Most of the current disease prevention and management models are not satisfactorily cost-effective and hardly reach those who need them the most. Therefore, novel evidence-based controlled interventions are necessary to evaluate models for prevention and treatment based on self-management. This randomized controlled trial examines the effectiveness, applicability, and acceptability of different lifestyle interventions with individuals having symptoms of metabolic syndrome and psychological distress. The offered interventions are based on cognitive behavioral approaches, and are designed for enhancing general well-being and supporting personalized lifestyle changes. 339 obese individuals reporting stress symptoms were recruited and randomized to either (1) a minimal contact web-guided Cognitive Behavioral Therapy-based (CBT) intervention including an approach of health assessment and coaching methods, (2) a mobile-guided intervention comprising of mindfulness, acceptance and value-based exercises, (3) a face-to-face group intervention using mindfulness, acceptance and value-based approach, or (4) a control group. The participants were measured three times during the study (pre = week 0, post = week 10, and follow-up = week 36). Psychological well-being, lifestyles and habits, eating behaviors, and user experiences were measured using online surveys. Laboratory measurements for physical well-being and general health were performed including e.g. liver function, thyroid glands, kidney function, blood lipids and glucose levels and body composition analysis. In addition, a 3-day ambulatory heart rate and 7-day movement data were collected for analyzing stress, recovery, physical activity, and sleep patterns. Food intake data were collected with a 48 -hour diet recall interview via telephone. Differences in the effects of the interventions would be examined using multiple-group modeling techniques, and effect-size calculations. This study will provide additional knowledge about the effects of three low intensity interventions for improving general well-being among individuals with obesity and stress symptoms. The study will show effects of two technology guided self-help interventions as well as effect of an acceptance and value-based brief group intervention. Those who might benefit from the aforesaid interventions will increase knowledge base to better understand what mechanisms facilitate effects of the interventions. Current Clinical Trials NCT01738256, Registered 17 August, 2012.

  19. Assessing risk-adjustment approaches under non-random selection.

    PubMed

    Luft, Harold S; Dudley, R Adams

    2004-01-01

    Various approaches have been proposed to adjust for differences in enrollee risk in health plans. Because risk-selection strategies may have different effects on enrollment, we simulated three types of selection--dumping, skimming, and stinting. Concurrent diagnosis-based risk adjustment, and a hybrid using concurrent adjustment for about 8% of the cases and prospective adjustment for the rest, perform markedly better than prospective or demographic adjustments, both in terms of R2 and the extent to which plans experience unwarranted gains or losses. The simulation approach offers a valuable tool for analysts in assessing various risk-adjustment strategies under different selection situations.

  20. Phonological and articulation treatment approaches in Portuguese children with speech and language impairments: a randomized controlled intervention study.

    PubMed

    Lousada, M; Jesus, Luis M T; Capelas, S; Margaça, C; Simões, D; Valente, A; Hall, A; Joffe, V L

    2013-01-01

    In Portugal, the routine clinical practice of speech and language therapists (SLTs) in treating children with all types of speech sound disorder (SSD) continues to be articulation therapy (AT). There is limited use of phonological therapy (PT) or phonological awareness training in Portugal. Additionally, at an international level there is a focus on collecting information on and differentiating between the effectiveness of PT and AT for children with different types of phonologically based SSD, as well as on the role of phonological awareness in remediating SSD. It is important to collect more evidence for the most effective and efficient type of intervention approach for different SSDs and for these data to be collected from diverse linguistic and cultural perspectives. To evaluate the effectiveness of a PT and AT approach for treatment of 14 Portuguese children, aged 4.0-6.7 years, with a phonologically based SSD. The children were randomly assigned to one of the two treatment approaches (seven children in each group). All children were treated by the same SLT, blind to the aims of the study, over three blocks of a total of 25 weekly sessions of intervention. Outcome measures of phonological ability (percentage of consonants correct (PCC), percentage occurrence of different phonological processes and phonetic inventory) were taken before and after intervention. A qualitative assessment of intervention effectiveness from the perspective of the parents of participants was included. Both treatments were effective in improving the participants' speech, with the children receiving PT showing a more significant improvement in PCC score than those receiving the AT. Children in the PT group also showed greater generalization to untreated words than those receiving AT. Parents reported both intervention approaches to be as effective in improving their children's speech. The PT (combination of expressive phonological tasks, phonological awareness, listening and discrimination activities) proved to be an effective integrated method of improving phonological SSD in children. These findings provide some evidence for Portuguese SLTs to employ PT with children with phonologically based SSD. © 2012 Royal College of Speech and Language Therapists.

  1. The effectiveness of artificial intelligent 3-D virtual reality vocational problem-solving training in enhancing employment opportunities for people with traumatic brain injury.

    PubMed

    Man, David Wai Kwong; Poon, Wai Sang; Lam, Chow

    2013-01-01

    People with traumatic brain injury (TBI) often experience cognitive deficits in attention, memory, executive functioning and problem-solving. The purpose of the present research study was to examine the effectiveness of an artificial intelligent virtual reality (VR)-based vocational problem-solving skill training programme designed to enhance employment opportunities for people with TBI. This was a prospective randomized controlled trial (RCT) comparing the effectiveness of the above programme with that of the conventional psycho-educational approach. Forty participants with mild (n = 20) or moderate (n = 20) brain injury were randomly assigned to each training programme. Comparisons of problem-solving skills were performed with the Wisconsin Card Sorting Test, the Tower of London Test and the Vocational Cognitive Rating Scale. Improvement in selective memory processes and perception of memory function were found. Across-group comparison showed that the VR group performed more favourably than the therapist-led one in terms of objective and subjective outcome measures and better vocational outcomes. These results support the potential use of a VR-based approach in memory training in people with MCI. Further VR applications, limitations and future research are described.

  2. The Effectiveness and Cost of Clinical Supervision for Motivational Interviewing: A Randomized Controlled Trial

    PubMed Central

    Martino, Steve; Paris, Manuel; Añez, Luis; Nich, Charla; Canning-Ball, Monica; Hunkele, Karen; Olmstead, Todd A.; Carroll, Kathleen M.

    2016-01-01

    The effectiveness of a competency-based supervision approach called Motivational Interviewing Assessment: Supervisory Tools for Enhancing Proficiency (MIA: STEP) was compared to Supervision-As-Usual (SAU) for increasing clinicians’ motivational interviewing (MI) adherence and competence and client retention and primary substance abstinence in a multisite Hybrid Type 2 effectiveness-implementation randomized controlled trial. Participants were 66 clinicians and 450 clients within one of eleven outpatient substance abuse programs. An independent evaluation of audio recorded supervision sessions indicated that MIA: STEP and SAU were highly and comparably discriminable across sites. While clinicians in both supervision conditions improved their MI performance, clinician supervised with MIA: STEP, compared to those in SAU, showed significantly greater increases in the competency in which they used fundamental and advanced MI strategies when using MI across seven intakes through a 16-week follow-up. There were no retention or substance use differences among the clients seen by clinicians in MIA: STEP or SAU. MIA: STEP was substantially more expensive to deliver than SAU. Innovative alternatives to resource-intensive competency-based supervision approaches such as MIA: STEP are needed to promote the implementation of evidence-based practices. PMID:27431042

  3. Marginalized zero-altered models for longitudinal count data.

    PubMed

    Tabb, Loni Philip; Tchetgen, Eric J Tchetgen; Wellenius, Greg A; Coull, Brent A

    2016-10-01

    Count data often exhibit more zeros than predicted by common count distributions like the Poisson or negative binomial. In recent years, there has been considerable interest in methods for analyzing zero-inflated count data in longitudinal or other correlated data settings. A common approach has been to extend zero-inflated Poisson models to include random effects that account for correlation among observations. However, these models have been shown to have a few drawbacks, including interpretability of regression coefficients and numerical instability of fitting algorithms even when the data arise from the assumed model. To address these issues, we propose a model that parameterizes the marginal associations between the count outcome and the covariates as easily interpretable log relative rates, while including random effects to account for correlation among observations. One of the main advantages of this marginal model is that it allows a basis upon which we can directly compare the performance of standard methods that ignore zero inflation with that of a method that explicitly takes zero inflation into account. We present simulations of these various model formulations in terms of bias and variance estimation. Finally, we apply the proposed approach to analyze toxicological data of the effect of emissions on cardiac arrhythmias.

  4. Marginalized zero-altered models for longitudinal count data

    PubMed Central

    Tabb, Loni Philip; Tchetgen, Eric J. Tchetgen; Wellenius, Greg A.; Coull, Brent A.

    2015-01-01

    Count data often exhibit more zeros than predicted by common count distributions like the Poisson or negative binomial. In recent years, there has been considerable interest in methods for analyzing zero-inflated count data in longitudinal or other correlated data settings. A common approach has been to extend zero-inflated Poisson models to include random effects that account for correlation among observations. However, these models have been shown to have a few drawbacks, including interpretability of regression coefficients and numerical instability of fitting algorithms even when the data arise from the assumed model. To address these issues, we propose a model that parameterizes the marginal associations between the count outcome and the covariates as easily interpretable log relative rates, while including random effects to account for correlation among observations. One of the main advantages of this marginal model is that it allows a basis upon which we can directly compare the performance of standard methods that ignore zero inflation with that of a method that explicitly takes zero inflation into account. We present simulations of these various model formulations in terms of bias and variance estimation. Finally, we apply the proposed approach to analyze toxicological data of the effect of emissions on cardiac arrhythmias. PMID:27867423

  5. Inclusion of Topological Measurements into Analytic Estimates of Effective Permeability in Fractured Media

    NASA Astrophysics Data System (ADS)

    Sævik, P. N.; Nixon, C. W.

    2017-11-01

    We demonstrate how topology-based measures of connectivity can be used to improve analytical estimates of effective permeability in 2-D fracture networks, which is one of the key parameters necessary for fluid flow simulations at the reservoir scale. Existing methods in this field usually compute fracture connectivity using the average fracture length. This approach is valid for ideally shaped, randomly distributed fractures, but is not immediately applicable to natural fracture networks. In particular, natural networks tend to be more connected than randomly positioned fractures of comparable lengths, since natural fractures often terminate in each other. The proposed topological connectivity measure is based on the number of intersections and fracture terminations per sampling area, which for statistically stationary networks can be obtained directly from limited outcrop exposures. To evaluate the method, numerical permeability upscaling was performed on a large number of synthetic and natural fracture networks, with varying topology and geometry. The proposed method was seen to provide much more reliable permeability estimates than the length-based approach, across a wide range of fracture patterns. We summarize our results in a single, explicit formula for the effective permeability.

  6. Improving patient-centeredness of fertility care using a multifaceted approach: study protocol for a randomized controlled trial

    PubMed Central

    2012-01-01

    Background Beside traditional outcomes of safety and (cost-)effectiveness, the Institute of Medicine states patient-centeredness as an independent outcome indicator to evaluate the quality of healthcare. Providing patient-centered care is important because patients want to be heard for their ideas and concerns. Healthcare areas associated with high emotions and intensive treatment periods could especially benefit from patient-centered care. How care can become optimally improved in patient-centeredness is unknown. Therefore, we will conduct a study in the context of Dutch fertility care to determine the effects of a multifaceted approach on patient-centeredness, patients’ quality of life (QoL) and levels of distress. Our aims are to investigate the effectiveness of a multifaceted approach and to identify determinants of a change in the level of patient-centeredness, patients’ QoL and distress levels. This paper presents the study protocol. Methods/Design In a cluster-randomized trial in 32 Dutch fertility clinics the effects of a multifaceted approach will be determined on the level of patient-centeredness (Patient-centredness Questionnaire – Infertility), patients’ QoL (FertiQoL) and levels of distress (SCREENIVF). The multifaceted approach includes audit and feedback, educational outreach visits and patient-mediated interventions. Potential determinants of a change in patient-centeredness, patients’ QoL and levels of distress will be collected by an addendum to the patients’ questionnaire and a professionals’ questionnaire. The latter includes the Organizational Culture Assessment Instrument about the clinic’s culture as a possible determinant of an increase in patient-centered care. Discussion The study is expected to yield important new evidence about the effects of a multifaceted approach on levels of patient-centeredness, patients’ QoL and distress in fertility care. Furthermore, determinants associated with a change in these outcome measures will be studied. With knowledge of these results, patient-centered care and thus the quality of healthcare can be improved. Moreover, the results of this study could be useful for similar initiatives to improve the quality of care delivery. The results of this project are expected at the end of 2013. Trial registration Clinicialtrials.gov NCT01481064 PMID:23006997

  7. Person mobility in the design and analysis of cluster-randomized cohort prevention trials.

    PubMed

    Vuchinich, Sam; Flay, Brian R; Aber, Lawrence; Bickman, Leonard

    2012-06-01

    Person mobility is an inescapable fact of life for most cluster-randomized (e.g., schools, hospitals, clinic, cities, state) cohort prevention trials. Mobility rates are an important substantive consideration in estimating the effects of an intervention. In cluster-randomized trials, mobility rates are often correlated with ethnicity, poverty and other variables associated with disparity. This raises the possibility that estimated intervention effects may generalize to only the least mobile segments of a population and, thus, create a threat to external validity. Such mobility can also create threats to the internal validity of conclusions from randomized trials. Researchers must decide how to deal with persons who leave study clusters during a trial (dropouts), persons and clusters that do not comply with an assigned intervention, and persons who enter clusters during a trial (late entrants), in addition to the persons who remain for the duration of a trial (stayers). Statistical techniques alone cannot solve the key issues of internal and external validity raised by the phenomenon of person mobility. This commentary presents a systematic, Campbellian-type analysis of person mobility in cluster-randomized cohort prevention trials. It describes four approaches for dealing with dropouts, late entrants and stayers with respect to data collection, analysis and generalizability. The questions at issue are: 1) From whom should data be collected at each wave of data collection? 2) Which cases should be included in the analyses of an intervention effect? and 3) To what populations can trial results be generalized? The conclusions lead to recommendations for the design and analysis of future cluster-randomized cohort prevention trials.

  8. The withholding of test results as a means of assessing the effectiveness of treatment in test-positive persons.

    PubMed

    Weiss, Noel S

    2013-04-01

    In recent years, a number of studies have achieved randomization of patients to alternative management strategies by blinding some patients (and their providers of medical care) to the results of tests that guide such strategies. Although this research approach has the potential to be a powerful means of measuring treatment effectiveness, the interpretation of the results may not be straightforward if the treatment received by test-positive persons is variable or not well documented, or if the analysis is not restricted to outcomes in test-positive persons. Studies in which the test results are withheld at random may face ethical issues that, to date, have received little discussion. Copyright © 2013 Elsevier Inc. All rights reserved.

  9. Tunneling Conductivity and Piezoresistivity of Composites Containing Randomly Dispersed Conductive Nano-Platelets

    PubMed Central

    Oskouyi, Amirhossein Biabangard; Sundararaj, Uttandaraman; Mertiny, Pierre

    2014-01-01

    In this study, a three-dimensional continuum percolation model was developed based on a Monte Carlo simulation approach to investigate the percolation behavior of an electrically insulating matrix reinforced with conductive nano-platelet fillers. The conductivity behavior of composites rendered conductive by randomly dispersed conductive platelets was modeled by developing a three-dimensional finite element resistor network. Parameters related to the percolation threshold and a power-low describing the conductivity behavior were determined. The piezoresistivity behavior of conductive composites was studied employing a reoriented resistor network emulating a conductive composite subjected to mechanical strain. The effects of the governing parameters, i.e., electron tunneling distance, conductive particle aspect ratio and size effects on conductivity behavior were examined. PMID:28788580

  10. CMOS integration of high-k/metal gate transistors in diffusion and gate replacement (D&GR) scheme for dynamic random access memory peripheral circuits

    NASA Astrophysics Data System (ADS)

    Dentoni Litta, Eugenio; Ritzenthaler, Romain; Schram, Tom; Spessot, Alessio; O’Sullivan, Barry; Machkaoutsan, Vladimir; Fazan, Pierre; Ji, Yunhyuck; Mannaert, Geert; Lorant, Christophe; Sebaai, Farid; Thiam, Arame; Ercken, Monique; Demuynck, Steven; Horiguchi, Naoto

    2018-04-01

    Integration of high-k/metal gate stacks in peripheral transistors is a major candidate to ensure continued scaling of dynamic random access memory (DRAM) technology. In this paper, the CMOS integration of diffusion and gate replacement (D&GR) high-k/metal gate stacks is investigated, evaluating four different approaches for the critical patterning step of removing the N-type field effect transistor (NFET) effective work function (eWF) shifter stack from the P-type field effect transistor (PFET) area. The effect of plasma exposure during the patterning step is investigated in detail and found to have a strong impact on threshold voltage tunability. A CMOS integration scheme based on an experimental wet-compatible photoresist is developed and the fulfillment of the main device metrics [equivalent oxide thickness (EOT), eWF, gate leakage current density, on/off currents, short channel control] is demonstrated.

  11. Methods to Limit Attrition in Longitudinal Comparative Effectiveness Trials: Lessons from the Lithium Use for Bipolar Disorder (LiTMUS) Study

    PubMed Central

    Sylvia, Louisa G.; Reilly-Harrington, Noreen A.; Leon, Andrew C.; Kansky, Christine I.; Ketter, Terence A.; Calabrese, Joseph R.; Thase, Michael E.; Bowden, Charles L.; Friedman, Edward S.; Ostacher, Michael J.; Iosifescu, Dan V.; Severe, Joanne; Nierenberg, Andrew A.

    2013-01-01

    Background High attrition rates which occur frequently in longitudinal clinical trials of interventions for bipolar disorder limit the interpretation of results. Purpose The aim of this article is to present design approaches that limited attrition in the Lithium Use for Bipolar Disorder (LiTMUS) Study. Methods LiTMUS was a 6-month randomized, longitudinal multi-site comparative effectiveness trial that examined bipolar participants who were at least mildly ill. Participants were randomized to either low to moderate doses of lithium or no lithium, in addition to other treatments needed for mood stabilization administered in a guideline-informed, empirically supported, and personalized fashion (N=283). Results Components of the study design that may have contributed to the low attrition rate of the study included use of: (1) an intent-to-treat design; (2) a randomized adjunctive single-blind design; (3) participant reimbursement; (4) intent-to-attend the next study visit (includes a discussion of attendance obstacles when intention is low); (5) quality care with limited participant burden; and (6) target windows for study visits. Limitations Site differences and the effectiveness and tolerability data have not been analyzed yet. Conclusions These components of the LiTMUS study design may have reduced the probability of attrition which would inform the design of future randomized clinical effectiveness trials. PMID:22076437

  12. Digital-Analog Hybrid Scheme and Its Application to Chaotic Random Number Generators

    NASA Astrophysics Data System (ADS)

    Yuan, Zeshi; Li, Hongtao; Miao, Yunchi; Hu, Wen; Zhu, Xiaohua

    2017-12-01

    Practical random number generation (RNG) circuits are typically achieved with analog devices or digital approaches. Digital-based techniques, which use field programmable gate array (FPGA) and graphics processing units (GPU) etc. usually have better performances than analog methods as they are programmable, efficient and robust. However, digital realizations suffer from the effect of finite precision. Accordingly, the generated random numbers (RNs) are actually periodic instead of being real random. To tackle this limitation, in this paper we propose a novel digital-analog hybrid scheme that employs the digital unit as the main body, and minimum analog devices to generate physical RNs. Moreover, the possibility of realizing the proposed scheme with only one memory element is discussed. Without loss of generality, we use the capacitor and the memristor along with FPGA to construct the proposed hybrid system, and a chaotic true random number generator (TRNG) circuit is realized, producing physical RNs at a throughput of Gbit/s scale. These RNs successfully pass all the tests in the NIST SP800-22 package, confirming the significance of the scheme in practical applications. In addition, the use of this new scheme is not restricted to RNGs, and it also provides a strategy to solve the effect of finite precision in other digital systems.

  13. The learning effects of different presentations of worked examples on medical students' breaking-bad-news skills: A randomized and blinded field trial.

    PubMed

    Schmitz, Felix Michael; Schnabel, Kai Philipp; Bauer, Daniel; Bachmann, Cadja; Woermann, Ulrich; Guttormsen, Sissel

    2018-02-24

    Effective instructional approaches are needed to enable undergraduates to optimally prepare for the limited training time they receive with simulated patients (SPs). This study examines the learning effects of different presentation formats of a worked example on student SP communication. Sixty-seven fourth-year medical students attending a mandatory communication course participated in this randomized field trial. Prior to the course, they worked through an e-learning module that introduced the SPIKES protocol for delivering bad news to patients. In this module, a single worked example was presented to one group of students in a text version, to a second group in a video version, and to a third group in a video version enriched with text hints denoting the SPIKES steps. The video-with-hints group broke bad news to SPs significantly more appropriately than either of the other groups. Although no further condition-related effects were revealed, students who learned from the text version most frequently (although non-significantly) ignored unpleasant emotions (standardised emotional cues and concerns) expressed by the SPs. The learning effect was strongest when the video-based worked example was accompanied by hints. Video-related learning approaches that embed attention-guiding hints can effectively prepare undergraduates for SP encounters. Copyright © 2018 Elsevier B.V. All rights reserved.

  14. Evaluating the effectiveness of Behavior-Based Safety education methods for commercial vehicle drivers.

    PubMed

    Wang, Xuesong; Xing, Yilun; Luo, Lian; Yu, Rongjie

    2018-08-01

    Risky driving behavior is one of the main causes of commercial vehicle related crashes. In order to achieve safer vehicle operation, safety education for drivers is often provided. However, the education programs vary in quality and may not always be successful in reducing crash rates. Behavior-Based Safety (BBS) education is a popular approach found effective by numerous studies, but even this approach varies as to the combination of frequency, mode and content used by different education providers. This study therefore evaluates and compares the effectiveness of BBS education methods. Thirty-five drivers in Shanghai, China, were coached with one of three different BBS education methods for 13 weeks following a 13-week baseline phase with no education. A random-effects negative binomial (NB) model was built and calibrated to investigate the relationship between BBS education and the driver at-fault safety-related event rate. Based on the results of the random-effects NB model, event modification factors (EMF) were calculated to evaluate and compare the effectiveness of the methods. Results show that (1) BBS education was confirmed to be effective in safety-related event reduction; (2) the most effective method among the three applied monthly face-to-face coaching, including feedback with video and statistical data, and training on strategies to avoid driver-specific unsafe behaviors; (3) weekly telephone coaching using statistics and strategies was rated by drivers as the most convenient delivery mode, and was also significantly effective. Copyright © 2018 Elsevier Ltd. All rights reserved.

  15. Large-scale randomized clinical trials of bioactives and nutrients in relation to human health and disease prevention - Lessons from the VITAL and COSMOS trials.

    PubMed

    Rautiainen, Susanne; Sesso, Howard D; Manson, JoAnn E

    2017-12-29

    Several bioactive compounds and nutrients in foods have physiological properties that are beneficial for human health. While nutrients typically have clear definitions with established levels of recommended intakes, bioactive compounds often lack such a definition. Although a food-based approach is often the optimal approach to ensure adequate intake of bioactives and nutrients, these components are also often produced as dietary supplements. However, many of these supplements are not sufficiently studied and have an unclear role in chronic disease prevention. Randomized trials are considered the gold standard of study designs, but have not been fully applied to understand the effects of bioactives and nutrients. We review the specific role of large-scale trials to test whether bioactives and nutrients have an effect on health outcomes through several crucial components of trial design, including selection of intervention, recruitment, compliance, outcome selection, and interpretation and generalizability of study findings. We will discuss these components in the context of two randomized clinical trials, the VITamin D and OmegA-3 TriaL (VITAL) and the COcoa Supplement and Multivitamin Outcomes Study (COSMOS). We will mainly focus on dietary supplements of bioactives and nutrients while also emphasizing the need for translation and integration with food-based trials that are of vital importance within nutritional research. Copyright © 2017. Published by Elsevier Ltd.

  16. Linear mixed-effects modeling approach to FMRI group analysis

    PubMed Central

    Chen, Gang; Saad, Ziad S.; Britton, Jennifer C.; Pine, Daniel S.; Cox, Robert W.

    2013-01-01

    Conventional group analysis is usually performed with Student-type t-test, regression, or standard AN(C)OVA in which the variance–covariance matrix is presumed to have a simple structure. Some correction approaches are adopted when assumptions about the covariance structure is violated. However, as experiments are designed with different degrees of sophistication, these traditional methods can become cumbersome, or even be unable to handle the situation at hand. For example, most current FMRI software packages have difficulty analyzing the following scenarios at group level: (1) taking within-subject variability into account when there are effect estimates from multiple runs or sessions; (2) continuous explanatory variables (covariates) modeling in the presence of a within-subject (repeated measures) factor, multiple subject-grouping (between-subjects) factors, or the mixture of both; (3) subject-specific adjustments in covariate modeling; (4) group analysis with estimation of hemodynamic response (HDR) function by multiple basis functions; (5) various cases of missing data in longitudinal studies; and (6) group studies involving family members or twins. Here we present a linear mixed-effects modeling (LME) methodology that extends the conventional group analysis approach to analyze many complicated cases, including the six prototypes delineated above, whose analyses would be otherwise either difficult or unfeasible under traditional frameworks such as AN(C)OVA and general linear model (GLM). In addition, the strength of the LME framework lies in its flexibility to model and estimate the variance–covariance structures for both random effects and residuals. The intraclass correlation (ICC) values can be easily obtained with an LME model with crossed random effects, even at the presence of confounding fixed effects. The simulations of one prototypical scenario indicate that the LME modeling keeps a balance between the control for false positives and the sensitivity for activation detection. The importance of hypothesis formulation is also illustrated in the simulations. Comparisons with alternative group analysis approaches and the limitations of LME are discussed in details. PMID:23376789

  17. Spatially-Distributed Cost–Effectiveness Analysis Framework to Control Phosphorus from Agricultural Diffuse Pollution

    PubMed Central

    Geng, Runzhe; Wang, Xiaoyan; Sharpley, Andrew N.; Meng, Fande

    2015-01-01

    Best management practices (BMPs) for agricultural diffuse pollution control are implemented at the field or small-watershed scale. However, the benefits of BMP implementation on receiving water quality at multiple spatial is an ongoing challenge. In this paper, we introduce an integrated approach that combines risk assessment (i.e., Phosphorus (P) index), model simulation techniques (Hydrological Simulation Program–FORTRAN), and a BMP placement tool at various scales to identify the optimal location for implementing multiple BMPs and estimate BMP effectiveness after implementation. A statistically significant decrease in nutrient discharge from watersheds is proposed to evaluate the effectiveness of BMPs, strategically targeted within watersheds. Specifically, we estimate two types of cost-effectiveness curves (total pollution reduction and proportion of watersheds improved) for four allocation approaches. Selection of a ‘‘best approach” depends on the relative importance of the two types of effectiveness, which involves a value judgment based on the random/aggregated degree of BMP distribution among and within sub-watersheds. A statistical optimization framework is developed and evaluated in Chaohe River Watershed located in the northern mountain area of Beijing. Results show that BMP implementation significantly (p >0.001) decrease P loss from the watershed. Remedial strategies where BMPs were targeted to areas of high risk of P loss, deceased P loads compared with strategies where BMPs were randomly located across watersheds. Sensitivity analysis indicated that aggregated BMP placement in particular watershed is the most cost-effective scenario to decrease P loss. The optimization approach outlined in this paper is a spatially hierarchical method for targeting nonpoint source controls across a range of scales from field to farm, to watersheds, to regions. Further, model estimates showed targeting at multiple scales is necessary to optimize program efficiency. The integrated model approach described that selects and places BMPs at varying levels of implementation, provides a new theoretical basis and technical guidance for diffuse pollution management in agricultural watersheds. PMID:26313561

  18. Linear mixed-effects modeling approach to FMRI group analysis.

    PubMed

    Chen, Gang; Saad, Ziad S; Britton, Jennifer C; Pine, Daniel S; Cox, Robert W

    2013-06-01

    Conventional group analysis is usually performed with Student-type t-test, regression, or standard AN(C)OVA in which the variance-covariance matrix is presumed to have a simple structure. Some correction approaches are adopted when assumptions about the covariance structure is violated. However, as experiments are designed with different degrees of sophistication, these traditional methods can become cumbersome, or even be unable to handle the situation at hand. For example, most current FMRI software packages have difficulty analyzing the following scenarios at group level: (1) taking within-subject variability into account when there are effect estimates from multiple runs or sessions; (2) continuous explanatory variables (covariates) modeling in the presence of a within-subject (repeated measures) factor, multiple subject-grouping (between-subjects) factors, or the mixture of both; (3) subject-specific adjustments in covariate modeling; (4) group analysis with estimation of hemodynamic response (HDR) function by multiple basis functions; (5) various cases of missing data in longitudinal studies; and (6) group studies involving family members or twins. Here we present a linear mixed-effects modeling (LME) methodology that extends the conventional group analysis approach to analyze many complicated cases, including the six prototypes delineated above, whose analyses would be otherwise either difficult or unfeasible under traditional frameworks such as AN(C)OVA and general linear model (GLM). In addition, the strength of the LME framework lies in its flexibility to model and estimate the variance-covariance structures for both random effects and residuals. The intraclass correlation (ICC) values can be easily obtained with an LME model with crossed random effects, even at the presence of confounding fixed effects. The simulations of one prototypical scenario indicate that the LME modeling keeps a balance between the control for false positives and the sensitivity for activation detection. The importance of hypothesis formulation is also illustrated in the simulations. Comparisons with alternative group analysis approaches and the limitations of LME are discussed in details. Published by Elsevier Inc.

  19. The Effects of Focus on Forms and Focus on Form in Teaching Complex Grammatical Structures

    ERIC Educational Resources Information Center

    Pawlak, Miroslaw

    2012-01-01

    The classroom-based study reported in the present paper sought to compare the effectiveness of the focus on forms (FonFs) and focus on form (FonF) approach in teaching English third conditional to Polish high school students. It involved three intact classes, randomly designated as FonF (n = 34), FonFs (n = 36), and Control (n = 35) with a pretest…

  20. Using Bayesian Adaptive Trial Designs for Comparative Effectiveness Research: A Virtual Trial Execution.

    PubMed

    Luce, Bryan R; Connor, Jason T; Broglio, Kristine R; Mullins, C Daniel; Ishak, K Jack; Saunders, Elijah; Davis, Barry R

    2016-09-20

    Bayesian and adaptive clinical trial designs offer the potential for more efficient processes that result in lower sample sizes and shorter trial durations than traditional designs. To explore the use and potential benefits of Bayesian adaptive clinical trial designs in comparative effectiveness research. Virtual execution of ALLHAT (Antihypertensive and Lipid-Lowering Treatment to Prevent Heart Attack Trial) as if it had been done according to a Bayesian adaptive trial design. Comparative effectiveness trial of antihypertensive medications. Patient data sampled from the more than 42 000 patients enrolled in ALLHAT with publicly available data. Number of patients randomly assigned between groups, trial duration, observed numbers of events, and overall trial results and conclusions. The Bayesian adaptive approach and original design yielded similar overall trial conclusions. The Bayesian adaptive trial randomly assigned more patients to the better-performing group and would probably have ended slightly earlier. This virtual trial execution required limited resampling of ALLHAT patients for inclusion in RE-ADAPT (REsearch in ADAptive methods for Pragmatic Trials). Involvement of a data monitoring committee and other trial logistics were not considered. In a comparative effectiveness research trial, Bayesian adaptive trial designs are a feasible approach and potentially generate earlier results and allocate more patients to better-performing groups. National Heart, Lung, and Blood Institute.

  1. Differing antidepressant maintenance methodologies.

    PubMed

    Safer, Daniel J

    2017-10-01

    The principle evidence that antidepressant medication (ADM) is an effective maintenance treatment for adults with major depressive disorder (MDD) is from placebo substitution trials. These trials enter responders from ADM efficacy trials into randomized, double-blind placebo-controlled (RDBPC) effectiveness trials to measure the rate of MDD relapse over time. However, other randomized maintenance trial methodologies merit consideration and comparison. A systematic review of ADM randomized maintenance trials included research reports from multiple databases. Relapse rate was the main effectiveness outcome assessed. Five ADM randomized maintenance methodologies for MDD responders are described and compared for outcome. These effectiveness trials include: placebo-substitution, ADM/placebo extension, ADM extension, ADM vs. psychotherapy, and treatment as usual. The placebo-substitution trials for those abruptly switched to placebo resulted in unusually high (46%) rates of relapse over 6-12months, twice the continuing ADM rate. These trials were characterized by selective screening, high attrition, an anxious anticipation of a switch to placebo, and a risk of drug withdrawal symptoms. Selectively screened ADM efficacy responders who entered into 4-12month extension trials experienced relapse rates averaging ~10% with a low attrition rate. Non-industry sponsored randomized trials of adults with multiple prior MDD episodes who were treated with ADM maintenance for 1-2years experienced relapse rates averaging 40%. Placebo substitution trial methodology represents only one approach to assess ADM maintenance. Antidepressant maintenance research for adults with MDD should be evaluated for industry sponsorship, attrition, the impact of the switch to placebo, and major relapse differences in MDD subpopulations. Copyright © 2017. Published by Elsevier Inc.

  2. Study of Tomography Of Nephrolithiasis Evaluation (STONE): methodology, approach and rationale.

    PubMed

    Valencia, Victoria; Moghadassi, Michelle; Kriesel, Dana R; Cummings, Steve; Smith-Bindman, Rebecca

    2014-05-01

    Urolithiasis (kidney stones) is a common reason for Emergency Department (ED) visits, accounting for nearly 1% of all visits in the United States. Computed tomography (CT) has become the most common imaging test for these patients but there are few comparative effectiveness data to support its use in comparison to ultrasound. This paper describes the rationale and methods of STONE (Study of Tomography Of Nephrolithiasis Evaluation), a pragmatic randomized comparative effectiveness trial comparing different imaging strategies for patients with suspected urolithiasis. STONE is a multi-center, non-blinded pragmatic randomized comparative effectiveness trial of patients between ages 18 and 75 with suspected nephrolithiasis seen in an ED setting. Patients were randomized to one of three initial imaging examinations: point-of-care ultrasound, ultrasound performed by a radiologist or CT. Participants then received diagnosis and treatment per usual care. The primary aim is to compare the rate of severe SAEs (Serious Adverse Events) between the three arms. In addition, a broad range of secondary outcomes was assessed at baseline and regularly for six months post-baseline using phone, email and mail questionnaires. Excluding 17 patients who withdrew after randomization, a total of 2759 patients were randomized and completed a baseline questionnaire (n=908, 893 and 958 in the point-of-care ultrasound, radiology ultrasound and radiology CT arms, respectively). Follow-up is complete, and full or partial outcomes were assessed on over 90% of participants. The detailed methodology of STONE will provide a roadmap for comparative effectiveness studies of diagnostic imaging conducted in an ED setting. Published by Elsevier Inc.

  3. Physiotherapy to improve physical activity in community-dwelling older adults with mobility problems (Coach2Move): study protocol for a randomized controlled trial.

    PubMed

    de Vries, Nienke M; Staal, J Bart; Teerenstra, Steven; Adang, Eddy M M; Rikkert, Marcel G M Olde; Nijhuis-van der Sanden, Maria W G

    2013-12-17

    Older adults can benefit from physical activity in numerous ways. Physical activity is considered to be one of the few ways to influence the level of frailty. Standardized exercise programs do not necessarily lead to more physical activity in daily life, however, and a more personalized approach seems appropriate. The main objective of this study is to investigate whether a focused, problem-oriented coaching intervention ('Coach2Move') delivered by a physiotherapist specializing in geriatrics is more effective for improving physical activity, mobility and health status in community-dwelling older adults than usual physiotherapy care. In addition, cost-effectiveness will be determined. The design of this study is a single-blind randomized controlled trial in thirteen physiotherapy practices. Randomization will take place at the individual patient level. The study population consists of older adults, ≥70 years of age, with decreased physical functioning and mobility and/or a physically inactive lifestyle. The intervention group will receive geriatric physiotherapy according to the Coach2Move strategy. The control group will receive the usual physiotherapy care. Measurements will be performed by research assistants not aware of group assignment. The results will be evaluated on the amount of physical activity (LASA Physical Activity Questionnaire), mobility (modified 'get up and go' test, walking speed and six-minute walking test), quality of life (SF-36), degree of frailty (Evaluative Frailty Index for Physical Activity), fatigue (NRS-fatigue), perceived effect (Global Perceived Effect and Patient Specific Complaints questionnaire) and health care costs. Most studies on the effect of exercise or physical activity consist of standardized programs. In this study, a personalized approach is evaluated within a group of frail older adults, many of whom suffer from multiple and complex diseases and problems. A complicating factor in evaluating a new approach is that it may not be automatically adopted by clinicians. Specific actions are undertaken to optimize implementation of the Coach2Move strategy during the trial. Whether or not these will be sufficient is a matter we will consider subsequently, using quality indicators and process analysis. The Netherlands National Trial Register: NTR3527.

  4. Theoretical model for plasmonic photothermal response of gold nanostructures solutions

    NASA Astrophysics Data System (ADS)

    Phan, Anh D.; Nga, Do T.; Viet, Nguyen A.

    2018-03-01

    Photothermal effects of gold core-shell nanoparticles and nanorods dispersed in water are theoretically investigated using the transient bioheat equation and the extended Mie theory. Properly calculating the absorption cross section is an extremely crucial milestone to determine the elevation of solution temperature. The nanostructures are assumed to be randomly and uniformly distributed in the solution. Compared to previous experiments, our theoretical temperature increase during laser light illumination provides, in various systems, both reasonable qualitative and quantitative agreement. This approach can be a highly reliable tool to predict photothermal effects in experimentally unexplored structures. We also validate our approach and discuss itslimitations.

  5. Impact of alcohol-promoting and alcohol-warning advertisements on alcohol consumption, affect, and implicit cognition in heavy-drinking young adults: A laboratory-based randomized controlled trial.

    PubMed

    Stautz, Kaidy; Frings, Daniel; Albery, Ian P; Moss, Antony C; Marteau, Theresa M

    2017-02-01

    There is sparse evidence regarding the effect of alcohol-advertising exposure on alcohol consumption among heavy drinkers. This study aimed to assess the immediate effects of alcohol-promoting and alcohol-warning video advertising on objective alcohol consumption in heavy-drinking young adults, and to examine underlying processes. Between-participants randomized controlled trial with three conditions. Two hundred and four young adults (aged 18-25) who self-reported as heavy drinkers were randomized to view one of three sets of 10 video advertisements that included either (1) alcohol-promoting, (2) alcohol-warning, or (3) non-alcohol advertisements. The primary outcome was the proportion of alcoholic beverages consumed in a sham taste test. Affective responses to advertisements, implicit alcohol approach bias, and alcohol attentional bias were assessed as secondary outcomes and possible mediators. Typical alcohol consumption, Internet use, and television use were measured as covariates. There was no main effect of condition on alcohol consumption. Participants exposed to alcohol-promoting advertisements showed increased positive affect and an increased approach/reduced avoidance bias towards alcohol relative to those exposed to non-alcohol advertisements. There was an indirect effect of exposure to alcohol-warning advertisements on reduced alcohol consumption via negative affect experienced in response to these advertisements. Restricting alcohol-promoting advertising could remove a potential influence on positive alcohol-related emotions and cognitions among heavy-drinking young adults. Producing alcohol-warning advertising that generates negative emotion may be an effective strategy to reduce alcohol consumption. Statement of contribution What is already known on this subject? Exposure to alcohol advertising has immediate and distal effects on alcohol consumption. There is some evidence that effects may be larger in heavy drinkers. Alcohol-warning advertising has been found to have mixed effects on alcohol-related cognitions. What does this study add? Among heavy-drinking young adults: Alcohol advertising does not appear to have an immediate impact on alcohol consumption. Alcohol advertising generates positive affect and increases alcohol approach bias. Alcohol-warning advertising that generates displeasure reduces alcohol consumption. © 2016 The Authors. British Journal of Health Psychology published by John Wiley & Sons Ltd on behalf of the British Psychological Society.

  6. Accounting for Heterogeneity in Relative Treatment Effects for Use in Cost-Effectiveness Models and Value-of-Information Analyses

    PubMed Central

    Soares, Marta O.; Palmer, Stephen; Ades, Anthony E.; Harrison, David; Shankar-Hari, Manu; Rowan, Kathy M.

    2015-01-01

    Cost-effectiveness analysis (CEA) models are routinely used to inform health care policy. Key model inputs include relative effectiveness of competing treatments, typically informed by meta-analysis. Heterogeneity is ubiquitous in meta-analysis, and random effects models are usually used when there is variability in effects across studies. In the absence of observed treatment effect modifiers, various summaries from the random effects distribution (random effects mean, predictive distribution, random effects distribution, or study-specific estimate [shrunken or independent of other studies]) can be used depending on the relationship between the setting for the decision (population characteristics, treatment definitions, and other contextual factors) and the included studies. If covariates have been measured that could potentially explain the heterogeneity, then these can be included in a meta-regression model. We describe how covariates can be included in a network meta-analysis model and how the output from such an analysis can be used in a CEA model. We outline a model selection procedure to help choose between competing models and stress the importance of clinical input. We illustrate the approach with a health technology assessment of intravenous immunoglobulin for the management of adult patients with severe sepsis in an intensive care setting, which exemplifies how risk of bias information can be incorporated into CEA models. We show that the results of the CEA and value-of-information analyses are sensitive to the model and highlight the importance of sensitivity analyses when conducting CEA in the presence of heterogeneity. The methods presented extend naturally to heterogeneity in other model inputs, such as baseline risk. PMID:25712447

  7. Accounting for Heterogeneity in Relative Treatment Effects for Use in Cost-Effectiveness Models and Value-of-Information Analyses.

    PubMed

    Welton, Nicky J; Soares, Marta O; Palmer, Stephen; Ades, Anthony E; Harrison, David; Shankar-Hari, Manu; Rowan, Kathy M

    2015-07-01

    Cost-effectiveness analysis (CEA) models are routinely used to inform health care policy. Key model inputs include relative effectiveness of competing treatments, typically informed by meta-analysis. Heterogeneity is ubiquitous in meta-analysis, and random effects models are usually used when there is variability in effects across studies. In the absence of observed treatment effect modifiers, various summaries from the random effects distribution (random effects mean, predictive distribution, random effects distribution, or study-specific estimate [shrunken or independent of other studies]) can be used depending on the relationship between the setting for the decision (population characteristics, treatment definitions, and other contextual factors) and the included studies. If covariates have been measured that could potentially explain the heterogeneity, then these can be included in a meta-regression model. We describe how covariates can be included in a network meta-analysis model and how the output from such an analysis can be used in a CEA model. We outline a model selection procedure to help choose between competing models and stress the importance of clinical input. We illustrate the approach with a health technology assessment of intravenous immunoglobulin for the management of adult patients with severe sepsis in an intensive care setting, which exemplifies how risk of bias information can be incorporated into CEA models. We show that the results of the CEA and value-of-information analyses are sensitive to the model and highlight the importance of sensitivity analyses when conducting CEA in the presence of heterogeneity. The methods presented extend naturally to heterogeneity in other model inputs, such as baseline risk. © The Author(s) 2015.

  8. Combined cognitive-strategy and task-specific training improves transfer to untrained activities in sub-acute stroke: An exploratory randomized controlled trial

    PubMed Central

    McEwen, Sara; Polatajko, Helene; Baum, Carolyn; Rios, Jorge; Cirone, Dianne; Doherty, Meghan; Wolf, Timothy

    2014-01-01

    Purpose The purpose of this study was to estimate the effect of the Cognitive Orientation to daily Occupational Performance (CO-OP) approach compared to usual outpatient rehabilitation on activity and participation in people less than 3 months post stroke. Methods An exploratory, single blind, randomized controlled trial with a usual care control arm was conducted. Participants referred to 2 stroke rehabilitation outpatient programs were randomized to receive either Usual Care or CO-OP. The primary outcome was actual performance of trained and untrained self-selected activities, measured using the Performance Quality Rating Scale (PQRS). Additional outcomes included the Canadian Occupational Performance Measure (COPM), the Stroke Impact Scale Participation Domain, the Community Participation Index, and the Self Efficacy Gauge. Results Thirty-five (35) eligible participants were randomized; 26 completed the intervention. Post-intervention, PQRS change scores demonstrated CO-OP had a medium effect over Usual Care on trained self-selected activities (d=0.5) and a large effect on untrained (d=1.2). At a 3 month follow-up, PQRS change scores indicated a large effect of CO-OP on both trained (d=1.6) and untrained activities (d=1.1). CO-OP had a small effect on COPM and a medium effect on the Community Participation Index perceived control and the Self-Efficacy Gauge. Conclusion CO-OP was associated with a large treatment effect on follow up performances of self-selected activities, and demonstrated transfer to untrained activities. A larger trial is warranted. PMID:25416738

  9. Combined Cognitive-Strategy and Task-Specific Training Improve Transfer to Untrained Activities in Subacute Stroke: An Exploratory Randomized Controlled Trial.

    PubMed

    McEwen, Sara; Polatajko, Helene; Baum, Carolyn; Rios, Jorge; Cirone, Dianne; Doherty, Meghan; Wolf, Timothy

    2015-07-01

    The purpose of this study was to estimate the effect of the Cognitive Orientation to daily Occupational Performance (CO-OP) approach compared with usual outpatient rehabilitation on activity and participation in people <3 months poststroke. An exploratory, single-blind, randomized controlled trial, with a usual-care control arm, was conducted. Participants referred to 2 stroke rehabilitation outpatient programs were randomized to receive either usual care or CO-OP. The primary outcome was actual performance of trained and untrained self-selected activities, measured using the Performance Quality Rating Scale (PQRS). Additional outcomes included the Canadian Occupational Performance Measure (COPM), the Stroke Impact Scale Participation Domain, the Community Participation Index, and the Self-Efficacy Gauge. A total of 35 eligible participants were randomized; 26 completed the intervention. Post intervention, PQRS change scores demonstrated that CO-OP had a medium effect over usual care on trained self-selected activities (d = 0.5) and a large effect on untrained activities (d = 1.2). At a 3-month follow-up, PQRS change scores indicated a large effect of CO-OP on both trained (d = 1.6) and untrained activities (d = 1.1). CO-OP had a small effect on COPM and a medium effect on the Community Participation Index perceived control and on the Self-Efficacy Gauge. CO-OP was associated with a large treatment effect on follow-up performances of self-selected activities and demonstrated transfer to untrained activities. A larger trial is warranted. © The Author(s) 2014.

  10. Interpreting null findings from trials of alcohol brief interventions.

    PubMed

    Heather, Nick

    2014-01-01

    The effectiveness of alcohol brief intervention (ABI) has been established by a succession of meta-analyses but, because the effects of ABI are small, null findings from randomized controlled trials are often reported and can sometimes lead to skepticism regarding the benefits of ABI in routine practice. This article first explains why null findings are likely to occur under null hypothesis significance testing (NHST) due to the phenomenon known as "the dance of the p-values." A number of misconceptions about null findings are then described, using as an example the way in which the results of the primary care arm of a recent cluster-randomized trial of ABI in England (the SIPS project) have been misunderstood. These misinterpretations include the fallacy of "proving the null hypothesis" that lack of a significant difference between the means of sample groups can be taken as evidence of no difference between their population means, and the possible effects of this and related misunderstandings of the SIPS findings are examined. The mistaken inference that reductions in alcohol consumption seen in control groups from baseline to follow-up are evidence of real effects of control group procedures is then discussed and other possible reasons for such reductions, including regression to the mean, research participation effects, historical trends, and assessment reactivity, are described. From the standpoint of scientific progress, the chief problem about null findings under the conventional NHST approach is that it is not possible to distinguish "evidence of absence" from "absence of evidence." By contrast, under a Bayesian approach, such a distinction is possible and it is explained how this approach could classify ABIs in particular settings or among particular populations as either truly ineffective or as of unknown effectiveness, thus accelerating progress in the field of ABI research.

  11. The pulling power of chocolate: Effects of approach-avoidance training on approach bias and consumption.

    PubMed

    Dickson, Hugh; Kavanagh, David J; MacLeod, Colin

    2016-04-01

    Previous research has shown that action tendencies to approach alcohol may be modified using computerized Approach-Avoidance Task (AAT), and that this impacted on subsequent consumption. A recent paper in this journal (Becker, Jostman, Wiers, & Holland, 2015) failed to show significant training effects for food in three studies: Nor did it find effects on subsequent consumption. However, avoidance training to high calorie foods was tested against a control rather than Approach training. The present study used a more comparable paradigm to the alcohol studies. It randomly assigned 90 participants to 'approach' or 'avoid' chocolate images on the AAT, and then asked them to taste and rate chocolates. A significant interaction of condition and time showed that training to avoid chocolate resulted in faster avoidance responses to chocolate images, compared with training to approach it. Consistent with Becker et al.'s Study 3, no effect was found on amounts of chocolate consumed, although a newly published study in this journal (Schumacher, Kemps, & Tiggemann, 2016) did do so. The collective evidence does not as yet provide solid basis for the application of AAT training to reduction of problematic food consumption, although clinical trials have yet to be conducted. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. A Path Analysis of a Randomized "Promotora de Salud" Cardiovascular Disease-Prevention Trial among At-Risk Hispanic Adults

    ERIC Educational Resources Information Center

    de Heer, Hendrik Dirk; Balcazar, Hector G.; Castro, Felipe; Schulz, Leslie

    2012-01-01

    This study assessed effectiveness of an educational community intervention taught by "promotoras de salud" in reducing cardiovascular disease (CVD) risk among Hispanics using a structural equation modeling (SEM) approach. Model development was guided by a social ecological framework proposing CVD risk reduction through improvement of…

  13. The Relationship between Critical Thinking Abilities and Classroom Management Skills of High School Teachers

    ERIC Educational Resources Information Center

    Demirdag, Seyithan

    2015-01-01

    High school teachers experience difficulties while providing effective teaching approaches in their classrooms. Some of the difficulties are associated with the lack of classroom management skills and critical thinking abilities. This quantitative study includes non-random selection of the participants and aims to examine critical thinking…

  14. Structuring Cooperative Learning for Motivation and Conceptual Change in the Concepts of Mixtures

    ERIC Educational Resources Information Center

    Belge Can, Hatice; Boz, Yezdan

    2016-01-01

    This study investigates the effect of structuring cooperative learning based on conceptual change approach on grade 9 students' understanding the concepts of mixtures and their motivation, compared with traditional instruction. Among six classes of a high school, two of them were randomly assigned to cooperative learning group where students were…

  15. An INAR(1) Negative Multinomial Regression Model for Longitudinal Count Data.

    ERIC Educational Resources Information Center

    Bockenholt, Ulf

    1999-01-01

    Discusses a regression model for the analysis of longitudinal count data in a panel study by adapting an integer-valued first-order autoregressive (INAR(1)) Poisson process to represent time-dependent correlation between counts. Derives a new negative multinomial distribution by combining INAR(1) representation with a random effects approach.…

  16. Enhanced Case Management versus Substance Abuse Treatment Alone among Substance Abusers with Depression

    ERIC Educational Resources Information Center

    Striley, Catherine W.; Nattala, Prasanthi; Ben Abdallah, Arbi; Dennis, Michael L.; Cottler, Linda B.

    2013-01-01

    This pilot study evaluated the effectiveness of enhanced case management for substance abusers with comorbid major depression, which was an integrated approach to care. One hundred and 20 participants admitted to drug treatment who also met Computerized Diagnostic Interview Schedule criteria for major depression at baseline were randomized to…

  17. Quality Quandaries: Predicting a Population of Curves

    DOE PAGES

    Fugate, Michael Lynn; Hamada, Michael Scott; Weaver, Brian Phillip

    2017-12-19

    We present a random effects spline regression model based on splines that provides an integrated approach for analyzing functional data, i.e., curves, when the shape of the curves is not parametrically specified. An analysis using this model is presented that makes inferences about a population of curves as well as features of the curves.

  18. Effective Programs for Elementary Science: A Best-Evidence Synthesis

    ERIC Educational Resources Information Center

    Slavin, Robert E.; Lake, Cynthia; Hanley, Pam; Thurston, Allen

    2012-01-01

    This article presents a systematic review of research on the achievement outcomes of all types of approaches to teaching science in elementary schools. Study inclusion criteria included use of randomized or matched control groups, a study duration of at least 4 weeks, and use of achievement measures independent of the experimental treatment. A…

  19. Analyzing Empirical Evaluations of Non-Experimental Methods in Field Settings

    ERIC Educational Resources Information Center

    Steiner, Peter M.; Wong, Vivian

    2016-01-01

    Despite recent emphasis on the use of randomized control trials (RCTs) for evaluating education interventions, in most areas of education research, observational methods remain the dominant approach for assessing program effects. Over the last three decades, the within-study comparison (WSC) design has emerged as a method for evaluating the…

  20. Quality Quandaries: Predicting a Population of Curves

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fugate, Michael Lynn; Hamada, Michael Scott; Weaver, Brian Phillip

    We present a random effects spline regression model based on splines that provides an integrated approach for analyzing functional data, i.e., curves, when the shape of the curves is not parametrically specified. An analysis using this model is presented that makes inferences about a population of curves as well as features of the curves.

  1. Effects of a Universal Positive Classroom Behavior Program on Student Learning

    ERIC Educational Resources Information Center

    Diperna, James Clyde; Lei, Puiwa; Bellinger, Jillian; Cheng, Weiyi

    2016-01-01

    The purpose of this study was to examine the impact of a universal program to promote positive classroom behavior on students' approaches to learning and early academic skills. Second grade classrooms (N = 39) were randomly assigned to treatment and business-as-usual control conditions. Teachers in intervention classrooms implemented the Social…

  2. Simulation-Extrapolation for Estimating Means and Causal Effects with Mismeasured Covariates

    ERIC Educational Resources Information Center

    Lockwood, J. R.; McCaffrey, Daniel F.

    2015-01-01

    Regression, weighting and related approaches to estimating a population mean from a sample with nonrandom missing data often rely on the assumption that conditional on covariates, observed samples can be treated as random. Standard methods using this assumption generally will fail to yield consistent estimators when covariates are measured with…

  3. A Family-School Intervention for Children with ADHD: Results of a Randomized Clinical Trial

    ERIC Educational Resources Information Center

    Power, Thomas J.; Mautone, Jennifer A.; Soffer, Stephen L.; Clarke, Angela T.; Marshall, Stephen A.; Sharman, Jaclyn; Blum, Nathan J.; Glanzman, Marianne; Elia, Josephine; Jawad, Abbas F.

    2012-01-01

    Objective: Accumulating evidence highlights the importance of using psychosocial approaches to intervention for children with attention-deficit/hyperactivity disorder (ADHD) that target the family and school, as well as the intersection of family and school. This study evaluated the effectiveness of a family-school intervention, Family-School…

  4. A Randomised Efficacy Study of Web-Based Synthetic and Analytic Programmes among Disadvantaged Urban Kindergarten Children

    ERIC Educational Resources Information Center

    Comaskey, Erin M.; Savage, Robert S.; Abrami, Philip

    2009-01-01

    This study explores whether two computer-based literacy interventions--a "synthetic phonics" and an "analytic phonics" approach produce qualitatively distinct effects on the early phonological abilities and reading skills of disadvantaged urban Kindergarten (Reception) children. Participants (n=53) were assigned by random allocation to one of the…

  5. Examining the Relationships of Component Reading Skills to Reading Comprehension in Struggling Adult Readers

    ERIC Educational Resources Information Center

    Tighe, Elizabeth L.; Schatschneider, Christopher

    2016-01-01

    The current study employed a meta-analytic approach to investigate the relative importance of component reading skills to reading comprehension in struggling adult readers. A total of 10 component skills were consistently identified across 16 independent studies and 2,707 participants. Random effects models generated 76 predictor-reading…

  6. Effective pore-scale dispersion upscaling with a correlated continuous time random walk approach

    NASA Astrophysics Data System (ADS)

    Le Borgne, T.; Bolster, D.; Dentz, M.; de Anna, P.; Tartakovsky, A.

    2011-12-01

    We investigate the upscaling of dispersion from a pore-scale analysis of Lagrangian velocities. A key challenge in the upscaling procedure is to relate the temporal evolution of spreading to the pore-scale velocity field properties. We test the hypothesis that one can represent Lagrangian velocities at the pore scale as a Markov process in space. The resulting effective transport model is a continuous time random walk (CTRW) characterized by a correlated random time increment, here denoted as correlated CTRW. We consider a simplified sinusoidal wavy channel model as well as a more complex heterogeneous pore space. For both systems, the predictions of the correlated CTRW model, with parameters defined from the velocity field properties (both distribution and correlation), are found to be in good agreement with results from direct pore-scale simulations over preasymptotic and asymptotic times. In this framework, the nontrivial dependence of dispersion on the pore boundary fluctuations is shown to be related to the competition between distribution and correlation effects. In particular, explicit inclusion of spatial velocity correlation in the effective CTRW model is found to be important to represent incomplete mixing in the pore throats.

  7. Bayesian network meta-analysis for cluster randomized trials with binary outcomes.

    PubMed

    Uhlmann, Lorenz; Jensen, Katrin; Kieser, Meinhard

    2017-06-01

    Network meta-analysis is becoming a common approach to combine direct and indirect comparisons of several treatment arms. In recent research, there have been various developments and extensions of the standard methodology. Simultaneously, cluster randomized trials are experiencing an increased popularity, especially in the field of health services research, where, for example, medical practices are the units of randomization but the outcome is measured at the patient level. Combination of the results of cluster randomized trials is challenging. In this tutorial, we examine and compare different approaches for the incorporation of cluster randomized trials in a (network) meta-analysis. Furthermore, we provide practical insight on the implementation of the models. In simulation studies, it is shown that some of the examined approaches lead to unsatisfying results. However, there are alternatives which are suitable to combine cluster randomized trials in a network meta-analysis as they are unbiased and reach accurate coverage rates. In conclusion, the methodology can be extended in such a way that an adequate inclusion of the results obtained in cluster randomized trials becomes feasible. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  8. Methods for identifying SNP interactions: a review on variations of Logic Regression, Random Forest and Bayesian logistic regression.

    PubMed

    Chen, Carla Chia-Ming; Schwender, Holger; Keith, Jonathan; Nunkesser, Robin; Mengersen, Kerrie; Macrossan, Paula

    2011-01-01

    Due to advancements in computational ability, enhanced technology and a reduction in the price of genotyping, more data are being generated for understanding genetic associations with diseases and disorders. However, with the availability of large data sets comes the inherent challenges of new methods of statistical analysis and modeling. Considering a complex phenotype may be the effect of a combination of multiple loci, various statistical methods have been developed for identifying genetic epistasis effects. Among these methods, logic regression (LR) is an intriguing approach incorporating tree-like structures. Various methods have built on the original LR to improve different aspects of the model. In this study, we review four variations of LR, namely Logic Feature Selection, Monte Carlo Logic Regression, Genetic Programming for Association Studies, and Modified Logic Regression-Gene Expression Programming, and investigate the performance of each method using simulated and real genotype data. We contrast these with another tree-like approach, namely Random Forests, and a Bayesian logistic regression with stochastic search variable selection.

  9. Metal-superconductor transition in low-dimensional superconducting clusters embedded in two-dimensional electron systems

    NASA Astrophysics Data System (ADS)

    Bucheli, D.; Caprara, S.; Castellani, C.; Grilli, M.

    2013-02-01

    Motivated by recent experimental data on thin film superconductors and oxide interfaces, we propose a random-resistor network apt to describe the occurrence of a metal-superconductor transition in a two-dimensional electron system with disorder on the mesoscopic scale. We consider low-dimensional (e.g. filamentary) structures of a superconducting cluster embedded in the two-dimensional network and we explore the separate effects and the interplay of the superconducting structure and of the statistical distribution of local critical temperatures. The thermal evolution of the resistivity is determined by a numerical calculation of the random-resistor network and, for comparison, a mean-field approach called effective medium theory (EMT). Our calculations reveal the relevance of the distribution of critical temperatures for clusters with low connectivity. In addition, we show that the presence of spatial correlations requires a modification of standard EMT to give qualitative agreement with the numerical results. Applying the present approach to an LaTiO3/SrTiO3 oxide interface, we find that the measured resistivity curves are compatible with a network of spatially dense but loosely connected superconducting islands.

  10. Effect of a mobile app intervention on vegetable consumption in overweight adults: a randomized controlled trial.

    PubMed

    Mummah, Sarah; Robinson, Thomas N; Mathur, Maya; Farzinkhou, Sarah; Sutton, Stephen; Gardner, Christopher D

    2017-09-15

    Mobile applications (apps) have been heralded as transformative tools to deliver behavioral health interventions at scale, but few have been tested in rigorous randomized controlled trials. We tested the effect of a mobile app to increase vegetable consumption among overweight adults attempting weight loss maintenance. Overweight adults (n=135) aged 18-50 years with BMI=28-40 kg/m 2 near Stanford, CA were recruited from an ongoing 12-month weight loss trial (parent trial) and randomly assigned to either the stand-alone, theory-based Vegethon mobile app (enabling goal setting, self-monitoring, and feedback and using "process motivators" including fun, surprise, choice, control, social comparison, and competition) or a wait-listed control condition. The primary outcome was daily vegetables servings, measured by an adapted Harvard food frequency questionnaire (FFQ) 8 weeks post-randomization. Daily vegetable servings from 24-hour dietary recalls, administered by trained, certified, and blinded interviewers 5 weeks post-randomization, was included as a secondary outcome. All analyses were conducted according to principles of intention-to-treat. Daily vegetable consumption was significantly greater in the intervention versus control condition for both measures (adjusted mean difference: 2.0 servings; 95% CI: 0.1, 3.8, p=0.04 for FFQ; and 1.0 servings; 95% CI: 0.2, 1.9; p=0.02 for 24-hour recalls). Baseline vegetable consumption was a significant moderator of intervention effects (p=0.002) in which effects increased as baseline consumption increased. These results demonstrate the efficacy of a mobile app to increase vegetable consumption among overweight adults. Theory-based mobile interventions may present a low-cost, scalable, and effective approach to improving dietary behaviors and preventing associated chronic diseases. ClinicalTrials.gov NCT01826591. Registered 27 March 2013.

  11. Random-effects linear modeling and sample size tables for two special crossover designs of average bioequivalence studies: the four-period, two-sequence, two-formulation and six-period, three-sequence, three-formulation designs.

    PubMed

    Diaz, Francisco J; Berg, Michel J; Krebill, Ron; Welty, Timothy; Gidal, Barry E; Alloway, Rita; Privitera, Michael

    2013-12-01

    Due to concern and debate in the epilepsy medical community and to the current interest of the US Food and Drug Administration (FDA) in revising approaches to the approval of generic drugs, the FDA is currently supporting ongoing bioequivalence studies of antiepileptic drugs, the EQUIGEN studies. During the design of these crossover studies, the researchers could not find commercial or non-commercial statistical software that quickly allowed computation of sample sizes for their designs, particularly software implementing the FDA requirement of using random-effects linear models for the analyses of bioequivalence studies. This article presents tables for sample-size evaluations of average bioequivalence studies based on the two crossover designs used in the EQUIGEN studies: the four-period, two-sequence, two-formulation design, and the six-period, three-sequence, three-formulation design. Sample-size computations assume that random-effects linear models are used in bioequivalence analyses with crossover designs. Random-effects linear models have been traditionally viewed by many pharmacologists and clinical researchers as just mathematical devices to analyze repeated-measures data. In contrast, a modern view of these models attributes an important mathematical role in theoretical formulations in personalized medicine to them, because these models not only have parameters that represent average patients, but also have parameters that represent individual patients. Moreover, the notation and language of random-effects linear models have evolved over the years. Thus, another goal of this article is to provide a presentation of the statistical modeling of data from bioequivalence studies that highlights the modern view of these models, with special emphasis on power analyses and sample-size computations.

  12. Bi-dimensional null model analysis of presence-absence binary matrices.

    PubMed

    Strona, Giovanni; Ulrich, Werner; Gotelli, Nicholas J

    2018-01-01

    Comparing the structure of presence/absence (i.e., binary) matrices with those of randomized counterparts is a common practice in ecology. However, differences in the randomization procedures (null models) can affect the results of the comparisons, leading matrix structural patterns to appear either "random" or not. Subjectivity in the choice of one particular null model over another makes it often advisable to compare the results obtained using several different approaches. Yet, available algorithms to randomize binary matrices differ substantially in respect to the constraints they impose on the discrepancy between observed and randomized row and column marginal totals, which complicates the interpretation of contrasting patterns. This calls for new strategies both to explore intermediate scenarios of restrictiveness in-between extreme constraint assumptions, and to properly synthesize the resulting information. Here we introduce a new modeling framework based on a flexible matrix randomization algorithm (named the "Tuning Peg" algorithm) that addresses both issues. The algorithm consists of a modified swap procedure in which the discrepancy between the row and column marginal totals of the target matrix and those of its randomized counterpart can be "tuned" in a continuous way by two parameters (controlling, respectively, row and column discrepancy). We show how combining the Tuning Peg with a wise random walk procedure makes it possible to explore the complete null space embraced by existing algorithms. This exploration allows researchers to visualize matrix structural patterns in an innovative bi-dimensional landscape of significance/effect size. We demonstrate the rational and potential of our approach with a set of simulated and real matrices, showing how the simultaneous investigation of a comprehensive and continuous portion of the null space can be extremely informative, and possibly key to resolving longstanding debates in the analysis of ecological matrices. © 2017 The Authors. Ecology, published by Wiley Periodicals, Inc., on behalf of the Ecological Society of America.

  13. A systematic review of randomized controlled trials with herbal medicine on chronic rhinosinusitis.

    PubMed

    Anushiravani, Majid; Bakhshaee, Mahdi; Taghipour, Ali; Naghedi-Baghdar, Hamideh; Farshchi, Masoumeh Kaboli; Hoseini, Seyed Saeed; Mehri, Mohammad Reza

    2018-03-01

    Chronic rhinosinusitis (CRS) is a common disease with evidence to show that its incidence and prevalence are increasing. Medicinal plants are commonly used to treat CRS. This systematic review aimed to assess the effectiveness and safety of herbal preparations for treatment of the patients with CRS. Cochran, Embase, ISI, PubMed, and Scopus databases were searched until August 1, 2016. Only randomized controlled trials were included. Four randomized controlled trials were included in this systematic review. Various medicinal plants were studied in each article. Inclusion and exclusion criteria, and outcome measures varied among different articles. The results of this trials showed that this special medicinal plants may be effective in the treatment of CRS. No serious reactions were reported during the administration of herbal remedies in the 4 studies. However, trials with a well-designed approach are needed to study the actual safety and efficacy of herbs in the treatment of CRS. Copyright © 2017 John Wiley & Sons, Ltd.

  14. A double-blind study on clonazepam in patients with burning mouth syndrome.

    PubMed

    Heckmann, Siegfried M; Kirchner, Elena; Grushka, Miriam; Wichmann, Manfred G; Hummel, Thomas

    2012-04-01

    In the treatment of burning mouth syndrome (BMS), various approaches have been tried with equivocal results. The aim of the present randomized clinical trial was to determine the efficacy of clonazepam, a GABA agonist designed as an antiepileptic drug that exerts the typical effects of benzodiazepines. Randomized clinical trial. Twenty patients with idiopathic BMS were carefully selected. Clonazepam (0.5 mg/day, n = 10) or placebo (lactose, n = 10) were randomly assigned to the patients. Patients on clonazepam significantly improved in pain ratings (P < .001). These changes were less pronounced in the placebo group (P < .11). No significant changes were observed in a mood scale (P = .56) or for depression scores (P = .56). Taste test and salivary flow increased over sessions, but were not different between groups (P = .83 and P = .06, respectively). Clonazepam appears to have a positive effect on pain in BMS patients. Copyright © 2011 The American Laryngological, Rhinological, and Otological Society, Inc.

  15. Best (but oft-forgotten) practices: the design, analysis, and interpretation of Mendelian randomization studies1

    PubMed Central

    Bowden, Jack; Relton, Caroline; Davey Smith, George

    2016-01-01

    Mendelian randomization (MR) is an increasingly important tool for appraising causality in observational epidemiology. The technique exploits the principle that genotypes are not generally susceptible to reverse causation bias and confounding, reflecting their fixed nature and Mendel’s first and second laws of inheritance. The approach is, however, subject to important limitations and assumptions that, if unaddressed or compounded by poor study design, can lead to erroneous conclusions. Nevertheless, the advent of 2-sample approaches (in which exposure and outcome are measured in separate samples) and the increasing availability of open-access data from large consortia of genome-wide association studies and population biobanks mean that the approach is likely to become routine practice in evidence synthesis and causal inference research. In this article we provide an overview of the design, analysis, and interpretation of MR studies, with a special emphasis on assumptions and limitations. We also consider different analytic strategies for strengthening causal inference. Although impossible to prove causality with any single approach, MR is a highly cost-effective strategy for prioritizing intervention targets for disease prevention and for strengthening the evidence base for public health policy. PMID:26961927

  16. Bilingual approach to online cancer genetics education for deaf American Sign Language users produces greater knowledge and confidence than English text only: A randomized study

    PubMed Central

    Palmer, Christina G.S.; Boudreault, Patrick; Berman, Barbara A.; Wolfson, Alicia; Duarte, Lionel; Venne, Vickie L.; Sinsheimer, Janet S.

    2016-01-01

    Introduction Deaf American Sign Language-users (ASL) have limited access to cancer genetics information they can readily understand, increasing risk for health disparities. We compared effectiveness of online cancer genetics information presented using a bilingual approach (ASL with English closed captioning) and a monolingual approach (English text). Hypothesis Bilingual modality would increase cancer genetics knowledge and confidence to create a family tree; education would interact with modality. Methods We used a block 2:1 randomized pre-post study design stratified on education. 150 Deaf ASL-users ≥18 years old with computer and internet access participated online; 100 (70 high, 30 low education) and 50 (35 high, 15 low education) were randomized to the bilingual and monolingual modalities. Modalities provide virtually identical content on creating a family tree, using the family tree to identify inherited cancer risk factors, understanding how cancer predisposition can be inherited, and the role of genetic counseling and testing for prevention or treatment. 25 True/False items assessed knowledge; a Likert scale item assessed confidence. Data were collected within 2 weeks before and after viewing the information. Results Significant interaction of language modality, education, and change in knowledge scores was observed (p=.01). High education group increased knowledge regardless of modality (Bilingual: p<.001; d=.56; Monolingual: p<.001; d=1.08). Low education group increased knowledge with bilingual (p<.001; d=.85), but not monolingual (p=.79; d=.08) modality. Bilingual modality yielded greater confidence creating a family tree (p=.03). Conclusions Bilingual approach provides a better opportunity for lower educated Deaf ASL-users to access cancer genetics information than a monolingual approach. PMID:27594054

  17. A non-iterative extension of the multivariate random effects meta-analysis.

    PubMed

    Makambi, Kepher H; Seung, Hyunuk

    2015-01-01

    Multivariate methods in meta-analysis are becoming popular and more accepted in biomedical research despite computational issues in some of the techniques. A number of approaches, both iterative and non-iterative, have been proposed including the multivariate DerSimonian and Laird method by Jackson et al. (2010), which is non-iterative. In this study, we propose an extension of the method by Hartung and Makambi (2002) and Makambi (2001) to multivariate situations. A comparison of the bias and mean square error from a simulation study indicates that, in some circumstances, the proposed approach perform better than the multivariate DerSimonian-Laird approach. An example is presented to demonstrate the application of the proposed approach.

  18. Latent spatial models and sampling design for landscape genetics

    USGS Publications Warehouse

    Hanks, Ephraim M.; Hooten, Mevin B.; Knick, Steven T.; Oyler-McCance, Sara J.; Fike, Jennifer A.; Cross, Todd B.; Schwartz, Michael K.

    2016-01-01

    We propose a spatially-explicit approach for modeling genetic variation across space and illustrate how this approach can be used to optimize spatial prediction and sampling design for landscape genetic data. We propose a multinomial data model for categorical microsatellite allele data commonly used in landscape genetic studies, and introduce a latent spatial random effect to allow for spatial correlation between genetic observations. We illustrate how modern dimension reduction approaches to spatial statistics can allow for efficient computation in landscape genetic statistical models covering large spatial domains. We apply our approach to propose a retrospective spatial sampling design for greater sage-grouse (Centrocercus urophasianus) population genetics in the western United States.

  19. Mendelian randomization with fine-mapped genetic data: Choosing from large numbers of correlated instrumental variables.

    PubMed

    Burgess, Stephen; Zuber, Verena; Valdes-Marquez, Elsa; Sun, Benjamin B; Hopewell, Jemma C

    2017-12-01

    Mendelian randomization uses genetic variants to make causal inferences about the effect of a risk factor on an outcome. With fine-mapped genetic data, there may be hundreds of genetic variants in a single gene region any of which could be used to assess this causal relationship. However, using too many genetic variants in the analysis can lead to spurious estimates and inflated Type 1 error rates. But if only a few genetic variants are used, then the majority of the data is ignored and estimates are highly sensitive to the particular choice of variants. We propose an approach based on summarized data only (genetic association and correlation estimates) that uses principal components analysis to form instruments. This approach has desirable theoretical properties: it takes the totality of data into account and does not suffer from numerical instabilities. It also has good properties in simulation studies: it is not particularly sensitive to varying the genetic variants included in the analysis or the genetic correlation matrix, and it does not have greatly inflated Type 1 error rates. Overall, the method gives estimates that are less precise than those from variable selection approaches (such as using a conditional analysis or pruning approach to select variants), but are more robust to seemingly arbitrary choices in the variable selection step. Methods are illustrated by an example using genetic associations with testosterone for 320 genetic variants to assess the effect of sex hormone related pathways on coronary artery disease risk, in which variable selection approaches give inconsistent inferences. © 2017 The Authors Genetic Epidemiology Published by Wiley Periodicals, Inc.

  20. Comparison of NMR simulations of porous media derived from analytical and voxelized representations.

    PubMed

    Jin, Guodong; Torres-Verdín, Carlos; Toumelin, Emmanuel

    2009-10-01

    We develop and compare two formulations of the random-walk method, grain-based and voxel-based, to simulate the nuclear-magnetic-resonance (NMR) response of fluids contained in various models of porous media. The grain-based approach uses a spherical grain pack as input, where the solid surface is analytically defined without an approximation. In the voxel-based approach, the input is a computer-tomography or computer-generated image of reconstructed porous media. Implementation of the two approaches is largely the same, except for the representation of porous media. For comparison, both approaches are applied to various analytical and digitized models of porous media: isolated spherical pore, simple cubic packing of spheres, and random packings of monodisperse and polydisperse spheres. We find that spin magnetization decays much faster in the digitized models than in their analytical counterparts. The difference in decay rate relates to the overestimation of surface area due to the discretization of the sample; it cannot be eliminated even if the voxel size decreases. However, once considering the effect of surface-area increase in the simulation of surface relaxation, good quantitative agreement is found between the two approaches. Different grain or pore shapes entail different rates of increase of surface area, whereupon we emphasize that the value of the "surface-area-corrected" coefficient may not be universal. Using an example of X-ray-CT image of Fontainebleau rock sample, we show that voxel size has a significant effect on the calculated surface area and, therefore, on the numerically simulated magnetization response.

Top