Meta-analysis in evidence-based healthcare: a paradigm shift away from random effects is overdue.
Doi, Suhail A R; Furuya-Kanamori, Luis; Thalib, Lukman; Barendregt, Jan J
2017-12-01
Each year up to 20 000 systematic reviews and meta-analyses are published whose results influence healthcare decisions, thus making the robustness and reliability of meta-analytic methods one of the world's top clinical and public health priorities. The evidence synthesis makes use of either fixed-effect or random-effects statistical methods. The fixed-effect method has largely been replaced by the random-effects method as heterogeneity of study effects led to poor error estimation. However, despite the widespread use and acceptance of the random-effects method to correct this, it too remains unsatisfactory and continues to suffer from defective error estimation, posing a serious threat to decision-making in evidence-based clinical and public health practice. We discuss here the problem with the random-effects approach and demonstrate that there exist better estimators under the fixed-effect model framework that can achieve optimal error estimation. We argue for an urgent return to the earlier framework with updates that address these problems and conclude that doing so can markedly improve the reliability of meta-analytical findings and thus decision-making in healthcare.
Interpreting findings from Mendelian randomization using the MR-Egger method.
Burgess, Stephen; Thompson, Simon G
2017-05-01
Mendelian randomization-Egger (MR-Egger) is an analysis method for Mendelian randomization using summarized genetic data. MR-Egger consists of three parts: (1) a test for directional pleiotropy, (2) a test for a causal effect, and (3) an estimate of the causal effect. While conventional analysis methods for Mendelian randomization assume that all genetic variants satisfy the instrumental variable assumptions, the MR-Egger method is able to assess whether genetic variants have pleiotropic effects on the outcome that differ on average from zero (directional pleiotropy), as well as to provide a consistent estimate of the causal effect, under a weaker assumption-the InSIDE (INstrument Strength Independent of Direct Effect) assumption. In this paper, we provide a critical assessment of the MR-Egger method with regard to its implementation and interpretation. While the MR-Egger method is a worthwhile sensitivity analysis for detecting violations of the instrumental variable assumptions, there are several reasons why causal estimates from the MR-Egger method may be biased and have inflated Type 1 error rates in practice, including violations of the InSIDE assumption and the influence of outlying variants. The issues raised in this paper have potentially serious consequences for causal inferences from the MR-Egger approach. We give examples of scenarios in which the estimates from conventional Mendelian randomization methods and MR-Egger differ, and discuss how to interpret findings in such cases.
2014-01-01
Background Meta-regression is becoming increasingly used to model study level covariate effects. However this type of statistical analysis presents many difficulties and challenges. Here two methods for calculating confidence intervals for the magnitude of the residual between-study variance in random effects meta-regression models are developed. A further suggestion for calculating credible intervals using informative prior distributions for the residual between-study variance is presented. Methods Two recently proposed and, under the assumptions of the random effects model, exact methods for constructing confidence intervals for the between-study variance in random effects meta-analyses are extended to the meta-regression setting. The use of Generalised Cochran heterogeneity statistics is extended to the meta-regression setting and a Newton-Raphson procedure is developed to implement the Q profile method for meta-analysis and meta-regression. WinBUGS is used to implement informative priors for the residual between-study variance in the context of Bayesian meta-regressions. Results Results are obtained for two contrasting examples, where the first example involves a binary covariate and the second involves a continuous covariate. Intervals for the residual between-study variance are wide for both examples. Conclusions Statistical methods, and R computer software, are available to compute exact confidence intervals for the residual between-study variance under the random effects model for meta-regression. These frequentist methods are almost as easily implemented as their established counterparts for meta-analysis. Bayesian meta-regressions are also easily performed by analysts who are comfortable using WinBUGS. Estimates of the residual between-study variance in random effects meta-regressions should be routinely reported and accompanied by some measure of their uncertainty. Confidence and/or credible intervals are well-suited to this purpose. PMID:25196829
RAYNOR, HOLLIE A.; OSTERHOLT, KATHRIN M.; HART, CHANTELLE N.; JELALIAN, ELISSA; VIVIER, PATRICK; WING, RENA R.
2016-01-01
Objective Evaluate enrollment numbers, randomization rates, costs, and cost-effectiveness of active versus passive recruitment methods for parent-child dyads into two pediatric obesity intervention trials. Methods Recruitment methods were categorized into active (pediatrician referral and targeted mailings, with participants identified by researcher/health care provider) versus passive methods (newspaper, bus, internet, television, and earning statements; fairs/community centers/schools; and word of mouth; with participants self-identified). Numbers of enrolled and randomized families and costs/recruitment method were monitored throughout the 22-month recruitment period. Costs (in USD) per recruitment method included staff time, mileage, and targeted costs of each method. Results A total of 940 families were referred or made contact, with 164 families randomized (child: 7.2±1.6 years, 2.27±0.61 standardized body mass index [zBMI], 86.6% obese, 61.7% female, 83.5% white; parent: 38.0±5.8 years, 32.9±8.4 BMI, 55.2% obese, 92.7% female, 89.6% white). Pediatrician referral, followed by targeted mailings, produced the largest number of enrolled and randomized families (both methods combined producing 87.2% of randomized families). Passive recruitment methods yielded better retention from enrollment to randomization (p <0.05), but produced few families (21 in total). Approximately $91 000 was spent on recruitment, with cost per randomized family at $554.77. Pediatrician referral was the most cost-effective method, $145.95/randomized family, but yielded only 91 randomized families over 22-months of continuous recruitment. Conclusion Pediatrician referral and targeted mailings, which are active recruitment methods, were the most successful strategies. However, recruitment demanded significant resources. Successful recruitment for pediatric trials should use several strategies. Clinical Trials Registration: NCT00259324, NCT00200265 PMID:19922036
Raynor, Hollie A; Osterholt, Kathrin M; Hart, Chantelle N; Jelalian, Elissa; Vivier, Patrick; Wing, Rena R
2009-01-01
Evaluate enrollment numbers, randomization rates, costs, and cost-effectiveness of active versus passive recruitment methods for parent-child dyads into two pediatric obesity intervention trials. Recruitment methods were categorized into active (pediatrician referral and targeted mailings, with participants identified by researcher/health care provider) versus passive methods (newspaper, bus, internet, television, and earning statements; fairs/community centers/schools; and word of mouth; with participants self-identified). Numbers of enrolled and randomized families and costs/recruitment method were monitored throughout the 22-month recruitment period. Costs (in USD) per recruitment method included staff time, mileage, and targeted costs of each method. A total of 940 families were referred or made contact, with 164 families randomized (child: 7.2+/-1.6 years, 2.27+/-0.61 standardized body mass index [zBMI], 86.6% obese, 61.7% female, 83.5% Caucasian; parent: 38.0+/-5.8 years, 32.9+/-8.4 BMI, 55.2% obese, 92.7% female, 89.6% caucasian). Pediatrician referral, followed by targeted mailings, produced the largest number of enrolled and randomized families (both methods combined producing 87.2% of randomized families). Passive recruitment methods yielded better retention from enrollment to randomization (p<0.05), but produced few families (21 in total). Approximately $91,000 was spent on recruitment, with cost per randomized family at $554.77. Pediatrician referral was the most cost-effective method, $145.95/randomized family, but yielded only 91 randomized families over 22-months of continuous recruitment. Pediatrician referral and targeted mailings, which are active recruitment methods, were the most successful strategies. However, recruitment demanded significant resources. Successful recruitment for pediatric trials should use several strategies. NCT00259324, NCT00200265.
Kim, Yoonsang; Choi, Young-Ku; Emery, Sherry
2013-08-01
Several statistical packages are capable of estimating generalized linear mixed models and these packages provide one or more of three estimation methods: penalized quasi-likelihood, Laplace, and Gauss-Hermite. Many studies have investigated these methods' performance for the mixed-effects logistic regression model. However, the authors focused on models with one or two random effects and assumed a simple covariance structure between them, which may not be realistic. When there are multiple correlated random effects in a model, the computation becomes intensive, and often an algorithm fails to converge. Moreover, in our analysis of smoking status and exposure to anti-tobacco advertisements, we have observed that when a model included multiple random effects, parameter estimates varied considerably from one statistical package to another even when using the same estimation method. This article presents a comprehensive review of the advantages and disadvantages of each estimation method. In addition, we compare the performances of the three methods across statistical packages via simulation, which involves two- and three-level logistic regression models with at least three correlated random effects. We apply our findings to a real dataset. Our results suggest that two packages-SAS GLIMMIX Laplace and SuperMix Gaussian quadrature-perform well in terms of accuracy, precision, convergence rates, and computing speed. We also discuss the strengths and weaknesses of the two packages in regard to sample sizes.
Wang, Wei; Griswold, Michael E
2016-11-30
The random effect Tobit model is a regression model that accommodates both left- and/or right-censoring and within-cluster dependence of the outcome variable. Regression coefficients of random effect Tobit models have conditional interpretations on a constructed latent dependent variable and do not provide inference of overall exposure effects on the original outcome scale. Marginalized random effects model (MREM) permits likelihood-based estimation of marginal mean parameters for the clustered data. For random effect Tobit models, we extend the MREM to marginalize over both the random effects and the normal space and boundary components of the censored response to estimate overall exposure effects at population level. We also extend the 'Average Predicted Value' method to estimate the model-predicted marginal means for each person under different exposure status in a designated reference group by integrating over the random effects and then use the calculated difference to assess the overall exposure effect. The maximum likelihood estimation is proposed utilizing a quasi-Newton optimization algorithm with Gauss-Hermite quadrature to approximate the integration of the random effects. We use these methods to carefully analyze two real datasets. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Seismic random noise attenuation method based on empirical mode decomposition of Hausdorff dimension
NASA Astrophysics Data System (ADS)
Yan, Z.; Luan, X.
2017-12-01
Introduction Empirical mode decomposition (EMD) is a noise suppression algorithm by using wave field separation, which is based on the scale differences between effective signal and noise. However, since the complexity of the real seismic wave field results in serious aliasing modes, it is not ideal and effective to denoise with this method alone. Based on the multi-scale decomposition characteristics of the signal EMD algorithm, combining with Hausdorff dimension constraints, we propose a new method for seismic random noise attenuation. First of all, We apply EMD algorithm adaptive decomposition of seismic data and obtain a series of intrinsic mode function (IMF)with different scales. Based on the difference of Hausdorff dimension between effectively signals and random noise, we identify IMF component mixed with random noise. Then we use threshold correlation filtering process to separate the valid signal and random noise effectively. Compared with traditional EMD method, the results show that the new method of seismic random noise attenuation has a better suppression effect. The implementation process The EMD algorithm is used to decompose seismic signals into IMF sets and analyze its spectrum. Since most of the random noise is high frequency noise, the IMF sets can be divided into three categories: the first category is the effective wave composition of the larger scale; the second category is the noise part of the smaller scale; the third category is the IMF component containing random noise. Then, the third kind of IMF component is processed by the Hausdorff dimension algorithm, and the appropriate time window size, initial step and increment amount are selected to calculate the Hausdorff instantaneous dimension of each component. The dimension of the random noise is between 1.0 and 1.05, while the dimension of the effective wave is between 1.05 and 2.0. On the basis of the previous steps, according to the dimension difference between the random noise and effective signal, we extracted the sample points, whose fractal dimension value is less than or equal to 1.05 for the each IMF components, to separate the residual noise. Using the IMF components after dimension filtering processing and the effective wave IMF components after the first selection for reconstruction, we can obtained the results of de-noising.
Anglewicz, Philip; Gourvenec, Diana; Halldorsdottir, Iris; O'Kane, Cate; Koketso, Obakeng; Gorgens, Marelize; Kasper, Toby
2013-02-01
Since self-reports of sensitive behaviors play an important role in HIV/AIDS research, the accuracy of these measures has often been examined. In this paper we (1) examine the effect of three survey interview methods on self-reported sexual behavior and perceptions of community sexual norms in Botswana, and (2) introduce an interview method to research on self-reported sexual behavior in sub-Saharan Africa. Comparing across these three survey methods (face-to-face, ballot box, and randomized response), we find that ballot box and randomized response surveys both provide higher reports of sensitive behaviors; the results for randomized response are particularly strong. Within these overall patterns, however, there is variation by question type; additionally the effect of interview method differs by sex. We also examine interviewer effects to gain insight into the effectiveness of these interview methods, and our results suggest that caution be used when interpreting the differences between survey methods.
Kim, Yoonsang; Emery, Sherry
2013-01-01
Several statistical packages are capable of estimating generalized linear mixed models and these packages provide one or more of three estimation methods: penalized quasi-likelihood, Laplace, and Gauss-Hermite. Many studies have investigated these methods’ performance for the mixed-effects logistic regression model. However, the authors focused on models with one or two random effects and assumed a simple covariance structure between them, which may not be realistic. When there are multiple correlated random effects in a model, the computation becomes intensive, and often an algorithm fails to converge. Moreover, in our analysis of smoking status and exposure to anti-tobacco advertisements, we have observed that when a model included multiple random effects, parameter estimates varied considerably from one statistical package to another even when using the same estimation method. This article presents a comprehensive review of the advantages and disadvantages of each estimation method. In addition, we compare the performances of the three methods across statistical packages via simulation, which involves two- and three-level logistic regression models with at least three correlated random effects. We apply our findings to a real dataset. Our results suggest that two packages—SAS GLIMMIX Laplace and SuperMix Gaussian quadrature—perform well in terms of accuracy, precision, convergence rates, and computing speed. We also discuss the strengths and weaknesses of the two packages in regard to sample sizes. PMID:24288415
A Structural Modeling Approach to a Multilevel Random Coefficients Model.
ERIC Educational Resources Information Center
Rovine, Michael J.; Molenaar, Peter C. M.
2000-01-01
Presents a method for estimating the random coefficients model using covariance structure modeling and allowing one to estimate both fixed and random effects. The method is applied to real and simulated data, including marriage data from J. Belsky and M. Rovine (1990). (SLD)
A simple method for assessing occupational exposure via the one-way random effects model.
Krishnamoorthy, K; Mathew, Thomas; Peng, Jie
2016-11-01
A one-way random effects model is postulated for the log-transformed shift-long personal exposure measurements, where the random effect in the model represents an effect due to the worker. Simple closed-form confidence intervals are proposed for the relevant parameters of interest using the method of variance estimates recovery (MOVER). The performance of the confidence bounds is evaluated and compared with those based on the generalized confidence interval approach. Comparison studies indicate that the proposed MOVER confidence bounds are better than the generalized confidence bounds for the overall mean exposure and an upper percentile of the exposure distribution. The proposed methods are illustrated using a few examples involving industrial hygiene data.
Methods for sample size determination in cluster randomized trials
Rutterford, Clare; Copas, Andrew; Eldridge, Sandra
2015-01-01
Background: The use of cluster randomized trials (CRTs) is increasing, along with the variety in their design and analysis. The simplest approach for their sample size calculation is to calculate the sample size assuming individual randomization and inflate this by a design effect to account for randomization by cluster. The assumptions of a simple design effect may not always be met; alternative or more complicated approaches are required. Methods: We summarise a wide range of sample size methods available for cluster randomized trials. For those familiar with sample size calculations for individually randomized trials but with less experience in the clustered case, this manuscript provides formulae for a wide range of scenarios with associated explanation and recommendations. For those with more experience, comprehensive summaries are provided that allow quick identification of methods for a given design, outcome and analysis method. Results: We present first those methods applicable to the simplest two-arm, parallel group, completely randomized design followed by methods that incorporate deviations from this design such as: variability in cluster sizes; attrition; non-compliance; or the inclusion of baseline covariates or repeated measures. The paper concludes with methods for alternative designs. Conclusions: There is a large amount of methodology available for sample size calculations in CRTs. This paper gives the most comprehensive description of published methodology for sample size calculation and provides an important resource for those designing these trials. PMID:26174515
Su, Xiaogang; Peña, Annette T; Liu, Lei; Levine, Richard A
2018-04-29
Assessing heterogeneous treatment effects is a growing interest in advancing precision medicine. Individualized treatment effects (ITEs) play a critical role in such an endeavor. Concerning experimental data collected from randomized trials, we put forward a method, termed random forests of interaction trees (RFIT), for estimating ITE on the basis of interaction trees. To this end, we propose a smooth sigmoid surrogate method, as an alternative to greedy search, to speed up tree construction. The RFIT outperforms the "separate regression" approach in estimating ITE. Furthermore, standard errors for the estimated ITE via RFIT are obtained with the infinitesimal jackknife method. We assess and illustrate the use of RFIT via both simulation and the analysis of data from an acupuncture headache trial. Copyright © 2018 John Wiley & Sons, Ltd.
Meta-analysis of two studies in the presence of heterogeneity with applications in rare diseases.
Friede, Tim; Röver, Christian; Wandel, Simon; Neuenschwander, Beat
2017-07-01
Random-effects meta-analyses are used to combine evidence of treatment effects from multiple studies. Since treatment effects may vary across trials due to differences in study characteristics, heterogeneity in treatment effects between studies must be accounted for to achieve valid inference. The standard model for random-effects meta-analysis assumes approximately normal effect estimates and a normal random-effects model. However, standard methods based on this model ignore the uncertainty in estimating the between-trial heterogeneity. In the special setting of only two studies and in the presence of heterogeneity, we investigate here alternatives such as the Hartung-Knapp-Sidik-Jonkman method (HKSJ), the modified Knapp-Hartung method (mKH, a variation of the HKSJ method) and Bayesian random-effects meta-analyses with priors covering plausible heterogeneity values; R code to reproduce the examples is presented in an appendix. The properties of these methods are assessed by applying them to five examples from various rare diseases and by a simulation study. Whereas the standard method based on normal quantiles has poor coverage, the HKSJ and mKH generally lead to very long, and therefore inconclusive, confidence intervals. The Bayesian intervals on the whole show satisfying properties and offer a reasonable compromise between these two extremes. © 2016 The Authors. Biometrical Journal published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Estimation of the Nonlinear Random Coefficient Model when Some Random Effects Are Separable
ERIC Educational Resources Information Center
du Toit, Stephen H. C.; Cudeck, Robert
2009-01-01
A method is presented for marginal maximum likelihood estimation of the nonlinear random coefficient model when the response function has some linear parameters. This is done by writing the marginal distribution of the repeated measures as a conditional distribution of the response given the nonlinear random effects. The resulting distribution…
ERIC Educational Resources Information Center
Maynard, Brandy R.; Kjellstrand, Elizabeth K.; Thompson, Aaron M.
2014-01-01
Objectives: This study examined the effects of Check & Connect (C&C) on the attendance, behavior, and academic outcomes of at-risk youth in a field-based effectiveness trial. Method: A multisite randomized block design was used, wherein 260 primarily Hispanic (89%) and economically disadvantaged (74%) students were randomized to treatment…
ERIC Educational Resources Information Center
Bawaneh, Ali Khalid Ali; Nurulazam Md Zain, Ahmad; Salmiza, Saleh
2011-01-01
The purpose of this study was to investigate the effect of Herrmann Whole Brain Teaching Method over conventional teaching method on eight graders in their understanding of simple electric circuits in Jordan. Participants (N = 273 students; M = 139, F = 134) were randomly selected from Bani Kenanah region-North of Jordan and randomly assigned to…
ERIC Educational Resources Information Center
Larwin, Karen H.; Larwin, David A.
2011-01-01
Bootstrapping methods and random distribution methods are increasingly recommended as better approaches for teaching students about statistical inference in introductory-level statistics courses. The authors examined the effect of teaching undergraduate business statistics students using random distribution and bootstrapping simulations. It is the…
Hubacher, David; Spector, Hannah; Monteith, Charles; Chen, Pai-Lien; Hart, Catherine
2017-02-01
Measures of contraceptive effectiveness combine technology and user-related factors. Observational studies show higher effectiveness of long-acting reversible contraception compared with short-acting reversible contraception. Women who choose long-acting reversible contraception may differ in key ways from women who choose short-acting reversible contraception, and it may be these differences that are responsible for the high effectiveness of long-acting reversible contraception. Wider use of long-acting reversible contraception is recommended, but scientific evidence of acceptability and successful use is lacking in a population that typically opts for short-acting methods. The objective of the study was to reduce bias in measuring contraceptive effectiveness and better isolate the independent role that long-acting reversible contraception has in preventing unintended pregnancy relative to short-acting reversible contraception. We conducted a partially randomized patient preference trial and recruited women aged 18-29 years who were seeking a short-acting method (pills or injectable). Participants who agreed to randomization were assigned to 1 of 2 categories: long-acting reversible contraception or short-acting reversible contraception. Women who declined randomization but agreed to follow-up in the observational cohort chose their preferred method. Under randomization, participants chose a specific method in the category and received it for free, whereas participants in the preference cohort paid for the contraception in their usual fashion. Participants were followed up prospectively to measure primary outcomes of method continuation and unintended pregnancy at 12 months. Kaplan-Meier techniques were used to estimate method continuation probabilities. Intent-to-treat principles were applied after method initiation for comparing incidence of unintended pregnancy. We also measured acceptability in terms of level of happiness with the products. Of the 916 participants, 43% chose randomization and 57% chose the preference option. Complete loss to follow-up at 12 months was <2%. The 12-month method continuation probabilities were 63.3% (95% confidence interval, 58.9-67.3) (preference short-acting reversible contraception), 53.0% (95% confidence interval, 45.7-59.8) (randomized short-acting reversible contraception), and 77.8% (95% confidence interval, 71.0-83.2) (randomized long-acting reversible contraception) (P < .001 in the primary comparison involving randomized groups). The 12-month cumulative unintended pregnancy probabilities were 6.4% (95% confidence interval, 4.1-8.7) (preference short-acting reversible contraception), 7.7% (95% confidence interval, 3.3-12.1) (randomized short-acting reversible contraception), and 0.7% (95% confidence interval, 0.0-4.7) (randomized long-acting reversible contraception) (P = .01 when comparing randomized groups). In the secondary comparisons involving only short-acting reversible contraception users, the continuation probability was higher in the preference group compared with the randomized group (P = .04). However, the short-acting reversible contraception randomized group and short-acting reversible contraception preference group had statistically equivalent rates of unintended pregnancy (P = .77). Seventy-eight percent of randomized long-acting reversible contraception users were happy/neutral with their initial method, compared with 89% of randomized short-acting reversible contraception users (P < .05). However, among method continuers at 12 months, all groups were equally happy/neutral (>90%). Even in a typical population of women who presented to initiate or continue short-acting reversible contraception, long-acting reversible contraception proved highly acceptable. One year after initiation, women randomized to long-acting reversible contraception had high continuation rates and consequently experienced superior protection from unintended pregnancy compared with women using short-acting reversible contraception; these findings are attributable to the initial technology and not underlying factors that often bias observational estimates of effectiveness. The similarly patterned experiences of the 2 short-acting reversible contraception cohorts provide a bridge of generalizability between the randomized group and usual-care preference group. Benefits of increased voluntary uptake of long-acting reversible contraception may extend to wider populations than previously thought. Copyright © 2016 Elsevier Inc. All rights reserved.
Selby-Harrington, M; Sorenson, J R; Quade, D; Stearns, S C; Tesh, A S; Donat, P L
1995-01-01
OBJECTIVES. A randomized controlled trial was conducted to test the effectiveness and cost effectiveness of three outreach interventions to promote well-child screening for children on Medicaid. METHODS. In rural North Carolina, a random sample of 2053 families with children due or overdue for screening was stratified according to the presence of a home phone. Families were randomly assigned to receive a mailed pamphlet and letter, a phone call, or a home visit outreach intervention, or the usual (control) method of informing at Medicaid intake. RESULTS. All interventions produced more screenings than the control method, but increases were significant only for families with phones. Among families with phones, a home visit was the most effective intervention but a phone call was the most cost-effective. However, absolute rates of effectiveness were low, and incremental costs per effect were high. CONCLUSIONS. Pamphlets, phone calls, and home visits by nurses were minimally effective for increasing well-child screenings. Alternate outreach methods are needed, especially for families without phones. PMID:7573627
[Theory, method and application of method R on estimation of (co)variance components].
Liu, Wen-Zhong
2004-07-01
Theory, method and application of Method R on estimation of (co)variance components were reviewed in order to make the method be reasonably used. Estimation requires R values,which are regressions of predicted random effects that are calculated using complete dataset on predicted random effects that are calculated using random subsets of the same data. By using multivariate iteration algorithm based on a transformation matrix,and combining with the preconditioned conjugate gradient to solve the mixed model equations, the computation efficiency of Method R is much improved. Method R is computationally inexpensive,and the sampling errors and approximate credible intervals of estimates can be obtained. Disadvantages of Method R include a larger sampling variance than other methods for the same data,and biased estimates in small datasets. As an alternative method, Method R can be used in larger datasets. It is necessary to study its theoretical properties and broaden its application range further.
Liu, Xian; Engel, Charles C
2012-12-20
Researchers often encounter longitudinal health data characterized with three or more ordinal or nominal categories. Random-effects multinomial logit models are generally applied to account for potential lack of independence inherent in such clustered data. When parameter estimates are used to describe longitudinal processes, however, random effects, both between and within individuals, need to be retransformed for correctly predicting outcome probabilities. This study attempts to go beyond existing work by developing a retransformation method that derives longitudinal growth trajectories of unbiased health probabilities. We estimated variances of the predicted probabilities by using the delta method. Additionally, we transformed the covariates' regression coefficients on the multinomial logit function, not substantively meaningful, to the conditional effects on the predicted probabilities. The empirical illustration uses the longitudinal data from the Asset and Health Dynamics among the Oldest Old. Our analysis compared three sets of the predicted probabilities of three health states at six time points, obtained from, respectively, the retransformation method, the best linear unbiased prediction, and the fixed-effects approach. The results demonstrate that neglect of retransforming random errors in the random-effects multinomial logit model results in severely biased longitudinal trajectories of health probabilities as well as overestimated effects of covariates on the probabilities. Copyright © 2012 John Wiley & Sons, Ltd.
Application of Poisson random effect models for highway network screening.
Jiang, Ximiao; Abdel-Aty, Mohamed; Alamili, Samer
2014-02-01
In recent years, Bayesian random effect models that account for the temporal and spatial correlations of crash data became popular in traffic safety research. This study employs random effect Poisson Log-Normal models for crash risk hotspot identification. Both the temporal and spatial correlations of crash data were considered. Potential for Safety Improvement (PSI) were adopted as a measure of the crash risk. Using the fatal and injury crashes that occurred on urban 4-lane divided arterials from 2006 to 2009 in the Central Florida area, the random effect approaches were compared to the traditional Empirical Bayesian (EB) method and the conventional Bayesian Poisson Log-Normal model. A series of method examination tests were conducted to evaluate the performance of different approaches. These tests include the previously developed site consistence test, method consistence test, total rank difference test, and the modified total score test, as well as the newly proposed total safety performance measure difference test. Results show that the Bayesian Poisson model accounting for both temporal and spatial random effects (PTSRE) outperforms the model that with only temporal random effect, and both are superior to the conventional Poisson Log-Normal model (PLN) and the EB model in the fitting of crash data. Additionally, the method evaluation tests indicate that the PTSRE model is significantly superior to the PLN model and the EB model in consistently identifying hotspots during successive time periods. The results suggest that the PTSRE model is a superior alternative for road site crash risk hotspot identification. Copyright © 2013 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Conn, Vicki S.; Hafdahl, Adam R.; Cooper, Pamela S.; Ruppar, Todd M.; Mehr, David R.; Russell, Cynthia L.
2009-01-01
Purpose: This study investigated the effectiveness of interventions to improve medication adherence (MA) in older adults. Design and Methods: Meta-analysis was used to synthesize results of 33 published and unpublished randomized controlled trials. Random-effects models were used to estimate overall mean effect sizes (ESs) for MA, knowledge,…
ERIC Educational Resources Information Center
Stice, Eric; Rohde, Paul; Seeley, John R.; Gau, Jeff M.
2010-01-01
Objective: Evaluate a new 5-step method for testing mediators hypothesized to account for the effects of depression prevention programs. Method: In this indicated prevention trial, at-risk teens with elevated depressive symptoms were randomized to a group cognitive-behavioral (CB) intervention, group supportive expressive intervention, CB…
A Method of Reducing Random Drift in the Combined Signal of an Array of Inertial Sensors
2015-09-30
stability of the collective output, Bayard et al, US Patent 6,882,964. The prior art methods rely upon the use of Kalman filtering and averaging...including scale-factor errors, quantization effects, temperature effects, random drift, and additive noise. A comprehensive account of all of these
Removal of Stationary Sinusoidal Noise from Random Vibration Signals.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, Brian; Cap, Jerome S.
In random vibration environments, sinusoidal line noise may appear in the vibration signal and can affect analysis of the resulting data. We studied two methods which remove stationary sine tones from random noise: a matrix inversion algorithm and a chirp-z transform algorithm. In addition, we developed new methods to determine the frequency of the tonal noise. The results show that both of the removal methods can eliminate sine tones in prefabricated random vibration data when the sine-to-random ratio is at least 0.25. For smaller ratios down to 0.02 only the matrix inversion technique can remove the tones, but the metricsmore » to evaluate its effectiveness also degrade. We also found that using fast Fourier transforms best identified the tonal noise, and determined that band-pass-filtering the signals prior to the process improved sine removal. When applied to actual vibration test data, the methods were not as effective at removing harmonic tones, which we believe to be a result of mixed-phase sinusoidal noise.« less
NASA Astrophysics Data System (ADS)
Pan'kov, A. A.
1997-05-01
The feasibility of using a generalized self-consistent method for predicting the effective elastic properties of composites with random hybrid structures has been examined. Using this method, the problem is reduced to solution of simpler special averaged problems for composites with single inclusions and corresponding transition layers in the medium examined. The dimensions of the transition layers are defined by correlation radii of the composite random structure of the composite, while the heterogeneous elastic properties of the transition layers take account of the probabilities for variation of the size and configuration of the inclusions using averaged special indicator functions. Results are given for a numerical calculation of the averaged indicator functions and analysis of the effect of the micropores in the matrix-fiber interface region on the effective elastic properties of unidirectional fiberglass—epoxy using the generalized self-consistent method and compared with experimental data and reported solutions.
ERIC Educational Resources Information Center
Bawaneh, Ali Khalid Ali; Zain, Ahmad Nurulazam Md; Ghazali, Munirah
2010-01-01
The purpose of the present study is to investigate the effectiveness of Conflict Maps and the V-Shape method as teaching methods in bringing about conceptual change in science among primary eighth-grade students in Jordan. A randomly selected sample (N = 63) from the Bani Kenana region North of Jordan was randomly assigned to the two teaching…
Using Non-experimental Data to Estimate Treatment Effects
Stuart, Elizabeth A.; Marcus, Sue M.; Horvitz-Lennon, Marcela V.; Gibbons, Robert D.; Normand, Sharon-Lise T.
2009-01-01
While much psychiatric research is based on randomized controlled trials (RCTs), where patients are randomly assigned to treatments, sometimes RCTs are not feasible. This paper describes propensity score approaches, which are increasingly used for estimating treatment effects in non-experimental settings. The primary goal of propensity score methods is to create sets of treated and comparison subjects who look as similar as possible, in essence replicating a randomized experiment, at least with respect to observed patient characteristics. A study to estimate the metabolic effects of antipsychotic medication in a sample of Florida Medicaid beneficiaries with schizophrenia illustrates methods. PMID:20563313
Efficacy of Virtual Patients in Medical Education: A Meta-Analysis of Randomized Studies
ERIC Educational Resources Information Center
Consorti, Fabrizio; Mancuso, Rosaria; Nocioni, Martina; Piccolo, Annalisa
2012-01-01
A meta-analysis was performed to assess the Effect Size (ES) from randomized studies comparing the effect of educational interventions in which Virtual patients (VPs) were used either as an alternative method or additive to usual curriculum versus interventions based on more traditional methods. Meta-analysis was designed, conducted and reported…
A note on variance estimation in random effects meta-regression.
Sidik, Kurex; Jonkman, Jeffrey N
2005-01-01
For random effects meta-regression inference, variance estimation for the parameter estimates is discussed. Because estimated weights are used for meta-regression analysis in practice, the assumed or estimated covariance matrix used in meta-regression is not strictly correct, due to possible errors in estimating the weights. Therefore, this note investigates the use of a robust variance estimation approach for obtaining variances of the parameter estimates in random effects meta-regression inference. This method treats the assumed covariance matrix of the effect measure variables as a working covariance matrix. Using an example of meta-analysis data from clinical trials of a vaccine, the robust variance estimation approach is illustrated in comparison with two other methods of variance estimation. A simulation study is presented, comparing the three methods of variance estimation in terms of bias and coverage probability. We find that, despite the seeming suitability of the robust estimator for random effects meta-regression, the improved variance estimator of Knapp and Hartung (2003) yields the best performance among the three estimators, and thus may provide the best protection against errors in the estimated weights.
Assessing the Generalizability of Randomized Trial Results to Target Populations
Stuart, Elizabeth A.; Bradshaw, Catherine P.; Leaf, Philip J.
2014-01-01
Recent years have seen increasing interest in and attention to evidence-based practices, where the “evidence” generally comes from well-conducted randomized trials. However, while those trials yield accurate estimates of the effect of the intervention for the participants in the trial (known as “internal validity”), they do not always yield relevant information about the effects in a particular target population (known as “external validity”). This may be due to a lack of specification of a target population when designing the trial, difficulties recruiting a sample that is representative of a pre-specified target population, or to interest in considering a target population somewhat different from the population directly targeted by the trial. This paper first provides an overview of existing design and analysis methods for assessing and enhancing the ability of a randomized trial to estimate treatment effects in a target population. It then provides a case study using one particular method, which weights the subjects in a randomized trial to match the population on a set of observed characteristics. The case study uses data from a randomized trial of School-wide Positive Behavioral Interventions and Supports (PBIS); our interest is in generalizing the results to the state of Maryland. In the case of PBIS, after weighting, estimated effects in the target population were similar to those observed in the randomized trial. The paper illustrates that statistical methods can be used to assess and enhance the external validity of randomized trials, making the results more applicable to policy and clinical questions. However, there are also many open research questions; future research should focus on questions of treatment effect heterogeneity and further developing these methods for enhancing external validity. Researchers should think carefully about the external validity of randomized trials and be cautious about extrapolating results to specific populations unless they are confident of the similarity between the trial sample and that target population. PMID:25307417
Assessing the generalizability of randomized trial results to target populations.
Stuart, Elizabeth A; Bradshaw, Catherine P; Leaf, Philip J
2015-04-01
Recent years have seen increasing interest in and attention to evidence-based practices, where the "evidence" generally comes from well-conducted randomized trials. However, while those trials yield accurate estimates of the effect of the intervention for the participants in the trial (known as "internal validity"), they do not always yield relevant information about the effects in a particular target population (known as "external validity"). This may be due to a lack of specification of a target population when designing the trial, difficulties recruiting a sample that is representative of a prespecified target population, or to interest in considering a target population somewhat different from the population directly targeted by the trial. This paper first provides an overview of existing design and analysis methods for assessing and enhancing the ability of a randomized trial to estimate treatment effects in a target population. It then provides a case study using one particular method, which weights the subjects in a randomized trial to match the population on a set of observed characteristics. The case study uses data from a randomized trial of school-wide positive behavioral interventions and supports (PBIS); our interest is in generalizing the results to the state of Maryland. In the case of PBIS, after weighting, estimated effects in the target population were similar to those observed in the randomized trial. The paper illustrates that statistical methods can be used to assess and enhance the external validity of randomized trials, making the results more applicable to policy and clinical questions. However, there are also many open research questions; future research should focus on questions of treatment effect heterogeneity and further developing these methods for enhancing external validity. Researchers should think carefully about the external validity of randomized trials and be cautious about extrapolating results to specific populations unless they are confident of the similarity between the trial sample and that target population.
Modeling pattern in collections of parameters
Link, W.A.
1999-01-01
Wildlife management is increasingly guided by analyses of large and complex datasets. The description of such datasets often requires a large number of parameters, among which certain patterns might be discernible. For example, one may consider a long-term study producing estimates of annual survival rates; of interest is the question whether these rates have declined through time. Several statistical methods exist for examining pattern in collections of parameters. Here, I argue for the superiority of 'random effects models' in which parameters are regarded as random variables, with distributions governed by 'hyperparameters' describing the patterns of interest. Unfortunately, implementation of random effects models is sometimes difficult. Ultrastructural models, in which the postulated pattern is built into the parameter structure of the original data analysis, are approximations to random effects models. However, this approximation is not completely satisfactory: failure to account for natural variation among parameters can lead to overstatement of the evidence for pattern among parameters. I describe quasi-likelihood methods that can be used to improve the approximation of random effects models by ultrastructural models.
A Comparison of Single Sample and Bootstrap Methods to Assess Mediation in Cluster Randomized Trials
ERIC Educational Resources Information Center
Pituch, Keenan A.; Stapleton, Laura M.; Kang, Joo Youn
2006-01-01
A Monte Carlo study examined the statistical performance of single sample and bootstrap methods that can be used to test and form confidence interval estimates of indirect effects in two cluster randomized experimental designs. The designs were similar in that they featured random assignment of clusters to one of two treatment conditions and…
ERIC Educational Resources Information Center
Hedeker, Donald; And Others
1996-01-01
Methods are proposed and described for estimating the degree to which relations among variables vary at the individual level. As an example, M. Fishbein and I. Ajzen's theory of reasoned action is examined. This article illustrates the use of empirical Bayes methods based on a random-effects regression model to estimate individual influences…
Krusche, Adele; Rudolf von Rohr, Isabelle; Muse, Kate; Duggan, Danielle; Crane, Catherine; Williams, J. Mark G.
2014-01-01
Background Randomized controlled trials (RCTs) are widely accepted as being the most efficient way of investigating the efficacy of psychological therapies. However, researchers conducting RCTs commonly report difficulties recruiting an adequate sample within planned timescales. In an effort to overcome recruitment difficulties, researchers often are forced to expand their recruitment criteria or extend the recruitment phase, thus increasing costs and delaying publication of results. Research investigating the effectiveness of recruitment strategies is limited and trials often fail to report sufficient details about the recruitment sources and resources utilised. Purpose We examined the efficacy of strategies implemented during the Staying Well after Depression RCT in Oxford to recruit participants with a history of recurrent depression. Methods We describe eight recruitment methods utilised and two further sources not initiated by the research team and examine their efficacy in terms of (i) the return, including the number of potential participants who contacted the trial and the number who were randomized into the trial, (ii) cost-effectiveness, comprising direct financial cost and manpower for initial contacts and randomized participants, and (iii) comparison of sociodemographic characteristics of individuals recruited from different sources. Results Poster advertising, web-based advertising and mental health worker referrals were the cheapest methods per randomized participant; however, the ratio of randomized participants to initial contacts differed markedly per source. Advertising online, via posters and on a local radio station were the most cost-effective recruitment methods for soliciting participants who subsequently were randomized into the trial. Advertising across many sources (saturation) was found to be important. Limitations It may not be feasible to employ all the recruitment methods used in this trial to obtain participation from other populations, such as those currently unwell, or in other geographical locations. Recruitment source was unavailable for participants who could not be reached after the initial contact. Thus, it is possible that the efficiency of certain methods of recruitment was poorer than estimated. Efficacy and costs of other recruitment initiatives, such as providing travel expenses to the in-person eligibility assessment and making follow-up telephone calls to candidates who contacted the recruitment team but could not be screened promptly, were not analysed. Conclusions Website advertising resulted in the highest number of randomized participants and was the second cheapest method of recruiting. Future research should evaluate the effectiveness of recruitment strategies for other samples to contribute to a comprehensive base of knowledge for future RCTs. PMID:24686105
Modeling for Ultrasonic Health Monitoring of Foams with Embedded Sensors
NASA Technical Reports Server (NTRS)
Wang, L.; Rokhlin, S. I.; Rokhlin, Stanislav, I.
2005-01-01
In this report analytical and numerical methods are proposed to estimate the effective elastic properties of regular and random open-cell foams. The methods are based on the principle of minimum energy and on structural beam models. The analytical solutions are obtained using symbolic processing software. The microstructure of the random foam is simulated using Voronoi tessellation together with a rate-dependent random close-packing algorithm. The statistics of the geometrical properties of random foams corresponding to different packing fractions have been studied. The effects of the packing fraction on elastic properties of the foams have been investigated by decomposing the compliance into bending and axial compliance components. It is shown that the bending compliance increases and the axial compliance decreases when the packing fraction increases. Keywords: Foam; Elastic properties; Finite element; Randomness
Analytical connection between thresholds and immunization strategies of SIS model in random networks
NASA Astrophysics Data System (ADS)
Zhou, Ming-Yang; Xiong, Wen-Man; Liao, Hao; Wang, Tong; Wei, Zong-Wen; Fu, Zhong-Qian
2018-05-01
Devising effective strategies for hindering the propagation of viruses and protecting the population against epidemics is critical for public security and health. Despite a number of studies based on the susceptible-infected-susceptible (SIS) model devoted to this topic, we still lack a general framework to compare different immunization strategies in completely random networks. Here, we address this problem by suggesting a novel method based on heterogeneous mean-field theory for the SIS model. Our method builds the relationship between the thresholds and different immunization strategies in completely random networks. Besides, we provide an analytical argument that the targeted large-degree strategy achieves the best performance in random networks with arbitrary degree distribution. Moreover, the experimental results demonstrate the effectiveness of the proposed method in both artificial and real-world networks.
Generalized self-adjustment method for statistical mechanics of composite materials
NASA Astrophysics Data System (ADS)
Pan'kov, A. A.
1997-03-01
A new method is developed for the statistical mechanics of composite materials — the generalized selfadjustment method — which makes it possible to reduce the problem of predicting effective elastic properties of composites with random structures to the solution of two simpler "averaged" problems of an inclusion with transitional layers in a medium with the desired effective elastic properties. The inhomogeneous elastic properties and dimensions of the transitional layers take into account both the "approximate" order of mutual positioning, and also the variation in the dimensions and elastics properties of inclusions through appropriate special averaged indicator functions of the random structure of the composite. A numerical calculation of averaged indicator functions and effective elastic characteristics is performed by the generalized self-adjustment method for a unidirectional fiberglass on the basis of various models of actual random structures in the plane of isotropy.
ERIC Educational Resources Information Center
Henry, James A.; Thielman, Emily J.; Zaugg, Tara L.; Kaelin, Christine; Schmidt, Caroline J.; Griest, Susan; McMillan, Garnett P.; Myers, Paula; Rivera, Izel; Baldwin, Robert; Carlson, Kathleen
2017-01-01
Purpose: This randomized controlled trial evaluated, within clinical settings, the effectiveness of coping skills education that is provided with progressive tinnitus management (PTM). Method: At 2 Veterans Affairs medical centers, N = 300 veterans were randomized to either PTM intervention or 6-month wait-list control. The PTM intervention…
ERIC Educational Resources Information Center
Eack, Shaun M.; Hogarty, Gerard E.; Greenwald, Deborah P.; Hogarty, Susan S.; Keshavan, Matcheri S.
2011-01-01
Objective: To examine the effects of psychosocial cognitive rehabilitation on employment outcomes in a randomized controlled trial for individuals with early course schizophrenia. Method: Early course schizophrenia outpatients (N = 58) were randomly assigned to cognitive enhancement therapy (CET) or an enriched supportive therapy (EST) control and…
Two new methods to fit models for network meta-analysis with random inconsistency effects.
Law, Martin; Jackson, Dan; Turner, Rebecca; Rhodes, Kirsty; Viechtbauer, Wolfgang
2016-07-28
Meta-analysis is a valuable tool for combining evidence from multiple studies. Network meta-analysis is becoming more widely used as a means to compare multiple treatments in the same analysis. However, a network meta-analysis may exhibit inconsistency, whereby the treatment effect estimates do not agree across all trial designs, even after taking between-study heterogeneity into account. We propose two new estimation methods for network meta-analysis models with random inconsistency effects. The model we consider is an extension of the conventional random-effects model for meta-analysis to the network meta-analysis setting and allows for potential inconsistency using random inconsistency effects. Our first new estimation method uses a Bayesian framework with empirically-based prior distributions for both the heterogeneity and the inconsistency variances. We fit the model using importance sampling and thereby avoid some of the difficulties that might be associated with using Markov Chain Monte Carlo (MCMC). However, we confirm the accuracy of our importance sampling method by comparing the results to those obtained using MCMC as the gold standard. The second new estimation method we describe uses a likelihood-based approach, implemented in the metafor package, which can be used to obtain (restricted) maximum-likelihood estimates of the model parameters and profile likelihood confidence intervals of the variance components. We illustrate the application of the methods using two contrasting examples. The first uses all-cause mortality as an outcome, and shows little evidence of between-study heterogeneity or inconsistency. The second uses "ear discharge" as an outcome, and exhibits substantial between-study heterogeneity and inconsistency. Both new estimation methods give results similar to those obtained using MCMC. The extent of heterogeneity and inconsistency should be assessed and reported in any network meta-analysis. Our two new methods can be used to fit models for network meta-analysis with random inconsistency effects. They are easily implemented using the accompanying R code in the Additional file 1. Using these estimation methods, the extent of inconsistency can be assessed and reported.
Sequential change detection and monitoring of temporal trends in random-effects meta-analysis.
Dogo, Samson Henry; Clark, Allan; Kulinskaya, Elena
2017-06-01
Temporal changes in magnitude of effect sizes reported in many areas of research are a threat to the credibility of the results and conclusions of meta-analysis. Numerous sequential methods for meta-analysis have been proposed to detect changes and monitor trends in effect sizes so that meta-analysis can be updated when necessary and interpreted based on the time it was conducted. The difficulties of sequential meta-analysis under the random-effects model are caused by dependencies in increments introduced by the estimation of the heterogeneity parameter τ 2 . In this paper, we propose the use of a retrospective cumulative sum (CUSUM)-type test with bootstrap critical values. This method allows retrospective analysis of the past trajectory of cumulative effects in random-effects meta-analysis and its visualization on a chart similar to CUSUM chart. Simulation results show that the new method demonstrates good control of Type I error regardless of the number or size of the studies and the amount of heterogeneity. Application of the new method is illustrated on two examples of medical meta-analyses. © 2016 The Authors. Research Synthesis Methods published by John Wiley & Sons Ltd. © 2016 The Authors. Research Synthesis Methods published by John Wiley & Sons Ltd.
Krusche, Adele; Rudolf von Rohr, Isabelle; Muse, Kate; Duggan, Danielle; Crane, Catherine; Williams, J Mark G
2014-04-01
Randomized controlled trials (RCTs) are widely accepted as being the most efficient way of investigating the efficacy of psychological therapies. However, researchers conducting RCTs commonly report difficulties in recruiting an adequate sample within planned timescales. In an effort to overcome recruitment difficulties, researchers often are forced to expand their recruitment criteria or extend the recruitment phase, thus increasing costs and delaying publication of results. Research investigating the effectiveness of recruitment strategies is limited, and trials often fail to report sufficient details about the recruitment sources and resources utilized. We examined the efficacy of strategies implemented during the Staying Well after Depression RCT in Oxford to recruit participants with a history of recurrent depression. We describe eight recruitment methods utilized and two further sources not initiated by the research team and examine their efficacy in terms of (1) the return, including the number of potential participants who contacted the trial and the number who were randomized into the trial; (2) cost-effectiveness, comprising direct financial cost and manpower for initial contacts and randomized participants; and (3) comparison of sociodemographic characteristics of individuals recruited from different sources. Poster advertising, web-based advertising, and mental health worker referrals were the cheapest methods per randomized participant; however, the ratio of randomized participants to initial contacts differed markedly per source. Advertising online, via posters, and on a local radio station were the most cost-effective recruitment methods for soliciting participants who subsequently were randomized into the trial. Advertising across many sources (saturation) was found to be important. It may not be feasible to employ all the recruitment methods used in this trial to obtain participation from other populations, such as those currently unwell, or in other geographical locations. Recruitment source was unavailable for participants who could not be reached after the initial contact. Thus, it is possible that the efficiency of certain methods of recruitment was poorer than estimated. Efficacy and costs of other recruitment initiatives, such as providing travel expenses to the in-person eligibility assessment and making follow-up telephone calls to candidates who contacted the recruitment team but could not be screened promptly, were not analysed. Website advertising resulted in the highest number of randomized participants and was the second cheapest method of recruiting. Future research should evaluate the effectiveness of recruitment strategies for other samples to contribute to a comprehensive base of knowledge for future RCTs.
ERIC Educational Resources Information Center
Evans, David K.; Ghosh, Arkadipta
2008-01-01
Poor countries need development programs that are both effective and cost-effective. To assess effectiveness, researchers are increasingly using randomized trials (or quasi-experimental methods that imitate randomized trials), which provide a clear picture of which outcomes are attributable to the program being evaluated. This "Policy Insight"…
Neither fixed nor random: weighted least squares meta-analysis.
Stanley, T D; Doucouliagos, Hristos
2015-06-15
This study challenges two core conventional meta-analysis methods: fixed effect and random effects. We show how and explain why an unrestricted weighted least squares estimator is superior to conventional random-effects meta-analysis when there is publication (or small-sample) bias and better than a fixed-effect weighted average if there is heterogeneity. Statistical theory and simulations of effect sizes, log odds ratios and regression coefficients demonstrate that this unrestricted weighted least squares estimator provides satisfactory estimates and confidence intervals that are comparable to random effects when there is no publication (or small-sample) bias and identical to fixed-effect meta-analysis when there is no heterogeneity. When there is publication selection bias, the unrestricted weighted least squares approach dominates random effects; when there is excess heterogeneity, it is clearly superior to fixed-effect meta-analysis. In practical applications, an unrestricted weighted least squares weighted average will often provide superior estimates to both conventional fixed and random effects. Copyright © 2015 John Wiley & Sons, Ltd.
ERIC Educational Resources Information Center
Krabbenborg, Manon A. M.; Boersma, Sandra N.; van der Veld, William M.; van Hulst, Bente; Vollebergh, Wilma A. M.; Wolf, Judith R. L. M.
2017-01-01
Objective: To test the effectiveness of Houvast: a strengths-based intervention for homeless young adults. Method: A cluster randomized controlled trial was conducted with 10 Dutch shelter facilities randomly allocated to an intervention and a control group. Homeless young adults were interviewed when entering the facility and when care ended.…
Micro-Randomized Trials: An Experimental Design for Developing Just-in-Time Adaptive Interventions
Klasnja, Predrag; Hekler, Eric B.; Shiffman, Saul; Boruvka, Audrey; Almirall, Daniel; Tewari, Ambuj; Murphy, Susan A.
2015-01-01
Objective This paper presents an experimental design, the micro-randomized trial, developed to support optimization of just-in-time adaptive interventions (JITAIs). JITAIs are mHealth technologies that aim to deliver the right intervention components at the right times and locations to optimally support individuals’ health behaviors. Micro-randomized trials offer a way to optimize such interventions by enabling modeling of causal effects and time-varying effect moderation for individual intervention components within a JITAI. Methods The paper describes the micro-randomized trial design, enumerates research questions that this experimental design can help answer, and provides an overview of the data analyses that can be used to assess the causal effects of studied intervention components and investigate time-varying moderation of those effects. Results Micro-randomized trials enable causal modeling of proximal effects of the randomized intervention components and assessment of time-varying moderation of those effects. Conclusions Micro-randomized trials can help researchers understand whether their interventions are having intended effects, when and for whom they are effective, and what factors moderate the interventions’ effects, enabling creation of more effective JITAIs. PMID:26651463
Designing Studies That Would Address the Multilayered Nature of Health Care
Pennell, Michael; Rhoda, Dale; Hade, Erinn M.; Paskett, Electra D.
2010-01-01
We review design and analytic methods available for multilevel interventions in cancer research with particular attention to study design, sample size requirements, and potential to provide statistical evidence for causal inference. The most appropriate methods will depend on the stage of development of the research and whether randomization is possible. Early on, fractional factorial designs may be used to screen intervention components, particularly when randomization of individuals is possible. Quasi-experimental designs, including time-series and multiple baseline designs, can be useful once the intervention is designed because they require few sites and can provide the preliminary evidence to plan efficacy studies. In efficacy and effectiveness studies, group-randomized trials are preferred when randomization is possible and regression discontinuity designs are preferred otherwise if assignment based on a quantitative score is possible. Quasi-experimental designs may be used, especially when combined with recent developments in analytic methods to reduce bias in effect estimates. PMID:20386057
Analysis on pseudo excitation of random vibration for structure of time flight counter
NASA Astrophysics Data System (ADS)
Wu, Qiong; Li, Dapeng
2015-03-01
Traditional computing method is inefficient for getting key dynamical parameters of complicated structure. Pseudo Excitation Method(PEM) is an effective method for calculation of random vibration. Due to complicated and coupling random vibration in rocket or shuttle launching, the new staging white noise mathematical model is deduced according to the practical launch environment. This deduced model is applied for PEM to calculate the specific structure of Time of Flight Counter(ToFC). The responses of power spectral density and the relevant dynamic characteristic parameters of ToFC are obtained in terms of the flight acceptance test level. Considering stiffness of fixture structure, the random vibration experiments are conducted in three directions to compare with the revised PEM. The experimental results show the structure can bear the random vibration caused by launch without any damage and key dynamical parameters of ToFC are obtained. The revised PEM is similar with random vibration experiment in dynamical parameters and responses are proved by comparative results. The maximum error is within 9%. The reasons of errors are analyzed to improve reliability of calculation. This research provides an effective method for solutions of computing dynamical characteristic parameters of complicated structure in the process of rocket or shuttle launching.
Effective dynamics of a random walker on a heterogeneous ring: Exact results
NASA Astrophysics Data System (ADS)
Masharian, S. R.
2018-07-01
In this paper, by considering a biased random walker hopping on a one-dimensional lattice with a ring geometry, we investigate the fluctuations of the speed of the random walker. We assume that the lattice is heterogeneous i.e. the hopping rate of the random walker between the first and the last lattice sites is different from the hopping rate of the random walker between the other links of the lattice. Assuming that the average speed of the random walker in the steady-state is v∗, we have been able to find the unconditional effective dynamics of the random walker where the absolute value of the average speed of the random walker is -v∗. Using a perturbative method in the large system-size limit, we have also been able to show that the effective hopping rates of the random walker near the defective link are highly site-dependent.
Kahan, Brennan C; Harhay, Michael O
2015-12-01
Adjustment for center in multicenter trials is recommended when there are between-center differences or when randomization has been stratified by center. However, common methods of analysis (such as fixed-effects, Mantel-Haenszel, or stratified Cox models) often require a large number of patients or events per center to perform well. We reviewed 206 multicenter randomized trials published in four general medical journals to assess the average number of patients and events per center and determine whether appropriate methods of analysis were used in trials with few patients or events per center. The median number of events per center/treatment arm combination for trials using a binary or survival outcome was 3 (interquartile range, 1-10). Sixteen percent of trials had less than 1 event per center/treatment combination, 50% fewer than 3, and 63% fewer than 5. Of the trials which adjusted for center using a method of analysis which requires a large number of events per center, 6% had less than 1 event per center-treatment combination, 25% fewer than 3, and 50% fewer than 5. Methods of analysis that allow for few events per center, such as random-effects models or generalized estimating equations (GEEs), were rarely used. Many multicenter trials contain few events per center. Adjustment for center using random-effects models or GEE with model-based (non-robust) standard errors may be beneficial in these scenarios. Copyright © 2015 Elsevier Inc. All rights reserved.
Confidence intervals for single-case effect size measures based on randomization test inversion.
Michiels, Bart; Heyvaert, Mieke; Meulders, Ann; Onghena, Patrick
2017-02-01
In the current paper, we present a method to construct nonparametric confidence intervals (CIs) for single-case effect size measures in the context of various single-case designs. We use the relationship between a two-sided statistical hypothesis test at significance level α and a 100 (1 - α) % two-sided CI to construct CIs for any effect size measure θ that contain all point null hypothesis θ values that cannot be rejected by the hypothesis test at significance level α. This method of hypothesis test inversion (HTI) can be employed using a randomization test as the statistical hypothesis test in order to construct a nonparametric CI for θ. We will refer to this procedure as randomization test inversion (RTI). We illustrate RTI in a situation in which θ is the unstandardized and the standardized difference in means between two treatments in a completely randomized single-case design. Additionally, we demonstrate how RTI can be extended to other types of single-case designs. Finally, we discuss a few challenges for RTI as well as possibilities when using the method with other effect size measures, such as rank-based nonoverlap indices. Supplementary to this paper, we provide easy-to-use R code, which allows the user to construct nonparametric CIs according to the proposed method.
NASA Astrophysics Data System (ADS)
Schießl, Stefan P.; Rother, Marcel; Lüttgens, Jan; Zaumseil, Jana
2017-11-01
The field-effect mobility is an important figure of merit for semiconductors such as random networks of single-walled carbon nanotubes (SWNTs). However, owing to their network properties and quantum capacitance, the standard models for field-effect transistors cannot be applied without modifications. Several different methods are used to determine the mobility with often very different results. We fabricated and characterized field-effect transistors with different polymer-sorted, semiconducting SWNT network densities ranging from low (≈6 μm-1) to densely packed quasi-monolayers (≈26 μm-1) with a maximum on-conductance of 0.24 μS μm-1 and compared four different techniques to evaluate the field-effect mobility. We demonstrate the limits and requirements for each method with regard to device layout and carrier accumulation. We find that techniques that take into account the measured capacitance on the active device give the most reliable mobility values. Finally, we compare our experimental results to a random-resistor-network model.
NASA Astrophysics Data System (ADS)
Liao, Zhikun; Lu, Dawei; Hu, Jiemin; Zhang, Jun
2018-04-01
For the random hopping frequency signal, the modulated frequencies are randomly distributed over given bandwidth. The randomness of modulated frequency not only improves the electronic counter countermeasure capability for radar systems, but also determines its performance of range compression. In this paper, the range ambiguity function of RHF signal is firstly derived. Then, a design method of frequency hopping pattern based on stationary phase principle to improve the peak to side-lobe ratio is proposed. Finally, the simulated experiments show a good effectiveness of the presented design method.
Asymptotic Effect of Misspecification in the Random Part of the Multilevel Model
ERIC Educational Resources Information Center
Berkhof, Johannes; Kampen, Jarl Kennard
2004-01-01
The authors examine the asymptotic effect of omitting a random coefficient in the multilevel model and derive expressions for the change in (a) the variance components estimator and (b) the estimated variance of the fixed effects estimator. They apply the method of moments, which yields a closed form expression for the omission effect. In…
Methods for Synthesizing Findings on Moderation Effects Across Multiple Randomized Trials
Brown, C Hendricks; Sloboda, Zili; Faggiano, Fabrizio; Teasdale, Brent; Keller, Ferdinand; Burkhart, Gregor; Vigna-Taglianti, Federica; Howe, George; Masyn, Katherine; Wang, Wei; Muthén, Bengt; Stephens, Peggy; Grey, Scott; Perrino, Tatiana
2011-01-01
This paper presents new methods for synthesizing results from subgroup and moderation analyses across different randomized trials. We demonstrate that such a synthesis generally results in additional power to detect significant moderation findings above what one would find in a single trial. Three general methods for conducting synthesis analyses are discussed, with two methods, integrative data analysis, and parallel analyses, sharing a large advantage over traditional methods available in meta-analysis. We present a broad class of analytic models to examine moderation effects across trials that can be used to assess their overall effect and explain sources of heterogeneity, and present ways to disentangle differences across trials due to individual differences, contextual level differences, intervention, and trial design. PMID:21360061
Methods for synthesizing findings on moderation effects across multiple randomized trials.
Brown, C Hendricks; Sloboda, Zili; Faggiano, Fabrizio; Teasdale, Brent; Keller, Ferdinand; Burkhart, Gregor; Vigna-Taglianti, Federica; Howe, George; Masyn, Katherine; Wang, Wei; Muthén, Bengt; Stephens, Peggy; Grey, Scott; Perrino, Tatiana
2013-04-01
This paper presents new methods for synthesizing results from subgroup and moderation analyses across different randomized trials. We demonstrate that such a synthesis generally results in additional power to detect significant moderation findings above what one would find in a single trial. Three general methods for conducting synthesis analyses are discussed, with two methods, integrative data analysis and parallel analyses, sharing a large advantage over traditional methods available in meta-analysis. We present a broad class of analytic models to examine moderation effects across trials that can be used to assess their overall effect and explain sources of heterogeneity, and present ways to disentangle differences across trials due to individual differences, contextual level differences, intervention, and trial design.
The prompted optional randomization trial: a new design for comparative effectiveness research.
Flory, James; Karlawish, Jason
2012-12-01
Randomized controlled trials are the gold standard for medical evidence because randomization provides the best-known protection against confounding of results. Randomization has practical and ethical problems that limit the number of trials that can be conducted, however. A different method for collecting clinical data retains the statistically useful properties of randomization without incurring its practical and ethical challenges. A computerized prompt introduces a random element into clinical decision-making that can be instantly overridden if it conflicts with optimal patient care. This creates a weak form of randomization that still eliminates the effect of all confounders, can be carried out without disturbing routine clinical care, and arguably will not require research-grade informed consent.
Blocking for Sequential Political Experiments
Moore, Sally A.
2013-01-01
In typical political experiments, researchers randomize a set of households, precincts, or individuals to treatments all at once, and characteristics of all units are known at the time of randomization. However, in many other experiments, subjects “trickle in” to be randomized to treatment conditions, usually via complete randomization. To take advantage of the rich background data that researchers often have (but underutilize) in these experiments, we develop methods that use continuous covariates to assign treatments sequentially. We build on biased coin and minimization procedures for discrete covariates and demonstrate that our methods outperform complete randomization, producing better covariate balance in simulated data. We then describe how we selected and deployed a sequential blocking method in a clinical trial and demonstrate the advantages of our having done so. Further, we show how that method would have performed in two larger sequential political trials. Finally, we compare causal effect estimates from differences in means, augmented inverse propensity weighted estimators, and randomization test inversion. PMID:24143061
MicroRNA array normalization: an evaluation using a randomized dataset as the benchmark.
Qin, Li-Xuan; Zhou, Qin
2014-01-01
MicroRNA arrays possess a number of unique data features that challenge the assumption key to many normalization methods. We assessed the performance of existing normalization methods using two microRNA array datasets derived from the same set of tumor samples: one dataset was generated using a blocked randomization design when assigning arrays to samples and hence was free of confounding array effects; the second dataset was generated without blocking or randomization and exhibited array effects. The randomized dataset was assessed for differential expression between two tumor groups and treated as the benchmark. The non-randomized dataset was assessed for differential expression after normalization and compared against the benchmark. Normalization improved the true positive rate significantly in the non-randomized data but still possessed a false discovery rate as high as 50%. Adding a batch adjustment step before normalization further reduced the number of false positive markers while maintaining a similar number of true positive markers, which resulted in a false discovery rate of 32% to 48%, depending on the specific normalization method. We concluded the paper with some insights on possible causes of false discoveries to shed light on how to improve normalization for microRNA arrays.
MicroRNA Array Normalization: An Evaluation Using a Randomized Dataset as the Benchmark
Qin, Li-Xuan; Zhou, Qin
2014-01-01
MicroRNA arrays possess a number of unique data features that challenge the assumption key to many normalization methods. We assessed the performance of existing normalization methods using two microRNA array datasets derived from the same set of tumor samples: one dataset was generated using a blocked randomization design when assigning arrays to samples and hence was free of confounding array effects; the second dataset was generated without blocking or randomization and exhibited array effects. The randomized dataset was assessed for differential expression between two tumor groups and treated as the benchmark. The non-randomized dataset was assessed for differential expression after normalization and compared against the benchmark. Normalization improved the true positive rate significantly in the non-randomized data but still possessed a false discovery rate as high as 50%. Adding a batch adjustment step before normalization further reduced the number of false positive markers while maintaining a similar number of true positive markers, which resulted in a false discovery rate of 32% to 48%, depending on the specific normalization method. We concluded the paper with some insights on possible causes of false discoveries to shed light on how to improve normalization for microRNA arrays. PMID:24905456
Cost-effectiveness of Recruitment Methods in an Obesity Prevention Trial for Young Children
Robinson, Jodie L.; Fuerch, Janene H.; Winiewicz, Dana D.; Salvy, Sarah J.; Roemmich, James N.; Epstein, Leonard H.
2007-01-01
Background Recruitment of participants for clinical trials requires considerable effort and cost. There is no research on the cost-effectiveness of recruitment methods for an obesity prevention trial of young children. Methods This study determined the cost-effectiveness of recruiting 70 families with a child aged 4 to 7 (5.9 ± 1.3) years in Western New York from February, 2003 to November, 2004, for a two year randomized obesity prevention trial to reduce television watching in the home. Results Of the 70 randomized families, 65.7% (n = 46) were obtained through direct mailings, 24.3% (n = 17) were acquired through newspaper advertisements, 7.1 % (n = 5) from other sources (e.g. word of mouth), and 2.9% (n = 2) through posters and brochures. Costs of each recruitment method were computed by adding the cost of materials, staff time, and media expenses. Cost-effectiveness (money spent per randomized participant) was US $0 for other sources, US $227.76 for direct mailing, US $546.95 for newspaper ads, and US $3,020.84 for posters and brochures. Conclusion Of the methods with associated costs, direct mailing was the most cost effective in recruiting families with young children, which supports the growing literature of the effectiveness of direct mailing. PMID:17475318
Testing the Intervention Effect in Single-Case Experiments: A Monte Carlo Simulation Study
ERIC Educational Resources Information Center
Heyvaert, Mieke; Moeyaert, Mariola; Verkempynck, Paul; Van den Noortgate, Wim; Vervloet, Marlies; Ugille, Maaike; Onghena, Patrick
2017-01-01
This article reports on a Monte Carlo simulation study, evaluating two approaches for testing the intervention effect in replicated randomized AB designs: two-level hierarchical linear modeling (HLM) and using the additive method to combine randomization test "p" values (RTcombiP). Four factors were manipulated: mean intervention effect,…
Xing, Haifeng; Hou, Bo; Lin, Zhihui; Guo, Meifeng
2017-10-13
MEMS (Micro Electro Mechanical System) gyroscopes have been widely applied to various fields, but MEMS gyroscope random drift has nonlinear and non-stationary characteristics. It has attracted much attention to model and compensate the random drift because it can improve the precision of inertial devices. This paper has proposed to use wavelet filtering to reduce noise in the original data of MEMS gyroscopes, then reconstruct the random drift data with PSR (phase space reconstruction), and establish the model for the reconstructed data by LSSVM (least squares support vector machine), of which the parameters were optimized using CPSO (chaotic particle swarm optimization). Comparing the effect of modeling the MEMS gyroscope random drift with BP-ANN (back propagation artificial neural network) and the proposed method, the results showed that the latter had a better prediction accuracy. Using the compensation of three groups of MEMS gyroscope random drift data, the standard deviation of three groups of experimental data dropped from 0.00354°/s, 0.00412°/s, and 0.00328°/s to 0.00065°/s, 0.00072°/s and 0.00061°/s, respectively, which demonstrated that the proposed method can reduce the influence of MEMS gyroscope random drift and verified the effectiveness of this method for modeling MEMS gyroscope random drift.
A comparison of methods for estimating the random effects distribution of a linear mixed model.
Ghidey, Wendimagegn; Lesaffre, Emmanuel; Verbeke, Geert
2010-12-01
This article reviews various recently suggested approaches to estimate the random effects distribution in a linear mixed model, i.e. (1) the smoothing by roughening approach of Shen and Louis,(1) (2) the semi-non-parametric approach of Zhang and Davidian,(2) (3) the heterogeneity model of Verbeke and Lesaffre( 3) and (4) a flexible approach of Ghidey et al. (4) These four approaches are compared via an extensive simulation study. We conclude that for the considered cases, the approach of Ghidey et al. (4) often shows to have the smallest integrated mean squared error for estimating the random effects distribution. An analysis of a longitudinal dental data set illustrates the performance of the methods in a practical example.
2014-01-01
Background Bipolar I disorder (BD-I) is a chronic mental illness characterized by the presence of one or more manic episodes, or both depressive and manic episodes, usually separated by asymptomatic intervals. Pharmacists can contribute to the management of BD-I, mainly with the use of effective and safe drugs, and improve the patient’s life quality through pharmaceutical care. Some studies have shown the effect of pharmaceutical care in the achievement of therapeutic goals in different illnesses; however, to our knowledge, there is a lack of randomized controlled trials designed to assess the effect of pharmacist intervention in patients with BD. The aim of this study is to assess the effectiveness of the Dader Method for pharmaceutical care in patients with BD-I. Methods/design Randomized, controlled, prospective, single-center clinical trial with duration of 12 months will be performed to compare the effect of Dader Method of pharmaceutical care with the usual care process of patients in a psychiatric clinic. Patients diagnosed with BD-I aged between 18 and 65 years who have been discharged or referred from outpatients service of the San Juan de Dios Clinic (Antioquia, Colombia) will be included. Patients will be randomized into the intervention group who will receive pharmaceutical care provided by pharmacists working in collaboration with psychiatrists, or into the control group who will receive usual care and verbal-written counseling regarding BD. Study outcomes will be assessed at baseline and at 3, 6, 9, and 12 months after randomization. The primary outcome will be to measure the number of hospitalizations, emergency service consultations, and unscheduled outpatient visits. Effectiveness, safety, adherence, and quality of life will be assessed as secondary outcomes. Statistical analyses will be performed using two-tailed McNemar tests, Pearson chi-square tests, and Student’s t-tests; a P value <0.05 will be considered as statistically significant. Discussion As far as we know, this is the first randomized controlled trial to assess the effect of the Dader Method for pharmaceutical care in patients with BD-I and it could generate valuable information and recommendations about the role of pharmacists in the improvement of therapeutic goals, solution of drug-related problems, and adherence. Trial registration Registration number NCT01750255 on August 6, 2012. First patient randomized on 24 November 2011. PMID:24885673
Diagnostics of Robust Growth Curve Modeling Using Student's "t" Distribution
ERIC Educational Resources Information Center
Tong, Xin; Zhang, Zhiyong
2012-01-01
Growth curve models with different types of distributions of random effects and of intraindividual measurement errors for robust analysis are compared. After demonstrating the influence of distribution specification on parameter estimation, 3 methods for diagnosing the distributions for both random effects and intraindividual measurement errors…
2011-01-01
Background Logistic random effects models are a popular tool to analyze multilevel also called hierarchical data with a binary or ordinal outcome. Here, we aim to compare different statistical software implementations of these models. Methods We used individual patient data from 8509 patients in 231 centers with moderate and severe Traumatic Brain Injury (TBI) enrolled in eight Randomized Controlled Trials (RCTs) and three observational studies. We fitted logistic random effects regression models with the 5-point Glasgow Outcome Scale (GOS) as outcome, both dichotomized as well as ordinal, with center and/or trial as random effects, and as covariates age, motor score, pupil reactivity or trial. We then compared the implementations of frequentist and Bayesian methods to estimate the fixed and random effects. Frequentist approaches included R (lme4), Stata (GLLAMM), SAS (GLIMMIX and NLMIXED), MLwiN ([R]IGLS) and MIXOR, Bayesian approaches included WinBUGS, MLwiN (MCMC), R package MCMCglmm and SAS experimental procedure MCMC. Three data sets (the full data set and two sub-datasets) were analysed using basically two logistic random effects models with either one random effect for the center or two random effects for center and trial. For the ordinal outcome in the full data set also a proportional odds model with a random center effect was fitted. Results The packages gave similar parameter estimates for both the fixed and random effects and for the binary (and ordinal) models for the main study and when based on a relatively large number of level-1 (patient level) data compared to the number of level-2 (hospital level) data. However, when based on relatively sparse data set, i.e. when the numbers of level-1 and level-2 data units were about the same, the frequentist and Bayesian approaches showed somewhat different results. The software implementations differ considerably in flexibility, computation time, and usability. There are also differences in the availability of additional tools for model evaluation, such as diagnostic plots. The experimental SAS (version 9.2) procedure MCMC appeared to be inefficient. Conclusions On relatively large data sets, the different software implementations of logistic random effects regression models produced similar results. Thus, for a large data set there seems to be no explicit preference (of course if there is no preference from a philosophical point of view) for either a frequentist or Bayesian approach (if based on vague priors). The choice for a particular implementation may largely depend on the desired flexibility, and the usability of the package. For small data sets the random effects variances are difficult to estimate. In the frequentist approaches the MLE of this variance was often estimated zero with a standard error that is either zero or could not be determined, while for Bayesian methods the estimates could depend on the chosen "non-informative" prior of the variance parameter. The starting value for the variance parameter may be also critical for the convergence of the Markov chain. PMID:21605357
Prediction models for clustered data: comparison of a random intercept and standard regression model
2013-01-01
Background When study data are clustered, standard regression analysis is considered inappropriate and analytical techniques for clustered data need to be used. For prediction research in which the interest of predictor effects is on the patient level, random effect regression models are probably preferred over standard regression analysis. It is well known that the random effect parameter estimates and the standard logistic regression parameter estimates are different. Here, we compared random effect and standard logistic regression models for their ability to provide accurate predictions. Methods Using an empirical study on 1642 surgical patients at risk of postoperative nausea and vomiting, who were treated by one of 19 anesthesiologists (clusters), we developed prognostic models either with standard or random intercept logistic regression. External validity of these models was assessed in new patients from other anesthesiologists. We supported our results with simulation studies using intra-class correlation coefficients (ICC) of 5%, 15%, or 30%. Standard performance measures and measures adapted for the clustered data structure were estimated. Results The model developed with random effect analysis showed better discrimination than the standard approach, if the cluster effects were used for risk prediction (standard c-index of 0.69 versus 0.66). In the external validation set, both models showed similar discrimination (standard c-index 0.68 versus 0.67). The simulation study confirmed these results. For datasets with a high ICC (≥15%), model calibration was only adequate in external subjects, if the used performance measure assumed the same data structure as the model development method: standard calibration measures showed good calibration for the standard developed model, calibration measures adapting the clustered data structure showed good calibration for the prediction model with random intercept. Conclusion The models with random intercept discriminate better than the standard model only if the cluster effect is used for predictions. The prediction model with random intercept had good calibration within clusters. PMID:23414436
The Impact of Five Missing Data Treatments on a Cross-Classified Random Effects Model
ERIC Educational Resources Information Center
Hoelzle, Braden R.
2012-01-01
The present study compared the performance of five missing data treatment methods within a Cross-Classified Random Effects Model environment under various levels and patterns of missing data given a specified sample size. Prior research has shown the varying effect of missing data treatment options within the context of numerous statistical…
A Randomized Effectiveness Trial of Brief Parent Training: Six-Month Follow-Up
ERIC Educational Resources Information Center
Kjøbli, John; Bjørnebekk, Gunnar
2013-01-01
Objective: To examine the follow-up effectiveness of brief parent training (BPT) for children with emerging or existing conduct problems. Method: With the use of a randomized controlled trial and parent and teacher reports, this study examined the effectiveness of BPT compared to regular services 6 months after the end of the intervention.…
The investigation of social networks based on multi-component random graphs
NASA Astrophysics Data System (ADS)
Zadorozhnyi, V. N.; Yudin, E. B.
2018-01-01
The methods of non-homogeneous random graphs calibration are developed for social networks simulation. The graphs are calibrated by the degree distributions of the vertices and the edges. The mathematical foundation of the methods is formed by the theory of random graphs with the nonlinear preferential attachment rule and the theory of Erdôs-Rényi random graphs. In fact, well-calibrated network graph models and computer experiments with these models would help developers (owners) of the networks to predict their development correctly and to choose effective strategies for controlling network projects.
Extending existing structural identifiability analysis methods to mixed-effects models.
Janzén, David L I; Jirstrand, Mats; Chappell, Michael J; Evans, Neil D
2018-01-01
The concept of structural identifiability for state-space models is expanded to cover mixed-effects state-space models. Two methods applicable for the analytical study of the structural identifiability of mixed-effects models are presented. The two methods are based on previously established techniques for non-mixed-effects models; namely the Taylor series expansion and the input-output form approach. By generating an exhaustive summary, and by assuming an infinite number of subjects, functions of random variables can be derived which in turn determine the distribution of the system's observation function(s). By considering the uniqueness of the analytical statistical moments of the derived functions of the random variables, the structural identifiability of the corresponding mixed-effects model can be determined. The two methods are applied to a set of examples of mixed-effects models to illustrate how they work in practice. Copyright © 2017 Elsevier Inc. All rights reserved.
Small, J R
1993-01-01
This paper is a study into the effects of experimental error on the estimated values of flux control coefficients obtained using specific inhibitors. Two possible techniques for analysing the experimental data are compared: a simple extrapolation method (the so-called graph method) and a non-linear function fitting method. For these techniques, the sources of systematic errors are identified and the effects of systematic and random errors are quantified, using both statistical analysis and numerical computation. It is shown that the graph method is very sensitive to random errors and, under all conditions studied, that the fitting method, even under conditions where the assumptions underlying the fitted function do not hold, outperformed the graph method. Possible ways of designing experiments to minimize the effects of experimental errors are analysed and discussed. PMID:8257434
Simulation of the Effects of Random Measurement Errors
ERIC Educational Resources Information Center
Kinsella, I. A.; Hannaidh, P. B. O.
1978-01-01
Describes a simulation method for measurement of errors that requires calculators and tables of random digits. Each student simulates the random behaviour of the component variables in the function and by combining the results of all students, the outline of the sampling distribution of the function can be obtained. (GA)
NASA Astrophysics Data System (ADS)
Zhao, Leihong; Qu, Xiaolu; Lin, Hongjun; Yu, Genying; Liao, Bao-Qiang
2018-03-01
Simulation of randomly rough bioparticle surface is crucial to better understand and control interface behaviors and membrane fouling. Pursuing literature indicated a lack of effective method for simulating random rough bioparticle surface. In this study, a new method which combines Gaussian distribution, Fourier transform, spectrum method and coordinate transformation was proposed to simulate surface topography of foulant bioparticles in a membrane bioreactor (MBR). The natural surface of a foulant bioparticle was found to be irregular and randomly rough. The topography simulated by the new method was quite similar to that of real foulant bioparticles. Moreover, the simulated topography of foulant bioparticles was critically affected by parameters correlation length (l) and root mean square (σ). The new method proposed in this study shows notable superiority over the conventional methods for simulation of randomly rough foulant bioparticles. The ease, facility and fitness of the new method point towards potential applications in interface behaviors and membrane fouling research.
Computer-Assisted Dieting: Effects of a Randomized Nutrition Intervention
ERIC Educational Resources Information Center
Schroder, Kerstin E. E.
2011-01-01
Objectives: To compare the effects of a computer-assisted dieting intervention (CAD) with and without self-management training on dieting among 55 overweight and obese adults. Methods: Random assignment to a single-session nutrition intervention (CAD-only) or a combined CAD plus self-management group intervention (CADG). Dependent variables were…
Liu, Xiaolei; Huang, Meng; Fan, Bin; Buckler, Edward S.; Zhang, Zhiwu
2016-01-01
False positives in a Genome-Wide Association Study (GWAS) can be effectively controlled by a fixed effect and random effect Mixed Linear Model (MLM) that incorporates population structure and kinship among individuals to adjust association tests on markers; however, the adjustment also compromises true positives. The modified MLM method, Multiple Loci Linear Mixed Model (MLMM), incorporates multiple markers simultaneously as covariates in a stepwise MLM to partially remove the confounding between testing markers and kinship. To completely eliminate the confounding, we divided MLMM into two parts: Fixed Effect Model (FEM) and a Random Effect Model (REM) and use them iteratively. FEM contains testing markers, one at a time, and multiple associated markers as covariates to control false positives. To avoid model over-fitting problem in FEM, the associated markers are estimated in REM by using them to define kinship. The P values of testing markers and the associated markers are unified at each iteration. We named the new method as Fixed and random model Circulating Probability Unification (FarmCPU). Both real and simulated data analyses demonstrated that FarmCPU improves statistical power compared to current methods. Additional benefits include an efficient computing time that is linear to both number of individuals and number of markers. Now, a dataset with half million individuals and half million markers can be analyzed within three days. PMID:26828793
An analysis of random projection for changeable and privacy-preserving biometric verification.
Wang, Yongjin; Plataniotis, Konstantinos N
2010-10-01
Changeability and privacy protection are important factors for widespread deployment of biometrics-based verification systems. This paper presents a systematic analysis of a random-projection (RP)-based method for addressing these problems. The employed method transforms biometric data using a random matrix with each entry an independent and identically distributed Gaussian random variable. The similarity- and privacy-preserving properties, as well as the changeability of the biometric information in the transformed domain, are analyzed in detail. Specifically, RP on both high-dimensional image vectors and dimensionality-reduced feature vectors is discussed and compared. A vector translation method is proposed to improve the changeability of the generated templates. The feasibility of the introduced solution is well supported by detailed theoretical analyses. Extensive experimentation on a face-based biometric verification problem shows the effectiveness of the proposed method.
Cost effectiveness of recruitment methods in an obesity prevention trial for young children.
Robinson, Jodie L; Fuerch, Janene H; Winiewicz, Dana D; Salvy, Sarah J; Roemmich, James N; Epstein, Leonard H
2007-06-01
Recruitment of participants for clinical trials requires considerable effort and cost. There is no research on the cost effectiveness of recruitment methods for an obesity prevention trial of young children. This study determined the cost effectiveness of recruiting 70 families with a child aged 4 to 7 (5.9+/-1.3) years in Western New York from February 2003 to November 2004, for a 2-year randomized obesity prevention trial to reduce television watching in the home. Of the 70 randomized families, 65.7% (n=46) were obtained through direct mailings, 24.3% (n=17) were acquired through newspaper advertisements, 7.1% (n=5) from other sources (e.g., word of mouth), and 2.9% (n=2) through posters and brochures. Costs of each recruitment method were computed by adding the cost of materials, staff time, and media expenses. Cost effectiveness (money spent per randomized participant) was US $0 for other sources, US $227.76 for direct mailing, US $546.95 for newspaper ads, and US $3,020.84 for posters and brochures. Of the methods with associated costs, direct mailing was the most cost effective in recruiting families with young children, which supports the growing literature of the effectiveness of direct mailing.
Austin, Peter C
2014-03-30
Propensity score methods are increasingly being used to estimate causal treatment effects in observational studies. In medical and epidemiological studies, outcomes are frequently time-to-event in nature. Propensity-score methods are often applied incorrectly when estimating the effect of treatment on time-to-event outcomes. This article describes how two different propensity score methods (matching and inverse probability of treatment weighting) can be used to estimate the measures of effect that are frequently reported in randomized controlled trials: (i) marginal survival curves, which describe survival in the population if all subjects were treated or if all subjects were untreated; and (ii) marginal hazard ratios. The use of these propensity score methods allows one to replicate the measures of effect that are commonly reported in randomized controlled trials with time-to-event outcomes: both absolute and relative reductions in the probability of an event occurring can be determined. We also provide guidance on variable selection for the propensity score model, highlight methods for assessing the balance of baseline covariates between treated and untreated subjects, and describe the implementation of a sensitivity analysis to assess the effect of unmeasured confounding variables on the estimated treatment effect when outcomes are time-to-event in nature. The methods in the paper are illustrated by estimating the effect of discharge statin prescribing on the risk of death in a sample of patients hospitalized with acute myocardial infarction. In this tutorial article, we describe and illustrate all the steps necessary to conduct a comprehensive analysis of the effect of treatment on time-to-event outcomes. © 2013 The authors. Statistics in Medicine published by John Wiley & Sons, Ltd.
ERIC Educational Resources Information Center
Eisenkopf, Gerald; Sulser, Pascal A.
2016-01-01
The authors present results from a comprehensive field experiment at Swiss high schools in which they compare the effectiveness of teaching methods in economics. They randomly assigned classes into an experimental and a conventional teaching group, or a control group that received no specific instruction. Both teaching treatments improve economic…
Economic Intervention and Parenting: A Randomized Experiment of Statewide Child Development Accounts
ERIC Educational Resources Information Center
Nam, Yunju; Wikoff, Nora; Sherraden, Michael
2016-01-01
Objective: We examine the effects of Child Development Accounts (CDAs) on parenting stress and practices. Methods: We use data from the SEED for Oklahoma Kids (SEED OK) experiment. SEED OK selected caregivers of infants from Oklahoma birth certificates using a probability sampling method, randomly assigned caregivers to the treatment (n = 1,132)…
ERIC Educational Resources Information Center
Modebelu, M. N.; Ogbonna, C. C.
2014-01-01
This study aimed at determining the effect of reform-based-instructional method learning styles on students' achievement and retention in mathematics. A sample size of 119 students was randomly selected. The quasiexperimental design comprising pre-test, post-test, and randomized control group were employed. The Collin Rose learning styles…
Saville, Benjamin R.; Herring, Amy H.; Kaufman, Jay S.
2013-01-01
Racial/ethnic disparities in birthweight are a large source of differential morbidity and mortality worldwide and have remained largely unexplained in epidemiologic models. We assess the impact of maternal ancestry and census tract residence on infant birth weights in New York City and the modifying effects of race and nativity by incorporating random effects in a multilevel linear model. Evaluating the significance of these predictors involves the test of whether the variances of the random effects are equal to zero. This is problematic because the null hypothesis lies on the boundary of the parameter space. We generalize an approach for assessing random effects in the two-level linear model to a broader class of multilevel linear models by scaling the random effects to the residual variance and introducing parameters that control the relative contribution of the random effects. After integrating over the random effects and variance components, the resulting integrals needed to calculate the Bayes factor can be efficiently approximated with Laplace’s method. PMID:24082430
Correction of confounding bias in non-randomized studies by appropriate weighting.
Schmoor, Claudia; Gall, Christine; Stampf, Susanne; Graf, Erika
2011-03-01
In non-randomized studies, the assessment of a causal effect of treatment or exposure on outcome is hampered by possible confounding. Applying multiple regression models including the effects of treatment and covariates on outcome is the well-known classical approach to adjust for confounding. In recent years other approaches have been promoted. One of them is based on the propensity score and considers the effect of possible confounders on treatment as a relevant criterion for adjustment. Another proposal is based on using an instrumental variable. Here inference relies on a factor, the instrument, which affects treatment but is thought to be otherwise unrelated to outcome, so that it mimics randomization. Each of these approaches can basically be interpreted as a simple reweighting scheme, designed to address confounding. The procedures will be compared with respect to their fundamental properties, namely, which bias they aim to eliminate, which effect they aim to estimate, and which parameter is modelled. We will expand our overview of methods for analysis of non-randomized studies to methods for analysis of randomized controlled trials and show that analyses of both study types may target different effects and different parameters. The considerations will be illustrated using a breast cancer study with a so-called Comprehensive Cohort Study design, including a randomized controlled trial and a non-randomized study in the same patient population as sub-cohorts. This design offers ideal opportunities to discuss and illustrate the properties of the different approaches. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Jackson, Dan; White, Ian R; Riley, Richard D
2013-01-01
Multivariate meta-analysis is becoming more commonly used. Methods for fitting the multivariate random effects model include maximum likelihood, restricted maximum likelihood, Bayesian estimation and multivariate generalisations of the standard univariate method of moments. Here, we provide a new multivariate method of moments for estimating the between-study covariance matrix with the properties that (1) it allows for either complete or incomplete outcomes and (2) it allows for covariates through meta-regression. Further, for complete data, it is invariant to linear transformations. Our method reduces to the usual univariate method of moments, proposed by DerSimonian and Laird, in a single dimension. We illustrate our method and compare it with some of the alternatives using a simulation study and a real example. PMID:23401213
Time-variant random interval natural frequency analysis of structures
NASA Astrophysics Data System (ADS)
Wu, Binhua; Wu, Di; Gao, Wei; Song, Chongmin
2018-02-01
This paper presents a new robust method namely, unified interval Chebyshev-based random perturbation method, to tackle hybrid random interval structural natural frequency problem. In the proposed approach, random perturbation method is implemented to furnish the statistical features (i.e., mean and standard deviation) and Chebyshev surrogate model strategy is incorporated to formulate the statistical information of natural frequency with regards to the interval inputs. The comprehensive analysis framework combines the superiority of both methods in a way that computational cost is dramatically reduced. This presented method is thus capable of investigating the day-to-day based time-variant natural frequency of structures accurately and efficiently under concrete intrinsic creep effect with probabilistic and interval uncertain variables. The extreme bounds of the mean and standard deviation of natural frequency are captured through the embedded optimization strategy within the analysis procedure. Three particularly motivated numerical examples with progressive relationship in perspective of both structure type and uncertainty variables are demonstrated to justify the computational applicability, accuracy and efficiency of the proposed method.
ERIC Educational Resources Information Center
Bundy, Anita; Engelen, Lina; Wyver, Shirley; Tranter, Paul; Ragen, Jo; Bauman, Adrian; Baur, Louise; Schiller, Wendy; Simpson, Judy M.; Niehues, Anita N.; Perry, Gabrielle; Jessup, Glenda; Naughton, Geraldine
2017-01-01
Background: We assessed the effectiveness of a simple intervention for increasing children's physical activity, play, perceived competence/social acceptance, and social skills. Methods: A cluster-randomized controlled trial was conducted, in which schools were the clusters. Twelve Sydney (Australia) primary schools were randomly allocated to…
Roggemann, M C; Welsh, B M; Montera, D; Rhoadarmer, T A
1995-07-10
Simulating the effects of atmospheric turbulence on optical imaging systems is an important aspect of understanding the performance of these systems. Simulations are particularly important for understanding the statistics of some adaptive-optics system performance measures, such as the mean and variance of the compensated optical transfer function, and for understanding the statistics of estimators used to reconstruct intensity distributions from turbulence-corrupted image measurements. Current methods of simulating the performance of these systems typically make use of random phase screens placed in the system pupil. Methods exist for making random draws of phase screens that have the correct spatial statistics. However, simulating temporal effects and anisoplanatism requires one or more phase screens at different distances from the aperture, possibly moving with different velocities. We describe and demonstrate a method for creating random draws of phase screens with the correct space-time statistics for a bitrary turbulence and wind-velocity profiles, which can be placed in the telescope pupil in simulations. Results are provided for both the von Kármán and the Kolmogorov turbulence spectra. We also show how to simulate anisoplanatic effects with this technique.
Effects of Selected Meditative Asanas on Kinaesthetic Perception and Speed of Movement
ERIC Educational Resources Information Center
Singh, Kanwaljeet; Bal, Baljinder S.; Deol, Nishan S.
2009-01-01
Study aim: To assess the effects of selected meditative "asanas" on kinesthetic perception and movement speed. Material and methods: Thirty randomly selected male students aged 18-24 years volunteered to participate in the study. They were randomly assigned into two groups: A (medidative) and B (control). The Nelson's movement speed and…
The Effectiveness of Healthy Start Home Visit Program: Cluster Randomized Controlled Trial
ERIC Educational Resources Information Center
Leung, Cynthia; Tsang, Sandra; Heung, Kitty
2015-01-01
Purpose: The study reported the effectiveness of a home visit program for disadvantaged Chinese parents with preschool children, using cluster randomized controlled trial design. Method: Participants included 191 parents and their children from 24 preschools, with 84 dyads (12 preschools) in the intervention group and 107 dyads (12 preschools) in…
ERIC Educational Resources Information Center
Shire, Stephanie Y.; Chang, Ya-Chih; Shih, Wendy; Bracaglia, Suzanne; Kodjoe, Maria; Kasari, Connie
2017-01-01
Background: Interventions found to be effective in research settings are often not as effective when implemented in community settings. Considering children with autism, studies have rarely examined the efficacy of laboratory-tested interventions on child outcomes in community settings using randomized controlled designs. Methods: One hundred and…
ERIC Educational Resources Information Center
Clarke, Gregory; DeBar, Lynn; Lynch, Frances; Powell, James; Gale, John; O'Connor, Elizabeth; Ludman, Evette; Bush, Terry; Lin, Elizabeth H. B.; Von Korff, Michael; Hertert, Stephanie
2005-01-01
Objective: To test a collaborative-care, cognitive-behavioral therapy (CBT) program adjunctive to selective serotonin reuptake inhibitor (SSRI) treatment in HMO pediatric primary care. Method: A randomized effectiveness trial comparing a treatment-as-usual (TAU) control condition consisting primarily of SSRI medication delivered outside the…
Performance of Random Effects Model Estimators under Complex Sampling Designs
ERIC Educational Resources Information Center
Jia, Yue; Stokes, Lynne; Harris, Ian; Wang, Yan
2011-01-01
In this article, we consider estimation of parameters of random effects models from samples collected via complex multistage designs. Incorporation of sampling weights is one way to reduce estimation bias due to unequal probabilities of selection. Several weighting methods have been proposed in the literature for estimating the parameters of…
ERIC Educational Resources Information Center
Lincoln, Tania M.; Ziegler, Michael; Mehl, Stephanie; Kesting, Marie-Luise; Lullmann, Eva; Westermann, Stefan; Rief, Winfried
2012-01-01
Objective: Randomized controlled trials have attested the efficacy of cognitive behavioral therapy (CBT) in reducing psychotic symptoms. Now, studies are needed to investigate its effectiveness in routine clinical practice settings. Method: Eighty patients with schizophrenia spectrum disorders who were seeking outpatient treatment were randomized…
ERIC Educational Resources Information Center
Middleton, Kathryn R.; Perri, Michael G.
2014-01-01
Objective: The current study was a randomized controlled trial investigating the effect of an innovative, short-term lifestyle intervention on weight gain in female freshman college students. Participants: Ninety-five freshmen were recruited from a large public university in the United States. Methods: Participants completed baseline assessments…
Sun, Jin; Xu, Xiaosu; Liu, Yiting; Zhang, Tao; Li, Yao
2016-07-12
In order to reduce the influence of fiber optic gyroscope (FOG) random drift error on inertial navigation systems, an improved auto regressive (AR) model is put forward in this paper. First, based on real-time observations at each restart of the gyroscope, the model of FOG random drift can be established online. In the improved AR model, the FOG measured signal is employed instead of the zero mean signals. Then, the modified Sage-Husa adaptive Kalman filter (SHAKF) is introduced, which can directly carry out real-time filtering on the FOG signals. Finally, static and dynamic experiments are done to verify the effectiveness. The filtering results are analyzed with Allan variance. The analysis results show that the improved AR model has high fitting accuracy and strong adaptability, and the minimum fitting accuracy of single noise is 93.2%. Based on the improved AR(3) model, the denoising method of SHAKF is more effective than traditional methods, and its effect is better than 30%. The random drift error of FOG is reduced effectively, and the precision of the FOG is improved.
Paule‐Mandel estimators for network meta‐analysis with random inconsistency effects
Veroniki, Areti Angeliki; Law, Martin; Tricco, Andrea C.; Baker, Rose
2017-01-01
Network meta‐analysis is used to simultaneously compare multiple treatments in a single analysis. However, network meta‐analyses may exhibit inconsistency, where direct and different forms of indirect evidence are not in agreement with each other, even after allowing for between‐study heterogeneity. Models for network meta‐analysis with random inconsistency effects have the dual aim of allowing for inconsistencies and estimating average treatment effects across the whole network. To date, two classical estimation methods for fitting this type of model have been developed: a method of moments that extends DerSimonian and Laird's univariate method and maximum likelihood estimation. However, the Paule and Mandel estimator is another recommended classical estimation method for univariate meta‐analysis. In this paper, we extend the Paule and Mandel method so that it can be used to fit models for network meta‐analysis with random inconsistency effects. We apply all three estimation methods to a variety of examples that have been used previously and we also examine a challenging new dataset that is highly heterogenous. We perform a simulation study based on this new example. We find that the proposed Paule and Mandel method performs satisfactorily and generally better than the previously proposed method of moments because it provides more accurate inferences. Furthermore, the Paule and Mandel method possesses some advantages over likelihood‐based methods because it is both semiparametric and requires no convergence diagnostics. Although restricted maximum likelihood estimation remains the gold standard, the proposed methodology is a fully viable alternative to this and other estimation methods. PMID:28585257
Physical Principle for Generation of Randomness
NASA Technical Reports Server (NTRS)
Zak, Michail
2009-01-01
A physical principle (more precisely, a principle that incorporates mathematical models used in physics) has been conceived as the basis of a method of generating randomness in Monte Carlo simulations. The principle eliminates the need for conventional random-number generators. The Monte Carlo simulation method is among the most powerful computational methods for solving high-dimensional problems in physics, chemistry, economics, and information processing. The Monte Carlo simulation method is especially effective for solving problems in which computational complexity increases exponentially with dimensionality. The main advantage of the Monte Carlo simulation method over other methods is that the demand on computational resources becomes independent of dimensionality. As augmented by the present principle, the Monte Carlo simulation method becomes an even more powerful computational method that is especially useful for solving problems associated with dynamics of fluids, planning, scheduling, and combinatorial optimization. The present principle is based on coupling of dynamical equations with the corresponding Liouville equation. The randomness is generated by non-Lipschitz instability of dynamics triggered and controlled by feedback from the Liouville equation. (In non-Lipschitz dynamics, the derivatives of solutions of the dynamical equations are not required to be bounded.)
Random domain name and address mutation (RDAM) for thwarting reconnaissance attacks
Chen, Xi; Zhu, Yuefei
2017-01-01
Network address shuffling is a novel moving target defense (MTD) that invalidates the address information collected by the attacker by dynamically changing or remapping the host’s network addresses. However, most network address shuffling methods are limited by the limited address space and rely on the host’s static domain name to map to its dynamic address; therefore these methods cannot effectively defend against random scanning attacks, and cannot defend against an attacker who knows the target’s domain name. In this paper, we propose a network defense method based on random domain name and address mutation (RDAM), which increases the scanning space of the attacker through a dynamic domain name method and reduces the probability that a host will be hit by an attacker scanning IP addresses using the domain name system (DNS) query list and the time window methods. Theoretical analysis and experimental results show that RDAM can defend against scanning attacks and worm propagation more effectively than general network address shuffling methods, while introducing an acceptable operational overhead. PMID:28489910
Mazidi, Mohsen; Karimi, Ehsan; Rezaie, Peyman; Ferns, Gordon A
2017-07-01
To undertake a systematic review and meta-analysis of randomized controlled trials of the effect of glucagon-like peptide-1 receptor agonist (GLP-1 RAs) therapy on serum C-reactive protein (CRP) concentrations. PubMed-Medline, SCOPUS, Web of Science and Google Scholar databases were searched for the period up until March 16, 2016. Prospective studies evaluating the impact of GLP-1 RAs on serum CRP were identified. A random effects model (using the DerSimonian-Laird method) and generic inverse variance methods were used for quantitative data synthesis. Sensitivity analysis was conducted using the leave-one-out method. Heterogeneity was quantitatively assessed using the I 2 index. Random effects meta-regression was performed using unrestricted maximum likelihood method to evaluate the impact of potential moderator. International Prospective Register for Systematic Reviews (PROSPERO) number CRD42016036868. Meta-analysis of the data from 7 treatment arms revealed a significant reduction in serum CRP concentrations following treatment with GLP-1 RAs (WMD -2.14 (mg/dL), 95% CI -3.51, -0.78, P=0.002; I 2 96.1%). Removal of one study in the meta-analysis did not change the result in the sensitivity analysis (WMD -2.14 (mg/dL), 95% CI -3.51, -0.78, P=0.002; I 2 96.1%), indicating that our results could not be solely attributed to the effect of a single study. Random effects meta-regression was performed to evaluate the impact of potential moderator on the estimated effect size. Changes in serum CRP concentration were associated with the duration of treatment (slope -0.097, 95% CI -0.158, -0.042, P<0.001). This meta-analysis suggests that GLP-1 RAs therapy causes a significant reduction in CRP. Copyright © 2016 Elsevier Inc. All rights reserved.
Dahabreh, Issa J.; Sheldrick, Radley C.; Paulus, Jessica K.; Chung, Mei; Varvarigou, Vasileia; Jafri, Haseeb; Rassen, Jeremy A.; Trikalinos, Thomas A.; Kitsios, Georgios D.
2012-01-01
Aims Randomized controlled trials (RCTs) are the gold standard for assessing the efficacy of therapeutic interventions because randomization protects from biases inherent in observational studies. Propensity score (PS) methods, proposed as a potential solution to confounding of the treatment–outcome association, are widely used in observational studies of therapeutic interventions for acute coronary syndromes (ACS). We aimed to systematically assess agreement between observational studies using PS methods and RCTs on therapeutic interventions for ACS. Methods and results We searched for observational studies of interventions for ACS that used PS methods to estimate treatment effects on short- or long-term mortality. Using a standardized algorithm, we matched observational studies to RCTs based on patients’ characteristics, interventions, and outcomes (‘topics’), and we compared estimates of treatment effect between the two designs. When multiple observational studies or RCTs were identified for the same topic, we performed a meta-analysis and used the summary relative risk for comparisons. We matched 21 observational studies investigating 17 distinct clinical topics to 63 RCTs (median = 3 RCTs per observational study) for short-term (7 topics) and long-term (10 topics) mortality. Estimates from PS analyses differed statistically significantly from randomized evidence in two instances; however, observational studies reported more extreme beneficial treatment effects compared with RCTs in 13 of 17 instances (P = 0.049). Sensitivity analyses limited to large RCTs, and using alternative meta-analysis models yielded similar results. Conclusion For the treatment of ACS, observational studies using PS methods produce treatment effect estimates that are of more extreme magnitude compared with those from RCTs, although the differences are rarely statistically significant. PMID:22711757
Genomic-Enabled Prediction Kernel Models with Random Intercepts for Multi-environment Trials.
Cuevas, Jaime; Granato, Italo; Fritsche-Neto, Roberto; Montesinos-Lopez, Osval A; Burgueño, Juan; Bandeira E Sousa, Massaine; Crossa, José
2018-03-28
In this study, we compared the prediction accuracy of the main genotypic effect model (MM) without G×E interactions, the multi-environment single variance G×E deviation model (MDs), and the multi-environment environment-specific variance G×E deviation model (MDe) where the random genetic effects of the lines are modeled with the markers (or pedigree). With the objective of further modeling the genetic residual of the lines, we incorporated the random intercepts of the lines ([Formula: see text]) and generated another three models. Each of these 6 models were fitted with a linear kernel method (Genomic Best Linear Unbiased Predictor, GB) and a Gaussian Kernel (GK) method. We compared these 12 model-method combinations with another two multi-environment G×E interactions models with unstructured variance-covariances (MUC) using GB and GK kernels (4 model-method). Thus, we compared the genomic-enabled prediction accuracy of a total of 16 model-method combinations on two maize data sets with positive phenotypic correlations among environments, and on two wheat data sets with complex G×E that includes some negative and close to zero phenotypic correlations among environments. The two models (MDs and MDE with the random intercept of the lines and the GK method) were computationally efficient and gave high prediction accuracy in the two maize data sets. Regarding the more complex G×E wheat data sets, the prediction accuracy of the model-method combination with G×E, MDs and MDe, including the random intercepts of the lines with GK method had important savings in computing time as compared with the G×E interaction multi-environment models with unstructured variance-covariances but with lower genomic prediction accuracy. Copyright © 2018 Cuevas et al.
Genomic-Enabled Prediction Kernel Models with Random Intercepts for Multi-environment Trials
Cuevas, Jaime; Granato, Italo; Fritsche-Neto, Roberto; Montesinos-Lopez, Osval A.; Burgueño, Juan; Bandeira e Sousa, Massaine; Crossa, José
2018-01-01
In this study, we compared the prediction accuracy of the main genotypic effect model (MM) without G×E interactions, the multi-environment single variance G×E deviation model (MDs), and the multi-environment environment-specific variance G×E deviation model (MDe) where the random genetic effects of the lines are modeled with the markers (or pedigree). With the objective of further modeling the genetic residual of the lines, we incorporated the random intercepts of the lines (l) and generated another three models. Each of these 6 models were fitted with a linear kernel method (Genomic Best Linear Unbiased Predictor, GB) and a Gaussian Kernel (GK) method. We compared these 12 model-method combinations with another two multi-environment G×E interactions models with unstructured variance-covariances (MUC) using GB and GK kernels (4 model-method). Thus, we compared the genomic-enabled prediction accuracy of a total of 16 model-method combinations on two maize data sets with positive phenotypic correlations among environments, and on two wheat data sets with complex G×E that includes some negative and close to zero phenotypic correlations among environments. The two models (MDs and MDE with the random intercept of the lines and the GK method) were computationally efficient and gave high prediction accuracy in the two maize data sets. Regarding the more complex G×E wheat data sets, the prediction accuracy of the model-method combination with G×E, MDs and MDe, including the random intercepts of the lines with GK method had important savings in computing time as compared with the G×E interaction multi-environment models with unstructured variance-covariances but with lower genomic prediction accuracy. PMID:29476023
Weighted re-randomization tests for minimization with unbalanced allocation.
Han, Baoguang; Yu, Menggang; McEntegart, Damian
2013-01-01
Re-randomization test has been considered as a robust alternative to the traditional population model-based methods for analyzing randomized clinical trials. This is especially so when the clinical trials are randomized according to minimization, which is a popular covariate-adaptive randomization method for ensuring balance among prognostic factors. Among various re-randomization tests, fixed-entry-order re-randomization is advocated as an effective strategy when a temporal trend is suspected. Yet when the minimization is applied to trials with unequal allocation, fixed-entry-order re-randomization test is biased and thus compromised in power. We find that the bias is due to non-uniform re-allocation probabilities incurred by the re-randomization in this case. We therefore propose a weighted fixed-entry-order re-randomization test to overcome the bias. The performance of the new test was investigated in simulation studies that mimic the settings of a real clinical trial. The weighted re-randomization test was found to work well in the scenarios investigated including the presence of a strong temporal trend. Copyright © 2013 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Crevillén-García, D.; Power, H.
2017-08-01
In this study, we apply four Monte Carlo simulation methods, namely, Monte Carlo, quasi-Monte Carlo, multilevel Monte Carlo and multilevel quasi-Monte Carlo to the problem of uncertainty quantification in the estimation of the average travel time during the transport of particles through random heterogeneous porous media. We apply the four methodologies to a model problem where the only input parameter, the hydraulic conductivity, is modelled as a log-Gaussian random field by using direct Karhunen-Loéve decompositions. The random terms in such expansions represent the coefficients in the equations. Numerical calculations demonstrating the effectiveness of each of the methods are presented. A comparison of the computational cost incurred by each of the methods for three different tolerances is provided. The accuracy of the approaches is quantified via the mean square error.
Crevillén-García, D; Power, H
2017-08-01
In this study, we apply four Monte Carlo simulation methods, namely, Monte Carlo, quasi-Monte Carlo, multilevel Monte Carlo and multilevel quasi-Monte Carlo to the problem of uncertainty quantification in the estimation of the average travel time during the transport of particles through random heterogeneous porous media. We apply the four methodologies to a model problem where the only input parameter, the hydraulic conductivity, is modelled as a log-Gaussian random field by using direct Karhunen-Loéve decompositions. The random terms in such expansions represent the coefficients in the equations. Numerical calculations demonstrating the effectiveness of each of the methods are presented. A comparison of the computational cost incurred by each of the methods for three different tolerances is provided. The accuracy of the approaches is quantified via the mean square error.
Power, H.
2017-01-01
In this study, we apply four Monte Carlo simulation methods, namely, Monte Carlo, quasi-Monte Carlo, multilevel Monte Carlo and multilevel quasi-Monte Carlo to the problem of uncertainty quantification in the estimation of the average travel time during the transport of particles through random heterogeneous porous media. We apply the four methodologies to a model problem where the only input parameter, the hydraulic conductivity, is modelled as a log-Gaussian random field by using direct Karhunen–Loéve decompositions. The random terms in such expansions represent the coefficients in the equations. Numerical calculations demonstrating the effectiveness of each of the methods are presented. A comparison of the computational cost incurred by each of the methods for three different tolerances is provided. The accuracy of the approaches is quantified via the mean square error. PMID:28878974
Effects of multiple spreaders in community networks
NASA Astrophysics Data System (ADS)
Hu, Zhao-Long; Ren, Zhuo-Ming; Yang, Guang-Yong; Liu, Jian-Guo
2014-12-01
Human contact networks exhibit the community structure. Understanding how such community structure affects the epidemic spreading could provide insights for preventing the spreading of epidemics between communities. In this paper, we explore the spreading of multiple spreaders in community networks. A network based on the clustering preferential mechanism is evolved, whose communities are detected by the Girvan-Newman (GN) algorithm. We investigate the spreading effectiveness by selecting the nodes as spreaders in the following ways: nodes with the largest degree in each community (community hubs), the same number of nodes with the largest degree from the global network (global large-degree) and randomly selected one node within each community (community random). The experimental results on the SIR model show that the spreading effectiveness based on the global large-degree and community hubs methods is the same in the early stage of the infection and the method of community random is the worst. However, when the infection rate exceeds the critical value, the global large-degree method embodies the worst spreading effectiveness. Furthermore, the discrepancy of effectiveness for the three methods will decrease as the infection rate increases. Therefore, we should immunize the hubs in each community rather than those hubs in the global network to prevent the outbreak of epidemics.
Random-effects meta-analysis: the number of studies matters.
Guolo, Annamaria; Varin, Cristiano
2017-06-01
This paper investigates the impact of the number of studies on meta-analysis and meta-regression within the random-effects model framework. It is frequently neglected that inference in random-effects models requires a substantial number of studies included in meta-analysis to guarantee reliable conclusions. Several authors warn about the risk of inaccurate results of the traditional DerSimonian and Laird approach especially in the common case of meta-analysis involving a limited number of studies. This paper presents a selection of likelihood and non-likelihood methods for inference in meta-analysis proposed to overcome the limitations of the DerSimonian and Laird procedure, with a focus on the effect of the number of studies. The applicability and the performance of the methods are investigated in terms of Type I error rates and empirical power to detect effects, according to scenarios of practical interest. Simulation studies and applications to real meta-analyses highlight that it is not possible to identify an approach uniformly superior to alternatives. The overall recommendation is to avoid the DerSimonian and Laird method when the number of meta-analysis studies is modest and prefer a more comprehensive procedure that compares alternative inferential approaches. R code for meta-analysis according to all of the inferential methods examined in the paper is provided.
ERIC Educational Resources Information Center
Newton, Nicola C.; Conrod, Patricia J.; Slade, Tim; Carragher, Natacha; Champion, Katrina E.; Barrett, Emma L.; Kelly, Erin V.; Nair, Natasha K.; Stapinski, Lexine; Teesson, Maree
2016-01-01
Background: This study investigated the long-term effectiveness of Preventure, a selective personality-targeted prevention program, in reducing the uptake of alcohol, harmful use of alcohol, and alcohol-related harms over a 3-year period. Methods: A cluster randomized controlled trial was conducted to assess the effectiveness of Preventure.…
ERIC Educational Resources Information Center
Aydin, Burak; Leite, Walter L.; Algina, James
2016-01-01
We investigated methods of including covariates in two-level models for cluster randomized trials to increase power to detect the treatment effect. We compared multilevel models that included either an observed cluster mean or a latent cluster mean as a covariate, as well as the effect of including Level 1 deviation scores in the model. A Monte…
NASA Astrophysics Data System (ADS)
Abid, Najmul; Mirkhalaf, Mohammad; Barthelat, Francois
2018-03-01
Natural materials such as nacre, collagen, and spider silk are composed of staggered stiff and strong inclusions in a softer matrix. This type of hybrid microstructure results in remarkable combinations of stiffness, strength, and toughness and it now inspires novel classes of high-performance composites. However, the analytical and numerical approaches used to predict and optimize the mechanics of staggered composites often neglect statistical variations and inhomogeneities, which may have significant impacts on modulus, strength, and toughness. Here we present an analysis of localization using small representative volume elements (RVEs) and large scale statistical volume elements (SVEs) based on the discrete element method (DEM). DEM is an efficient numerical method which enabled the evaluation of more than 10,000 microstructures in this study, each including about 5,000 inclusions. The models explore the combined effects of statistics, inclusion arrangement, and interface properties. We find that statistical variations have a negative effect on all properties, in particular on the ductility and energy absorption because randomness precipitates the localization of deformations. However, the results also show that the negative effects of random microstructures can be offset by interfaces with large strain at failure accompanied by strain hardening. More specifically, this quantitative study reveals an optimal range of interface properties where the interfaces are the most effective at delaying localization. These findings show how carefully designed interfaces in bioinspired staggered composites can offset the negative effects of microstructural randomness, which is inherent to most current fabrication methods.
Che, W W; Frey, H Christopher; Lau, Alexis K H
2014-12-01
Population and diary sampling methods are employed in exposure models to sample simulated individuals and their daily activity on each simulation day. Different sampling methods may lead to variations in estimated human exposure. In this study, two population sampling methods (stratified-random and random-random) and three diary sampling methods (random resampling, diversity and autocorrelation, and Markov-chain cluster [MCC]) are evaluated. Their impacts on estimated children's exposure to ambient fine particulate matter (PM2.5 ) are quantified via case studies for children in Wake County, NC for July 2002. The estimated mean daily average exposure is 12.9 μg/m(3) for simulated children using the stratified population sampling method, and 12.2 μg/m(3) using the random sampling method. These minor differences are caused by the random sampling among ages within census tracts. Among the three diary sampling methods, there are differences in the estimated number of individuals with multiple days of exposures exceeding a benchmark of concern of 25 μg/m(3) due to differences in how multiday longitudinal diaries are estimated. The MCC method is relatively more conservative. In case studies evaluated here, the MCC method led to 10% higher estimation of the number of individuals with repeated exposures exceeding the benchmark. The comparisons help to identify and contrast the capabilities of each method and to offer insight regarding implications of method choice. Exposure simulation results are robust to the two population sampling methods evaluated, and are sensitive to the choice of method for simulating longitudinal diaries, particularly when analyzing results for specific microenvironments or for exposures exceeding a benchmark of concern. © 2014 Society for Risk Analysis.
NASA Astrophysics Data System (ADS)
Zhang, Hong; Hou, Rui; Yi, Lei; Meng, Juan; Pan, Zhisong; Zhou, Yuhuan
2016-07-01
The accurate identification of encrypted data stream helps to regulate illegal data, detect network attacks and protect users' information. In this paper, a novel encrypted data stream identification algorithm is introduced. The proposed method is based on randomness characteristics of encrypted data stream. We use a l1-norm regularized logistic regression to improve sparse representation of randomness features and Fuzzy Gaussian Mixture Model (FGMM) to improve identification accuracy. Experimental results demonstrate that the method can be adopted as an effective technique for encrypted data stream identification.
Altstein, L.; Li, G.
2012-01-01
Summary This paper studies a semiparametric accelerated failure time mixture model for estimation of a biological treatment effect on a latent subgroup of interest with a time-to-event outcome in randomized clinical trials. Latency is induced because membership is observable in one arm of the trial and unidentified in the other. This method is useful in randomized clinical trials with all-or-none noncompliance when patients in the control arm have no access to active treatment and in, for example, oncology trials when a biopsy used to identify the latent subgroup is performed only on subjects randomized to active treatment. We derive a computational method to estimate model parameters by iterating between an expectation step and a weighted Buckley-James optimization step. The bootstrap method is used for variance estimation, and the performance of our method is corroborated in simulation. We illustrate our method through an analysis of a multicenter selective lymphadenectomy trial for melanoma. PMID:23383608
Platt, Richard; Takvorian, Samuel U; Septimus, Edward; Hickok, Jason; Moody, Julia; Perlin, Jonathan; Jernigan, John A; Kleinman, Ken; Huang, Susan S
2010-06-01
The need for evidence about the effectiveness of therapeutics and other medical practices has triggered new interest in methods for comparative effectiveness research. Describe an approach to comparative effectiveness research involving cluster randomized trials in networks of hospitals, health plans, or medical practices with centralized administrative and informatics capabilities. We discuss the example of an ongoing cluster randomized trial to prevent methicillin-resistant Staphylococcus aureus (MRSA) infection in intensive care units (ICUs). The trial randomizes 45 hospitals to: (a) screening cultures of ICU admissions, followed by Contact Precautions if MRSA-positive, (b) screening cultures of ICU admissions followed by decolonization if MRSA-positive, or (c) universal decolonization of ICU admissions without screening. All admissions to adult ICUs. The primary outcome is MRSA-positive clinical cultures occurring >or=2 days following ICU admission. Secondary outcomes include blood and urine infection caused by MRSA (and, separately, all pathogens), as well as the development of resistance to decolonizing agents. Recruitment of hospitals is complete. Data collection will end in Summer 2011. This trial takes advantage of existing personnel, procedures, infrastructure, and information systems in a large integrated hospital network to conduct a low-cost evaluation of prevention strategies under usual practice conditions. This approach is applicable to many comparative effectiveness topics in both inpatient and ambulatory settings.
Effect of Atomoxetine on Executive Function Impairments in Adults with ADHD
ERIC Educational Resources Information Center
Brown, Thomas E.; Holdnack, James; Saylor, Keith; Adler, Lenard; Spencer, Thomas; Williams, David W.; Padival, Anoop K.; Schuh, Kory; Trzepacz, Paula T.; Kelsey, Douglas
2011-01-01
Objective: To assess the effect of atomoxetine on ADHD-related executive functions over a 6-month period using the Brown Attention-Deficit Disorder Scale (BADDS) for Adults, a normed, 40-item, self-report scale in a randomized, double-blind, placebo-controlled clinical trial. Method: In a randomized, double-blind clinical trial, adults with ADHD…
ERIC Educational Resources Information Center
Smith-Lock, Karen M.; Leitão, Suze; Prior, Polly; Nickels, Lyndsey
2015-01-01
Purpose: This study compared the effectiveness of two grammar treatment procedures for children with specific language impairment. Method: A double-blind superiority trial with cluster randomization was used to compare a cueing procedure, designed to elicit a correct production following an initial error, to a recasting procedure, which required…
ERIC Educational Resources Information Center
Jia, Yue; Stokes, Lynne; Harris, Ian; Wang, Yan
2011-01-01
Estimation of parameters of random effects models from samples collected via complex multistage designs is considered. One way to reduce estimation bias due to unequal probabilities of selection is to incorporate sampling weights. Many researchers have been proposed various weighting methods (Korn, & Graubard, 2003; Pfeffermann, Skinner,…
ERIC Educational Resources Information Center
Travis, Heather E.; Lawrance, Kelli-an G.
2009-01-01
Objective: Between September 2002 and February 2003, the authors assessed the effectiveness of a new, age-tailored, self-help smoking-cessation program for college students. Participants: College student smokers (N = 216) from 6 Ontario universities participated. Methods: The researchers used a randomized controlled trial with a 3-month telephone…
ERIC Educational Resources Information Center
Robbins, Michael S.; Feaster, Daniel J.; Horigian, Viviana E.; Rohrbaugh, Michael; Shoham, Varda; Bachrach, Ken; Miller, Michael; Burlew, Kathleen A.; Hodgkins, Candy; Carrion, Ibis; Vandermark, Nancy; Schindler, Eric; Werstlein, Robert; Szapocznik, Jose
2011-01-01
Objective: To determine the effectiveness of brief strategic family therapy (BSFT; an evidence-based family therapy) compared to treatment as usual (TAU) as provided in community-based adolescent outpatient drug abuse programs. Method: A randomized effectiveness trial in the National Drug Abuse Treatment Clinical Trials Network compared BSFT to…
ERIC Educational Resources Information Center
Warren, Steven F.; Fey, Marc E.; Finestack, Lizbeth, H.; Brady, Nancy C.; Bredin-Oja, Shelley L.; Fleming, Kandace K.
2008-01-01
Purpose: To evaluate the longitudinal effects of a 6-month course of responsivity education (RE)/prelinguistic milieu teaching (PMT) for young children with developmental delay. Method: Fifty-one children, age 24-33 months, with fewer than 10 expressive words were randomly assigned to early-treatment/no-treatment groups. All treatment was added as…
ERIC Educational Resources Information Center
Moreno, Mario; Harwell, Michael; Guzey, S. Selcen; Phillips, Alison; Moore, Tamara J.
2016-01-01
Hierarchical linear models have become a familiar method for accounting for a hierarchical data structure in studies of science and mathematics achievement. This paper illustrates the use of cross-classified random effects models (CCREMs), which are likely less familiar. The defining characteristic of CCREMs is a hierarchical data structure…
A new compound control method for sine-on-random mixed vibration test
NASA Astrophysics Data System (ADS)
Zhang, Buyun; Wang, Ruochen; Zeng, Falin
2017-09-01
Vibration environmental test (VET) is one of the important and effective methods to provide supports for the strength design, reliability and durability test of mechanical products. A new separation control strategy was proposed to apply in multiple-input multiple-output (MIMO) sine on random (SOR) mixed mode vibration test, which is the advanced and intensive test type of VET. As the key problem of the strategy, correlation integral method was applied to separate the mixed signals which included random and sinusoidal components. The feedback control formula of MIMO linear random vibration system was systematically deduced in frequency domain, and Jacobi control algorithm was proposed in view of the elements, such as self-spectrum, coherence, and phase of power spectral density (PSD) matrix. Based on the excessive correction of excitation in sine vibration test, compression factor was introduced to reduce the excitation correction, avoiding the destruction to vibration table or other devices. The two methods were synthesized to be applied in MIMO SOR vibration test system. In the final, verification test system with the vibration of a cantilever beam as the control object was established to verify the reliability and effectiveness of the methods proposed in the paper. The test results show that the exceeding values can be controlled in the tolerance range of references accurately, and the method can supply theory and application supports for mechanical engineering.
Time-frequency peak filtering for random noise attenuation of magnetic resonance sounding signal
NASA Astrophysics Data System (ADS)
Lin, Tingting; Zhang, Yang; Yi, Xiaofeng; Fan, Tiehu; Wan, Ling
2018-05-01
When measuring in a geomagnetic field, the method of magnetic resonance sounding (MRS) is often limited because of the notably low signal-to-noise ratio (SNR). Most current studies focus on discarding spiky noise and power-line harmonic noise cancellation. However, the effects of random noise should not be underestimated. The common method for random noise attenuation is stacking, but collecting multiple recordings merely to suppress random noise is time-consuming. Moreover, stacking is insufficient to suppress high-level random noise. Here, we propose the use of time-frequency peak filtering for random noise attenuation, which is performed after the traditional de-spiking and power-line harmonic removal method. By encoding the noisy signal with frequency modulation and estimating the instantaneous frequency using the peak of the time-frequency representation of the encoded signal, the desired MRS signal can be acquired from only one stack. The performance of the proposed method is tested on synthetic envelope signals and field data from different surveys. Good estimations of the signal parameters are obtained at different SNRs. Moreover, an attempt to use the proposed method to handle a single recording provides better results compared to 16 stacks. Our results suggest that the number of stacks can be appropriately reduced to shorten the measurement time and improve the measurement efficiency.
Brown, Alexandra R; Gajewski, Byron J; Aaronson, Lauren S; Mudaranthakam, Dinesh Pal; Hunt, Suzanne L; Berry, Scott M; Quintana, Melanie; Pasnoor, Mamatha; Dimachkie, Mazen M; Jawdat, Omar; Herbelin, Laura; Barohn, Richard J
2016-08-31
In the last few decades, the number of trials using Bayesian methods has grown rapidly. Publications prior to 1990 included only three clinical trials that used Bayesian methods, but that number quickly jumped to 19 in the 1990s and to 99 from 2000 to 2012. While this literature provides many examples of Bayesian Adaptive Designs (BAD), none of the papers that are available walks the reader through the detailed process of conducting a BAD. This paper fills that gap by describing the BAD process used for one comparative effectiveness trial (Patient Assisted Intervention for Neuropathy: Comparison of Treatment in Real Life Situations) that can be generalized for use by others. A BAD was chosen with efficiency in mind. Response-adaptive randomization allows the potential for substantially smaller sample sizes, and can provide faster conclusions about which treatment or treatments are most effective. An Internet-based electronic data capture tool, which features a randomization module, facilitated data capture across study sites and an in-house computation software program was developed to implement the response-adaptive randomization. A process for adapting randomization with minimal interruption to study sites was developed. A new randomization table can be generated quickly and can be seamlessly integrated in the data capture tool with minimal interruption to study sites. This manuscript is the first to detail the technical process used to evaluate a multisite comparative effectiveness trial using adaptive randomization. An important opportunity for the application of Bayesian trials is in comparative effectiveness trials. The specific case study presented in this paper can be used as a model for conducting future clinical trials using a combination of statistical software and a web-based application. ClinicalTrials.gov Identifier: NCT02260388 , registered on 6 October 2014.
Yavorska, Olena O; Burgess, Stephen
2017-12-01
MendelianRandomization is a software package for the R open-source software environment that performs Mendelian randomization analyses using summarized data. The core functionality is to implement the inverse-variance weighted, MR-Egger and weighted median methods for multiple genetic variants. Several options are available to the user, such as the use of robust regression, fixed- or random-effects models and the penalization of weights for genetic variants with heterogeneous causal estimates. Extensions to these methods, such as allowing for variants to be correlated, can be chosen if appropriate. Graphical commands allow summarized data to be displayed in an interactive graph, or the plotting of causal estimates from multiple methods, for comparison. Although the main method of data entry is directly by the user, there is also an option for allowing summarized data to be incorporated from the PhenoScanner database of genotype-phenotype associations. We hope to develop this feature in future versions of the package. The R software environment is available for download from [https://www.r-project.org/]. The MendelianRandomization package can be downloaded from the Comprehensive R Archive Network (CRAN) within R, or directly from [https://cran.r-project.org/web/packages/MendelianRandomization/]. Both R and the MendelianRandomization package are released under GNU General Public Licenses (GPL-2|GPL-3). © The Author 2017. Published by Oxford University Press on behalf of the International Epidemiological Association.
Evaluation of some random effects methodology applicable to bird ringing data
Burnham, K.P.; White, Gary C.
2002-01-01
Existing models for ring recovery and recapture data analysis treat temporal variations in annual survival probability (S) as fixed effects. Often there is no explainable structure to the temporal variation in S1,..., Sk; random effects can then be a useful model: Si = E(S) + ??i. Here, the temporal variation in survival probability is treated as random with average value E(??2) = ??2. This random effects model can now be fit in program MARK. Resultant inferences include point and interval estimation for process variation, ??2, estimation of E(S) and var (E??(S)) where the latter includes a component for ??2 as well as the traditional component for v??ar(S??\\S??). Furthermore, the random effects model leads to shrinkage estimates, Si, as improved (in mean square error) estimators of Si compared to the MLE, S??i, from the unrestricted time-effects model. Appropriate confidence intervals based on the Si are also provided. In addition, AIC has been generalized to random effects models. This paper presents results of a Monte Carlo evaluation of inference performance under the simple random effects model. Examined by simulation, under the simple one group Cormack-Jolly-Seber (CJS) model, are issues such as bias of ??s2, confidence interval coverage on ??2, coverage and mean square error comparisons for inference about Si based on shrinkage versus maximum likelihood estimators, and performance of AIC model selection over three models: Si ??? S (no effects), Si = E(S) + ??i (random effects), and S1,..., Sk (fixed effects). For the cases simulated, the random effects methods performed well and were uniformly better than fixed effects MLE for the Si.
Network meta-analysis, electrical networks and graph theory.
Rücker, Gerta
2012-12-01
Network meta-analysis is an active field of research in clinical biostatistics. It aims to combine information from all randomized comparisons among a set of treatments for a given medical condition. We show how graph-theoretical methods can be applied to network meta-analysis. A meta-analytic graph consists of vertices (treatments) and edges (randomized comparisons). We illustrate the correspondence between meta-analytic networks and electrical networks, where variance corresponds to resistance, treatment effects to voltage, and weighted treatment effects to current flows. Based thereon, we then show that graph-theoretical methods that have been routinely applied to electrical networks also work well in network meta-analysis. In more detail, the resulting consistent treatment effects induced in the edges can be estimated via the Moore-Penrose pseudoinverse of the Laplacian matrix. Moreover, the variances of the treatment effects are estimated in analogy to electrical effective resistances. It is shown that this method, being computationally simple, leads to the usual fixed effect model estimate when applied to pairwise meta-analysis and is consistent with published results when applied to network meta-analysis examples from the literature. Moreover, problems of heterogeneity and inconsistency, random effects modeling and including multi-armed trials are addressed. Copyright © 2012 John Wiley & Sons, Ltd. Copyright © 2012 John Wiley & Sons, Ltd.
Hedden, Sarra L; Woolson, Robert F; Carter, Rickey E; Palesch, Yuko; Upadhyaya, Himanshu P; Malcolm, Robert J
2009-07-01
"Loss to follow-up" can be substantial in substance abuse clinical trials. When extensive losses to follow-up occur, one must cautiously analyze and interpret the findings of a research study. Aims of this project were to introduce the types of missing data mechanisms and describe several methods for analyzing data with loss to follow-up. Furthermore, a simulation study compared Type I error and power of several methods when missing data amount and mechanism varies. Methods compared were the following: Last observation carried forward (LOCF), multiple imputation (MI), modified stratified summary statistics (SSS), and mixed effects models. Results demonstrated nominal Type I error for all methods; power was high for all methods except LOCF. Mixed effect model, modified SSS, and MI are generally recommended for use; however, many methods require that the data are missing at random or missing completely at random (i.e., "ignorable"). If the missing data are presumed to be nonignorable, a sensitivity analysis is recommended.
Effect of packing method on the randomness of disc packings
NASA Astrophysics Data System (ADS)
Zhang, Z. P.; Yu, A. B.; Oakeshott, R. B. S.
1996-06-01
The randomness of disc packings, generated by random sequential adsorption (RSA), random packing under gravity (RPG) and Mason packing (MP) which gives a packing density close to that of the RSA packing, has been analysed, based on the Delaunay tessellation, and is evaluated at two levels, i.e. the randomness at individual subunit level which relates to the construction of a triangle from a given edge length distribution and the randomness at network level which relates to the connection between triangles from a given triangle frequency distribution. The Delaunay tessellation itself is also analysed and its almost perfect randomness at the two levels is demonstrated, which verifies the proposed approach and provides a random reference system for the present analysis. It is found that (i) the construction of a triangle subunit is not random for the RSA, MP and RPG packings, with the degree of randomness decreasing from the RSA to MP and then to RPG packing; (ii) the connection of triangular subunits in the network is almost perfectly random for the RSA packing, acceptable for the MP packing and not good for the RPG packing. Packing method is an important factor governing the randomness of disc packings.
A random effects meta-analysis model with Box-Cox transformation.
Yamaguchi, Yusuke; Maruo, Kazushi; Partlett, Christopher; Riley, Richard D
2017-07-19
In a random effects meta-analysis model, true treatment effects for each study are routinely assumed to follow a normal distribution. However, normality is a restrictive assumption and the misspecification of the random effects distribution may result in a misleading estimate of overall mean for the treatment effect, an inappropriate quantification of heterogeneity across studies and a wrongly symmetric prediction interval. We focus on problems caused by an inappropriate normality assumption of the random effects distribution, and propose a novel random effects meta-analysis model where a Box-Cox transformation is applied to the observed treatment effect estimates. The proposed model aims to normalise an overall distribution of observed treatment effect estimates, which is sum of the within-study sampling distributions and the random effects distribution. When sampling distributions are approximately normal, non-normality in the overall distribution will be mainly due to the random effects distribution, especially when the between-study variation is large relative to the within-study variation. The Box-Cox transformation addresses this flexibly according to the observed departure from normality. We use a Bayesian approach for estimating parameters in the proposed model, and suggest summarising the meta-analysis results by an overall median, an interquartile range and a prediction interval. The model can be applied for any kind of variables once the treatment effect estimate is defined from the variable. A simulation study suggested that when the overall distribution of treatment effect estimates are skewed, the overall mean and conventional I 2 from the normal random effects model could be inappropriate summaries, and the proposed model helped reduce this issue. We illustrated the proposed model using two examples, which revealed some important differences on summary results, heterogeneity measures and prediction intervals from the normal random effects model. The random effects meta-analysis with the Box-Cox transformation may be an important tool for examining robustness of traditional meta-analysis results against skewness on the observed treatment effect estimates. Further critical evaluation of the method is needed.
Li, Baoyue; Lingsma, Hester F; Steyerberg, Ewout W; Lesaffre, Emmanuel
2011-05-23
Logistic random effects models are a popular tool to analyze multilevel also called hierarchical data with a binary or ordinal outcome. Here, we aim to compare different statistical software implementations of these models. We used individual patient data from 8509 patients in 231 centers with moderate and severe Traumatic Brain Injury (TBI) enrolled in eight Randomized Controlled Trials (RCTs) and three observational studies. We fitted logistic random effects regression models with the 5-point Glasgow Outcome Scale (GOS) as outcome, both dichotomized as well as ordinal, with center and/or trial as random effects, and as covariates age, motor score, pupil reactivity or trial. We then compared the implementations of frequentist and Bayesian methods to estimate the fixed and random effects. Frequentist approaches included R (lme4), Stata (GLLAMM), SAS (GLIMMIX and NLMIXED), MLwiN ([R]IGLS) and MIXOR, Bayesian approaches included WinBUGS, MLwiN (MCMC), R package MCMCglmm and SAS experimental procedure MCMC.Three data sets (the full data set and two sub-datasets) were analysed using basically two logistic random effects models with either one random effect for the center or two random effects for center and trial. For the ordinal outcome in the full data set also a proportional odds model with a random center effect was fitted. The packages gave similar parameter estimates for both the fixed and random effects and for the binary (and ordinal) models for the main study and when based on a relatively large number of level-1 (patient level) data compared to the number of level-2 (hospital level) data. However, when based on relatively sparse data set, i.e. when the numbers of level-1 and level-2 data units were about the same, the frequentist and Bayesian approaches showed somewhat different results. The software implementations differ considerably in flexibility, computation time, and usability. There are also differences in the availability of additional tools for model evaluation, such as diagnostic plots. The experimental SAS (version 9.2) procedure MCMC appeared to be inefficient. On relatively large data sets, the different software implementations of logistic random effects regression models produced similar results. Thus, for a large data set there seems to be no explicit preference (of course if there is no preference from a philosophical point of view) for either a frequentist or Bayesian approach (if based on vague priors). The choice for a particular implementation may largely depend on the desired flexibility, and the usability of the package. For small data sets the random effects variances are difficult to estimate. In the frequentist approaches the MLE of this variance was often estimated zero with a standard error that is either zero or could not be determined, while for Bayesian methods the estimates could depend on the chosen "non-informative" prior of the variance parameter. The starting value for the variance parameter may be also critical for the convergence of the Markov chain.
Random forests as cumulative effects models: A case study of lakes and rivers in Muskoka, Canada.
Jones, F Chris; Plewes, Rachel; Murison, Lorna; MacDougall, Mark J; Sinclair, Sarah; Davies, Christie; Bailey, John L; Richardson, Murray; Gunn, John
2017-10-01
Cumulative effects assessment (CEA) - a type of environmental appraisal - lacks effective methods for modeling cumulative effects, evaluating indicators of ecosystem condition, and exploring the likely outcomes of development scenarios. Random forests are an extension of classification and regression trees, which model response variables by recursive partitioning. Random forests were used to model a series of candidate ecological indicators that described lakes and rivers from a case study watershed (The Muskoka River Watershed, Canada). Suitability of the candidate indicators for use in cumulative effects assessment and watershed monitoring was assessed according to how well they could be predicted from natural habitat features and how sensitive they were to human land-use. The best models explained 75% of the variation in a multivariate descriptor of lake benthic-macroinvertebrate community structure, and 76% of the variation in the conductivity of river water. Similar results were obtained by cross-validation. Several candidate indicators detected a simulated doubling of urban land-use in their catchments, and a few were able to detect a simulated doubling of agricultural land-use. The paper demonstrates that random forests can be used to describe the combined and singular effects of multiple stressors and natural environmental factors, and furthermore, that random forests can be used to evaluate the performance of monitoring indicators. The numerical methods presented are applicable to any ecosystem and indicator type, and therefore represent a step forward for CEA. Crown Copyright © 2017. Published by Elsevier Ltd. All rights reserved.
Wampler, Peter J; Rediske, Richard R; Molla, Azizur R
2013-01-18
A remote sensing technique was developed which combines a Geographic Information System (GIS); Google Earth, and Microsoft Excel to identify home locations for a random sample of households in rural Haiti. The method was used to select homes for ethnographic and water quality research in a region of rural Haiti located within 9 km of a local hospital and source of health education in Deschapelles, Haiti. The technique does not require access to governmental records or ground based surveys to collect household location data and can be performed in a rapid, cost-effective manner. The random selection of households and the location of these households during field surveys were accomplished using GIS, Google Earth, Microsoft Excel, and handheld Garmin GPSmap 76CSx GPS units. Homes were identified and mapped in Google Earth, exported to ArcMap 10.0, and a random list of homes was generated using Microsoft Excel which was then loaded onto handheld GPS units for field location. The development and use of a remote sensing method was essential to the selection and location of random households. A total of 537 homes initially were mapped and a randomized subset of 96 was identified as potential survey locations. Over 96% of the homes mapped using Google Earth imagery were correctly identified as occupied dwellings. Only 3.6% of the occupants of mapped homes visited declined to be interviewed. 16.4% of the homes visited were not occupied at the time of the visit due to work away from the home or market days. A total of 55 households were located using this method during the 10 days of fieldwork in May and June of 2012. The method used to generate and field locate random homes for surveys and water sampling was an effective means of selecting random households in a rural environment lacking geolocation infrastructure. The success rate for locating households using a handheld GPS was excellent and only rarely was local knowledge required to identify and locate households. This method provides an important technique that can be applied to other developing countries where a randomized study design is needed but infrastructure is lacking to implement more traditional participant selection methods.
Game-Based Learning as a Vehicle to Teach First Aid Content: A Randomized Experiment
ERIC Educational Resources Information Center
Charlier, Nathalie; De Fraine, Bieke
2013-01-01
Background: Knowledge of first aid (FA), which constitutes lifesaving treatments for injuries or illnesses, is important for every individual. In this study, we have set up a group-randomized controlled trial to assess the effectiveness of a board game for learning FA. Methods: Four class groups (120 students) were randomly assigned to 2…
Individualizing drug dosage with longitudinal data.
Zhu, Xiaolu; Qu, Annie
2016-10-30
We propose a two-step procedure to personalize drug dosage over time under the framework of a log-linear mixed-effect model. We model patients' heterogeneity using subject-specific random effects, which are treated as the realizations of an unspecified stochastic process. We extend the conditional quadratic inference function to estimate both fixed-effect coefficients and individual random effects on a longitudinal training data sample in the first step and propose an adaptive procedure to estimate new patients' random effects and provide dosage recommendations for new patients in the second step. An advantage of our approach is that we do not impose any distribution assumption on estimating random effects. Moreover, the new approach can accommodate more general time-varying covariates corresponding to random effects. We show in theory and numerical studies that the proposed method is more efficient compared with existing approaches, especially when covariates are time varying. In addition, a real data example of a clozapine study confirms that our two-step procedure leads to more accurate drug dosage recommendations. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Analysis of drift correction in different simulated weighing schemes
NASA Astrophysics Data System (ADS)
Beatrici, A.; Rebelo, A.; Quintão, D.; Cacais, F. L.; Loayza, V. M.
2015-10-01
In the calibration of high accuracy mass standards some weighing schemes are used to reduce or eliminate the zero drift effects in mass comparators. There are different sources for the drift and different methods for its treatment. By using numerical methods, drift functions were simulated and a random term was included in each function. The comparison between the results obtained from ABABAB and ABBA weighing series was carried out. The results show a better efficacy of ABABAB method for drift with smooth variation and small randomness.
Andridge, Rebecca. R.
2011-01-01
In cluster randomized trials (CRTs), identifiable clusters rather than individuals are randomized to study groups. Resulting data often consist of a small number of clusters with correlated observations within a treatment group. Missing data often present a problem in the analysis of such trials, and multiple imputation (MI) has been used to create complete data sets, enabling subsequent analysis with well-established analysis methods for CRTs. We discuss strategies for accounting for clustering when multiply imputing a missing continuous outcome, focusing on estimation of the variance of group means as used in an adjusted t-test or ANOVA. These analysis procedures are congenial to (can be derived from) a mixed effects imputation model; however, this imputation procedure is not yet available in commercial statistical software. An alternative approach that is readily available and has been used in recent studies is to include fixed effects for cluster, but the impact of using this convenient method has not been studied. We show that under this imputation model the MI variance estimator is positively biased and that smaller ICCs lead to larger overestimation of the MI variance. Analytical expressions for the bias of the variance estimator are derived in the case of data missing completely at random (MCAR), and cases in which data are missing at random (MAR) are illustrated through simulation. Finally, various imputation methods are applied to data from the Detroit Middle School Asthma Project, a recent school-based CRT, and differences in inference are compared. PMID:21259309
ERIC Educational Resources Information Center
Rodriguez-Sanchez, Emiliano; Patino-Alonso, Maria C.; Mora-Simon, Sara; Gomez-Marcos, Manuel A.; Perez-Penaranda, Anibal; Losada-Baltar, Andres; Garcia-Ortiz, Luis
2013-01-01
Purpose: To assess, in the context of Primary Health Care (PHC), the effect of a psychological intervention in mental health among caregivers (CGs) of dependent relatives. Design and Methods: Randomized multicenter, controlled clinical trial. The 125 CGs included in the trial were receiving health care in PHC. Inclusion criteria: Identifying…
ERIC Educational Resources Information Center
Rhoads, Christopher
2011-01-01
Researchers planning a randomized field trial to evaluate the effectiveness of an educational intervention often face the following dilemma. They plan to recruit schools to participate in their study. The question is, "Should the researchers randomly assign individuals (either students or teachers, depending on the intervention) within schools to…
ERIC Educational Resources Information Center
Iriyama, Yae; Murayama, Nobuko
2014-01-01
Objective: We conducted a randomized controlled crossover trial to evaluate the effects of a new worksite weight-control programme designed for men with or at risk of obesity using a combination of nutrition education and nutrition environmental interventions. Subjects and methods: Male workers with or at risk of obesity were recruited for this…
ERIC Educational Resources Information Center
Jackson, Dan
2013-01-01
Statistical inference is problematic in the common situation in meta-analysis where the random effects model is fitted to just a handful of studies. In particular, the asymptotic theory of maximum likelihood provides a poor approximation, and Bayesian methods are sensitive to the prior specification. Hence, less efficient, but easily computed and…
ERIC Educational Resources Information Center
Schlosser, Ralf W.; Koul, Rajinder; Shane, Howard; Sorce, James; Brock, Kristofer; Harmon, Ashley; Moerlein, Dorothy; Hearn, Emilia
2014-01-01
Purpose: The effects of animation on naming and identification of graphic symbols for verbs and prepositions were studied in 2 graphic symbol sets in preschoolers. Method: Using a 2 × 2 × 2 × 3 completely randomized block design, preschoolers across three age groups were randomly assigned to combinations of symbol set (Autism Language Program…
Child-Parent Interventions for Childhood Anxiety Disorders: A Systematic Review and Meta-Analysis
ERIC Educational Resources Information Center
Brendel, Kristen Esposito; Maynard, Brandy R.
2014-01-01
Objective: This study compared the effects of direct child-parent interventions to the effects of child-focused interventions on anxiety outcomes for children with anxiety disorders. Method: Systematic review methods and meta-analytic techniques were employed. Eight randomized controlled trials examining effects of family cognitive behavior…
Confidence intervals for a difference between lognormal means in cluster randomization trials.
Poirier, Julia; Zou, G Y; Koval, John
2017-04-01
Cluster randomization trials, in which intact social units are randomized to different interventions, have become popular in the last 25 years. Outcomes from these trials in many cases are positively skewed, following approximately lognormal distributions. When inference is focused on the difference between treatment arm arithmetic means, existent confidence interval procedures either make restricting assumptions or are complex to implement. We approach this problem by assuming log-transformed outcomes from each treatment arm follow a one-way random effects model. The treatment arm means are functions of multiple parameters for which separate confidence intervals are readily available, suggesting that the method of variance estimates recovery may be applied to obtain closed-form confidence intervals. A simulation study showed that this simple approach performs well in small sample sizes in terms of empirical coverage, relatively balanced tail errors, and interval widths as compared to existing methods. The methods are illustrated using data arising from a cluster randomization trial investigating a critical pathway for the treatment of community acquired pneumonia.
NASA Astrophysics Data System (ADS)
Wang, Tao; Zhou, Guoqing; Wang, Jianzhou; Zhou, Lei
2018-03-01
The artificial ground freezing method (AGF) is widely used in civil and mining engineering, and the thermal regime of frozen soil around the freezing pipe affects the safety of design and construction. The thermal parameters can be truly random due to heterogeneity of the soil properties, which lead to the randomness of thermal regime of frozen soil around the freezing pipe. The purpose of this paper is to study the one-dimensional (1D) random thermal regime problem on the basis of a stochastic analysis model and the Monte Carlo (MC) method. Considering the uncertain thermal parameters of frozen soil as random variables, stochastic processes and random fields, the corresponding stochastic thermal regime of frozen soil around a single freezing pipe are obtained and analyzed. Taking the variability of each stochastic parameter into account individually, the influences of each stochastic thermal parameter on stochastic thermal regime are investigated. The results show that the mean temperatures of frozen soil around the single freezing pipe with three analogy method are the same while the standard deviations are different. The distributions of standard deviation have a great difference at different radial coordinate location and the larger standard deviations are mainly at the phase change area. The computed data with random variable method and stochastic process method have a great difference from the measured data while the computed data with random field method well agree with the measured data. Each uncertain thermal parameter has a different effect on the standard deviation of frozen soil temperature around the single freezing pipe. These results can provide a theoretical basis for the design and construction of AGF.
Effects of random tooth profile errors on the dynamic behaviors of planetary gears
NASA Astrophysics Data System (ADS)
Xun, Chao; Long, Xinhua; Hua, Hongxing
2018-02-01
In this paper, a nonlinear random model is built to describe the dynamics of planetary gear trains (PGTs), in which the time-varying mesh stiffness, tooth profile modification (TPM), tooth contact loss, and random tooth profile error are considered. A stochastic method based on the method of multiple scales (MMS) is extended to analyze the statistical property of the dynamic performance of PGTs. By the proposed multiple-scales based stochastic method, the distributions of the dynamic transmission errors (DTEs) are investigated, and the lower and upper bounds are determined based on the 3σ principle. Monte Carlo method is employed to verify the proposed method. Results indicate that the proposed method can be used to determine the distribution of the DTE of PGTs high efficiently and allow a link between the manufacturing precision and the dynamical response. In addition, the effects of tooth profile modification on the distributions of vibration amplitudes and the probability of tooth contact loss with different manufacturing tooth profile errors are studied. The results show that the manufacturing precision affects the distribution of dynamic transmission errors dramatically and appropriate TPMs are helpful to decrease the nominal value and the deviation of the vibration amplitudes.
SNP selection and classification of genome-wide SNP data using stratified sampling random forests.
Wu, Qingyao; Ye, Yunming; Liu, Yang; Ng, Michael K
2012-09-01
For high dimensional genome-wide association (GWA) case-control data of complex disease, there are usually a large portion of single-nucleotide polymorphisms (SNPs) that are irrelevant with the disease. A simple random sampling method in random forest using default mtry parameter to choose feature subspace, will select too many subspaces without informative SNPs. Exhaustive searching an optimal mtry is often required in order to include useful and relevant SNPs and get rid of vast of non-informative SNPs. However, it is too time-consuming and not favorable in GWA for high-dimensional data. The main aim of this paper is to propose a stratified sampling method for feature subspace selection to generate decision trees in a random forest for GWA high-dimensional data. Our idea is to design an equal-width discretization scheme for informativeness to divide SNPs into multiple groups. In feature subspace selection, we randomly select the same number of SNPs from each group and combine them to form a subspace to generate a decision tree. The advantage of this stratified sampling procedure can make sure each subspace contains enough useful SNPs, but can avoid a very high computational cost of exhaustive search of an optimal mtry, and maintain the randomness of a random forest. We employ two genome-wide SNP data sets (Parkinson case-control data comprised of 408 803 SNPs and Alzheimer case-control data comprised of 380 157 SNPs) to demonstrate that the proposed stratified sampling method is effective, and it can generate better random forest with higher accuracy and lower error bound than those by Breiman's random forest generation method. For Parkinson data, we also show some interesting genes identified by the method, which may be associated with neurological disorders for further biological investigations.
A new simple technique for improving the random properties of chaos-based cryptosystems
NASA Astrophysics Data System (ADS)
Garcia-Bosque, M.; Pérez-Resa, A.; Sánchez-Azqueta, C.; Celma, S.
2018-03-01
A new technique for improving the security of chaos-based stream ciphers has been proposed and tested experimentally. This technique manages to improve the randomness properties of the generated keystream by preventing the system to fall into short period cycles due to digitation. In order to test this technique, a stream cipher based on a Skew Tent Map algorithm has been implemented on a Virtex 7 FPGA. The randomness of the keystream generated by this system has been compared to the randomness of the keystream generated by the same system with the proposed randomness-enhancement technique. By subjecting both keystreams to the National Institute of Standards and Technology (NIST) tests, we have proved that our method can considerably improve the randomness of the generated keystreams. In order to incorporate our randomness-enhancement technique, only 41 extra slices have been needed, proving that, apart from effective, this method is also efficient in terms of area and hardware resources.
Miyamoto, Gisela Cristiane; Moura, Katherinne Ferro; Franco, Yuri Rafael dos Santos; Oliveira, Naiane Teixeira Bastos de; Amaral, Diego Diulgeroglo Vicco; Branco, Amanda Nery Castelo; Silva, Maria Liliane da; Lin, Christine; Cabral, Cristina Maria Nunes
2016-03-01
The Pilates method has been recommended to patients with low back pain, but the evidence on effectiveness is inconclusive. In addition, there is still no evidence for the cost-effectiveness of this method or for the ideal number of sessions to achieve the highest effectiveness. The aim of this study will be to investigate the effectiveness and cost-effectiveness of the Pilates method with different weekly frequencies in the treatment of patients with nonspecific low back pain. This is a randomized controlled trial with blinded assessor. This study will be conducted at a physical therapy clinic in São Paulo, Brazil. Two hundred ninety-six patients with nonspecific low back pain between the ages of 18 and 80 years will be assessed and randomly allocated to 4 groups (n=74 patients per group). All groups will receive an educational booklet. The booklet group will not receive additional exercises. Pilates group 1 will follow a Pilates-based program once a week, Pilates group 2 will follow the same program twice a week, and Pilates group 3 will follow the same program 3 times a week. The intervention will last 6 weeks. A blinded assessor will evaluate pain, quality-adjusted life-years, general and specific disability, kinesiophobia, pain catastrophizing, and global perceived effect 6 weeks, 6 months, and 12 months after randomization. Therapists and patients will not be blinded. This will be the first study to investigate different weekly frequencies of treatment sessions for nonspecific low back pain. The results of this study will contribute to a better definition of treatment programs for this population. © 2016 American Physical Therapy Association.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Romero, Vicente; Bonney, Matthew; Schroeder, Benjamin
When very few samples of a random quantity are available from a source distribution of unknown shape, it is usually not possible to accurately infer the exact distribution from which the data samples come. Under-estimation of important quantities such as response variance and failure probabilities can result. For many engineering purposes, including design and risk analysis, we attempt to avoid under-estimation with a strategy to conservatively estimate (bound) these types of quantities -- without being overly conservative -- when only a few samples of a random quantity are available from model predictions or replicate experiments. This report examines a classmore » of related sparse-data uncertainty representation and inference approaches that are relatively simple, inexpensive, and effective. Tradeoffs between the methods' conservatism, reliability, and risk versus number of data samples (cost) are quantified with multi-attribute metrics use d to assess method performance for conservative estimation of two representative quantities: central 95% of response; and 10 -4 probability of exceeding a response threshold in a tail of the distribution. Each method's performance is characterized with 10,000 random trials on a large number of diverse and challenging distributions. The best method and number of samples to use in a given circumstance depends on the uncertainty quantity to be estimated, the PDF character, and the desired reliability of bounding the true value. On the basis of this large data base and study, a strategy is proposed for selecting the method and number of samples for attaining reasonable credibility levels in bounding these types of quantities when sparse samples of random variables or functions are available from experiments or simulations.« less
Meta-analysis in clinical trials revisited.
DerSimonian, Rebecca; Laird, Nan
2015-11-01
In this paper, we revisit a 1986 article we published in this Journal, Meta-Analysis in Clinical Trials, where we introduced a random-effects model to summarize the evidence about treatment efficacy from a number of related clinical trials. Because of its simplicity and ease of implementation, our approach has been widely used (with more than 12,000 citations to date) and the "DerSimonian and Laird method" is now often referred to as the 'standard approach' or a 'popular' method for meta-analysis in medical and clinical research. The method is especially useful for providing an overall effect estimate and for characterizing the heterogeneity of effects across a series of studies. Here, we review the background that led to the original 1986 article, briefly describe the random-effects approach for meta-analysis, explore its use in various settings and trends over time and recommend a refinement to the method using a robust variance estimator for testing overall effect. We conclude with a discussion of repurposing the method for Big Data meta-analysis and Genome Wide Association Studies for studying the importance of genetic variants in complex diseases. Published by Elsevier Inc.
2014-01-01
Background There is a need for evidence of the clinical effectiveness of minimally invasive surgery for the treatment of esophageal cancer, but randomized controlled trials in surgery are often difficult to conduct. The ROMIO (Randomized Open or Minimally Invasive Oesophagectomy) study will establish the feasibility of a main trial which will examine the clinical and cost-effectiveness of minimally invasive and open surgical procedures for the treatment of esophageal cancer. Methods/Design A pilot randomized controlled trial (RCT), in two centers (University Hospitals Bristol NHS Foundation Trust and Plymouth Hospitals NHS Trust) will examine numbers of incident and eligible patients who consent to participate in the ROMIO study. Interventions will include esophagectomy by: (1) open gastric mobilization and right thoracotomy, (2) laparoscopic gastric mobilization and right thoracotomy, and (3) totally minimally invasive surgery (in the Bristol center only). The primary outcomes of the feasibility study will be measures of recruitment, successful development of methods to monitor quality of surgery and fidelity to a surgical protocol, and development of a core outcome set to evaluate esophageal cancer surgery. The study will test patient-reported outcomes measures to assess recovery, methods to blind participants, assessments of surgical morbidity, and methods to capture cost and resource use. ROMIO will integrate methods to monitor and improve recruitment using audio recordings of consultations between recruiting surgeons, nurses, and patients to provide feedback for recruiting staff. Discussion The ROMIO study aims to establish efficient methods to undertake a main trial of minimally invasive surgery versus open surgery for esophageal cancer. Trial registration The pilot trial has Current Controlled Trials registration number ISRCTN59036820(25/02/2013) at http://www.controlled-trials.com; the ROMIO trial record at that site gives a link to the original version of the study protocol. PMID:24888266
ERIC Educational Resources Information Center
Warnell, Ronald L.; Duk, Anthony D.; Christison, George W.; Haviland, Mark G.
2005-01-01
Objective: To compare the effects of learning about electroconvulsive therapy (ECT) via live observation to learning via an instructional videotape. Method: During their psychiatry clerkship, 122 medical students were randomized using these two educational methods, and their ECT knowledge and attitudes were assessed during the first and last weeks…
NASA Astrophysics Data System (ADS)
Witteveen, Jeroen A. S.; Bijl, Hester
2009-10-01
The Unsteady Adaptive Stochastic Finite Elements (UASFE) method resolves the effect of randomness in numerical simulations of single-mode aeroelastic responses with a constant accuracy in time for a constant number of samples. In this paper, the UASFE framework is extended to multi-frequency responses and continuous structures by employing a wavelet decomposition pre-processing step to decompose the sampled multi-frequency signals into single-frequency components. The effect of the randomness on the multi-frequency response is then obtained by summing the results of the UASFE interpolation at constant phase for the different frequency components. Results for multi-frequency responses and continuous structures show a three orders of magnitude reduction of computational costs compared to crude Monte Carlo simulations in a harmonically forced oscillator, a flutter panel problem, and the three-dimensional transonic AGARD 445.6 wing aeroelastic benchmark subject to random fields and random parameters with various probability distributions.
Mid-trimester induced abortion: a review.
Lalitkumar, S; Bygdeman, M; Gemzell-Danielsson, K
2007-01-01
Mid-trimester abortion constitutes 10-15% of all induced abortion. The aim of this article is to provide a review of the current literature of mid-trimester methods of abortion with respect to efficacy, side effects and acceptability. There have been continuing efforts to improve the abortion technology in terms of effectiveness, technical ease of performance, acceptability and reduction of side effects and complications. During the last decade, medical methods for mid-trimester induced abortion have shown a considerable development and have become safe and more accessible. The combination of mifepristone and misoprostol is now an established and highly effective method for termination of pregnancy (TOP). Advantages and disadvantages of medical versus surgical methods are discussed. Randomized studies are lacking, and more studies on pain treatment and the safety of any method used in patients with a previous uterine scar are debated, and data are scarce. Pain management in abortion requires special attention. This review highlights the need for randomized studies to set guidelines for mid-trimester abortion methods in terms of safety and acceptability as well as for better analgesic regimens.
Cambron, Jerrilyn A; Dexheimer, Jennifer M; Chang, Mabel; Cramer, Gregory D
2010-01-01
The purpose of this article is to describe the methods for recruitment in a clinical trial on chiropractic care for lumbar spinal stenosis. This randomized, placebo-controlled pilot study investigated the efficacy of different amounts of total treatment dosage over 6 weeks in 60 volunteer subjects with lumbar spinal stenosis. Subjects were recruited for this study through several media venues, focusing on successful and cost-effective strategies. Included in our efforts were radio advertising, newspaper advertising, direct mail, and various other low-cost initiatives. Of the 1211 telephone screens, 60 responders (5.0%) were randomized into the study. The most successful recruitment method was radio advertising, generating more than 64% of the calls (776 subjects). Newspaper and magazine advertising generated approximately 9% of all calls (108 subjects), and direct mail generated less than 7% (79 subjects). The total direct cost for recruitment was $40 740 or $679 per randomized patient. The costs per randomization were highest for direct mail ($995 per randomization) and lowest for newspaper/magazine advertising ($558 per randomization). Success of recruitment methods may vary based on target population and location. Planning of recruitment efforts is essential to the success of any clinical trial. Copyright 2010 National University of Health Sciences. Published by Mosby, Inc. All rights reserved.
Xiao, Yongling; Abrahamowicz, Michal
2010-03-30
We propose two bootstrap-based methods to correct the standard errors (SEs) from Cox's model for within-cluster correlation of right-censored event times. The cluster-bootstrap method resamples, with replacement, only the clusters, whereas the two-step bootstrap method resamples (i) the clusters, and (ii) individuals within each selected cluster, with replacement. In simulations, we evaluate both methods and compare them with the existing robust variance estimator and the shared gamma frailty model, which are available in statistical software packages. We simulate clustered event time data, with latent cluster-level random effects, which are ignored in the conventional Cox's model. For cluster-level covariates, both proposed bootstrap methods yield accurate SEs, and type I error rates, and acceptable coverage rates, regardless of the true random effects distribution, and avoid serious variance under-estimation by conventional Cox-based standard errors. However, the two-step bootstrap method over-estimates the variance for individual-level covariates. We also apply the proposed bootstrap methods to obtain confidence bands around flexible estimates of time-dependent effects in a real-life analysis of cluster event times.
Solving large test-day models by iteration on data and preconditioned conjugate gradient.
Lidauer, M; Strandén, I; Mäntysaari, E A; Pösö, J; Kettunen, A
1999-12-01
A preconditioned conjugate gradient method was implemented into an iteration on a program for data estimation of breeding values, and its convergence characteristics were studied. An algorithm was used as a reference in which one fixed effect was solved by Gauss-Seidel method, and other effects were solved by a second-order Jacobi method. Implementation of the preconditioned conjugate gradient required storing four vectors (size equal to number of unknowns in the mixed model equations) in random access memory and reading the data at each round of iteration. The preconditioner comprised diagonal blocks of the coefficient matrix. Comparison of algorithms was based on solutions of mixed model equations obtained by a single-trait animal model and a single-trait, random regression test-day model. Data sets for both models used milk yield records of primiparous Finnish dairy cows. Animal model data comprised 665,629 lactation milk yields and random regression test-day model data of 6,732,765 test-day milk yields. Both models included pedigree information of 1,099,622 animals. The animal model ¿random regression test-day model¿ required 122 ¿305¿ rounds of iteration to converge with the reference algorithm, but only 88 ¿149¿ were required with the preconditioned conjugate gradient. To solve the random regression test-day model with the preconditioned conjugate gradient required 237 megabytes of random access memory and took 14% of the computation time needed by the reference algorithm.
ERIC Educational Resources Information Center
Siddique, Juned; Chung, Joyce Y.; Brown, C. Hendricks; Miranda, Jeanne
2012-01-01
Objective: To examine whether there are latent trajectory classes in response to treatment and whether they moderate the effects of medication versus psychotherapy. Method: Data come from a 1-year randomized controlled trial of 267 low-income, young (M = 29 years), minority (44% Black, 50% Latina, 6% White) women with current major depression…
ERIC Educational Resources Information Center
Fey, Marc E.; Warren, Steven F.; Brady, Nancy; Finestack, Lizbeth H.; Bredin-Oja, Shelley L.; Fairchild, Martha; Sokol, Shari; Yoder, Paul J.
2006-01-01
Purpose: To evaluate the efficacy of a 6-month course of responsivity education/prelinguistic milieu teaching (RE/PMT) for children with developmental delay and RE/PMT's effects on parenting stress in a randomized clinical trial. Method: Fifty-one children, age 24-33 months, with no more than 10 expressive words or signs, were randomly assigned to…
ERIC Educational Resources Information Center
Tonge, Bruce; Brereton, Avril; Kiomall, Melissa; MacKinnon, Andrew; King, Neville; Rinehart, Nicole
2006-01-01
Objective: To determine the impact of a parent education and behavior management intervention (PEBM) on the mental health and adjustment of parents with preschool children with autism. Method: A randomized, group-comparison design involving a parent education and counseling intervention to control for nonspecific therapist effects and a control…
ERIC Educational Resources Information Center
McLeod, Sharynne; Baker, Elise; McCormack, Jane; Wren, Yvonne; Roulstone, Sue; Crowe, Kathryn; Masso, Sarah; White, Paul; Howland, Charlotte
2017-01-01
Purpose: The aim was to evaluate the effectiveness of computer-assisted input-based intervention for children with speech sound disorders (SSD). Method: The Sound Start Study was a cluster-randomized controlled trial. Seventy-nine early childhood centers were invited to participate, 45 were recruited, and 1,205 parents and educators of 4- and…
Controlling the influence of elastic eigenmodes on nanomagnet dynamics through pattern geometry
NASA Astrophysics Data System (ADS)
Berk, C.; Yahagi, Y.; Dhuey, S.; Cabrini, S.; Schmidt, H.
2017-03-01
The effect of the nanoscale array geometry on the interaction between optically generated surface acoustic waves (SAWs) and nanomagnet dynamics is investigated using Time-Resolved Magneto-Optical Kerr Effect Microscopy (TR-MOKE). It is demonstrated that altering the nanomagnet geometry from a periodic to a randomized aperiodic pattern effectively removes the magneto-elastic effect of SAWs on the magnetization dynamics. The efficiency of this method depends on the extent of any residual spatial correlations and is quantified by spatial Fourier analysis of the two structures. Randomization allows observation and extraction of intrinsic magnetic parameters such as spin wave frequencies and damping to be resolvable using all-optical methods, enabling the conclusion that the fabrication process does not affect the damping.
Group versus individual family planning counseling in Ghana: a randomized, noninferiority trial.
Schwandt, Hilary M; Creanga, Andreea A; Danso, Kwabena A; Adanu, Richard M K; Agbenyega, Tsiri; Hindin, Michelle J
2013-08-01
Group, rather than individual, family planning counseling has the potential to increase family planning knowledge and use through more efficient use of limited human resources. A randomized, noninferiority study design was utilized to identify whether group family planning counseling is as effective as individual family planning counseling in Ghana. Female gynecology patients were enrolled from two teaching hospitals in Ghana in June and July 2008. Patients were randomized to receive either group or individual family planning counseling. The primary outcome in this study was change in modern contraceptive method knowledge. Changes in family planning use intention before and after the intervention and intended method type were also explored. Comparisons between the two study arms suggest that randomization was successful. The difference in change in modern contraceptive methods known from baseline to follow-up between the two study arms (group-individual), adjusted for study site, was -0.21, (95% confidence interval: -0.53 to 0.12) suggesting no difference between the two arms. Group family planning counseling was as effective as individual family planning counseling in increasing modern contraceptive knowledge among female gynecology patients in Ghana. Copyright © 2013 Elsevier Inc. All rights reserved.
Reitsma, Angela; Chu, Rong; Thorpe, Julia; McDonald, Sarah; Thabane, Lehana; Hutton, Eileen
2014-09-26
Clustering of outcomes at centers involved in multicenter trials is a type of center effect. The Consolidated Standards of Reporting Trials Statement recommends that multicenter randomized controlled trials (RCTs) should account for center effects in their analysis, however most do not. The Early External Cephalic Version (EECV) trials published in 2003 and 2011 stratified by center at randomization, but did not account for center in the analyses, and due to the nature of the intervention and number of centers, may have been prone to center effects. Using data from the EECV trials, we undertook an empirical study to compare various statistical approaches to account for center effect while estimating the impact of external cephalic version timing (early or delayed) on the outcomes of cesarean section, preterm birth, and non-cephalic presentation at the time of birth. The data from the EECV pilot trial and the EECV2 trial were merged into one dataset. Fisher's exact method was used to test the overall effect of external cephalic version timing unadjusted for center effects. Seven statistical models that accounted for center effects were applied to the data. The models included: i) the Mantel-Haenszel test, ii) logistic regression with fixed center effect and fixed treatment effect, iii) center-size weighted and iv) un-weighted logistic regression with fixed center effect and fixed treatment-by-center interaction, iv) logistic regression with random center effect and fixed treatment effect, v) logistic regression with random center effect and random treatment-by-center interaction, and vi) generalized estimating equations. For each of the three outcomes of interest approaches to account for center effect did not alter the overall findings of the trial. The results were similar for the majority of the methods used to adjust for center, illustrating the robustness of the findings. Despite literature that suggests center effect can change the estimate of effect in multicenter trials, this empirical study does not show a difference in the outcomes of the EECV trials when accounting for center effect. The EECV2 trial was registered on 30 July 30 2005 with Current Controlled Trials: ISRCTN 56498577.
Should multiple imputation be the method of choice for handling missing data in randomized trials?
Sullivan, Thomas R; White, Ian R; Salter, Amy B; Ryan, Philip; Lee, Katherine J
2016-01-01
The use of multiple imputation has increased markedly in recent years, and journal reviewers may expect to see multiple imputation used to handle missing data. However in randomized trials, where treatment group is always observed and independent of baseline covariates, other approaches may be preferable. Using data simulation we evaluated multiple imputation, performed both overall and separately by randomized group, across a range of commonly encountered scenarios. We considered both missing outcome and missing baseline data, with missing outcome data induced under missing at random mechanisms. Provided the analysis model was correctly specified, multiple imputation produced unbiased treatment effect estimates, but alternative unbiased approaches were often more efficient. When the analysis model overlooked an interaction effect involving randomized group, multiple imputation produced biased estimates of the average treatment effect when applied to missing outcome data, unless imputation was performed separately by randomized group. Based on these results, we conclude that multiple imputation should not be seen as the only acceptable way to handle missing data in randomized trials. In settings where multiple imputation is adopted, we recommend that imputation is carried out separately by randomized group. PMID:28034175
Should multiple imputation be the method of choice for handling missing data in randomized trials?
Sullivan, Thomas R; White, Ian R; Salter, Amy B; Ryan, Philip; Lee, Katherine J
2016-01-01
The use of multiple imputation has increased markedly in recent years, and journal reviewers may expect to see multiple imputation used to handle missing data. However in randomized trials, where treatment group is always observed and independent of baseline covariates, other approaches may be preferable. Using data simulation we evaluated multiple imputation, performed both overall and separately by randomized group, across a range of commonly encountered scenarios. We considered both missing outcome and missing baseline data, with missing outcome data induced under missing at random mechanisms. Provided the analysis model was correctly specified, multiple imputation produced unbiased treatment effect estimates, but alternative unbiased approaches were often more efficient. When the analysis model overlooked an interaction effect involving randomized group, multiple imputation produced biased estimates of the average treatment effect when applied to missing outcome data, unless imputation was performed separately by randomized group. Based on these results, we conclude that multiple imputation should not be seen as the only acceptable way to handle missing data in randomized trials. In settings where multiple imputation is adopted, we recommend that imputation is carried out separately by randomized group.
Treatment selection in a randomized clinical trial via covariate-specific treatment effect curves.
Ma, Yunbei; Zhou, Xiao-Hua
2017-02-01
For time-to-event data in a randomized clinical trial, we proposed two new methods for selecting an optimal treatment for a patient based on the covariate-specific treatment effect curve, which is used to represent the clinical utility of a predictive biomarker. To select an optimal treatment for a patient with a specific biomarker value, we proposed pointwise confidence intervals for each covariate-specific treatment effect curve and the difference between covariate-specific treatment effect curves of two treatments. Furthermore, to select an optimal treatment for a future biomarker-defined subpopulation of patients, we proposed confidence bands for each covariate-specific treatment effect curve and the difference between each pair of covariate-specific treatment effect curve over a fixed interval of biomarker values. We constructed the confidence bands based on a resampling technique. We also conducted simulation studies to evaluate finite-sample properties of the proposed estimation methods. Finally, we illustrated the application of the proposed method in a real-world data set.
Scott, JoAnna M; deCamp, Allan; Juraska, Michal; Fay, Michael P; Gilbert, Peter B
2017-04-01
Stepped wedge designs are increasingly commonplace and advantageous for cluster randomized trials when it is both unethical to assign placebo, and it is logistically difficult to allocate an intervention simultaneously to many clusters. We study marginal mean models fit with generalized estimating equations for assessing treatment effectiveness in stepped wedge cluster randomized trials. This approach has advantages over the more commonly used mixed models that (1) the population-average parameters have an important interpretation for public health applications and (2) they avoid untestable assumptions on latent variable distributions and avoid parametric assumptions about error distributions, therefore, providing more robust evidence on treatment effects. However, cluster randomized trials typically have a small number of clusters, rendering the standard generalized estimating equation sandwich variance estimator biased and highly variable and hence yielding incorrect inferences. We study the usual asymptotic generalized estimating equation inferences (i.e., using sandwich variance estimators and asymptotic normality) and four small-sample corrections to generalized estimating equation for stepped wedge cluster randomized trials and for parallel cluster randomized trials as a comparison. We show by simulation that the small-sample corrections provide improvement, with one correction appearing to provide at least nominal coverage even with only 10 clusters per group. These results demonstrate the viability of the marginal mean approach for both stepped wedge and parallel cluster randomized trials. We also study the comparative performance of the corrected methods for stepped wedge and parallel designs, and describe how the methods can accommodate interval censoring of individual failure times and incorporate semiparametric efficient estimators.
Differential Susceptibility to Prevention: GABAergic, Dopaminergic, and Multilocus Effects
ERIC Educational Resources Information Center
Brody, Gene H.; Chen, Yi-fu; Beach, Steven R. H.
2013-01-01
Background: Randomized prevention trials provide a unique opportunity to test hypotheses about the interaction of genetic predispositions with contextual processes to create variations in phenotypes over time. Methods: Using two longitudinal, randomized prevention trials, molecular genetic and alcohol use outcome data were gathered from more than…
Hossain, Ahmed; Beyene, Joseph
2014-01-01
This article compares baseline, average, and longitudinal data analysis methods for identifying genetic variants in genome-wide association study using the Genetic Analysis Workshop 18 data. We apply methods that include (a) linear mixed models with baseline measures, (b) random intercept linear mixed models with mean measures outcome, and (c) random intercept linear mixed models with longitudinal measurements. In the linear mixed models, covariates are included as fixed effects, whereas relatedness among individuals is incorporated as the variance-covariance structure of the random effect for the individuals. The overall strategy of applying linear mixed models decorrelate the data is based on Aulchenko et al.'s GRAMMAR. By analyzing systolic and diastolic blood pressure, which are used separately as outcomes, we compare the 3 methods in identifying a known genetic variant that is associated with blood pressure from chromosome 3 and simulated phenotype data. We also analyze the real phenotype data to illustrate the methods. We conclude that the linear mixed model with longitudinal measurements of diastolic blood pressure is the most accurate at identifying the known single-nucleotide polymorphism among the methods, but linear mixed models with baseline measures perform best with systolic blood pressure as the outcome.
The Performance of Methods to Test Upper-Level Mediation in the Presence of Nonnormal Data
ERIC Educational Resources Information Center
Pituch, Keenan A.; Stapleton, Laura M.
2008-01-01
A Monte Carlo study compared the statistical performance of standard and robust multilevel mediation analysis methods to test indirect effects for a cluster randomized experimental design under various departures from normality. The performance of these methods was examined for an upper-level mediation process, where the indirect effect is a fixed…
ERIC Educational Resources Information Center
Greenhill, Laurence L.; Newcorn, Jeffrey H.; Gao, Haitao; Feldman, Peter D.
2007-01-01
Objective: To compare the effects of two different methods for initiating atomoxetine in terms of the incidence of early adverse events. Method: Data on atomoxetine treatment-emergent adverse events in youths, ages 6 to 18 years, were analyzed from five randomized, double-blind, placebo-controlled, acute-phase studies. Two studies involve…
Using Audit Information to Adjust Parameter Estimates for Data Errors in Clinical Trials
Shepherd, Bryan E.; Shaw, Pamela A.; Dodd, Lori E.
2013-01-01
Background Audits are often performed to assess the quality of clinical trial data, but beyond detecting fraud or sloppiness, the audit data is generally ignored. In earlier work using data from a non-randomized study, Shepherd and Yu (2011) developed statistical methods to incorporate audit results into study estimates, and demonstrated that audit data could be used to eliminate bias. Purpose In this manuscript we examine the usefulness of audit-based error-correction methods in clinical trial settings where a continuous outcome is of primary interest. Methods We demonstrate the bias of multiple linear regression estimates in general settings with an outcome that may have errors and a set of covariates for which some may have errors and others, including treatment assignment, are recorded correctly for all subjects. We study this bias under different assumptions including independence between treatment assignment, covariates, and data errors (conceivable in a double-blinded randomized trial) and independence between treatment assignment and covariates but not data errors (possible in an unblinded randomized trial). We review moment-based estimators to incorporate the audit data and propose new multiple imputation estimators. The performance of estimators is studied in simulations. Results When treatment is randomized and unrelated to data errors, estimates of the treatment effect using the original error-prone data (i.e., ignoring the audit results) are unbiased. In this setting, both moment and multiple imputation estimators incorporating audit data are more variable than standard analyses using the original data. In contrast, in settings where treatment is randomized but correlated with data errors and in settings where treatment is not randomized, standard treatment effect estimates will be biased. And in all settings, parameter estimates for the original, error-prone covariates will be biased. Treatment and covariate effect estimates can be corrected by incorporating audit data using either the multiple imputation or moment-based approaches. Bias, precision, and coverage of confidence intervals improve as the audit size increases. Limitations The extent of bias and the performance of methods depend on the extent and nature of the error as well as the size of the audit. This work only considers methods for the linear model. Settings much different than those considered here need further study. Conclusions In randomized trials with continuous outcomes and treatment assignment independent of data errors, standard analyses of treatment effects will be unbiased and are recommended. However, if treatment assignment is correlated with data errors or other covariates, naive analyses may be biased. In these settings, and when covariate effects are of interest, approaches for incorporating audit results should be considered. PMID:22848072
Randomized trial of anesthetic methods for intravitreal injections.
Blaha, Gregory R; Tilton, Elisha P; Barouch, Fina C; Marx, Jeffrey L
2011-03-01
To compare the effectiveness of four different anesthetic methods for intravitreal injection. Twenty-four patients each received four intravitreal injections using each of four types of anesthesia (proparacaine, tetracaine, lidocaine pledget, and subconjunctival injection of lidocaine) in a prospective, masked, randomized block design. Pain was graded by the patient on a 0 to 10 scale for both the anesthesia and the injection. The average combined pain scores for both the anesthesia and the intravitreal injection were 4.4 for the lidocaine pledget, 3.5 for topical proparacaine, 3.8 for the subconjunctival lidocaine injection, and 4.1 for topical tetracaine. The differences were not significant (P = 0.65). There were also no statistical differences in the individual anesthesia or injection pain scores. Subconjunctival lidocaine injection had the most side effects. Topical anesthesia is an effective method for limiting pain associated with intravitreal injections.
Wang, Xiaolong; Li, Lin; Zhao, Jiaxin; Li, Fangliang; Guo, Wei; Chen, Xia
2017-04-01
To evaluate the effects of different preservation methods (stored in a -20°C ice chest, preserved in liquid nitrogen and dried in silica gel) on inter simple sequence repeat (ISSR) or random amplified polymorphic DNA (RAPD) analyses in various botanical specimens (including broad-leaved plants, needle-leaved plants and succulent plants) for different times (three weeks and three years), we used a statistical analysis based on the number of bands, genetic index and cluster analysis. The results demonstrate that methods used to preserve samples can provide sufficient amounts of genomic DNA for ISSR and RAPD analyses; however, the effect of different preservation methods on these analyses vary significantly, and the preservation time has little effect on these analyses. Our results provide a reference for researchers to select the most suitable preservation method depending on their study subject for the analysis of molecular markers based on genomic DNA. Copyright © 2017 Académie des sciences. Published by Elsevier Masson SAS. All rights reserved.
Sampling large random knots in a confined space
NASA Astrophysics Data System (ADS)
Arsuaga, J.; Blackstone, T.; Diao, Y.; Hinson, K.; Karadayi, E.; Saito, M.
2007-09-01
DNA knots formed under extreme conditions of condensation, as in bacteriophage P4, are difficult to analyze experimentally and theoretically. In this paper, we propose to use the uniform random polygon model as a supplementary method to the existing methods for generating random knots in confinement. The uniform random polygon model allows us to sample knots with large crossing numbers and also to generate large diagrammatically prime knot diagrams. We show numerically that uniform random polygons sample knots with large minimum crossing numbers and certain complicated knot invariants (as those observed experimentally). We do this in terms of the knot determinants or colorings. Our numerical results suggest that the average determinant of a uniform random polygon of n vertices grows faster than O(e^{n^2}) . We also investigate the complexity of prime knot diagrams. We show rigorously that the probability that a randomly selected 2D uniform random polygon of n vertices is almost diagrammatically prime goes to 1 as n goes to infinity. Furthermore, the average number of crossings in such a diagram is at the order of O(n2). Therefore, the two-dimensional uniform random polygons offer an effective way in sampling large (prime) knots, which can be useful in various applications.
Zhang, Kejiang; Achari, Gopal; Li, Hua
2009-11-03
Traditionally, uncertainty in parameters are represented as probabilistic distributions and incorporated into groundwater flow and contaminant transport models. With the advent of newer uncertainty theories, it is now understood that stochastic methods cannot properly represent non random uncertainties. In the groundwater flow and contaminant transport equations, uncertainty in some parameters may be random, whereas those of others may be non random. The objective of this paper is to develop a fuzzy-stochastic partial differential equation (FSPDE) model to simulate conditions where both random and non random uncertainties are involved in groundwater flow and solute transport. Three potential solution techniques namely, (a) transforming a probability distribution to a possibility distribution (Method I) then a FSPDE becomes a fuzzy partial differential equation (FPDE), (b) transforming a possibility distribution to a probability distribution (Method II) and then a FSPDE becomes a stochastic partial differential equation (SPDE), and (c) the combination of Monte Carlo methods and FPDE solution techniques (Method III) are proposed and compared. The effects of these three methods on the predictive results are investigated by using two case studies. The results show that the predictions obtained from Method II is a specific case of that got from Method I. When an exact probabilistic result is needed, Method II is suggested. As the loss or gain of information during a probability-possibility (or vice versa) transformation cannot be quantified, their influences on the predictive results is not known. Thus, Method III should probably be preferred for risk assessments.
2011-01-01
Background Clinical researchers have often preferred to use a fixed effects model for the primary interpretation of a meta-analysis. Heterogeneity is usually assessed via the well known Q and I2 statistics, along with the random effects estimate they imply. In recent years, alternative methods for quantifying heterogeneity have been proposed, that are based on a 'generalised' Q statistic. Methods We review 18 IPD meta-analyses of RCTs into treatments for cancer, in order to quantify the amount of heterogeneity present and also to discuss practical methods for explaining heterogeneity. Results Differing results were obtained when the standard Q and I2 statistics were used to test for the presence of heterogeneity. The two meta-analyses with the largest amount of heterogeneity were investigated further, and on inspection the straightforward application of a random effects model was not deemed appropriate. Compared to the standard Q statistic, the generalised Q statistic provided a more accurate platform for estimating the amount of heterogeneity in the 18 meta-analyses. Conclusions Explaining heterogeneity via the pre-specification of trial subgroups, graphical diagnostic tools and sensitivity analyses produced a more desirable outcome than an automatic application of the random effects model. Generalised Q statistic methods for quantifying and adjusting for heterogeneity should be incorporated as standard into statistical software. Software is provided to help achieve this aim. PMID:21473747
NASA Astrophysics Data System (ADS)
Zi, Bin; Zhou, Bin
2016-07-01
For the prediction of dynamic response field of the luffing system of an automobile crane (LSOAAC) with random and interval parameters, a hybrid uncertain model is introduced. In the hybrid uncertain model, the parameters with certain probability distribution are modeled as random variables, whereas, the parameters with lower and upper bounds are modeled as interval variables instead of given precise values. Based on the hybrid uncertain model, the hybrid uncertain dynamic response equilibrium equation, in which different random and interval parameters are simultaneously included in input and output terms, is constructed. Then a modified hybrid uncertain analysis method (MHUAM) is proposed. In the MHUAM, based on random interval perturbation method, the first-order Taylor series expansion and the first-order Neumann series, the dynamic response expression of the LSOAAC is developed. Moreover, the mathematical characteristics of extrema of bounds of dynamic response are determined by random interval moment method and monotonic analysis technique. Compared with the hybrid Monte Carlo method (HMCM) and interval perturbation method (IPM), numerical results show the feasibility and efficiency of the MHUAM for solving the hybrid LSOAAC problems. The effects of different uncertain models and parameters on the LSOAAC response field are also investigated deeply, and numerical results indicate that the impact made by the randomness in the thrust of the luffing cylinder F is larger than that made by the gravity of the weight in suspension Q . In addition, the impact made by the uncertainty in the displacement between the lower end of the lifting arm and the luffing cylinder a is larger than that made by the length of the lifting arm L .
Burgess, Stephen; Daniel, Rhian M; Butterworth, Adam S; Thompson, Simon G
2015-01-01
Background: Mendelian randomization uses genetic variants, assumed to be instrumental variables for a particular exposure, to estimate the causal effect of that exposure on an outcome. If the instrumental variable criteria are satisfied, the resulting estimator is consistent even in the presence of unmeasured confounding and reverse causation. Methods: We extend the Mendelian randomization paradigm to investigate more complex networks of relationships between variables, in particular where some of the effect of an exposure on the outcome may operate through an intermediate variable (a mediator). If instrumental variables for the exposure and mediator are available, direct and indirect effects of the exposure on the outcome can be estimated, for example using either a regression-based method or structural equation models. The direction of effect between the exposure and a possible mediator can also be assessed. Methods are illustrated in an applied example considering causal relationships between body mass index, C-reactive protein and uric acid. Results: These estimators are consistent in the presence of unmeasured confounding if, in addition to the instrumental variable assumptions, the effects of both the exposure on the mediator and the mediator on the outcome are homogeneous across individuals and linear without interactions. Nevertheless, a simulation study demonstrates that even considerable heterogeneity in these effects does not lead to bias in the estimates. Conclusions: These methods can be used to estimate direct and indirect causal effects in a mediation setting, and have potential for the investigation of more complex networks between multiple interrelated exposures and disease outcomes. PMID:25150977
Do little interactions get lost in dark random forests?
Wright, Marvin N; Ziegler, Andreas; König, Inke R
2016-03-31
Random forests have often been claimed to uncover interaction effects. However, if and how interaction effects can be differentiated from marginal effects remains unclear. In extensive simulation studies, we investigate whether random forest variable importance measures capture or detect gene-gene interactions. With capturing interactions, we define the ability to identify a variable that acts through an interaction with another one, while detection is the ability to identify an interaction effect as such. Of the single importance measures, the Gini importance captured interaction effects in most of the simulated scenarios, however, they were masked by marginal effects in other variables. With the permutation importance, the proportion of captured interactions was lower in all cases. Pairwise importance measures performed about equal, with a slight advantage for the joint variable importance method. However, the overall fraction of detected interactions was low. In almost all scenarios the detection fraction in a model with only marginal effects was larger than in a model with an interaction effect only. Random forests are generally capable of capturing gene-gene interactions, but current variable importance measures are unable to detect them as interactions. In most of the cases, interactions are masked by marginal effects and interactions cannot be differentiated from marginal effects. Consequently, caution is warranted when claiming that random forests uncover interactions.
ERIC Educational Resources Information Center
Kariuki, Patrick N. K.; Bush, Elizabeth Danielle
2008-01-01
The purpose of this study was to examine the effects of Total Physical Response by Storytelling and the traditional teaching method on a foreign language in a selected high school. The sample consisted of 30 students who were randomly selected and randomly assigned to experimental and control group. The experimental group was taught using Total…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, S.Y.; Tepikian, S.
1985-01-01
Nonlinear magnetic forces become more important for particles in the modern large accelerators. These nonlinear elements are introduced either intentionally to control beam dynamics or by uncontrollable random errors. Equations of motion in the nonlinear Hamiltonian are usually non-integrable. Because of the nonlinear part of the Hamiltonian, the tune diagram of accelerators is a jungle. Nonlinear magnet multipoles are important in keeping the accelerator operation point in the safe quarter of the hostile jungle of resonant tunes. Indeed, all the modern accelerator designs have taken advantages of nonlinear mechanics. On the other hand, the effect of the uncontrollable random multipolesmore » should be evaluated carefully. A powerful method of studying the effect of these nonlinear multipoles is using a particle tracking calculation, where a group of test particles are tracing through these magnetic multipoles in the accelerator hundreds to millions of turns in order to test the dynamical aperture of the machine. These methods are extremely useful in the design of a large accelerator such as SSC, LEP, HERA and RHIC. These calculations unfortunately take a tremendous amount of computing time. In this review the method of determining chaotic orbit and applying the method to nonlinear problems in accelerator physics is discussed. We then discuss the scaling properties and effect of random sextupoles.« less
Longitudinal data analysis with non-ignorable missing data.
Tseng, Chi-hong; Elashoff, Robert; Li, Ning; Li, Gang
2016-02-01
A common problem in the longitudinal data analysis is the missing data problem. Two types of missing patterns are generally considered in statistical literature: monotone and non-monotone missing data. Nonmonotone missing data occur when study participants intermittently miss scheduled visits, while monotone missing data can be from discontinued participation, loss to follow-up, and mortality. Although many novel statistical approaches have been developed to handle missing data in recent years, few methods are available to provide inferences to handle both types of missing data simultaneously. In this article, a latent random effects model is proposed to analyze longitudinal outcomes with both monotone and non-monotone missingness in the context of missing not at random. Another significant contribution of this article is to propose a new computational algorithm for latent random effects models. To reduce the computational burden of high-dimensional integration problem in latent random effects models, we develop a new computational algorithm that uses a new adaptive quadrature approach in conjunction with the Taylor series approximation for the likelihood function to simplify the E-step computation in the expectation-maximization algorithm. Simulation study is performed and the data from the scleroderma lung study are used to demonstrate the effectiveness of this method. © The Author(s) 2012.
Shahgholian, Nahid; Jazi, Shahrzad Khojandi; Karimian, Jahangir; Valiani, Mahboubeh
2016-01-01
Background: Restless leg syndrome prevalence is high among the patients undergoing hemodialysis. Due to several side effects of medicational treatments, the patients prefer non-medicational methods. Therefore, the present study aimed to investigate the effects of two methods of reflexology and stretching exercises on the severity of restless leg syndrome among patients undergoing hemodialysis. Materials and Methods: This study is a randomized clinical trial that was done on 90 qualified patients undergoing hemodialysis in selected hospitals of Isfahan, who were diagnosed with restless leg syndrome through standard restless leg syndrome questionnaire. They were randomly assigned by random number table to three groups: Reflexology, stretching exercises, and control groups through random allocation. Foot reflexology and stretching exercises were conducted three times a week for 30–40 min within straight 4 weeks. Data analysis was performed by SPSS version 18 using descriptive and inferential statistical analyses [one-way analysis of variance (ANOVA), paired t-test, and least significant difference (LSD) post hoc test]. Results: There was a significant difference in the mean scores of restless leg syndrome severity between reflexology and stretching exercises groups, compared to control (P < 0.001), but there was no significant difference between the two study groups (P < 0.001). Changes in the mean score of restless leg syndrome severity were significantly higher in reflexology and stretching exercises groups compared to the control group (P < 0.001), but it showed no significant difference between reflexology massage and stretching exercises groups. Conclusions: Our obtained results showed that reflexology and stretching exercises can reduce the severity of restless leg syndrome. These two methods of treatment are recommended to the patients. PMID:27186197
Improved Cardiovascular Prevention Using Best CME Practices: A Randomized Trial
ERIC Educational Resources Information Center
Laprise, Rejean; Thivierge, Robert; Gosselin, Gilbert; Bujas-Bobanovic, Maja; Vandal, Sylvie; Paquette, Daniel; Luneau, Micheline; Julien, Pierre; Goulet, Serge; Desaulniers, Jean; Maltais, Paule
2009-01-01
Introduction: It was hypothesized that after a continuing medical education (CME) event, practice enablers and reinforcers addressing main clinical barriers to preventive care would be more effective in improving general practitioners' (GPs) adherence to cardiovascular guidelines than a CME event only. Methods: A cluster-randomized trial was…
Tharwat, Alaa; Moemen, Yasmine S; Hassanien, Aboul Ella
2016-12-09
Measuring toxicity is one of the main steps in drug development. Hence, there is a high demand for computational models to predict the toxicity effects of the potential drugs. In this study, we used a dataset, which consists of four toxicity effects:mutagenic, tumorigenic, irritant and reproductive effects. The proposed model consists of three phases. In the first phase, rough set-based methods are used to select the most discriminative features for reducing the classification time and improving the classification performance. Due to the imbalanced class distribution, in the second phase, different sampling methods such as Random Under-Sampling, Random Over-Sampling and Synthetic Minority Oversampling Technique are used to solve the problem of imbalanced datasets. ITerative Sampling (ITS) method is proposed to avoid the limitations of those methods. ITS method has two steps. The first step (sampling step) iteratively modifies the prior distribution of the minority and majority classes. In the second step, a data cleaning method is used to remove the overlapping that is produced from the first step. In the third phase, Bagging classifier is used to classify an unknown drug into toxic or non-toxic. The experimental results proved that the proposed model performed well in classifying the unknown samples according to all toxic effects in the imbalanced datasets.
Meta-Analysis in Clinical Trials Revisited
Laird, Nan
2015-01-01
In this paper, we revisit a 1986 article we published in this Journal, Meta-Analysis in Clinical Trials, where we introduced a random-effect model to summarize the evidence about treatment efficacy from a number of related clinical trials. Because of its simplicity and ease of implementation, our approach has been widely used (with more than 12,000 citations to date) and the “DerSimonian and Laird method” is now often referred to as the ‘standard approach’ or a ‘popular’ method for meta-analysis in medical and clinical research. The method is especially useful for providing an overall effect estimate and for characterizing the heterogeneity of effects across a series of studies. Here, we review the background that led to the original 1986 article, briefly describe the random-effects approach for meta-analysis, explore its use in various settings and trends over time and recommend a refinement to the method using a robust variance estimator for testing overall effect. We conclude with a discussion of repurposing the method for Big Data meta-analysis and Genome Wide Association Studies for studying the importance of genetic variants in complex diseases. PMID:26343745
Effects of Pilates method in elderly people: Systematic review of randomized controlled trials.
de Oliveira Francisco, Cristina; de Almeida Fagundes, Alessandra; Gorges, Bruna
2015-07-01
The Pilates method has been widely used in physical training and rehabilitation. Evidence regarding the effectiveness of this method in elderly people is limited. Six randomized controlled trials studies involving the use of the Pilates method for elderly people, published prior to December 2013, were selected from the databases PubMed, MEDLINE, Embase, Cochrane, Scielo and PEDro. Three articles suggested that Pilates produced improvements in balance. Two studies evaluated the adherence to Pilates programs. One study assessed Pilates' influence on cardio-metabolic parameters and another study evaluated changes in body composition. Strong evidence was found regarding beneficial effects of Pilates over static and dynamic balance in women. Nevertheless, evidence of balance improvement in both genders, changes in body composition in woman and adherence to Pilates programs were limited. Effects on cardio-metabolic parameters due to Pilates training presented inconclusive results. Pilates may be a useful tool in rehabilitation and prevention programs but more high quality studies are necessary to establish all the effects on elderly populations. Copyright © 2015 Elsevier Ltd. All rights reserved.
Random mechanics: Nonlinear vibrations, turbulences, seisms, swells, fatigue
NASA Astrophysics Data System (ADS)
Kree, P.; Soize, C.
The random modeling of physical phenomena, together with probabilistic methods for the numerical calculation of random mechanical forces, are analytically explored. Attention is given to theoretical examinations such as probabilistic concepts, linear filtering techniques, and trajectory statistics. Applications of the methods to structures experiencing atmospheric turbulence, the quantification of turbulence, and the dynamic responses of the structures are considered. A probabilistic approach is taken to study the effects of earthquakes on structures and to the forces exerted by ocean waves on marine structures. Theoretical analyses by means of vector spaces and stochastic modeling are reviewed, as are Markovian formulations of Gaussian processes and the definition of stochastic differential equations. Finally, random vibrations with a variable number of links and linear oscillators undergoing the square of Gaussian processes are investigated.
Yao, Shengnan; Zeng, Weiming; Wang, Nizhuan; Chen, Lei
2013-07-01
Independent component analysis (ICA) has been proven to be effective for functional magnetic resonance imaging (fMRI) data analysis. However, ICA decomposition requires to optimize the unmixing matrix iteratively whose initial values are generated randomly. Thus the randomness of the initialization leads to different ICA decomposition results. Therefore, just one-time decomposition for fMRI data analysis is not usually reliable. Under this circumstance, several methods about repeated decompositions with ICA (RDICA) were proposed to reveal the stability of ICA decomposition. Although utilizing RDICA has achieved satisfying results in validating the performance of ICA decomposition, RDICA cost much computing time. To mitigate the problem, in this paper, we propose a method, named ATGP-ICA, to do the fMRI data analysis. This method generates fixed initial values with automatic target generation process (ATGP) instead of being produced randomly. We performed experimental tests on both hybrid data and fMRI data to indicate the effectiveness of the new method and made a performance comparison of the traditional one-time decomposition with ICA (ODICA), RDICA and ATGP-ICA. The proposed method demonstrated that it not only could eliminate the randomness of ICA decomposition, but also could save much computing time compared to RDICA. Furthermore, the ROC (Receiver Operating Characteristic) power analysis also denoted the better signal reconstruction performance of ATGP-ICA than that of RDICA. Copyright © 2013 Elsevier Inc. All rights reserved.
Chan, Derwin K; Ivarsson, Andreas; Stenling, Andreas; Yang, Sophie X; Chatzisarantis, Nikos L; Hagger, Martin S
2015-12-01
Consistency tendency is characterized by the propensity for participants responding to subsequent items in a survey consistent with their responses to previous items. This method effect might contaminate the results of sport psychology surveys using cross-sectional design. We present a randomized controlled crossover study examining the effect of consistency tendency on the motivational pathway (i.e., autonomy support → autonomous motivation → intention) of self-determination theory in the context of sport injury prevention. Athletes from Sweden (N = 341) responded to the survey printed in either low interitem distance (IID; consistency tendency likely) or high IID (consistency tendency suppressed) on two separate occasions, with a one-week interim period. Participants were randomly allocated into two groups, and they received the survey of different IID at each occasion. Bayesian structural equation modeling showed that low IID condition had stronger parameter estimates than high IID condition, but the differences were not statistically significant.
Applying a weighted random forests method to extract karst sinkholes from LiDAR data
NASA Astrophysics Data System (ADS)
Zhu, Junfeng; Pierskalla, William P.
2016-02-01
Detailed mapping of sinkholes provides critical information for mitigating sinkhole hazards and understanding groundwater and surface water interactions in karst terrains. LiDAR (Light Detection and Ranging) measures the earth's surface in high-resolution and high-density and has shown great potentials to drastically improve locating and delineating sinkholes. However, processing LiDAR data to extract sinkholes requires separating sinkholes from other depressions, which can be laborious because of the sheer number of the depressions commonly generated from LiDAR data. In this study, we applied the random forests, a machine learning method, to automatically separate sinkholes from other depressions in a karst region in central Kentucky. The sinkhole-extraction random forest was grown on a training dataset built from an area where LiDAR-derived depressions were manually classified through a visual inspection and field verification process. Based on the geometry of depressions, as well as natural and human factors related to sinkholes, 11 parameters were selected as predictive variables to form the dataset. Because the training dataset was imbalanced with the majority of depressions being non-sinkholes, a weighted random forests method was used to improve the accuracy of predicting sinkholes. The weighted random forest achieved an average accuracy of 89.95% for the training dataset, demonstrating that the random forest can be an effective sinkhole classifier. Testing of the random forest in another area, however, resulted in moderate success with an average accuracy rate of 73.96%. This study suggests that an automatic sinkhole extraction procedure like the random forest classifier can significantly reduce time and labor costs and makes its more tractable to map sinkholes using LiDAR data for large areas. However, the random forests method cannot totally replace manual procedures, such as visual inspection and field verification.
Likelihood-Based Random-Effect Meta-Analysis of Binary Events.
Amatya, Anup; Bhaumik, Dulal K; Normand, Sharon-Lise; Greenhouse, Joel; Kaizar, Eloise; Neelon, Brian; Gibbons, Robert D
2015-01-01
Meta-analysis has been used extensively for evaluation of efficacy and safety of medical interventions. Its advantages and utilities are well known. However, recent studies have raised questions about the accuracy of the commonly used moment-based meta-analytic methods in general and for rare binary outcomes in particular. The issue is further complicated for studies with heterogeneous effect sizes. Likelihood-based mixed-effects modeling provides an alternative to moment-based methods such as inverse-variance weighted fixed- and random-effects estimators. In this article, we compare and contrast different mixed-effect modeling strategies in the context of meta-analysis. Their performance in estimation and testing of overall effect and heterogeneity are evaluated when combining results from studies with a binary outcome. Models that allow heterogeneity in both baseline rate and treatment effect across studies have low type I and type II error rates, and their estimates are the least biased among the models considered.
Robust, Adaptive Functional Regression in Functional Mixed Model Framework.
Zhu, Hongxiao; Brown, Philip J; Morris, Jeffrey S
2011-09-01
Functional data are increasingly encountered in scientific studies, and their high dimensionality and complexity lead to many analytical challenges. Various methods for functional data analysis have been developed, including functional response regression methods that involve regression of a functional response on univariate/multivariate predictors with nonparametrically represented functional coefficients. In existing methods, however, the functional regression can be sensitive to outlying curves and outlying regions of curves, so is not robust. In this paper, we introduce a new Bayesian method, robust functional mixed models (R-FMM), for performing robust functional regression within the general functional mixed model framework, which includes multiple continuous or categorical predictors and random effect functions accommodating potential between-function correlation induced by the experimental design. The underlying model involves a hierarchical scale mixture model for the fixed effects, random effect and residual error functions. These modeling assumptions across curves result in robust nonparametric estimators of the fixed and random effect functions which down-weight outlying curves and regions of curves, and produce statistics that can be used to flag global and local outliers. These assumptions also lead to distributions across wavelet coefficients that have outstanding sparsity and adaptive shrinkage properties, with great flexibility for the data to determine the sparsity and the heaviness of the tails. Together with the down-weighting of outliers, these within-curve properties lead to fixed and random effect function estimates that appear in our simulations to be remarkably adaptive in their ability to remove spurious features yet retain true features of the functions. We have developed general code to implement this fully Bayesian method that is automatic, requiring the user to only provide the functional data and design matrices. It is efficient enough to handle large data sets, and yields posterior samples of all model parameters that can be used to perform desired Bayesian estimation and inference. Although we present details for a specific implementation of the R-FMM using specific distributional choices in the hierarchical model, 1D functions, and wavelet transforms, the method can be applied more generally using other heavy-tailed distributions, higher dimensional functions (e.g. images), and using other invertible transformations as alternatives to wavelets.
Robust, Adaptive Functional Regression in Functional Mixed Model Framework
Zhu, Hongxiao; Brown, Philip J.; Morris, Jeffrey S.
2012-01-01
Functional data are increasingly encountered in scientific studies, and their high dimensionality and complexity lead to many analytical challenges. Various methods for functional data analysis have been developed, including functional response regression methods that involve regression of a functional response on univariate/multivariate predictors with nonparametrically represented functional coefficients. In existing methods, however, the functional regression can be sensitive to outlying curves and outlying regions of curves, so is not robust. In this paper, we introduce a new Bayesian method, robust functional mixed models (R-FMM), for performing robust functional regression within the general functional mixed model framework, which includes multiple continuous or categorical predictors and random effect functions accommodating potential between-function correlation induced by the experimental design. The underlying model involves a hierarchical scale mixture model for the fixed effects, random effect and residual error functions. These modeling assumptions across curves result in robust nonparametric estimators of the fixed and random effect functions which down-weight outlying curves and regions of curves, and produce statistics that can be used to flag global and local outliers. These assumptions also lead to distributions across wavelet coefficients that have outstanding sparsity and adaptive shrinkage properties, with great flexibility for the data to determine the sparsity and the heaviness of the tails. Together with the down-weighting of outliers, these within-curve properties lead to fixed and random effect function estimates that appear in our simulations to be remarkably adaptive in their ability to remove spurious features yet retain true features of the functions. We have developed general code to implement this fully Bayesian method that is automatic, requiring the user to only provide the functional data and design matrices. It is efficient enough to handle large data sets, and yields posterior samples of all model parameters that can be used to perform desired Bayesian estimation and inference. Although we present details for a specific implementation of the R-FMM using specific distributional choices in the hierarchical model, 1D functions, and wavelet transforms, the method can be applied more generally using other heavy-tailed distributions, higher dimensional functions (e.g. images), and using other invertible transformations as alternatives to wavelets. PMID:22308015
A Bayesian, generalized frailty model for comet assays.
Ghebretinsae, Aklilu Habteab; Faes, Christel; Molenberghs, Geert; De Boeck, Marlies; Geys, Helena
2013-05-01
This paper proposes a flexible modeling approach for so-called comet assay data regularly encountered in preclinical research. While such data consist of non-Gaussian outcomes in a multilevel hierarchical structure, traditional analyses typically completely or partly ignore this hierarchical nature by summarizing measurements within a cluster. Non-Gaussian outcomes are often modeled using exponential family models. This is true not only for binary and count data, but also for, example, time-to-event outcomes. Two important reasons for extending this family are for (1) the possible occurrence of overdispersion, meaning that the variability in the data may not be adequately described by the models, which often exhibit a prescribed mean-variance link, and (2) the accommodation of a hierarchical structure in the data, owing to clustering in the data. The first issue is dealt with through so-called overdispersion models. Clustering is often accommodated through the inclusion of random subject-specific effects. Though not always, one conventionally assumes such random effects to be normally distributed. In the case of time-to-event data, one encounters, for example, the gamma frailty model (Duchateau and Janssen, 2007 ). While both of these issues may occur simultaneously, models combining both are uncommon. Molenberghs et al. ( 2010 ) proposed a broad class of generalized linear models accommodating overdispersion and clustering through two separate sets of random effects. Here, we use this method to model data from a comet assay with a three-level hierarchical structure. Although a conjugate gamma random effect is used for the overdispersion random effect, both gamma and normal random effects are considered for the hierarchical random effect. Apart from model formulation, we place emphasis on Bayesian estimation. Our proposed method has an upper hand over the traditional analysis in that it (1) uses the appropriate distribution stipulated in the literature; (2) deals with the complete hierarchical nature; and (3) uses all information instead of summary measures. The fit of the model to the comet assay is compared against the background of more conventional model fits. Results indicate the toxicity of 1,2-dimethylhydrazine dihydrochloride at different dose levels (low, medium, and high).
Ewing, Alexander C.; Kottke, Melissa J.; Kraft, Joan Marie; Sales, Jessica M.; Brown, Jennifer L.; Goedken, Peggy; Wiener, Jeffrey; Kourtis, Athena P.
2018-01-01
Background African American adolescent females are at elevated risk for unintended pregnancy and sexually transmitted infections (STIs). Dual protection (DP) is defined as concurrent prevention of pregnancy and STIs. This can be achieved by abstinence, consistent condom use, or the dual methods of condoms plus an effective non-barrier contraceptive. Previous clinic-based interventions showed short-term effects on increasing dual method use, but evidence of sustained effects on dual method use and decreased incident pregnancies and STIs are lacking. Methods/Design This manuscript describes the 2GETHER Project. 2GETHER is a randomized controlled trial of a multi-component intervention to increase dual protection use among sexually active African American females aged 14–19 years not desiring pregnancy at a Title X clinic in Atlanta, GA. The intervention is clinic-based and includes a culturally tailored interactive multimedia component and counseling sessions, both to assist in selection of a DP method and to reinforce use of the DP method. The participants are randomized to the study intervention or the standard of care, and followed for 12 months to evaluate how the intervention influences DP method selection and adherence, pregnancy and STI incidence, and participants’ DP knowledge, intentions, and self-efficacy. Discussion The 2GETHER Project is a novel trial to reduce unintended pregnancies and STIs among African American adolescents. The intervention is unique in the comprehensive and complementary nature of its components and its individual tailoring of provider-patient interaction. If the trial interventions are shown to be effective, then it will be reasonable to assess their scalability and applicability in other populations. PMID:28007634
Nejati, Parisa; Ghahremaninia, Armita; Naderi, Farrokh; Gharibzadeh, Safoora; Mazaherinezhad, Ali
2017-01-01
Background: Subacromial impingement syndrome (SAIS) is the most common disorder of the shoulder. The evidence for the effectiveness of treatment options is inconclusive and limited. Therefore, there is a need for more evidence in this regard, particularly for long-term outcomes. Hypothesis: Platelet-rich plasma (PRP) would be an effective method in treating subacromial impingement. Study Design: Randomized controlled trial; Level of evidence, 1. Methods: This was a single-blinded randomized clinical trial with 1-, 3-, and 6-month follow-up. Sixty-two patients were randomly placed into 2 groups, receiving either PRP or exercise therapy. The outcome parameters were pain, shoulder range of motion (ROM), muscle force, functionality, and magnetic resonance imaging findings. Results: Both treatment options significantly reduced pain and increased shoulder ROM compared with baseline measurements. Both treatments also significantly improved functionality. However, the treatment choices were not significantly effective in improving muscle force. Trend analysis revealed that in the first and third months, exercise therapy was superior to PRP in pain, shoulder flexion and abduction, and functionality. However, in the sixth month, only shoulder abduction and total Western Ontario Rotator Cuff score were significantly different between the 2 groups. Conclusion: Both PRP injection and exercise therapy were effective in reducing pain and disability in patients with SAIS, with exercise therapy proving more effective. PMID:28567426
Fang, Yun; Wu, Hulin; Zhu, Li-Xing
2011-07-01
We propose a two-stage estimation method for random coefficient ordinary differential equation (ODE) models. A maximum pseudo-likelihood estimator (MPLE) is derived based on a mixed-effects modeling approach and its asymptotic properties for population parameters are established. The proposed method does not require repeatedly solving ODEs, and is computationally efficient although it does pay a price with the loss of some estimation efficiency. However, the method does offer an alternative approach when the exact likelihood approach fails due to model complexity and high-dimensional parameter space, and it can also serve as a method to obtain the starting estimates for more accurate estimation methods. In addition, the proposed method does not need to specify the initial values of state variables and preserves all the advantages of the mixed-effects modeling approach. The finite sample properties of the proposed estimator are studied via Monte Carlo simulations and the methodology is also illustrated with application to an AIDS clinical data set.
Jian Yang; Hong S. He; Brian R. Sturtevant; Brian R. Miranda; Eric J. Gustafson
2008-01-01
We compared four fire spread simulation methods (completely random, dynamic percolation. size-based minimum travel time algorithm. and duration-based minimum travel time algorithm) and two fire occurrence simulation methods (Poisson fire frequency model and hierarchical fire frequency model) using a two-way factorial design. We examined these treatment effects on...
Electromagnetic Scattering by Fully Ordered and Quasi-Random Rigid Particulate Samples
NASA Technical Reports Server (NTRS)
Mishchenko, Michael I.; Dlugach, Janna M.; Mackowski, Daniel W.
2016-01-01
In this paper we have analyzed circumstances under which a rigid particulate sample can behave optically as a true discrete random medium consisting of particles randomly moving relative to each other during measurement. To this end, we applied the numerically exact superposition T-matrix method to model far-field scattering characteristics of fully ordered and quasi-randomly arranged rigid multiparticle groups in fixed and random orientations. We have shown that, in and of itself, averaging optical observables over movements of a rigid sample as a whole is insufficient unless it is combined with a quasi-random arrangement of the constituent particles in the sample. Otherwise, certain scattering effects typical of discrete random media (including some manifestations of coherent backscattering) may not be accurately replicated.
Latent class instrumental variables: A clinical and biostatistical perspective
Baker, Stuart G.; Kramer, Barnett S.; Lindeman, Karen S.
2015-01-01
In some two-arm randomized trials, some participants receive the treatment assigned to the other arm as a result of technical problems, refusal of a treatment invitation, or a choice of treatment in an encouragement design. In some before-and-after studies, the availability of a new treatment changes from one time period to this next. Under assumptions that are often reasonable, the latent class instrumental variable (IV) method estimates the effect of treatment received in the aforementioned scenarios involving all-or-none compliance and all-or-none availability. Key aspects are four initial latent classes (sometimes called principal strata) based on treatment received if in each randomization group or time period, the exclusion restriction assumption (in which randomization group or time period is an instrumental variable), the monotonicity assumption (which drops an implausible latent class from the analysis), and the estimated effect of receiving treatment in one latent class (sometimes called efficacy, the local average treatment effect, or the complier average causal effect). Since its independent formulations in the biostatistics and econometrics literatures, the latent class IV method (which has no well-established name) has gained increasing popularity. We review the latent class IV method from a clinical and biostatistical perspective, focusing on underlying assumptions, methodological extensions, and applications in our fields of obstetrics and cancer research. PMID:26239275
A Tutorial on Multilevel Survival Analysis: Methods, Models and Applications
Austin, Peter C.
2017-01-01
Summary Data that have a multilevel structure occur frequently across a range of disciplines, including epidemiology, health services research, public health, education and sociology. We describe three families of regression models for the analysis of multilevel survival data. First, Cox proportional hazards models with mixed effects incorporate cluster-specific random effects that modify the baseline hazard function. Second, piecewise exponential survival models partition the duration of follow-up into mutually exclusive intervals and fit a model that assumes that the hazard function is constant within each interval. This is equivalent to a Poisson regression model that incorporates the duration of exposure within each interval. By incorporating cluster-specific random effects, generalised linear mixed models can be used to analyse these data. Third, after partitioning the duration of follow-up into mutually exclusive intervals, one can use discrete time survival models that use a complementary log–log generalised linear model to model the occurrence of the outcome of interest within each interval. Random effects can be incorporated to account for within-cluster homogeneity in outcomes. We illustrate the application of these methods using data consisting of patients hospitalised with a heart attack. We illustrate the application of these methods using three statistical programming languages (R, SAS and Stata). PMID:29307954
A Tutorial on Multilevel Survival Analysis: Methods, Models and Applications.
Austin, Peter C
2017-08-01
Data that have a multilevel structure occur frequently across a range of disciplines, including epidemiology, health services research, public health, education and sociology. We describe three families of regression models for the analysis of multilevel survival data. First, Cox proportional hazards models with mixed effects incorporate cluster-specific random effects that modify the baseline hazard function. Second, piecewise exponential survival models partition the duration of follow-up into mutually exclusive intervals and fit a model that assumes that the hazard function is constant within each interval. This is equivalent to a Poisson regression model that incorporates the duration of exposure within each interval. By incorporating cluster-specific random effects, generalised linear mixed models can be used to analyse these data. Third, after partitioning the duration of follow-up into mutually exclusive intervals, one can use discrete time survival models that use a complementary log-log generalised linear model to model the occurrence of the outcome of interest within each interval. Random effects can be incorporated to account for within-cluster homogeneity in outcomes. We illustrate the application of these methods using data consisting of patients hospitalised with a heart attack. We illustrate the application of these methods using three statistical programming languages (R, SAS and Stata).
Role of re-screening of cervical smears in internal quality control.
Baker, A; Melcher, D; Smith, R
1995-01-01
AIMS--To investigate the use of rapid re-screening as a quality control method for previously screened cervical slides; to compare this method with 10% random re-screening and clinically indicated double screening. METHODS--Between June 1990 and December 1994, 117,890 negative smears were subjected to rapid re-screening. RESULTS--This study shows that rapid re-screening detects far greater numbers of false negative cases when compared with both 10% random re-screening and clinically indicated double screening, with no additional demand on human resources. The technique also identifies variation in the performance of screening personnel as an additional benefit. CONCLUSION--Rapid re-screening is an effective method of quality control. Although less sensitive, rapid re-screening should replace 10% random re-screening and selected re-screening as greater numbers of false negative results are detected while consuming less resources. PMID:8543619
Tehran Air Pollutants Prediction Based on Random Forest Feature Selection Method
NASA Astrophysics Data System (ADS)
Shamsoddini, A.; Aboodi, M. R.; Karami, J.
2017-09-01
Air pollution as one of the most serious forms of environmental pollutions poses huge threat to human life. Air pollution leads to environmental instability, and has harmful and undesirable effects on the environment. Modern prediction methods of the pollutant concentration are able to improve decision making and provide appropriate solutions. This study examines the performance of the Random Forest feature selection in combination with multiple-linear regression and Multilayer Perceptron Artificial Neural Networks methods, in order to achieve an efficient model to estimate carbon monoxide and nitrogen dioxide, sulfur dioxide and PM2.5 contents in the air. The results indicated that Artificial Neural Networks fed by the attributes selected by Random Forest feature selection method performed more accurate than other models for the modeling of all pollutants. The estimation accuracy of sulfur dioxide emissions was lower than the other air contaminants whereas the nitrogen dioxide was predicted more accurate than the other pollutants.
ERIC Educational Resources Information Center
Oen, Urban T.; Sweany, H. Paul
To compare the effectiveness of individualized and lecture-discussion methods with a non-instruction (Control) method in developing turfgrass competencies in 11th and 12th grade students as measured by achievement in a battery of tests, teachers from 29 Michigan schools were randomly placed in three groups and attended workshops where they were…
simulation methods for materials physics and chemistry, with particular expertise in post-DFT, high accuracy methods such as the GW approximation for electronic structure and random phase approximation (RPA) total the art in computational methods, including efficient methods for including the effects of substrates
Gagnon, Marie-Pierre; Gagnon, Johanne; Desmartis, Marie; Njoya, Merlin
2013-01-01
This study aimed to assess the effectiveness of a blended-teaching intervention using Internet-based tutorials coupled with traditional lectures in an introduction to research undergraduate nursing course. Effects of the intervention were compared with conventional, face-to-face classroom teaching on three outcomes: knowledge, satisfaction, and self-learning readiness. A two-group, randomized, controlled design was used, involving 112 participants. Descriptive statistics and analysis of covariance (ANCOVA) were performed. The teaching method was found to have no direct impact on knowledge acquisition, satisfaction, and self-learning readiness. However, motivation and teaching method had an interaction effect on knowledge acquisition by students. Among less motivated students, those in the intervention group performed better than those who received traditional training. These findings suggest that this blended-teaching method could better suit some students, depending on their degree of motivation and level of self-directed learning readiness.
Stochastic Seismic Response of an Algiers Site with Random Depth to Bedrock
NASA Astrophysics Data System (ADS)
Badaoui, M.; Berrah, M. K.; Mébarki, A.
2010-05-01
Among the important effects of the Boumerdes earthquake (Algeria, May 21st 2003) was that, within the same zone, the destructions in certain parts were more important than in others. This phenomenon is due to site effects which alter the characteristics of seismic motions and cause concentration of damage during earthquakes. Local site effects such as thickness and mechanical properties of soil layers have important effects on the surface ground motions. This paper deals with the effect of the randomness aspect of the depth to bedrock (soil layers heights) which is assumed to be a random variable with lognormal distribution. This distribution is suitable for strictly non-negative random variables with large values of the coefficient of variation. In this case, Monte Carlo simulations are combined with the stiffness matrix method, used herein as a deterministic method, for evaluating the effect of the depth to bedrock uncertainty on the seismic response of a multilayered soil. This study considers a P and SV wave propagation pattern using input accelerations collected at Keddara station, located at 20 km from the epicenter, as it is located directly on the bedrock. A parametric study is conducted do derive the stochastic behavior of the peak ground acceleration and its response spectrum, the transfer function and the amplification factors. It is found that the soil height heterogeneity causes a widening of the frequency content and an increase in the fundamental frequency of the soil profile, indicating that the resonance phenomenon concerns a larger number of structures.
Introducing Statistical Inference to Biology Students through Bootstrapping and Randomization
ERIC Educational Resources Information Center
Lock, Robin H.; Lock, Patti Frazer
2008-01-01
Bootstrap methods and randomization tests are increasingly being used as alternatives to standard statistical procedures in biology. They also serve as an effective introduction to the key ideas of statistical inference in introductory courses for biology students. We discuss the use of such simulation based procedures in an integrated curriculum…
ERIC Educational Resources Information Center
Cream, Angela; O'Brian, Sue; Jones, Mark; Block, Susan; Harrison, Elisabeth; Lincoln, Michelle; Hewat, Sally; Packman, Ann; Menzies, Ross; Onslow, Mark
2010-01-01
Purpose: In this study, the authors investigated the efficacy of video self-modeling (VSM) following speech restructuring treatment to improve the maintenance of treatment effects. Method: The design was an open-plan, parallel-group, randomized controlled trial. Participants were 89 adults and adolescents who undertook intensive speech…
Tic Reduction with Risperidone Versus Pimozide in a Randomized, Double-Blind, Crossover Trial
ERIC Educational Resources Information Center
Gilbert, Donald L.; Batterson, J. Robert; Sethuraman, Gopalan; Sallee, Floyd R.
2004-01-01
Objective: To compare the tic suppression, electrocardiogram (ECG) changes, weight gain, and side effect profiles of pimozide versus risperidone in children and adolescents with tic disorders. Method: This was a randomized, double-blind, crossover (evaluable patient analysis) study. Nineteen children aged 7 to 17 years with Tourette's or chronic…
Individual mineral supplement intake by ewes swath grazing or confinement fed pea-barley forage
USDA-ARS?s Scientific Manuscript database
Sixty mature ewes (non-pregnant, non-lactating) were used in a completely randomized design to determine if feeding method of pea-barley forage (swath grazing or hay in confinement) had an effect on individual ewe mineral consumption. Thirty ewes were randomly allocated to 3 confinement pens and 30 ...
EEG Neurofeedback for ADHD: Double-Blind Sham-Controlled Randomized Pilot Feasibility Trial
ERIC Educational Resources Information Center
Arnold, L. Eugene; Lofthouse, Nicholas; Hersch, Sarah; Pan, Xueliang; Hurt, Elizabeth; Bates, Bethany; Kassouf, Kathleen; Moone, Stacey; Grantier, Cara
2013-01-01
Objective: Preparing for a definitive randomized clinical trial (RCT) of neurofeedback (NF) for ADHD, this pilot trial explored feasibility of a double-blind, sham-controlled design and adherence/palatability/relative effect of two versus three treatments/week. Method: Unmedicated 6- to 12-year-olds with "Diagnostic and Statistical Manual of…
ERIC Educational Resources Information Center
O'Neill, James M.; Clark, Jeffrey K.; Jones, James A.
2016-01-01
Background: In elementary grades, comprehensive health education curricula have demonstrated effectiveness in addressing singular health issues. The Michigan Model for Health (MMH) was implemented and evaluated to determine its impact on nutrition, physical fitness, and safety knowledge and skills. Methods: Schools (N = 52) were randomly assigned…
Unbiased Causal Inference from an Observational Study: Results of a Within-Study Comparison
ERIC Educational Resources Information Center
Pohl, Steffi; Steiner, Peter M.; Eisermann, Jens; Soellner, Renate; Cook, Thomas D.
2009-01-01
Adjustment methods such as propensity scores and analysis of covariance are often used for estimating treatment effects in nonexperimental data. Shadish, Clark, and Steiner used a within-study comparison to test how well these adjustments work in practice. They randomly assigned participating students to a randomized or nonrandomized experiment.…
ERIC Educational Resources Information Center
Braam, W.; Didden, R.; Smits, M.; Curfs, L.
2008-01-01
Background: While several small-number or open-label studies suggest that melatonin improves sleep in individuals with intellectual disabilities (ID) with chronic sleep disturbance, a larger randomized control trial is necessary to validate these promising results. Methods: The effectiveness of melatonin for the treatment of chronic sleep…
ERIC Educational Resources Information Center
Thurstone, Christian; Riggs, Paula D.; Salomonsen-Sautel, Stacy; Mikulich-Gilbertson, Susan K.
2010-01-01
Objective: To evaluate the effect of atomoxetine hydrochloride versus placebo on attention-deficit/hyperactivity disorder (ADHD) and substance use disorder (SUD) in adolescents receiving motivational interviewing/cognitive behavioral therapy (MI/CBT) for SUD. Method: This single-site, randomized, controlled trial was conducted between December…
Methods of Learning in Statistical Education: A Randomized Trial of Public Health Graduate Students
ERIC Educational Resources Information Center
Enders, Felicity Boyd; Diener-West, Marie
2006-01-01
A randomized trial of 265 consenting students was conducted within an introductory biostatistics course: 69 received eight small group cooperative learning sessions; 97 accessed internet learning sessions; 96 received no intervention. Effect on examination score (95% CI) was assessed by intent-to-treat analysis and by incorporating reported…
Randomized Clinical Trial: The Use of SpeechEasy® in Stuttering Treatment
ERIC Educational Resources Information Center
Ritto, Ana Paula; Juste, Fabiola Staróbole; Stuart, Andrew; Kalinowski, Joseph; de Andrade, Claudia Regina Furquim
2016-01-01
Background: Numerous studies have demonstrated the benefit of devices delivering altered auditory feedback (AAF) as a therapeutic alternative for those who stutter. Aims: The effectiveness of a device delivering AAF (SpeechEasy®) was compared with behavioural techniques in the treatment of stuttering in a randomized clinical trial. Methods &…
DOT National Transportation Integrated Search
2003-04-01
This study evaluates two methods for repairing slope surface failures of clayey soil embankments. One method involves reinforcing the cohesive soils with randomly oriented synthetic fibers; the other method incorporates non-woven geotextiles. The per...
Object-based change detection method using refined Markov random field
NASA Astrophysics Data System (ADS)
Peng, Daifeng; Zhang, Yongjun
2017-01-01
In order to fully consider the local spatial constraints between neighboring objects in object-based change detection (OBCD), an OBCD approach is presented by introducing a refined Markov random field (MRF). First, two periods of images are stacked and segmented to produce image objects. Second, object spectral and textual histogram features are extracted and G-statistic is implemented to measure the distance among different histogram distributions. Meanwhile, object heterogeneity is calculated by combining spectral and textual histogram distance using adaptive weight. Third, an expectation-maximization algorithm is applied for determining the change category of each object and the initial change map is then generated. Finally, a refined change map is produced by employing the proposed refined object-based MRF method. Three experiments were conducted and compared with some state-of-the-art unsupervised OBCD methods to evaluate the effectiveness of the proposed method. Experimental results demonstrate that the proposed method obtains the highest accuracy among the methods used in this paper, which confirms its validness and effectiveness in OBCD.
A Numerical Theory for Impedance Education in Three-Dimensional Normal Incidence Tubes
NASA Technical Reports Server (NTRS)
Watson, Willie R.; Jones, Michael G.
2016-01-01
A method for educing the locally-reacting acoustic impedance of a test sample mounted in a 3-D normal incidence impedance tube is presented and validated. The unique feature of the method is that the excitation frequency (or duct geometry) may be such that high-order duct modes may exist. The method educes the impedance, iteratively, by minimizing an objective function consisting of the difference between the measured and numerically computed acoustic pressure at preselected measurement points in the duct. The method is validated on planar and high-order mode sources with data synthesized from exact mode theory. These data are then subjected to random jitter to simulate the effects of measurement uncertainties on the educed impedance spectrum. The primary conclusions of the study are 1) Without random jitter the method is in excellent agreement with that for known impedance samples, and 2) Random jitter that is compatible to that found in a typical experiment has minimal impact on the accuracy of the educed impedance.
Technical Note: Introduction of variance component analysis to setup error analysis in radiotherapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matsuo, Yukinori, E-mail: ymatsuo@kuhp.kyoto-u.ac.
Purpose: The purpose of this technical note is to introduce variance component analysis to the estimation of systematic and random components in setup error of radiotherapy. Methods: Balanced data according to the one-factor random effect model were assumed. Results: Analysis-of-variance (ANOVA)-based computation was applied to estimate the values and their confidence intervals (CIs) for systematic and random errors and the population mean of setup errors. The conventional method overestimates systematic error, especially in hypofractionated settings. The CI for systematic error becomes much wider than that for random error. The ANOVA-based estimation can be extended to a multifactor model considering multiplemore » causes of setup errors (e.g., interpatient, interfraction, and intrafraction). Conclusions: Variance component analysis may lead to novel applications to setup error analysis in radiotherapy.« less
Inverse random source scattering for the Helmholtz equation in inhomogeneous media
NASA Astrophysics Data System (ADS)
Li, Ming; Chen, Chuchu; Li, Peijun
2018-01-01
This paper is concerned with an inverse random source scattering problem in an inhomogeneous background medium. The wave propagation is modeled by the stochastic Helmholtz equation with the source driven by additive white noise. The goal is to reconstruct the statistical properties of the random source such as the mean and variance from the boundary measurement of the radiated random wave field at multiple frequencies. Both the direct and inverse problems are considered. We show that the direct problem has a unique mild solution by a constructive proof. For the inverse problem, we derive Fredholm integral equations, which connect the boundary measurement of the radiated wave field with the unknown source function. A regularized block Kaczmarz method is developed to solve the ill-posed integral equations. Numerical experiments are included to demonstrate the effectiveness of the proposed method.
Ewing, Alexander C; Kottke, Melissa J; Kraft, Joan Marie; Sales, Jessica M; Brown, Jennifer L; Goedken, Peggy; Wiener, Jeffrey; Kourtis, Athena P
2017-03-01
African American adolescent females are at elevated risk for unintended pregnancy and sexually transmitted infections (STIs). Dual protection (DP) is defined as concurrent prevention of pregnancy and STIs. This can be achieved by abstinence, consistent condom use, or the dual methods of condoms plus an effective non-barrier contraceptive. Previous clinic-based interventions showed short-term effects on increasing dual method use, but evidence of sustained effects on dual method use and decreased incident pregnancies and STIs are lacking. This manuscript describes the 2GETHER Project. 2GETHER is a randomized controlled trial of a multi-component intervention to increase dual protection use among sexually active African American females aged 14-19years not desiring pregnancy at a Title X clinic in Atlanta, GA. The intervention is clinic-based and includes a culturally tailored interactive multimedia component and counseling sessions, both to assist in selection of a DP method and to reinforce use of the DP method. The participants are randomized to the study intervention or the standard of care, and followed for 12months to evaluate how the intervention influences DP method selection and adherence, pregnancy and STI incidence, and participants' DP knowledge, intentions, and self-efficacy. The 2GETHER Project is a novel trial to reduce unintended pregnancies and STIs among African American adolescents. The intervention is unique in the comprehensive and complementary nature of its components and its individual tailoring of provider-patient interaction. If the trial interventions are shown to be effective, then it will be reasonable to assess their scalability and applicability in other populations. Published by Elsevier Inc.
A functional renormalization method for wave propagation in random media
NASA Astrophysics Data System (ADS)
Lamagna, Federico; Calzetta, Esteban
2017-08-01
We develop the exact renormalization group approach as a way to evaluate the effective speed of the propagation of a scalar wave in a medium with random inhomogeneities. We use the Martin-Siggia-Rose formalism to translate the problem into a non equilibrium field theory one, and then consider a sequence of models with a progressively lower infrared cutoff; in the limit where the cutoff is removed we recover the problem of interest. As a test of the formalism, we compute the effective dielectric constant of an homogeneous medium interspersed with randomly located, interpenetrating bubbles. A simple approximation to the renormalization group equations turns out to be equivalent to a self-consistent two-loops evaluation of the effective dielectric constant.
Inference of median difference based on the Box-Cox model in randomized clinical trials.
Maruo, K; Isogawa, N; Gosho, M
2015-05-10
In randomized clinical trials, many medical and biological measurements are not normally distributed and are often skewed. The Box-Cox transformation is a powerful procedure for comparing two treatment groups for skewed continuous variables in terms of a statistical test. However, it is difficult to directly estimate and interpret the location difference between the two groups on the original scale of the measurement. We propose a helpful method that infers the difference of the treatment effect on the original scale in a more easily interpretable form. We also provide statistical analysis packages that consistently include an estimate of the treatment effect, covariance adjustments, standard errors, and statistical hypothesis tests. The simulation study that focuses on randomized parallel group clinical trials with two treatment groups indicates that the performance of the proposed method is equivalent to or better than that of the existing non-parametric approaches in terms of the type-I error rate and power. We illustrate our method with cluster of differentiation 4 data in an acquired immune deficiency syndrome clinical trial. Copyright © 2015 John Wiley & Sons, Ltd.
Molas, Marek; Lesaffre, Emmanuel
2008-12-30
Discrete bounded outcome scores (BOS), i.e. discrete measurements that are restricted on a finite interval, often occur in practice. Examples are compliance measures, quality of life measures, etc. In this paper we examine three related random effects approaches to analyze longitudinal studies with a BOS as response: (1) a linear mixed effects (LM) model applied to a logistic transformed modified BOS; (2) a model assuming that the discrete BOS is a coarsened version of a latent random variable, which after a logistic-normal transformation, satisfies an LM model; and (3) a random effects probit model. We consider also the extension whereby the variability of the BOS is allowed to depend on covariates. The methods are contrasted using a simulation study and on a longitudinal project, which documents stroke rehabilitation in four European countries using measures of motor and functional recovery. Copyright 2008 John Wiley & Sons, Ltd.
Mukaka, Mavuto; White, Sarah A; Terlouw, Dianne J; Mwapasa, Victor; Kalilani-Phiri, Linda; Faragher, E Brian
2016-07-22
Missing outcomes can seriously impair the ability to make correct inferences from randomized controlled trials (RCTs). Complete case (CC) analysis is commonly used, but it reduces sample size and is perceived to lead to reduced statistical efficiency of estimates while increasing the potential for bias. As multiple imputation (MI) methods preserve sample size, they are generally viewed as the preferred analytical approach. We examined this assumption, comparing the performance of CC and MI methods to determine risk difference (RD) estimates in the presence of missing binary outcomes. We conducted simulation studies of 5000 simulated data sets with 50 imputations of RCTs with one primary follow-up endpoint at different underlying levels of RD (3-25 %) and missing outcomes (5-30 %). For missing at random (MAR) or missing completely at random (MCAR) outcomes, CC method estimates generally remained unbiased and achieved precision similar to or better than MI methods, and high statistical coverage. Missing not at random (MNAR) scenarios yielded invalid inferences with both methods. Effect size estimate bias was reduced in MI methods by always including group membership even if this was unrelated to missingness. Surprisingly, under MAR and MCAR conditions in the assessed scenarios, MI offered no statistical advantage over CC methods. While MI must inherently accompany CC methods for intention-to-treat analyses, these findings endorse CC methods for per protocol risk difference analyses in these conditions. These findings provide an argument for the use of the CC approach to always complement MI analyses, with the usual caveat that the validity of the mechanism for missingness be thoroughly discussed. More importantly, researchers should strive to collect as much data as possible.
Stochastic stability of parametrically excited random systems
NASA Astrophysics Data System (ADS)
Labou, M.
2004-01-01
Multidegree-of-freedom dynamic systems subjected to parametric excitation are analyzed for stochastic stability. The variation of excitation intensity with time is described by the sum of a harmonic function and a stationary random process. The stability boundaries are determined by the stochastic averaging method. The effect of random parametric excitation on the stability of trivial solutions of systems of differential equations for the moments of phase variables is studied. It is assumed that the frequency of harmonic component falls within the region of combination resonances. Stability conditions for the first and second moments are obtained. It turns out that additional parametric excitation may have a stabilizing or destabilizing effect, depending on the values of certain parameters of random excitation. As an example, the stability of a beam in plane bending is analyzed.
Sylvia, Louisa G.; Reilly-Harrington, Noreen A.; Leon, Andrew C.; Kansky, Christine I.; Ketter, Terence A.; Calabrese, Joseph R.; Thase, Michael E.; Bowden, Charles L.; Friedman, Edward S.; Ostacher, Michael J.; Iosifescu, Dan V.; Severe, Joanne; Nierenberg, Andrew A.
2013-01-01
Background High attrition rates which occur frequently in longitudinal clinical trials of interventions for bipolar disorder limit the interpretation of results. Purpose The aim of this article is to present design approaches that limited attrition in the Lithium Use for Bipolar Disorder (LiTMUS) Study. Methods LiTMUS was a 6-month randomized, longitudinal multi-site comparative effectiveness trial that examined bipolar participants who were at least mildly ill. Participants were randomized to either low to moderate doses of lithium or no lithium, in addition to other treatments needed for mood stabilization administered in a guideline-informed, empirically supported, and personalized fashion (N=283). Results Components of the study design that may have contributed to the low attrition rate of the study included use of: (1) an intent-to-treat design; (2) a randomized adjunctive single-blind design; (3) participant reimbursement; (4) intent-to-attend the next study visit (includes a discussion of attendance obstacles when intention is low); (5) quality care with limited participant burden; and (6) target windows for study visits. Limitations Site differences and the effectiveness and tolerability data have not been analyzed yet. Conclusions These components of the LiTMUS study design may have reduced the probability of attrition which would inform the design of future randomized clinical effectiveness trials. PMID:22076437
Effect of Oral Carbohydrate Intake on Labor Progress: Randomized Controlled Trial
Rahmani, R; Khakbazan, Z; Yavari, P; Granmayeh, M; Yavari, L
2012-01-01
Background Lack of information regarding biochemical changes in women during labor and its outcomes on maternal and neonatal health still is an unanswered question. This study aims to explore the effectiveness of oral carbohydrate intake during labor on the duration of the active phase and other maternal and neonatal outcomes. Methods: A parallel prospective randomized controlled trial, conducted at the University Affiliated Teaching Hospital in Gonabad. Totally, 190 women were randomly assigned to an intervention (N=87) or control (N=90) group. Inclusion criteria were low-risk women with singleton cephalic presentation; and cervical dilatation 3–4 cm. Randomization was used by random number generator on every day. Odd numbers was used for intervention and even numbers for control group. Intervention was based on the preferences between: 3 medium dates plus 110 ml water; 3 dates plus 110 ml light tea without sugar; or 110 ml orange juice. The protocol is only run once but women ate and drank gradually before second stage of labor. Control group were fasted as routine practice. Neither participants nor care givers or staff could be blinded to group allocation. Differences between duration of the active phase of labor were assessed as primary outcome measure. Results: There was significant difference in the length of second stage of labor (P <.05). The effect size for this variable was 0.48. There were no significant differences in other maternal and neonatal outcomes. Conclusions: Oral intake of carbohydrate was an effective method for shortening the duration of second stage of labor in low-risk women. PMID:23304677
Valbuza, Juliana Spelta; de Oliveira, Márcio Moysés; Conti, Cristiane Fiquene; Prado, Lucila Bizari F; de Carvalho, Luciane Bizari Coin; do Prado, Gilmar Fernandes
2010-12-01
Treatment of obstructive sleep apnea (OSA) using methods for increasing upper airway muscle tonus has been controversial and poorly reported. Thus, a review of the evidence is needed to evaluate the effectiveness of these methods. The design used was a systematic review of randomized controlled trials. Data sources are from the Cochrane Library, Medline, Embase and Scielo, registries of ongoing trials, theses indexed at Biblioteca Regional de Medicina/Pan-American Health Organization of the World Health Organization and the reference lists of all the trials retrieved. This was a review of randomized or quasi-randomized double-blind trials on OSA. Two reviewers independently applied eligibility criteria. One reviewer assessed study quality and extracted data, and these processes were checked by a second reviewer. The primary outcome was a decrease in the apnea/hypopnea index (AHI) of below five episodes per hour. Other outcomes were subjective sleep quality, sleep quality measured by night polysomnography, quality of life measured subjectively and adverse events associated with the treatments. Three eligible trials were included. Two studies showed improvements through the objective and subjective analyses, and one study showed improvement of snoring, but not of AHI while the subjective analyses showed no improvement. The adverse events were reported and they were not significant. There is no accepted scientific evidence that methods aiming to increase muscle tonus of the stomatognathic system are effective in reducing AHI to below five events per hour. Well-designed randomized controlled trials are needed to assess the efficacy of such methods.
Disease Mapping of Zero-excessive Mesothelioma Data in Flanders
Neyens, Thomas; Lawson, Andrew B.; Kirby, Russell S.; Nuyts, Valerie; Watjou, Kevin; Aregay, Mehreteab; Carroll, Rachel; Nawrot, Tim S.; Faes, Christel
2016-01-01
Purpose To investigate the distribution of mesothelioma in Flanders using Bayesian disease mapping models that account for both an excess of zeros and overdispersion. Methods The numbers of newly diagnosed mesothelioma cases within all Flemish municipalities between 1999 and 2008 were obtained from the Belgian Cancer Registry. To deal with overdispersion, zero-inflation and geographical association, the hurdle combined model was proposed, which has three components: a Bernoulli zero-inflation mixture component to account for excess zeros, a gamma random effect to adjust for overdispersion and a normal conditional autoregressive random effect to attribute spatial association. This model was compared with other existing methods in literature. Results The results indicate that hurdle models with a random effects term accounting for extra-variance in the Bernoulli zero-inflation component fit the data better than hurdle models that do not take overdispersion in the occurrence of zeros into account. Furthermore, traditional models that do not take into account excessive zeros but contain at least one random effects term that models extra-variance in the counts have better fits compared to their hurdle counterparts. In other words, the extra-variability, due to an excess of zeros, can be accommodated by spatially structured and/or unstructured random effects in a Poisson model such that the hurdle mixture model is not necessary. Conclusions Models taking into account zero-inflation do not always provide better fits to data with excessive zeros than less complex models. In this study, a simple conditional autoregressive model identified a cluster in mesothelioma cases near a former asbestos processing plant (Kapelle-op-den-Bos). This observation is likely linked with historical local asbestos exposures. Future research will clarify this. PMID:27908590
Shahgholian, Nahid; Jazi, Shahrzad Khojandi; Karimian, Jahangir; Valiani, Mahboubeh
2016-01-01
Restless leg syndrome prevalence is high among the patients undergoing hemodialysis. Due to several side effects of medicational treatments, the patients prefer non-medicational methods. Therefore, the present study aimed to investigate the effects of two methods of reflexology and stretching exercises on the severity of restless leg syndrome among patients undergoing hemodialysis. This study is a randomized clinical trial that was done on 90 qualified patients undergoing hemodialysis in selected hospitals of Isfahan, who were diagnosed with restless leg syndrome through standard restless leg syndrome questionnaire. They were randomly assigned by random number table to three groups: Reflexology, stretching exercises, and control groups through random allocation. Foot reflexology and stretching exercises were conducted three times a week for 30-40 min within straight 4 weeks. Data analysis was performed by SPSS version 18 using descriptive and inferential statistical analyses [one-way analysis of variance (ANOVA), paired t-test, and least significant difference (LSD) post hoc test]. There was a significant difference in the mean scores of restless leg syndrome severity between reflexology and stretching exercises groups, compared to control (P < 0.001), but there was no significant difference between the two study groups (P < 0.001). Changes in the mean score of restless leg syndrome severity were significantly higher in reflexology and stretching exercises groups compared to the control group (P < 0.001), but it showed no significant difference between reflexology massage and stretching exercises groups. Our obtained results showed that reflexology and stretching exercises can reduce the severity of restless leg syndrome. These two methods of treatment are recommended to the patients.
Sample Size Calculations for Micro-randomized Trials in mHealth
Liao, Peng; Klasnja, Predrag; Tewari, Ambuj; Murphy, Susan A.
2015-01-01
The use and development of mobile interventions are experiencing rapid growth. In “just-in-time” mobile interventions, treatments are provided via a mobile device and they are intended to help an individual make healthy decisions “in the moment,” and thus have a proximal, near future impact. Currently the development of mobile interventions is proceeding at a much faster pace than that of associated data science methods. A first step toward developing data-based methods is to provide an experimental design for testing the proximal effects of these just-in-time treatments. In this paper, we propose a “micro-randomized” trial design for this purpose. In a micro-randomized trial, treatments are sequentially randomized throughout the conduct of the study, with the result that each participant may be randomized at the 100s or 1000s of occasions at which a treatment might be provided. Further, we develop a test statistic for assessing the proximal effect of a treatment as well as an associated sample size calculator. We conduct simulation evaluations of the sample size calculator in various settings. Rules of thumb that might be used in designing a micro-randomized trial are discussed. This work is motivated by our collaboration on the HeartSteps mobile application designed to increase physical activity. PMID:26707831
Complementary and Alternative Approaches to Pain Relief During Labor
Theau-Yonneau, Anne
2007-01-01
This review evaluated the effect of complementary and alternative medicine on pain during labor with conventional scientific methods using electronic data bases through 2006 were used. Only randomized controlled trials with outcome measures for labor pain were kept for the conclusions. Many studies did not meet the scientific inclusion criteria. According to the randomized control trials, we conclude that for the decrease of labor pain and/or reduction of the need for conventional analgesic methods: (i) There is an efficacy found for acupressure and sterile water blocks. (ii) Most results favored some efficacy for acupuncture and hydrotherapy. (iii) Studies for other complementary or alternative therapies for labor pain control have not shown their effectiveness. PMID:18227907
Leung, Joseph; Mann, Surinder; Siao-Salera, Rodelei; Ransibrahmanakul, Kanat; Lim, Brian; Canete, Wilhelmina; Samson, Laramie; Gutierrez, Rebeck; Leung, Felix W
2011-01-01
Sedation for colonoscopy discomfort imposes a recovery-time burden on patients. The water method permitted 52% of patients accepting on-demand sedation to complete colonoscopy without sedation. On-site and at-home recovery times were not reported. To confirm the beneficial effect of the water method and document the patient recovery-time burden. Randomized, controlled trial, with single-blinded, intent-to-treat analysis. Veterans Affairs outpatient endoscopy unit. This study involved veterans accepting on-demand sedation for screening and surveillance colonoscopy. Air versus water method for colonoscope insertion. Proportion of patients completing colonoscopy without sedation, cecal intubation rate, medication requirement, maximum discomfort (0 = none, 10 = severe), procedure-related and patient-related outcomes. One hundred veterans were randomized to the air (n = 50) or water (n = 50) method. The proportions of patients who could complete colonoscopy without sedation in the water group (78%) and the air group (54%) were significantly different (P = .011, Fisher exact test), but the cecal intubation rate was similar (100% in both groups). Secondary analysis (data as Mean [SD]) shows that the water method produced a reduction in medication requirement: fentanyl, 12.5 (26.8) μg versus 24.0 (30.7) μg; midazolam, 0.5 (1.1) mg versus 0.94 (1.20) mg; maximum discomfort, 2.3 (1.7) versus 4.9 (2.0); recovery time on site, 8.4 (6.8) versus 12.3 (9.4) minutes; and recovery time at home, 4.5 (9.2) versus 10.9 (14.0) hours (P = .049; P = .06; P = .0012; P = .0199; and P = .0048, respectively, t test). Single Veterans Affairs site, predominantly male population, unblinded examiners. This randomized, controlled trial confirms the reported beneficial effects of the water method. The combination of the water method with on-demand sedation minimizes the patient recovery-time burden. ( NCT00920751.). Copyright © 2011 American Society for Gastrointestinal Endoscopy. Published by Mosby, Inc. All rights reserved.
ERIC Educational Resources Information Center
Schuppert, H. Marieke; Timmerman, Marieke E.; Bloo, Josephine; van Gemert, Tonny G.; Wiersema, Herman M.; Minderaa, Ruud B.; Emmelkamp, Paul M. G.; Nauta, Maaike H.
2012-01-01
Objective: To evaluate the effectiveness of Emotion Regulation Training (ERT), a 17-session weekly group training for adolescents with borderline personality disorder (BPD) symptoms. Method: One hundred nine adolescents with borderline traits (73% meeting the full criteria for BPD) were randomized to treatment as usual only (TAU) or ERT + TAU.…
Mentalization-Based Treatment for Self-Harm in Adolescents: A Randomized Controlled Trial
ERIC Educational Resources Information Center
Rossouw, Trudie I.; Fonagy, Peter
2012-01-01
Objective: We examined whether mentalization-based treatment for adolescents (MBT-A) is more effective than treatment as usual (TAU) for adolescents who self-harm. Method: A total of 80 adolescents (85% female) consecutively presenting to mental health services with self-harm and comorbid depression were randomly allocated to either MBT-A or TAU.…
ERIC Educational Resources Information Center
Greeson, Jeffrey M.; Juberg, Michael K.; Maytan, Margaret; James, Kiera; Rogers, Holly
2014-01-01
Objective: To evaluate the effectiveness of Koru, a mindfulness training program for college students and other emerging adults. Participants: Ninety students (66% female, 62% white, 71% graduate students) participated between Fall 2012 and Spring 2013. Methods: Randomized controlled trial. It was hypothesized that Koru, compared with a wait-list…
A Randomized Placebo-Controlled Trial of a School-Based Depression Prevention Program.
ERIC Educational Resources Information Center
Merry, Sally; McDowell, Heather; Wild, Chris J.; Bir, Julliet; Cunliffe, Rachel
2004-01-01
Objective: To conduct a placebo-controlled study of the effectiveness of a universal school-based depression prevention program. Method: Three hundred ninety-two students age 13 to 15 from two schools were randomized to intervention (RAP-Kiwi) and placebo programs run by teachers. RAP-Kiwi was an 11-session manual-based program derived from…
ERIC Educational Resources Information Center
Dowling, S.; Hubert, J.; White, S.; Hollins, S.
2006-01-01
Background: Bereaved adults with intellectual disabilities are known to experience prolonged and atypical grief which is often unrecognized. The aim of this project was to find an effective way to improve mental health and behavioural outcomes. Methods: Subjects were randomized to two different therapeutic interventions: traditional counselling by…
ERIC Educational Resources Information Center
Henderson, Craig E.; Dakof, Gayle A.; Greenbaum, Paul E.; Liddle, Howard A.
2010-01-01
Objective: We used growth mixture modeling to examine heterogeneity in treatment response in a secondary analysis of 2 randomized controlled trials testing multidimensional family therapy (MDFT), an established evidence-based therapy for adolescent drug abuse and delinquency. Method: The first study compared 2 evidence-based adolescent substance…
ERIC Educational Resources Information Center
Abikoff, Howard; Gallagher, Richard; Wells, Karen C.; Murray, Desiree W.; Huang, Lei; Lu, Feihan; Petkova, Eva
2013-01-01
Objective: The study compared the efficacy of 2 behavioral interventions to ameliorate organization, time management, and planning (OTMP) difficulties in 3rd- to 5th-grade children with attention-deficit/hyperactivity disorder (ADHD). Method: In a dual-site randomized controlled trial, 158 children were assigned to organizational skills training…
Vestibular Stimulation for ADHD: Randomized Controlled Trial of Comprehensive Motion Apparatus
ERIC Educational Resources Information Center
Clark, David L.; Arnold, L. Eugene; Crowl, Lindsay; Bozzolo, Hernan; Peruggia, Mario; Ramadan, Yaser; Bornstein, Robert; Hollway, Jill A.; Thompson, Susan; Malone, Krista; Hall, Kristy L.; Shelton, Sara B.; Bozzolo, Dawn R.; Cook, Amy
2008-01-01
Objective: This research evaluates effects of vestibular stimulation by Comprehensive Motion Apparatus (CMA) in ADHD. Method: Children ages 6 to 12 (48 boys, 5 girls) with ADHD were randomized to thrice-weekly 30-min treatments for 12 weeks with CMA, stimulating otoliths and semicircular canals, or a single-blind control of equal duration and…
ERIC Educational Resources Information Center
Diamond, Guy S.; Wintersteen, Matthew B.; Brown, Gregory K.; Diamond, Gary M.; Gallop, Robert; Shelef, Karni; Levy, Suzanne
2010-01-01
Objective: To evaluate whether Attachment-Based Family Therapy (ABFT) is more effective than Enhanced Usual Care (EUC) for reducing suicidal ideation and depressive symptoms in adolescents. Method: This was a randomized controlled trial of suicidal adolescents between the ages of 12 and 17, identified in primary care and emergency departments. Of…
ERIC Educational Resources Information Center
Yli-Piipari, Sami; Layne, Todd; Hinson, Janet; Irwin, Carol
2018-01-01
Purpose: Grounded in the trans-contextual model of motivation framework, this cluster-randomized trial examined the effectiveness of an autonomy supportive physical education (PE) instruction on student motivation and physical activity (PA). Method: The study comprised six middle schools and 408 students (M[subscript age] = 12.29), with primary…
ERIC Educational Resources Information Center
Yoder, Paul; Stone, Wendy L.
2006-01-01
Purpose: This randomized group experiment compared the efficacy of 2 communication interventions (Responsive Education and Prelinguistic Milieu Teaching [RPMT] and the Picture Exchange Communication System [PECS]) on spoken communication in 36 preschoolers with autism spectrum disorders (ASD). Method: Each treatment was delivered to children for a…
Reducing Sexual Risk Behaviors for HIV/STDs in Women with Alcohol Use Disorders
ERIC Educational Resources Information Center
Langhorst, Diane M.; Choi, Y. Joon; Keyser-Marcus, Lori; Svikis, Dace S.
2012-01-01
Objective: A pilot randomized clinical trial (RCT) examined effectiveness of HIV/STD Safer Sex Skills Building + Alcohol (SSB+A) intervention for women with Alcohol Use Disorders (AUDs) in a residential treatment setting. Method: After randomizing thirty-six women with AUDs and reporting having intercourse with a male partner in the past 180 days…
Rivera, Margarita; Locke, Adam E.; Corre, Tanguy; Czamara, Darina; Wolf, Christiane; Ching-Lopez, Ana; Milaneschi, Yuri; Kloiber, Stefan; Cohen-Woods, Sara; Rucker, James; Aitchison, Katherine J.; Bergmann, Sven; Boomsma, Dorret I.; Craddock, Nick; Gill, Michael; Holsboer, Florian; Hottenga, Jouke-Jan; Korszun, Ania; Kutalik, Zoltan; Lucae, Susanne; Maier, Wolfgang; Mors, Ole; Müller-Myhsok, Bertram; Owen, Michael J.; Penninx, Brenda W. J. H.; Preisig, Martin; Rice, John; Rietschel, Marcella; Tozzi, Federica; Uher, Rudolf; Vollenweider, Peter; Waeber, Gerard; Willemsen, Gonneke; Craig, Ian W.; Farmer, Anne E.; Lewis, Cathryn M.; Breen, Gerome; McGuffin, Peter
2017-01-01
Background Depression and obesity are highly prevalent, and major impacts on public health frequently co-occur. Recently, we reported that having depression moderates the effect of the FTO gene, suggesting its implication in the association between depression and obesity. Aims To confirm these findings by investigating the FTO polymorphism rs9939609 in new cohorts, and subsequently in a meta-analysis. Method The sample consists of 6902 individuals with depression and 6799 controls from three replication cohorts and two original discovery cohorts. Linear regression models were performed to test for association between rs9939609 and body mass index (BMI), and for the interaction between rs9939609 and depression status for an effect on BMI. Fixed and random effects meta-analyses were performed using METASOFT. Results In the replication cohorts, we observed a significant interaction between FTO, BMI and depression with fixed effects meta-analysis (β = 0.12, P = 2.7 × 10−4) and with the Han/Eskin random effects method (P = 1.4 × 10−7) but not with traditional random effects (β = 0.1, P = 0.35). When combined with the discovery cohorts, random effects meta-analysis also supports the interaction (β = 0.12, P = 0.027) being highly significant based on the Han/Eskin model (P = 6.9 × 10−8). On average, carriers of the risk allele who have depression have a 2.2% higher BMI for each risk allele, over and above the main effect of FTO. Conclusions This meta-analysis provides additional support for a significant interaction between FTO, depression and BMI, indicating that depression increases the effect of FTO on BMI. The findings provide a useful starting point in understanding the biological mechanism involved in the association between obesity and depression. PMID:28642257
Meta-analysis of Odds Ratios: Current Good Practices
Chang, Bei-Hung; Hoaglin, David C.
2016-01-01
Background Many systematic reviews of randomized clinical trials lead to meta-analyses of odds ratios. The customary methods of estimating an overall odds ratio involve weighted averages of the individual trials’ estimates of the logarithm of the odds ratio. That approach, however, has several shortcomings, arising from assumptions and approximations, that render the results unreliable. Although the problems have been documented in the literature for many years, the conventional methods persist in software and applications. A well-developed alternative approach avoids the approximations by working directly with the numbers of subjects and events in the arms of the individual trials. Objective We aim to raise awareness of methods that avoid the conventional approximations, can be applied with widely available software, and produce more-reliable results. Methods We summarize the fixed-effect and random-effects approaches to meta-analysis; describe conventional, approximate methods and alternative methods; apply the methods in a meta-analysis of 19 randomized trials of endoscopic sclerotherapy in patients with cirrhosis and esophagogastric varices; and compare the results. We demonstrate the use of SAS, Stata, and R software for the analysis. Results In the example, point estimates and confidence intervals for the overall log-odds-ratio differ between the conventional and alternative methods, in ways that can affect inferences. Programming is straightforward in the three software packages; an appendix gives the details. Conclusions The modest additional programming required should not be an obstacle to adoption of the alternative methods. Because their results are unreliable, use of the conventional methods for meta-analysis of odds ratios should be discontinued. PMID:28169977
Campos, Nicole G.; Castle, Philip E.; Schiffman, Mark; Kim, Jane J.
2013-01-01
Background Although the randomized controlled trial (RCT) is widely considered the most reliable method for evaluation of health care interventions, challenges to both internal and external validity exist. Thus, the efficacy of an intervention in a trial setting does not necessarily represent the real-world performance that decision makers seek to inform comparative effectiveness studies and economic evaluations. Methods Using data from the ASCUS-LSIL Triage Study (ALTS), we performed a simplified economic evaluation of age-based management strategies to detect cervical intraepithelial neoplasia grade 3 (CIN3) among women who were referred to the study with low-grade squamous intraepithelial lesions (LSIL). We used data from the trial itself to adjust for 1) potential lead time bias and random error that led to variation in the observed prevalence of CIN3 by study arm, and 2) potential ascertainment bias among providers in the most aggressive management arm. Results We found that using unadjusted RCT data may result in counterintuitive cost-effectiveness results when random error and/or bias are present. Following adjustment, the rank order of management strategies changed for two of the three age groups we considered. Conclusion Decision analysts need to examine study design, available trial data and cost-effectiveness results closely in order to detect evidence of potential bias. Adjustment for random error and bias in RCTs may yield different policy conclusions relative to unadjusted trial data. PMID:22147881
Game-based learning as a vehicle to teach first aid content: a randomized experiment.
Charlier, Nathalie; De Fraine, Bieke
2013-07-01
Knowledge of first aid (FA), which constitutes lifesaving treatments for injuries or illnesses, is important for every individual. In this study, we have set up a group-randomized controlled trial to assess the effectiveness of a board game for learning FA. Four class groups (120 students) were randomly assigned to 2 conditions, a board game or a traditional lecture method (control condition). The effect of the learning environment on students' achievement was examined through a paper-and-pencil test of FA knowledge. Two months after the intervention, the participants took a retention test and completed a questionnaire assessing enjoyment, interest, and motivation. An analysis of pre- and post-test knowledge scores showed that both conditions produced significant increases in knowledge. The lecture was significantly more effective in increasing knowledge, as compared to the board game. Participants indicated that they liked the game condition more than their fellow participants in the traditional lecture condition. These results suggest that traditional lectures are more effective in increasing student knowledge, whereas educational games are more effective for student enjoyment. From this case study we recommend alteration or a combination of these teaching methods to make learning both effective and enjoyable. © 2013, American School Health Association.
Latent class instrumental variables: a clinical and biostatistical perspective.
Baker, Stuart G; Kramer, Barnett S; Lindeman, Karen S
2016-01-15
In some two-arm randomized trials, some participants receive the treatment assigned to the other arm as a result of technical problems, refusal of a treatment invitation, or a choice of treatment in an encouragement design. In some before-and-after studies, the availability of a new treatment changes from one time period to this next. Under assumptions that are often reasonable, the latent class instrumental variable (IV) method estimates the effect of treatment received in the aforementioned scenarios involving all-or-none compliance and all-or-none availability. Key aspects are four initial latent classes (sometimes called principal strata) based on treatment received if in each randomization group or time period, the exclusion restriction assumption (in which randomization group or time period is an instrumental variable), the monotonicity assumption (which drops an implausible latent class from the analysis), and the estimated effect of receiving treatment in one latent class (sometimes called efficacy, the local average treatment effect, or the complier average causal effect). Since its independent formulations in the biostatistics and econometrics literatures, the latent class IV method (which has no well-established name) has gained increasing popularity. We review the latent class IV method from a clinical and biostatistical perspective, focusing on underlying assumptions, methodological extensions, and applications in our fields of obstetrics and cancer research. Copyright © 2015 John Wiley & Sons, Ltd.
ERIC Educational Resources Information Center
Sewasew, Daniel; Mengestle, Missaye; Abate, Gebeyehu
2015-01-01
The aim of this study was to compare PPT and traditional lecture method in material understandability, effectiveness and attitude among university students. Comparative descriptive survey research design was employed to answer the research questions raised. Four hundred and twenty nine participants were selected randomly using stratified sampling…
Researching Sex Bias in the Classroom.
ERIC Educational Resources Information Center
Donlan, Dan
This paper outlines five methods of research on sex bias in the classroom: one-time survey, one class/one treatment, two class/two treatment, one class/random assignment to treatment, and analysis of differentiated effect. It shows how each method could be used in attempting to measure the effect of a unit on Norma Klein's "Mom, the Wolfman and…
ERIC Educational Resources Information Center
Filges, Trine; Andersen, Ditte; Jørgensen, Anne-Marie Klint
2018-01-01
Purpose: This review evaluates the evidence of the effects of multidimensional family therapy (MDFT) on drug use reduction in young people for the treatment of nonopioid drug use. Method: We followed Campbell Collaboration guidelines to conduct a systematic review of randomized and nonrandomized trials. Meta-analytic methods were used to…
Effects of the School-to-Work Group Method among Young People
ERIC Educational Resources Information Center
Koivisto, Petri; Vuori, Jukka; Nykyri, Elina
2007-01-01
This study examines effects of the School-to-Work Group Method among 17-25-year-old young people facing the transition from vocational college to work. After baseline measurement (N=416) participants were randomized into experimental and control groups. The results of ten month follow-up (N=334) showed notable beneficial impacts of the group…
Bouwmeester, Walter; Twisk, Jos W R; Kappen, Teus H; van Klei, Wilton A; Moons, Karel G M; Vergouwe, Yvonne
2013-02-15
When study data are clustered, standard regression analysis is considered inappropriate and analytical techniques for clustered data need to be used. For prediction research in which the interest of predictor effects is on the patient level, random effect regression models are probably preferred over standard regression analysis. It is well known that the random effect parameter estimates and the standard logistic regression parameter estimates are different. Here, we compared random effect and standard logistic regression models for their ability to provide accurate predictions. Using an empirical study on 1642 surgical patients at risk of postoperative nausea and vomiting, who were treated by one of 19 anesthesiologists (clusters), we developed prognostic models either with standard or random intercept logistic regression. External validity of these models was assessed in new patients from other anesthesiologists. We supported our results with simulation studies using intra-class correlation coefficients (ICC) of 5%, 15%, or 30%. Standard performance measures and measures adapted for the clustered data structure were estimated. The model developed with random effect analysis showed better discrimination than the standard approach, if the cluster effects were used for risk prediction (standard c-index of 0.69 versus 0.66). In the external validation set, both models showed similar discrimination (standard c-index 0.68 versus 0.67). The simulation study confirmed these results. For datasets with a high ICC (≥15%), model calibration was only adequate in external subjects, if the used performance measure assumed the same data structure as the model development method: standard calibration measures showed good calibration for the standard developed model, calibration measures adapting the clustered data structure showed good calibration for the prediction model with random intercept. The models with random intercept discriminate better than the standard model only if the cluster effect is used for predictions. The prediction model with random intercept had good calibration within clusters.
The estimation of branching curves in the presence of subject-specific random effects.
Elmi, Angelo; Ratcliffe, Sarah J; Guo, Wensheng
2014-12-20
Branching curves are a technique for modeling curves that change trajectory at a change (branching) point. Currently, the estimation framework is limited to independent data, and smoothing splines are used for estimation. This article aims to extend the branching curve framework to the longitudinal data setting where the branching point varies by subject. If the branching point is modeled as a random effect, then the longitudinal branching curve framework is a semiparametric nonlinear mixed effects model. Given existing issues with using random effects within a smoothing spline, we express the model as a B-spline based semiparametric nonlinear mixed effects model. Simple, clever smoothness constraints are enforced on the B-splines at the change point. The method is applied to Women's Health data where we model the shape of the labor curve (cervical dilation measured longitudinally) before and after treatment with oxytocin (a labor stimulant). Copyright © 2014 John Wiley & Sons, Ltd.
[Computer-assisted education in problem-solving in neurology; a randomized educational study].
Weverling, G J; Stam, J; ten Cate, T J; van Crevel, H
1996-02-24
To determine the effect of computer-based medical teaching (CBMT) as a supplementary method to teach clinical problem-solving during the clerkship in neurology. Randomized controlled blinded study. Academic Medical Centre, Amsterdam, the Netherlands. 103 Students were assigned at random to a group with access to CBMT and a control group. CBMT consisted of 20 computer-simulated patients with neurological diseases, and was permanently available during five weeks to students in the CBMT group. The ability to recognize and solve neurological problems was assessed with two free-response tests, scored by two blinded observers. The CBMT students scored significantly better on the test related to the CBMT cases (mean score 7.5 on a zero to 10 point scale; control group 6.2; p < 0.001). There was no significant difference on the control test not related to the problems practised with CBMT. CBMT can be an effective method for teaching clinical problem-solving, when used as a supplementary teaching facility during a clinical clerkship. The increased ability to solve problems learned by CBMT had no demonstrable effect on the performance with other neurological problems.
Ship Detection Based on Multiple Features in Random Forest Model for Hyperspectral Images
NASA Astrophysics Data System (ADS)
Li, N.; Ding, L.; Zhao, H.; Shi, J.; Wang, D.; Gong, X.
2018-04-01
A novel method for detecting ships which aim to make full use of both the spatial and spectral information from hyperspectral images is proposed. Firstly, the band which is high signal-noise ratio in the range of near infrared or short-wave infrared spectrum, is used to segment land and sea on Otsu threshold segmentation method. Secondly, multiple features that include spectral and texture features are extracted from hyperspectral images. Principal components analysis (PCA) is used to extract spectral features, the Grey Level Co-occurrence Matrix (GLCM) is used to extract texture features. Finally, Random Forest (RF) model is introduced to detect ships based on the extracted features. To illustrate the effectiveness of the method, we carry out experiments over the EO-1 data by comparing single feature and different multiple features. Compared with the traditional single feature method and Support Vector Machine (SVM) model, the proposed method can stably achieve the target detection of ships under complex background and can effectively improve the detection accuracy of ships.
NASA Astrophysics Data System (ADS)
Müller, Tobias M.; Gurevich, Boris
2005-04-01
An important dissipation mechanism for waves in randomly inhomogeneous poroelastic media is the effect of wave-induced fluid flow. In the framework of Biot's theory of poroelasticity, this mechanism can be understood as scattering from fast into slow compressional waves. To describe this conversion scattering effect in poroelastic random media, the dynamic characteristics of the coherent wavefield using the theory of statistical wave propagation are analyzed. In particular, the method of statistical smoothing is applied to Biot's equations of poroelasticity. Within the accuracy of the first-order statistical smoothing an effective wave number of the coherent field, which accounts for the effect of wave-induced flow, is derived. This wave number is complex and involves an integral over the correlation function of the medium's fluctuations. It is shown that the known one-dimensional (1-D) result can be obtained as a special case of the present 3-D theory. The expression for the effective wave number allows to derive a model for elastic attenuation and dispersion due to wave-induced fluid flow. These wavefield attributes are analyzed in a companion paper. .
NASA Astrophysics Data System (ADS)
Luo, D. M.; Xie, Y.; Su, X. R.; Zhou, Y. L.
2018-01-01
Based on the four classical models of Mooney-Rivlin (M-R), Yeoh, Ogden and Neo-Hookean (N-H) model, a strain energy constitutive equation with large deformation for rubber composites reinforced with random ceramic particles is proposed from the angle of continuum mechanics theory in this paper. By decoupling the interaction between matrix and random particles, the strain energy of each phase is obtained to derive the explicit constitutive equation for rubber composites. The tests results of uni-axial tensile, pure shear and equal bi-axial tensile are simulated by the non-linear finite element method on the ANSYS platform. The results from finite element method are compared with those from experiment, and the material parameters are determined by fitting the results from different test conditions, and the influence of radius of random ceramic particles on the effective mechanical properties are analyzed.
Adjusting for multiple prognostic factors in the analysis of randomised trials
2013-01-01
Background When multiple prognostic factors are adjusted for in the analysis of a randomised trial, it is unclear (1) whether it is necessary to account for each of the strata, formed by all combinations of the prognostic factors (stratified analysis), when randomisation has been balanced within each stratum (stratified randomisation), or whether adjusting for the main effects alone will suffice, and (2) the best method of adjustment in terms of type I error rate and power, irrespective of the randomisation method. Methods We used simulation to (1) determine if a stratified analysis is necessary after stratified randomisation, and (2) to compare different methods of adjustment in terms of power and type I error rate. We considered the following methods of analysis: adjusting for covariates in a regression model, adjusting for each stratum using either fixed or random effects, and Mantel-Haenszel or a stratified Cox model depending on outcome. Results Stratified analysis is required after stratified randomisation to maintain correct type I error rates when (a) there are strong interactions between prognostic factors, and (b) there are approximately equal number of patients in each stratum. However, simulations based on real trial data found that type I error rates were unaffected by the method of analysis (stratified vs unstratified), indicating these conditions were not met in real datasets. Comparison of different analysis methods found that with small sample sizes and a binary or time-to-event outcome, most analysis methods lead to either inflated type I error rates or a reduction in power; the lone exception was a stratified analysis using random effects for strata, which gave nominal type I error rates and adequate power. Conclusions It is unlikely that a stratified analysis is necessary after stratified randomisation except in extreme scenarios. Therefore, the method of analysis (accounting for the strata, or adjusting only for the covariates) will not generally need to depend on the method of randomisation used. Most methods of analysis work well with large sample sizes, however treating strata as random effects should be the analysis method of choice with binary or time-to-event outcomes and a small sample size. PMID:23898993
Heckman, James; Moon, Seong Hyeok; Pinto, Rodrigo; Savelyev, Peter; Yavitz, Adam
2012-01-01
Social experiments are powerful sources of information about the effectiveness of interventions. In practice, initial randomization plans are almost always compromised. Multiple hypotheses are frequently tested. “Significant” effects are often reported with p-values that do not account for preliminary screening from a large candidate pool of possible effects. This paper develops tools for analyzing data from experiments as they are actually implemented. We apply these tools to analyze the influential HighScope Perry Preschool Program. The Perry program was a social experiment that provided preschool education and home visits to disadvantaged children during their preschool years. It was evaluated by the method of random assignment. Both treatments and controls have been followed from age 3 through age 40. Previous analyses of the Perry data assume that the planned randomization protocol was implemented. In fact, as in many social experiments, the intended randomization protocol was compromised. Accounting for compromised randomization, multiple-hypothesis testing, and small sample sizes, we find statistically significant and economically important program effects for both males and females. We also examine the representativeness of the Perry study. PMID:23255883
Zerfu, Taddese Alemu; Ayele, Henok Taddese; Bogale, Tariku Nigatu
2018-06-01
To investigate the effect of innovative means to distribute LARC on contraceptive use, we implemented a three arm, parallel groups, cluster randomized community trial design. The intervention consisted of placing trained community-based reproductive health nurses (CORN) within health centers or health posts. The nurses provided counseling to encourage women to use LARC and distributed all contraceptive methods. A total of 282 villages were randomly selected and assigned to a control arm (n = 94) or 1 of 2 treatment arms (n = 94 each). The treatment groups differed by where the new service providers were deployed, health post or health center. We calculated difference-in-difference (DID) estimates to assess program impacts on LARC use. After nine months of intervention, the use of LARC methods increased significantly by 72.3 percent, while the use of short acting methods declined by 19.6 percent. The proportion of women using LARC methods increased by 45.9 percent and 45.7 percent in the health post and health center based intervention arms, respectively. Compared to the control group, the DID estimates indicate that the use of LARC methods increased by 11.3 and 12.3 percentage points in the health post and health center based intervention arms. Given the low use of LARC methods in similar settings, deployment of contextually trained nurses at the grassroots level could substantially increase utilization of these methods. © 2018 The Population Council, Inc.
Stochastic Seismic Response of an Algiers Site with Random Depth to Bedrock
DOE Office of Scientific and Technical Information (OSTI.GOV)
Badaoui, M.; Mebarki, A.; Berrah, M. K.
2010-05-21
Among the important effects of the Boumerdes earthquake (Algeria, May 21{sup st} 2003) was that, within the same zone, the destructions in certain parts were more important than in others. This phenomenon is due to site effects which alter the characteristics of seismic motions and cause concentration of damage during earthquakes. Local site effects such as thickness and mechanical properties of soil layers have important effects on the surface ground motions.This paper deals with the effect of the randomness aspect of the depth to bedrock (soil layers heights) which is assumed to be a random variable with lognormal distribution. Thismore » distribution is suitable for strictly non-negative random variables with large values of the coefficient of variation. In this case, Monte Carlo simulations are combined with the stiffness matrix method, used herein as a deterministic method, for evaluating the effect of the depth to bedrock uncertainty on the seismic response of a multilayered soil. This study considers a P and SV wave propagation pattern using input accelerations collected at Keddara station, located at 20 km from the epicenter, as it is located directly on the bedrock.A parametric study is conducted do derive the stochastic behavior of the peak ground acceleration and its response spectrum, the transfer function and the amplification factors. It is found that the soil height heterogeneity causes a widening of the frequency content and an increase in the fundamental frequency of the soil profile, indicating that the resonance phenomenon concerns a larger number of structures.« less
Zhang, Peng; Luo, Dandan; Li, Pengfei; Sharpsten, Lucie; Medeiros, Felipe A.
2015-01-01
Glaucoma is a progressive disease due to damage in the optic nerve with associated functional losses. Although the relationship between structural and functional progression in glaucoma is well established, there is disagreement on how this association evolves over time. In addressing this issue, we propose a new class of non-Gaussian linear-mixed models to estimate the correlations among subject-specific effects in multivariate longitudinal studies with a skewed distribution of random effects, to be used in a study of glaucoma. This class provides an efficient estimation of subject-specific effects by modeling the skewed random effects through the log-gamma distribution. It also provides more reliable estimates of the correlations between the random effects. To validate the log-gamma assumption against the usual normality assumption of the random effects, we propose a lack-of-fit test using the profile likelihood function of the shape parameter. We apply this method to data from a prospective observation study, the Diagnostic Innovations in Glaucoma Study, to present a statistically significant association between structural and functional change rates that leads to a better understanding of the progression of glaucoma over time. PMID:26075565
Wang, Xuesong; Xing, Yilun; Luo, Lian; Yu, Rongjie
2018-08-01
Risky driving behavior is one of the main causes of commercial vehicle related crashes. In order to achieve safer vehicle operation, safety education for drivers is often provided. However, the education programs vary in quality and may not always be successful in reducing crash rates. Behavior-Based Safety (BBS) education is a popular approach found effective by numerous studies, but even this approach varies as to the combination of frequency, mode and content used by different education providers. This study therefore evaluates and compares the effectiveness of BBS education methods. Thirty-five drivers in Shanghai, China, were coached with one of three different BBS education methods for 13 weeks following a 13-week baseline phase with no education. A random-effects negative binomial (NB) model was built and calibrated to investigate the relationship between BBS education and the driver at-fault safety-related event rate. Based on the results of the random-effects NB model, event modification factors (EMF) were calculated to evaluate and compare the effectiveness of the methods. Results show that (1) BBS education was confirmed to be effective in safety-related event reduction; (2) the most effective method among the three applied monthly face-to-face coaching, including feedback with video and statistical data, and training on strategies to avoid driver-specific unsafe behaviors; (3) weekly telephone coaching using statistics and strategies was rated by drivers as the most convenient delivery mode, and was also significantly effective. Copyright © 2018 Elsevier Ltd. All rights reserved.
A method for validating Rent's rule for technological and biological networks.
Alcalde Cuesta, Fernando; González Sequeiros, Pablo; Lozano Rojo, Álvaro
2017-07-14
Rent's rule is empirical power law introduced in an effort to describe and optimize the wiring complexity of computer logic graphs. It is known that brain and neuronal networks also obey Rent's rule, which is consistent with the idea that wiring costs play a fundamental role in brain evolution and development. Here we propose a method to validate this power law for a certain range of network partitions. This method is based on the bifurcation phenomenon that appears when the network is subjected to random alterations preserving its degree distribution. It has been tested on a set of VLSI circuits and real networks, including biological and technological ones. We also analyzed the effect of different types of random alterations on the Rentian scaling in order to test the influence of the degree distribution. There are network architectures quite sensitive to these randomization procedures with significant increases in the values of the Rent exponents.
NASA Astrophysics Data System (ADS)
Shi, Jing; Shi, Yunli; Tan, Jian; Zhu, Lei; Li, Hu
2018-02-01
Traditional power forecasting models cannot efficiently take various factors into account, neither to identify the relation factors. In this paper, the mutual information in information theory and the artificial intelligence random forests algorithm are introduced into the medium and long-term electricity demand prediction. Mutual information can identify the high relation factors based on the value of average mutual information between a variety of variables and electricity demand, different industries may be highly associated with different variables. The random forests algorithm was used for building the different industries forecasting models according to the different correlation factors. The data of electricity consumption in Jiangsu Province is taken as a practical example, and the above methods are compared with the methods without regard to mutual information and the industries. The simulation results show that the above method is scientific, effective, and can provide higher prediction accuracy.
Staley, James R; Burgess, Stephen
2017-05-01
Mendelian randomization, the use of genetic variants as instrumental variables (IV), can test for and estimate the causal effect of an exposure on an outcome. Most IV methods assume that the function relating the exposure to the expected value of the outcome (the exposure-outcome relationship) is linear. However, in practice, this assumption may not hold. Indeed, often the primary question of interest is to assess the shape of this relationship. We present two novel IV methods for investigating the shape of the exposure-outcome relationship: a fractional polynomial method and a piecewise linear method. We divide the population into strata using the exposure distribution, and estimate a causal effect, referred to as a localized average causal effect (LACE), in each stratum of population. The fractional polynomial method performs metaregression on these LACE estimates. The piecewise linear method estimates a continuous piecewise linear function, the gradient of which is the LACE estimate in each stratum. Both methods were demonstrated in a simulation study to estimate the true exposure-outcome relationship well, particularly when the relationship was a fractional polynomial (for the fractional polynomial method) or was piecewise linear (for the piecewise linear method). The methods were used to investigate the shape of relationship of body mass index with systolic blood pressure and diastolic blood pressure. © 2017 The Authors Genetic Epidemiology Published by Wiley Periodicals, Inc.
Staley, James R.
2017-01-01
ABSTRACT Mendelian randomization, the use of genetic variants as instrumental variables (IV), can test for and estimate the causal effect of an exposure on an outcome. Most IV methods assume that the function relating the exposure to the expected value of the outcome (the exposure‐outcome relationship) is linear. However, in practice, this assumption may not hold. Indeed, often the primary question of interest is to assess the shape of this relationship. We present two novel IV methods for investigating the shape of the exposure‐outcome relationship: a fractional polynomial method and a piecewise linear method. We divide the population into strata using the exposure distribution, and estimate a causal effect, referred to as a localized average causal effect (LACE), in each stratum of population. The fractional polynomial method performs metaregression on these LACE estimates. The piecewise linear method estimates a continuous piecewise linear function, the gradient of which is the LACE estimate in each stratum. Both methods were demonstrated in a simulation study to estimate the true exposure‐outcome relationship well, particularly when the relationship was a fractional polynomial (for the fractional polynomial method) or was piecewise linear (for the piecewise linear method). The methods were used to investigate the shape of relationship of body mass index with systolic blood pressure and diastolic blood pressure. PMID:28317167
Causal inference from observational data.
Listl, Stefan; Jürges, Hendrik; Watt, Richard G
2016-10-01
Randomized controlled trials have long been considered the 'gold standard' for causal inference in clinical research. In the absence of randomized experiments, identification of reliable intervention points to improve oral health is often perceived as a challenge. But other fields of science, such as social science, have always been challenged by ethical constraints to conducting randomized controlled trials. Methods have been established to make causal inference using observational data, and these methods are becoming increasingly relevant in clinical medicine, health policy and public health research. This study provides an overview of state-of-the-art methods specifically designed for causal inference in observational data, including difference-in-differences (DiD) analyses, instrumental variables (IV), regression discontinuity designs (RDD) and fixed-effects panel data analysis. The described methods may be particularly useful in dental research, not least because of the increasing availability of routinely collected administrative data and electronic health records ('big data'). © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Baker, John [Walnut Creek, CA; Archer, Daniel E [Knoxville, TN; Luke, Stanley John [Pleasanton, CA; Decman, Daniel J [Livermore, CA; White, Gregory K [Livermore, CA
2009-06-23
A tailpulse signal generating/simulating apparatus, system, and method designed to produce electronic pulses which simulate tailpulses produced by a gamma radiation detector, including the pileup effect caused by the characteristic exponential decay of the detector pulses, and the random Poisson distribution pulse timing for radioactive materials. A digital signal process (DSP) is programmed and configured to produce digital values corresponding to pseudo-randomly selected pulse amplitudes and pseudo-randomly selected Poisson timing intervals of the tailpulses. Pulse amplitude values are exponentially decayed while outputting the digital value to a digital to analog converter (DAC). And pulse amplitudes of new pulses are added to decaying pulses to simulate the pileup effect for enhanced realism in the simulation.
Alternate methods for FAAT S-curve generation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kaufman, A.M.
The FAAT (Foreign Asset Assessment Team) assessment methodology attempts to derive a probability of effect as a function of incident field strength. The probability of effect is the likelihood that the stress put on a system exceeds its strength. In the FAAT methodology, both the stress and strength are random variables whose statistical properties are estimated by experts. Each random variable has two components of uncertainty: systematic and random. The systematic uncertainty drives the confidence bounds in the FAAT assessment. Its variance can be reduced by improved information. The variance of the random uncertainty is not reducible. The FAAT methodologymore » uses an assessment code called ARES to generate probability of effect curves (S-curves) at various confidence levels. ARES assumes log normal distributions for all random variables. The S-curves themselves are log normal cumulants associated with the random portion of the uncertainty. The placement of the S-curves depends on confidence bounds. The systematic uncertainty in both stress and strength is usually described by a mode and an upper and lower variance. Such a description is not consistent with the log normal assumption of ARES and an unsatisfactory work around solution is used to obtain the required placement of the S-curves at each confidence level. We have looked into this situation and have found that significant errors are introduced by this work around. These errors are at least several dB-W/cm{sup 2} at all confidence levels, but they are especially bad in the estimate of the median. In this paper, we suggest two alternate solutions for the placement of S-curves. To compare these calculational methods, we have tabulated the common combinations of upper and lower variances and generated the relevant S-curves offsets from the mode difference of stress and strength.« less
NASA Astrophysics Data System (ADS)
Kudoh, Eisuke; Ito, Haruki; Wang, Zhisen; Adachi, Fumiyuki
In mobile communication systems, high speed packet data services are demanded. In the high speed data transmission, throughput degrades severely due to severe inter-path interference (IPI). Recently, we proposed a random transmit power control (TPC) to increase the uplink throughput of DS-CDMA packet mobile communications. In this paper, we apply IPI cancellation in addition to the random TPC. We derive the numerical expression of the received signal-to-interference plus noise power ratio (SINR) and introduce IPI cancellation factor. We also derive the numerical expression of system throughput when IPI is cancelled ideally to compare with the Monte Carlo numerically evaluated system throughput. Then we evaluate, by Monte-Carlo numerical computation method, the combined effect of random TPC and IPI cancellation on the uplink throughput of DS-CDMA packet mobile communications.
Chen, Xiaoqin; Li, Ying; Zheng, Hui; Hu, Kaming; Zhang, Hongxing; Zhao, Ling; Li, Yan; Liu, Lian; Mang, Lingling; Yu, Shuyuan
2009-07-01
Acupuncture to treat Bell's palsy is one of the most commonly used methods in China. There are a variety of acupuncture treatment options to treat Bell's palsy in clinical practice. Since Bell's palsy has three different path-stages (acute stage, resting stage and restoration stage), so whether acupuncture is effective in the different path-stages and which acupuncture treatment is the best method are major issues in acupuncture clinical trials about Bell's palsy. In this article, we report the design and protocol of a large sample multi-center randomized controlled trial to treat Bell's palsy with acupuncture. There are five acupuncture groups, with four according to different path-stages and one not. In total, 900 patients with Bell's palsy are enrolled in this study. These patients are randomly assigned to receive one of the following four treatment groups according to different path-stages, i.e. 1) staging acupuncture group, 2) staging acupuncture and moxibustion group, 3) staging electro-acupuncture group, 4) staging acupuncture along yangming musculature group or non-staging acupuncture control group. The outcome measurements in this trial are the effect comparison achieved among these five groups in terms of House-Brackmann scale (Global Score and Regional Score), Facial Disability Index scale, Classification scale of Facial Paralysis, and WHOQOL-BREF scale before randomization (baseline phase) and after randomization. The result of this trial will certify the efficacy of using staging acupuncture and moxibustion to treat Bell's palsy, and to approach a best acupuncture treatment among these five different methods for treating Bell's palsy.
Gruber, Joshua S; Arnold, Benjamin F; Reygadas, Fermin; Hubbard, Alan E; Colford, John M
2014-05-01
Complier average causal effects (CACE) estimate the impact of an intervention among treatment compliers in randomized trials. Methods used to estimate CACE have been outlined for parallel-arm trials (e.g., using an instrumental variables (IV) estimator) but not for other randomized study designs. Here, we propose a method for estimating CACE in randomized stepped wedge trials, where experimental units cross over from control conditions to intervention conditions in a randomized sequence. We illustrate the approach with a cluster-randomized drinking water trial conducted in rural Mexico from 2009 to 2011. Additionally, we evaluated the plausibility of assumptions required to estimate CACE using the IV approach, which are testable in stepped wedge trials but not in parallel-arm trials. We observed small increases in the magnitude of CACE risk differences compared with intention-to-treat estimates for drinking water contamination (risk difference (RD) = -22% (95% confidence interval (CI): -33, -11) vs. RD = -19% (95% CI: -26, -12)) and diarrhea (RD = -0.8% (95% CI: -2.1, 0.4) vs. RD = -0.1% (95% CI: -1.1, 0.9)). Assumptions required for IV analysis were probably violated. Stepped wedge trials allow investigators to estimate CACE with an approach that avoids the stronger assumptions required for CACE estimation in parallel-arm trials. Inclusion of CACE estimates in stepped wedge trials with imperfect compliance could enhance reporting and interpretation of the results of such trials.
Turbulence and fire-spotting effects into wild-land fire simulators
NASA Astrophysics Data System (ADS)
Kaur, Inderpreet; Mentrelli, Andrea; Bosseur, Frédéric; Filippi, Jean-Baptiste; Pagnini, Gianni
2016-10-01
This paper presents a mathematical approach to model the effects and the role of phenomena with random nature such as turbulence and fire-spotting into the existing wildfire simulators. The formulation proposes that the propagation of the fire-front is the sum of a drifting component (obtained from an existing wildfire simulator without turbulence and fire-spotting) and a random fluctuating component. The modelling of the random effects is embodied in a probability density function accounting for the fluctuations around the fire perimeter which is given by the drifting component. In past, this formulation has been applied to include these random effects into a wildfire simulator based on an Eulerian moving interface method, namely the Level Set Method (LSM), but in this paper the same formulation is adapted for a wildfire simulator based on a Lagrangian front tracking technique, namely the Discrete Event System Specification (DEVS). The main highlight of the present study is the comparison of the performance of a Lagrangian and an Eulerian moving interface method when applied to wild-land fire propagation. Simple idealised numerical experiments are used to investigate the potential applicability of the proposed formulation to DEVS and to compare its behaviour with respect to the LSM. The results show that DEVS based wildfire propagation model qualitatively improves its performance (e.g., reproducing flank and back fire, increase in fire spread due to pre-heating of the fuel by hot air and firebrands, fire propagation across no fuel zones, secondary fire generation, ...) when random effects are included according to the present formulation. The performance of DEVS and LSM based wildfire models is comparable and the only differences which arise among the two are due to the differences in the geometrical construction of the direction of propagation. Though the results presented here are devoid of any validation exercise and provide only a proof of concept, they show a strong inclination towards an intended operational use. The existing LSM or DEVS based operational simulators like WRF-SFIRE and ForeFire respectively can serve as an ideal basis for the same.
Randomized controlled trial of a computer-based module to improve contraceptive method choice.
Garbers, Samantha; Meserve, Allison; Kottke, Melissa; Hatcher, Robert; Ventura, Alicia; Chiasson, Mary Ann
2012-10-01
Unintended pregnancy is common in the United States, and interventions are needed to improve contraceptive use among women at higher risk of unintended pregnancy, including Latinas and women with low educational attainment. A three-arm randomized controlled trial was conducted at two family planning sites serving low-income, predominantly Latina populations. The trial tested the efficacy of a computer-based contraceptive assessment module in increasing the proportion of patients choosing an effective method of contraception (<10 pregnancies/100 women per year, typical use). Participants were randomized to complete the module and receive tailored health materials, to complete the module and receive generic health materials, or to a control condition. In intent-to-treat analyses adjusted for recruitment site (n=2231), family planning patients who used the module were significantly more likely to choose an effective contraceptive method: 75% among those who received tailored materials [odds ratio (OR)=1.56; 95% confidence interval (CI): 1.23-1.98] and 78% among those who received generic materials (OR=1.74; 95% CI: 1.35-2.25), compared to 65% among control arm participants. The findings support prior research suggesting that patient-centered interventions can positively influence contraceptive method choice. Copyright © 2012 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Eliasson, Ann-Christin; Shaw, Karin; Berg, Elisabeth; Krumlinde-Sundholm, Lena
2011-01-01
The aim was to evaluate the effect of Eco-CIMT in young children with unilateral cerebral palsy in a randomized controlled crossover design. The training was implemented within the regular pediatric services, provided by the child's parents and/or preschool teacher and supervised by the child's regular therapist. Methods: Twenty-five children…
ERIC Educational Resources Information Center
Foley, Elizabeth; Baillie, Andrew; Huxter, Malcolm; Price, Melanie; Sinclair, Emma
2010-01-01
Objective: This study evaluated the effectiveness of mindfulness-based cognitive therapy (MBCT) for individuals with a diagnosis of cancer. Method: Participants (N = 115) diagnosed with cancer, across site and stage, were randomly allocated to either the treatment or the wait-list condition. Treatment was conducted at 1 site, by a single…
ERIC Educational Resources Information Center
Hesser, Hugo; Gustafsson, Tore; Lunden, Charlotte; Henrikson, Oskar; Fattahi, Kidjan; Johnsson, Erik; Westin, Vendela Zetterqvist; Carlbring, Per; Maki-Torkko, Elina; Kaldo, Viktor; Andersson, Gerhard
2012-01-01
Objective: Our aim in this randomized controlled trial was to investigate the effects on global tinnitus severity of 2 Internet-delivered psychological treatments, acceptance and commitment therapy (ACT) and cognitive behavior therapy (CBT), in guided self-help format. Method: Ninety-nine participants (mean age = 48.5 years; 43% female) who were…
ERIC Educational Resources Information Center
Coalition for Evidence-Based Policy, 2012
2012-01-01
The increasing ability of social policy researchers to conduct randomized controlled trials (RCTs) at low cost could revolutionize the field of performance-based government. RCTs are widely judged to be the most credible method of evaluating whether a social program is effective, overcoming the demonstrated inability of other, more common methods…
ERIC Educational Resources Information Center
Smith, Sherri L.; Saunders, Gabrielle H.; Chisolm, Theresa H.; Frederick, Melissa; Bailey, Beth A.
2016-01-01
Purpose: The purpose of this study was to determine if patient characteristics or clinical variables could predict who benefits from individual auditory training. Method: A retrospective series of analyses were performed using a data set from a large, multisite, randomized controlled clinical trial that compared the treatment effects of at-home…
ERIC Educational Resources Information Center
Strait, Gerald Gill; Smith, Bradley H.; McQuillin, Sam; Terry, John; Swan, Suzanne; Malone, Patrick S.
2012-01-01
Motivational interviewing (MI) is an effective method of promoting change in adults, but research on adolescents is limited. This study tests the efficacy of MI for promoting academic achievement in middle school students. Participants were 103 6th-, 7th-, and 8th-grade students randomly assigned to either a MI (n = 50) or a waitlist control…
ERIC Educational Resources Information Center
Verde, Pablo E.; Ohmann, Christian
2015-01-01
Researchers may have multiple motivations for combining disparate pieces of evidence in a meta-analysis, such as generalizing experimental results or increasing the power to detect an effect that a single study is not able to detect. However, while in meta-analysis, the main question may be simple, the structure of evidence available to answer it…
ERIC Educational Resources Information Center
Sebro, Negusse Yohannes; Goshu, Ayele Taye
2017-01-01
This study aims to explore Bayesian multilevel modeling to investigate variations of average academic achievement of grade eight school students. A sample of 636 students is randomly selected from 26 private and government schools by a two-stage stratified sampling design. Bayesian method is used to estimate the fixed and random effects. Input and…
ERIC Educational Resources Information Center
Glassman, Jill R.; Potter, Susan C.; Baumler, Elizabeth R.; Coyle, Karin K.
2015-01-01
Introduction: Group-randomized trials (GRTs) are one of the most rigorous methods for evaluating the effectiveness of group-based health risk prevention programs. Efficiently designing GRTs with a sample size that is sufficient for meeting the trial's power and precision goals while not wasting resources exceeding them requires estimates of the…
ERIC Educational Resources Information Center
Hester, Reid K.; Delaney, Harold D.; Campbell, William
2011-01-01
Objective: To evaluate the effectiveness of a web-based protocol, ModerateDrinking.com (MD; "www.moderatedrinking.com") combined with use of the online resources of Moderation Management (MM; "www.moderation.org") as opposed to the use of the online resources of MM alone. Method: We randomly assigned 80 problem drinkers to…
ERIC Educational Resources Information Center
Szobot, C. M.; Ketzer, C.; Parente, M. A.; Biederman, J.; Rohde, L. A.
2004-01-01
Objective: To evaluate the acute efficacy of methylphenidate (MPH) in Brazilian male children and adolescents with ADHD. Method: In a 4-day, double-blind, placebo-controlled, randomized, fix dose escalating, parallel-group trial, 36 ADHD children and adolescents were allocated to two groups: MPH (n = 19) and placebo (n = 17). Participants were…
NASA Astrophysics Data System (ADS)
Smith, Lyndon N.; Smith, Melvyn L.
2000-10-01
Particulate materials undergo processing in many industries, and therefore there are significant commercial motivators for attaining improvements in the flow and packing behavior of powders. This can be achieved by modeling the effects of particle size, friction, and most importantly, particle shape or morphology. The method presented here for simulating powders employs a random number generator to construct a model of a random particle by combining a sphere with a number of smaller spheres. The resulting 3D model particle has a nodular type of morphology, which is similar to that exhibited by the atomized powders that are used in the bulk of powder metallurgy (PM) manufacture. The irregularity of the model particles is dependent upon vision system data gathered from microscopic analysis of real powder particles. A methodology is proposed whereby randomly generated model particles of various sized and irregularities can be combined in a random packing simulation. The proposed Monte Carlo technique would allow incorporation of the effects of gravity, wall friction, and inter-particle friction. The improvements in simulation realism that this method is expected to provide would prove useful for controlling powder production, and for predicting die fill behavior during the production of PM parts.
Modelling wildland fire propagation by tracking random fronts
NASA Astrophysics Data System (ADS)
Pagnini, G.; Mentrelli, A.
2013-11-01
Wildland fire propagation is studied in literature by two alternative approaches, namely the reaction-diffusion equation and the level-set method. These two approaches are considered alternative each other because the solution of the reaction-diffusion equation is generally a continuous smooth function that has an exponential decay and an infinite support, while the level-set method, which is a front tracking technique, generates a sharp function with a finite support. However, these two approaches can indeed be considered complementary and reconciled. Turbulent hot-air transport and fire spotting are phenomena with a random character that are extremely important in wildland fire propagation. As a consequence the fire front gets a random character, too. Hence a tracking method for random fronts is needed. In particular, the level-set contourn is here randomized accordingly to the probability density function of the interface particle displacement. Actually, when the level-set method is developed for tracking a front interface with a random motion, the resulting averaged process emerges to be governed by an evolution equation of the reaction-diffusion type. In this reconciled approach, the rate of spread of the fire keeps the same key and characterizing role proper to the level-set approach. The resulting model emerges to be suitable to simulate effects due to turbulent convection as fire flank and backing fire, the faster fire spread because of the actions by hot air pre-heating and by ember landing, and also the fire overcoming a firebreak zone that is a case not resolved by models based on the level-set method. Moreover, from the proposed formulation it follows a correction for the rate of spread formula due to the mean jump-length of firebrands in the downwind direction for the leeward sector of the fireline contour.
Developing Appropriate Methods for Cost-Effectiveness Analysis of Cluster Randomized Trials
Gomes, Manuel; Ng, Edmond S.-W.; Nixon, Richard; Carpenter, James; Thompson, Simon G.
2012-01-01
Aim. Cost-effectiveness analyses (CEAs) may use data from cluster randomized trials (CRTs), where the unit of randomization is the cluster, not the individual. However, most studies use analytical methods that ignore clustering. This article compares alternative statistical methods for accommodating clustering in CEAs of CRTs. Methods. Our simulation study compared the performance of statistical methods for CEAs of CRTs with 2 treatment arms. The study considered a method that ignored clustering—seemingly unrelated regression (SUR) without a robust standard error (SE)—and 4 methods that recognized clustering—SUR and generalized estimating equations (GEEs), both with robust SE, a “2-stage” nonparametric bootstrap (TSB) with shrinkage correction, and a multilevel model (MLM). The base case assumed CRTs with moderate numbers of balanced clusters (20 per arm) and normally distributed costs. Other scenarios included CRTs with few clusters, imbalanced cluster sizes, and skewed costs. Performance was reported as bias, root mean squared error (rMSE), and confidence interval (CI) coverage for estimating incremental net benefits (INBs). We also compared the methods in a case study. Results. Each method reported low levels of bias. Without the robust SE, SUR gave poor CI coverage (base case: 0.89 v. nominal level: 0.95). The MLM and TSB performed well in each scenario (CI coverage, 0.92–0.95). With few clusters, the GEE and SUR (with robust SE) had coverage below 0.90. In the case study, the mean INBs were similar across all methods, but ignoring clustering underestimated statistical uncertainty and the value of further research. Conclusions. MLMs and the TSB are appropriate analytical methods for CEAs of CRTs with the characteristics described. SUR and GEE are not recommended for studies with few clusters. PMID:22016450
Methods to assess an exercise intervention trial based on 3-level functional data.
Li, Haocheng; Kozey Keadle, Sarah; Staudenmayer, John; Assaad, Houssein; Huang, Jianhua Z; Carroll, Raymond J
2015-10-01
Motivated by data recording the effects of an exercise intervention on subjects' physical activity over time, we develop a model to assess the effects of a treatment when the data are functional with 3 levels (subjects, weeks and days in our application) and possibly incomplete. We develop a model with 3-level mean structure effects, all stratified by treatment and subject random effects, including a general subject effect and nested effects for the 3 levels. The mean and random structures are specified as smooth curves measured at various time points. The association structure of the 3-level data is induced through the random curves, which are summarized using a few important principal components. We use penalized splines to model the mean curves and the principal component curves, and cast the proposed model into a mixed effects model framework for model fitting, prediction and inference. We develop an algorithm to fit the model iteratively with the Expectation/Conditional Maximization Either (ECME) version of the EM algorithm and eigenvalue decompositions. Selection of the number of principal components and handling incomplete data issues are incorporated into the algorithm. The performance of the Wald-type hypothesis test is also discussed. The method is applied to the physical activity data and evaluated empirically by a simulation study. © The Author 2015. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
ERIC Educational Resources Information Center
Abdu-Raheem, B. O.
2012-01-01
This study investigated the effects of problem-solving method of teaching on secondary school students' achievement and retention in Social Studies. The study adopted the quasi-experimental, pre-test, post-test, control group design. The sample for the study consisted of 240 Junior Secondary School Class II students randomly selected from six…
Dressel, Anne; Schneider, Robert; DeNomie, Melissa; Kusch, Jennifer; Welch, Whitney; Sosa, Mirtha; Yeldell, Sally; Maida, Tatiana; Wineberg, Jessica; Holt, Keith; Bernstein, Rebecca
2017-09-01
Most low-income Americans fail to meet physical activity recommendations. Inactivity and poor diet contribute to obesity, a risk factor for multiple chronic diseases. Health promotion activities have the potential to improve health outcomes for low-income populations. Measuring the effectiveness of these activities, however, can be challenging in community settings. A "Biking for Health" study tested the impact of a bicycling intervention on overweight or obese low-income Latino and African American adults to reduce barriers to cycling and increase physical activity and fitness. A randomized controlled trial was conducted in Milwaukee, Wisconsin, in summer 2015. A 12-week bicycling intervention was implemented at two sites with low-income, overweight, or obese Latino and African American adults. We found that randomized controlled trial methodology was suboptimal for use in this small pilot study and that it negatively affected participation. More discussion is needed about the effectiveness of using traditional research methods in community settings to assess the effectiveness of health promotion interventions. Modifications or alternative methods may yield better results. The aim of this article is to discuss the effectiveness and feasibility of using traditional research methods to assess health promotion interventions in community-based settings.
The chaotic dynamical aperture
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, S.Y.; Tepikian, S.
1985-10-01
Nonlinear magnetic forces become more important for particles in the modern large accelerators. These nonlinear elements are introduced either intentionally to control beam dynamics or by uncontrollable random errors. Equations of motion in the nonlinear Hamiltonian are usually non-integrable. Because of the nonlinear part of the Hamiltonian, the tune diagram of accelerators is a jungle. Nonlinear magnet multipoles are important in keeping the accelerator operation point in the safe quarter of the hostile jungle of resonant tunes. Indeed, all the modern accelerator design have taken advantages of nonlinear mechanics. On the other hand, the effect of the uncontrollable random multipolesmore » should be evaluated carefully. A powerful method of studying the effect of these nonlinear multipoles is using a particle tracking calculation, where a group of test particles are tracing through these magnetic multipoles in the accelerator hundreds to millions of turns in order to test the dynamical aperture of the machine. These methods are extremely useful in the design of a large accelerator such as SSC, LEP, HERA and RHIC. These calculations unfortunately take tremendous amount of computing time. In this paper, we try to apply the existing method in the nonlinear dynamics to study the possible alternative solution. When the Hamiltonian motion becomes chaotic, the tune of the machine becomes undefined. The aperture related to the chaotic orbit can be identified as chaotic dynamical aperture. We review the method of determining chaotic orbit and apply the method to nonlinear problems in accelerator physics. We then discuss the scaling properties and effect of random sextupoles.« less
Effects of absorption on multiple scattering by random particulate media: exact results.
Mishchenko, Michael I; Liu, Li; Hovenier, Joop W
2007-10-01
We employ the numerically exact superposition T-matrix method to perform extensive computations of elec nottromagnetic scattering by a volume of discrete random medium densely filled with increasingly absorbing as well as non-absorbing particles. Our numerical data demonstrate that increasing absorption diminishes and nearly extinguishes certain optical effects such as depolarization and coherent backscattering and increases the angular width of coherent backscattering patterns. This result corroborates the multiple-scattering origin of such effects and further demonstrates the heuristic value of the concept of multiple scattering even in application to densely packed particulate media.
Analyzing Empirical Evaluations of Non-Experimental Methods in Field Settings
ERIC Educational Resources Information Center
Steiner, Peter M.; Wong, Vivian
2016-01-01
Despite recent emphasis on the use of randomized control trials (RCTs) for evaluating education interventions, in most areas of education research, observational methods remain the dominant approach for assessing program effects. Over the last three decades, the within-study comparison (WSC) design has emerged as a method for evaluating the…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-27
... permissible methods of taking, other means of effecting the least practicable impact on the species or stock... non-destructive sampling methods to monitor rocky intertidal algal and invertebrate species abundances... and random quadrat are sampled, using methods described by Foster et al. (1991) and Dethier et al...
Analysis of the Interaction of Student Characteristics with Method in Micro-Teaching.
ERIC Educational Resources Information Center
Chavers, Katherine; And Others
A study examined the comparative effects on microteaching performance of (1) eight different methods of teacher training and (2) the interaction of method with student characteristics. Subjects, 71 enrollees in an educational psychology course, were randomly assigned to eight treatment groups (including one control group). Treatments consisted of…
A Weighting Method for Assessing Between-Site Heterogeneity in Causal Mediation Mechanism
ERIC Educational Resources Information Center
Qin, Xu; Hong, Guanglei
2017-01-01
When a multisite randomized trial reveals between-site variation in program impact, methods are needed for further investigating heterogeneous mediation mechanisms across the sites. We conceptualize and identify a joint distribution of site-specific direct and indirect effects under the potential outcomes framework. A method-of-moments procedure…
Chen, Carla Chia-Ming; Schwender, Holger; Keith, Jonathan; Nunkesser, Robin; Mengersen, Kerrie; Macrossan, Paula
2011-01-01
Due to advancements in computational ability, enhanced technology and a reduction in the price of genotyping, more data are being generated for understanding genetic associations with diseases and disorders. However, with the availability of large data sets comes the inherent challenges of new methods of statistical analysis and modeling. Considering a complex phenotype may be the effect of a combination of multiple loci, various statistical methods have been developed for identifying genetic epistasis effects. Among these methods, logic regression (LR) is an intriguing approach incorporating tree-like structures. Various methods have built on the original LR to improve different aspects of the model. In this study, we review four variations of LR, namely Logic Feature Selection, Monte Carlo Logic Regression, Genetic Programming for Association Studies, and Modified Logic Regression-Gene Expression Programming, and investigate the performance of each method using simulated and real genotype data. We contrast these with another tree-like approach, namely Random Forests, and a Bayesian logistic regression with stochastic search variable selection.
Kaye, T.N.; Pyke, David A.
2003-01-01
Population viability analysis is an important tool for conservation biologists, and matrix models that incorporate stochasticity are commonly used for this purpose. However, stochastic simulations may require assumptions about the distribution of matrix parameters, and modelers often select a statistical distribution that seems reasonable without sufficient data to test its fit. We used data from long-term (5a??10 year) studies with 27 populations of five perennial plant species to compare seven methods of incorporating environmental stochasticity. We estimated stochastic population growth rate (a measure of viability) using a matrix-selection method, in which whole observed matrices were selected at random at each time step of the model. In addition, we drew matrix elements (transition probabilities) at random using various statistical distributions: beta, truncated-gamma, truncated-normal, triangular, uniform, or discontinuous/observed. Recruitment rates were held constant at their observed mean values. Two methods of constraining stage-specific survival to a??100% were also compared. Different methods of incorporating stochasticity and constraining matrix column sums interacted in their effects and resulted in different estimates of stochastic growth rate (differing by up to 16%). Modelers should be aware that when constraining stage-specific survival to 100%, different methods may introduce different levels of bias in transition element means, and when this happens, different distributions for generating random transition elements may result in different viability estimates. There was no species effect on the results and the growth rates derived from all methods were highly correlated with one another. We conclude that the absolute value of population viability estimates is sensitive to model assumptions, but the relative ranking of populations (and management treatments) is robust. Furthermore, these results are applicable to a range of perennial plants and possibly other life histories.
Cauley, Jane A.; LaCroix, Andrea Z.; Robbins, John A.; Larson, Joseph; Wallace, Robert; Wactawski-Wende, Jean; Chen, Zhao; Bauer, Douglas C.; Cummings, Steven R.; Jackson, Rebecca
2009-01-01
Purpose To test the hypothesis that the reduction in fractures with hormone therapy (HT) is greater in women with lower estradiol levels. Methods We conducted a nested case-control study within the Women’s Health Initiative HT Trials. The sample included 231 hip fracture case-control pairs and a random sample of 519 all fracture case-control pairs. Cases and controls were matched for age, ethnicity, randomization date, fracture history and hysterectomy status. Hormones were measured prior to randomization. Incident cases of fracture identified over an average follow-up of 6.53 years. Results There was no evidence that the effect of HT on fracture differed by baseline estradiol (E2) or sex hormone binding globulin (SHBG). Across all quartiles of E2 and SHBG, women randomized to HT had about a 50% lower risk of fracture including hip fracture, compared to placebo. Conclusion The effect of HT on fracture reduction is independent of estradiol and SHBG levels. PMID:19436934
Effects of zinc supplementation on subscales of anorexia in children: A randomized controlled trial
Khademian, Majid; Farhangpajouh, Neda; Shahsanaee, Armindokht; Bahreynian, Maryam; Mirshamsi, Mehran; Kelishadi, Roya
2014-01-01
Objectives: This study aims to assess the effects of zinc supplementation on improving the appetite and its subscales in children. Methods: This study was conducted in 2013 in Isfahan, Iran. It had two phases. At the first step, after validation of the Child Eating Behaviour Questionaire (CEBQ), it was completed for 300 preschool children, who were randomly selected. The second phase was conducted as a randomized controlled trial. Eighty of these children were randomly selected, and were randomly assigned to two groups of equal number receiving zinc (10 mg/day) or placebo for 12 weeks. Results: Overall 77 children completed the trial (39 in the case and 3 in the control group).The results showed that zinc supplement can improve calorie intake in children by affecting some CEBQ subscales like Emotional over Eating and Food Responsible. Conclusion: Zinc supplementation had positive impact in promoting the calorie intake and some subscales of anorexia. PMID:25674110
Phase unwrapping using region-based markov random field model.
Dong, Ying; Ji, Jim
2010-01-01
Phase unwrapping is a classical problem in Magnetic Resonance Imaging (MRI), Interferometric Synthetic Aperture Radar and Sonar (InSAR/InSAS), fringe pattern analysis, and spectroscopy. Although many methods have been proposed to address this problem, robust and effective phase unwrapping remains a challenge. This paper presents a novel phase unwrapping method using a region-based Markov Random Field (MRF) model. Specifically, the phase image is segmented into regions within which the phase is not wrapped. Then, the phase image is unwrapped between different regions using an improved Highest Confidence First (HCF) algorithm to optimize the MRF model. The proposed method has desirable theoretical properties as well as an efficient implementation. Simulations and experimental results on MRI images show that the proposed method provides similar or improved phase unwrapping than Phase Unwrapping MAx-flow/min-cut (PUMA) method and ZpM method.
Clagett, Bartholt; Nathanson, Katherine L.; Ciosek, Stephanie L.; McDermoth, Monique; Vaughn, David J.; Mitra, Nandita; Weiss, Andrew; Martonik, Rachel; Kanetsky, Peter A.
2013-01-01
Random-digit dialing (RDD) using landline telephone numbers is the historical gold standard for control recruitment in population-based epidemiologic research. However, increasing cell-phone usage and diminishing response rates suggest that the effectiveness of RDD in recruiting a random sample of the general population, particularly for younger target populations, is decreasing. In this study, we compared landline RDD with alternative methods of control recruitment, including RDD using cell-phone numbers and address-based sampling (ABS), to recruit primarily white men aged 18–55 years into a study of testicular cancer susceptibility conducted in the Philadelphia, Pennsylvania, metropolitan area between 2009 and 2012. With few exceptions, eligible and enrolled controls recruited by means of RDD and ABS were similar with regard to characteristics for which data were collected on the screening survey. While we find ABS to be a comparably effective method of recruiting young males compared with landline RDD, we acknowledge the potential impact that selection bias may have had on our results because of poor overall response rates, which ranged from 11.4% for landline RDD to 1.7% for ABS. PMID:24008901
Generalizing Evidence From Randomized Clinical Trials to Target Populations
Cole, Stephen R.; Stuart, Elizabeth A.
2010-01-01
Properly planned and conducted randomized clinical trials remain susceptible to a lack of external validity. The authors illustrate a model-based method to standardize observed trial results to a specified target population using a seminal human immunodeficiency virus (HIV) treatment trial, and they provide Monte Carlo simulation evidence supporting the method. The example trial enrolled 1,156 HIV-infected adult men and women in the United States in 1996, randomly assigned 577 to a highly active antiretroviral therapy and 579 to a largely ineffective combination therapy, and followed participants for 52 weeks. The target population was US people infected with HIV in 2006, as estimated by the Centers for Disease Control and Prevention. Results from the trial apply, albeit muted by 12%, to the target population, under the assumption that the authors have measured and correctly modeled the determinants of selection that reflect heterogeneity in the treatment effect. In simulations with a heterogeneous treatment effect, a conventional intent-to-treat estimate was biased with poor confidence limit coverage, but the proposed estimate was largely unbiased with appropriate confidence limit coverage. The proposed method standardizes observed trial results to a specified target population and thereby provides information regarding the generalizability of trial results. PMID:20547574
Packet Randomized Experiments for Eliminating Classes of Confounders
Pavela, Greg; Wiener, Howard; Fontaine, Kevin R.; Fields, David A.; Voss, Jameson D.; Allison, David B.
2014-01-01
Background Although randomization is considered essential for causal inference, it is often not possible to randomize in nutrition and obesity research. To address this, we develop a framework for an experimental design—packet randomized experiments (PREs), which improves causal inferences when randomization on a single treatment variable is not possible. This situation arises when subjects are randomly assigned to a condition (such as a new roommate) which varies in one characteristic of interest (such as weight), but also varies across many others. There has been no general discussion of this experimental design, including its strengths, limitations, and statistical properties. As such, researchers are left to develop and apply PREs on an ad hoc basis, limiting its potential to improve causal inferences among nutrition and obesity researchers. Methods We introduce PREs as an intermediary design between randomized controlled trials and observational studies. We review previous research that used the PRE design and describe its application in obesity-related research, including random roommate assignments, heterochronic parabiosis, and the quasi-random assignment of subjects to geographic areas. We then provide a statistical framework to control for potential packet-level confounders not accounted for by randomization. Results PREs have successfully been used to improve causal estimates of the effect of roommates, altitude, and breastfeeding on weight outcomes. When certain assumptions are met, PREs can asymptotically control for packet-level characteristics. This has the potential to statistically estimate the effect of a single treatment even when randomization to a single treatment did not occur. Conclusions Applying PREs to obesity-related research will improve decisions about clinical, public health, and policy actions insofar as it offers researchers new insight into cause and effect relationships among variables. PMID:25444088
Luck, Tobias; Motzek, Tom; Luppa, Melanie; Matschinger, Herbert; Fleischer, Steffen; Sesselmann, Yves; Roling, Gudrun; Beutner, Katrin; König, Hans-Helmut; Behrens, Johann; Riedel-Heller, Steffi G
2013-01-01
Background Falls in older people are a major public health issue, but the underlying causes are complex. We sought to evaluate the effectiveness of preventive home visits as a multifactorial, individualized strategy to reduce falls in community-dwelling older people. Methods Data were derived from a prospective randomized controlled trial with follow-up examination after 18 months. Two hundred and thirty participants (≥80 years of age) with functional impairment were randomized to intervention and control groups. The intervention group received up to three preventive home visits including risk assessment, home counseling intervention, and a booster session. The control group received no preventive home visits. Structured interviews at baseline and follow-up provided information concerning falls in both study groups. Random-effects Poisson regression evaluated the effect of preventive home visits on the number of falls controlling for covariates. Results Random-effects Poisson regression showed a significant increase in the number of falls between baseline and follow-up in the control group (incidence rate ratio 1.96) and a significant decrease in the intervention group (incidence rate ratio 0.63) controlling for age, sex, family status, level of care, and impairment in activities of daily living. Conclusion Our results indicate that a preventive home visiting program can be effective in reducing falls in community-dwelling older people. PMID:23788832
Dolan, Lori A.; Donnelly, Melanie J.; Spratt, Kevin F.; Weinstein, Stuart L.
2015-01-01
Objective To determine if community equipoise exists concerning the effectiveness of bracing in adolescent idiopathic scoliosis. Background Data Bracing is the standard of care for adolescent idiopathic scoliosis despite the lack of strong reasearch evidence concerning its effectiveness. Thus, some researchers support the idea of a randomized trial, whereas others think that randomization in the face of a standard of care would be unethical. Methods A random of Scoliosis Research Society and Pediatric Orthopaedic Society of North America members were asked to consider 12 clinical profiles and to give their opinion concerning the radiographic outcomes after observation and bracing. Results An expert panel was created from the respondents. They expressed a wide array of opinions concerning the percentage of patients within each scenario who would benefit from bracing. Agreement was noted concerning the risk due to bracing for post-menarchal patients only. Conclusions This study found a high degree of variability in opinion among clinicians concerning the effectiveness of bracing, suggesting that a randomized trial of bracing would be ethical. PMID:17414008
Developing appropriate methods for cost-effectiveness analysis of cluster randomized trials.
Gomes, Manuel; Ng, Edmond S-W; Grieve, Richard; Nixon, Richard; Carpenter, James; Thompson, Simon G
2012-01-01
Cost-effectiveness analyses (CEAs) may use data from cluster randomized trials (CRTs), where the unit of randomization is the cluster, not the individual. However, most studies use analytical methods that ignore clustering. This article compares alternative statistical methods for accommodating clustering in CEAs of CRTs. Our simulation study compared the performance of statistical methods for CEAs of CRTs with 2 treatment arms. The study considered a method that ignored clustering--seemingly unrelated regression (SUR) without a robust standard error (SE)--and 4 methods that recognized clustering--SUR and generalized estimating equations (GEEs), both with robust SE, a "2-stage" nonparametric bootstrap (TSB) with shrinkage correction, and a multilevel model (MLM). The base case assumed CRTs with moderate numbers of balanced clusters (20 per arm) and normally distributed costs. Other scenarios included CRTs with few clusters, imbalanced cluster sizes, and skewed costs. Performance was reported as bias, root mean squared error (rMSE), and confidence interval (CI) coverage for estimating incremental net benefits (INBs). We also compared the methods in a case study. Each method reported low levels of bias. Without the robust SE, SUR gave poor CI coverage (base case: 0.89 v. nominal level: 0.95). The MLM and TSB performed well in each scenario (CI coverage, 0.92-0.95). With few clusters, the GEE and SUR (with robust SE) had coverage below 0.90. In the case study, the mean INBs were similar across all methods, but ignoring clustering underestimated statistical uncertainty and the value of further research. MLMs and the TSB are appropriate analytical methods for CEAs of CRTs with the characteristics described. SUR and GEE are not recommended for studies with few clusters.
The Effect of General Statistical Fiber Misalignment on Predicted Damage Initiation in Composites
NASA Technical Reports Server (NTRS)
Bednarcyk, Brett A.; Aboudi, Jacob; Arnold, Steven M.
2014-01-01
A micromechanical method is employed for the prediction of unidirectional composites in which the fiber orientation can possess various statistical misalignment distributions. The method relies on the probability-weighted averaging of the appropriate concentration tensor, which is established by the micromechanical procedure. This approach provides access to the local field quantities throughout the constituents, from which initiation of damage in the composite can be predicted. In contrast, a typical macromechanical procedure can determine the effective composite elastic properties in the presence of statistical fiber misalignment, but cannot provide the local fields. Fully random fiber distribution is presented as a special case using the proposed micromechanical method. Results are given that illustrate the effects of various amounts of fiber misalignment in terms of the standard deviations of in-plane and out-of-plane misalignment angles, where normal distributions have been employed. Damage initiation envelopes, local fields, effective moduli, and strengths are predicted for polymer and ceramic matrix composites with given normal distributions of misalignment angles, as well as fully random fiber orientation.
Dynamic Loads Generation for Multi-Point Vibration Excitation Problems
NASA Technical Reports Server (NTRS)
Shen, Lawrence
2011-01-01
A random-force method has been developed to predict dynamic loads produced by rocket-engine random vibrations for new rocket-engine designs. The method develops random forces at multiple excitation points based on random vibration environments scaled from accelerometer data obtained during hot-fire tests of existing rocket engines. This random-force method applies random forces to the model and creates expected dynamic response in a manner that simulates the way the operating engine applies self-generated random vibration forces (random pressure acting on an area) with the resulting responses that we measure with accelerometers. This innovation includes the methodology (implementation sequence), the computer code, two methods to generate the random-force vibration spectra, and two methods to reduce some of the inherent conservatism in the dynamic loads. This methodology would be implemented to generate the random-force spectra at excitation nodes without requiring the use of artificial boundary conditions in a finite element model. More accurate random dynamic loads than those predicted by current industry methods can then be generated using the random force spectra. The scaling method used to develop the initial power spectral density (PSD) environments for deriving the random forces for the rocket engine case is based on the Barrett Criteria developed at Marshall Space Flight Center in 1963. This invention approach can be applied in the aerospace, automotive, and other industries to obtain reliable dynamic loads and responses from a finite element model for any structure subject to multipoint random vibration excitations.
Tubal anastomosis after previous sterilization: a systematic review.
van Seeters, Jacoba A H; Chua, Su Jen; Mol, Ben W J; Koks, Carolien A M
2017-05-01
Female sterilization is one of the most common contraceptive methods. A small number of women, however, opt for reversal of sterilization procedures after they experience regret. Procedures can be performed by laparotomy or laparoscopy, with or without robotic assistance. Another commonly utilized alternative is IVF. The choice between surgery and IVF is often influenced by reimbursement politics for that particular geographic location. We evaluated the fertility outcomes of different surgical methods available for the reversal of female sterilization, compared these to IVF and assessed the prognostic factors for success. Two search strategies were employed. Firstly, we searched for randomized and non-randomized clinical studies presenting fertility outcomes of sterilization reversal up to July 2016. Data on the following outcomes were collected: pregnancy rate, ectopic pregnancy rate, cost of the procedure and operative time. Eligible study designs included prospective or retrospective studies, randomized controlled trials, cohort studies, case-control studies and case series. No age restriction was applied. Exclusion criteria were patients suffering from tubal infertility from any other reason (e.g. infection, endometriosis and adhesions from previous surgery) and studies including <10 participants. The following factors likely to influence the success of sterilization reversal procedures were then evaluated: female age, BMI and duration and method of sterilization. Secondly, we searched for randomized and non-randomized clinical studies that compared reversal of sterilization to IVF and evaluated them for pregnancy outcomes and cost effectiveness. We included 37 studies that investigated a total of 10 689 women. No randomized controlled trials were found. Most studies were retrospective cohort studies of a moderate quality. The pooled pregnancy rate after sterilization reversal was 42-69%, with heterogeneity seen from the different methods utilized. The reported ectopic pregnancy rate was 4-8%. The only prognostic factor affecting the chance of conception was female age. The surgical approach (i.e. laparotomy [microscopic], laparoscopy or robotic) had no impact on the outcome, with the exception of the macroscopic laparotomic technique, which had inferior results and is not currently utilized. For older women, IVF could be a more cost-effective alternative for the reversal of sterilization. However, direct comparative data are lacking and a cut-off age cannot be stated. In sterilized women who suffer regret, surgical tubal re-anastomosis is an effective treatment, especially in younger women. However, there is a need for randomized controlled trials comparing the success rates and costs of surgical reversal with IVF. © The Author 2017. Published by Oxford University Press on behalf of the European Society of Human Reproduction and Embryology. All rights reserved. For Permissions, please email: journals.permissions@oup.com
Recruitment and accrual of women in a placebo-controlled clinical pilot study on manual therapy.
Cambron, Jerrilyn A; Hawk, Cheryl; Evans, Roni; Long, Cynthia R
2004-06-01
To investigate the accrual rates and recruitment processes among 3 Midwestern sites during a pilot study on manual therapy for chronic pelvic pain. Multisite pilot study for a randomized, placebo-controlled clinical trial. Three chiropractic institutions in or near major metropolitan cities in the Midwestern United States. Thirty-nine women aged 18 to 45 with chronic pelvic pain of at least 6 months duration, diagnosed by a board certified gynecologist. The method of recruitment was collected for each individual who responded to an advertisement and completed an interviewer-administered telephone screen. Participants who were willing and eligible after 3 baseline visits were entered into a randomized clinical trial. The number of responses and accrual rates were determined for the overall study, each of the 3 treatment sites, and each of the 5 recruitment efforts. In this study, 355 women were screened over the telephone and 39 were randomized, making the rate of randomization approximately 10%. The most effective recruitment methods leading to randomization were direct mail (38%) and radio advertisements (34%). However, success of the recruitment process differed by site. Based on the accrual of this multisite pilot study, a full-scale trial would not be feasible using this study's parameters. However, useful information was gained on recruitment effectiveness, eligibility criteria, and screening protocols among the 3 metropolitan sites.
Wijeysundera, Duminda N; Austin, Peter C; Hux, Janet E; Beattie, W Scott; Laupacis, Andreas
2009-01-01
Randomized trials generally use "frequentist" statistics based on P-values and 95% confidence intervals. Frequentist methods have limitations that might be overcome, in part, by Bayesian inference. To illustrate these advantages, we re-analyzed randomized trials published in four general medical journals during 2004. We used Medline to identify randomized superiority trials with two parallel arms, individual-level randomization and dichotomous or time-to-event primary outcomes. Studies with P<0.05 in favor of the intervention were deemed "positive"; otherwise, they were "negative." We used several prior distributions and exact conjugate analyses to calculate Bayesian posterior probabilities for clinically relevant effects. Of 88 included studies, 39 were positive using a frequentist analysis. Although the Bayesian posterior probabilities of any benefit (relative risk or hazard ratio<1) were high in positive studies, these probabilities were lower and variable for larger benefits. The positive studies had only moderate probabilities for exceeding the effects that were assumed for calculating the sample size. By comparison, there were moderate probabilities of any benefit in negative studies. Bayesian and frequentist analyses complement each other when interpreting the results of randomized trials. Future reports of randomized trials should include both.
John, Bindu; Bellipady, Sumanth Shetty; Bhat, Shrinivasa Undaru
2016-01-01
Aims. The purpose of this pilot trial was to determine the efficacy of sleep promotion program to adapt it for the use of adolescents studying in various schools of Mangalore, India, and evaluate the feasibility issues before conducting a randomized controlled trial in a larger sample of adolescents. Methods. A randomized controlled trial design with stratified random sampling method was used. Fifty-eight adolescents were selected (mean age: 14.02 ± 2.15 years; intervention group, n = 34; control group, n = 24). Self-report questionnaires, including sociodemographic questionnaire with some additional questions on sleep and activities, Sleep Hygiene Index, Pittsburgh Sleep Quality Index, The Cleveland Adolescent Sleepiness Questionnaire, and PedsQL™ Present Functioning Visual Analogue Scale, were used. Results. Insufficient weekday-weekend sleep duration with increasing age of adolescents was observed. The program revealed a significant effect in the experimental group over the control group in overall sleep quality, sleep onset latency, sleep duration, daytime sleepiness, and emotional and overall distress. No significant effect was observed in sleep hygiene and other sleep parameters. All target variables showed significant correlations with each other. Conclusion. The intervention holds a promise for improving the sleep behaviors in healthy adolescents. However, the effect of the sleep promotion program treatment has yet to be proven through a future research. This trial is registered with ISRCTN13083118. PMID:27088040
Propagation of elastic wave in nanoporous material with distributed cylindrical nanoholes
NASA Astrophysics Data System (ADS)
Qiang, FangWei; Wei, PeiJun; Liu, XiQiang
2013-08-01
The effective propagation constants of plane longitudinal and shear waves in nanoporous material with random distributed parallel cylindrical nanoholes are studied. The surface elastic theory is used to consider the surface stress effects and to derive the nontraditional boundary condition on the surface of nanoholes. The plane wave expansion method is used to obtain the scattering waves from the single nanohole. The multiple scattering effects are taken into consideration by summing the scattered waves from all scatterers and performing the configuration averaging of random distributed scatterers. The effective propagation constants of coherent waves along with the associated dynamic effective elastic modulus are numerically evaluated. The influences of surface stress are discussed based on the numerical results.
On coherent oscillations of a string.
NASA Technical Reports Server (NTRS)
Liu, C. H.
1972-01-01
Vibrations of an elastic string when the separation between the ends varies randomly are studied. The emphasis is on the evolution of the coherent, or ordered, oscillations of the string. Using a perturbation technique borrowed from quantum field theory and the modified Kryloff-Bogoliuboff method, the 'multiple scattering' effect of the random separation between the ends on the linear and nonlinear coherent oscillations are investigated. It is found that due to the random interactions the coherent fundamental oscillation as well as the harmonies are damped. Their frequencies are also modified.
ERIC Educational Resources Information Center
Dong, Nianbo; Lipsey, Mark
2014-01-01
When randomized control trials (RCT) are not feasible, researchers seek other methods to make causal inference, e.g., propensity score methods. One of the underlined assumptions for the propensity score methods to obtain unbiased treatment effect estimates is the ignorability assumption, that is, conditional on the propensity score, treatment…
A Novel Motion Compensation Method for Random Stepped Frequency Radar with M-sequence
NASA Astrophysics Data System (ADS)
Liao, Zhikun; Hu, Jiemin; Lu, Dawei; Zhang, Jun
2018-01-01
The random stepped frequency radar is a new kind of synthetic wideband radar. In the research, it has been found that it possesses a thumbtack-like ambiguity function which is considered to be the ideal one. This also means that only a precise motion compensation could result in the correct high resolution range profile. In this paper, we will introduce the random stepped frequency radar coded by M-sequence firstly and briefly analyse the effect of relative motion between target and radar on the distance imaging, which is called defocusing problem. Then, a novel motion compensation method, named complementary code cancellation, will be put forward to solve this problem. Finally, the simulated experiments will demonstrate its validity and the computational analysis will show up its efficiency.
Jongerling, Joran; Laurenceau, Jean-Philippe; Hamaker, Ellen L
2015-01-01
In this article we consider a multilevel first-order autoregressive [AR(1)] model with random intercepts, random autoregression, and random innovation variance (i.e., the level 1 residual variance). Including random innovation variance is an important extension of the multilevel AR(1) model for two reasons. First, between-person differences in innovation variance are important from a substantive point of view, in that they capture differences in sensitivity and/or exposure to unmeasured internal and external factors that influence the process. Second, using simulation methods we show that modeling the innovation variance as fixed across individuals, when it should be modeled as a random effect, leads to biased parameter estimates. Additionally, we use simulation methods to compare maximum likelihood estimation to Bayesian estimation of the multilevel AR(1) model and investigate the trade-off between the number of individuals and the number of time points. We provide an empirical illustration by applying the extended multilevel AR(1) model to daily positive affect ratings from 89 married women over the course of 42 consecutive days.
Tao, Da; Or, Calvin Kl
2013-04-01
We conducted a systematic review and meta-analysis of randomized controlled trials (RCTs) which had evaluated self-management health information technology (SMHIT) for glycaemic control in patients with diabetes. A total of 43 RCTs was identified, which reported on 52 control-intervention comparisons. The glycosylated haemoglobin (HbA 1c ) data were pooled using a random effects meta-analysis method, followed by a meta-regression and subgroup analyses to examine the effects of a set of moderators. The meta-analysis showed that use of SMHITs was associated with a significant reduction in HbA 1c compared to usual care, with a pooled standardized mean difference of -0.30% (95% CI -0.39 to -0.21, P < 0.001). Sample size, age, study setting, type of application and method of data entry significantly moderated the effects of SMHIT use. The review supports the use of SMHITs as a self-management approach to improve glycaemic control. The effect of SMHIT use is significantly greater when the technology is a web-based application, when a mechanism for patients' health data entry is provided (manual or automatic) and when the technology is operated in the home or without location restrictions. Integrating these variables into the design of SMHITs may augment the effectiveness of the interventions. © SAGE Publications Ltd, 2013.
ERIC Educational Resources Information Center
McGinnis, Kathleen A.; Schulz, Richard; Stone, Roslyn A.; Klinger, Julie; Mercurio, Rocco
2006-01-01
Purpose: We assess the effects of racial or ethnic concordance between caregivers and interventionists on caregiver attrition, change in depression, and change in burden in a multisite randomized clinical trial. Design and Methods: Family caregivers of patients with Alzheimer's disease were randomized to intervention or control groups at six sites…
ERIC Educational Resources Information Center
Stern, Susan B.; Walsh, Margaret; Mercado, Micaela; Levene, Kathryn; Pepler, Debra J.; Carr, Ashley; Heppell, Allison; Lowe, Erin
2015-01-01
Objective: This study examines the effect of an ecological and contextually responsive approach, during initial intake call, on engagement for multistressed families seeking child mental health services in an urban setting. Methods: Using a randomized design, parents were allocated to phone Intake As Usual (IAU) or Enhanced Engagement Phone Intake…
ERIC Educational Resources Information Center
Possel, Patrick; Baldus, Christiane; Horn, Andrea B.; Groen, Gunter; Hautzinger, Martin
2005-01-01
Background: Depressive disorders in adolescents are a widespread and increasing problem. Prevention seems a promising and feasible approach. Methods: We designed a cognitive-behavioral school-based universal primary prevention program and followed 347 eighth-grade students participating in a randomized controlled trial for three months. Results:…
ERIC Educational Resources Information Center
Pina, Armando A.; Zerr, Argero A.; Villalta, Ian K.; Gonzales, Nancy A.
2012-01-01
Objective: This trial of a randomized indicated anxiety prevention and early intervention explored initial program effects as well as the role of ethnicity and language on measured outcomes. Method: A total of 88 youth (M = 10.36 years; 45 girls, 52 Latino) received 1 of 2 protocols with varying degrees of parent involvement, and response was…
ERIC Educational Resources Information Center
Brandon, Paul R.; Harrison, George M.; Lawton, Brian E.
2013-01-01
When evaluators plan site-randomized experiments, they must conduct the appropriate statistical power analyses. These analyses are most likely to be valid when they are based on data from the jurisdictions in which the studies are to be conducted. In this method note, we provide software code, in the form of a SAS macro, for producing statistical…
NASA Astrophysics Data System (ADS)
Zhang, Y.; Li, F.; Zhang, S.; Hao, W.; Zhu, T.; Yuan, L.; Xiao, F.
2017-09-01
In this paper, Statistical Distribution based Conditional Random Fields (STA-CRF) algorithm is exploited for improving marginal ice-water classification. Pixel level ice concentration is presented as the comparison of methods based on CRF. Furthermore, in order to explore the effective statistical distribution model to be integrated into STA-CRF, five statistical distribution models are investigated. The STA-CRF methods are tested on 2 scenes around Prydz Bay and Adélie Depression, where contain a variety of ice types during melt season. Experimental results indicate that the proposed method can resolve sea ice edge well in Marginal Ice Zone (MIZ) and show a robust distinction of ice and water.
Impedance measurement using a two-microphone, random-excitation method
NASA Technical Reports Server (NTRS)
Seybert, A. F.; Parrott, T. L.
1978-01-01
The feasibility of using a two-microphone, random-excitation technique for the measurement of acoustic impedance was studied. Equations were developed, including the effect of mean flow, which show that acoustic impedance is related to the pressure ratio and phase difference between two points in a duct carrying plane waves only. The impedances of a honeycomb ceramic specimen and a Helmholtz resonator were measured and compared with impedances obtained using the conventional standing-wave method. Agreement between the two methods was generally good. A sensitivity analysis was performed to pinpoint possible error sources and recommendations were made for future study. The two-microphone approach evaluated in this study appears to have some advantages over other impedance measuring techniques.
Vincenzi, Simone; Mangel, Marc; Crivelli, Alain J; Munch, Stephan; Skaug, Hans J
2014-09-01
The differences in demographic and life-history processes between organisms living in the same population have important consequences for ecological and evolutionary dynamics. Modern statistical and computational methods allow the investigation of individual and shared (among homogeneous groups) determinants of the observed variation in growth. We use an Empirical Bayes approach to estimate individual and shared variation in somatic growth using a von Bertalanffy growth model with random effects. To illustrate the power and generality of the method, we consider two populations of marble trout Salmo marmoratus living in Slovenian streams, where individually tagged fish have been sampled for more than 15 years. We use year-of-birth cohort, population density during the first year of life, and individual random effects as potential predictors of the von Bertalanffy growth function's parameters k (rate of growth) and L∞ (asymptotic size). Our results showed that size ranks were largely maintained throughout marble trout lifetime in both populations. According to the Akaike Information Criterion (AIC), the best models showed different growth patterns for year-of-birth cohorts as well as the existence of substantial individual variation in growth trajectories after accounting for the cohort effect. For both populations, models including density during the first year of life showed that growth tended to decrease with increasing population density early in life. Model validation showed that predictions of individual growth trajectories using the random-effects model were more accurate than predictions based on mean size-at-age of fish.
Transcranial direct current stimulation (tDCS) for idiopathic Parkinson's disease.
Elsner, Bernhard; Kugler, Joachim; Pohl, Marcus; Mehrholz, Jan
2016-07-18
Idiopathic Parkinson's disease (IPD) is a neurodegenerative disorder, with the severity of the disability usually increasing with disease duration. IPD affects patients' health-related quality of life, disability, and impairment. Current rehabilitation approaches have limited effectiveness in improving outcomes in patients with IPD, but a possible adjunct to rehabilitation might be non-invasive brain stimulation by transcranial direct current stimulation (tDCS) to modulate cortical excitability, and hence to improve these outcomes in IPD. To assess the effectiveness of tDCS in improving motor and non-motor symptoms in people with IPD. We searched the following databases (until February 2016): the Cochrane Central Register of Controlled Trials (CENTRAL; the Cochrane Library ; 2016 , Issue 2), MEDLINE, EMBASE, CINAHL, AMED, Science Citation Index, the Physiotherapy Evidence Database (PEDro), Rehabdata, and Inspec. In an effort to identify further published, unpublished, and ongoing trials, we searched trial registers and reference lists, handsearched conference proceedings, and contacted authors and equipment manufacturers. We included only randomised controlled trials (RCTs) and randomised controlled cross-over trials that compared tDCS versus control in patients with IPD for improving health-related quality of life , disability, and impairment. Two review authors independently assessed trial quality (JM and MP) and extracted data (BE and JM). If necessary, we contacted study authors to ask for additional information. We collected information on dropouts and adverse events from the trial reports. We included six trials with a total of 137 participants. We found two studies with 45 participants examining the effects of tDCS compared to control (sham tDCS) on our primary outcome measure, impairment, as measured by the Unified Parkinson's Disease Rating Scale (UPDRS). There was very low quality evidence for no effect of tDCS on change in global UPDRS score ( mean difference (MD) -7.10 %, 95% confidence interval (CI -19.18 to 4.97; P = 0.25, I² = 21%, random-effects model). However, there was evidence of an effect on UPDRS part III motor subsection score at the end of the intervention phase (MD -14.43%, 95% CI -24.68 to -4.18; P = 0.006, I² = 2%, random-effects model; very low quality evidence). One study with 25 participants measured the reduction in off and on time with dyskinesia, but there was no evidence of an effect (MD 0.10 hours, 95% CI -0.14 to 0.34; P = 0.41, I² = 0%, random-effects model; and MD 0.00 hours, 95% CI -0.12 to 0.12; P = 1, I² = 0%, random- effects model, respectively; very low quality evidence).Two trials with a total of 41 participants measured gait speed using measures of timed gait at the end of the intervention phase, revealing no evidence of an effect ( standardised mean difference (SMD) 0.50, 95% CI -0.17 to 1.18; P = 0.14, I² = 11%, random-effects model; very low quality evidence). Another secondary outcome was health-related quality of life and we found one study with 25 participants reporting on the physical health and mental health aspects of health-related quality of life (MD 1.00 SF-12 score, 95% CI -5.20 to 7.20; I² = 0%, inverse variance method with random-effects model; very low quality evidence; and MD 1.60 SF-12 score, 95% CI -5.08 to 8.28; I² = 0%, inverse variance method with random-effects model; very low quality evidence, respectively). We found no study examining the effects of tDCS for improving activities of daily living. In two of six studies, dropouts , adverse events, or deaths occurring during the intervention phase were reported. There was insufficient evidence that dropouts , adverse effects, or deaths were higher with intervention (risk difference (RD) 0.04, 95% CI -0.05 to 0.12; P = 0.40, I² = 0%, random-effects model; very low quality evidence).We found one trial with a total of 16 participants examining the effects of tDCS plus movement therapy compared to control (sham tDCS) plus movement therapy on our secondary outcome, gait speed at the end of the intervention phase, revealing no evidence of an effect (MD 0.05 m/s, 95% CI -0.15 to 0.25; inverse variance method with random-effects model; very low quality evidence). We found no evidence of an effect regarding differences in dropouts and adverse effects between intervention and control groups (RD 0.00, 95% CI -0.21 to 0.21; Mantel-Haenszel method with random-effects model; very low quality evidence). There is insufficient evidence to determine the effects of tDCS for reducing off time ( when the symptoms are not controlled by the medication) and on time with dyskinesia ( time that symptoms are controlled but the person still experiences involuntary muscle movements ) , and for improving health- related quality of life, disability, and impairment in patients with IPD. Evidence of very low quality indicates no difference in dropouts and adverse events between tDCS and control groups.
Sensing Urban Land-Use Patterns by Integrating Google Tensorflow and Scene-Classification Models
NASA Astrophysics Data System (ADS)
Yao, Y.; Liang, H.; Li, X.; Zhang, J.; He, J.
2017-09-01
With the rapid progress of China's urbanization, research on the automatic detection of land-use patterns in Chinese cities is of substantial importance. Deep learning is an effective method to extract image features. To take advantage of the deep-learning method in detecting urban land-use patterns, we applied a transfer-learning-based remote-sensing image approach to extract and classify features. Using the Google Tensorflow framework, a powerful convolution neural network (CNN) library was created. First, the transferred model was previously trained on ImageNet, one of the largest object-image data sets, to fully develop the model's ability to generate feature vectors of standard remote-sensing land-cover data sets (UC Merced and WHU-SIRI). Then, a random-forest-based classifier was constructed and trained on these generated vectors to classify the actual urban land-use pattern on the scale of traffic analysis zones (TAZs). To avoid the multi-scale effect of remote-sensing imagery, a large random patch (LRP) method was used. The proposed method could efficiently obtain acceptable accuracy (OA = 0.794, Kappa = 0.737) for the study area. In addition, the results show that the proposed method can effectively overcome the multi-scale effect that occurs in urban land-use classification at the irregular land-parcel level. The proposed method can help planners monitor dynamic urban land use and evaluate the impact of urban-planning schemes.
Pandis, Nikolaos; Polychronopoulou, Argy; Eliades, Theodore
2011-12-01
Randomization is a key step in reducing selection bias during the treatment allocation phase in randomized clinical trials. The process of randomization follows specific steps, which include generation of the randomization list, allocation concealment, and implementation of randomization. The phenomenon in the dental and orthodontic literature of characterizing treatment allocation as random is frequent; however, often the randomization procedures followed are not appropriate. Randomization methods assign, at random, treatment to the trial arms without foreknowledge of allocation by either the participants or the investigators thus reducing selection bias. Randomization entails generation of random allocation, allocation concealment, and the actual methodology of implementing treatment allocation randomly and unpredictably. Most popular randomization methods include some form of restricted and/or stratified randomization. This article introduces the reasons, which make randomization an integral part of solid clinical trial methodology, and presents the main randomization schemes applicable to clinical trials in orthodontics.
Simental-Mendía, Luis E; Simental-Mendía, Mario; Sahebkar, Amirhossein; Rodríguez-Morán, Martha; Guerrero-Romero, Fernando
2017-05-01
We performed a meta-analysis of randomized controlled trials (RCTs) in order to evaluate the effect of oral magnesium supplementation on lipid profile of both diabetic and non-diabetic individuals. PubMed-Medline, SCOPUS, Web of Science, and Google Scholar databases were searched (from inception to February 23, 2016) to identify RCTs evaluating the effect of magnesium on lipid concentrations. A random-effects model and generic inverse variance method were used for quantitative data synthesis. Sensitivity analysis was conducted using the leave-one-out method. A weighted random-effects meta-regression was performed to evaluate the impact of potential confounders on lipid concentrations. Magnesium treatment was not found to significantly affect plasma concentrations of any of the lipid indices including total cholesterol (WMD 0.03 mmol/L, 95% CI -0.11, 0.16, p = 0.671), LDL-C (WMD -0.01 mmol/L, 95% CI -0.13, 0.11, p = 0.903), HDL-C (WMD 0.03 mmol/L, 95% CI -0.003, 0.06, p = 0.076), and triglycerides concentrations (WMD -0.10 mmol/L, 95% CI -0.25, 0.04, p = 0.149). In a subgroup analysis comparing studies with and without diabetes, no difference was observed between subgroups in terms of changes in plasma total cholesterol (p = 0.924), LDL-C (p = 0.161), HDL-C (p = 0.822), and triglyceride (p = 0.162) concentrations. Results of the present meta-analysis indicated that magnesium supplementation showed no significant effects on the lipid profile of either diabetic or non-diabetic individuals.
Yoga for veterans with chronic low back pain: Design and methods of a randomized clinical trial.
Groessl, Erik J; Schmalzl, Laura; Maiya, Meghan; Liu, Lin; Goodman, Debora; Chang, Douglas G; Wetherell, Julie L; Bormann, Jill E; Atkinson, J Hamp; Baxi, Sunita
2016-05-01
Chronic low back pain (CLBP) afflicts millions of people worldwide, with particularly high prevalence in military veterans. Many treatment options exist for CLBP, but most have limited effectiveness and some have significant side effects. In general populations with CLBP, yoga has been shown to improve health outcomes with few side effects. However, yoga has not been adequately studied in military veteran populations. In the current paper we will describe the design and methods of a randomized clinical trial aimed at examining whether yoga can effectively reduce disability and pain in US military veterans with CLBP. A total of 144 US military veterans with CLBP will be randomized to either yoga or a delayed treatment comparison group. The yoga intervention will consist of 2× weekly yoga classes for 12weeks, complemented by regular home practice guided by a manual. The delayed treatment group will receive the same intervention after six months. The primary outcome is the change in back pain-related disability measured with the Roland-Morris Disability Questionnaire at baseline and 12-weeks. Secondary outcomes include pain intensity, pain interference, depression, anxiety, fatigue/energy, quality of life, self-efficacy, sleep quality, and medication usage. Additional process and/or mediational factors will be measured to examine dose response and effect mechanisms. Assessments will be conducted at baseline, 6-weeks, 12-weeks, and 6-months. All randomized participants will be included in intention-to-treat analyses. Study results will provide much needed evidence on the feasibility and effectiveness of yoga as a therapeutic modality for the treatment of CLBP in US military veterans. Published by Elsevier Inc.
Under What Assumptions Do Site-by-Treatment Instruments Identify Average Causal Effects?
ERIC Educational Resources Information Center
Reardon, Sean F.; Raudenbush, Stephen W.
2013-01-01
The increasing availability of data from multi-site randomized trials provides a potential opportunity to use instrumental variables methods to study the effects of multiple hypothesized mediators of the effect of a treatment. We derive nine assumptions needed to identify the effects of multiple mediators when using site-by-treatment interactions…
Effect of H-wave polarization on laser radar detection of partially convex targets in random media.
El-Ocla, Hosam
2010-07-01
A study on the performance of laser radar cross section (LRCS) of conducting targets with large sizes is investigated numerically in free space and random media. The LRCS is calculated using a boundary value method with beam wave incidence and H-wave polarization. Considered are those elements that contribute to the LRCS problem including random medium strength, target configuration, and beam width. The effect of the creeping waves, stimulated by H-polarization, on the LRCS behavior is manifested. Targets taking large sizes of up to five wavelengths are sufficiently larger than the beam width and are sufficient for considering fairly complex targets. Scatterers are assumed to have analytical partially convex contours with inflection points.
Yen, A M-F; Liou, H-H; Lin, H-L; Chen, T H-H
2006-01-01
The study aimed to develop a predictive model to deal with data fraught with heterogeneity that cannot be explained by sampling variation or measured covariates. The random-effect Poisson regression model was first proposed to deal with over-dispersion for data fraught with heterogeneity after making allowance for measured covariates. Bayesian acyclic graphic model in conjunction with Markov Chain Monte Carlo (MCMC) technique was then applied to estimate the parameters of both relevant covariates and random effect. Predictive distribution was then generated to compare the predicted with the observed for the Bayesian model with and without random effect. Data from repeated measurement of episodes among 44 patients with intractable epilepsy were used as an illustration. The application of Poisson regression without taking heterogeneity into account to epilepsy data yielded a large value of heterogeneity (heterogeneity factor = 17.90, deviance = 1485, degree of freedom (df) = 83). After taking the random effect into account, the value of heterogeneity factor was greatly reduced (heterogeneity factor = 0.52, deviance = 42.5, df = 81). The Pearson chi2 for the comparison between the expected seizure frequencies and the observed ones at two and three months of the model with and without random effect were 34.27 (p = 1.00) and 1799.90 (p < 0.0001), respectively. The Bayesian acyclic model using the MCMC method was demonstrated to have great potential for disease prediction while data show over-dispersion attributed either to correlated property or to subject-to-subject variability.
Randomized subspace-based robust principal component analysis for hyperspectral anomaly detection
NASA Astrophysics Data System (ADS)
Sun, Weiwei; Yang, Gang; Li, Jialin; Zhang, Dianfa
2018-01-01
A randomized subspace-based robust principal component analysis (RSRPCA) method for anomaly detection in hyperspectral imagery (HSI) is proposed. The RSRPCA combines advantages of randomized column subspace and robust principal component analysis (RPCA). It assumes that the background has low-rank properties, and the anomalies are sparse and do not lie in the column subspace of the background. First, RSRPCA implements random sampling to sketch the original HSI dataset from columns and to construct a randomized column subspace of the background. Structured random projections are also adopted to sketch the HSI dataset from rows. Sketching from columns and rows could greatly reduce the computational requirements of RSRPCA. Second, the RSRPCA adopts the columnwise RPCA (CWRPCA) to eliminate negative effects of sampled anomaly pixels and that purifies the previous randomized column subspace by removing sampled anomaly columns. The CWRPCA decomposes the submatrix of the HSI data into a low-rank matrix (i.e., background component), a noisy matrix (i.e., noise component), and a sparse anomaly matrix (i.e., anomaly component) with only a small proportion of nonzero columns. The algorithm of inexact augmented Lagrange multiplier is utilized to optimize the CWRPCA problem and estimate the sparse matrix. Nonzero columns of the sparse anomaly matrix point to sampled anomaly columns in the submatrix. Third, all the pixels are projected onto the complemental subspace of the purified randomized column subspace of the background and the anomaly pixels in the original HSI data are finally exactly located. Several experiments on three real hyperspectral images are carefully designed to investigate the detection performance of RSRPCA, and the results are compared with four state-of-the-art methods. Experimental results show that the proposed RSRPCA outperforms four comparison methods both in detection performance and in computational time.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Emanuel, A.E.
1991-03-01
This article presents a preliminary analysis of the effect of randomly varying harmonic voltages on the temperature rise of squirrel-cage motors. The stochastic process of random variations of harmonic voltages is defined by means of simple statistics (mean, standard deviation, type of distribution). Computational models based on a first-order approximation of the motor losses and on the Monte Carlo method yield results which prove that equipment with large thermal time-constant is capable of withstanding for a short period of time larger distortions than THD = 5%.
Effect of aperiodicity on the broadband reflection of silicon nanorod structures for photovoltaics.
Lin, Chenxi; Huang, Ningfeng; Povinelli, Michelle L
2012-01-02
We carry out a systematic numerical study of the effects of aperiodicity on silicon nanorod anti-reflection structures. We use the scattering matrix method to calculate the average reflection loss over the solar spectrum for periodic and aperiodic arrangements of nanorods. We find that aperiodicity can either improve or deteriorate the anti-reflection performance, depending on the nanorod diameter. We use a guided random-walk algorithm to design optimal aperiodic structures that exhibit lower reflection loss than both optimal periodic and random aperiodic structures.
ERIC Educational Resources Information Center
Huang, SuHua
2012-01-01
The mixed-method explanatory research design was employed to investigate the effectiveness of the Accelerated Reader (AR) program on middle school students' reading achievement and motivation. A total of 211 sixth to eighth-grade students provided quantitative data by completing an AR Survey. Thirty of the 211 students were randomly selected to…
ERIC Educational Resources Information Center
Sola, Agboola Omowunmi; Ojo, Oloyede Ezekiel
2007-01-01
This study assessed and compared the relative effectiveness of three methods for teaching and conducting experiments in separation of mixtures in chemistry. A pre-test, post-test experimental design with a control group was used. Two hundred and thirty three randomly selected Senior Secondary School I (SSS I) chemistry students were drawn from…
Mavridis, Dimitris; White, Ian R; Higgins, Julian P T; Cipriani, Andrea; Salanti, Georgia
2015-02-28
Missing outcome data are commonly encountered in randomized controlled trials and hence may need to be addressed in a meta-analysis of multiple trials. A common and simple approach to deal with missing data is to restrict analysis to individuals for whom the outcome was obtained (complete case analysis). However, estimated treatment effects from complete case analyses are potentially biased if informative missing data are ignored. We develop methods for estimating meta-analytic summary treatment effects for continuous outcomes in the presence of missing data for some of the individuals within the trials. We build on a method previously developed for binary outcomes, which quantifies the degree of departure from a missing at random assumption via the informative missingness odds ratio. Our new model quantifies the degree of departure from missing at random using either an informative missingness difference of means or an informative missingness ratio of means, both of which relate the mean value of the missing outcome data to that of the observed data. We propose estimating the treatment effects, adjusted for informative missingness, and their standard errors by a Taylor series approximation and by a Monte Carlo method. We apply the methodology to examples of both pairwise and network meta-analysis with multi-arm trials. © 2014 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.
Gemmell, Isla; Dunn, Graham
2011-03-01
In a partially randomized preference trial (PRPT) patients with no treatment preference are allocated to groups at random, but those who express a preference receive the treatment of their choice. It has been suggested that the design can improve the external and internal validity of trials. We used computer simulation to illustrate the impact that an unmeasured confounder could have on the results and conclusions drawn from a PRPT. We generated 4000 observations ("patients") that reflected the distribution of the Beck Depression Index (DBI) in trials of depression. Half were randomly assigned to a randomized controlled trial (RCT) design and half were assigned to a PRPT design. In the RCT, "patients" were evenly split between treatment and control groups; whereas in the preference arm, to reflect patient choice, 87.5% of patients were allocated to the experimental treatment and 12.5% to the control. Unadjusted analyses of the PRPT data consistently overestimated the treatment effect and its standard error. This lead to Type I errors when the true treatment effect was small and Type II errors when the confounder effect was large. The PRPT design is not recommended as a method of establishing an unbiased estimate of treatment effect due to the potential influence of unmeasured confounders. Copyright © 2011 John Wiley & Sons, Ltd.
On the apparent insignificance of the randomness of flexible joints on large space truss dynamics
NASA Technical Reports Server (NTRS)
Koch, R. M.; Klosner, J. M.
1993-01-01
Deployable periodic large space structures have been shown to exhibit high dynamic sensitivity to period-breaking imperfections and uncertainties. These can be brought on by manufacturing or assembly errors, structural imperfections, as well as nonlinear and/or nonconservative joint behavior. In addition, the necessity of precise pointing and position capability can require the consideration of these usually negligible and unknown parametric uncertainties and their effect on the overall dynamic response of large space structures. This work describes the use of a new design approach for the global dynamic solution of beam-like periodic space structures possessing parametric uncertainties. Specifically, the effect of random flexible joints on the free vibrations of simply-supported periodic large space trusses is considered. The formulation is a hybrid approach in terms of an extended Timoshenko beam continuum model, Monte Carlo simulation scheme, and first-order perturbation methods. The mean and mean-square response statistics for a variety of free random vibration problems are derived for various input random joint stiffness probability distributions. The results of this effort show that, although joint flexibility has a substantial effect on the modal dynamic response of periodic large space trusses, the effect of any reasonable uncertainty or randomness associated with these joint flexibilities is insignificant.
A meta-analysis of third wave mindfulness-based cognitive behavioral therapies for older people.
Kishita, Naoko; Takei, Yuko; Stewart, Ian
2017-12-01
The aim of this study is to review the effectiveness of third wave mindfulness-based cognitive behavioral therapies (CBTs) for depressive or anxiety symptomatology in older adults across a wide range of physical and psychological conditions. Electronic literature databases were searched for articles, and random-effects meta-analysis was conducted. Ten studies met the inclusion criteria, of which nine reported the efficacy of interventions on depressive symptoms and seven on anxiety symptoms. Effect-size estimates suggested that mindfulness-based CBT is moderately effective on depressive symptoms in older adults (g = 0.55). The results demonstrated a similar level of overall effect size for anxiety symptoms (g = 0.58). However, there was a large heterogeneity, and publication bias was evident in studies reporting outcomes on anxiety symptoms, and thus, this observed efficacy for late-life anxiety may not be robust. The quality of the included studies varied. Only one study used an active psychological control condition. There were a limited number of studies that used an intent-to-treat (last observation carried forward method) analysis and reported appropriate methods for clinical trials (e.g., treatment-integrity reporting). Third wave mindfulness-based CBT may be robust in particular for depressive symptoms in older adults. We recommend that future studies (i) conduct randomized controlled trials with intent-to-treat to compare mindfulness-based CBT with other types of psychotherapy in older people and (ii) improve study quality by using appropriate methods for checking treatment adherence, randomization, and blinding of assessors. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Traverzim, Maria Aparecida Dos Santos; Makabe, Sergio; Silva, Daniela Fátima Teixeira; Pavani, Christiane; Bussadori, Sandra Kalil; Fernandes, Kristianne Santos Porta; Motta, Lara Jansiski
2018-06-01
Labor pain is one of the most intense pains experienced by women, which leads to an increase in the number of women opting to undergo a cesarean delivery. Pharmacological and nonpharmacological analgesia methods are used to control labor pain. Epidural analgesia is the most commonly used pharmacological analgesia method. However, it may have side effects on the fetus and the mother. Light-emitting diode (LED) photobiomodulation is an effective and noninvasive alternative to pharmacological methods. To evaluate the effects of LED photobiomodulation on analgesia during labor. In total, 60 women in labor admitted to a public maternity hospital will be selected for a randomized controlled trial. The participants will be randomized into 2 groups: intervention group [analgesia with LED therapy (n = 30)] and control group [analgesia with bath therapy (n = 30)]. The perception of pain will be assessed using the visual analogue scale (VAS), with a score from 0 to 10 at baseline, that is, before the intervention. In both the groups, the procedures will last 10 minutes and will be performed at 3 time points during labor: during cervical dilation of 4 to 5 cm, 6 to 7 cm, and 8 to 9 cm. At all 3 time points, pain perception will be evaluated using VAS shortly after the intervention. In addition, the evaluation of membrane characteristics (intact or damaged), heart rate, uterine dynamics, and cardiotocography will be performed at all time points. The use of LED photobiomodulation will have an analgesic effect superior to that of the bath therapy.
Heidari, Saeide; Babaii, Atye; Abbasinia, Mohammad; Shamali, Mahdi; Abbasi, Mohammad; Rezaei, Mahboobe
2015-01-01
Background: The instability of cardiovascular indices and anxiety disorders are common among patients undergoing coronary artery bypass graft (CABG) and could interfere with their recovery. Therefore, improving the cardiovascular indices and anxiety is essential. Objectives: This study aimed to investigate the effect of music therapy on anxiety and cardiovascular indices in patients undergoing CABG. Patients and Methods: In this randomized controlled trial, 60 patients hospitalized in the cardiovascular surgical intensive care unit of Shahid Beheshti Hospital in Qom city, Iran, in 2013 were selected using a consecutive sampling method and randomly allocated into the experimental and control groups. In the experimental group, patients received 30 minutes of light music, whereas in the control group, patients had 30 minutes of rest in bed. The cardiovascular indices and anxiety were measured immediately before, immediately after and half an hour after the study. Data were analyzed using the chi-square test and repeated measures analysis of variance. Results: Compared to the immediately before intervention, the mean anxiety scores immediately after and 30 minutes after the intervention were significantly lower in the experimental group (P < 0.037) while it did not significantly change in the control group. However, there were no significant differences regarding the cardiovascular indices in the three consecutive measurements (P > 0.05). Conclusions: Music therapy is effective in decreasing anxiety among patients undergoing CABG. However, the intervention was not effective on cardiovascular indices. Music can effectively be used as a non-pharmacological method to manage anxiety after CABG. PMID:26835471
Burgess, Stephen; Scott, Robert A; Timpson, Nicholas J; Davey Smith, George; Thompson, Simon G
2015-07-01
Finding individual-level data for adequately-powered Mendelian randomization analyses may be problematic. As publicly-available summarized data on genetic associations with disease outcomes from large consortia are becoming more abundant, use of published data is an attractive analysis strategy for obtaining precise estimates of the causal effects of risk factors on outcomes. We detail the necessary steps for conducting Mendelian randomization investigations using published data, and present novel statistical methods for combining data on the associations of multiple (correlated or uncorrelated) genetic variants with the risk factor and outcome into a single causal effect estimate. A two-sample analysis strategy may be employed, in which evidence on the gene-risk factor and gene-outcome associations are taken from different data sources. These approaches allow the efficient identification of risk factors that are suitable targets for clinical intervention from published data, although the ability to assess the assumptions necessary for causal inference is diminished. Methods and guidance are illustrated using the example of the causal effect of serum calcium levels on fasting glucose concentrations. The estimated causal effect of a 1 standard deviation (0.13 mmol/L) increase in calcium levels on fasting glucose (mM) using a single lead variant from the CASR gene region is 0.044 (95 % credible interval -0.002, 0.100). In contrast, using our method to account for the correlation between variants, the corresponding estimate using 17 genetic variants is 0.022 (95 % credible interval 0.009, 0.035), a more clearly positive causal effect.
Brown, Justin C.; Troxel, Andrea B.; Ky, Bonnie; Damjanov, Nevena; Zemel, Babette S.; Rickels, Michael R.; Rhim, Andrew D.; Rustgi, Anil K.; Courneya, Kerry S.; Schmitz, Kathryn H.
2016-01-01
Background Observational studies indicate that higher volumes of physical activity are associated with improved disease outcomes among colon cancer survivors. The aim of this report is to describe the purpose, study design, methods, and recruitment results of the COURAGE trial, a National Cancer Institute (NCI) sponsored, phase II, randomized, dose-response exercise trial among colon cancer survivors. Methods/Results The primary objective of the COURAGE trial is to quantify the feasibility, safety, and physiologic effects of low-dose (150 min·wk−1) and high-dose (300 min·wk−1) moderate-intensity aerobic exercise compared to usual-care control group over six months. The exercise groups are provided with in-home treadmills and heart rate monitors. Between January and July 2015, 1,433 letters were mailed using a population-based state cancer registry; 126 colon cancer survivors inquired about participation, and 39 were randomized onto the study protocol. Age was associated with inquiry about study participation (P<0.001) and randomization onto the study protocol (P<0.001). No other demographic, clinical, or geographic characteristics were associated with study inquiry or randomization. The final trial participant was randomized in August 2015. Six month endpoint data collection was completed in February 2016. Discussion The recruitment of colon cancer survivors into an exercise trial is feasible. The findings from this trial will inform key design aspects for future phase 2 and phase 3 randomized controlled trials to examine the efficacy of exercise to improve clinical outcomes among colon cancer survivors. PMID:26970181
NASA Astrophysics Data System (ADS)
Elmore, K. L.
2016-12-01
The Metorological Phenomemna Identification NeartheGround (mPING) project is an example of a crowd-sourced, citizen science effort to gather data of sufficeint quality and quantity needed by new post processing methods that use machine learning. Transportation and infrastructure are particularly sensitive to precipitation type in winter weather. We extract attributes from operational numerical forecast models and use them in a random forest to generate forecast winter precipitation types. We find that random forests applied to forecast soundings are effective at generating skillful forecasts of surface ptype with consideralbly more skill than the current algorithms, especuially for ice pellets and freezing rain. We also find that three very different forecast models yuield similar overall results, showing that random forests are able to extract essentially equivalent information from different forecast models. We also show that the random forest for each model, and each profile type is unique to the particular forecast model and that the random forests developed using a particular model suffer significant degradation when given attributes derived from a different model. This implies that no single algorithm can perform well across all forecast models. Clearly, random forests extract information unavailable to "physically based" methods because the physical information in the models does not appear as we expect. One intersting result is that results from the classic "warm nose" sounding profile are, by far, the most sensitive to the particular forecast model, but this profile is also the one for which random forests are most skillful. Finally, a method for calibrarting probabilties for each different ptype using multinomial logistic regression is shown.
Su, Tin Tin; Majid, Hazreen Abdul; Nahar, Azmi Mohamed; Azizan, Nurul Ain; Hairi, Farizah Mohd; Thangiah, Nithiah; Dahlui, Maznah; Bulgiba, Awang; Murray, Liam J
2017-11-06
After publication of the article [1], it has been brought to our attention that the methodology outlined in the original article was not able to be fully carried out. The article planned a two armed randomized control trial. However, due to a lower response than expected and one housing complex dropping out from the study, the method was changed to pre- and post-intervention with no control group. All other methods were conducted as outlined in the original article.
Enhancing sparsity of Hermite polynomial expansions by iterative rotations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Xiu; Lei, Huan; Baker, Nathan A.
2016-02-01
Compressive sensing has become a powerful addition to uncertainty quantification in recent years. This paper identifies new bases for random variables through linear mappings such that the representation of the quantity of interest is more sparse with new basis functions associated with the new random variables. This sparsity increases both the efficiency and accuracy of the compressive sensing-based uncertainty quantification method. Specifically, we consider rotation- based linear mappings which are determined iteratively for Hermite polynomial expansions. We demonstrate the effectiveness of the new method with applications in solving stochastic partial differential equations and high-dimensional (O(100)) problems.
Comparison of Three Tobacco Survey Methods with College Students: A Case Study
ERIC Educational Resources Information Center
James, Delores C. S.; Chen, W. William; Sheu, Jiunn-Jye
2005-01-01
The goals of this case study were to: (1) determine the efficiency and effectiveness of three survey methods--postal mail survey, web-based survey, and random in-class administration survey--in assessing tobacco-related attitudes and behaviors among college students and (2) compare the response rate and procedures of these three methods. There was…
NASA Technical Reports Server (NTRS)
Tomberlin, T. J.
1985-01-01
Research studies of residents' responses to noise consist of interviews with samples of individuals who are drawn from a number of different compact study areas. The statistical techniques developed provide a basis for those sample design decisions. These techniques are suitable for a wide range of sample survey applications. A sample may consist of a random sample of residents selected from a sample of compact study areas, or in a more complex design, of a sample of residents selected from a sample of larger areas (e.g., cities). The techniques may be applied to estimates of the effects on annoyance of noise level, numbers of noise events, the time-of-day of the events, ambient noise levels, or other factors. Methods are provided for determining, in advance, how accurately these effects can be estimated for different sample sizes and study designs. Using a simple cost function, they also provide for optimum allocation of the sample across the stages of the design for estimating these effects. These techniques are developed via a regression model in which the regression coefficients are assumed to be random, with components of variance associated with the various stages of a multi-stage sample design.
Probalistic Finite Elements (PFEM) structural dynamics and fracture mechanics
NASA Technical Reports Server (NTRS)
Liu, Wing-Kam; Belytschko, Ted; Mani, A.; Besterfield, G.
1989-01-01
The purpose of this work is to develop computationally efficient methodologies for assessing the effects of randomness in loads, material properties, and other aspects of a problem by a finite element analysis. The resulting group of methods is called probabilistic finite elements (PFEM). The overall objective of this work is to develop methodologies whereby the lifetime of a component can be predicted, accounting for the variability in the material and geometry of the component, the loads, and other aspects of the environment; and the range of response expected in a particular scenario can be presented to the analyst in addition to the response itself. Emphasis has been placed on methods which are not statistical in character; that is, they do not involve Monte Carlo simulations. The reason for this choice of direction is that Monte Carlo simulations of complex nonlinear response require a tremendous amount of computation. The focus of efforts so far has been on nonlinear structural dynamics. However, in the continuation of this project, emphasis will be shifted to probabilistic fracture mechanics so that the effect of randomness in crack geometry and material properties can be studied interactively with the effect of random load and environment.
Principal Score Methods: Assumptions, Extensions, and Practical Considerations
ERIC Educational Resources Information Center
Feller, Avi; Mealli, Fabrizia; Miratrix, Luke
2017-01-01
Researchers addressing posttreatment complications in randomized trials often turn to principal stratification to define relevant assumptions and quantities of interest. One approach for the subsequent estimation of causal effects in this framework is to use methods based on the "principal score," the conditional probability of belonging…
2014-01-01
In the current practice, to determine the safety factor of a slope with two-dimensional circular potential failure surface, one of the searching methods for the critical slip surface is Genetic Algorithm (GA), while the method to calculate the slope safety factor is Fellenius' slices method. However GA needs to be validated with more numeric tests, while Fellenius' slices method is just an approximate method like finite element method. This paper proposed a new method to determine the minimum slope safety factor which is the determination of slope safety factor with analytical solution and searching critical slip surface with Genetic-Traversal Random Method. The analytical solution is more accurate than Fellenius' slices method. The Genetic-Traversal Random Method uses random pick to utilize mutation. A computer automatic search program is developed for the Genetic-Traversal Random Method. After comparison with other methods like slope/w software, results indicate that the Genetic-Traversal Random Search Method can give very low safety factor which is about half of the other methods. However the obtained minimum safety factor with Genetic-Traversal Random Search Method is very close to the lower bound solutions of slope safety factor given by the Ansys software. PMID:24782679
Beidas, Rinad S; Maclean, Johanna Catherine; Fishman, Jessica; Dorsey, Shannon; Schoenwald, Sonja K; Mandell, David S; Shea, Judy A; McLeod, Bryce D; French, Michael T; Hogue, Aaron; Adams, Danielle R; Lieberman, Adina; Becker-Haimes, Emily M; Marcus, Steven C
2016-09-15
This randomized trial will compare three methods of assessing fidelity to cognitive-behavioral therapy (CBT) for youth to identify the most accurate and cost-effective method. The three methods include self-report (i.e., therapist completes a self-report measure on the CBT interventions used in session while circumventing some of the typical barriers to self-report), chart-stimulated recall (i.e., therapist reports on the CBT interventions used in session via an interview with a trained rater, and with the chart to assist him/her) and behavioral rehearsal (i.e., therapist demonstrates the CBT interventions used in session via a role-play with a trained rater). Direct observation will be used as the gold-standard comparison for each of the three methods. This trial will recruit 135 therapists in approximately 12 community agencies in the City of Philadelphia. Therapists will be randomized to one of the three conditions. Each therapist will provide data from three unique sessions, for a total of 405 sessions. All sessions will be audio-recorded and coded using the Therapy Process Observational Coding System for Child Psychotherapy-Revised Strategies scale. This will enable comparison of each measurement approach to direct observation of therapist session behavior to determine which most accurately assesses fidelity. Cost data associated with each method will be gathered. To gather stakeholder perspectives of each measurement method, we will use purposive sampling to recruit 12 therapists from each condition (total of 36 therapists) and 12 supervisors to participate in semi-structured qualitative interviews. Results will provide needed information on how to accurately and cost-effectively measure therapist fidelity to CBT for youth, as well as important information about stakeholder perspectives with regard to each measurement method. Findings will inform fidelity measurement practices in future implementation studies as well as in clinical practice. NCT02820623 , June 3rd, 2016.
FitzGerald, Mary P; Anderson, Rodney U; Potts, Jeannette; Payne, Christopher K; Peters, Kenneth M; Clemens, J Quentin; Kotarinos, Rhonda; Fraser, Laura; Cosby, Annamarie; Fortman, Carole; Neville, Cynthia; Badillo, Suzanne; Odabachian, Lisa; Sanfield, Anna; O’Dougherty, Betsy; Halle-Podell, Rick; Cen, Liyi; Chuai, Shannon; Landis, J Richard; Kusek, John W; Nyberg, Leroy M
2010-01-01
Objectives To determine the feasibility of conducting a randomized clinical trial designed to compare two methods of manual therapy (myofascial physical therapy (MPT) and global therapeutic massage (GTM)) among patients with urologic chronic pelvic pain syndromes. Materials and Methods Our goal was to recruit 48 subjects with chronic prostatitis/chronic pelvic pain syndrome or interstitial cystitis/painful bladder syndrome at six clinical centers. Eligible patients were randomized to either MPT or GTM and were scheduled to receive up to 10 weekly treatments, each 1 hour in duration. Criteria to assess feasibility included adherence of therapists to prescribed therapeutic protocol as determined by records of treatment, adverse events which occurred during study treatment, and rate of response to therapy as assessed by the Patient Global Response Assessment (GRA). Primary outcome analysis compared response rates between treatment arms using Mantel-Haenszel methods. Results Twenty-three (49%) men and 24 (51%) women were randomized over a six month period. Twenty-four (51%) patients were randomized to GTM, 23 (49%) to MPT; 44 (94%) patients completed the study. Therapist adherence to the treatment protocols was excellent. The GRA response rate of 57% in the MPT group was significantly higher than the rate of 21% in the GTM treatment group (p=0.03). Conclusions The goals to judge feasibility of conducting a full-scale trial of physical therapy methods were met. The preliminary findings of a beneficial effect of MPT warrants further study. PMID:19535099
A dynamic spatio-temporal model for spatial data
Hefley, Trevor J.; Hooten, Mevin B.; Hanks, Ephraim M.; Russell, Robin; Walsh, Daniel P.
2017-01-01
Analyzing spatial data often requires modeling dependencies created by a dynamic spatio-temporal data generating process. In many applications, a generalized linear mixed model (GLMM) is used with a random effect to account for spatial dependence and to provide optimal spatial predictions. Location-specific covariates are often included as fixed effects in a GLMM and may be collinear with the spatial random effect, which can negatively affect inference. We propose a dynamic approach to account for spatial dependence that incorporates scientific knowledge of the spatio-temporal data generating process. Our approach relies on a dynamic spatio-temporal model that explicitly incorporates location-specific covariates. We illustrate our approach with a spatially varying ecological diffusion model implemented using a computationally efficient homogenization technique. We apply our model to understand individual-level and location-specific risk factors associated with chronic wasting disease in white-tailed deer from Wisconsin, USA and estimate the location the disease was first introduced. We compare our approach to several existing methods that are commonly used in spatial statistics. Our spatio-temporal approach resulted in a higher predictive accuracy when compared to methods based on optimal spatial prediction, obviated confounding among the spatially indexed covariates and the spatial random effect, and provided additional information that will be important for containing disease outbreaks.
NASA Astrophysics Data System (ADS)
Abdulbaqi, Hayder Saad; Jafri, Mohd Zubir Mat; Omar, Ahmad Fairuz; Mustafa, Iskandar Shahrim Bin; Abood, Loay Kadom
2015-04-01
Brain tumors, are an abnormal growth of tissues in the brain. They may arise in people of any age. They must be detected early, diagnosed accurately, monitored carefully, and treated effectively in order to optimize patient outcomes regarding both survival and quality of life. Manual segmentation of brain tumors from CT scan images is a challenging and time consuming task. Size and location accurate detection of brain tumor plays a vital role in the successful diagnosis and treatment of tumors. Brain tumor detection is considered a challenging mission in medical image processing. The aim of this paper is to introduce a scheme for tumor detection in CT scan images using two different techniques Hidden Markov Random Fields (HMRF) and Fuzzy C-means (FCM). The proposed method has been developed in this research in order to construct hybrid method between (HMRF) and threshold. These methods have been applied on 4 different patient data sets. The result of comparison among these methods shows that the proposed method gives good results for brain tissue detection, and is more robust and effective compared with (FCM) techniques.
Classification of Hyperspectral Data Based on Guided Filtering and Random Forest
NASA Astrophysics Data System (ADS)
Ma, H.; Feng, W.; Cao, X.; Wang, L.
2017-09-01
Hyperspectral images usually consist of more than one hundred spectral bands, which have potentials to provide rich spatial and spectral information. However, the application of hyperspectral data is still challengeable due to "the curse of dimensionality". In this context, many techniques, which aim to make full use of both the spatial and spectral information, are investigated. In order to preserve the geometrical information, meanwhile, with less spectral bands, we propose a novel method, which combines principal components analysis (PCA), guided image filtering and the random forest classifier (RF). In detail, PCA is firstly employed to reduce the dimension of spectral bands. Secondly, the guided image filtering technique is introduced to smooth land object, meanwhile preserving the edge of objects. Finally, the features are fed into RF classifier. To illustrate the effectiveness of the method, we carry out experiments over the popular Indian Pines data set, which is collected by Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) sensor. By comparing the proposed method with the method of only using PCA or guided image filter, we find that effect of the proposed method is better.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thomas, Luc, E-mail: luc.thomas@headway.com; Jan, Guenole; Le, Son
The thermal stability of perpendicular Spin-Transfer-Torque Magnetic Random Access Memory (STT-MRAM) devices is investigated at chip level. Experimental data are analyzed in the framework of the Néel-Brown model including distributions of the thermal stability factor Δ. We show that in the low error rate regime important for applications, the effect of distributions of Δ can be described by a single quantity, the effective thermal stability factor Δ{sub eff}, which encompasses both the median and the standard deviation of the distributions. Data retention of memory chips can be assessed accurately by measuring Δ{sub eff} as a function of device diameter andmore » temperature. We apply this method to show that 54 nm devices based on our perpendicular STT-MRAM design meet our 10 year data retention target up to 120 °C.« less
Tanprasertkul, Chamnan; Ekarattanawong, Sophapun; Sreshthaputra, Opas; Vutyavanich, Teraporn
2014-08-01
To evaluate the impact on ovarian reserve between two different methods ofhemostasis after laparoscopic ovarian endometrioma excision. A randomized controlled study was conducted from January to December 2013 in Thammasat University Hospital, Thailand. Reproductive women, age 18-45years who underwent laparoscopic ovarian cystectomy were randomized in electrocoagulation and suture groups. Clinical baseline data and ovarian reserve outcome (anti-Mullerian hormone (AMH)) were evaluated. Fifty participants were recruited and randomized in two groups. Electrocoagulation and suture groups consisted of 25 participants. Baseline characteristics between 2 groups (age, weight, BMI, height, cyst diameter, duration and estimated blood loss) were not statistically different. There were no significant difference of AMIH between electrocoagulation and suture group atpre-operative (2.90±2.26 vs. 2.52±2.37 ng/ml), 1 week (1.78±1.51 vs. 1.99±1.71 ng/ml), 1 month (1.76±1.50 vs. 2.09±1.62 ng/ml), 3 months (2.09±1.66 vs. 1.96±1.68 ng/ml) and 6 months (2.11±1.84 vs 1.72±1.68 ng/ml), respectively. However mean AMH ofboth groups significantly decreased since the first week of operation. Effect oflaparoscopic ovarian surgery had significantly declined and sustained AMH level until 6 months. Laparoscopic cystectomy of ovarian endometrioma has negative impact to ovarian reserve. Either electroco- agulation or suture method had no different effects.
Sahebkar, Amirhossein; Simental-Mendía, Luis E; Ferretti, Gianna; Bacchetti, Tiziana; Golledge, Jonathan
2015-12-01
Vitamin E is one of the most important natural antioxidants, and its plasma levels are inversely associated with the progression of atherosclerosis. There have been reports suggesting a potential negative effect of statin therapy on plasma vitamin E levels. The aim of this meta-analysis was to determine the impact of statin therapy on plasma vitamin E concentrations. PubMed-Medline, SCOPUS, Web of Science and Google Scholar databases were searched to identify randomized placebo-controlled trials evaluating the impact of statins on plasma vitamin E concentrations from inception to February 27, 2015. A systematic assessment of bias in the included studies was performed using the Cochrane criteria. A random-effects model (using DerSimonian-Laird method) and the generic inverse variance method were used to examine the effect of statins on plasma vitamin E concentrations. Heterogeneity was quantitatively assessed using the I(2) index. Sensitivity analysis was conducted using the leave-one-out method. A meta-analysis of data from 8 randomized treatment arms including 504 participants indicated a significant reduction in plasma vitamin E concentrations following statin treatment (WMD: -16.30%, 95% CI: -16.93, -15.98, p < 0.001). However, cholesterol-adjusted vitamin E concentrations (defined as vitamin E:total cholesterol ratio) were found to be improved by statin therapy (WMD: 29.35%, 95% CI: 24.98, 33.72, p < 0.001). Statin therapy was not associated with any significant alteration in LDL vitamin E content (SMD: 0.003, 95% CI: -0.90, 0.90, p = 0.995). Findings of the present study suggest that statin therapy has no negative impact on plasma vitamin E concentrations or LDL vitamin E content. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Multilevel Modeling with Correlated Effects
ERIC Educational Resources Information Center
Kim, Jee-Seon; Frees, Edward W.
2007-01-01
When there exist omitted effects, measurement error, and/or simultaneity in multilevel models, explanatory variables may be correlated with random components, and standard estimation methods do not provide consistent estimates of model parameters. This paper introduces estimators that are consistent under such conditions. By employing generalized…
NASA Astrophysics Data System (ADS)
Hong, Liang
2013-10-01
The availability of high spatial resolution remote sensing data provides new opportunities for urban land-cover classification. More geometric details can be observed in the high resolution remote sensing image, Also Ground objects in the high resolution remote sensing image have displayed rich texture, structure, shape and hierarchical semantic characters. More landscape elements are represented by a small group of pixels. Recently years, the an object-based remote sensing analysis methodology is widely accepted and applied in high resolution remote sensing image processing. The classification method based on Geo-ontology and conditional random fields is presented in this paper. The proposed method is made up of four blocks: (1) the hierarchical ground objects semantic framework is constructed based on geoontology; (2) segmentation by mean-shift algorithm, which image objects are generated. And the mean-shift method is to get boundary preserved and spectrally homogeneous over-segmentation regions ;(3) the relations between the hierarchical ground objects semantic and over-segmentation regions are defined based on conditional random fields framework ;(4) the hierarchical classification results are obtained based on geo-ontology and conditional random fields. Finally, high-resolution remote sensed image data -GeoEye, is used to testify the performance of the presented method. And the experimental results have shown the superiority of this method to the eCognition method both on the effectively and accuracy, which implies it is suitable for the classification of high resolution remote sensing image.
The effects of Sahaja Yoga meditation on mental health: a systematic review.
Hendriks, Tom
2018-05-30
Objectives To determine the efficacy of Sahaja Yoga (SY) meditation on mental health among clinical and healthy populations. Methods All publications on SY were eligible. Databases were searched up to November 2017, namely PubMed, MEDLINE (NLM), PsychINFO, and Scopus. An internet search (Google Scholar) was also conducted. The quality of the randomized controlled trails was assessed using the Cochrane Risk Assessment for Bias. The quality of cross-sectional studies, a non-randomized controlled trial and a cohort study was assessed with the Newcastle-Ottawa Quality Assessment Scale. Results We included a total of eleven studies; four randomized controlled trials, one non-randomized controlled trial, five cross-sectional studies, and one prospective cohort study. The studies included a total of 910 participants. Significant findings were reported in relation to the following outcomes: anxiety, depression, stress, subjective well-being, and psychological well-being. Two randomized studies were rated as high quality studies, two randomized studies as low quality studies. The quality of the non-randomized trial, the cross-sectional studies and the cohort study was high. Effect sizes could not be calculated in five studies due to unclear or incomplete reporting. Conclusions After reviewing the articles and taking the quality of the studies into account, it appears that SY may reduce depression and possibly anxiety. In addition, the practice of SY is also associated with increased subjective wellbeing and psychological well-beng. However, due to the limited number of publications, definite conclusions on the effects of SY cannot be made and more high quality randomized studies are needed to justify any firm conclusions on the beneficial effects of SY on mental health.
Mesoscopic description of random walks on combs
NASA Astrophysics Data System (ADS)
Méndez, Vicenç; Iomin, Alexander; Campos, Daniel; Horsthemke, Werner
2015-12-01
Combs are a simple caricature of various types of natural branched structures, which belong to the category of loopless graphs and consist of a backbone and branches. We study continuous time random walks on combs and present a generic method to obtain their transport properties. The random walk along the branches may be biased, and we account for the effect of the branches by renormalizing the waiting time probability distribution function for the motion along the backbone. We analyze the overall diffusion properties along the backbone and find normal diffusion, anomalous diffusion, and stochastic localization (diffusion failure), respectively, depending on the characteristics of the continuous time random walk along the branches, and compare our analytical results with stochastic simulations.
Extraction of linear features on SAR imagery
NASA Astrophysics Data System (ADS)
Liu, Junyi; Li, Deren; Mei, Xin
2006-10-01
Linear features are usually extracted from SAR imagery by a few edge detectors derived from the contrast ratio edge detector with a constant probability of false alarm. On the other hand, the Hough Transform is an elegant way of extracting global features like curve segments from binary edge images. Randomized Hough Transform can reduce the computation time and memory usage of the HT drastically. While Randomized Hough Transform will bring about a great deal of cells invalid during the randomized sample. In this paper, we propose a new approach to extract linear features on SAR imagery, which is an almost automatic algorithm based on edge detection and Randomized Hough Transform. The presented improved method makes full use of the directional information of each edge candidate points so as to solve invalid cumulate problems. Applied result is in good agreement with the theoretical study, and the main linear features on SAR imagery have been extracted automatically. The method saves storage space and computational time, which shows its effectiveness and applicability.
Use of simulation to compare the performance of minimization with stratified blocked randomization.
Toorawa, Robert; Adena, Michael; Donovan, Mark; Jones, Steve; Conlon, John
2009-01-01
Minimization is an alternative method to stratified permuted block randomization, which may be more effective at balancing treatments when there are many strata. However, its use in the regulatory setting for industry trials remains controversial, primarily due to the difficulty in interpreting conventional asymptotic statistical tests under restricted methods of treatment allocation. We argue that the use of minimization should be critically evaluated when designing the study for which it is proposed. We demonstrate by example how simulation can be used to investigate whether minimization improves treatment balance compared with stratified randomization, and how much randomness can be incorporated into the minimization before any balance advantage is no longer retained. We also illustrate by example how the performance of the traditional model-based analysis can be assessed, by comparing the nominal test size with the observed test size over a large number of simulations. We recommend that the assignment probability for the minimization be selected using such simulations. Copyright (c) 2008 John Wiley & Sons, Ltd.
Hernández-Cordero, Sonia; González-Castell, Dinorah; Rodríguez-Ramírez, Sonia; Villanueva-Borbolla, María Ángeles; Unar, Mishel; Barquera, Simón; de Cossío, Teresita González; Rivera-Dommarco, Juan; Popkin, Barry M
2014-01-01
Objective To describe the design, methods, and challenges encountered during a randomized clinical trial aimed to promote water intake for reducing risks of metabolic syndrome in Mexican women. Materials and methods In a randomized clinical trial in Cuernavaca, Mexico, overweight and obese (body mass index [BMI] ≥ 25 < 39) women, 18 – < 45 years old with an intake of sugar-sweetened beverages ≥ 250 kilocalories per day (kcal/day) were randomly allocated to the water and education provision group (n = 120) or the education provision only group (n = 120). Results We screened 1 756 women. The main difficulties encountered were identifying participants with the recruitment criteria, delivering water to participants, and the time demanded from the study participants. Conclusions The trial’s main challenges were difficulties surrounding recruitment, delivery of the intervention, and the time demanded from the study participants. Modifications were effectively implemented without jeopardizing the original protocol. PMID:24715012
Jackson, Dan; Bowden, Jack
2016-09-07
Confidence intervals for the between study variance are useful in random-effects meta-analyses because they quantify the uncertainty in the corresponding point estimates. Methods for calculating these confidence intervals have been developed that are based on inverting hypothesis tests using generalised heterogeneity statistics. Whilst, under the random effects model, these new methods furnish confidence intervals with the correct coverage, the resulting intervals are usually very wide, making them uninformative. We discuss a simple strategy for obtaining 95 % confidence intervals for the between-study variance with a markedly reduced width, whilst retaining the nominal coverage probability. Specifically, we consider the possibility of using methods based on generalised heterogeneity statistics with unequal tail probabilities, where the tail probability used to compute the upper bound is greater than 2.5 %. This idea is assessed using four real examples and a variety of simulation studies. Supporting analytical results are also obtained. Our results provide evidence that using unequal tail probabilities can result in shorter 95 % confidence intervals for the between-study variance. We also show some further results for a real example that illustrates how shorter confidence intervals for the between-study variance can be useful when performing sensitivity analyses for the average effect, which is usually the parameter of primary interest. We conclude that using unequal tail probabilities when computing 95 % confidence intervals for the between-study variance, when using methods based on generalised heterogeneity statistics, can result in shorter confidence intervals. We suggest that those who find the case for using unequal tail probabilities convincing should use the '1-4 % split', where greater tail probability is allocated to the upper confidence bound. The 'width-optimal' interval that we present deserves further investigation.
Crowther, Michael J; Look, Maxime P; Riley, Richard D
2014-09-28
Multilevel mixed effects survival models are used in the analysis of clustered survival data, such as repeated events, multicenter clinical trials, and individual participant data (IPD) meta-analyses, to investigate heterogeneity in baseline risk and covariate effects. In this paper, we extend parametric frailty models including the exponential, Weibull and Gompertz proportional hazards (PH) models and the log logistic, log normal, and generalized gamma accelerated failure time models to allow any number of normally distributed random effects. Furthermore, we extend the flexible parametric survival model of Royston and Parmar, modeled on the log-cumulative hazard scale using restricted cubic splines, to include random effects while also allowing for non-PH (time-dependent effects). Maximum likelihood is used to estimate the models utilizing adaptive or nonadaptive Gauss-Hermite quadrature. The methods are evaluated through simulation studies representing clinically plausible scenarios of a multicenter trial and IPD meta-analysis, showing good performance of the estimation method. The flexible parametric mixed effects model is illustrated using a dataset of patients with kidney disease and repeated times to infection and an IPD meta-analysis of prognostic factor studies in patients with breast cancer. User-friendly Stata software is provided to implement the methods. Copyright © 2014 John Wiley & Sons, Ltd.
Optimization Of Mean-Semivariance-Skewness Portfolio Selection Model In Fuzzy Random Environment
NASA Astrophysics Data System (ADS)
Chatterjee, Amitava; Bhattacharyya, Rupak; Mukherjee, Supratim; Kar, Samarjit
2010-10-01
The purpose of the paper is to construct a mean-semivariance-skewness portfolio selection model in fuzzy random environment. The objective is to maximize the skewness with predefined maximum risk tolerance and minimum expected return. Here the security returns in the objectives and constraints are assumed to be fuzzy random variables in nature and then the vagueness of the fuzzy random variables in the objectives and constraints are transformed into fuzzy variables which are similar to trapezoidal numbers. The newly formed fuzzy model is then converted into a deterministic optimization model. The feasibility and effectiveness of the proposed method is verified by numerical example extracted from Bombay Stock Exchange (BSE). The exact parameters of fuzzy membership function and probability density function are obtained through fuzzy random simulating the past dates.
Vehicle track segmentation using higher order random fields
Quach, Tu -Thach
2017-01-09
Here, we present an approach to segment vehicle tracks in coherent change detection images, a product of combining two synthetic aperture radar images taken at different times. The approach uses multiscale higher order random field models to capture track statistics, such as curvatures and their parallel nature, that are not currently utilized in existing methods. These statistics are encoded as 3-by-3 patterns at different scales. The model can complete disconnected tracks often caused by sensor noise and various environmental effects. Coupling the model with a simple classifier, our approach is effective at segmenting salient tracks. We improve the F-measure onmore » a standard vehicle track data set to 0.963, up from 0.897 obtained by the current state-of-the-art method.« less
Vehicle track segmentation using higher order random fields
DOE Office of Scientific and Technical Information (OSTI.GOV)
Quach, Tu -Thach
Here, we present an approach to segment vehicle tracks in coherent change detection images, a product of combining two synthetic aperture radar images taken at different times. The approach uses multiscale higher order random field models to capture track statistics, such as curvatures and their parallel nature, that are not currently utilized in existing methods. These statistics are encoded as 3-by-3 patterns at different scales. The model can complete disconnected tracks often caused by sensor noise and various environmental effects. Coupling the model with a simple classifier, our approach is effective at segmenting salient tracks. We improve the F-measure onmore » a standard vehicle track data set to 0.963, up from 0.897 obtained by the current state-of-the-art method.« less
NASA Technical Reports Server (NTRS)
Suomi, V. E.
1975-01-01
The stability of stochastic satellites and the stability and control of flexible satellites were investigated. The effects of random environmental torques and noises in the moments of inertia of spinning and three-axes stabilized satellites were first compared analytically by four methods and by analog simulations. Among the analytical methods, it was shown that the Fokker-Planck formulation yields predictions which most coincide with the simulation results. It was then shown that the required stability criterion of a satellite is quite different from that obtained by a deterministic approach, under the assumption that the environmental and control torques experienced by the satellite are random. Finally, it was demonstrated that, by monitoring the deformations of the flexible elements of a satellite, the effectiveness of the satellite control system can be increased considerably.
Lamontagne, Marie-Eve; Perreault, Kadija; Gagnon, Marie-Pierre
2014-04-10
Despite growing interest in the importance of, and challenges associated with the involvement of patient and population (IPP) in the process of developing and adapting clinical practice guidelines (CPGs), there is a lack of knowledge about the best method to use. This is especially problematic in the field of rehabilitation, where individuals with disabilities might face many barriers to their involvement in the guideline development and adaptation process. The goal of this pilot trial is to document the acceptability, feasibility and effectiveness of two methods of involving patients with a disability (traumatic brain injury) in CPG development. A single-blind, randomized, crossover pragmatic trial will be performed with 20 patients with traumatic brain injury (TBI). They will be randomized into two groups, and each will try two alternative methods of producing recommendations; a discussion group (control intervention) and a Wiki, a webpage that can be modified by those who have access to it (experimental intervention). The participants will rate the acceptability of the two methods, and feasibility will be assessed using indicators such as the number of participants who accessed and completed the two methods, and the number of support interventions required. Twenty experts, blinded to the method of producing the recommendations, will independently rate the recommendations produced by the participants for clarity, accuracy, appropriateness and usefulness. Our trial will allow for the use of optimal IPP methods in a larger project of adapting guidelines for the rehabilitation of individuals with TBI. Ultimately the results will inform the science of CPG development and contribute to the growing knowledge about IPP in rehabilitation settings. Clinical trial KT Canada 87776.
Wall, Kristin M; Vwalika, Bellington; Haddad, Lisa; Khu, Naw H; Vwalika, Cheswa; Kilembe, William; Chomba, Elwyn; Stephenson, Rob; Kleinbaum, David; Nizam, Azhar; Brill, Ilene; Tichacek, Amanda; Allen, Susan
2013-05-01
To evaluate the impact of family planning promotion on incident pregnancy in a combined effort to address Prongs 1 and 2 of prevention of mother-to-child transmission of HIV. We conducted a factorial randomized controlled trial of 2 video-based interventions. "Methods" and "Motivational" messages promoted long-term contraceptive use among 1060 couples with HIV in Lusaka, Zambia. Among couples not using contraception before randomization (n = 782), the video interventions had no impact on incident pregnancy. Among baseline contraceptive users, viewing the "Methods video" which focused on the intrauterine device and contraceptive implant was associated with a significantly lower pregnancy incidence [hazard ratio (HR) = 0.38; 95% confidence interval (CI): 0.19 to 0.75] relative to those viewing control and/or motivational videos. The effect was strongest in concordant positive couples (HR = 0.22; 95% CI: 0.08 to 0.58) and couples with HIV-positive women (HR = 0.23; 95% CI: 0.09 to 0.55). The "Methods video" intervention was previously shown to increase uptake of long-acting contraception and to prompt a shift from daily oral contraceptives to quarterly injectables and long-acting methods such as the intrauterine device and implant. Follow-up confirms sustained intervention impact on pregnancy incidence among baseline contraceptive users, in particular couples with HIV-positive women. Further work is needed to identify effective interventions to promote long-acting contraception among couples who have not yet adopted modern methods.
Cogo-Moreira, Hugo; de Ávila, Clara Regina Brandão; Ploubidis, George B.; Mari, Jair de Jesus
2013-01-01
Introduction Difficulties in word-level reading skills are prevalent in Brazilian schools and may deter children from gaining the knowledge obtained through reading and academic achievement. Music education has emerged as a potential method to improve reading skills because due to a common neurobiological substratum. Objective To evaluate the effectiveness of music education for the improvement of reading skills and academic achievement among children (eight to 10 years of age) with reading difficulties. Method 235 children with reading difficulties in 10 schools participated in a five-month, randomized clinical trial in cluster (RCT) in an impoverished zone within the city of São Paulo to test the effects of music education intervention while assessing reading skills and academic achievement during the school year. Five schools were chosen randomly to incorporate music classes (n = 114), and five served as controls (n = 121). Two different methods of analysis were used to evaluate the effectiveness of the intervention: The standard method was intention-to-treat (ITT), and the other was the Complier Average Causal Effect (CACE) estimation method, which took compliance status into account. Results The ITT analyses were not very promising; only one marginal effect existed for the rate of correct real words read per minute. Indeed, considering ITT, improvements were observed in the secondary outcomes (slope of Portuguese = 0.21 [p<0.001] and slope of math = 0.25 [p<0.001]). As for CACE estimation (i.e., complier children versus non-complier children), more promising effects were observed in terms of the rate of correct words read per minute [β = 13.98, p<0.001] and phonological awareness [β = 19.72, p<0.001] as well as secondary outcomes (academic achievement in Portuguese [β = 0.77, p<0.0001] and math [β = 0.49, p<0.001] throughout the school year). Conclusion The results may be seen as promising, but they are not, in themselves, enough to make music lessons as public policy. PMID:23544117
NASA Astrophysics Data System (ADS)
Tian, Yu-Kun; Zhou, Hui; Chen, Han-Ming; Zou, Ya-Ming; Guan, Shou-Jun
2013-12-01
Seismic inversion is a highly ill-posed problem, due to many factors such as the limited seismic frequency bandwidth and inappropriate forward modeling. To obtain a unique solution, some smoothing constraints, e.g., the Tikhonov regularization are usually applied. The Tikhonov method can maintain a global smooth solution, but cause a fuzzy structure edge. In this paper we use Huber-Markov random-field edge protection method in the procedure of inverting three parameters, P-velocity, S-velocity and density. The method can avoid blurring the structure edge and resist noise. For the parameter to be inverted, the Huber-Markov random-field constructs a neighborhood system, which further acts as the vertical and lateral constraints. We use a quadratic Huber edge penalty function within the layer to suppress noise and a linear one on the edges to avoid a fuzzy result. The effectiveness of our method is proved by inverting the synthetic data without and with noises. The relationship between the adopted constraints and the inversion results is analyzed as well.
NASA Astrophysics Data System (ADS)
El Sachat, Alexandros; Meristoudi, Anastasia; Markos, Christos; Pispas, Stergios; Riziotis, Christos
2014-03-01
A low cost and low complexity optical detection method of proteins is presented by employing a detection scheme based on electrostatic interactions, and implemented by sensitization of a polymer optical fibers' (POF) surface by thin overlayers of properly designed sensitive copolymer materials with predesigned charges. This method enables the fast detection of proteins having opposite charge to the overlayer, and also the effective discrimination of differently charged proteins like lysozyme (LYS) and bovine serum albumin (BSA). As sensitive materials the block and the random copolymers of the same monomers were employed, namely the block copolymer poly(styrene-b-2vinylpyridine) (PS-b- P2VP) and the corresponding random copolymer poly(styrene-r-2vinylpyridine) (PS-r-P2VP), of similar composition and molecular weights. Results show systematically different response between the block and the random copolymers, although of the same order of magnitude, drawing thus important conclusions on their applications' techno-economic aspects given that they have significantly different associated manufacturing method and costs. The use of the POF platform, in combination with those adaptable copolymer sensing materials could lead to efficient low cost bio-detection schemes.
Effect of anger management education on mental health and aggression of prisoner women
Bahrami, Elaheh; Mazaheri, Maryam Amidi; Hasanzadeh, Akbar
2016-01-01
Background and Purpose: “Uncontrolled anger” threats the compatible and health of people as serious risk. The effects of weaknesses and shortcomings in the management of anger, from personal distress and destruction interpersonal relationships beyond and linked to the public health problems, lack of compromises, and aggressive behavior adverse outcomes. This study investigates the effects of anger management education on mental health and aggression of prisoner women in Isfahan. Materials and Methods: The single-group quasi-experimental (pretest, posttest) by prisoner women in the central prison of Isfahan was done. Multi-stage random sampling method was used. Initially, 165 women were selected randomly and completed the Buss and Perry Aggression Questionnaire and the General Health Questionnaire-28, and among these, those with scores >78 (the cut point) in aggression scale was selected and among them 70 were randomly selected. In the next step, interventions in four 90 min training sessions were conducted. Posttest was performed within 1-month after the intervention. Data were analyzed using SPSS-20 software. Results: Data analysis showed that anger management training was effective in reducing aggression (P < 0.001) and also had a positive effect on mental health (P < 0.001). Conclusion: According to the importance of aggression in consistency and individual and collective health and according to findings, presented educational programs on anger management is essential for female prisoners. PMID:27512697
Method for removal of random noise in eddy-current testing system
Levy, Arthur J.
1995-01-01
Eddy-current response voltages, generated during inspection of metallic structures for anomalies, are often replete with noise. Therefore, analysis of the inspection data and results is difficult or near impossible, resulting in inconsistent or unreliable evaluation of the structure. This invention processes the eddy-current response voltage, removing the effect of random noise, to allow proper identification of anomalies within and associated with the structure.
ERIC Educational Resources Information Center
Johnson, Mats; Fransson, Gunnar; Östlund, Sven; Areskoug, Björn; Gillberg, Christopher
2017-01-01
Background: Previous research has shown positive effects of Omega 3/6 fatty acids in children with inattention and reading difficulties. We aimed to investigate if Omega 3/6 improved reading ability in mainstream schoolchildren. Methods: We performed a 3-month parallel, randomized, double-blind, placebo-controlled trial followed by 3-month active…
ERIC Educational Resources Information Center
Costa, Rochelle Rocha; Pilla, Carmen; Buttelli, Adriana Cristine Koch; Barreto, Michelle Flores; Vieiro, Priscila Azevedo; Alberton, Cristine Lima; Bracht, Cláudia Gomes; Kruel, Luiz Fernando Martins
2018-01-01
Purpose: This study aimed to investigate the effects of water-based aerobic training on the lipid profile and lipoprotein lipase (LPL) levels in premenopausal women with dyslipidemia. Method: Forty women were randomly assigned to: aquatic training (WA; n = 20) or a control group (CG; n = 20). The WA group underwent 12 weeks of water-based interval…
Game of Life on the Equal Degree Random Lattice
NASA Astrophysics Data System (ADS)
Shao, Zhi-Gang; Chen, Tao
2010-12-01
An effective matrix method is performed to build the equal degree random (EDR) lattice, and then a cellular automaton game of life on the EDR lattice is studied by Monte Carlo (MC) simulation. The standard mean field approximation (MFA) is applied, and then the density of live cells is given ρ=0.37017 by MFA, which is consistent with the result ρ=0.37±0.003 by MC simulation.
Recruiting Unmotivated Smokers into a Smoking Induction Trial
ERIC Educational Resources Information Center
Harris, Kari Jo; Bradley-Ewing, Andrea; Goggin, Kathy; Richter, Kimber P.; Patten, Christi; Williams, Karen; Lee, Hyoung S.; Staggs, Vincent S.; Catley, Delwyn
2016-01-01
Little is known about effective methods to recruit unmotivated smokers into cessation induction trials, the reasons unmotivated smokers agree to participate, and the impact of those reasons on study outcomes. A mixed-method approach was used to examine recruitment data from a randomized controlled cessation induction trial that enrolled 255 adult…
Monolingual or Bilingual Intervention for Primary Language Impairment? A Randomized Control Trial
ERIC Educational Resources Information Center
Thordardottir, Elin; Cloutier, Geneviève; Ménard, Suzanne; Pelland-Blais, Elaine; Rvachew, Susan
2015-01-01
Purpose: This study investigated the clinical effectiveness of monolingual versus bilingual language intervention, the latter involving speech-language pathologist-parent collaboration. The study focuses on methods that are currently being recommended and that are feasible within current clinical contexts. Method: Bilingual children with primary…
Reactive Power Pricing Model Considering the Randomness of Wind Power Output
NASA Astrophysics Data System (ADS)
Dai, Zhong; Wu, Zhou
2018-01-01
With the increase of wind power capacity integrated into grid, the influence of the randomness of wind power output on the reactive power distribution of grid is gradually highlighted. Meanwhile, the power market reform puts forward higher requirements for reasonable pricing of reactive power service. Based on it, the article combined the optimal power flow model considering wind power randomness with integrated cost allocation method to price reactive power. Meanwhile, considering the advantages and disadvantages of the present cost allocation method and marginal cost pricing, an integrated cost allocation method based on optimal power flow tracing is proposed. The model realized the optimal power flow distribution of reactive power with the minimal integrated cost and wind power integration, under the premise of guaranteeing the balance of reactive power pricing. Finally, through the analysis of multi-scenario calculation examples and the stochastic simulation of wind power outputs, the article compared the results of the model pricing and the marginal cost pricing, which proved that the model is accurate and effective.
Analysis of axial compressive loaded beam under random support excitations
NASA Astrophysics Data System (ADS)
Xiao, Wensheng; Wang, Fengde; Liu, Jian
2017-12-01
An analytical procedure to investigate the response spectrum of a uniform Bernoulli-Euler beam with axial compressive load subjected to random support excitations is implemented based on the Mindlin-Goodman method and the mode superposition method in the frequency domain. The random response spectrum of the simply supported beam subjected to white noise excitation and to Pierson-Moskowitz spectrum excitation is investigated, and the characteristics of the response spectrum are further explored. Moreover, the effect of axial compressive load is studied and a method to determine the axial load is proposed. The research results show that the response spectrum mainly consists of the beam's additional displacement response spectrum when the excitation is white noise; however, the quasi-static displacement response spectrum is the main component when the excitation is the Pierson-Moskowitz spectrum. Under white noise excitation, the amplitude of the power spectral density function decreased as the axial compressive load increased, while the frequency band of the vibration response spectrum increased with the increase of axial compressive load.
Estimating peer effects in networks with peer encouragement designs.
Eckles, Dean; Kizilcec, René F; Bakshy, Eytan
2016-07-05
Peer effects, in which the behavior of an individual is affected by the behavior of their peers, are central to social science. Because peer effects are often confounded with homophily and common external causes, recent work has used randomized experiments to estimate effects of specific peer behaviors. These experiments have often relied on the experimenter being able to randomly modulate mechanisms by which peer behavior is transmitted to a focal individual. We describe experimental designs that instead randomly assign individuals' peers to encouragements to behaviors that directly affect those individuals. We illustrate this method with a large peer encouragement design on Facebook for estimating the effects of receiving feedback from peers on posts shared by focal individuals. We find evidence for substantial effects of receiving marginal feedback on multiple behaviors, including giving feedback to others and continued posting. These findings provide experimental evidence for the role of behaviors directed at specific individuals in the adoption and continued use of communication technologies. In comparison, observational estimates differ substantially, both underestimating and overestimating effects, suggesting that researchers and policy makers should be cautious in relying on them.
Estimating peer effects in networks with peer encouragement designs
Eckles, Dean; Kizilcec, René F.; Bakshy, Eytan
2016-01-01
Peer effects, in which the behavior of an individual is affected by the behavior of their peers, are central to social science. Because peer effects are often confounded with homophily and common external causes, recent work has used randomized experiments to estimate effects of specific peer behaviors. These experiments have often relied on the experimenter being able to randomly modulate mechanisms by which peer behavior is transmitted to a focal individual. We describe experimental designs that instead randomly assign individuals’ peers to encouragements to behaviors that directly affect those individuals. We illustrate this method with a large peer encouragement design on Facebook for estimating the effects of receiving feedback from peers on posts shared by focal individuals. We find evidence for substantial effects of receiving marginal feedback on multiple behaviors, including giving feedback to others and continued posting. These findings provide experimental evidence for the role of behaviors directed at specific individuals in the adoption and continued use of communication technologies. In comparison, observational estimates differ substantially, both underestimating and overestimating effects, suggesting that researchers and policy makers should be cautious in relying on them. PMID:27382145
Yamashita, Yasunobu; Ueda, Kazuki; Kawaji, Yuki; Tamura, Takashi; Itonaga, Masahiro; Yoshida, Takeichi; Maeda, Hiroki; Magari, Hirohito; Maekita, Takao; Iguchi, Mikitaka; Tamai, Hideyuki; Ichinose, Masao; Kato, Jun
2016-07-15
Transpapillary forceps biopsy is an effective diagnostic technique in patients with biliary stricture. This prospective study aimed to determine the usefulness of the wire-grasping method as a new technique for forceps biopsy. Consecutive patients with biliary stricture or irregularities of the bile duct wall were randomly allocated to either the direct or wire-grasping method group. In the wiregrasping method, forceps in the duodenum grasps a guidewire placed into the bile duct beforehand, and then, the forceps are pushed through the papilla without endoscopic sphincterotomy. In the direct method, forceps are directly pushed into the bile duct alongside a guide-wire. The primary endpoint was the success rate of obtaining specimens suitable for adequate pathological examination. In total, 32 patients were enrolled, and 28 (14 in each group) were eligible for analysis. The success rate was significantly higher using the wire-grasping method than the direct method (100% vs 50%, p=0.016). Sensitivity and accuracy for the diagnosis of cancer were comparable in patients with the successful procurement of biopsy specimens between the two methods (91% vs 83% and 93% vs 86%, respectively). The wire-grasping method is useful for diagnosing patients with biliary stricture or irregularities of the bile duct wall.
Effective Recruitment of Schools for Randomized Clinical Trials: Role of School Nurses.
Petosa, R L; Smith, L
2017-01-01
In school settings, nurses lead efforts to improve the student health and well-being to support academic success. Nurses are guided by evidenced-based practice and data to inform care decisions. The randomized controlled trial (RCT) is considered the gold standard of scientific rigor for clinical trials. RCTs are critical to the development of evidence-based health promotion programs in schools. The purpose of this article is to present practical solutions to implementing principles of randomization to RCT trials conducted in school settings. Randomization is a powerful sampling method used to build internal and external validity. The school's daily organization and educational mission provide several barriers to randomization. Based on the authors' experience in conducting school-based RCTs, they offer a host of practical solutions to working with schools to successfully implement randomization procedures. Nurses play a critical role in implementing RCTs in schools to promote rigorous science in support of evidence-based practice.
Amiri, Hamid Reza; Mirzaei, Mojtaba; Beig Mohammadi, Mohammad Taghi; Tavakoli, Farhad
2016-01-01
Background Preemptive analgesia may be considered as a method not only to alleviate postoperative pain but also to decrease analgesic consumption. Different regimens are suggested, but there is currently no standard. Objectives The aim was to measure the efficacy of preemptive analgesia with pregabalin, acetaminophen, naproxen, and dextromethorphan in radical neck dissection surgery for reducing the intensity of pain and morphine consumption. Patients and Methods This study was conducted as a randomized double-blind clinical trial. Eighty adult patients (18 to 60 years of age) under the American society of anesthesiologists (ASA) physical status I and II undergoing elective radical neck dissection were enrolled. Patients were randomized into two groups of 40 with a simple randomization method. The case group received a combination of 15 mg/kg acetaminophen, 2.5 mg/kg pregabalin, 7 mg/kg naproxen, and 0.3 mg/kg dextromethorphan administered orally one hour prior to surgery. Postoperative pain was assessed with the universal pain assessment tool (UPAT) at 0, 2, 4, 6, 12, and 24 hours after surgery. Subjects received morphine based on postoperative pain control protocol. Total administered morphine doses were noted. Results Postoperative pain rates at 0, 2, 4, 6, 12, and 24 hours after surgery were significantly lower for the case group than the control group (P values = 0.014, 0.003, 0.00, 0.00, and 0.00, respectively). Total morphine doses for the preemptive analgesia group were 45% lower than those of the other group. Side effects were similar for both groups. Conclusions A single preoperative oral dose of pregabalin, acetaminophen, dextromethorphan, and naproxen one hour before surgery is an effective method for reducing postoperative pain and morphine consumption in patients undergoing radical neck dissection. PMID:27843771
Abbasinia, Mohammad; Irajpour, Alireza; Babaii, Atye; Shamali, Mehdi; Vahdatnezhad, Jahanbakhsh
2014-01-01
Introduction: Endotracheal tube suctioning is essential for improve oxygenation in the patients undergoing mechanical ventilation. There are two types of shallow and deep endotracheal tube suctioning. This study aimed to evaluate the effect of shallow and deep suctioning methods on respiratory rate (RR), arterial blood oxygen saturation (SpO2) and number of suctioning in patients hospitalized in the intensive care units of Al-Zahra Hospital, Isfahan, Iran. Methods: In this randomized controlled trial, 74 patients who hospitalized in the intensive care units of Isfahan Al-Zahra Hospital were randomly allocated to the shallow and deep suctioning groups. RR and SpO2 were measured immediately before, immediately after, 1 and 3 minute after each suctioning. Number of suctioning was also noted in each groups. Data were analyzed using repeated measures analysis of variance (RMANOVA), chi-square and independent t-tests. Results: RR was significantly increased and SpO2 was significantly decreased after each suctioning in the both groups. However, these changes were not significant between the two groups. The numbers of suctioning was significantly higher in the shallow suctioning group than in the deep suctioning group. Conclusion: Shallow and deep suctioning had a similar effect on RR and SpO2. However, shallow suctioning caused further manipulation of patient's trachea than deep suctioning method. Therefore, it seems that deep endotracheal tube suctioning method can be used to clean the airway with lesser manipulation of the trachea. PMID:25276759
Experiments in randomly agitated granular assemblies close to the jamming transition
NASA Astrophysics Data System (ADS)
Caballero, Gabriel; Lindner, Anke; Ovarlez, Guillaume; Reydellet, Guillaume; Lanuza, José; Clément, Eric
2004-11-01
We present the results obtained for two experiments on randomly agitated granular assemblies using a novel way of shaking. First we discuss the transport properties of a 2D model system undergoing classical shaking that show the importance of large scale dynamics for this type of agitation and offer a local view of the microscopic motions of a grain. We then develop a new way of vibrating the system allowing for random accelerations smaller than gravity. Using this method we study the evolution of the free surface as well as results from a light scattering method for a 3D model system. The final aim of these experiments is to investigate the ideas of effective temperature on the one hand as a function of inherent states and on the other hand using fluctuation dissipation relations.
Experiments in randomly agitated granular assemblies close to the jamming transition
NASA Astrophysics Data System (ADS)
Caballero, Gabriel; Lindner, Anke; Ovarlez, Guillaume; Reydellet, Guillaume; Lanuza, José; Clément, Eric
2004-03-01
We present the results obtained for two experiments on randomly agitated granular assemblies using a novel way of shaking. First we discuss the transport properties of a 2D model system undergoing classical shaking that show the importance of large scale dynamics for this type of agitation and offer a local view of the microscopic motions of a grain. We then develop a new way of vibrating the system allowing for random accelerations smaller than gravity. Using this method we study the evolution of the free surface as well as results from a light scattering method for a 3D model system. The final aim of these experiments is to investigate the ideas of effective temperature on the one hand as a function of inherent states and on the other hand using fluctuation dissipation relations.
Kundu, Anjana; Lin, Yuting; Oron, Assaf P.; Doorenbos, Ardith Z.
2014-01-01
Purpose To examine the effects of Reiki as an adjuvant therapy to opioid therapy for postoperative pain control in pediatric patients. Methods This was a double-blind, randomized controlled study of children undergoing dental procedures. Participants were randomly assigned to receive either Reiki therapy or the control therapy (sham Reiki) preoperatively. Postoperative pain scores, opioid requirements, and side effects were assessed. Family members were also asked about perioperative care satisfaction. Multiple linear regressions were used for analysis. Results Thirty-eight children participated. The blinding procedure was successful. No statistically significant difference was observed between groups on all outcome measures. Implications Our study provides a successful example of a blinding procedure for Reiki therapy among children in the perioperative period. This study does not support the effectiveness of Reiki as an adjuvant therapy to opioid therapy for postoperative pain control in pediatric patients. PMID:24439640
Yavari kia, Parisa; Safajou, Farzaneh; Shahnazi, Mahnaz; Nazemiyeh, Hossein
2014-01-01
Background: Nausea and vomiting of pregnancy are amongst the most common complaints that effects on both the physical and mental conditions of the pregnant women. Due to the increasing tendency of women to use herbal medications during pregnancy, the effect of lemon inhalation aromatherapy on nausea and vomiting of pregnancy was investigated in this study. Objectives: The aim of this study was to determine the effect of lemon inhalation aromatherapy on nausea and vomiting during pregnancy. Materials and Methods: This was a randomized clinical trial in which 100 pregnant women with nausea and vomiting who had eligibility criteria were randomly divided into intervention and control groups based on four- and six-random block sampling method. Lemon essential oil and placebo were given to the intervention and control groups, respectively, to inhale it as soon as they felt nausea. The nausea, vomiting, and retch intensity were investigated 24 hours before and during the four days of treatment by means of PUQE-24 (24-hour Pregnancy Unique Quantification of Emesis). Results: There was a statistically significant difference between the two groups in the mean scores of nausea and vomiting on the second and fourth days (P = 0.017 and P = 0.039, respectively). The means of nausea and vomiting intensity in the second and fourth days in the intervention group were significantly lower than the control group. In addition, in intragroup comparison with ANOVA with repeated measures, the nausea and vomiting mean in the five intervals, showed a statistically significant difference in each group (P < 0.001 and P = 0.049, respectively). Conclusions: Lemon scent can be effective in reducing nausea and vomiting of pregnancy. PMID:24829772
Estimating intervention effects of prevention programs: Accounting for noncompliance
Stuart, Elizabeth A.; Perry, Deborah F.; Le, Huynh-Nhu; Ialongo, Nicholas S.
2010-01-01
Individuals not fully complying with their assigned treatments is a common problem encountered in randomized evaluations of behavioral interventions. Treatment group members rarely attend all sessions or do all “required” activities; control group members sometimes find ways to participate in aspects of the intervention. As a result, there is often interest in estimating both the effect of being assigned to participate in the intervention, as well as the impact of actually participating and doing all of the required activities. Methods known broadly as “complier average causal effects” (CACE) or “instrumental variables” (IV) methods have been developed to estimate this latter effect, but they are more commonly applied in medical and treatment research. Since the use of these statistical techniques in prevention trials has been less widespread, many prevention scientists may not be familiar with the underlying assumptions and limitations of CACE and IV approaches. This paper provides an introduction to these methods, described in the context of randomized controlled trials of two preventive interventions: one for perinatal depression among at-risk women and the other for aggressive disruptive behavior in children. Through these case studies, the underlying assumptions and limitations of these methods are highlighted. PMID:18843535
Robust electroencephalogram phase estimation with applications in brain-computer interface systems.
Seraj, Esmaeil; Sameni, Reza
2017-03-01
In this study, a robust method is developed for frequency-specific electroencephalogram (EEG) phase extraction using the analytic representation of the EEG. Based on recent theoretical findings in this area, it is shown that some of the phase variations-previously associated to the brain response-are systematic side-effects of the methods used for EEG phase calculation, especially during low analytical amplitude segments of the EEG. With this insight, the proposed method generates randomized ensembles of the EEG phase using minor perturbations in the zero-pole loci of narrow-band filters, followed by phase estimation using the signal's analytical form and ensemble averaging over the randomized ensembles to obtain a robust EEG phase and frequency. This Monte Carlo estimation method is shown to be very robust to noise and minor changes of the filter parameters and reduces the effect of fake EEG phase jumps, which do not have a cerebral origin. As proof of concept, the proposed method is used for extracting EEG phase features for a brain computer interface (BCI) application. The results show significant improvement in classification rates using rather simple phase-related features and a standard K-nearest neighbors and random forest classifiers, over a standard BCI dataset. The average performance was improved between 4-7% (in absence of additive noise) and 8-12% (in presence of additive noise). The significance of these improvements was statistically confirmed by a paired sample t-test, with 0.01 and 0.03 p-values, respectively. The proposed method for EEG phase calculation is very generic and may be applied to other EEG phase-based studies.
Schaeffer, Christine; Teter, Caroline; Finch, Emily A; Hurt, Courtney; Keeter, Mary Kate; Liss, David T; Rogers, Angela; Sheth, Avani; Ackermann, Ronald
2018-02-01
Transitional care programs have been widely used to reduce readmissions and improve the quality and safety of the handoff process between hospital and outpatient providers. Very little is known about effective transitional care interventions among patients who are uninsured or with Medicaid. This paper describes the design and baseline characteristics of a pragmatic randomized comparative effectiveness trial of transitional care. Northwestern Medical Group- Transitional Care (NMG-TC) care model was developed to address the needs of patients with multiple medical problems that required lifestyle changes and were amenable to office-based management. We present the design, evaluation methods and baseline characteristics of NMG-TC trial patients. Baseline demographic characteristics indicate that our patient population is predominantly male, Medicaid insured and non-white. This study will evaluate two methods for implementing an effective transitional care model in a medically complex and socioeconomically diverse population. Copyright © 2017 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Hussein, Hussein El-ghamry Mohammad
2016-01-01
This study investigated the effect of Blackboard-based instruction on pre-service teachers' achievement in the teaching methods course at The Faculty of Education for Girls, in Bisha, KSA. Forty seventh-level English Department students were randomly assigned into either the experimental group (N = 20) or the control group (N = 20). While studying…
Xu, Chonggang; Gertner, George
2013-01-01
Fourier Amplitude Sensitivity Test (FAST) is one of the most popular uncertainty and sensitivity analysis techniques. It uses a periodic sampling approach and a Fourier transformation to decompose the variance of a model output into partial variances contributed by different model parameters. Until now, the FAST analysis is mainly confined to the estimation of partial variances contributed by the main effects of model parameters, but does not allow for those contributed by specific interactions among parameters. In this paper, we theoretically show that FAST analysis can be used to estimate partial variances contributed by both main effects and interaction effects of model parameters using different sampling approaches (i.e., traditional search-curve based sampling, simple random sampling and random balance design sampling). We also analytically calculate the potential errors and biases in the estimation of partial variances. Hypothesis tests are constructed to reduce the effect of sampling errors on the estimation of partial variances. Our results show that compared to simple random sampling and random balance design sampling, sensitivity indices (ratios of partial variances to variance of a specific model output) estimated by search-curve based sampling generally have higher precision but larger underestimations. Compared to simple random sampling, random balance design sampling generally provides higher estimation precision for partial variances contributed by the main effects of parameters. The theoretical derivation of partial variances contributed by higher-order interactions and the calculation of their corresponding estimation errors in different sampling schemes can help us better understand the FAST method and provide a fundamental basis for FAST applications and further improvements. PMID:24143037
Xu, Chonggang; Gertner, George
2011-01-01
Fourier Amplitude Sensitivity Test (FAST) is one of the most popular uncertainty and sensitivity analysis techniques. It uses a periodic sampling approach and a Fourier transformation to decompose the variance of a model output into partial variances contributed by different model parameters. Until now, the FAST analysis is mainly confined to the estimation of partial variances contributed by the main effects of model parameters, but does not allow for those contributed by specific interactions among parameters. In this paper, we theoretically show that FAST analysis can be used to estimate partial variances contributed by both main effects and interaction effects of model parameters using different sampling approaches (i.e., traditional search-curve based sampling, simple random sampling and random balance design sampling). We also analytically calculate the potential errors and biases in the estimation of partial variances. Hypothesis tests are constructed to reduce the effect of sampling errors on the estimation of partial variances. Our results show that compared to simple random sampling and random balance design sampling, sensitivity indices (ratios of partial variances to variance of a specific model output) estimated by search-curve based sampling generally have higher precision but larger underestimations. Compared to simple random sampling, random balance design sampling generally provides higher estimation precision for partial variances contributed by the main effects of parameters. The theoretical derivation of partial variances contributed by higher-order interactions and the calculation of their corresponding estimation errors in different sampling schemes can help us better understand the FAST method and provide a fundamental basis for FAST applications and further improvements.
Sun, Xin; Briel, Matthias; Busse, Jason W; Akl, Elie A; You, John J; Mejza, Filip; Bala, Malgorzata; Diaz-Granados, Natalia; Bassler, Dirk; Mertz, Dominik; Srinathan, Sadeesh K; Vandvik, Per Olav; Malaga, German; Alshurafa, Mohamed; Dahm, Philipp; Alonso-Coello, Pablo; Heels-Ansdell, Diane M; Bhatnagar, Neera; Johnston, Bradley C; Wang, Li; Walter, Stephen D; Altman, Douglas G; Guyatt, Gordon H
2009-01-01
Background Subgroup analyses in randomized trials examine whether effects of interventions differ between subgroups of study populations according to characteristics of patients or interventions. However, findings from subgroup analyses may be misleading, potentially resulting in suboptimal clinical and health decision making. Few studies have investigated the reporting and conduct of subgroup analyses and a number of important questions remain unanswered. The objectives of this study are: 1) to describe the reporting of subgroup analyses and claims of subgroup effects in randomized controlled trials, 2) to assess study characteristics associated with reporting of subgroup analyses and with claims of subgroup effects, and 3) to examine the analysis, and interpretation of subgroup effects for each study's primary outcome. Methods We will conduct a systematic review of 464 randomized controlled human trials published in 2007 in the 118 Core Clinical Journals defined by the National Library of Medicine. We will randomly select journal articles, stratified in a 1:1 ratio by higher impact versus lower impact journals. According to 2007 ISI total citations, we consider the New England Journal of Medicine, JAMA, Lancet, Annals of Internal Medicine, and BMJ as higher impact journals. Teams of two reviewers will independently screen full texts of reports for eligibility, and abstract data, using standardized, pilot-tested extraction forms. We will conduct univariable and multivariable logistic regression analyses to examine the association of pre-specified study characteristics with reporting of subgroup analyses and with claims of subgroup effects for the primary and any other outcomes. Discussion A clear understanding of subgroup analyses, as currently conducted and reported in published randomized controlled trials, will reveal both strengths and weaknesses of this practice. Our findings will contribute to a set of recommendations to optimize the conduct and reporting of subgroup analyses, and claim and interpretation of subgroup effects in randomized trials. PMID:19900273
Novel image encryption algorithm based on multiple-parameter discrete fractional random transform
NASA Astrophysics Data System (ADS)
Zhou, Nanrun; Dong, Taiji; Wu, Jianhua
2010-08-01
A new method of digital image encryption is presented by utilizing a new multiple-parameter discrete fractional random transform. Image encryption and decryption are performed based on the index additivity and multiple parameters of the multiple-parameter fractional random transform. The plaintext and ciphertext are respectively in the spatial domain and in the fractional domain determined by the encryption keys. The proposed algorithm can resist statistic analyses effectively. The computer simulation results show that the proposed encryption algorithm is sensitive to the multiple keys, and that it has considerable robustness, noise immunity and security.
A spectral analysis of the domain decomposed Monte Carlo method for linear systems
Slattery, Stuart R.; Evans, Thomas M.; Wilson, Paul P. H.
2015-09-08
The domain decomposed behavior of the adjoint Neumann-Ulam Monte Carlo method for solving linear systems is analyzed using the spectral properties of the linear oper- ator. Relationships for the average length of the adjoint random walks, a measure of convergence speed and serial performance, are made with respect to the eigenvalues of the linear operator. In addition, relationships for the effective optical thickness of a domain in the decomposition are presented based on the spectral analysis and diffusion theory. Using the effective optical thickness, the Wigner rational approxi- mation and the mean chord approximation are applied to estimate the leakagemore » frac- tion of random walks from a domain in the decomposition as a measure of parallel performance and potential communication costs. The one-speed, two-dimensional neutron diffusion equation is used as a model problem in numerical experiments to test the models for symmetric operators with spectral qualities similar to light water reactor problems. We find, in general, the derived approximations show good agreement with random walk lengths and leakage fractions computed by the numerical experiments.« less
Quantification of moving target cyber defenses
NASA Astrophysics Data System (ADS)
Farris, Katheryn A.; Cybenko, George
2015-05-01
Current network and information systems are static, making it simple for attackers to maintain an advantage. Adaptive defenses, such as Moving Target Defenses (MTD) have been developed as potential "game-changers" in an effort to increase the attacker's workload. With many new methods being developed, it is difficult to accurately quantify and compare their overall costs and effectiveness. This paper compares the tradeoffs between current approaches to the quantification of MTDs. We present results from an expert opinion survey on quantifying the overall effectiveness, upfront and operating costs of a select set of MTD techniques. We find that gathering informed scientific opinions can be advantageous for evaluating such new technologies as it offers a more comprehensive assessment. We end by presenting a coarse ordering of a set of MTD techniques from most to least dominant. We found that seven out of 23 methods rank as the more dominant techniques. Five of which are techniques of either address space layout randomization or instruction set randomization. The remaining two techniques are applicable to software and computer platforms. Among the techniques that performed the worst are those primarily aimed at network randomization.
ERIC Educational Resources Information Center
Rubin, Allen; Washburn, Micki; Schieszler, Christine
2017-01-01
Purpose: This article provides benchmark data on within-group effect sizes from published randomized clinical trials (RCTs) supporting the efficacy of trauma-focused cognitive behavioral therapy (TF-CBT) for traumatized children. Methods: Within-group effect-size benchmarks for symptoms of trauma, anxiety, and depression were calculated via the…
A Comparison of Four Approaches to Account for Method Effects in Latent State-Trait Analyses
ERIC Educational Resources Information Center
Geiser, Christian; Lockhart, Ginger
2012-01-01
Latent state-trait (LST) analysis is frequently applied in psychological research to determine the degree to which observed scores reflect stable person-specific effects, effects of situations and/or person-situation interactions, and random measurement error. Most LST applications use multiple repeatedly measured observed variables as indicators…
Lin, Hung-Yu; Flask, Chris A; Dale, Brian M; Duerk, Jeffrey L
2007-06-01
To investigate and evaluate a new rapid dark-blood vessel-wall imaging method using random bipolar gradients with a radial steady-state free precession (SSFP) acquisition in carotid applications. The carotid artery bifurcations of four asymptomatic volunteers (28-37 years old, mean age = 31 years) were included in this study. Dark-blood contrast was achieved through the use of random bipolar gradients applied prior to the signal acquisition of each radial projection in a balanced SSFP acquisition. The resulting phase variation for moving spins established significant destructive interference in the low-frequency region of k-space. This phase variation resulted in a net nulling of the signal from flowing spins, while the bipolar gradients had a minimal effect on the static spins. The net effect was that the regular SSFP signal amplitude (SA) in stationary tissues was preserved while dark-blood contrast was achieved for moving spins. In this implementation, application of the random bipolar gradient pulses along all three spatial directions nulled the signal from both in-plane and through-plane flow in phantom and in vivo studies. In vivo imaging trials confirmed that dark-blood contrast can be achieved with the radial random bipolar SSFP method, thereby substantially reversing the vessel-to-lumen contrast-to-noise ratio (CNR) of a conventional rectilinear SSFP "bright-blood" acquisition from bright blood to dark blood with only a modest increase in TR (approximately 4 msec) to accommodate the additional bipolar gradients. Overall, this sequence offers a simple and effective dark-blood contrast mechanism for high-SNR SSFP acquisitions in vessel wall imaging within a short acquisition time.
Neil, Jordan M.; Strekalova, Yulia A.; Sarge, Melanie A.
2017-01-01
Abstract Background: Improving informed consent to participate in randomized clinical trials (RCTs) is a key challenge in cancer communication. The current study examines strategies for enhancing randomization comprehension among patients with diverse levels of health literacy and identifies cognitive and affective predictors of intentions to participate in cancer RCTs. Methods: Using a post-test-only experimental design, cancer patients (n = 500) were randomly assigned to receive one of three message conditions for explaining randomization (ie, plain language condition, gambling metaphor, benign metaphor) or a control message. All statistical tests were two-sided. Results: Health literacy was a statistically significant moderator of randomization comprehension (P = .03). Among participants with the lowest levels of health literacy, the benign metaphor resulted in greater comprehension of randomization as compared with plain language (P = .04) and control (P = .004) messages. Among participants with the highest levels of health literacy, the gambling metaphor resulted in greater randomization comprehension as compared with the benign metaphor (P = .04). A serial mediation model showed a statistically significant negative indirect effect of comprehension on behavioral intention through personal relevance of RCTs and anxiety associated with participation in RCTs (P < .001). Conclusions: The effectiveness of metaphors for explaining randomization depends on health literacy, with a benign metaphor being particularly effective for patients at the lower end of the health literacy spectrum. The theoretical model demonstrates the cognitive and affective predictors of behavioral intention to participate in cancer RCTs and offers guidance on how future research should employ communication strategies to improve the informed consent processes. PMID:27794035
Effect of Methods of Learning and Self Regulated Learning toward Outcomes of Learning Social Studies
ERIC Educational Resources Information Center
Tjalla, Awaluddin; Sofiah, Evi
2015-01-01
This research aims to reveal the influence of learning methods and self-regulated learning on students learning scores for Social Studies object. The research was done in Islamic Junior High School (MTs Manba'ul Ulum), Batuceper City Tangerang using quasi-experimental method. The research employed simple random technique to 28 students. Data were…
Quantum Random Number Generation Using a Quanta Image Sensor
Amri, Emna; Felk, Yacine; Stucki, Damien; Ma, Jiaju; Fossum, Eric R.
2016-01-01
A new quantum random number generation method is proposed. The method is based on the randomness of the photon emission process and the single photon counting capability of the Quanta Image Sensor (QIS). It has the potential to generate high-quality random numbers with remarkable data output rate. In this paper, the principle of photon statistics and theory of entropy are discussed. Sample data were collected with QIS jot device, and its randomness quality was analyzed. The randomness assessment method and results are discussed. PMID:27367698
NASA Astrophysics Data System (ADS)
Walrand, Stephan; Hesse, Michel; Jamar, François; Lhommel, Renaud
2018-04-01
Our literature survey revealed a physical effect unknown to the nuclear medicine community, i.e. internal bremsstrahlung emission, and also the existence of long energy resolution tails in crystal scintillation. None of these effects has ever been modelled in PET Monte Carlo (MC) simulations. This study investigates whether these two effects could be at the origin of two unexplained observations in 90Y imaging by PET: the increasing tails in the radial profile of true coincidences, and the presence of spurious extrahepatic counts post radioembolization in non-TOF PET and their absence in TOF PET. These spurious extrahepatic counts hamper the microsphere delivery check in liver radioembolization. An acquisition of a 32P vial was performed on a GSO PET system. This is the ideal setup to study the impact of bremsstrahlung x-rays on the true coincidence rate when no positron emission and no crystal radioactivity are present. A MC simulation of the acquisition was performed using Gate-Geant4. MC simulations of non-TOF PET and TOF-PET imaging of a synthetic 90Y human liver radioembolization phantom were also performed. Internal bremsstrahlung and long energy resolution tails inclusion in MC simulations quantitatively predict the increasing tails in the radial profile. In addition, internal bremsstrahlung explains the discrepancy previously observed in bremsstrahlung SPECT between the measure of the 90Y bremsstrahlung spectrum and its simulation with Gate-Geant4. However the spurious extrahepatic counts in non-TOF PET mainly result from the failure of conventional random correction methods in such low count rate studies and poor robustness versus emission-transmission inconsistency. A novel proposed random correction method succeeds in cleaning the spurious extrahepatic counts in non-TOF PET. Two physical effects not considered up to now in nuclear medicine were identified to be at the origin of the unusual 90Y true coincidences radial profile. TOF reconstruction removing of the spurious extrahepatic counts was theoretically explained by a better robustness against emission-transmission inconsistency. A novel random correction method was proposed to overcome the issue in non-TOF PET. Further studies are needed to assess the novel random correction method robustness.
Weight Control Intervention for Truck Drivers: The SHIFT Randomized Controlled Trial, United States
Wipfli, Brad; Thompson, Sharon V.; Elliot, Diane L.; Anger, W. Kent; Bodner, Todd; Hammer, Leslie B.; Perrin, Nancy A.
2016-01-01
Objectives. To evaluate the effectiveness of the Safety and Health Involvement For Truckers (SHIFT) intervention with a randomized controlled design. Methods. The multicomponent intervention was a weight-loss competition supported with body weight and behavioral self-monitoring, computer-based training, and motivational interviewing. We evaluated intervention effectiveness with a cluster-randomized design involving 22 terminals from 5 companies in the United States in 2012 to 2014. Companies were required to provide interstate transportation services and operate at least 2 larger terminals. We randomly assigned terminals to intervention or usual practice control conditions. We assessed participating drivers (n = 452) at baseline and 6 months. Results. In an intent-to-treat analysis, the postintervention difference between groups in mean body mass index change was 1.00 kilograms per meters squared (P < .001; intervention = −0.73; control = +0.27). Behavioral changes included statistically significant improvements in fruit and vegetable consumption and physical activity. Conclusions. Results establish the effectiveness of a multicomponent and remotely administered intervention for producing significant weight loss among commercial truck drivers. PMID:27463067
Efficacy of abstinence promotion media messages: findings from an online randomized trial.
Evans, W Douglas; Davis, Kevin C; Ashley, Olivia Silber; Blitstein, Jonathan; Koo, Helen; Zhang, Yun
2009-10-01
We conducted an online randomized experiment to evaluate the efficacy of messages from the Parents Speak Up National Campaign (PSUNC) to promote parent-child communication about sex. We randomly assigned a national sample of 1,969 mothers and fathers to treatment (PSUNC exposure) and control (no exposure) conditions. Mothers were further randomized into treatment and booster (additional messages) conditions to evaluate dose-response effects. Participants were surveyed at baseline, 4 weeks postexposure, and 6 months postexposure. We used multivariable logistic regression procedures in our analysis. Treatment fathers were more likely than control fathers to initiate conversations about sex at 4 weeks, and treatment fathers and mothers were more likely than controls at 6 months to recommend that their children wait to have sex. Treatment fathers and mothers were far more likely than controls to use the campaign Web site. There was a dose-response effect for mothers' Web site use. Using new media methods, this study shows that PSUNC messages are efficacious in promoting parent-child communication about sex and abstinence. Future research should evaluate mechanisms and effectiveness in natural settings.
Application of random seismic inversion method based on tectonic model in thin sand body research
NASA Astrophysics Data System (ADS)
Dianju, W.; Jianghai, L.; Qingkai, F.
2017-12-01
The oil and gas exploitation at Songliao Basin, Northeast China have already progressed to the period with high water production. The previous detailed reservoir description that based on seismic image, sediment core, borehole logging has great limitations in small scale structural interpretation and thin sand body characterization. Thus, precise guidance for petroleum exploration is badly in need of a more advanced method. To do so, we derived the method of random seismic inversion constrained by tectonic model.It can effectively improve the depicting ability of thin sand bodies, combining numerical simulation techniques, which can credibly reducing the blindness of reservoir analysis from the whole to the local and from the macroscopic to the microscopic. At the same time, this can reduce the limitations of the study under the constraints of different geological conditions of the reservoir, accomplish probably the exact estimation for the effective reservoir. Based on the research, this paper has optimized the regional effective reservoir evaluation and the productive location adjustment of applicability, combined with the practical exploration and development in Aonan oil field.
PNF and manual therapy treatment results of patients with cervical spine osteoarthritis.
Maicki, Tomasz; Bilski, Jan; Szczygieł, Elżbieta; Trąbka, Rafał
2017-09-22
The aim of this study was to evaluate the effectiveness of PNF and manual therapy methods in the treatment of patients with cervical spine osteoarthritis, especially their efficacy in reducing pain and improving functionality in everyday life. Long-term results were also compared in order to determine which method of treatment is more effective. Eighty randomly selected females aged 45-65 were included in the study. They were randomly divided into two groups of 40 persons. One group received PNF treatment and the other received manual therapy (MAN.T). To evaluate functional capabilities, the Functional Rating Index was used. To evaluate changes in pain, a shortened version of the McGill Questionnaire was used. The PNF group achieved a greater reduction in pain than the MAN.T group. The PNF group showed a greater improvement in performing daily activities such as sleeping, personal care, travelling, work, recreation, lifting, walking and standing as well as decreased intensity and frequency of pain compared to the MAN.T group. The PNF method proved to be more effective in both short (after two weeks) and long (after three months) term.
Transport properties of random media: A new effective medium theory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Busch, K.; Soukoulis, C.M.
We present a new method for efficient, accurate calculations of transport properties of random media. It is based on the principle that the wave energy density should be uniform when averaged over length scales larger than the size of the scatterers. This scheme captures the effects of resonant scattering of the individual scatterer exactly, as well as the multiple scattering in a mean-field sense. It has been successfully applied to both ``scalar`` and ``vector`` classical wave calculations. Results for the energy transport velocity are in agreement with experiment. This approach is of general use and can be easily extended tomore » treat different types of wave propagation in random media. {copyright} {ital 1995} {ital The} {ital American} {ital Physical} {ital Society}.« less
Rousseau, Bernard; Gutmann, Michelle L.; Mau, I-fan Theodore; Francis, David O.; Johnson, Jeffrey P.; Novaleski, Carolyn K.; Vinson, Kimberly N.; Garrett, C. Gaelyn
2015-01-01
Objective This randomized trial investigated voice rest and supplemental text-to-speech communication versus voice rest alone on visual analog scale measures of communication effectiveness and magnitude of voice use. Study Design Randomized clinical trial. Setting Multicenter outpatient voice clinics. Subjects Thirty-seven patients undergoing phonomicrosurgery. Methods Patients undergoing phonomicrosurgery were randomized to voice rest and supplemental text-to-speech communication or voice rest alone. The primary outcome measure was the impact of voice rest on ability to communicate effectively over a seven-day period. Pre- and post-operative magnitude of voice use was also measured as an observational outcome. Results Patients randomized to voice rest and supplemental text-to-speech communication reported higher median communication effectiveness on each post-operative day compared to those randomized to voice rest alone, with significantly higher median communication effectiveness on post-operative day 3 (p = 0.03) and 5 (p = 0.01). Magnitude of voice use did not differ on any pre-operative (p > 0.05) or post-operative day (p > 0.05), nor did patients significantly decrease voice use as the surgery date approached (p > 0.05). However, there was a significant reduction in median voice use pre- to post-operatively across patients (p < 0.001) with median voice use ranging from 0–3 throughout the post-operative week. Conclusion Supplemental text-to-speech communication increased patient perceived communication effectiveness on post-operative days 3 and 5 over voice rest alone. With the prevalence of smartphones and the widespread use of text messaging, supplemental text-to-speech communication may provide an accessible and cost-effective communication option for patients on vocal restrictions. PMID:25605690
Chen, Minghao; Wei, Shiyou; Hu, Junyan; Yuan, Jing; Liu, Fenghua
2017-01-01
The present study aimed to undertake a review of available evidence assessing whether time-lapse imaging (TLI) has favorable outcomes for embryo incubation and selection compared with conventional methods in clinical in vitro fertilization (IVF). Using PubMed, EMBASE, Cochrane library and ClinicalTrial.gov up to February 2017 to search for randomized controlled trials (RCTs) comparing TLI versus conventional methods. Both studies randomized women and oocytes were included. For studies randomized women, the primary outcomes were live birth and ongoing pregnancy, the secondary outcomes were clinical pregnancy and miscarriage; for studies randomized oocytes, the primary outcome was blastocyst rate, the secondary outcome was good quality embryo on Day 2/3. Subgroup analysis was conducted based on different incubation and embryo selection between groups. Ten RCTs were included, four randomized oocytes and six randomized women. For oocyte-based review, the pool-analysis observed no significant difference between TLI group and control group for blastocyst rate [relative risk (RR) 1.08, 95% CI 0.94-1.25, I2 = 0%, two studies, including 1154 embryos]. The quality of evidence was moderate for all outcomes in oocyte-based review. For woman-based review, only one study provided live birth rate (RR 1,23, 95% CI 1.06-1.44,I2 N/A, one study, including 842 women), the pooled result showed no significant difference in ongoing pregnancy rate (RR 1.04, 95% CI 0.80-1.36, I2 = 59%, four studies, including 1403 women) between two groups. The quality of the evidence was low or very low for all outcomes in woman-based review. Currently there is insufficient evidence to support that TLI is superior to conventional methods for human embryo incubation and selection. In consideration of the limitations and flaws of included studies, more well designed RCTs are still in need to comprehensively evaluate the effectiveness of clinical TLI use.
Model-Mapped RPA for Determining the Effective Coulomb Interaction
NASA Astrophysics Data System (ADS)
Sakakibara, Hirofumi; Jang, Seung Woo; Kino, Hiori; Han, Myung Joon; Kuroki, Kazuhiko; Kotani, Takao
2017-04-01
We present a new method to obtain a model Hamiltonian from first-principles calculations. The effective interaction contained in the model is determined on the basis of random phase approximation (RPA). In contrast to previous methods such as projected RPA and constrained RPA (cRPA), the new method named "model-mapped RPA" takes into account the long-range part of the polarization effect to determine the effective interaction in the model. After discussing the problems of cRPA, we present the formulation of the model-mapped RPA, together with a numerical test for the single-band Hubbard model of HgBa2CuO4.
Biases and power for groups comparison on subjective health measurements.
Hamel, Jean-François; Hardouin, Jean-Benoit; Le Neel, Tanguy; Kubis, Gildas; Roquelaure, Yves; Sébille, Véronique
2012-01-01
Subjective health measurements are increasingly used in clinical research, particularly for patient groups comparisons. Two main types of analytical strategies can be used for such data: so-called classical test theory (CTT), relying on observed scores and models coming from Item Response Theory (IRT) relying on a response model relating the items responses to a latent parameter, often called latent trait. Whether IRT or CTT would be the most appropriate method to compare two independent groups of patients on a patient reported outcomes measurement remains unknown and was investigated using simulations. For CTT-based analyses, groups comparison was performed using t-test on the scores. For IRT-based analyses, several methods were compared, according to whether the Rasch model was considered with random effects or with fixed effects, and the group effect was included as a covariate or not. Individual latent traits values were estimated using either a deterministic method or by stochastic approaches. Latent traits were then compared with a t-test. Finally, a two-steps method was performed to compare the latent trait distributions, and a Wald test was performed to test the group effect in the Rasch model including group covariates. The only unbiased IRT-based method was the group covariate Wald's test, performed on the random effects Rasch model. This model displayed the highest observed power, which was similar to the power using the score t-test. These results need to be extended to the case frequently encountered in practice where data are missing and possibly informative.
Martinussen, Torben; Vansteelandt, Stijn; Tchetgen Tchetgen, Eric J; Zucker, David M
2017-12-01
The use of instrumental variables for estimating the effect of an exposure on an outcome is popular in econometrics, and increasingly so in epidemiology. This increasing popularity may be attributed to the natural occurrence of instrumental variables in observational studies that incorporate elements of randomization, either by design or by nature (e.g., random inheritance of genes). Instrumental variables estimation of exposure effects is well established for continuous outcomes and to some extent for binary outcomes. It is, however, largely lacking for time-to-event outcomes because of complications due to censoring and survivorship bias. In this article, we make a novel proposal under a class of structural cumulative survival models which parameterize time-varying effects of a point exposure directly on the scale of the survival function; these models are essentially equivalent with a semi-parametric variant of the instrumental variables additive hazards model. We propose a class of recursive instrumental variable estimators for these exposure effects, and derive their large sample properties along with inferential tools. We examine the performance of the proposed method in simulation studies and illustrate it in a Mendelian randomization study to evaluate the effect of diabetes on mortality using data from the Health and Retirement Study. We further use the proposed method to investigate potential benefit from breast cancer screening on subsequent breast cancer mortality based on the HIP-study. © 2017, The International Biometric Society.
JahaniShoorab, Nahid; Ebrahimzadeh Zagami, Samira; Nahvi, Ali; Mazluom, Seyed Reza; Golmakani, Nahid; Talebi, Mahdi; Pabarja, Ferial
2015-01-01
Background Pain is one of the side effects of episiotomy. The virtual reality (VR) is a non-pharmacological method for pain relief. The purpose of this study was to determine the effect of using video glasses on pain reduction in primiparity women during episiotomy repair. Methods This clinical trial was conducted on 30 primiparous parturient women having labor at Omolbanin Hospital (Mashhad, Iran) during May-July 2012. Samples during episiotomy repair were randomly divided into two equal groups. The intervention group received the usual treatment with VR (video glasses and local infiltration 5 ml solution of lidocaine 2%) and the control group only received local infiltration (5 ml solution of lidocaine 2%). Pain was measured using the Numeric Pain Rating Scale (0-100 scale) before, during and after the episiotomy repair. Data were analyzed using Fisher’s exact test, Chi-square, Mann-Whitney and repeated measures ANOVA tests by SPSS 11.5 software. Results There were statistically significant differences between the pain score during episiotomy repair in both groups (P=0.038). Conclusion Virtual reality is an effective complementary non-pharmacological method to reduce pain during episiotomy repair. Trial Registration Number: IRCT138811063185N1. PMID:25999621
Moreno-Segura, Noemi; Igual-Camacho, Celedonia; Ballester-Gil, Yéntel; Blasco-Igual, María Clara; Blasco, Jose María
2018-04-01
Exercising with the Pilates method may be a beneficial treatment to improve balance and decrease the number of falls. To ascertain this, our search in 7 databases included 15 randomized controlled trials in which Pilates was the primary intervention. Participants were over 60 years of age; the outcomes were related to balance and falls. The Cochrane tool and PEDro scale were used to assess risk of bias and quality of individual studies. Current evidence supported the view that exercising with the Pilates method improves the balance of older adults with a high practical effect in terms of the dynamic (SMD = 0.75 [0.17;1.32]), static (SMD = 1.33 [0.53;2.13]), and overall balance (SMD = 0.96[0.00;1.91]). Pilates also produced greater improvements with a moderate effect in terms of the dynamic (SMD = 0.37[-0.36;1.11]) and overall balance (SMD = 0.58[0.19;0.96]) compared to other training approaches oriented to the same end. Literature evaluating the effects on falls is scarce, and results were not conclusive.
Zhu, Qiaohao; Carriere, K C
2016-01-01
Publication bias can significantly limit the validity of meta-analysis when trying to draw conclusion about a research question from independent studies. Most research on detection and correction for publication bias in meta-analysis focus mainly on funnel plot-based methodologies or selection models. In this paper, we formulate publication bias as a truncated distribution problem, and propose new parametric solutions. We develop methodologies of estimating the underlying overall effect size and the severity of publication bias. We distinguish the two major situations, in which publication bias may be induced by: (1) small effect size or (2) large p-value. We consider both fixed and random effects models, and derive estimators for the overall mean and the truncation proportion. These estimators will be obtained using maximum likelihood estimation and method of moments under fixed- and random-effects models, respectively. We carried out extensive simulation studies to evaluate the performance of our methodology, and to compare with the non-parametric Trim and Fill method based on funnel plot. We find that our methods based on truncated normal distribution perform consistently well, both in detecting and correcting publication bias under various situations.
Vincenzi, Simone; Mangel, Marc; Crivelli, Alain J.; Munch, Stephan; Skaug, Hans J.
2014-01-01
The differences in demographic and life-history processes between organisms living in the same population have important consequences for ecological and evolutionary dynamics. Modern statistical and computational methods allow the investigation of individual and shared (among homogeneous groups) determinants of the observed variation in growth. We use an Empirical Bayes approach to estimate individual and shared variation in somatic growth using a von Bertalanffy growth model with random effects. To illustrate the power and generality of the method, we consider two populations of marble trout Salmo marmoratus living in Slovenian streams, where individually tagged fish have been sampled for more than 15 years. We use year-of-birth cohort, population density during the first year of life, and individual random effects as potential predictors of the von Bertalanffy growth function's parameters k (rate of growth) and (asymptotic size). Our results showed that size ranks were largely maintained throughout marble trout lifetime in both populations. According to the Akaike Information Criterion (AIC), the best models showed different growth patterns for year-of-birth cohorts as well as the existence of substantial individual variation in growth trajectories after accounting for the cohort effect. For both populations, models including density during the first year of life showed that growth tended to decrease with increasing population density early in life. Model validation showed that predictions of individual growth trajectories using the random-effects model were more accurate than predictions based on mean size-at-age of fish. PMID:25211603
NASA Astrophysics Data System (ADS)
Ma, L. X.; Tan, J. Y.; Zhao, J. M.; Wang, F. Q.; Wang, C. A.; Wang, Y. Y.
2017-07-01
Due to the dependent scattering and absorption effects, the radiative transfer equation (RTE) may not be suitable for dealing with radiative transfer in dense discrete random media. This paper continues previous research on multiple and dependent scattering in densely packed discrete particle systems, and puts emphasis on the effects of particle complex refractive index. The Mueller matrix elements of the scattering system with different complex refractive indexes are obtained by both electromagnetic method and radiative transfer method. The Maxwell equations are directly solved based on the superposition T-matrix method, while the RTE is solved by the Monte Carlo method combined with the hard sphere model in the Percus-Yevick approximation (HSPYA) to consider the dependent scattering effects. The results show that for densely packed discrete random media composed of medium size parameter particles (equals 6.964 in this study), the demarcation line between independent and dependent scattering has remarkable connections with the particle complex refractive index. With the particle volume fraction increase to a certain value, densely packed discrete particles with higher refractive index contrasts between the particles and host medium and higher particle absorption indexes are more likely to show stronger dependent characteristics. Due to the failure of the extended Rayleigh-Debye scattering condition, the HSPYA has weak effect on the dependent scattering correction at large phase shift parameters.
Adaptive threshold shearlet transform for surface microseismic data denoising
NASA Astrophysics Data System (ADS)
Tang, Na; Zhao, Xian; Li, Yue; Zhu, Dan
2018-06-01
Random noise suppression plays an important role in microseismic data processing. The microseismic data is often corrupted by strong random noise, which would directly influence identification and location of microseismic events. Shearlet transform is a new multiscale transform, which can effectively process the low magnitude of microseismic data. In shearlet domain, due to different distributions of valid signals and random noise, shearlet coefficients can be shrunk by threshold. Therefore, threshold is vital in suppressing random noise. The conventional threshold denoising algorithms usually use the same threshold to process all coefficients, which causes noise suppression inefficiency or valid signals loss. In order to solve above problems, we propose the adaptive threshold shearlet transform (ATST) for surface microseismic data denoising. In the new algorithm, we calculate the fundamental threshold for each direction subband firstly. In each direction subband, the adjustment factor is obtained according to each subband coefficient and its neighboring coefficients, in order to adaptively regulate the fundamental threshold for different shearlet coefficients. Finally we apply the adaptive threshold to deal with different shearlet coefficients. The experimental denoising results of synthetic records and field data illustrate that the proposed method exhibits better performance in suppressing random noise and preserving valid signal than the conventional shearlet denoising method.
Shortreed, Susan M.; Moodie, Erica E. M.
2012-01-01
Summary Treatment of schizophrenia is notoriously difficult and typically requires personalized adaption of treatment due to lack of efficacy of treatment, poor adherence, or intolerable side effects. The Clinical Antipsychotic Trials in Intervention Effectiveness (CATIE) Schizophrenia Study is a sequential multiple assignment randomized trial comparing the typical antipsychotic medication, perphenazine, to several newer atypical antipsychotics. This paper describes the marginal structural modeling method for estimating optimal dynamic treatment regimes and applies the approach to the CATIE Schizophrenia Study. Missing data and valid estimation of confidence intervals are also addressed. PMID:23087488
Studies in astronomical time series analysis. I - Modeling random processes in the time domain
NASA Technical Reports Server (NTRS)
Scargle, J. D.
1981-01-01
Several random process models in the time domain are defined and discussed. Attention is given to the moving average model, the autoregressive model, and relationships between and combinations of these models. Consideration is then given to methods for investigating pulse structure, procedures of model construction, computational methods, and numerical experiments. A FORTRAN algorithm of time series analysis has been developed which is relatively stable numerically. Results of test cases are given to study the effect of adding noise and of different distributions for the pulse amplitudes. A preliminary analysis of the light curve of the quasar 3C 272 is considered as an example.
Inferring animal densities from tracking data using Markov chains.
Whitehead, Hal; Jonsen, Ian D
2013-01-01
The distributions and relative densities of species are keys to ecology. Large amounts of tracking data are being collected on a wide variety of animal species using several methods, especially electronic tags that record location. These tracking data are effectively used for many purposes, but generally provide biased measures of distribution, because the starts of the tracks are not randomly distributed among the locations used by the animals. We introduce a simple Markov-chain method that produces unbiased measures of relative density from tracking data. The density estimates can be over a geographical grid, and/or relative to environmental measures. The method assumes that the tracked animals are a random subset of the population in respect to how they move through the habitat cells, and that the movements of the animals among the habitat cells form a time-homogenous Markov chain. We illustrate the method using simulated data as well as real data on the movements of sperm whales. The simulations illustrate the bias introduced when the initial tracking locations are not randomly distributed, as well as the lack of bias when the Markov method is used. We believe that this method will be important in giving unbiased estimates of density from the growing corpus of animal tracking data.
Soheili, Mozhgan; Nazari, Fatemeh; Shaygannejad, Vahid; Valiani, Mahboobeh
2017-01-01
Background: Multiple sclerosis (MS) occurs with a variety of physical and psychological symptoms, yet there is not a conclusive cure for this disease. Complementary medicine is a current treatment which seems is effective in relieving symptoms of patients with MS. Therefore, this study is aimed to determine and compare the effects of reflexology and relaxation on anxiety, stress, and depression in women with MS. Subjects and Methods: This study is a randomized clinical trial that is done on 75 women with MS referred to MS Clinic of Kashani Hospital. After simple non random sampling, participants were randomly assigned by minimization method to three groups: reflexology, relaxation and control (25 patients in each group). In the experimental groups were performed reflexology and relaxation interventions within 4 weeks, twice a week for 40 min and the control group were received only routine treatment as directed by a doctor. Data were collected through depression anxiety and stress scale questionnaire, before, immediately after and 2 months after interventions in all three groups. Chi-square, Kruskal–Wallis, repeated measures analysis of variance and one-way analysis of variance and least significant difference post hoc test via SPSS version 18 were used to analyze the data (P < 0.05) was considered as significant level. Results: The results showed a significant reduction in the severity of anxiety, stress and depression during the different times in the reflexology and relaxation groups as compared with the control group (P < 0.05). Conclusion: The results showed that reflexology and relaxation in relieving anxiety, stress and depression are effective in women with MS. Hence, these two methods, as effective techniques, can be recommended. PMID:28546976
NASA Astrophysics Data System (ADS)
Henri, Christopher; Fernàndez-Garcia, Daniel
2015-04-01
Modeling multi-species reactive transport in natural systems with strong heterogeneities and complex biochemical reactions is a major challenge for assessing groundwater polluted sites with organic and inorganic contaminants. A large variety of these contaminants react according to serial-parallel reaction networks commonly simplified by a combination of first-order kinetic reactions. In this context, a random-walk particle tracking method is presented. This method is capable of efficiently simulating the motion of particles affected by first-order network reactions in three-dimensional systems, which are represented by spatially variable physical and biochemical coefficients described at high resolution. The approach is based on the development of transition probabilities that describe the likelihood that particles belonging to a given species and location at a given time will be transformed into and moved to another species and location afterwards. These probabilities are derived from the solution matrix of the spatial moments governing equations. The method is fully coupled with reactions, free of numerical dispersion and overcomes the inherent numerical problems stemming from the incorporation of heterogeneities to reactive transport codes. In doing this, we demonstrate that the motion of particles follows a standard random walk with time-dependent effective retardation and dispersion parameters that depend on the initial and final chemical state of the particle. The behavior of effective parameters develops as a result of differential retardation effects among species. Moreover, explicit analytic solutions of the transition probability matrix and related particle motions are provided for serial reactions. An example of the effect of heterogeneity on the dechlorination of organic solvents in a three-dimensional random porous media shows that the power-law behavior typically observed in conservative tracers breakthrough curves can be largely compromised by the effect of biochemical reactions.
NASA Astrophysics Data System (ADS)
Henri, Christopher V.; Fernàndez-Garcia, Daniel
2014-09-01
Modeling multispecies reactive transport in natural systems with strong heterogeneities and complex biochemical reactions is a major challenge for assessing groundwater polluted sites with organic and inorganic contaminants. A large variety of these contaminants react according to serial-parallel reaction networks commonly simplified by a combination of first-order kinetic reactions. In this context, a random-walk particle tracking method is presented. This method is capable of efficiently simulating the motion of particles affected by first-order network reactions in three-dimensional systems, which are represented by spatially variable physical and biochemical coefficients described at high resolution. The approach is based on the development of transition probabilities that describe the likelihood that particles belonging to a given species and location at a given time will be transformed into and moved to another species and location afterward. These probabilities are derived from the solution matrix of the spatial moments governing equations. The method is fully coupled with reactions, free of numerical dispersion and overcomes the inherent numerical problems stemming from the incorporation of heterogeneities to reactive transport codes. In doing this, we demonstrate that the motion of particles follows a standard random walk with time-dependent effective retardation and dispersion parameters that depend on the initial and final chemical state of the particle. The behavior of effective parameters develops as a result of differential retardation effects among species. Moreover, explicit analytic solutions of the transition probability matrix and related particle motions are provided for serial reactions. An example of the effect of heterogeneity on the dechlorination of organic solvents in a three-dimensional random porous media shows that the power-law behavior typically observed in conservative tracers breakthrough curves can be largely compromised by the effect of biochemical reactions.
ERIC Educational Resources Information Center
Rogers, Sally J.; Estes, Annette; Lord, Catherine; Vismara, Laurie; Winter, Jamie; Fitzpatrick, Annette; Guo, Mengye; Dawson, Geraldine
2012-01-01
Objective: This study was carried out to examine the efficacy of a 12-week, low-intensity (1-hour/wk of therapist contact), parent-delivered intervention for toddlers at risk for autism spectrum disorders (ASD) aged 14 to 24 months and their families. Method: A randomized controlled trial involving 98 children and families was carried out in three…
Research on user behavior authentication model based on stochastic Petri nets
NASA Astrophysics Data System (ADS)
Zhang, Chengyuan; Xu, Haishui
2017-08-01
A behavioural authentication model based on stochastic Petri net is proposed to meet the randomness, uncertainty and concurrency characteristics of user behaviour. The use of random models in the location, changes, arc and logo to describe the characteristics of a variety of authentication and game relationships, so as to effectively implement the graphical user behaviour authentication model analysis method, according to the corresponding proof to verify the model is valuable.
Finite-time stability of neutral-type neural networks with random time-varying delays
NASA Astrophysics Data System (ADS)
Ali, M. Syed; Saravanan, S.; Zhu, Quanxin
2017-11-01
This paper is devoted to the finite-time stability analysis of neutral-type neural networks with random time-varying delays. The randomly time-varying delays are characterised by Bernoulli stochastic variable. This result can be extended to analysis and design for neutral-type neural networks with random time-varying delays. On the basis of this paper, we constructed suitable Lyapunov-Krasovskii functional together and established a set of sufficient linear matrix inequalities approach to guarantee the finite-time stability of the system concerned. By employing the Jensen's inequality, free-weighting matrix method and Wirtinger's double integral inequality, the proposed conditions are derived and two numerical examples are addressed for the effectiveness of the developed techniques.
NASA Astrophysics Data System (ADS)
Tang, Li-Chuan; Hu, Guang W.; Russell, Kendra L.; Chang, Chen S.; Chang, Chi Ching
2000-10-01
We propose a new holographic memory scheme based on random phase-encoded multiplexing in a photorefractive LiNbO3:Fe crystal. Experimental results show that rotating a diffuser placed as a random phase modulator in the path of the reference beam provides a simple yet effective method of increasing the holographic storage capabilities of the crystal. Combining this rotational multiplexing with angular multiplexing offers further advantages. Storage capabilities can be optimized by using a post-image random phase plate in the path of the object beam. The technique is applied to a triple phase-encoded optical security system that takes advantage of the high angular selectivity of the angular-rotational multiplexing components.
Simental-Mendia, Luis E; Pirro, Matteo; Atkin, Stephen L; Banach, Maciej; Mikhailidis, Dimitri P; Sahebkar, Amirhossein
2018-01-01
Fibrinogen is a key mediator of thrombosis and it has been implicated in the pathogenesis of atherosclerosis. Because metformin has shown a potential protective effect on different atherothrombotic risk factors, we assessed in this meta-analysis its effect on plasma fibrinogen concentrations. A systematic review and meta-analysis was carried out to identify randomized placebo-controlled trials evaluating the effect of metformin administration on fibrinogen levels. The search included PubMed-Medline, Scopus, ISI Web of Knowledge and Google Scholar databases (by June 2, 2017) and quality of studies was performed according to Cochrane criteria. Quantitative data synthesis was conducted using a random-effects model and sensitivity analysis by the leave-one-out method. Meta-regression analysis was performed to assess the modifiers of treatment response. Meta-analysis of data from 9 randomized placebo-controlled clinical trials with 2302 patients comprising 10 treatment arms did not suggest a significant change in plasma fibrinogen concentrations following metformin therapy (WMD: -0.25 g/L, 95% CI: -0.53, 0.04, p = 0.092). The effect size was robust in the leave-one-out sensitivity analysis and remained non-significant after omission of each single study from the meta-analysis. No significant effect of metformin on plasma fibrinogen concentrations was demonstrated in the current meta-analysis. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
Yuan, Jing; Liu, Fenghua
2017-01-01
Objective The present study aimed to undertake a review of available evidence assessing whether time-lapse imaging (TLI) has favorable outcomes for embryo incubation and selection compared with conventional methods in clinical in vitro fertilization (IVF). Methods Using PubMed, EMBASE, Cochrane library and ClinicalTrial.gov up to February 2017 to search for randomized controlled trials (RCTs) comparing TLI versus conventional methods. Both studies randomized women and oocytes were included. For studies randomized women, the primary outcomes were live birth and ongoing pregnancy, the secondary outcomes were clinical pregnancy and miscarriage; for studies randomized oocytes, the primary outcome was blastocyst rate, the secondary outcome was good quality embryo on Day 2/3. Subgroup analysis was conducted based on different incubation and embryo selection between groups. Results Ten RCTs were included, four randomized oocytes and six randomized women. For oocyte-based review, the pool-analysis observed no significant difference between TLI group and control group for blastocyst rate [relative risk (RR) 1.08, 95% CI 0.94–1.25, I2 = 0%, two studies, including 1154 embryos]. The quality of evidence was moderate for all outcomes in oocyte-based review. For woman-based review, only one study provided live birth rate (RR 1,23, 95% CI 1.06–1.44,I2 N/A, one study, including 842 women), the pooled result showed no significant difference in ongoing pregnancy rate (RR 1.04, 95% CI 0.80–1.36, I2 = 59%, four studies, including 1403 women) between two groups. The quality of the evidence was low or very low for all outcomes in woman-based review. Conclusions Currently there is insufficient evidence to support that TLI is superior to conventional methods for human embryo incubation and selection. In consideration of the limitations and flaws of included studies, more well designed RCTs are still in need to comprehensively evaluate the effectiveness of clinical TLI use. PMID:28570713
Nejati, Parisa; Ghahremaninia, Armita; Naderi, Farrokh; Gharibzadeh, Safoora; Mazaherinezhad, Ali
2017-05-01
Subacromial impingement syndrome (SAIS) is the most common disorder of the shoulder. The evidence for the effectiveness of treatment options is inconclusive and limited. Therefore, there is a need for more evidence in this regard, particularly for long-term outcomes. Platelet-rich plasma (PRP) would be an effective method in treating subacromial impingement. Randomized controlled trial; Level of evidence, 1. This was a single-blinded randomized clinical trial with 1-, 3-, and 6-month follow-up. Sixty-two patients were randomly placed into 2 groups, receiving either PRP or exercise therapy. The outcome parameters were pain, shoulder range of motion (ROM), muscle force, functionality, and magnetic resonance imaging findings. Both treatment options significantly reduced pain and increased shoulder ROM compared with baseline measurements. Both treatments also significantly improved functionality. However, the treatment choices were not significantly effective in improving muscle force. Trend analysis revealed that in the first and third months, exercise therapy was superior to PRP in pain, shoulder flexion and abduction, and functionality. However, in the sixth month, only shoulder abduction and total Western Ontario Rotator Cuff score were significantly different between the 2 groups. Both PRP injection and exercise therapy were effective in reducing pain and disability in patients with SAIS, with exercise therapy proving more effective.
A proposed method to investigate reliability throughout a questionnaire
2011-01-01
Background Questionnaires are used extensively in medical and health care research and depend on validity and reliability. However, participants may differ in interest and awareness throughout long questionnaires, which can affect reliability of their answers. A method is proposed for "screening" of systematic change in random error, which could assess changed reliability of answers. Methods A simulation study was conducted to explore whether systematic change in reliability, expressed as changed random error, could be assessed using unsupervised classification of subjects by cluster analysis (CA) and estimation of intraclass correlation coefficient (ICC). The method was also applied on a clinical dataset from 753 cardiac patients using the Jalowiec Coping Scale. Results The simulation study showed a relationship between the systematic change in random error throughout a questionnaire and the slope between the estimated ICC for subjects classified by CA and successive items in a questionnaire. This slope was proposed as an awareness measure - to assessing if respondents provide only a random answer or one based on a substantial cognitive effort. Scales from different factor structures of Jalowiec Coping Scale had different effect on this awareness measure. Conclusions Even though assumptions in the simulation study might be limited compared to real datasets, the approach is promising for assessing systematic change in reliability throughout long questionnaires. Results from a clinical dataset indicated that the awareness measure differed between scales. PMID:21974842
Effects of random aspects of cutting tool wear on surface roughness and tool life
NASA Astrophysics Data System (ADS)
Nabil, Ben Fredj; Mabrouk, Mohamed
2006-10-01
The effects of random aspects of cutting tool flank wear on surface roughness and on tool lifetime, when turning the AISI 1045 carbon steel, were studied in this investigation. It was found that standard deviations corresponding to tool flank wear and to the surface roughness increase exponentially with cutting time. Under cutting conditions that correspond to finishing operations, no significant differences were found between the calculated values of the capability index C p at the steady-state region of the tool flank wear, using the best-fit method or the Box-Cox transformation, or by making the assumption that the surface roughness data are normally distributed. Hence, a method to establish cutting tool lifetime could be established that simultaneously respects the desired average of surface roughness and the required capability index.
ERIC Educational Resources Information Center
Egan, Rylan G.
2012-01-01
Introduction: The following study investigates relationships between spaced practice (re-studying after a delay) and transfer of learning. Specifically, the impact on learners ability to transfer learning after participating in spaced model-building or unstructured study of narrated text. Method: Subjects were randomly assigned either to a…
Community of Inquiry Method and Language Skills Acquisition: Empirical Evidence
ERIC Educational Resources Information Center
Preece, Abdul Shakhour Duncan
2015-01-01
The study investigates the effectiveness of community of inquiry method in preparing students to develop listening and speaking skills in a sample of junior secondary school students in Borno state, Nigeria. A sample of 100 students in standard classes was drawn in one secondary school in Maiduguri metropolis through stratified random sampling…
Cognitive-Behavioral Therapies for Young People in Outpatient Treatment for Nonopioid Drug Use
ERIC Educational Resources Information Center
Filges, Trine; Jorgensen, Anne-Marie Klint
2018-01-01
Objectives: This review evaluates the evidence on the effects of cognitive-behavioral therapy (CBT) on drug use reduction for young people in treatment for nonopioid drug use. Method: We followed Campbell Collaboration guidelines to conduct a systematic review of randomized and nonrandomized trials. Meta-analytic methods were used to…
Application of Network Planning to Teaching Wind-Surfing
ERIC Educational Resources Information Center
Zybko, Przemyslaw; Jaczynowski, Lech
2008-01-01
Study aim: To determine the effects of network planning on teaching untrained subjects windsurfing. Material and methods: Untrained physical education students (n = 390), aged 19-23 years, took part in the study while staying on a summer camp. They were randomly assigned into two groups: experimental (n = 216) and control (n = 174). Two methods of…
Mynaugh, P A
1991-09-01
This study examined the effects of two methods of teaching perineal massage on the rates of practice of perineal massage, of episiotomy, and of lacerations in primiparas at birth. Couples in 20 randomly selected sections of four prenatal class series received routine printed and verbal instruction and a 12-minute video demonstration of perineal massage, or only the routine printed and verbal instruction. Women reported their practice rates in daily diary records, which were mailed to the researcher weekly. Hospital records provided delivery data. Of the 83 women, 23 (28%) practiced perineal massage: 16 (35.6%) in the experimental group, 7 (18.4%) controls. Even though the rate of practice almost doubled among experimental group women, the videotape instruction method was statistically nonsignificant. Episiotomy and laceration rates were not affected by teaching method. More severe lacerations occurred among the experimental group; however, the control group had almost four times as many severe (21%) as minor (5.3%) lacerations. The experimental group had twice as many severe (28.9%) as minor (13.3%) lacerations. These results were also nonsignificant.
Hedeker, D; Flay, B R; Petraitis, J
1996-02-01
Methods are proposed and described for estimating the degree to which relations among variables vary at the individual level. As an example of the methods, M. Fishbein and I. Ajzen's (1975; I. Ajzen & M. Fishbein, 1980) theory of reasoned action is examined, which posits first that an individual's behavioral intentions are a function of 2 components: the individual's attitudes toward the behavior and the subjective norms as perceived by the individual. A second component of their theory is that individuals may weight these 2 components differently in assessing their behavioral intentions. This article illustrates the use of empirical Bayes methods based on a random-effects regression model to estimate these individual influences, estimating an individual's weighting of both of these components (attitudes toward the behavior and subjective norms) in relation to their behavioral intentions. This method can be used when an individual's behavioral intentions, subjective norms, and attitudes toward the behavior are all repeatedly measured. In this case, the empirical Bayes estimates are derived as a function of the data from the individual, strengthened by the overall sample data.
SAR Image Change Detection Based on Fuzzy Markov Random Field Model
NASA Astrophysics Data System (ADS)
Zhao, J.; Huang, G.; Zhao, Z.
2018-04-01
Most existing SAR image change detection algorithms only consider single pixel information of different images, and not consider the spatial dependencies of image pixels. So the change detection results are susceptible to image noise, and the detection effect is not ideal. Markov Random Field (MRF) can make full use of the spatial dependence of image pixels and improve detection accuracy. When segmenting the difference image, different categories of regions have a high degree of similarity at the junction of them. It is difficult to clearly distinguish the labels of the pixels near the boundaries of the judgment area. In the traditional MRF method, each pixel is given a hard label during iteration. So MRF is a hard decision in the process, and it will cause loss of information. This paper applies the combination of fuzzy theory and MRF to the change detection of SAR images. The experimental results show that the proposed method has better detection effect than the traditional MRF method.
Reducing DNA context dependence in bacterial promoters
Carr, Swati B.; Densmore, Douglas M.
2017-01-01
Variation in the DNA sequence upstream of bacterial promoters is known to affect the expression levels of the products they regulate, sometimes dramatically. While neutral synthetic insulator sequences have been found to buffer promoters from upstream DNA context, there are no established methods for designing effective insulator sequences with predictable effects on expression levels. We address this problem with Degenerate Insulation Screening (DIS), a novel method based on a randomized 36-nucleotide insulator library and a simple, high-throughput, flow-cytometry-based screen that randomly samples from a library of 436 potential insulated promoters. The results of this screen can then be compared against a reference uninsulated device to select a set of insulated promoters providing a precise level of expression. We verify this method by insulating the constitutive, inducible, and repressible promotors of a four transcriptional-unit inverter (NOT-gate) circuit, finding both that order dependence is largely eliminated by insulation and that circuit performance is also significantly improved, with a 5.8-fold mean improvement in on/off ratio. PMID:28422998
ERIC Educational Resources Information Center
Hans, Eva; Hiller, Wolfgang
2013-01-01
Objective: The primary aim of this study was to assess the overall effectiveness of and dropout from individual and group outpatient cognitive behavioral therapy (CBT) for adults with a primary diagnosis of unipolar depressive disorder in routine clinical practice. Method: We conducted a random effects meta-analysis of 34 nonrandomized…
ERIC Educational Resources Information Center
Kim, James S.; Samson, Jennifer F.; Fitzgerald, Robert; Hartry, Ardice
2010-01-01
The purpose of this study was (1) to examine the causal effects of READ 180, a mixed-methods literacy intervention, on measures of word reading efficiency, reading comprehension and vocabulary, and oral reading fluency and (2) to examine whether print exposure among children in the experimental condition explained variance in posttest reading…
Analysis of Variance in the Modern Design of Experiments
NASA Technical Reports Server (NTRS)
Deloach, Richard
2010-01-01
This paper is a tutorial introduction to the analysis of variance (ANOVA), intended as a reference for aerospace researchers who are being introduced to the analytical methods of the Modern Design of Experiments (MDOE), or who may have other opportunities to apply this method. One-way and two-way fixed-effects ANOVA, as well as random effects ANOVA, are illustrated in practical terms that will be familiar to most practicing aerospace researchers.
Xing, Dongyuan; Huang, Yangxin; Chen, Henian; Zhu, Yiliang; Dagne, Getachew A; Baldwin, Julie
2017-08-01
Semicontinuous data featured with an excessive proportion of zeros and right-skewed continuous positive values arise frequently in practice. One example would be the substance abuse/dependence symptoms data for which a substantial proportion of subjects investigated may report zero. Two-part mixed-effects models have been developed to analyze repeated measures of semicontinuous data from longitudinal studies. In this paper, we propose a flexible two-part mixed-effects model with skew distributions for correlated semicontinuous alcohol data under the framework of a Bayesian approach. The proposed model specification consists of two mixed-effects models linked by the correlated random effects: (i) a model on the occurrence of positive values using a generalized logistic mixed-effects model (Part I); and (ii) a model on the intensity of positive values using a linear mixed-effects model where the model errors follow skew distributions including skew- t and skew-normal distributions (Part II). The proposed method is illustrated with an alcohol abuse/dependence symptoms data from a longitudinal observational study, and the analytic results are reported by comparing potential models under different random-effects structures. Simulation studies are conducted to assess the performance of the proposed models and method.
Improving waveform inversion using modified interferometric imaging condition
NASA Astrophysics Data System (ADS)
Guo, Xuebao; Liu, Hong; Shi, Ying; Wang, Weihong; Zhang, Zhen
2017-12-01
Similar to the reverse-time migration, full waveform inversion in the time domain is a memory-intensive processing method. The computational storage size for waveform inversion mainly depends on the model size and time recording length. In general, 3D and 4D data volumes need to be saved for 2D and 3D waveform inversion gradient calculations, respectively. Even the boundary region wavefield-saving strategy creates a huge storage demand. Using the last two slices of the wavefield to reconstruct wavefields at other moments through the random boundary, avoids the need to store a large number of wavefields; however, traditional random boundary method is less effective at low frequencies. In this study, we follow a new random boundary designed to regenerate random velocity anomalies in the boundary region for each shot of each iteration. The results obtained using the random boundary condition in less illuminated areas are more seriously affected by random scattering than other areas due to the lack of coverage. In this paper, we have replaced direct correlation for computing the waveform inversion gradient by modified interferometric imaging, which enhances the continuity of the imaging path and reduces noise interference. The new imaging condition is a weighted average of extended imaging gathers can be directly used in the gradient computation. In this process, we have not changed the objective function, and the role of the imaging condition is similar to regularization. The window size for the modified interferometric imaging condition-based waveform inversion plays an important role in this process. The numerical examples show that the proposed method significantly enhances waveform inversion performance.
Improving waveform inversion using modified interferometric imaging condition
NASA Astrophysics Data System (ADS)
Guo, Xuebao; Liu, Hong; Shi, Ying; Wang, Weihong; Zhang, Zhen
2018-02-01
Similar to the reverse-time migration, full waveform inversion in the time domain is a memory-intensive processing method. The computational storage size for waveform inversion mainly depends on the model size and time recording length. In general, 3D and 4D data volumes need to be saved for 2D and 3D waveform inversion gradient calculations, respectively. Even the boundary region wavefield-saving strategy creates a huge storage demand. Using the last two slices of the wavefield to reconstruct wavefields at other moments through the random boundary, avoids the need to store a large number of wavefields; however, traditional random boundary method is less effective at low frequencies. In this study, we follow a new random boundary designed to regenerate random velocity anomalies in the boundary region for each shot of each iteration. The results obtained using the random boundary condition in less illuminated areas are more seriously affected by random scattering than other areas due to the lack of coverage. In this paper, we have replaced direct correlation for computing the waveform inversion gradient by modified interferometric imaging, which enhances the continuity of the imaging path and reduces noise interference. The new imaging condition is a weighted average of extended imaging gathers can be directly used in the gradient computation. In this process, we have not changed the objective function, and the role of the imaging condition is similar to regularization. The window size for the modified interferometric imaging condition-based waveform inversion plays an important role in this process. The numerical examples show that the proposed method significantly enhances waveform inversion performance.
Sung, Vivian W; Borello-France, Diane; Dunivan, Gena; Gantz, Marie; Lukacz, Emily S; Moalli, Pamela; Newman, Diane K; Richter, Holly E; Ridgeway, Beri; Smith, Ariana L; Weidner, Alison C; Meikle, Susan
2016-10-01
Mixed urinary incontinence (MUI) can be a challenging condition to manage. We describe the protocol design and rationale for the Effects of Surgical Treatment Enhanced with Exercise for Mixed Urinary Incontinence (ESTEEM) trial, designed to compare a combined conservative and surgical treatment approach versus surgery alone for improving patient-centered MUI outcomes at 12 months. ESTEEM is a multisite, prospective, randomized trial of female participants with MUI randomized to a standardized perioperative behavioral/pelvic floor exercise intervention plus midurethral sling versus midurethral sling alone. We describe our methods and four challenges encountered during the design phase: defining the study population, selecting relevant patient-centered outcomes, determining sample size estimates using a patient-reported outcome measure, and designing an analysis plan that accommodates MUI failure rates. A central theme in the design was patient centeredness, which guided many key decisions. Our primary outcome is patient-reported MUI symptoms measured using the Urogenital Distress Inventory (UDI) score at 12 months. Secondary outcomes include quality of life, sexual function, cost-effectiveness, time to failure, and need for additional treatment. The final study design was implemented in November 2013 across eight clinical sites in the Pelvic Floor Disorders Network. As of 27 February 2016, 433 total/472 targeted participants had been randomized. We describe the ESTEEM protocol and our methods for reaching consensus for methodological challenges in designing a trial for MUI by maintaining the patient perspective at the core of key decisions. This trial will provide information that can directly impact patient care and clinical decision making.
Dai, James Y.; Hughes, James P.
2012-01-01
The meta-analytic approach to evaluating surrogate end points assesses the predictiveness of treatment effect on the surrogate toward treatment effect on the clinical end point based on multiple clinical trials. Definition and estimation of the correlation of treatment effects were developed in linear mixed models and later extended to binary or failure time outcomes on a case-by-case basis. In a general regression setting that covers nonnormal outcomes, we discuss in this paper several metrics that are useful in the meta-analytic evaluation of surrogacy. We propose a unified 3-step procedure to assess these metrics in settings with binary end points, time-to-event outcomes, or repeated measures. First, the joint distribution of estimated treatment effects is ascertained by an estimating equation approach; second, the restricted maximum likelihood method is used to estimate the means and the variance components of the random treatment effects; finally, confidence intervals are constructed by a parametric bootstrap procedure. The proposed method is evaluated by simulations and applications to 2 clinical trials. PMID:22394448
The Locomotion of Mouse Fibroblasts in Tissue Culture
Gail, Mitchell H.; Boone, Charles W.
1970-01-01
Time-lapse cinematography was used to investigate the motion of mouse fibroblasts in tissue culture. Observations over successive short time intervals revealed a tendency for the cells to persist in their direction of motion from one 2.5 hr time interval to the next. Over 5.0-hr time intervals, however, the direction of motion appeared random. This fact suggested that D, the diffusion constant of a random walk model, might serve to characterize cellular motility if suitably long observation times were used. We therefore investigated the effect of “persistence” on the pure random walk model, and we found theoretically and confirmed experimentally that the motility of a persisting cell could indeed be characterized by an augmented diffusion constant, D*. A method for determining confidence limits on D* was also developed. Thus a random walk model, modified to comprehend the persistence effect, was found to describe the motion of fibroblasts in tissue culture and to provide a numerical measure of cellular motility. PMID:5531614
An Analysis of Costs in Institutions of Higher Education in England
ERIC Educational Resources Information Center
Johnes, Geraint; Johnes, Jill; Thanassoulis, Emmanuel
2008-01-01
Cost functions are estimated, using random effects and stochastic frontier methods, for English higher education institutions. The article advances on existing literature by employing finer disaggregation by subject, institution type and location, and by introducing consideration of quality effects. Estimates are provided of average incremental…
The Effects of Three Nebulized Osmotic Agents in the Dry Larynx
ERIC Educational Resources Information Center
Tanner, Kristine; Roy, Nelson; Merrill, Ray M.; Elstad, Mark
2007-01-01
Purpose: This investigation examined the effects of nebulized hypertonic saline, isotonic saline (IS), and sterile (hypotonic) water on phonation threshold pressure (PTP) and self-perceived phonatory effort (PPE) following a surface laryngeal dehydration challenge. Method: In a double-blind, randomized experimental trial, 60 vocally healthy women…
Sahebkar, Amirhossein; Simental-Mendía, Luis E; Pirro, Matteo; Montecucco, Fabrizio; Carbone, Federico; Banach, Maciej; Barreto, George E; Butler, Alexandra E
2018-06-29
To assess the effect of fibrates on circulating cystatin C levels. Clinical studies evaluating the effect of a fibrate on circulating cystatin C levels were searched in PubMed-Medline, SCOPUS, Web of Science and Google Scholar databases. A random-effect model and generic inverse variance method were used for quantitative data synthesis, sensitivity analysis conducted using the leave-one-out method, and weighted random-effects meta-regression performed to evaluate potential confounders on cystatin C levels. This meta-analysis of data from 9 published studies (16 treatment arms) involved a total of 2195 subjects. In a single-arm analysis of clinical trials (without control group; 8 studies comprising 14 treatment arms), fibrate therapy increased circulating cystatin C concentrations (WMD: 0.07 mg/dL, 95% CI: 0.04, 0.10, p <0.001; I 2 = 82.66%). When the analysis was restricted to randomized controlled trials (4 studies comprising 6 treatment arms), again elevation of circulating cystatin C levels was observed (WMD: 0.06 mg/L, 95% CI: 0.03, 0.09, p <0.001; I 2 = 42.98%). Elevated cystatin C levels were only seen with fenofibrate, not other fibrates. The results suggest that fenofibrate treatment adversely affects cystatin C levels and might partially explain the limited efficacy of fenofibrate in reducing cardiovascular events.
Yamashita, Yasunobu; Ueda, Kazuki; Kawaji, Yuki; Tamura, Takashi; Itonaga, Masahiro; Yoshida, Takeichi; Maeda, Hiroki; Magari, Hirohito; Maekita, Takao; Iguchi, Mikitaka; Tamai, Hideyuki; Ichinose, Masao; Kato, Jun
2016-01-01
Background/Aims Transpapillary forceps biopsy is an effective diagnostic technique in patients with biliary stricture. This prospective study aimed to determine the usefulness of the wire-grasping method as a new technique for forceps biopsy. Methods Consecutive patients with biliary stricture or irregularities of the bile duct wall were randomly allocated to either the direct or wire-grasping method group. In the wire-grasping method, forceps in the duodenum grasps a guide-wire placed into the bile duct beforehand, and then, the forceps are pushed through the papilla without endoscopic sphincterotomy. In the direct method, forceps are directly pushed into the bile duct alongside a guide-wire. The primary endpoint was the success rate of obtaining specimens suitable for adequate pathological examination. Results In total, 32 patients were enrolled, and 28 (14 in each group) were eligible for analysis. The success rate was significantly higher using the wire-grasping method than the direct method (100% vs 50%, p=0.016). Sensitivity and accuracy for the diagnosis of cancer were comparable in patients with the successful procurement of biopsy specimens between the two methods (91% vs 83% and 93% vs 86%, respectively). Conclusions The wire-grasping method is useful for diagnosing patients with biliary stricture or irregularities of the bile duct wall. PMID:27021502
Certified randomness in quantum physics.
Acín, Antonio; Masanes, Lluis
2016-12-07
The concept of randomness plays an important part in many disciplines. On the one hand, the question of whether random processes exist is fundamental for our understanding of nature. On the other, randomness is a resource for cryptography, algorithms and simulations. Standard methods for generating randomness rely on assumptions about the devices that are often not valid in practice. However, quantum technologies enable new methods for generating certified randomness, based on the violation of Bell inequalities. These methods are referred to as device-independent because they do not rely on any modelling of the devices. Here we review efforts to design device-independent randomness generators and the associated challenges.
ERIC Educational Resources Information Center
Lou, Vivian W. Q.; Zhang, Yiqi
2006-01-01
Objective: This study evaluated the effectiveness of a Participatory Empowerment Group (PEG) for Chinese type 2 diabetes patients in Shanghai. Method: A randomized waiting list control and pretest and posttest comparisons were used to evaluate the effectiveness of the intervention by comparing blood sugar level and health-related quality of life.…
Effects of a 2-Year School-Based Intervention of Enhanced Physical Education in the Primary School
ERIC Educational Resources Information Center
Sacchetti, Rossella; Ceciliani, Andrea; Garulli, Andrea; Dallolio, Laura; Beltrami, Patrizia; Leoni, Erica
2013-01-01
Background: This study aimed to assess whether a school-based physical education intervention was effective in improving physical abilities and influencing daily physical activity habits in primary school children. The possible effect on body mass index (BMI) was also considered. Methods: Twenty-six 3rd-grade classes were randomly selected…
Zolghadri, Jaleh; Younesi, Masoumeh; Asadi, Nasrin; Khosravi, Dezire; Behdin, Shabnam; Tavana, Zohre; Ghaffarpasand, Fariborz
2014-02-01
To compare the effectiveness of the double cervical cerclage method versus the single method in women with recurrent second-trimester delivery. In this randomized clinical trial, we included 33 singleton pregnancies suffering from recurrent second-trimester pregnancy loss (≥2 consecutive fetal loss during second-trimester or with a history of unsuccessful procedures utilizing the McDonald method), due to cervical incompetence. Patients were randomly assigned to undergo either the classic McDonald method (n = 14) or the double cerclage method (n = 19). The successful pregnancy rate and gestational age at delivery was also compared between the two groups. The two study groups were comparable regarding their baseline characteristics. The successful pregnancy rate did not differ significantly between those who underwent the double cerclage method or the classic McDonald cerclage method (100% vs 85.7%; P = 0.172). In the same way, the preterm delivery rate (<34 weeks of gestation) was comparable between the two study groups (10.5% vs 35.7%; P = 0.106). Those undergoing the double cerclage method had longer gestational duration (37.2 ± 2.6 vs 34.3 ± 3.8 weeks; P = 0.016). The double cervical cerclage method seems to provide better cervical support, as compared with the classic McDonald cerclage method, in those suffering from recurrent pregnancy loss, due to cervical incompetence. © 2013 The Authors. Journal of Obstetrics and Gynaecology Research © 2013 Japan Society of Obstetrics and Gynecology.
Robust inference in summary data Mendelian randomization via the zero modal pleiotropy assumption.
Hartwig, Fernando Pires; Davey Smith, George; Bowden, Jack
2017-12-01
Mendelian randomization (MR) is being increasingly used to strengthen causal inference in observational studies. Availability of summary data of genetic associations for a variety of phenotypes from large genome-wide association studies (GWAS) allows straightforward application of MR using summary data methods, typically in a two-sample design. In addition to the conventional inverse variance weighting (IVW) method, recently developed summary data MR methods, such as the MR-Egger and weighted median approaches, allow a relaxation of the instrumental variable assumptions. Here, a new method - the mode-based estimate (MBE) - is proposed to obtain a single causal effect estimate from multiple genetic instruments. The MBE is consistent when the largest number of similar (identical in infinite samples) individual-instrument causal effect estimates comes from valid instruments, even if the majority of instruments are invalid. We evaluate the performance of the method in simulations designed to mimic the two-sample summary data setting, and demonstrate its use by investigating the causal effect of plasma lipid fractions and urate levels on coronary heart disease risk. The MBE presented less bias and lower type-I error rates than other methods under the null in many situations. Its power to detect a causal effect was smaller compared with the IVW and weighted median methods, but was larger than that of MR-Egger regression, with sample size requirements typically smaller than those available from GWAS consortia. The MBE relaxes the instrumental variable assumptions, and should be used in combination with other approaches in sensitivity analyses. © The Author 2017. Published by Oxford University Press on behalf of the International Epidemiological Association
Wang, Chaoyi; Chen, Xiaoan; Wang, Huiru
2018-01-01
Objective: The purpose of this review was to objectively evaluate the effects of Baduanjin exercise on rehabilitative outcomes in stroke patients. Methods: Both Chinese and English electronic databases were searched for potentially relevant trials. Two review authors independently screened eligible trials against the inclusion criteria, extracted data, and assessed the methodological quality by using the revised PEDro scale. Meta-analysis was only performed for balance function. Results: In total, there were eight randomized controlled trials selected in this systematic review. The aggregated result of four trials has shown a significant benefit in favor of Baduanjin on balance function (Hedges’ g = 2.39, 95% CI 2.14 to 2.65, p < 0.001, I2 = 61.54). Additionally, Baduanjin exercise effectively improved sensorimotor function of lower extremities and ability of daily activities as well as reduced depressive level, leading to improved quality of life. Conclusion: Baduanjin exercise as an adjunctive and safe method may be conducive to help stroke patients achieve the best possible short-term outcome and should be integrated with mainstream rehabilitation programs. More rigorous randomized controlled trials with long-term intervention periods among a large sample size of stroke patients are needed to draw a firm conclusion regarding the rehabilitative effects for this population. PMID:29584623
Curiac, Daniel-Ioan; Volosencu, Constantin
2015-10-08
Providing unpredictable trajectories for patrol robots is essential when coping with adversaries. In order to solve this problem we developed an effective approach based on the known protean behavior of individual prey animals-random zig-zag movement. The proposed bio-inspired method modifies the normal robot's path by incorporating sudden and irregular direction changes without jeopardizing the robot's mission. Such a tactic is aimed to confuse the enemy (e.g. a sniper), offering less time to acquire and retain sight alignment and sight picture. This idea is implemented by simulating a series of fictive-temporary obstacles that will randomly appear in the robot's field of view, deceiving the obstacle avoiding mechanism to react. The new general methodology is particularized by using the Arnold's cat map to obtain the timely random appearance and disappearance of the fictive obstacles. The viability of the proposed method is confirmed through an extensive simulation case study.
Carman, Kristin L; Mallery, Coretta; Maurer, Maureen; Wang, Grace; Garfinkel, Steve; Yang, Manshu; Gilmore, Dierdre; Windham, Amy; Ginsburg, Marjorie; Sofaer, Shoshanna; Gold, Marthe; Pathak-Sen, Ela; Davies, Todd; Siegel, Joanna; Mangrum, Rikki; Fernandez, Jessica; Richmond, Jennifer; Fishkin, James; Siu Chao, Alice
2015-05-01
Public deliberation elicits informed perspectives on complex issues that are values-laden and lack technical solutions. This Deliberative Methods Demonstration examined the effectiveness of public deliberation for obtaining informed public input regarding the role of medical evidence in U.S. healthcare. We conducted a 5-arm randomized controlled trial, assigning participants to one of four deliberative methods or to a reading materials only (RMO) control group. The four deliberative methods reflected important differences in implementation, including length of the deliberative process and mode of interaction. The project convened 76 groups between August and November 2012 in four U.S. Chicago, IL; Sacramento, CA; Silver Spring, MD; and Durham, NC, capturing a sociodemographically diverse sample with specific attention to ensuring inclusion of Hispanic, African-American, and elderly participants. Of 1774 people recruited, 75% participated: 961 took part in a deliberative method and 377 participants comprised the RMO control group. To assess effectiveness of the deliberative methods overall and of individual methods, we evaluated whether mean pre-post changes on a knowledge and attitude survey were statistically different from the RMO control using ANCOVA. In addition, we calculated mean scores capturing participant views of the impact and value of deliberation. Participating in deliberation increased participants' knowledge of evidence and comparative effectiveness research and shifted participants' attitudes regarding the role of evidence in decision-making. When comparing each deliberative method to the RMO control group, all four deliberative methods resulted in statistically significant change on at least one knowledge or attitude measure. These findings were underscored by self-reports that the experience affected participants' opinions. Public deliberation offers unique potential for those seeking informed input on complex, values-laden topics affecting broad public constituencies. Copyright © 2015 Elsevier Ltd. All rights reserved.