Methods for sample size determination in cluster randomized trials
Rutterford, Clare; Copas, Andrew; Eldridge, Sandra
2015-01-01
Background: The use of cluster randomized trials (CRTs) is increasing, along with the variety in their design and analysis. The simplest approach for their sample size calculation is to calculate the sample size assuming individual randomization and inflate this by a design effect to account for randomization by cluster. The assumptions of a simple design effect may not always be met; alternative or more complicated approaches are required. Methods: We summarise a wide range of sample size methods available for cluster randomized trials. For those familiar with sample size calculations for individually randomized trials but with less experience in the clustered case, this manuscript provides formulae for a wide range of scenarios with associated explanation and recommendations. For those with more experience, comprehensive summaries are provided that allow quick identification of methods for a given design, outcome and analysis method. Results: We present first those methods applicable to the simplest two-arm, parallel group, completely randomized design followed by methods that incorporate deviations from this design such as: variability in cluster sizes; attrition; non-compliance; or the inclusion of baseline covariates or repeated measures. The paper concludes with methods for alternative designs. Conclusions: There is a large amount of methodology available for sample size calculations in CRTs. This paper gives the most comprehensive description of published methodology for sample size calculation and provides an important resource for those designing these trials. PMID:26174515
A Bayesian sequential design with adaptive randomization for 2-sided hypothesis test.
Yu, Qingzhao; Zhu, Lin; Zhu, Han
2017-11-01
Bayesian sequential and adaptive randomization designs are gaining popularity in clinical trials thanks to their potentials to reduce the number of required participants and save resources. We propose a Bayesian sequential design with adaptive randomization rates so as to more efficiently attribute newly recruited patients to different treatment arms. In this paper, we consider 2-arm clinical trials. Patients are allocated to the 2 arms with a randomization rate to achieve minimum variance for the test statistic. Algorithms are presented to calculate the optimal randomization rate, critical values, and power for the proposed design. Sensitivity analysis is implemented to check the influence on design by changing the prior distributions. Simulation studies are applied to compare the proposed method and traditional methods in terms of power and actual sample sizes. Simulations show that, when total sample size is fixed, the proposed design can obtain greater power and/or cost smaller actual sample size than the traditional Bayesian sequential design. Finally, we apply the proposed method to a real data set and compare the results with the Bayesian sequential design without adaptive randomization in terms of sample sizes. The proposed method can further reduce required sample size. Copyright © 2017 John Wiley & Sons, Ltd.
Designing Studies That Would Address the Multilayered Nature of Health Care
Pennell, Michael; Rhoda, Dale; Hade, Erinn M.; Paskett, Electra D.
2010-01-01
We review design and analytic methods available for multilevel interventions in cancer research with particular attention to study design, sample size requirements, and potential to provide statistical evidence for causal inference. The most appropriate methods will depend on the stage of development of the research and whether randomization is possible. Early on, fractional factorial designs may be used to screen intervention components, particularly when randomization of individuals is possible. Quasi-experimental designs, including time-series and multiple baseline designs, can be useful once the intervention is designed because they require few sites and can provide the preliminary evidence to plan efficacy studies. In efficacy and effectiveness studies, group-randomized trials are preferred when randomization is possible and regression discontinuity designs are preferred otherwise if assignment based on a quantitative score is possible. Quasi-experimental designs may be used, especially when combined with recent developments in analytic methods to reduce bias in effect estimates. PMID:20386057
Review of Recent Methodological Developments in Group-Randomized Trials: Part 2-Analysis.
Turner, Elizabeth L; Prague, Melanie; Gallis, John A; Li, Fan; Murray, David M
2017-07-01
In 2004, Murray et al. reviewed methodological developments in the design and analysis of group-randomized trials (GRTs). We have updated that review with developments in analysis of the past 13 years, with a companion article to focus on developments in design. We discuss developments in the topics of the earlier review (e.g., methods for parallel-arm GRTs, individually randomized group-treatment trials, and missing data) and in new topics, including methods to account for multiple-level clustering and alternative estimation methods (e.g., augmented generalized estimating equations, targeted maximum likelihood, and quadratic inference functions). In addition, we describe developments in analysis of alternative group designs (including stepped-wedge GRTs, network-randomized trials, and pseudocluster randomized trials), which require clustering to be accounted for in their design and analysis.
NASA Astrophysics Data System (ADS)
Liao, Zhikun; Lu, Dawei; Hu, Jiemin; Zhang, Jun
2018-04-01
For the random hopping frequency signal, the modulated frequencies are randomly distributed over given bandwidth. The randomness of modulated frequency not only improves the electronic counter countermeasure capability for radar systems, but also determines its performance of range compression. In this paper, the range ambiguity function of RHF signal is firstly derived. Then, a design method of frequency hopping pattern based on stationary phase principle to improve the peak to side-lobe ratio is proposed. Finally, the simulated experiments show a good effectiveness of the presented design method.
A Comparison of Single Sample and Bootstrap Methods to Assess Mediation in Cluster Randomized Trials
ERIC Educational Resources Information Center
Pituch, Keenan A.; Stapleton, Laura M.; Kang, Joo Youn
2006-01-01
A Monte Carlo study examined the statistical performance of single sample and bootstrap methods that can be used to test and form confidence interval estimates of indirect effects in two cluster randomized experimental designs. The designs were similar in that they featured random assignment of clusters to one of two treatment conditions and…
A New Approach to Extreme Value Estimation Applicable to a Wide Variety of Random Variables
NASA Technical Reports Server (NTRS)
Holland, Frederic A., Jr.
1997-01-01
Designing reliable structures requires an estimate of the maximum and minimum values (i.e., strength and load) that may be encountered in service. Yet designs based on very extreme values (to insure safety) can result in extra material usage and hence, uneconomic systems. In aerospace applications, severe over-design cannot be tolerated making it almost mandatory to design closer to the assumed limits of the design random variables. The issue then is predicting extreme values that are practical, i.e. neither too conservative or non-conservative. Obtaining design values by employing safety factors is well known to often result in overly conservative designs and. Safety factor values have historically been selected rather arbitrarily, often lacking a sound rational basis. To answer the question of how safe a design needs to be has lead design theorists to probabilistic and statistical methods. The so-called three-sigma approach is one such method and has been described as the first step in utilizing information about the data dispersion. However, this method is based on the assumption that the random variable is dispersed symmetrically about the mean and is essentially limited to normally distributed random variables. Use of this method can therefore result in unsafe or overly conservative design allowables if the common assumption of normality is incorrect.
Zhao, Wenle; Weng, Yanqiu; Wu, Qi; Palesch, Yuko
2012-01-01
To evaluate the performance of randomization designs under various parameter settings and trial sample sizes, and identify optimal designs with respect to both treatment imbalance and allocation randomness, we evaluate 260 design scenarios from 14 randomization designs under 15 sample sizes range from 10 to 300, using three measures for imbalance and three measures for randomness. The maximum absolute imbalance and the correct guess (CG) probability are selected to assess the trade-off performance of each randomization design. As measured by the maximum absolute imbalance and the CG probability, we found that performances of the 14 randomization designs are located in a closed region with the upper boundary (worst case) given by Efron's biased coin design (BCD) and the lower boundary (best case) from the Soares and Wu's big stick design (BSD). Designs close to the lower boundary provide a smaller imbalance and a higher randomness than designs close to the upper boundary. Our research suggested that optimization of randomization design is possible based on quantified evaluation of imbalance and randomness. Based on the maximum imbalance and CG probability, the BSD, Chen's biased coin design with imbalance tolerance method, and Chen's Ehrenfest urn design perform better than popularly used permuted block design, EBCD, and Wei's urn design. Copyright © 2011 John Wiley & Sons, Ltd.
NASA Technical Reports Server (NTRS)
Merchant, D. H.
1976-01-01
Methods are presented for calculating design limit loads compatible with probabilistic structural design criteria. The approach is based on the concept that the desired limit load, defined as the largest load occurring in a mission, is a random variable having a specific probability distribution which may be determined from extreme-value theory. The design limit load, defined as a particular of this random limit load, is the value conventionally used in structural design. Methods are presented for determining the limit load probability distributions from both time-domain and frequency-domain dynamic load simulations. Numerical demonstrations of the method are also presented.
Methods for Combining Payload Parameter Variations with Input Environment
NASA Technical Reports Server (NTRS)
Merchant, D. H.; Straayer, J. W.
1975-01-01
Methods are presented for calculating design limit loads compatible with probabilistic structural design criteria. The approach is based on the concept that the desired limit load, defined as the largest load occuring in a mission, is a random variable having a specific probability distribution which may be determined from extreme-value theory. The design limit load, defined as a particular value of this random limit load, is the value conventionally used in structural design. Methods are presented for determining the limit load probability distributions from both time-domain and frequency-domain dynamic load simulations. Numerical demonstrations of the methods are also presented.
On the design of henon and logistic map-based random number generator
NASA Astrophysics Data System (ADS)
Magfirawaty; Suryadi, M. T.; Ramli, Kalamullah
2017-10-01
The key sequence is one of the main elements in the cryptosystem. True Random Number Generators (TRNG) method is one of the approaches to generating the key sequence. The randomness source of the TRNG divided into three main groups, i.e. electrical noise based, jitter based and chaos based. The chaos based utilizes a non-linear dynamic system (continuous time or discrete time) as an entropy source. In this study, a new design of TRNG based on discrete time chaotic system is proposed, which is then simulated in LabVIEW. The principle of the design consists of combining 2D and 1D chaotic systems. A mathematical model is implemented for numerical simulations. We used comparator process as a harvester method to obtain the series of random bits. Without any post processing, the proposed design generated random bit sequence with high entropy value and passed all NIST 800.22 statistical tests.
R. A. Fisher and his advocacy of randomization.
Hall, Nancy S
2007-01-01
The requirement of randomization in experimental design was first stated by R. A. Fisher, statistician and geneticist, in 1925 in his book Statistical Methods for Research Workers. Earlier designs were systematic and involved the judgment of the experimenter; this led to possible bias and inaccurate interpretation of the data. Fisher's dictum was that randomization eliminates bias and permits a valid test of significance. Randomization in experimenting had been used by Charles Sanders Peirce in 1885 but the practice was not continued. Fisher developed his concepts of randomizing as he considered the mathematics of small samples, in discussions with "Student," William Sealy Gosset. Fisher published extensively. His principles of experimental design were spread worldwide by the many "voluntary workers" who came from other institutions to Rothamsted Agricultural Station in England to learn Fisher's methods.
Conditional Monte Carlo randomization tests for regression models.
Parhat, Parwen; Rosenberger, William F; Diao, Guoqing
2014-08-15
We discuss the computation of randomization tests for clinical trials of two treatments when the primary outcome is based on a regression model. We begin by revisiting the seminal paper of Gail, Tan, and Piantadosi (1988), and then describe a method based on Monte Carlo generation of randomization sequences. The tests based on this Monte Carlo procedure are design based, in that they incorporate the particular randomization procedure used. We discuss permuted block designs, complete randomization, and biased coin designs. We also use a new technique by Plamadeala and Rosenberger (2012) for simple computation of conditional randomization tests. Like Gail, Tan, and Piantadosi, we focus on residuals from generalized linear models and martingale residuals from survival models. Such techniques do not apply to longitudinal data analysis, and we introduce a method for computation of randomization tests based on the predicted rate of change from a generalized linear mixed model when outcomes are longitudinal. We show, by simulation, that these randomization tests preserve the size and power well under model misspecification. Copyright © 2014 John Wiley & Sons, Ltd.
Gu, Jing; Wang, Qi; Wang, Xiaogang; Li, Hailong; Gu, Mei; Ming, Haixia; Dong, Xiaoli; Yang, Kehu; Wu, Hongyan
2014-01-01
Background. This review provides the first methodological information assessment of protocol of acupuncture RCTs registered in WHO International Clinical Trials Registry Platform (ICTRP). Methods. All records of acupuncture RCTs registered in the ICTRP have been collected. The methodological design assessment involved whether the randomization methods, allocation concealment, and blinding were adequate or not based on the information of registration records (protocols of acupuncture RCTs). Results. A total of 453 records, found in 11 registries, were examined. Methodological details were insufficient in registration records; there were 76.4%, 89.0%, and 21.4% records that did not provide information on randomization methods, allocation concealment, and blinding respectively. The proportions of adequate randomization methods, allocation concealment, and blinding were only 107 (23.6%), 48 (10.6%), and 210 (46.4%), respectively. The methodological design improved year by year, especially after 2007. Additionally, methodology of RCTs with ethics approval was clearly superior to those without ethics approval and different among registries. Conclusions. The overall methodological design based on registration records of acupuncture RCTs is not very well but improved year by year. The insufficient information on randomization methods, allocation concealment, and blinding maybe due to the relevant description is not taken seriously in acupuncture RCTs' registration. PMID:24688591
Gu, Jing; Wang, Qi; Wang, Xiaogang; Li, Hailong; Gu, Mei; Ming, Haixia; Dong, Xiaoli; Yang, Kehu; Wu, Hongyan
2014-01-01
Background. This review provides the first methodological information assessment of protocol of acupuncture RCTs registered in WHO International Clinical Trials Registry Platform (ICTRP). Methods. All records of acupuncture RCTs registered in the ICTRP have been collected. The methodological design assessment involved whether the randomization methods, allocation concealment, and blinding were adequate or not based on the information of registration records (protocols of acupuncture RCTs). Results. A total of 453 records, found in 11 registries, were examined. Methodological details were insufficient in registration records; there were 76.4%, 89.0%, and 21.4% records that did not provide information on randomization methods, allocation concealment, and blinding respectively. The proportions of adequate randomization methods, allocation concealment, and blinding were only 107 (23.6%), 48 (10.6%), and 210 (46.4%), respectively. The methodological design improved year by year, especially after 2007. Additionally, methodology of RCTs with ethics approval was clearly superior to those without ethics approval and different among registries. Conclusions. The overall methodological design based on registration records of acupuncture RCTs is not very well but improved year by year. The insufficient information on randomization methods, allocation concealment, and blinding maybe due to the relevant description is not taken seriously in acupuncture RCTs' registration.
Certified randomness in quantum physics.
Acín, Antonio; Masanes, Lluis
2016-12-07
The concept of randomness plays an important part in many disciplines. On the one hand, the question of whether random processes exist is fundamental for our understanding of nature. On the other, randomness is a resource for cryptography, algorithms and simulations. Standard methods for generating randomness rely on assumptions about the devices that are often not valid in practice. However, quantum technologies enable new methods for generating certified randomness, based on the violation of Bell inequalities. These methods are referred to as device-independent because they do not rely on any modelling of the devices. Here we review efforts to design device-independent randomness generators and the associated challenges.
Pandis, Nikolaos; Polychronopoulou, Argy; Eliades, Theodore
2011-12-01
Randomization is a key step in reducing selection bias during the treatment allocation phase in randomized clinical trials. The process of randomization follows specific steps, which include generation of the randomization list, allocation concealment, and implementation of randomization. The phenomenon in the dental and orthodontic literature of characterizing treatment allocation as random is frequent; however, often the randomization procedures followed are not appropriate. Randomization methods assign, at random, treatment to the trial arms without foreknowledge of allocation by either the participants or the investigators thus reducing selection bias. Randomization entails generation of random allocation, allocation concealment, and the actual methodology of implementing treatment allocation randomly and unpredictably. Most popular randomization methods include some form of restricted and/or stratified randomization. This article introduces the reasons, which make randomization an integral part of solid clinical trial methodology, and presents the main randomization schemes applicable to clinical trials in orthodontics.
Analysis of random signal combinations for spacecraft pointing stability
NASA Technical Reports Server (NTRS)
Howell, L.
1983-01-01
Methods for obtaining the probability density function of random signal combustions are discussed. These methods provide a realistic criteria for the design of control systems subjected to external noise with several important applications for aerospace problems.
ERIC Educational Resources Information Center
Pillemer, Karl; Meador, Rhoda; Henderson, Charles, Jr.; Robison, Julie; Hegeman, Carol; Graham, Edwin; Schultz, Leslie
2008-01-01
Purpose: This article reports on a randomized, controlled intervention study designed to reduce employee turnover by creating a retention specialist position in nursing homes. Design and Methods: We collected data three times over a 1-year period in 30 nursing homes, sampled in stratified random manner from facilities in New York State and…
Dong, Qi; Elliott, Michael R; Raghunathan, Trivellore E
2014-06-01
Outside of the survey sampling literature, samples are often assumed to be generated by a simple random sampling process that produces independent and identically distributed (IID) samples. Many statistical methods are developed largely in this IID world. Application of these methods to data from complex sample surveys without making allowance for the survey design features can lead to erroneous inferences. Hence, much time and effort have been devoted to develop the statistical methods to analyze complex survey data and account for the sample design. This issue is particularly important when generating synthetic populations using finite population Bayesian inference, as is often done in missing data or disclosure risk settings, or when combining data from multiple surveys. By extending previous work in finite population Bayesian bootstrap literature, we propose a method to generate synthetic populations from a posterior predictive distribution in a fashion inverts the complex sampling design features and generates simple random samples from a superpopulation point of view, making adjustment on the complex data so that they can be analyzed as simple random samples. We consider a simulation study with a stratified, clustered unequal-probability of selection sample design, and use the proposed nonparametric method to generate synthetic populations for the 2006 National Health Interview Survey (NHIS), and the Medical Expenditure Panel Survey (MEPS), which are stratified, clustered unequal-probability of selection sample designs.
Dong, Qi; Elliott, Michael R.; Raghunathan, Trivellore E.
2017-01-01
Outside of the survey sampling literature, samples are often assumed to be generated by a simple random sampling process that produces independent and identically distributed (IID) samples. Many statistical methods are developed largely in this IID world. Application of these methods to data from complex sample surveys without making allowance for the survey design features can lead to erroneous inferences. Hence, much time and effort have been devoted to develop the statistical methods to analyze complex survey data and account for the sample design. This issue is particularly important when generating synthetic populations using finite population Bayesian inference, as is often done in missing data or disclosure risk settings, or when combining data from multiple surveys. By extending previous work in finite population Bayesian bootstrap literature, we propose a method to generate synthetic populations from a posterior predictive distribution in a fashion inverts the complex sampling design features and generates simple random samples from a superpopulation point of view, making adjustment on the complex data so that they can be analyzed as simple random samples. We consider a simulation study with a stratified, clustered unequal-probability of selection sample design, and use the proposed nonparametric method to generate synthetic populations for the 2006 National Health Interview Survey (NHIS), and the Medical Expenditure Panel Survey (MEPS), which are stratified, clustered unequal-probability of selection sample designs. PMID:29200608
Safety assessment of a shallow foundation using the random finite element method
NASA Astrophysics Data System (ADS)
Zaskórski, Łukasz; Puła, Wojciech
2015-04-01
A complex structure of soil and its random character are reasons why soil modeling is a cumbersome task. Heterogeneity of soil has to be considered even within a homogenous layer of soil. Therefore an estimation of shear strength parameters of soil for the purposes of a geotechnical analysis causes many problems. In applicable standards (Eurocode 7) there is not presented any explicit method of an evaluation of characteristic values of soil parameters. Only general guidelines can be found how these values should be estimated. Hence many approaches of an assessment of characteristic values of soil parameters are presented in literature and can be applied in practice. In this paper, the reliability assessment of a shallow strip footing was conducted using a reliability index β. Therefore some approaches of an estimation of characteristic values of soil properties were compared by evaluating values of reliability index β which can be achieved by applying each of them. Method of Orr and Breysse, Duncan's method, Schneider's method, Schneider's method concerning influence of fluctuation scales and method included in Eurocode 7 were examined. Design values of the bearing capacity based on these approaches were referred to the stochastic bearing capacity estimated by the random finite element method (RFEM). Design values of the bearing capacity were conducted for various widths and depths of a foundation in conjunction with design approaches DA defined in Eurocode. RFEM was presented by Griffiths and Fenton (1993). It combines deterministic finite element method, random field theory and Monte Carlo simulations. Random field theory allows to consider a random character of soil parameters within a homogenous layer of soil. For this purpose a soil property is considered as a separate random variable in every element of a mesh in the finite element method with proper correlation structure between points of given area. RFEM was applied to estimate which theoretical probability distribution fits the empirical probability distribution of bearing capacity basing on 3000 realizations. Assessed probability distribution was applied to compute design values of the bearing capacity and related reliability indices β. Conducted analysis were carried out for a cohesion soil. Hence a friction angle and a cohesion were defined as a random parameters and characterized by two dimensional random fields. A friction angle was described by a bounded distribution as it differs within limited range. While a lognormal distribution was applied in case of a cohesion. Other properties - Young's modulus, Poisson's ratio and unit weight were assumed as deterministic values because they have negligible influence on the stochastic bearing capacity. Griffiths D. V., & Fenton G. A. (1993). Seepage beneath water retaining structures founded on spatially random soil. Géotechnique, 43(6), 577-587.
Schweizer, Marin L.; Braun, Barbara I.; Milstone, Aaron M.
2016-01-01
Quasi-experimental studies evaluate the association between an intervention and an outcome using experiments in which the intervention is not randomly assigned. Quasi-experimental studies are often used to evaluate rapid responses to outbreaks or other patient safety problems requiring prompt non-randomized interventions. Quasi-experimental studies can be categorized into three major types: interrupted time series designs, designs with control groups, and designs without control groups. This methods paper highlights key considerations for quasi-experimental studies in healthcare epidemiology and antimicrobial stewardship including study design and analytic approaches to avoid selection bias and other common pitfalls of quasi-experimental studies. PMID:27267457
Design and analysis of group-randomized trials in cancer: A review of current practices.
Murray, David M; Pals, Sherri L; George, Stephanie M; Kuzmichev, Andrey; Lai, Gabriel Y; Lee, Jocelyn A; Myles, Ranell L; Nelson, Shakira M
2018-06-01
The purpose of this paper is to summarize current practices for the design and analysis of group-randomized trials involving cancer-related risk factors or outcomes and to offer recommendations to improve future trials. We searched for group-randomized trials involving cancer-related risk factors or outcomes that were published or online in peer-reviewed journals in 2011-15. During 2016-17, in Bethesda MD, we reviewed 123 articles from 76 journals to characterize their design and their methods for sample size estimation and data analysis. Only 66 (53.7%) of the articles reported appropriate methods for sample size estimation. Only 63 (51.2%) reported exclusively appropriate methods for analysis. These findings suggest that many investigators do not adequately attend to the methodological challenges inherent in group-randomized trials. These practices can lead to underpowered studies, to an inflated type 1 error rate, and to inferences that mislead readers. Investigators should work with biostatisticians or other methodologists familiar with these issues. Funders and editors should ensure careful methodological review of applications and manuscripts. Reviewers should ensure that studies are properly planned and analyzed. These steps are needed to improve the rigor and reproducibility of group-randomized trials. The Office of Disease Prevention (ODP) at the National Institutes of Health (NIH) has taken several steps to address these issues. ODP offers an online course on the design and analysis of group-randomized trials. ODP is working to increase the number of methodologists who serve on grant review panels. ODP has developed standard language for the Application Guide and the Review Criteria to draw investigators' attention to these issues. Finally, ODP has created a new Research Methods Resources website to help investigators, reviewers, and NIH staff better understand these issues. Published by Elsevier Inc.
Methods and analysis of realizing randomized grouping.
Hu, Liang-Ping; Bao, Xiao-Lei; Wang, Qi
2011-07-01
Randomization is one of the four basic principles of research design. The meaning of randomization includes two aspects: one is to randomly select samples from the population, which is known as random sampling; the other is to randomly group all the samples, which is called randomized grouping. Randomized grouping can be subdivided into three categories: completely, stratified and dynamically randomized grouping. This article mainly introduces the steps of complete randomization, the definition of dynamic randomization and the realization of random sampling and grouping by SAS software.
Robustness-Based Design Optimization Under Data Uncertainty
NASA Technical Reports Server (NTRS)
Zaman, Kais; McDonald, Mark; Mahadevan, Sankaran; Green, Lawrence
2010-01-01
This paper proposes formulations and algorithms for design optimization under both aleatory (i.e., natural or physical variability) and epistemic uncertainty (i.e., imprecise probabilistic information), from the perspective of system robustness. The proposed formulations deal with epistemic uncertainty arising from both sparse and interval data without any assumption about the probability distributions of the random variables. A decoupled approach is proposed in this paper to un-nest the robustness-based design from the analysis of non-design epistemic variables to achieve computational efficiency. The proposed methods are illustrated for the upper stage design problem of a two-stage-to-orbit (TSTO) vehicle, where the information on the random design inputs are only available as sparse point and/or interval data. As collecting more data reduces uncertainty but increases cost, the effect of sample size on the optimality and robustness of the solution is also studied. A method is developed to determine the optimal sample size for sparse point data that leads to the solutions of the design problem that are least sensitive to variations in the input random variables.
A practical approach to automate randomized design of experiments for ligand-binding assays.
Tsoi, Jennifer; Patel, Vimal; Shih, Judy
2014-03-01
Design of experiments (DOE) is utilized in optimizing ligand-binding assay by modeling factor effects. To reduce the analyst's workload and error inherent with DOE, we propose the integration of automated liquid handlers to perform the randomized designs. A randomized design created from statistical software was imported into custom macro converting the design into a liquid-handler worklist to automate reagent delivery. An optimized assay was transferred to a contract research organization resulting in a successful validation. We developed a practical solution for assay optimization by integrating DOE and automation to increase assay robustness and enable successful method transfer. The flexibility of this process allows it to be applied to a variety of assay designs.
Performance of Random Effects Model Estimators under Complex Sampling Designs
ERIC Educational Resources Information Center
Jia, Yue; Stokes, Lynne; Harris, Ian; Wang, Yan
2011-01-01
In this article, we consider estimation of parameters of random effects models from samples collected via complex multistage designs. Incorporation of sampling weights is one way to reduce estimation bias due to unequal probabilities of selection. Several weighting methods have been proposed in the literature for estimating the parameters of…
A Probabilistic Design Method Applied to Smart Composite Structures
NASA Technical Reports Server (NTRS)
Shiao, Michael C.; Chamis, Christos C.
1995-01-01
A probabilistic design method is described and demonstrated using a smart composite wing. Probabilistic structural design incorporates naturally occurring uncertainties including those in constituent (fiber/matrix) material properties, fabrication variables, structure geometry and control-related parameters. Probabilistic sensitivity factors are computed to identify those parameters that have a great influence on a specific structural reliability. Two performance criteria are used to demonstrate this design methodology. The first criterion requires that the actuated angle at the wing tip be bounded by upper and lower limits at a specified reliability. The second criterion requires that the probability of ply damage due to random impact load be smaller than an assigned value. When the relationship between reliability improvement and the sensitivity factors is assessed, the results show that a reduction in the scatter of the random variable with the largest sensitivity factor (absolute value) provides the lowest failure probability. An increase in the mean of the random variable with a negative sensitivity factor will reduce the failure probability. Therefore, the design can be improved by controlling or selecting distribution parameters associated with random variables. This can be implemented during the manufacturing process to obtain maximum benefit with minimum alterations.
Sørensen, By Ole H
2016-10-01
Organizational-level occupational health interventions have great potential to improve employees' health and well-being. However, they often compare unfavourably to individual-level interventions. This calls for improving methods for designing, implementing and evaluating organizational interventions. This paper presents and discusses the regression discontinuity design because, like the randomized control trial, it is a strong summative experimental design, but it typically fits organizational-level interventions better. The paper explores advantages and disadvantages of a regression discontinuity design with an embedded randomized control trial. It provides an example from an intervention study focusing on reducing sickness absence in 196 preschools. The paper demonstrates that such a design fits the organizational context, because it allows management to focus on organizations or workgroups with the most salient problems. In addition, organizations may accept an embedded randomized design because the organizations or groups with most salient needs receive obligatory treatment as part of the regression discontinuity design. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Statistical Evaluation and Improvement of Methods for Combining Random and Harmonic Loads
NASA Technical Reports Server (NTRS)
Brown, A. M.; McGhee, D. S.
2003-01-01
Structures in many environments experience both random and harmonic excitation. A variety of closed-form techniques has been used in the aerospace industry to combine the loads resulting from the two sources. The resulting combined loads are then used to design for both yield/ultimate strength and high- cycle fatigue capability. This Technical Publication examines the cumulative distribution percentiles obtained using each method by integrating the joint probability density function of the sine and random components. A new Microsoft Excel spreadsheet macro that links with the software program Mathematica to calculate the combined value corresponding to any desired percentile is then presented along with a curve tit to this value. Another Excel macro that calculates the combination using Monte Carlo simulation is shown. Unlike the traditional techniques. these methods quantify the calculated load value with a consistent percentile. Using either of the presented methods can be extremely valuable in probabilistic design, which requires a statistical characterization of the loading. Additionally, since the CDF at high probability levels is very flat, the design value is extremely sensitive to the predetermined percentile; therefore, applying the new techniques can substantially lower the design loading without losing any of the identified structural reliability.
Statistical Comparison and Improvement of Methods for Combining Random and Harmonic Loads
NASA Technical Reports Server (NTRS)
Brown, Andrew M.; McGhee, David S.
2004-01-01
Structures in many environments experience both random and harmonic excitation. A variety of closed-form techniques has been used in the aerospace industry to combine the loads resulting from the two sources. The resulting combined loads are then used to design for both yield ultimate strength and high cycle fatigue capability. This paper examines the cumulative distribution function (CDF) percentiles obtained using each method by integrating the joint probability density function of the sine and random components. A new Microsoft Excel spreadsheet macro that links with the software program Mathematics is then used to calculate the combined value corresponding to any desired percentile along with a curve fit to this value. Another Excel macro is used to calculate the combination using a Monte Carlo simulation. Unlike the traditional techniques, these methods quantify the calculated load value with a Consistent percentile. Using either of the presented methods can be extremely valuable in probabilistic design, which requires a statistical characterization of the loading. Also, since the CDF at high probability levels is very flat, the design value is extremely sensitive to the predetermined percentile; therefore, applying the new techniques can lower the design loading substantially without losing any of the identified structural reliability.
Designs of Empirical Evaluations of Nonexperimental Methods in Field Settings.
Wong, Vivian C; Steiner, Peter M
2018-01-01
Over the last three decades, a research design has emerged to evaluate the performance of nonexperimental (NE) designs and design features in field settings. It is called the within-study comparison (WSC) approach or the design replication study. In the traditional WSC design, treatment effects from a randomized experiment are compared to those produced by an NE approach that shares the same target population. The nonexperiment may be a quasi-experimental design, such as a regression-discontinuity or an interrupted time-series design, or an observational study approach that includes matching methods, standard regression adjustments, and difference-in-differences methods. The goals of the WSC are to determine whether the nonexperiment can replicate results from a randomized experiment (which provides the causal benchmark estimate), and the contexts and conditions under which these methods work in practice. This article presents a coherent theory of the design and implementation of WSCs for evaluating NE methods. It introduces and identifies the multiple purposes of WSCs, required design components, common threats to validity, design variants, and causal estimands of interest in WSCs. It highlights two general approaches for empirical evaluations of methods in field settings, WSC designs with independent and dependent benchmark and NE arms. This article highlights advantages and disadvantages for each approach, and conditions and contexts under which each approach is optimal for addressing methodological questions.
Design approaches to experimental mediation☆
Pirlott, Angela G.; MacKinnon, David P.
2016-01-01
Identifying causal mechanisms has become a cornerstone of experimental social psychology, and editors in top social psychology journals champion the use of mediation methods, particularly innovative ones when possible (e.g. Halberstadt, 2010, Smith, 2012). Commonly, studies in experimental social psychology randomly assign participants to levels of the independent variable and measure the mediating and dependent variables, and the mediator is assumed to causally affect the dependent variable. However, participants are not randomly assigned to levels of the mediating variable(s), i.e., the relationship between the mediating and dependent variables is correlational. Although researchers likely know that correlational studies pose a risk of confounding, this problem seems forgotten when thinking about experimental designs randomly assigning participants to levels of the independent variable and measuring the mediator (i.e., “measurement-of-mediation” designs). Experimentally manipulating the mediator provides an approach to solving these problems, yet these methods contain their own set of challenges (e.g., Bullock, Green, & Ha, 2010). We describe types of experimental manipulations targeting the mediator (manipulations demonstrating a causal effect of the mediator on the dependent variable and manipulations targeting the strength of the causal effect of the mediator) and types of experimental designs (double randomization, concurrent double randomization, and parallel), provide published examples of the designs, and discuss the strengths and challenges of each design. Therefore, the goals of this paper include providing a practical guide to manipulation-of-mediator designs in light of their challenges and encouraging researchers to use more rigorous approaches to mediation because manipulation-of-mediator designs strengthen the ability to infer causality of the mediating variable on the dependent variable. PMID:27570259
Design approaches to experimental mediation.
Pirlott, Angela G; MacKinnon, David P
2016-09-01
Identifying causal mechanisms has become a cornerstone of experimental social psychology, and editors in top social psychology journals champion the use of mediation methods, particularly innovative ones when possible (e.g. Halberstadt, 2010, Smith, 2012). Commonly, studies in experimental social psychology randomly assign participants to levels of the independent variable and measure the mediating and dependent variables, and the mediator is assumed to causally affect the dependent variable. However, participants are not randomly assigned to levels of the mediating variable(s), i.e., the relationship between the mediating and dependent variables is correlational. Although researchers likely know that correlational studies pose a risk of confounding, this problem seems forgotten when thinking about experimental designs randomly assigning participants to levels of the independent variable and measuring the mediator (i.e., "measurement-of-mediation" designs). Experimentally manipulating the mediator provides an approach to solving these problems, yet these methods contain their own set of challenges (e.g., Bullock, Green, & Ha, 2010). We describe types of experimental manipulations targeting the mediator (manipulations demonstrating a causal effect of the mediator on the dependent variable and manipulations targeting the strength of the causal effect of the mediator) and types of experimental designs (double randomization, concurrent double randomization, and parallel), provide published examples of the designs, and discuss the strengths and challenges of each design. Therefore, the goals of this paper include providing a practical guide to manipulation-of-mediator designs in light of their challenges and encouraging researchers to use more rigorous approaches to mediation because manipulation-of-mediator designs strengthen the ability to infer causality of the mediating variable on the dependent variable.
On Some Methods in Safety Evaluation in Geotechnics
NASA Astrophysics Data System (ADS)
Puła, Wojciech; Zaskórski, Łukasz
2015-06-01
The paper demonstrates how the reliability methods can be utilised in order to evaluate safety in geotechnics. Special attention is paid to the so-called reliability based design that can play a useful and complementary role to Eurocode 7. In the first part, a brief review of first- and second-order reliability methods is given. Next, two examples of reliability-based design are demonstrated. The first one is focussed on bearing capacity calculation and is dedicated to comparison with EC7 requirements. The second one analyses a rigid pile subjected to lateral load and is oriented towards working stress design method. In the second part, applications of random field to safety evaluations in geotechnics are addressed. After a short review of the theory a Random Finite Element algorithm to reliability based design of shallow strip foundation is given. Finally, two illustrative examples for cohesive and cohesionless soils are demonstrated.
Sung, Vivian W; Borello-France, Diane; Dunivan, Gena; Gantz, Marie; Lukacz, Emily S; Moalli, Pamela; Newman, Diane K; Richter, Holly E; Ridgeway, Beri; Smith, Ariana L; Weidner, Alison C; Meikle, Susan
2016-10-01
Mixed urinary incontinence (MUI) can be a challenging condition to manage. We describe the protocol design and rationale for the Effects of Surgical Treatment Enhanced with Exercise for Mixed Urinary Incontinence (ESTEEM) trial, designed to compare a combined conservative and surgical treatment approach versus surgery alone for improving patient-centered MUI outcomes at 12 months. ESTEEM is a multisite, prospective, randomized trial of female participants with MUI randomized to a standardized perioperative behavioral/pelvic floor exercise intervention plus midurethral sling versus midurethral sling alone. We describe our methods and four challenges encountered during the design phase: defining the study population, selecting relevant patient-centered outcomes, determining sample size estimates using a patient-reported outcome measure, and designing an analysis plan that accommodates MUI failure rates. A central theme in the design was patient centeredness, which guided many key decisions. Our primary outcome is patient-reported MUI symptoms measured using the Urogenital Distress Inventory (UDI) score at 12 months. Secondary outcomes include quality of life, sexual function, cost-effectiveness, time to failure, and need for additional treatment. The final study design was implemented in November 2013 across eight clinical sites in the Pelvic Floor Disorders Network. As of 27 February 2016, 433 total/472 targeted participants had been randomized. We describe the ESTEEM protocol and our methods for reaching consensus for methodological challenges in designing a trial for MUI by maintaining the patient perspective at the core of key decisions. This trial will provide information that can directly impact patient care and clinical decision making.
Landsverk, John; Brown, C Hendricks; Rolls Reutz, Jennifer; Palinkas, Lawrence; Horwitz, Sarah McCue
2011-01-01
Implementation science is an emerging field of research with considerable penetration in physical medicine and less in the fields of mental health and social services. There remains a lack of consensus on methodological approaches to the study of implementation processes and tests of implementation strategies. This paper addresses the need for methods development through a structured review that describes design elements in nine studies testing implementation strategies for evidence-based interventions addressing mental health problems of children in child welfare and child mental health settings. Randomized trial designs were dominant with considerable use of mixed method designs in the nine studies published since 2005. The findings are discussed in reference to the limitations of randomized designs in implementation science and the potential for use of alternative designs.
NASA Astrophysics Data System (ADS)
Wang, Tao; Zhou, Guoqing; Wang, Jianzhou; Zhou, Lei
2018-03-01
The artificial ground freezing method (AGF) is widely used in civil and mining engineering, and the thermal regime of frozen soil around the freezing pipe affects the safety of design and construction. The thermal parameters can be truly random due to heterogeneity of the soil properties, which lead to the randomness of thermal regime of frozen soil around the freezing pipe. The purpose of this paper is to study the one-dimensional (1D) random thermal regime problem on the basis of a stochastic analysis model and the Monte Carlo (MC) method. Considering the uncertain thermal parameters of frozen soil as random variables, stochastic processes and random fields, the corresponding stochastic thermal regime of frozen soil around a single freezing pipe are obtained and analyzed. Taking the variability of each stochastic parameter into account individually, the influences of each stochastic thermal parameter on stochastic thermal regime are investigated. The results show that the mean temperatures of frozen soil around the single freezing pipe with three analogy method are the same while the standard deviations are different. The distributions of standard deviation have a great difference at different radial coordinate location and the larger standard deviations are mainly at the phase change area. The computed data with random variable method and stochastic process method have a great difference from the measured data while the computed data with random field method well agree with the measured data. Each uncertain thermal parameter has a different effect on the standard deviation of frozen soil temperature around the single freezing pipe. These results can provide a theoretical basis for the design and construction of AGF.
Confidence intervals for single-case effect size measures based on randomization test inversion.
Michiels, Bart; Heyvaert, Mieke; Meulders, Ann; Onghena, Patrick
2017-02-01
In the current paper, we present a method to construct nonparametric confidence intervals (CIs) for single-case effect size measures in the context of various single-case designs. We use the relationship between a two-sided statistical hypothesis test at significance level α and a 100 (1 - α) % two-sided CI to construct CIs for any effect size measure θ that contain all point null hypothesis θ values that cannot be rejected by the hypothesis test at significance level α. This method of hypothesis test inversion (HTI) can be employed using a randomization test as the statistical hypothesis test in order to construct a nonparametric CI for θ. We will refer to this procedure as randomization test inversion (RTI). We illustrate RTI in a situation in which θ is the unstandardized and the standardized difference in means between two treatments in a completely randomized single-case design. Additionally, we demonstrate how RTI can be extended to other types of single-case designs. Finally, we discuss a few challenges for RTI as well as possibilities when using the method with other effect size measures, such as rank-based nonoverlap indices. Supplementary to this paper, we provide easy-to-use R code, which allows the user to construct nonparametric CIs according to the proposed method.
Micro-Randomized Trials: An Experimental Design for Developing Just-in-Time Adaptive Interventions
Klasnja, Predrag; Hekler, Eric B.; Shiffman, Saul; Boruvka, Audrey; Almirall, Daniel; Tewari, Ambuj; Murphy, Susan A.
2015-01-01
Objective This paper presents an experimental design, the micro-randomized trial, developed to support optimization of just-in-time adaptive interventions (JITAIs). JITAIs are mHealth technologies that aim to deliver the right intervention components at the right times and locations to optimally support individuals’ health behaviors. Micro-randomized trials offer a way to optimize such interventions by enabling modeling of causal effects and time-varying effect moderation for individual intervention components within a JITAI. Methods The paper describes the micro-randomized trial design, enumerates research questions that this experimental design can help answer, and provides an overview of the data analyses that can be used to assess the causal effects of studied intervention components and investigate time-varying moderation of those effects. Results Micro-randomized trials enable causal modeling of proximal effects of the randomized intervention components and assessment of time-varying moderation of those effects. Conclusions Micro-randomized trials can help researchers understand whether their interventions are having intended effects, when and for whom they are effective, and what factors moderate the interventions’ effects, enabling creation of more effective JITAIs. PMID:26651463
Optimal fractional order PID design via Tabu Search based algorithm.
Ateş, Abdullah; Yeroglu, Celaleddin
2016-01-01
This paper presents an optimization method based on the Tabu Search Algorithm (TSA) to design a Fractional-Order Proportional-Integral-Derivative (FOPID) controller. All parameter computations of the FOPID employ random initial conditions, using the proposed optimization method. Illustrative examples demonstrate the performance of the proposed FOPID controller design method. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.
Dynamic Loads Generation for Multi-Point Vibration Excitation Problems
NASA Technical Reports Server (NTRS)
Shen, Lawrence
2011-01-01
A random-force method has been developed to predict dynamic loads produced by rocket-engine random vibrations for new rocket-engine designs. The method develops random forces at multiple excitation points based on random vibration environments scaled from accelerometer data obtained during hot-fire tests of existing rocket engines. This random-force method applies random forces to the model and creates expected dynamic response in a manner that simulates the way the operating engine applies self-generated random vibration forces (random pressure acting on an area) with the resulting responses that we measure with accelerometers. This innovation includes the methodology (implementation sequence), the computer code, two methods to generate the random-force vibration spectra, and two methods to reduce some of the inherent conservatism in the dynamic loads. This methodology would be implemented to generate the random-force spectra at excitation nodes without requiring the use of artificial boundary conditions in a finite element model. More accurate random dynamic loads than those predicted by current industry methods can then be generated using the random force spectra. The scaling method used to develop the initial power spectral density (PSD) environments for deriving the random forces for the rocket engine case is based on the Barrett Criteria developed at Marshall Space Flight Center in 1963. This invention approach can be applied in the aerospace, automotive, and other industries to obtain reliable dynamic loads and responses from a finite element model for any structure subject to multipoint random vibration excitations.
Concurrent design of quasi-random photonic nanostructures
Lee, Won-Kyu; Yu, Shuangcheng; Engel, Clifford J.; Reese, Thaddeus; Rhee, Dongjoon; Chen, Wei
2017-01-01
Nanostructured surfaces with quasi-random geometries can manipulate light over broadband wavelengths and wide ranges of angles. Optimization and realization of stochastic patterns have typically relied on serial, direct-write fabrication methods combined with real-space design. However, this approach is not suitable for customizable features or scalable nanomanufacturing. Moreover, trial-and-error processing cannot guarantee fabrication feasibility because processing–structure relations are not included in conventional designs. Here, we report wrinkle lithography integrated with concurrent design to produce quasi-random nanostructures in amorphous silicon at wafer scales that achieved over 160% light absorption enhancement from 800 to 1,200 nm. The quasi-periodicity of patterns, materials filling ratio, and feature depths could be independently controlled. We statistically represented the quasi-random patterns by Fourier spectral density functions (SDFs) that could bridge the processing–structure and structure–performance relations. Iterative search of the optimal structure via the SDF representation enabled concurrent design of nanostructures and processing. PMID:28760975
Sample Size Calculations for Micro-randomized Trials in mHealth
Liao, Peng; Klasnja, Predrag; Tewari, Ambuj; Murphy, Susan A.
2015-01-01
The use and development of mobile interventions are experiencing rapid growth. In “just-in-time” mobile interventions, treatments are provided via a mobile device and they are intended to help an individual make healthy decisions “in the moment,” and thus have a proximal, near future impact. Currently the development of mobile interventions is proceeding at a much faster pace than that of associated data science methods. A first step toward developing data-based methods is to provide an experimental design for testing the proximal effects of these just-in-time treatments. In this paper, we propose a “micro-randomized” trial design for this purpose. In a micro-randomized trial, treatments are sequentially randomized throughout the conduct of the study, with the result that each participant may be randomized at the 100s or 1000s of occasions at which a treatment might be provided. Further, we develop a test statistic for assessing the proximal effect of a treatment as well as an associated sample size calculator. We conduct simulation evaluations of the sample size calculator in various settings. Rules of thumb that might be used in designing a micro-randomized trial are discussed. This work is motivated by our collaboration on the HeartSteps mobile application designed to increase physical activity. PMID:26707831
Brown, Justin C.; Troxel, Andrea B.; Ky, Bonnie; Damjanov, Nevena; Zemel, Babette S.; Rickels, Michael R.; Rhim, Andrew D.; Rustgi, Anil K.; Courneya, Kerry S.; Schmitz, Kathryn H.
2016-01-01
Background Observational studies indicate that higher volumes of physical activity are associated with improved disease outcomes among colon cancer survivors. The aim of this report is to describe the purpose, study design, methods, and recruitment results of the COURAGE trial, a National Cancer Institute (NCI) sponsored, phase II, randomized, dose-response exercise trial among colon cancer survivors. Methods/Results The primary objective of the COURAGE trial is to quantify the feasibility, safety, and physiologic effects of low-dose (150 min·wk−1) and high-dose (300 min·wk−1) moderate-intensity aerobic exercise compared to usual-care control group over six months. The exercise groups are provided with in-home treadmills and heart rate monitors. Between January and July 2015, 1,433 letters were mailed using a population-based state cancer registry; 126 colon cancer survivors inquired about participation, and 39 were randomized onto the study protocol. Age was associated with inquiry about study participation (P<0.001) and randomization onto the study protocol (P<0.001). No other demographic, clinical, or geographic characteristics were associated with study inquiry or randomization. The final trial participant was randomized in August 2015. Six month endpoint data collection was completed in February 2016. Discussion The recruitment of colon cancer survivors into an exercise trial is feasible. The findings from this trial will inform key design aspects for future phase 2 and phase 3 randomized controlled trials to examine the efficacy of exercise to improve clinical outcomes among colon cancer survivors. PMID:26970181
Design space exploration for early identification of yield limiting patterns
NASA Astrophysics Data System (ADS)
Li, Helen; Zou, Elain; Lee, Robben; Hong, Sid; Liu, Square; Wang, JinYan; Du, Chunshan; Zhang, Recco; Madkour, Kareem; Ali, Hussein; Hsu, Danny; Kabeel, Aliaa; ElManhawy, Wael; Kwan, Joe
2016-03-01
In order to resolve the causality dilemma of which comes first, accurate design rules or real designs, this paper presents a flow for exploration of the layout design space to early identify problematic patterns that will negatively affect the yield. A new random layout generating method called Layout Schema Generator (LSG) is reported in this paper, this method generates realistic design-like layouts without any design rule violation. Lithography simulation is then used on the generated layout to discover the potentially problematic patterns (hotspots). These hotspot patterns are further explored by randomly inducing feature and context variations to these identified hotspots through a flow called Hotspot variation Flow (HSV). Simulation is then performed on these expanded set of layout clips to further identify more problematic patterns. These patterns are then classified into design forbidden patterns that should be included in the design rule checker and legal patterns that need better handling in the RET recipes and processes.
Multilevel Analysis Methods for Partially Nested Cluster Randomized Trials
ERIC Educational Resources Information Center
Sanders, Elizabeth A.
2011-01-01
This paper explores multilevel modeling approaches for 2-group randomized experiments in which a treatment condition involving clusters of individuals is compared to a control condition involving only ungrouped individuals, otherwise known as partially nested cluster randomized designs (PNCRTs). Strategies for comparing groups from a PNCRT in the…
Rationale, Design, and Methods of the Preschool ADHD Treatment Study (PATS)
ERIC Educational Resources Information Center
Kollins, Scott; Greenhill, Laurence; Swanson, James; Wigal, Sharon; Abikoff, Howard; McCracken, James; Riddle, Mark; McGough, James; Vitiello, Benedetto; Wigal, Tim; Skrobala, Anne; Posner, Kelly; Ghuman, Jaswinder; Davies, Mark; Cunningham, Charles; Bauzo, Audrey
2006-01-01
Objective: To describe the rationale and design of the Preschool ADHD Treatment Study (PATS). Method: PATS was a National Institutes of Mental Health-funded, multicenter, randomized, efficacy trial designed to evaluate the short-term (5 weeks) efficacy and long-term (40 weeks) safety of methylphenidate (MPH) in preschoolers with…
Bayesian randomized clinical trials: From fixed to adaptive design.
Yin, Guosheng; Lam, Chi Kin; Shi, Haolun
2017-08-01
Randomized controlled studies are the gold standard for phase III clinical trials. Using α-spending functions to control the overall type I error rate, group sequential methods are well established and have been dominating phase III studies. Bayesian randomized design, on the other hand, can be viewed as a complement instead of competitive approach to the frequentist methods. For the fixed Bayesian design, the hypothesis testing can be cast in the posterior probability or Bayes factor framework, which has a direct link to the frequentist type I error rate. Bayesian group sequential design relies upon Bayesian decision-theoretic approaches based on backward induction, which is often computationally intensive. Compared with the frequentist approaches, Bayesian methods have several advantages. The posterior predictive probability serves as a useful and convenient tool for trial monitoring, and can be updated at any time as the data accrue during the trial. The Bayesian decision-theoretic framework possesses a direct link to the decision making in the practical setting, and can be modeled more realistically to reflect the actual cost-benefit analysis during the drug development process. Other merits include the possibility of hierarchical modeling and the use of informative priors, which would lead to a more comprehensive utilization of information from both historical and longitudinal data. From fixed to adaptive design, we focus on Bayesian randomized controlled clinical trials and make extensive comparisons with frequentist counterparts through numerical studies. Copyright © 2017 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Kautz, Tim; Schochet, Peter Z.; Tilley, Charles
2017-01-01
A new design-based theory has recently been developed to estimate impacts for randomized controlled trials (RCTs) and basic quasi-experimental designs (QEDs) for a wide range of designs used in social policy research (Imbens & Rubin, 2015; Schochet, 2016). These methods use the potential outcomes framework and known features of study designs…
Luce, Bryan R; Broglio, Kristine R; Ishak, K Jack; Mullins, C Daniel; Vanness, David J; Fleurence, Rachael; Saunders, Elijah; Davis, Barry R
2013-01-01
Background Randomized clinical trials, particularly for comparative effectiveness research (CER), are frequently criticized for being overly restrictive or untimely for health-care decision making. Purpose Our prospectively designed REsearch in ADAptive methods for Pragmatic Trials (RE-ADAPT) study is a ‘proof of concept’ to stimulate investment in Bayesian adaptive designs for future CER trials. Methods We will assess whether Bayesian adaptive designs offer potential efficiencies in CER by simulating a re-execution of the Antihypertensive and Lipid Lowering Treatment to Prevent Heart Attack Trial (ALLHAT) study using actual data from ALLHAT. Results We prospectively define seven alternate designs consisting of various combinations of arm dropping, adaptive randomization, and early stopping and describe how these designs will be compared to the original ALLHAT design. We identify the one particular design that would have been executed, which incorporates early stopping and information-based adaptive randomization. Limitations While the simulation realistically emulates patient enrollment, interim analyses, and adaptive changes to design, it cannot incorporate key features like the involvement of data monitoring committee in making decisions about adaptive changes. Conclusion This article describes our analytic approach for RE-ADAPT. The next stage of the project is to conduct the re-execution analyses using the seven prespecified designs and the original ALLHAT data. PMID:23983160
Sylvia, Louisa G.; Reilly-Harrington, Noreen A.; Leon, Andrew C.; Kansky, Christine I.; Ketter, Terence A.; Calabrese, Joseph R.; Thase, Michael E.; Bowden, Charles L.; Friedman, Edward S.; Ostacher, Michael J.; Iosifescu, Dan V.; Severe, Joanne; Nierenberg, Andrew A.
2013-01-01
Background High attrition rates which occur frequently in longitudinal clinical trials of interventions for bipolar disorder limit the interpretation of results. Purpose The aim of this article is to present design approaches that limited attrition in the Lithium Use for Bipolar Disorder (LiTMUS) Study. Methods LiTMUS was a 6-month randomized, longitudinal multi-site comparative effectiveness trial that examined bipolar participants who were at least mildly ill. Participants were randomized to either low to moderate doses of lithium or no lithium, in addition to other treatments needed for mood stabilization administered in a guideline-informed, empirically supported, and personalized fashion (N=283). Results Components of the study design that may have contributed to the low attrition rate of the study included use of: (1) an intent-to-treat design; (2) a randomized adjunctive single-blind design; (3) participant reimbursement; (4) intent-to-attend the next study visit (includes a discussion of attendance obstacles when intention is low); (5) quality care with limited participant burden; and (6) target windows for study visits. Limitations Site differences and the effectiveness and tolerability data have not been analyzed yet. Conclusions These components of the LiTMUS study design may have reduced the probability of attrition which would inform the design of future randomized clinical effectiveness trials. PMID:22076437
ERIC Educational Resources Information Center
Cook, David A.; Thompson, Warren G.; Thomas, Kris G.; Thomas, Matthew R.
2009-01-01
Background: Adaptation to learning styles has been proposed to enhance learning. Objective: We hypothesized that learners with sensing learning style would perform better using a problem-first instructional method while intuitive learners would do better using an information-first method. Design: Randomized, controlled, crossover trial. Setting:…
Efficacy of Virtual Patients in Medical Education: A Meta-Analysis of Randomized Studies
ERIC Educational Resources Information Center
Consorti, Fabrizio; Mancuso, Rosaria; Nocioni, Martina; Piccolo, Annalisa
2012-01-01
A meta-analysis was performed to assess the Effect Size (ES) from randomized studies comparing the effect of educational interventions in which Virtual patients (VPs) were used either as an alternative method or additive to usual curriculum versus interventions based on more traditional methods. Meta-analysis was designed, conducted and reported…
ERIC Educational Resources Information Center
Modebelu, M. N.; Ogbonna, C. C.
2014-01-01
This study aimed at determining the effect of reform-based-instructional method learning styles on students' achievement and retention in mathematics. A sample size of 119 students was randomly selected. The quasiexperimental design comprising pre-test, post-test, and randomized control group were employed. The Collin Rose learning styles…
Efficacy of Parent-Child Interaction Therapy with Chinese ADHD Children: Randomized Controlled Trial
ERIC Educational Resources Information Center
Leung, Cynthia; Tsang, Sandra; Ng, Gene S. H.; Choi, S. Y.
2017-01-01
Purpose: This study aimed to evaluate the efficacy of Parent-Child Interaction Therapy (PCIT) in Chinese children with attention-deficit/hyperactivity disorder (ADHD) or ADHD features. Methods: This study adopted a randomized controlled trial design without blinding. Participants were randomized into either the intervention group (n = 32) and…
A Data Management System Integrating Web-Based Training and Randomized Trials
ERIC Educational Resources Information Center
Muroff, Jordana; Amodeo, Maryann; Larson, Mary Jo; Carey, Margaret; Loftin, Ralph D.
2011-01-01
This article describes a data management system (DMS) developed to support a large-scale randomized study of an innovative web-course that was designed to improve substance abuse counselors' knowledge and skills in applying a substance abuse treatment method (i.e., cognitive behavioral therapy; CBT). The randomized trial compared the performance…
Sample Selection in Randomized Experiments: A New Method Using Propensity Score Stratified Sampling
ERIC Educational Resources Information Center
Tipton, Elizabeth; Hedges, Larry; Vaden-Kiernan, Michael; Borman, Geoffrey; Sullivan, Kate; Caverly, Sarah
2014-01-01
Randomized experiments are often seen as the "gold standard" for causal research. Despite the fact that experiments use random assignment to treatment conditions, units are seldom selected into the experiment using probability sampling. Very little research on experimental design has focused on how to make generalizations to well-defined…
What Is Design-Based Causal Inference for RCTs and Why Should I Use It? NCEE 2017-4025
ERIC Educational Resources Information Center
Schochet, Peter Z.
2017-01-01
Design-based methods have recently been developed as a way to analyze data from impact evaluations of interventions, programs, and policies. The impact estimators are derived using the building blocks of experimental designs with minimal assumptions, and have good statistical properties. The methods apply to randomized controlled trials (RCTs) and…
Analysis/design of strip reinforced random composites (strip hybrids)
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Sinclair, J. H.
1978-01-01
Advanced analysis methods and composite mechanics were applied to a strip-reinforced random composite square panel with fixed ends to illustrate the use of these methods for the a priori assessment of the composite panel when subjected to complex loading conditions. The panel was assumed to be of E-glass random composite. The strips were assumed to be of three advanced unidirectional composites to cover a range of low, intermediate, and high modulus stiffness. The panels were assumed to be subjected to complex loadings to assess their adequacy as load-carrying members in auto body, aircraft engine nacelle and windmill blade applications. The results show that strip hybrid panels can be several times more structurally efficient than the random composite base materials. Some of the results are presented in graphical form and procedures are described for use of these graphs as guides for preliminary design of strip hybrids.
Ryeznik, Yevgen; Sverdlov, Oleksandr
2018-06-04
Randomization designs for multiarm clinical trials are increasingly used in practice, especially in phase II dose-ranging studies. Many new methods have been proposed in the literature; however, there is lack of systematic, head-to-head comparison of the competing designs. In this paper, we systematically investigate statistical properties of various restricted randomization procedures for multiarm trials with fixed and possibly unequal allocation ratios. The design operating characteristics include measures of allocation balance, randomness of treatment assignments, variations in the allocation ratio, and statistical characteristics such as type I error rate and power. The results from the current paper should help clinical investigators select an appropriate randomization procedure for their clinical trial. We also provide a web-based R shiny application that can be used to reproduce all results in this paper and run simulations under additional user-defined experimental scenarios. Copyright © 2018 John Wiley & Sons, Ltd.
NASA Technical Reports Server (NTRS)
Tomberlin, T. J.
1985-01-01
Research studies of residents' responses to noise consist of interviews with samples of individuals who are drawn from a number of different compact study areas. The statistical techniques developed provide a basis for those sample design decisions. These techniques are suitable for a wide range of sample survey applications. A sample may consist of a random sample of residents selected from a sample of compact study areas, or in a more complex design, of a sample of residents selected from a sample of larger areas (e.g., cities). The techniques may be applied to estimates of the effects on annoyance of noise level, numbers of noise events, the time-of-day of the events, ambient noise levels, or other factors. Methods are provided for determining, in advance, how accurately these effects can be estimated for different sample sizes and study designs. Using a simple cost function, they also provide for optimum allocation of the sample across the stages of the design for estimating these effects. These techniques are developed via a regression model in which the regression coefficients are assumed to be random, with components of variance associated with the various stages of a multi-stage sample design.
Melvin, Neal R; Poda, Daniel; Sutherland, Robert J
2007-10-01
When properly applied, stereology is a very robust and efficient method to quantify a variety of parameters from biological material. A common sampling strategy in stereology is systematic random sampling, which involves choosing a random sampling [corrected] start point outside the structure of interest, and sampling relevant objects at [corrected] sites that are placed at pre-determined, equidistant intervals. This has proven to be a very efficient sampling strategy, and is used widely in stereological designs. At the microscopic level, this is most often achieved through the use of a motorized stage that facilitates the systematic random stepping across the structure of interest. Here, we report a simple, precise and cost-effective software-based alternative to accomplishing systematic random sampling under the microscope. We believe that this approach will facilitate the use of stereological designs that employ systematic random sampling in laboratories that lack the resources to acquire costly, fully automated systems.
Causal inference from observational data.
Listl, Stefan; Jürges, Hendrik; Watt, Richard G
2016-10-01
Randomized controlled trials have long been considered the 'gold standard' for causal inference in clinical research. In the absence of randomized experiments, identification of reliable intervention points to improve oral health is often perceived as a challenge. But other fields of science, such as social science, have always been challenged by ethical constraints to conducting randomized controlled trials. Methods have been established to make causal inference using observational data, and these methods are becoming increasingly relevant in clinical medicine, health policy and public health research. This study provides an overview of state-of-the-art methods specifically designed for causal inference in observational data, including difference-in-differences (DiD) analyses, instrumental variables (IV), regression discontinuity designs (RDD) and fixed-effects panel data analysis. The described methods may be particularly useful in dental research, not least because of the increasing availability of routinely collected administrative data and electronic health records ('big data'). © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Small, J R
1993-01-01
This paper is a study into the effects of experimental error on the estimated values of flux control coefficients obtained using specific inhibitors. Two possible techniques for analysing the experimental data are compared: a simple extrapolation method (the so-called graph method) and a non-linear function fitting method. For these techniques, the sources of systematic errors are identified and the effects of systematic and random errors are quantified, using both statistical analysis and numerical computation. It is shown that the graph method is very sensitive to random errors and, under all conditions studied, that the fitting method, even under conditions where the assumptions underlying the fitted function do not hold, outperformed the graph method. Possible ways of designing experiments to minimize the effects of experimental errors are analysed and discussed. PMID:8257434
The Efficacy of Parent-Child Interaction Therapy with Chinese Families: Randomized Controlled Trial
ERIC Educational Resources Information Center
Leung, Cynthia; Tsang, Sandra; Sin, Tammy C. S.; Choi, Siu-yan
2015-01-01
Objective: This study aimed to examine the efficacy of the Parent-Child Interaction Therapy (PCIT) in Hong Kong Chinese families, using randomized controlled trial design. Methods: The participants included 111 Hong Kong Chinese parents with children aged 2--7 years old, who were randomized into the intervention group (n = 54) and control group (n…
ERIC Educational Resources Information Center
Conn, Vicki S.; Hafdahl, Adam R.; Cooper, Pamela S.; Ruppar, Todd M.; Mehr, David R.; Russell, Cynthia L.
2009-01-01
Purpose: This study investigated the effectiveness of interventions to improve medication adherence (MA) in older adults. Design and Methods: Meta-analysis was used to synthesize results of 33 published and unpublished randomized controlled trials. Random-effects models were used to estimate overall mean effect sizes (ESs) for MA, knowledge,…
ERIC Educational Resources Information Center
Maynard, Brandy R.; Kjellstrand, Elizabeth K.; Thompson, Aaron M.
2014-01-01
Objectives: This study examined the effects of Check & Connect (C&C) on the attendance, behavior, and academic outcomes of at-risk youth in a field-based effectiveness trial. Method: A multisite randomized block design was used, wherein 260 primarily Hispanic (89%) and economically disadvantaged (74%) students were randomized to treatment…
Venter, Anre; Maxwell, Scott E; Bolig, Erika
2002-06-01
Adding a pretest as a covariate to a randomized posttest-only design increases statistical power, as does the addition of intermediate time points to a randomized pretest-posttest design. Although typically 5 waves of data are required in this instance to produce meaningful gains in power, a 3-wave intensive design allows the evaluation of the straight-line growth model and may reduce the effect of missing data. The authors identify the statistically most powerful method of data analysis in the 3-wave intensive design. If straight-line growth is assumed, the pretest-posttest slope must assume fairly extreme values for the intermediate time point to increase power beyond the standard analysis of covariance on the posttest with the pretest as covariate, ignoring the intermediate time point.
2011-01-01
Background Energy-based surgical scalpels are designed to efficiently transect and seal blood vessels using thermal energy to promote protein denaturation and coagulation. Assessment and design improvement of ultrasonic scalpel performance relies on both in vivo and ex vivo testing. The objective of this work was to design and implement a robust, experimental test matrix with randomization restrictions and predictive statistical power, which allowed for identification of those experimental variables that may affect the quality of the seal obtained ex vivo. Methods The design of the experiment included three factors: temperature (two levels); the type of solution used to perfuse the artery during transection (three types); and artery type (two types) resulting in a total of twelve possible treatment combinations. Burst pressures of porcine carotid and renal arteries sealed ex vivo were assigned as the response variable. Results The experimental test matrix was designed and carried out as a split-plot experiment in order to assess the contributions of several variables and their interactions while accounting for randomization restrictions present in the experimental setup. The statistical software package SAS was utilized and PROC MIXED was used to account for the randomization restrictions in the split-plot design. The combination of temperature, solution, and vessel type had a statistically significant impact on seal quality. Conclusions The design and implementation of a split-plot experimental test-matrix provided a mechanism for addressing the existing technical randomization restrictions of ex vivo ultrasonic scalpel performance testing, while preserving the ability to examine the potential effects of independent factors or variables. This method for generating the experimental design and the statistical analyses of the resulting data are adaptable to a wide variety of experimental problems involving large-scale tissue-based studies of medical or experimental device efficacy and performance. PMID:21599963
Study on Nonlinear Vibration Analysis of Gear System with Random Parameters
NASA Astrophysics Data System (ADS)
Tong, Cao; Liu, Xiaoyuan; Fan, Li
2018-03-01
In order to study the dynamic characteristics of gear nonlinear vibration system and the influence of random parameters, firstly, a nonlinear stochastic vibration analysis model of gear 3-DOF is established based on Newton’s Law. And the random response of gear vibration is simulated by stepwise integration method. Secondly, the influence of stochastic parameters such as meshing damping, tooth side gap and excitation frequency on the dynamic response of gear nonlinear system is analyzed by using the stability analysis method such as bifurcation diagram and Lyapunov exponent method. The analysis shows that the stochastic process can not be neglected, which can cause the random bifurcation and chaos of the system response. This study will provide important reference value for vibration engineering designers.
ERIC Educational Resources Information Center
Bockenholt, Ulf; Van Der Heijden, Peter G. M.
2007-01-01
Randomized response (RR) is a well-known method for measuring sensitive behavior. Yet this method is not often applied because: (i) of its lower efficiency and the resulting need for larger sample sizes which make applications of RR costly; (ii) despite its privacy-protection mechanism the RR design may not be followed by every respondent; and…
Methodology Series Module 5: Sampling Strategies.
Setia, Maninder Singh
2016-01-01
Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the ' Sampling Method'. There are essentially two types of sampling methods: 1) probability sampling - based on chance events (such as random numbers, flipping a coin etc.); and 2) non-probability sampling - based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample) is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term ' random sample' when the researcher has used convenience sample). The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the ' generalizability' of these results. In such a scenario, the researcher may want to use ' purposive sampling' for the study.
ERIC Educational Resources Information Center
To, Carol K. S.; Lui, Hoi Ming; Li, Xin Xin; Lam, Gary Y. H
2015-01-01
Purpose: In this study, we aimed to evaluate the efficacy of sentence-combining (SC) and narrative-based (NAR) intervention approaches to syntax intervention using a randomized-controlled-trial design. Method: Fifty-two Cantonese-speaking, school-age children with language impairment were assigned randomly to either the SC or the NAR treatment…
Enhancements and Algorithms for Avionic Information Processing System Design Methodology.
1982-06-16
programming algorithm is enhanced by incorporating task precedence constraints and hardware failures. Stochastic network methods are used to analyze...allocations in the presence of random fluctuations. Graph theoretic methods are used to analyze hardware designs, and new designs are constructed with...There, spatial dynamic programming (SDP) was used to solve a static, deterministic software allocation problem. Under the current contract the SDP
Probability techniques for reliability analysis of composite materials
NASA Technical Reports Server (NTRS)
Wetherhold, Robert C.; Ucci, Anthony M.
1994-01-01
Traditional design approaches for composite materials have employed deterministic criteria for failure analysis. New approaches are required to predict the reliability of composite structures since strengths and stresses may be random variables. This report will examine and compare methods used to evaluate the reliability of composite laminae. The two types of methods that will be evaluated are fast probability integration (FPI) methods and Monte Carlo methods. In these methods, reliability is formulated as the probability that an explicit function of random variables is less than a given constant. Using failure criteria developed for composite materials, a function of design variables can be generated which defines a 'failure surface' in probability space. A number of methods are available to evaluate the integration over the probability space bounded by this surface; this integration delivers the required reliability. The methods which will be evaluated are: the first order, second moment FPI methods; second order, second moment FPI methods; the simple Monte Carlo; and an advanced Monte Carlo technique which utilizes importance sampling. The methods are compared for accuracy, efficiency, and for the conservativism of the reliability estimation. The methodology involved in determining the sensitivity of the reliability estimate to the design variables (strength distributions) and importance factors is also presented.
Simulation methods to estimate design power: an overview for applied research
2011-01-01
Background Estimating the required sample size and statistical power for a study is an integral part of study design. For standard designs, power equations provide an efficient solution to the problem, but they are unavailable for many complex study designs that arise in practice. For such complex study designs, computer simulation is a useful alternative for estimating study power. Although this approach is well known among statisticians, in our experience many epidemiologists and social scientists are unfamiliar with the technique. This article aims to address this knowledge gap. Methods We review an approach to estimate study power for individual- or cluster-randomized designs using computer simulation. This flexible approach arises naturally from the model used to derive conventional power equations, but extends those methods to accommodate arbitrarily complex designs. The method is universally applicable to a broad range of designs and outcomes, and we present the material in a way that is approachable for quantitative, applied researchers. We illustrate the method using two examples (one simple, one complex) based on sanitation and nutritional interventions to improve child growth. Results We first show how simulation reproduces conventional power estimates for simple randomized designs over a broad range of sample scenarios to familiarize the reader with the approach. We then demonstrate how to extend the simulation approach to more complex designs. Finally, we discuss extensions to the examples in the article, and provide computer code to efficiently run the example simulations in both R and Stata. Conclusions Simulation methods offer a flexible option to estimate statistical power for standard and non-traditional study designs and parameters of interest. The approach we have described is universally applicable for evaluating study designs used in epidemiologic and social science research. PMID:21689447
Hernández-Cordero, Sonia; González-Castell, Dinorah; Rodríguez-Ramírez, Sonia; Villanueva-Borbolla, María Ángeles; Unar, Mishel; Barquera, Simón; de Cossío, Teresita González; Rivera-Dommarco, Juan; Popkin, Barry M
2014-01-01
Objective To describe the design, methods, and challenges encountered during a randomized clinical trial aimed to promote water intake for reducing risks of metabolic syndrome in Mexican women. Materials and methods In a randomized clinical trial in Cuernavaca, Mexico, overweight and obese (body mass index [BMI] ≥ 25 < 39) women, 18 – < 45 years old with an intake of sugar-sweetened beverages ≥ 250 kilocalories per day (kcal/day) were randomly allocated to the water and education provision group (n = 120) or the education provision only group (n = 120). Results We screened 1 756 women. The main difficulties encountered were identifying participants with the recruitment criteria, delivering water to participants, and the time demanded from the study participants. Conclusions The trial’s main challenges were difficulties surrounding recruitment, delivery of the intervention, and the time demanded from the study participants. Modifications were effectively implemented without jeopardizing the original protocol. PMID:24715012
Some practical problems in implementing randomization.
Downs, Matt; Tucker, Kathryn; Christ-Schmidt, Heidi; Wittes, Janet
2010-06-01
While often theoretically simple, implementing randomization to treatment in a masked, but confirmable, fashion can prove difficult in practice. At least three categories of problems occur in randomization: (1) bad judgment in the choice of method, (2) design and programming errors in implementing the method, and (3) human error during the conduct of the trial. This article focuses on these latter two types of errors, dealing operationally with what can go wrong after trial designers have selected the allocation method. We offer several case studies and corresponding recommendations for lessening the frequency of problems in allocating treatment or for mitigating the consequences of errors. Recommendations include: (1) reviewing the randomization schedule before starting a trial, (2) being especially cautious of systems that use on-demand random number generators, (3) drafting unambiguous randomization specifications, (4) performing thorough testing before entering a randomization system into production, (5) maintaining a dataset that captures the values investigators used to randomize participants, thereby allowing the process of treatment allocation to be reproduced and verified, (6) resisting the urge to correct errors that occur in individual treatment assignments, (7) preventing inadvertent unmasking to treatment assignments in kit allocations, and (8) checking a sample of study drug kits to allow detection of errors in drug packaging and labeling. Although we performed a literature search of documented randomization errors, the examples that we provide and the resultant recommendations are based largely on our own experience in industry-sponsored clinical trials. We do not know how representative our experience is or how common errors of the type we have seen occur. Our experience underscores the importance of verifying the integrity of the treatment allocation process before and during a trial. Clinical Trials 2010; 7: 235-245. http://ctj.sagepub.com.
Methodology Series Module 5: Sampling Strategies
Setia, Maninder Singh
2016-01-01
Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the ‘ Sampling Method’. There are essentially two types of sampling methods: 1) probability sampling – based on chance events (such as random numbers, flipping a coin etc.); and 2) non-probability sampling – based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample) is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term ‘ random sample’ when the researcher has used convenience sample). The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the ‘ generalizability’ of these results. In such a scenario, the researcher may want to use ‘ purposive sampling’ for the study. PMID:27688438
Correction of confounding bias in non-randomized studies by appropriate weighting.
Schmoor, Claudia; Gall, Christine; Stampf, Susanne; Graf, Erika
2011-03-01
In non-randomized studies, the assessment of a causal effect of treatment or exposure on outcome is hampered by possible confounding. Applying multiple regression models including the effects of treatment and covariates on outcome is the well-known classical approach to adjust for confounding. In recent years other approaches have been promoted. One of them is based on the propensity score and considers the effect of possible confounders on treatment as a relevant criterion for adjustment. Another proposal is based on using an instrumental variable. Here inference relies on a factor, the instrument, which affects treatment but is thought to be otherwise unrelated to outcome, so that it mimics randomization. Each of these approaches can basically be interpreted as a simple reweighting scheme, designed to address confounding. The procedures will be compared with respect to their fundamental properties, namely, which bias they aim to eliminate, which effect they aim to estimate, and which parameter is modelled. We will expand our overview of methods for analysis of non-randomized studies to methods for analysis of randomized controlled trials and show that analyses of both study types may target different effects and different parameters. The considerations will be illustrated using a breast cancer study with a so-called Comprehensive Cohort Study design, including a randomized controlled trial and a non-randomized study in the same patient population as sub-cohorts. This design offers ideal opportunities to discuss and illustrate the properties of the different approaches. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Multiaxis Rainflow Fatigue Methods for Nonstationary Vibration
NASA Technical Reports Server (NTRS)
Irvine, T.
2016-01-01
Mechanical structures and components may be subjected to cyclical loading conditions, including sine and random vibration. Such systems must be designed and tested accordingly. Rainflow cycle counting is the standard method for reducing a stress time history to a table of amplitude-cycle pairings prior to the Palmgren-Miner cumulative damage calculation. The damage calculation is straightforward for sinusoidal stress but very complicated for random stress, particularly for nonstationary vibration. This paper evaluates candidate methods and makes a recommendation for further study of a hybrid technique.
Knight, Stacey; Camp, Nicola J
2011-04-01
Current common wisdom posits that association analyses using family-based designs have inflated type 1 error rates (if relationships are ignored) and independent controls are more powerful than familial controls. We explore these suppositions. We show theoretically that family-based designs can have deflated type-error rates. Through simulation, we examine the validity and power of family designs for several scenarios: cases from randomly or selectively ascertained pedigrees; and familial or independent controls. Family structures considered are as follows: sibships, nuclear families, moderate-sized and extended pedigrees. Three methods were considered with the χ(2) test for trend: variance correction (VC), weighted (weights assigned to account for genetic similarity), and naïve (ignoring relatedness) as well as the Modified Quasi-likelihood Score (MQLS) test. Selectively ascertained pedigrees had similar levels of disease enrichment; random ascertainment had no such restriction. Data for 1,000 cases and 1,000 controls were created under the null and alternate models. The VC and MQLS methods were always valid. The naïve method was anti-conservative if independent controls were used and valid or conservative in designs with familial controls. The weighted association method was generally valid for independent controls, and was conservative for familial controls. With regard to power, independent controls were more powerful for small-to-moderate selectively ascertained pedigrees, but familial and independent controls were equivalent in the extended pedigrees and familial controls were consistently more powerful for all randomly ascertained pedigrees. These results suggest a more complex situation than previously assumed, which has important implications for study design and analysis. © 2011 Wiley-Liss, Inc.
Handley, Margaret A; Schillinger, Dean; Shiboski, Stephen
2011-01-01
Although randomized controlled trials are often a gold standard for determining intervention effects, in the area of practice-based research (PBR), there are many situations in which individual randomization is not possible. Alternative approaches to evaluating interventions have received increased attention, particularly those that can retain elements of randomization such that they can be considered "controlled" trials. Methodological design elements and practical implementation considerations for two quasi-experimental design approaches that have considerable promise in PBR settings--the stepped-wedge design, and a variant of this design, a wait-list cross-over design, are presented along with a case study from a recent PBR intervention for patients with diabetes. PBR-relevant design features include: creation of a cohort over time that collects control data but allows all participants (clusters or patients) to receive the intervention; staggered introduction of clusters; multiple data collection points; and one-way cross-over into the intervention arm. Practical considerations include: randomization versus stratification, training run in phases; and extended time period for overall study completion. Several design features of practice based research studies can be adapted to local circumstances yet retain elements to improve methodological rigor. Studies that utilize these methods, such as the stepped-wedge design and the wait-list cross-over design, can increase the evidence base for controlled studies conducted within the complex environment of PBR.
Bridges for Pedestrians with Random Parameters using the Stochastic Finite Elements Analysis
NASA Astrophysics Data System (ADS)
Szafran, J.; Kamiński, M.
2017-02-01
The main aim of this paper is to present a Stochastic Finite Element Method analysis with reference to principal design parameters of bridges for pedestrians: eigenfrequency and deflection of bridge span. They are considered with respect to random thickness of plates in boxed-section bridge platform, Young modulus of structural steel and static load resulting from crowd of pedestrians. The influence of the quality of the numerical model in the context of traditional FEM is shown also on the example of a simple steel shield. Steel structures with random parameters are discretized in exactly the same way as for the needs of traditional Finite Element Method. Its probabilistic version is provided thanks to the Response Function Method, where several numerical tests with random parameter values varying around its mean value enable the determination of the structural response and, thanks to the Least Squares Method, its final probabilistic moments.
Power Analysis in Two-Level Unbalanced Designs
ERIC Educational Resources Information Center
Konstantopoulos, Spyros
2010-01-01
Previous work on statistical power has discussed mainly single-level designs or 2-level balanced designs with random effects. Although balanced experiments are common, in practice balance cannot always be achieved. Work on class size is one example of unbalanced designs. This study provides methods for power analysis in 2-level unbalanced designs…
Optimum systems design with random input and output applied to solar water heating
NASA Astrophysics Data System (ADS)
Abdel-Malek, L. L.
1980-03-01
Solar water heating systems are evaluated. Models were developed to estimate the percentage of energy supplied from the Sun to a household. Since solar water heating systems have random input and output queueing theory, birth and death processes were the major tools in developing the models of evaluation. Microeconomics methods help in determining the optimum size of the solar water heating system design parameters, i.e., the water tank volume and the collector area.
GEOSTATISTICAL SAMPLING DESIGNS FOR HAZARDOUS WASTE SITES
This chapter discusses field sampling design for environmental sites and hazardous waste sites with respect to random variable sampling theory, Gy's sampling theory, and geostatistical (kriging) sampling theory. The literature often presents these sampling methods as an adversari...
Simulation methods to estimate design power: an overview for applied research.
Arnold, Benjamin F; Hogan, Daniel R; Colford, John M; Hubbard, Alan E
2011-06-20
Estimating the required sample size and statistical power for a study is an integral part of study design. For standard designs, power equations provide an efficient solution to the problem, but they are unavailable for many complex study designs that arise in practice. For such complex study designs, computer simulation is a useful alternative for estimating study power. Although this approach is well known among statisticians, in our experience many epidemiologists and social scientists are unfamiliar with the technique. This article aims to address this knowledge gap. We review an approach to estimate study power for individual- or cluster-randomized designs using computer simulation. This flexible approach arises naturally from the model used to derive conventional power equations, but extends those methods to accommodate arbitrarily complex designs. The method is universally applicable to a broad range of designs and outcomes, and we present the material in a way that is approachable for quantitative, applied researchers. We illustrate the method using two examples (one simple, one complex) based on sanitation and nutritional interventions to improve child growth. We first show how simulation reproduces conventional power estimates for simple randomized designs over a broad range of sample scenarios to familiarize the reader with the approach. We then demonstrate how to extend the simulation approach to more complex designs. Finally, we discuss extensions to the examples in the article, and provide computer code to efficiently run the example simulations in both R and Stata. Simulation methods offer a flexible option to estimate statistical power for standard and non-traditional study designs and parameters of interest. The approach we have described is universally applicable for evaluating study designs used in epidemiologic and social science research.
NASA Technical Reports Server (NTRS)
Belytschko, Ted; Wing, Kam Liu
1987-01-01
In the Probabilistic Finite Element Method (PFEM), finite element methods have been efficiently combined with second-order perturbation techniques to provide an effective method for informing the designer of the range of response which is likely in a given problem. The designer must provide as input the statistical character of the input variables, such as yield strength, load magnitude, and Young's modulus, by specifying their mean values and their variances. The output then consists of the mean response and the variance in the response. Thus the designer is given a much broader picture of the predicted performance than with simply a single response curve. These methods are applicable to a wide class of problems, provided that the scale of randomness is not too large and the probabilistic density functions possess decaying tails. By incorporating the computational techniques we have developed in the past 3 years for efficiency, the probabilistic finite element methods are capable of handling large systems with many sources of uncertainties. Sample results for an elastic-plastic ten-bar structure and an elastic-plastic plane continuum with a circular hole subject to cyclic loadings with the yield stress on the random field are given.
Gemmell, Isla; Dunn, Graham
2011-03-01
In a partially randomized preference trial (PRPT) patients with no treatment preference are allocated to groups at random, but those who express a preference receive the treatment of their choice. It has been suggested that the design can improve the external and internal validity of trials. We used computer simulation to illustrate the impact that an unmeasured confounder could have on the results and conclusions drawn from a PRPT. We generated 4000 observations ("patients") that reflected the distribution of the Beck Depression Index (DBI) in trials of depression. Half were randomly assigned to a randomized controlled trial (RCT) design and half were assigned to a PRPT design. In the RCT, "patients" were evenly split between treatment and control groups; whereas in the preference arm, to reflect patient choice, 87.5% of patients were allocated to the experimental treatment and 12.5% to the control. Unadjusted analyses of the PRPT data consistently overestimated the treatment effect and its standard error. This lead to Type I errors when the true treatment effect was small and Type II errors when the confounder effect was large. The PRPT design is not recommended as a method of establishing an unbiased estimate of treatment effect due to the potential influence of unmeasured confounders. Copyright © 2011 John Wiley & Sons, Ltd.
Efficiency of a Care Coordination Model: A Randomized Study with Stroke Patients
ERIC Educational Resources Information Center
Claiborne, Nancy
2006-01-01
Objectives: This study investigated the efficiency of a social work care coordination model for stroke patients. Care coordination addresses patient care and treatment resources across the health care system to reduce risk, improve clinical outcomes, and maximize efficiency. Method: A randomly assigned, pre-post experimental design measured…
Estimation of Parameters from Discrete Random Nonstationary Time Series
NASA Astrophysics Data System (ADS)
Takayasu, H.; Nakamura, T.
For the analysis of nonstationary stochastic time series we introduce a formulation to estimate the underlying time-dependent parameters. This method is designed for random events with small numbers that are out of the applicability range of the normal distribution. The method is demonstrated for numerical data generated by a known system, and applied to time series of traffic accidents, batting average of a baseball player and sales volume of home electronics.
ERIC Educational Resources Information Center
Livingston, Samuel A.; Kim, Sooyeon
2010-01-01
A series of resampling studies investigated the accuracy of equating by four different methods in a random groups equating design with samples of 400, 200, 100, and 50 test takers taking each form. Six pairs of forms were constructed. Each pair was constructed by assigning items from an existing test taken by 9,000 or more test takers. The…
Key Aspects of Nucleic Acid Library Design for in Vitro Selection
Vorobyeva, Maria A.; Davydova, Anna S.; Vorobjev, Pavel E.; Pyshnyi, Dmitrii V.; Venyaminova, Alya G.
2018-01-01
Nucleic acid aptamers capable of selectively recognizing their target molecules have nowadays been established as powerful and tunable tools for biospecific applications, be it therapeutics, drug delivery systems or biosensors. It is now generally acknowledged that in vitro selection enables one to generate aptamers to almost any target of interest. However, the success of selection and the affinity of the resulting aptamers depend to a large extent on the nature and design of an initial random nucleic acid library. In this review, we summarize and discuss the most important features of the design of nucleic acid libraries for in vitro selection such as the nature of the library (DNA, RNA or modified nucleotides), the length of a randomized region and the presence of fixed sequences. We also compare and contrast different randomization strategies and consider computer methods of library design and some other aspects. PMID:29401748
Assessing the Generalizability of Randomized Trial Results to Target Populations
Stuart, Elizabeth A.; Bradshaw, Catherine P.; Leaf, Philip J.
2014-01-01
Recent years have seen increasing interest in and attention to evidence-based practices, where the “evidence” generally comes from well-conducted randomized trials. However, while those trials yield accurate estimates of the effect of the intervention for the participants in the trial (known as “internal validity”), they do not always yield relevant information about the effects in a particular target population (known as “external validity”). This may be due to a lack of specification of a target population when designing the trial, difficulties recruiting a sample that is representative of a pre-specified target population, or to interest in considering a target population somewhat different from the population directly targeted by the trial. This paper first provides an overview of existing design and analysis methods for assessing and enhancing the ability of a randomized trial to estimate treatment effects in a target population. It then provides a case study using one particular method, which weights the subjects in a randomized trial to match the population on a set of observed characteristics. The case study uses data from a randomized trial of School-wide Positive Behavioral Interventions and Supports (PBIS); our interest is in generalizing the results to the state of Maryland. In the case of PBIS, after weighting, estimated effects in the target population were similar to those observed in the randomized trial. The paper illustrates that statistical methods can be used to assess and enhance the external validity of randomized trials, making the results more applicable to policy and clinical questions. However, there are also many open research questions; future research should focus on questions of treatment effect heterogeneity and further developing these methods for enhancing external validity. Researchers should think carefully about the external validity of randomized trials and be cautious about extrapolating results to specific populations unless they are confident of the similarity between the trial sample and that target population. PMID:25307417
Assessing the generalizability of randomized trial results to target populations.
Stuart, Elizabeth A; Bradshaw, Catherine P; Leaf, Philip J
2015-04-01
Recent years have seen increasing interest in and attention to evidence-based practices, where the "evidence" generally comes from well-conducted randomized trials. However, while those trials yield accurate estimates of the effect of the intervention for the participants in the trial (known as "internal validity"), they do not always yield relevant information about the effects in a particular target population (known as "external validity"). This may be due to a lack of specification of a target population when designing the trial, difficulties recruiting a sample that is representative of a prespecified target population, or to interest in considering a target population somewhat different from the population directly targeted by the trial. This paper first provides an overview of existing design and analysis methods for assessing and enhancing the ability of a randomized trial to estimate treatment effects in a target population. It then provides a case study using one particular method, which weights the subjects in a randomized trial to match the population on a set of observed characteristics. The case study uses data from a randomized trial of school-wide positive behavioral interventions and supports (PBIS); our interest is in generalizing the results to the state of Maryland. In the case of PBIS, after weighting, estimated effects in the target population were similar to those observed in the randomized trial. The paper illustrates that statistical methods can be used to assess and enhance the external validity of randomized trials, making the results more applicable to policy and clinical questions. However, there are also many open research questions; future research should focus on questions of treatment effect heterogeneity and further developing these methods for enhancing external validity. Researchers should think carefully about the external validity of randomized trials and be cautious about extrapolating results to specific populations unless they are confident of the similarity between the trial sample and that target population.
He, Jianbo; Li, Jijie; Huang, Zhongwen; Zhao, Tuanjie; Xing, Guangnan; Gai, Junyi; Guan, Rongzhan
2015-01-01
Experimental error control is very important in quantitative trait locus (QTL) mapping. Although numerous statistical methods have been developed for QTL mapping, a QTL detection model based on an appropriate experimental design that emphasizes error control has not been developed. Lattice design is very suitable for experiments with large sample sizes, which is usually required for accurate mapping of quantitative traits. However, the lack of a QTL mapping method based on lattice design dictates that the arithmetic mean or adjusted mean of each line of observations in the lattice design had to be used as a response variable, resulting in low QTL detection power. As an improvement, we developed a QTL mapping method termed composite interval mapping based on lattice design (CIMLD). In the lattice design, experimental errors are decomposed into random errors and block-within-replication errors. Four levels of block-within-replication errors were simulated to show the power of QTL detection under different error controls. The simulation results showed that the arithmetic mean method, which is equivalent to a method under random complete block design (RCBD), was very sensitive to the size of the block variance and with the increase of block variance, the power of QTL detection decreased from 51.3% to 9.4%. In contrast to the RCBD method, the power of CIMLD and the adjusted mean method did not change for different block variances. The CIMLD method showed 1.2- to 7.6-fold higher power of QTL detection than the arithmetic or adjusted mean methods. Our proposed method was applied to real soybean (Glycine max) data as an example and 10 QTLs for biomass were identified that explained 65.87% of the phenotypic variation, while only three and two QTLs were identified by arithmetic and adjusted mean methods, respectively.
Optimal design of aperiodic, vertical silicon nanowire structures for photovoltaics.
Lin, Chenxi; Povinelli, Michelle L
2011-09-12
We design a partially aperiodic, vertically-aligned silicon nanowire array that maximizes photovoltaic absorption. The optimal structure is obtained using a random walk algorithm with transfer matrix method based electromagnetic forward solver. The optimal, aperiodic structure exhibits a 2.35 times enhancement in ultimate efficiency compared to its periodic counterpart. The spectral behavior mimics that of a periodic array with larger lattice constant. For our system, we find that randomly-selected, aperiodic structures invariably outperform the periodic array.
Schneider, Kristin L.; Pagoto, Sherry L.; Handschin, Barbara; Panza, Emily; Bakke, Susan; Liu, Qin; Blendea, Mihaela; Ockene, Ira S.; Ma, Yunsheng
2011-01-01
Background The comorbidity of type 2 diabetes mellitus (T2DM) and depression is associated with poor glycemic control. Exercise has been shown to improve mood and glycemic control, but individuals with comorbid T2DM and depression are disproportionately sedentary compared to the general population and report more difficulty with exercise. Behavioral activation, an evidence-based depression psychotherapy, was designed to help people with depression make gradual behavior changes, and may be helpful to build exercise adherence in sedentary populations. This pilot randomized clinical trial will test the feasibility of a group exercise program enhanced with behavioral activation strategies among women with comorbid T2DM and depression. Methods/Design Sedentary women with inadequately controlled T2DM and depression (N=60) will be randomly assigned to one of two conditions: exercise or usual care. Participants randomized to the exercise condition will attend 38 behavioral activation-enhanced group exercise classes over 24 weeks in addition to usual care. Participants randomized to the usual care condition will receive depression treatment referrals and print information on diabetes management via diet and physical activity. Assessments will occur at baseline and 3-, 6-, and 9-months following randomization. The goals of this pilot study are to demonstrate feasibility and intervention acceptability, estimate the resources and costs required to deliver the intervention and to estimate the standard deviation of continuous outcomes (e.g., depressive symptoms and glycosylated hemoglobin) in preparation for a fully-powered randomized clinical trial. Discussion A novel intervention that combines exercise and behavioral activation strategies could potentially improve glycemic control and mood in women with comorbid type 2 diabetes and depression. Trial registration NCT01024790 PMID:21765864
Encryption method based on pseudo random spatial light modulation for single-fibre data transmission
NASA Astrophysics Data System (ADS)
Kowalski, Marcin; Zyczkowski, Marek
2017-11-01
Optical cryptosystems can provide encryption and sometimes compression simultaneously. They are increasingly attractive for information securing especially for image encryption. Our studies shown that the optical cryptosystems can be used to encrypt optical data transmission. We propose and study a new method for securing fibre data communication. The paper presents a method for optical encryption of data transmitted with a single optical fibre. The encryption process relies on pseudo-random spatial light modulation, combination of two encryption keys and the Compressed Sensing framework. A linear combination of light pulses with pseudo-random patterns provides a required encryption performance. We propose an architecture to transmit the encrypted data through the optical fibre. The paper describes the method, presents the theoretical analysis, design of physical model and results of experiment.
The Effect of Cluster Sampling Design in Survey Research on the Standard Error Statistic.
ERIC Educational Resources Information Center
Wang, Lin; Fan, Xitao
Standard statistical methods are used to analyze data that is assumed to be collected using a simple random sampling scheme. These methods, however, tend to underestimate variance when the data is collected with a cluster design, which is often found in educational survey research. The purposes of this paper are to demonstrate how a cluster design…
Anders, Katherine L; Cutcher, Zoe; Kleinschmidt, Immo; Donnelly, Christl A; Ferguson, Neil M; Indriani, Citra; O'Neill, Scott L; Jewell, Nicholas P; Simmons, Cameron P
2018-05-07
Cluster randomized trials are the gold standard for assessing efficacy of community-level interventions, such as vector control strategies against dengue. We describe a novel cluster randomized trial methodology with a test-negative design, which offers advantages over traditional approaches. It utilizes outcome-based sampling of patients presenting with a syndrome consistent with the disease of interest, who are subsequently classified as test-positive cases or test-negative controls on the basis of diagnostic testing. We use simulations of a cluster trial to demonstrate validity of efficacy estimates under the test-negative approach. This demonstrates that, provided study arms are balanced for both test-negative and test-positive illness at baseline and that other test-negative design assumptions are met, the efficacy estimates closely match true efficacy. We also briefly discuss analytical considerations for an odds ratio-based effect estimate arising from clustered data, and outline potential approaches to analysis. We conclude that application of the test-negative design to certain cluster randomized trials could increase their efficiency and ease of implementation.
Clinical trial design for orthodontists.
Pandis, Nikolaos; Cobourne, Martyn T
2013-06-01
High-quality research should form the basis of all clinical practice. Randomized controlled trials currently provide the gold standard for investigating the effectiveness of treatment interventions and these are increasingly being used in orthodontics. Here we discuss the reasons why this form of investigation provides the most useful evidence for assessing treatment outcome. The methods available to achieve true randomization, a fundamental component in the design of these trials, are also discussed. In addition, we focus on how to minimize bias in clinical research, not only during the design and management of a trial, but also when disseminating results. We focus on the importance of using control groups correctly and describe methods that are available to adequately power a trial. Finally, we emphasise the importance of accurate and transparent reporting, which facilitates correct communication and assessment of the evidence.
Robust Takagi-Sugeno fuzzy control for fractional order hydro-turbine governing system.
Wang, Bin; Xue, Jianyi; Wu, Fengjiao; Zhu, Delan
2016-11-01
A robust fuzzy control method for fractional order hydro-turbine governing system (FOHGS) in the presence of random disturbances is investigated in this paper. Firstly, the mathematical model of FOHGS is introduced, and based on Takagi-Sugeno (T-S) fuzzy rules, the generalized T-S fuzzy model of FOHGS is presented. Secondly, based on fractional order Lyapunov stability theory, a novel T-S fuzzy control method is designed for the stability control of FOHGS. Thirdly, the relatively loose sufficient stability condition is acquired, which could be transformed into a group of linear matrix inequalities (LMIs) via Schur complement as well as the strict mathematical derivation is given. Furthermore, the control method could resist random disturbances, which shows the good robustness. Simulation results indicate the designed fractional order T-S fuzzy control scheme works well compared with the existing method. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.
The Effectiveness of Healthy Start Home Visit Program: Cluster Randomized Controlled Trial
ERIC Educational Resources Information Center
Leung, Cynthia; Tsang, Sandra; Heung, Kitty
2015-01-01
Purpose: The study reported the effectiveness of a home visit program for disadvantaged Chinese parents with preschool children, using cluster randomized controlled trial design. Method: Participants included 191 parents and their children from 24 preschools, with 84 dyads (12 preschools) in the intervention group and 107 dyads (12 preschools) in…
ERIC Educational Resources Information Center
Cream, Angela; O'Brian, Sue; Jones, Mark; Block, Susan; Harrison, Elisabeth; Lincoln, Michelle; Hewat, Sally; Packman, Ann; Menzies, Ross; Onslow, Mark
2010-01-01
Purpose: In this study, the authors investigated the efficacy of video self-modeling (VSM) following speech restructuring treatment to improve the maintenance of treatment effects. Method: The design was an open-plan, parallel-group, randomized controlled trial. Participants were 89 adults and adolescents who undertook intensive speech…
ERIC Educational Resources Information Center
Shire, Stephanie Y.; Chang, Ya-Chih; Shih, Wendy; Bracaglia, Suzanne; Kodjoe, Maria; Kasari, Connie
2017-01-01
Background: Interventions found to be effective in research settings are often not as effective when implemented in community settings. Considering children with autism, studies have rarely examined the efficacy of laboratory-tested interventions on child outcomes in community settings using randomized controlled designs. Methods: One hundred and…
Adiposity and Quality of Life: A Case Study from an Urban Center in Nigeria
ERIC Educational Resources Information Center
Akinpelu, Aderonke O.; Akinola, Odunayo T.; Gbiri, Caleb A.
2009-01-01
Objective: To determine relationship between adiposity indices and quality of life (QOL) of residents of a housing estate in Lagos, Nigeria. Design: Cross-sectional survey employing multistep random sampling method. Setting: Urban residential estate. Participants: This study involved 900 randomly selected residents of Abesan Housing Estate, Lagos,…
Individual mineral supplement intake by ewes swath grazing or confinement fed pea-barley forage
USDA-ARS?s Scientific Manuscript database
Sixty mature ewes (non-pregnant, non-lactating) were used in a completely randomized design to determine if feeding method of pea-barley forage (swath grazing or hay in confinement) had an effect on individual ewe mineral consumption. Thirty ewes were randomly allocated to 3 confinement pens and 30 ...
EEG Neurofeedback for ADHD: Double-Blind Sham-Controlled Randomized Pilot Feasibility Trial
ERIC Educational Resources Information Center
Arnold, L. Eugene; Lofthouse, Nicholas; Hersch, Sarah; Pan, Xueliang; Hurt, Elizabeth; Bates, Bethany; Kassouf, Kathleen; Moone, Stacey; Grantier, Cara
2013-01-01
Objective: Preparing for a definitive randomized clinical trial (RCT) of neurofeedback (NF) for ADHD, this pilot trial explored feasibility of a double-blind, sham-controlled design and adherence/palatability/relative effect of two versus three treatments/week. Method: Unmedicated 6- to 12-year-olds with "Diagnostic and Statistical Manual of…
Testing the Intervention Effect in Single-Case Experiments: A Monte Carlo Simulation Study
ERIC Educational Resources Information Center
Heyvaert, Mieke; Moeyaert, Mariola; Verkempynck, Paul; Van den Noortgate, Wim; Vervloet, Marlies; Ugille, Maaike; Onghena, Patrick
2017-01-01
This article reports on a Monte Carlo simulation study, evaluating two approaches for testing the intervention effect in replicated randomized AB designs: two-level hierarchical linear modeling (HLM) and using the additive method to combine randomization test "p" values (RTcombiP). Four factors were manipulated: mean intervention effect,…
Evaluation of Parent and Child Enhancement (PACE) Program: Randomized Controlled Trial
ERIC Educational Resources Information Center
Leung, Cynthia; Tsang, Sandra; Lo, Cyrus
2017-01-01
Objective: This study examined the efficacy of the Parent and Child Enhancement (PACE) program on child learning, child behavior problems, and parental stress, using randomized controlled trial design, in social services centers. Methods: Eligibility criteria were (1) children aged 2 years at program commencement, (2) low-income, new immigrant, or…
ERIC Educational Resources Information Center
Katz, M.; Adar Levine, A.; Kol-Degani, H.; Kav-Venaki, L.
2010-01-01
Objective: Evaluation of the efficacy of a patented, compound herbal preparation (CHP) in improving attention, cognition, and impulse control in children with ADHD. Method: Design: A randomized, double-blind, placebo-controlled trial. Setting: University-affiliated tertiary medical center. Participants: 120 children newly diagnosed with ADHD,…
The prompted optional randomization trial: a new design for comparative effectiveness research.
Flory, James; Karlawish, Jason
2012-12-01
Randomized controlled trials are the gold standard for medical evidence because randomization provides the best-known protection against confounding of results. Randomization has practical and ethical problems that limit the number of trials that can be conducted, however. A different method for collecting clinical data retains the statistically useful properties of randomization without incurring its practical and ethical challenges. A computerized prompt introduces a random element into clinical decision-making that can be instantly overridden if it conflicts with optimal patient care. This creates a weak form of randomization that still eliminates the effect of all confounders, can be carried out without disturbing routine clinical care, and arguably will not require research-grade informed consent.
Zwarenstein, Merrick; Reeves, Scott; Russell, Ann; Kenaszchuk, Chris; Conn, Lesley Gotlib; Miller, Karen-Lee; Lingard, Lorelei; Thorpe, Kevin E
2007-01-01
Background Despite a burgeoning interest in using interprofessional approaches to promote effective collaboration in health care, systematic reviews find scant evidence of benefit. This protocol describes the first cluster randomized controlled trial (RCT) to design and evaluate an intervention intended to improve interprofessional collaborative communication and patient-centred care. Objectives The objective is to evaluate the effects of a four-component, hospital-based staff communication protocol designed to promote collaborative communication between healthcare professionals and enhance patient-centred care. Methods The study is a multi-centre mixed-methods cluster randomized controlled trial involving twenty clinical teaching teams (CTTs) in general internal medicine (GIM) divisions of five Toronto tertiary-care hospitals. CTTs will be randomly assigned either to receive an intervention designed to improve interprofessional collaborative communication, or to continue usual communication practices. Non-participant naturalistic observation, shadowing, and semi-structured, qualitative interviews were conducted to explore existing patterns of interprofessional collaboration in the CTTs, and to support intervention development. Interviews and shadowing will continue during intervention delivery in order to document interactions between the intervention settings and adopters, and changes in interprofessional communication. The primary outcome is the rate of unplanned hospital readmission. Secondary outcomes are length of stay (LOS); adherence to evidence-based prescription drug therapy; patients' satisfaction with care; self-report surveys of CTT staff perceptions of interprofessional collaboration; and frequency of calls to paging devices. Outcomes will be compared on an intention-to-treat basis using adjustment methods appropriate for data from a cluster randomized design. Discussion Pre-intervention qualitative analysis revealed that a substantial amount of interprofessional interaction lacks key core elements of collaborative communication such as self-introduction, description of professional role, and solicitation of other professional perspectives. Incorporating these findings, a four-component intervention was designed with a goal of creating a culture of communication in which the fundamentals of collaboration become a routine part of interprofessional interactions during unstructured work periods on GIM wards. Trial registration Registered with National Institutes of Health as NCT00466297. PMID:17877830
Methodological Issues in Trials of Complementary and Alternative Medicine Interventions
Sikorskii, Alla; Wyatt, Gwen; Victorson, David; Faulkner, Gwen; Rahbar, Mohammad Hossein
2010-01-01
Background Complementary and alternative medicine (CAM) use is widespread among cancer patients. Information on safety and efficacy of CAM therapies is needed for both patients and health care providers. Well-designed randomized clinical trials (RCTs) of CAM therapy interventions can inform both clinical research and practice. Objectives To review important issues that affect the design of RCTs for CAM interventions. Methods Using the methods component of the Consolidated Standards for Reporting Trials (CONSORT) as a guiding framework, and a National Cancer Institute-funded reflexology study as an exemplar, methodological issues related to participants, intervention, objectives, outcomes, sample size, randomization, blinding, and statistical methods were reviewed. Discussion Trials of CAM interventions designed and implemented according to appropriate methodological standards will facilitate the needed scientific rigor in CAM research. Interventions in CAM can be tested using proposed methodology, and the results of testing will inform nursing practice in providing safe and effective supportive care and improving the well-being of patients. PMID:19918155
Internet Training to Respond to Aggressive Resident Behaviors
ERIC Educational Resources Information Center
Irvine, A. Blair; Billow, Molly B.; Gates, Donna M.; Fitzwater, Evelyn L.; Seeley, John R.; Bourgeois, Michelle
2012-01-01
Purpose: This research evaluated an individualized Internet training designed to teach nurse aides (NAs) strategies to prevent or, if necessary, react to resident aggression in ways that are safe for the resident as well as the caregiver. Design and Methods: A randomized treatment and control design was implemented, with baseline, 1-, and 2-month…
A generic minimization random allocation and blinding system on web.
Cai, Hongwei; Xia, Jielai; Xu, Dezhong; Gao, Donghuai; Yan, Yongping
2006-12-01
Minimization is a dynamic randomization method for clinical trials. Although recommended by many researchers, the utilization of minimization has been seldom reported in randomized trials mainly because of the controversy surrounding the validity of conventional analyses and its complexity in implementation. However, both the statistical and clinical validity of minimization were demonstrated in recent studies. Minimization random allocation system integrated with blinding function that could facilitate the implementation of this method in general clinical trials has not been reported. SYSTEM OVERVIEW: The system is a web-based random allocation system using Pocock and Simon minimization method. It also supports multiple treatment arms within a trial, multiple simultaneous trials, and blinding without further programming. This system was constructed with generic database schema design method, Pocock and Simon minimization method and blinding method. It was coded with Microsoft Visual Basic and Active Server Pages (ASP) programming languages. And all dataset were managed with a Microsoft SQL Server database. Some critical programming codes were also provided. SIMULATIONS AND RESULTS: Two clinical trials were simulated simultaneously to test the system's applicability. Not only balanced groups but also blinded allocation results were achieved in both trials. Practical considerations for minimization method, the benefits, general applicability and drawbacks of the technique implemented in this system are discussed. Promising features of the proposed system are also summarized.
Multi-Agent Methods for the Configuration of Random Nanocomputers
NASA Technical Reports Server (NTRS)
Lawson, John W.
2004-01-01
As computational devices continue to shrink, the cost of manufacturing such devices is expected to grow exponentially. One alternative to the costly, detailed design and assembly of conventional computers is to place the nano-electronic components randomly on a chip. The price for such a trivial assembly process is that the resulting chip would not be programmable by conventional means. In this work, we show that such random nanocomputers can be adaptively programmed using multi-agent methods. This is accomplished through the optimization of an associated high dimensional error function. By representing each of the independent variables as a reinforcement learning agent, we are able to achieve convergence must faster than with other methods, including simulated annealing. Standard combinational logic circuits such as adders and multipliers are implemented in a straightforward manner. In addition, we show that the intrinsic flexibility of these adaptive methods allows the random computers to be reconfigured easily, making them reusable. Recovery from faults is also demonstrated.
ERIC Educational Resources Information Center
Brady, Bernadine; O'Regan, Connie
2009-01-01
The youth mentoring program Big Brothers Big Sisters is one of the first social interventions involving youth in Ireland to be evaluated using a randomized controlled trial methodology. This article sets out the design process undertaken, describing how the research team came to adopt a concurrent embedded mixed methods design as a means of…
Diestelkamp, Wiebke S; Krane, Carissa M; Pinnell, Margaret F
2011-05-20
Energy-based surgical scalpels are designed to efficiently transect and seal blood vessels using thermal energy to promote protein denaturation and coagulation. Assessment and design improvement of ultrasonic scalpel performance relies on both in vivo and ex vivo testing. The objective of this work was to design and implement a robust, experimental test matrix with randomization restrictions and predictive statistical power, which allowed for identification of those experimental variables that may affect the quality of the seal obtained ex vivo. The design of the experiment included three factors: temperature (two levels); the type of solution used to perfuse the artery during transection (three types); and artery type (two types) resulting in a total of twelve possible treatment combinations. Burst pressures of porcine carotid and renal arteries sealed ex vivo were assigned as the response variable. The experimental test matrix was designed and carried out as a split-plot experiment in order to assess the contributions of several variables and their interactions while accounting for randomization restrictions present in the experimental setup. The statistical software package SAS was utilized and PROC MIXED was used to account for the randomization restrictions in the split-plot design. The combination of temperature, solution, and vessel type had a statistically significant impact on seal quality. The design and implementation of a split-plot experimental test-matrix provided a mechanism for addressing the existing technical randomization restrictions of ex vivo ultrasonic scalpel performance testing, while preserving the ability to examine the potential effects of independent factors or variables. This method for generating the experimental design and the statistical analyses of the resulting data are adaptable to a wide variety of experimental problems involving large-scale tissue-based studies of medical or experimental device efficacy and performance.
Observational Studies: Cohort and Case-Control Studies
Song, Jae W.; Chung, Kevin C.
2010-01-01
Observational studies are an important category of study designs. To address some investigative questions in plastic surgery, randomized controlled trials are not always indicated or ethical to conduct. Instead, observational studies may be the next best method to address these types of questions. Well-designed observational studies have been shown to provide results similar to randomized controlled trials, challenging the belief that observational studies are second-rate. Cohort studies and case-control studies are two primary types of observational studies that aid in evaluating associations between diseases and exposures. In this review article, we describe these study designs, methodological issues, and provide examples from the plastic surgery literature. PMID:20697313
Methodological reporting of randomized trials in five leading Chinese nursing journals.
Shi, Chunhu; Tian, Jinhui; Ren, Dan; Wei, Hongli; Zhang, Lihuan; Wang, Quan; Yang, Kehu
2014-01-01
Randomized controlled trials (RCTs) are not always well reported, especially in terms of their methodological descriptions. This study aimed to investigate the adherence of methodological reporting complying with CONSORT and explore associated trial level variables in the Chinese nursing care field. In June 2012, we identified RCTs published in five leading Chinese nursing journals and included trials with details of randomized methods. The quality of methodological reporting was measured through the methods section of the CONSORT checklist and the overall CONSORT methodological items score was calculated and expressed as a percentage. Meanwhile, we hypothesized that some general and methodological characteristics were associated with reporting quality and conducted a regression with these data to explore the correlation. The descriptive and regression statistics were calculated via SPSS 13.0. In total, 680 RCTs were included. The overall CONSORT methodological items score was 6.34 ± 0.97 (Mean ± SD). No RCT reported descriptions and changes in "trial design," changes in "outcomes" and "implementation," or descriptions of the similarity of interventions for "blinding." Poor reporting was found in detailing the "settings of participants" (13.1%), "type of randomization sequence generation" (1.8%), calculation methods of "sample size" (0.4%), explanation of any interim analyses and stopping guidelines for "sample size" (0.3%), "allocation concealment mechanism" (0.3%), additional analyses in "statistical methods" (2.1%), and targeted subjects and methods of "blinding" (5.9%). More than 50% of trials described randomization sequence generation, the eligibility criteria of "participants," "interventions," and definitions of the "outcomes" and "statistical methods." The regression analysis found that publication year and ITT analysis were weakly associated with CONSORT score. The completeness of methodological reporting of RCTs in the Chinese nursing care field is poor, especially with regard to the reporting of trial design, changes in outcomes, sample size calculation, allocation concealment, blinding, and statistical methods.
77 FR 40866 - Applications for New Awards; Innovative Approaches to Literacy Program
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-11
... supported by the methods that have been employed. The term includes, appropriate to the research being... observational methods that provide reliable data; (iv) making claims of causal relationships only in random...; and (vii) using research designs and methods appropriate to the research question posed...
ERIC Educational Resources Information Center
Juhasz, Stephen; And Others
Table of contents (TOC) practices of some 120 primary journals were analyzed. The journals were randomly selected. The method of randomization is described. The samples were selected from a university library with a holding of approximately 12,000 titles published worldwide. A questionnaire was designed. Purpose was to find uniformity and…
Fisher, Sir Ronald Aylmer (1890-1962)
NASA Astrophysics Data System (ADS)
Murdin, P.
2000-11-01
Statistician, born in London, England. After studying astronomy using AIRY's manual on the Theory of Errors he became interested in statistics, and laid the foundation of randomization in experimental design, the analysis of variance and the use of data in estimating the properties of the parent population from which it was drawn. Invented the maximum likelihood method for estimating from random ...
ERIC Educational Resources Information Center
Smith-Lock, Karen M.; Leitão, Suze; Prior, Polly; Nickels, Lyndsey
2015-01-01
Purpose: This study compared the effectiveness of two grammar treatment procedures for children with specific language impairment. Method: A double-blind superiority trial with cluster randomization was used to compare a cueing procedure, designed to elicit a correct production following an initial error, to a recasting procedure, which required…
Group Lidcombe Program Treatment for Early Stuttering: A Randomized Controlled Trial
ERIC Educational Resources Information Center
Arnott, Simone; Onslow, Mark; O'Brian, Sue; Packman, Ann; Jones, Mark; Block, Susan
2014-01-01
Purpose: This study adds to the Lidcombe Program evidence base by comparing individual and group treatment of preschoolers who stutter. Method: A randomized controlled trial of 54 preschoolers was designed to establish whether group delivery outcomes were not inferior to the individual model. The group arm used a rolling group model, in which a…
ERIC Educational Resources Information Center
Jia, Yue; Stokes, Lynne; Harris, Ian; Wang, Yan
2011-01-01
Estimation of parameters of random effects models from samples collected via complex multistage designs is considered. One way to reduce estimation bias due to unequal probabilities of selection is to incorporate sampling weights. Many researchers have been proposed various weighting methods (Korn, & Graubard, 2003; Pfeffermann, Skinner,…
Supervised Home Training of Dialogue Skills in Chronic Aphasia: A Randomized Parallel Group Study
ERIC Educational Resources Information Center
Nobis-Bosch, Ruth; Springer, Luise; Radermacher, Irmgard; Huber, Walter
2011-01-01
Purpose: The aim of this study was to prove the efficacy of supervised self-training for individuals with aphasia. Linguistic and communicative performance in structured dialogues represented the main study parameters. Method: In a cross-over design for randomized matched pairs, 18 individuals with chronic aphasia were examined during 12 weeks of…
ERIC Educational Resources Information Center
Hirshfeld-Becker, Dina R.; Masek, Bruce; Henin, Aude; Blakely, Lauren Raezer; Pollock-Wurman, Rachel A.; McQuade, Julia; DePetrillo, Lillian; Briesch, Jacquelyn; Ollendick, Thomas H.; Rosenbaum, Jerrold F.; Biederman, Joseph
2010-01-01
Objective: To examine the efficacy of a developmentally appropriate parent-child cognitive behavioral therapy (CBT) protocol for anxiety disorders in children ages 4-7 years. Method: Design: Randomized wait-list controlled trial. Conduct: Sixty-four children (53% female, mean age 5.4 years, 80% European American) with anxiety disorders were…
Correcting Classifiers for Sample Selection Bias in Two-Phase Case-Control Studies
Theis, Fabian J.
2017-01-01
Epidemiological studies often utilize stratified data in which rare outcomes or exposures are artificially enriched. This design can increase precision in association tests but distorts predictions when applying classifiers on nonstratified data. Several methods correct for this so-called sample selection bias, but their performance remains unclear especially for machine learning classifiers. With an emphasis on two-phase case-control studies, we aim to assess which corrections to perform in which setting and to obtain methods suitable for machine learning techniques, especially the random forest. We propose two new resampling-based methods to resemble the original data and covariance structure: stochastic inverse-probability oversampling and parametric inverse-probability bagging. We compare all techniques for the random forest and other classifiers, both theoretically and on simulated and real data. Empirical results show that the random forest profits from only the parametric inverse-probability bagging proposed by us. For other classifiers, correction is mostly advantageous, and methods perform uniformly. We discuss consequences of inappropriate distribution assumptions and reason for different behaviors between the random forest and other classifiers. In conclusion, we provide guidance for choosing correction methods when training classifiers on biased samples. For random forests, our method outperforms state-of-the-art procedures if distribution assumptions are roughly fulfilled. We provide our implementation in the R package sambia. PMID:29312464
How Generalizable Is Your Experiment? An Index for Comparing Samples and Populations
ERIC Educational Resources Information Center
Tipton, Elizabeth
2013-01-01
Recent research on the design of social experiments has highlighted the effects of different design choices on research findings. Since experiments rarely collect their samples using random selection, in order to address these external validity problems and design choices, recent research has focused on two areas. The first area is on methods for…
Section Preequating under the Equivalent Groups Design without IRT
ERIC Educational Resources Information Center
Guo, Hongwen; Puhan, Gautam
2014-01-01
In this article, we introduce a section preequating (SPE) method (linear and nonlinear) under the randomly equivalent groups design. In this equating design, sections of Test X (a future new form) and another existing Test Y (an old form already on scale) are administered. The sections of Test X are equated to Test Y, after adjusting for the…
Zwarenstein, Merrick; Reeves, Scott; Russell, Ann; Kenaszchuk, Chris; Conn, Lesley Gotlib; Miller, Karen-Lee; Lingard, Lorelei; Thorpe, Kevin E
2007-09-18
Despite a burgeoning interest in using interprofessional approaches to promote effective collaboration in health care, systematic reviews find scant evidence of benefit. This protocol describes the first cluster randomized controlled trial (RCT) to design and evaluate an intervention intended to improve interprofessional collaborative communication and patient-centred care. The objective is to evaluate the effects of a four-component, hospital-based staff communication protocol designed to promote collaborative communication between healthcare professionals and enhance patient-centred care. The study is a multi-centre mixed-methods cluster randomized controlled trial involving twenty clinical teaching teams (CTTs) in general internal medicine (GIM) divisions of five Toronto tertiary-care hospitals. CTTs will be randomly assigned either to receive an intervention designed to improve interprofessional collaborative communication, or to continue usual communication practices. Non-participant naturalistic observation, shadowing, and semi-structured, qualitative interviews were conducted to explore existing patterns of interprofessional collaboration in the CTTs, and to support intervention development. Interviews and shadowing will continue during intervention delivery in order to document interactions between the intervention settings and adopters, and changes in interprofessional communication. The primary outcome is the rate of unplanned hospital readmission. Secondary outcomes are length of stay (LOS); adherence to evidence-based prescription drug therapy; patients' satisfaction with care; self-report surveys of CTT staff perceptions of interprofessional collaboration; and frequency of calls to paging devices. Outcomes will be compared on an intention-to-treat basis using adjustment methods appropriate for data from a cluster randomized design. Pre-intervention qualitative analysis revealed that a substantial amount of interprofessional interaction lacks key core elements of collaborative communication such as self-introduction, description of professional role, and solicitation of other professional perspectives. Incorporating these findings, a four-component intervention was designed with a goal of creating a culture of communication in which the fundamentals of collaboration become a routine part of interprofessional interactions during unstructured work periods on GIM wards. Registered with National Institutes of Health as NCT00466297.
System Design under Uncertainty: Evolutionary Optimization of the Gravity Probe-B Spacecraft
NASA Technical Reports Server (NTRS)
Pullen, Samuel P.; Parkinson, Bradford W.
1994-01-01
This paper discusses the application of evolutionary random-search algorithms (Simulated Annealing and Genetic Algorithms) to the problem of spacecraft design under performance uncertainty. Traditionally, spacecraft performance uncertainty has been measured by reliability. Published algorithms for reliability optimization are seldom used in practice because they oversimplify reality. The algorithm developed here uses random-search optimization to allow us to model the problem more realistically. Monte Carlo simulations are used to evaluate the objective function for each trial design solution. These methods have been applied to the Gravity Probe-B (GP-B) spacecraft being developed at Stanford University for launch in 1999, Results of the algorithm developed here for GP-13 are shown, and their implications for design optimization by evolutionary algorithms are discussed.
NASA Astrophysics Data System (ADS)
Ju, Yaping; Zhang, Chuhua
2016-03-01
Blade fouling has been proved to be a great threat to compressor performance in operating stage. The current researches on fouling-induced performance degradations of centrifugal compressors are based mainly on simplified roughness models without taking into account the realistic factors such as spatial non-uniformity and randomness of the fouling-induced surface roughness. Moreover, little attention has been paid to the robust design optimization of centrifugal compressor impellers with considerations of blade fouling. In this paper, a multi-objective robust design optimization method is developed for centrifugal impellers under surface roughness uncertainties due to blade fouling. A three-dimensional surface roughness map is proposed to describe the nonuniformity and randomness of realistic fouling accumulations on blades. To lower computational cost in robust design optimization, the support vector regression (SVR) metamodel is combined with the Monte Carlo simulation (MCS) method to conduct the uncertainty analysis of fouled impeller performance. The analyzed results show that the critical fouled region associated with impeller performance degradations lies at the leading edge of blade tip. The SVR metamodel has been proved to be an efficient and accurate means in the detection of impeller performance variations caused by roughness uncertainties. After design optimization, the robust optimal design is found to be more efficient and less sensitive to fouling uncertainties while maintaining good impeller performance in the clean condition. This research proposes a systematic design optimization method for centrifugal compressors with considerations of blade fouling, providing a practical guidance to the design of advanced centrifugal compressors.
Rationale, design and methods of the HEALTHY study behavior intervention component
USDA-ARS?s Scientific Manuscript database
HEALTHY was a multi-center primary prevention trial designed to reduce risk factors for type 2 diabetes in adolescents. Seven centers each recruited six middle schools that were randomized to either intervention or control. The HEALTHY intervention integrated multiple components in nutrition, physic...
Interrelation Between Safety Factors and Reliability
NASA Technical Reports Server (NTRS)
Elishakoff, Isaac; Chamis, Christos C. (Technical Monitor)
2001-01-01
An evaluation was performed to establish relationships between safety factors and reliability relationships. Results obtained show that the use of the safety factor is not contradictory to the employment of the probabilistic methods. In many cases the safety factors can be directly expressed by the required reliability levels. However, there is a major difference that must be emphasized: whereas the safety factors are allocated in an ad hoc manner, the probabilistic approach offers a unified mathematical framework. The establishment of the interrelation between the concepts opens an avenue to specify safety factors based on reliability. In cases where there are several forms of failure, then the allocation of safety factors should he based on having the same reliability associated with each failure mode. This immediately suggests that by the probabilistic methods the existing over-design or under-design can be eliminated. The report includes three parts: Part 1-Random Actual Stress and Deterministic Yield Stress; Part 2-Deterministic Actual Stress and Random Yield Stress; Part 3-Both Actual Stress and Yield Stress Are Random.
Computational methods of robust controller design for aerodynamic flutter suppression
NASA Technical Reports Server (NTRS)
Anderson, L. R.
1981-01-01
The development of Riccati iteration, a tool for the design and analysis of linear control systems is examined. First, Riccati iteration is applied to the problem of pole placement and order reduction in two-time scale control systems. Order reduction, yielding a good approximation to the original system, is demonstrated using a 16th order linear model of a turbofan engine. Next, a numerical method for solving the Riccati equation is presented and demonstrated for a set of eighth order random examples. A literature review of robust controller design methods follows which includes a number of methods for reducing the trajectory and performance index sensitivity in linear regulators. Lastly, robust controller design for large parameter variations is discussed.
Chakraborty, Bibhas; Davidson, Karina W.
2015-01-01
Summary Implementation study is an important tool for deploying state-of-the-art treatments from clinical efficacy studies into a treatment program, with the dual goals of learning about effectiveness of the treatments and improving the quality of care for patients enrolled into the program. In this article, we deal with the design of a treatment program of dynamic treatment regimens (DTRs) for patients with depression post acute coronary syndrome. We introduce a novel adaptive randomization scheme for a sequential multiple assignment randomized trial of DTRs. Our approach adapts the randomization probabilities to favor treatment sequences having comparatively superior Q-functions used in Q-learning. The proposed approach addresses three main concerns of an implementation study: it allows incorporation of historical data or opinions, it includes randomization for learning purposes, and it aims to improve care via adaptation throughout the program. We demonstrate how to apply our method to design a depression treatment program using data from a previous study. By simulation, we illustrate that the inputs from historical data are important for the program performance measured by the expected outcomes of the enrollees, but also show that the adaptive randomization scheme is able to compensate poorly specified historical inputs by improving patient outcomes within a reasonable horizon. The simulation results also confirm that the proposed design allows efficient learning of the treatments by alleviating the curse of dimensionality. PMID:25354029
Peto, R.; Pike, M. C.; Armitage, P.; Breslow, N. E.; Cox, D. R.; Howard, S. V.; Mantel, N.; McPherson, K.; Peto, J.; Smith, P. G.
1977-01-01
Part I of this report appeared in the previous issue (Br. J. Cancer (1976) 34,585), and discussed the design of randomized clinical trials. Part II now describes efficient methods of analysis of randomized clinical trials in which we wish to compare the duration of survival (or the time until some other untoward event first occurs) among different groups of patients. It is intended to enable physicians without statistical training either to analyse such data themselves using life tables, the logrank test and retrospective stratification, or, when such analyses are presented, to appreciate them more critically, but the discussion may also be of interest to statisticians who have not yet specialized in clinical trial analyses. PMID:831755
Valbuza, Juliana Spelta; de Oliveira, Márcio Moysés; Conti, Cristiane Fiquene; Prado, Lucila Bizari F; de Carvalho, Luciane Bizari Coin; do Prado, Gilmar Fernandes
2010-12-01
Treatment of obstructive sleep apnea (OSA) using methods for increasing upper airway muscle tonus has been controversial and poorly reported. Thus, a review of the evidence is needed to evaluate the effectiveness of these methods. The design used was a systematic review of randomized controlled trials. Data sources are from the Cochrane Library, Medline, Embase and Scielo, registries of ongoing trials, theses indexed at Biblioteca Regional de Medicina/Pan-American Health Organization of the World Health Organization and the reference lists of all the trials retrieved. This was a review of randomized or quasi-randomized double-blind trials on OSA. Two reviewers independently applied eligibility criteria. One reviewer assessed study quality and extracted data, and these processes were checked by a second reviewer. The primary outcome was a decrease in the apnea/hypopnea index (AHI) of below five episodes per hour. Other outcomes were subjective sleep quality, sleep quality measured by night polysomnography, quality of life measured subjectively and adverse events associated with the treatments. Three eligible trials were included. Two studies showed improvements through the objective and subjective analyses, and one study showed improvement of snoring, but not of AHI while the subjective analyses showed no improvement. The adverse events were reported and they were not significant. There is no accepted scientific evidence that methods aiming to increase muscle tonus of the stomatognathic system are effective in reducing AHI to below five events per hour. Well-designed randomized controlled trials are needed to assess the efficacy of such methods.
Modeling and dynamic environment analysis technology for spacecraft
NASA Astrophysics Data System (ADS)
Fang, Ren; Zhaohong, Qin; Zhong, Zhang; Zhenhao, Liu; Kai, Yuan; Long, Wei
Spacecraft sustains complex and severe vibrations and acoustic environments during flight. Predicting the resulting structures, including numerical predictions of fluctuating pressure, updating models and random vibration and acoustic analysis, plays an important role during the design, manufacture and ground testing of spacecraft. In this paper, Monotony Integrative Large Eddy Simulation (MILES) is introduced to predict the fluctuating pressure of the fairing. The exact flow structures of the fairing wall surface under different Mach numbers are obtained, then a spacecraft model is constructed using the finite element method (FEM). According to the modal test data, the model is updated by the penalty method. On this basis, the random vibration and acoustic responses of the fairing and satellite are analyzed by different methods. The simulated results agree well with the experimental ones, which shows the validity of the modeling and dynamic environment analysis technology. This information can better support test planning, defining test conditions and designing optimal structures.
Blencowe, Natalie S; Cook, Jonathan A; Pinkney, Thomas; Rogers, Chris; Reeves, Barnaby C; Blazeby, Jane M
2017-04-01
Randomized controlled trials in surgery are notoriously difficult to design and conduct due to numerous methodological and cultural challenges. Over the last 5 years, several UK-based surgical trial-related initiatives have been funded to address these issues. These include the development of Surgical Trials Centers and Surgical Specialty Leads (individual surgeons responsible for championing randomized controlled trials in their specialist fields), both funded by the Royal College of Surgeons of England; networks of research-active surgeons in training; and investment in methodological research relating to surgical randomized controlled trials (to address issues such as recruitment, blinding, and the selection and standardization of interventions). This article discusses these initiatives more in detail and provides exemplar cases to illustrate how the methodological challenges have been tackled. The initiatives have surpassed expectations, resulting in a renaissance in surgical research throughout the United Kingdom, such that the number of patients entering surgical randomized controlled trials has doubled.
Schneider, Kristin L; Bodenlos, Jamie S; Ma, Yunsheng; Olendzki, Barbara; Oleski, Jessica; Merriam, Philip; Crawford, Sybil; Ockene, Ira S; Pagoto, Sherry L
2008-01-01
Background Obesity is often comorbid with depression and individuals with this comorbidity fare worse in behavioral weight loss treatment. Treating depression directly prior to behavioral weight loss treatment might bolster weight loss outcomes in this population, but this has not yet been tested in a randomized clinical trial. Methods and design This randomized clinical trial will examine whether behavior therapy for depression administered prior to standard weight loss treatment produces greater weight loss than standard weight loss treatment alone. Obese women with major depressive disorder (N = 174) will be recruited from primary care clinics and the community and randomly assigned to one of the two treatment conditions. Treatment will last 2 years, and will include a 6-month intensive treatment phase followed by an 18-month maintenance phase. Follow-up assessment will occur at 6-months and 1- and 2 years following randomization. The primary outcome is weight loss. The study was designed to provide 90% power for detecting a weight change difference between conditions of 3.1 kg (standard deviation of 5.5 kg) at 1-year assuming a 25% rate of loss to follow-up. Secondary outcomes include depression, physical activity, dietary intake, psychosocial variables and cardiovascular risk factors. Potential mediators (e.g., adherence, depression, physical activity and caloric intake) of the intervention effect on weight change will also be examined. Discussion Treating depression before administering intensive health behavior interventions could potentially boost the impact on both mental and physical health outcomes. Trial registration NCT00572520 PMID:18793398
Packet Randomized Experiments for Eliminating Classes of Confounders
Pavela, Greg; Wiener, Howard; Fontaine, Kevin R.; Fields, David A.; Voss, Jameson D.; Allison, David B.
2014-01-01
Background Although randomization is considered essential for causal inference, it is often not possible to randomize in nutrition and obesity research. To address this, we develop a framework for an experimental design—packet randomized experiments (PREs), which improves causal inferences when randomization on a single treatment variable is not possible. This situation arises when subjects are randomly assigned to a condition (such as a new roommate) which varies in one characteristic of interest (such as weight), but also varies across many others. There has been no general discussion of this experimental design, including its strengths, limitations, and statistical properties. As such, researchers are left to develop and apply PREs on an ad hoc basis, limiting its potential to improve causal inferences among nutrition and obesity researchers. Methods We introduce PREs as an intermediary design between randomized controlled trials and observational studies. We review previous research that used the PRE design and describe its application in obesity-related research, including random roommate assignments, heterochronic parabiosis, and the quasi-random assignment of subjects to geographic areas. We then provide a statistical framework to control for potential packet-level confounders not accounted for by randomization. Results PREs have successfully been used to improve causal estimates of the effect of roommates, altitude, and breastfeeding on weight outcomes. When certain assumptions are met, PREs can asymptotically control for packet-level characteristics. This has the potential to statistically estimate the effect of a single treatment even when randomization to a single treatment did not occur. Conclusions Applying PREs to obesity-related research will improve decisions about clinical, public health, and policy actions insofar as it offers researchers new insight into cause and effect relationships among variables. PMID:25444088
Study on Stationarity of Random Load Spectrum Based on the Special Road
NASA Astrophysics Data System (ADS)
Yan, Huawen; Zhang, Weigong; Wang, Dong
2017-09-01
In the special road quality assessment method, there is a method using a wheel force sensor, the essence of this method is collecting the load spectrum of the car to reflect the quality of road. According to the definition of stochastic process, it is easy to find that the load spectrum is a stochastic process. However, the analysis method and application range of different random processes are very different, especially in engineering practice, which will directly affect the design and development of the experiment. Therefore, determining the type of a random process has important practical significance. Based on the analysis of the digital characteristics of road load spectrum, this paper determines that the road load spectrum in this experiment belongs to a stationary stochastic process, paving the way for the follow-up modeling and feature extraction of the special road.
SNP selection and classification of genome-wide SNP data using stratified sampling random forests.
Wu, Qingyao; Ye, Yunming; Liu, Yang; Ng, Michael K
2012-09-01
For high dimensional genome-wide association (GWA) case-control data of complex disease, there are usually a large portion of single-nucleotide polymorphisms (SNPs) that are irrelevant with the disease. A simple random sampling method in random forest using default mtry parameter to choose feature subspace, will select too many subspaces without informative SNPs. Exhaustive searching an optimal mtry is often required in order to include useful and relevant SNPs and get rid of vast of non-informative SNPs. However, it is too time-consuming and not favorable in GWA for high-dimensional data. The main aim of this paper is to propose a stratified sampling method for feature subspace selection to generate decision trees in a random forest for GWA high-dimensional data. Our idea is to design an equal-width discretization scheme for informativeness to divide SNPs into multiple groups. In feature subspace selection, we randomly select the same number of SNPs from each group and combine them to form a subspace to generate a decision tree. The advantage of this stratified sampling procedure can make sure each subspace contains enough useful SNPs, but can avoid a very high computational cost of exhaustive search of an optimal mtry, and maintain the randomness of a random forest. We employ two genome-wide SNP data sets (Parkinson case-control data comprised of 408 803 SNPs and Alzheimer case-control data comprised of 380 157 SNPs) to demonstrate that the proposed stratified sampling method is effective, and it can generate better random forest with higher accuracy and lower error bound than those by Breiman's random forest generation method. For Parkinson data, we also show some interesting genes identified by the method, which may be associated with neurological disorders for further biological investigations.
An Overview of Research and Evaluation Designs for Dissemination and Implementation
Brown, C. Hendricks; Curran, Geoffrey; Palinkas, Lawrence A.; Aarons, Gregory A.; Wells, Kenneth B.; Jones, Loretta; Collins, Linda M.; Duan, Naihua; Mittman, Brian S.; Wallace, Andrea; Tabak, Rachel G.; Ducharme, Lori; Chambers, David; Neta, Gila; Wiley, Tisha; Landsverk, John; Cheung, Ken; Cruden, Gracelyn
2016-01-01
Background The wide variety of dissemination and implementation designs now being used to evaluate and improve health systems and outcomes warrants review of the scope, features, and limitations of these designs. Methods This paper is one product of a design workgroup formed in 2013 by the National Institutes of Health to address dissemination and implementation research, and whose members represented diverse methodologic backgrounds, content focus areas, and health sectors. These experts integrated their collective knowledge on dissemination and implementation designs with searches of published evaluations strategies. Results This paper emphasizes randomized and non-randomized designs for the traditional translational research continuum or pipeline, which builds on existing efficacy and effectiveness trials to examine how one or more evidence-based clinical/prevention interventions are adopted, scaled up, and sustained in community or service delivery systems. We also mention other designs, including hybrid designs that combine effectiveness and implementation research, quality improvement designs for local knowledge, and designs that use simulation modeling. PMID:28384085
ERIC Educational Resources Information Center
Rodriguez-Sanchez, Emiliano; Patino-Alonso, Maria C.; Mora-Simon, Sara; Gomez-Marcos, Manuel A.; Perez-Penaranda, Anibal; Losada-Baltar, Andres; Garcia-Ortiz, Luis
2013-01-01
Purpose: To assess, in the context of Primary Health Care (PHC), the effect of a psychological intervention in mental health among caregivers (CGs) of dependent relatives. Design and Methods: Randomized multicenter, controlled clinical trial. The 125 CGs included in the trial were receiving health care in PHC. Inclusion criteria: Identifying…
ERIC Educational Resources Information Center
Sarayani, Amir; Rashidian, Arash; Gholami, Kheirollah; Torkamandi, Hassan; Javadi, Mohammadreza
2012-01-01
Introduction: Weight management is a new public health role for community pharmacists in many countries. Lack of expertise is one of the key barriers to counseling obese patients. We evaluated the comparative efficacy of three alternative continuing education (CE) meetings on weight management. Methods: We designed a randomized controlled trial…
ERIC Educational Resources Information Center
Rhoads, Christopher
2011-01-01
Researchers planning a randomized field trial to evaluate the effectiveness of an educational intervention often face the following dilemma. They plan to recruit schools to participate in their study. The question is, "Should the researchers randomly assign individuals (either students or teachers, depending on the intervention) within schools to…
ERIC Educational Resources Information Center
Eliasson, Ann-Christin; Shaw, Karin; Berg, Elisabeth; Krumlinde-Sundholm, Lena
2011-01-01
The aim was to evaluate the effect of Eco-CIMT in young children with unilateral cerebral palsy in a randomized controlled crossover design. The training was implemented within the regular pediatric services, provided by the child's parents and/or preschool teacher and supervised by the child's regular therapist. Methods: Twenty-five children…
ERIC Educational Resources Information Center
Johnson, Dawn M.; Zlotnick, Caron; Perez, Sara
2011-01-01
Objective: This study was designed to explore the acceptability, feasibility, and initial efficacy of a new shelter-based treatment for victims of intimate partner violence (IPV; i.e., Helping to Overcome PTSD through Empowerment [HOPE]). Method: A Phase I randomized clinical trial comparing HOPE (n = 35) with standard shelter services (SSS) (n =…
ERIC Educational Resources Information Center
Iriyama, Yae; Murayama, Nobuko
2014-01-01
Objective: We conducted a randomized controlled crossover trial to evaluate the effects of a new worksite weight-control programme designed for men with or at risk of obesity using a combination of nutrition education and nutrition environmental interventions. Subjects and methods: Male workers with or at risk of obesity were recruited for this…
ERIC Educational Resources Information Center
Gellis, Zvi D.; Kenaley, Bonnie; McGinty, Jean; Bardelli, Ellen; Davitt, Joan; Ten Have, Thomas
2012-01-01
Purpose: Telehealth care is emerging as a viable intervention model to treat complex chronic conditions, such as heart failure (HF) and chronic obstructive pulmonary disease (COPD), and to engage older adults in self-care disease management. Design and Methods: We report on a randomized controlled trial examining the impact of a multifaceted…
A Body Image and Disordered Eating Intervention for Women in Midlife: A Randomized Controlled Trial
ERIC Educational Resources Information Center
McLean, Sian A.; Paxton, Susan J.; Wertheim, Eleanor H.
2011-01-01
Objective: This study examined the outcome of a body image and disordered eating intervention for midlife women. The intervention was specifically designed to address risk factors that are pertinent in midlife. Method: Participants were 61 women aged 30 to 60 years (M = 43.92, SD = 8.22) randomly assigned to intervention (n = 32) or (delayed…
ERIC Educational Resources Information Center
Jones, Stephanie M.; Brown, Joshua L.; Hoglund, Wendy L. G.; Aber, J. Lawrence
2010-01-01
Objective: To report experimental impacts of a universal, integrated school-based intervention in social-emotional learning and literacy development on change over 1 school year in 3rd-grade children's social-emotional, behavioral, and academic outcomes. Method: This study employed a school-randomized, experimental design and included 942…
ERIC Educational Resources Information Center
Sebro, Negusse Yohannes; Goshu, Ayele Taye
2017-01-01
This study aims to explore Bayesian multilevel modeling to investigate variations of average academic achievement of grade eight school students. A sample of 636 students is randomly selected from 26 private and government schools by a two-stage stratified sampling design. Bayesian method is used to estimate the fixed and random effects. Input and…
ERIC Educational Resources Information Center
Lennox, Nicholas; Bain, Chris; Rey-Conde, Therese; Taylor, Miriam; Boyle, Frances M.; Purdie, David M.; Ware, Robert S.
2010-01-01
Background: People with intellectual disability who live in the community often have poor health and healthcare, partly as a consequence of poor communication, recall difficulties and incomplete patient health information. Materials and Methods: A cluster randomized-controlled trial with 2 x 2 factorial design was conducted with adults with…
ERIC Educational Resources Information Center
Glassman, Jill R.; Potter, Susan C.; Baumler, Elizabeth R.; Coyle, Karin K.
2015-01-01
Introduction: Group-randomized trials (GRTs) are one of the most rigorous methods for evaluating the effectiveness of group-based health risk prevention programs. Efficiently designing GRTs with a sample size that is sufficient for meeting the trial's power and precision goals while not wasting resources exceeding them requires estimates of the…
ERIC Educational Resources Information Center
Schlosser, Ralf W.; Koul, Rajinder; Shane, Howard; Sorce, James; Brock, Kristofer; Harmon, Ashley; Moerlein, Dorothy; Hearn, Emilia
2014-01-01
Purpose: The effects of animation on naming and identification of graphic symbols for verbs and prepositions were studied in 2 graphic symbol sets in preschoolers. Method: Using a 2 × 2 × 2 × 3 completely randomized block design, preschoolers across three age groups were randomly assigned to combinations of symbol set (Autism Language Program…
Investigating Test Equating Methods in Small Samples through Various Factors
ERIC Educational Resources Information Center
Asiret, Semih; Sünbül, Seçil Ömür
2016-01-01
In this study, equating methods for random group design using small samples through factors such as sample size, difference in difficulty between forms, and guessing parameter was aimed for comparison. Moreover, which method gives better results under which conditions was also investigated. In this study, 5,000 dichotomous simulated data…
Analyzing Empirical Evaluations of Non-Experimental Methods in Field Settings
ERIC Educational Resources Information Center
Steiner, Peter M.; Wong, Vivian
2016-01-01
Despite recent emphasis on the use of randomized control trials (RCTs) for evaluating education interventions, in most areas of education research, observational methods remain the dominant approach for assessing program effects. Over the last three decades, the within-study comparison (WSC) design has emerged as a method for evaluating the…
Rationale, design and methods of the HEALTHY study physical education intervention component
USDA-ARS?s Scientific Manuscript database
The HEALTHY primary prevention trial was designed to reduce risk factors for type 2 diabetes in middle school students. Middle schools at seven centers across the United States participated in the 3-year study. Half of them were randomized to receive a multi-component intervention. The intervention ...
Binomial leap methods for simulating stochastic chemical kinetics.
Tian, Tianhai; Burrage, Kevin
2004-12-01
This paper discusses efficient simulation methods for stochastic chemical kinetics. Based on the tau-leap and midpoint tau-leap methods of Gillespie [D. T. Gillespie, J. Chem. Phys. 115, 1716 (2001)], binomial random variables are used in these leap methods rather than Poisson random variables. The motivation for this approach is to improve the efficiency of the Poisson leap methods by using larger stepsizes. Unlike Poisson random variables whose range of sample values is from zero to infinity, binomial random variables have a finite range of sample values. This probabilistic property has been used to restrict possible reaction numbers and to avoid negative molecular numbers in stochastic simulations when larger stepsize is used. In this approach a binomial random variable is defined for a single reaction channel in order to keep the reaction number of this channel below the numbers of molecules that undergo this reaction channel. A sampling technique is also designed for the total reaction number of a reactant species that undergoes two or more reaction channels. Samples for the total reaction number are not greater than the molecular number of this species. In addition, probability properties of the binomial random variables provide stepsize conditions for restricting reaction numbers in a chosen time interval. These stepsize conditions are important properties of robust leap control strategies. Numerical results indicate that the proposed binomial leap methods can be applied to a wide range of chemical reaction systems with very good accuracy and significant improvement on efficiency over existing approaches. (c) 2004 American Institute of Physics.
MicroRNA array normalization: an evaluation using a randomized dataset as the benchmark.
Qin, Li-Xuan; Zhou, Qin
2014-01-01
MicroRNA arrays possess a number of unique data features that challenge the assumption key to many normalization methods. We assessed the performance of existing normalization methods using two microRNA array datasets derived from the same set of tumor samples: one dataset was generated using a blocked randomization design when assigning arrays to samples and hence was free of confounding array effects; the second dataset was generated without blocking or randomization and exhibited array effects. The randomized dataset was assessed for differential expression between two tumor groups and treated as the benchmark. The non-randomized dataset was assessed for differential expression after normalization and compared against the benchmark. Normalization improved the true positive rate significantly in the non-randomized data but still possessed a false discovery rate as high as 50%. Adding a batch adjustment step before normalization further reduced the number of false positive markers while maintaining a similar number of true positive markers, which resulted in a false discovery rate of 32% to 48%, depending on the specific normalization method. We concluded the paper with some insights on possible causes of false discoveries to shed light on how to improve normalization for microRNA arrays.
MicroRNA Array Normalization: An Evaluation Using a Randomized Dataset as the Benchmark
Qin, Li-Xuan; Zhou, Qin
2014-01-01
MicroRNA arrays possess a number of unique data features that challenge the assumption key to many normalization methods. We assessed the performance of existing normalization methods using two microRNA array datasets derived from the same set of tumor samples: one dataset was generated using a blocked randomization design when assigning arrays to samples and hence was free of confounding array effects; the second dataset was generated without blocking or randomization and exhibited array effects. The randomized dataset was assessed for differential expression between two tumor groups and treated as the benchmark. The non-randomized dataset was assessed for differential expression after normalization and compared against the benchmark. Normalization improved the true positive rate significantly in the non-randomized data but still possessed a false discovery rate as high as 50%. Adding a batch adjustment step before normalization further reduced the number of false positive markers while maintaining a similar number of true positive markers, which resulted in a false discovery rate of 32% to 48%, depending on the specific normalization method. We concluded the paper with some insights on possible causes of false discoveries to shed light on how to improve normalization for microRNA arrays. PMID:24905456
Designing with figer-reinforced plastics (planar random composites)
NASA Technical Reports Server (NTRS)
Chamis, C. C.
1982-01-01
The use of composite mechanics to predict the hygrothermomechanical behavior of planar random composites (PRC) is reviewed and described. These composites are usually made from chopped fiber reinforced resins (thermoplastics or thermosets). The hygrothermomechanical behavior includes mechanical properties, physical properties, thermal properties, fracture toughness, creep and creep rupture. Properties are presented in graphical form with sample calculations to illustrate their use. Concepts such as directional reinforcement and strip hybrids are described. Typical data that can be used for preliminary design for various PRCs are included. Several resins and molding compounds used to make PRCs are described briefly. Pertinent references are cited that cover analysis and design methods, materials, data, fabrication procedures and applications.
Schweizer, Marin L; Braun, Barbara I; Milstone, Aaron M
2016-10-01
Quasi-experimental studies evaluate the association between an intervention and an outcome using experiments in which the intervention is not randomly assigned. Quasi-experimental studies are often used to evaluate rapid responses to outbreaks or other patient safety problems requiring prompt, nonrandomized interventions. Quasi-experimental studies can be categorized into 3 major types: interrupted time-series designs, designs with control groups, and designs without control groups. This methods paper highlights key considerations for quasi-experimental studies in healthcare epidemiology and antimicrobial stewardship, including study design and analytic approaches to avoid selection bias and other common pitfalls of quasi-experimental studies. Infect Control Hosp Epidemiol 2016;1-6.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, S.Y.; Tepikian, S.
1985-01-01
Nonlinear magnetic forces become more important for particles in the modern large accelerators. These nonlinear elements are introduced either intentionally to control beam dynamics or by uncontrollable random errors. Equations of motion in the nonlinear Hamiltonian are usually non-integrable. Because of the nonlinear part of the Hamiltonian, the tune diagram of accelerators is a jungle. Nonlinear magnet multipoles are important in keeping the accelerator operation point in the safe quarter of the hostile jungle of resonant tunes. Indeed, all the modern accelerator designs have taken advantages of nonlinear mechanics. On the other hand, the effect of the uncontrollable random multipolesmore » should be evaluated carefully. A powerful method of studying the effect of these nonlinear multipoles is using a particle tracking calculation, where a group of test particles are tracing through these magnetic multipoles in the accelerator hundreds to millions of turns in order to test the dynamical aperture of the machine. These methods are extremely useful in the design of a large accelerator such as SSC, LEP, HERA and RHIC. These calculations unfortunately take a tremendous amount of computing time. In this review the method of determining chaotic orbit and applying the method to nonlinear problems in accelerator physics is discussed. We then discuss the scaling properties and effect of random sextupoles.« less
Robust Airfoil Optimization to Achieve Consistent Drag Reduction Over a Mach Range
NASA Technical Reports Server (NTRS)
Li, Wu; Huyse, Luc; Padula, Sharon; Bushnell, Dennis M. (Technical Monitor)
2001-01-01
We prove mathematically that in order to avoid point-optimization at the sampled design points for multipoint airfoil optimization, the number of design points must be greater than the number of free-design variables. To overcome point-optimization at the sampled design points, a robust airfoil optimization method (called the profile optimization method) is developed and analyzed. This optimization method aims at a consistent drag reduction over a given Mach range and has three advantages: (a) it prevents severe degradation in the off-design performance by using a smart descent direction in each optimization iteration, (b) there is no random airfoil shape distortion for any iterate it generates, and (c) it allows a designer to make a trade-off between a truly optimized airfoil and the amount of computing time consumed. For illustration purposes, we use the profile optimization method to solve a lift-constrained drag minimization problem for 2-D airfoil in Euler flow with 20 free-design variables. A comparison with other airfoil optimization methods is also included.
Messier, S P; Callahan, L F; Golightly, Y M; Keefe, F J
2015-05-01
The objective was to develop a set of "best practices" for use as a primer for those interested in entering the clinical trials field for lifestyle diet and/or exercise interventions in osteoarthritis (OA), and as a set of recommendations for experienced clinical trials investigators. A subcommittee of the non-pharmacologic therapies committee of the OARSI Clinical Trials Working Group was selected by the Steering Committee to develop a set of recommended principles for non-pharmacologic diet/exercise OA randomized clinical trials. Topics were identified for inclusion by co-authors and reviewed by the subcommittee. Resources included authors' expert opinions, traditional search methods including MEDLINE (via PubMed), and previously published guidelines. Suggested steps and considerations for study methods (e.g., recruitment and enrollment of participants, study design, intervention and assessment methods) were recommended. The recommendations set forth in this paper provide a guide from which a research group can design a lifestyle diet/exercise randomized clinical trial in patients with OA. Copyright © 2015 Osteoarthritis Research Society International. Published by Elsevier Ltd. All rights reserved.
Analytic methods for questions pertaining to a randomized pretest, posttest, follow-up design.
Rausch, Joseph R; Maxwell, Scott E; Kelley, Ken
2003-09-01
Delineates 5 questions regarding group differences that are likely to be of interest to researchers within the framework of a randomized pretest, posttest, follow-up (PPF) design. These 5 questions are examined from a methodological perspective by comparing and discussing analysis of variance (ANOVA) and analysis of covariance (ANCOVA) methods and briefly discussing hierarchical linear modeling (HLM) for these questions. This article demonstrates that the pretest should be utilized as a covariate in the model rather than as a level of the time factor or as part of the dependent variable within the analysis of group differences. It is also demonstrated that how the posttest and the follow-up are utilized in the analysis of group differences is determined by the specific question asked by the researcher.
Self-organization of maze-like structures via guided wrinkling.
Bae, Hyung Jong; Bae, Sangwook; Yoon, Jinsik; Park, Cheolheon; Kim, Kibeom; Kwon, Sunghoon; Park, Wook
2017-06-01
Sophisticated three-dimensional (3D) structures found in nature are self-organized by bottom-up natural processes. To artificially construct these complex systems, various bottom-up fabrication methods, designed to transform 2D structures into 3D structures, have been developed as alternatives to conventional top-down lithography processes. We present a different self-organization approach, where we construct microstructures with periodic and ordered, but with random architecture, like mazes. For this purpose, we transformed planar surfaces using wrinkling to directly use randomly generated ridges as maze walls. Highly regular maze structures, consisting of several tessellations with customized designs, were fabricated by precisely controlling wrinkling with the ridge-guiding structure, analogous to the creases in origami. The method presented here could have widespread applications in various material systems with multiple length scales.
Linear combinations come alive in crossover designs.
Shuster, Jonathan J
2017-10-30
Before learning anything about statistical inference in beginning service courses in biostatistics, students learn how to calculate the mean and variance of linear combinations of random variables. Practical precalculus examples of the importance of these exercises can be helpful for instructors, the target audience of this paper. We shall present applications to the "1-sample" and "2-sample" methods for randomized short-term 2-treatment crossover studies, where patients experience both treatments in random order with a "washout" between the active treatment periods. First, we show that the 2-sample method is preferred as it eliminates "conditional bias" when sample sizes by order differ and produces a smaller variance. We also demonstrate that it is usually advisable to use the differences in posttests (ignoring baseline and post washout values) rather than the differences between the changes in treatment from the start of the period to the end of the period ("delta of delta"). Although the intent is not to provide a definitive discussion of crossover designs, we provide a section and references to excellent alternative methods, where instructors can provide motivation to students to explore the topic in greater detail in future readings or courses. Copyright © 2017 John Wiley & Sons, Ltd.
A Comparative Study of Random Patterns for Digital Image Correlation
NASA Astrophysics Data System (ADS)
Stoilov, G.; Kavardzhikov, V.; Pashkouleva, D.
2012-06-01
Digital Image Correlation (DIC) is a computer based image analysis technique utilizing random patterns, which finds applications in experimental mechanics of solids and structures. In this paper a comparative study of three simulated random patterns is done. One of them is generated according to a new algorithm, introduced by the authors. A criterion for quantitative evaluation of random patterns after the calculation of their autocorrelation functions is introduced. The patterns' deformations are simulated numerically and realized experimentally. The displacements are measured by using the DIC method. Tensile tests are performed after printing the generated random patterns on surfaces of standard iron sheet specimens. It is found that the new designed random pattern keeps relatively good quality until reaching 20% deformation.
Truss Optimization for a Manned Nuclear Electric Space Vehicle using Genetic Algorithms
NASA Technical Reports Server (NTRS)
Benford, Andrew; Tinker, Michael L.
2004-01-01
The purpose of this paper is to utilize the genetic algorithm (GA) optimization method for structural design of a nuclear propulsion vehicle. Genetic algorithms provide a guided, random search technique that mirrors biological adaptation. To verify the GA capabilities, other traditional optimization methods were used to generate results for comparison to the GA results, first for simple two-dimensional structures, and then for full-scale three-dimensional truss designs.
Analysis of Variance in the Modern Design of Experiments
NASA Technical Reports Server (NTRS)
Deloach, Richard
2010-01-01
This paper is a tutorial introduction to the analysis of variance (ANOVA), intended as a reference for aerospace researchers who are being introduced to the analytical methods of the Modern Design of Experiments (MDOE), or who may have other opportunities to apply this method. One-way and two-way fixed-effects ANOVA, as well as random effects ANOVA, are illustrated in practical terms that will be familiar to most practicing aerospace researchers.
Des Jarlais, Don C.; Lyles, Cynthia; Crepaz, Nicole
2004-01-01
Developing an evidence base for making public health decisions will require using data from evaluation studies with randomized and nonrandomized designs. Assessing individual studies and using studies in quantitative research syntheses require transparent reporting of the study, with sufficient detail and clarity to readily see differences and similarities among studies in the same area. The Consolidated Standards of Reporting Trials (CONSORT) statement provides guidelines for transparent reporting of randomized clinical trials. We present the initial version of the Transparent Reporting of Evaluations with Nonrandomized Designs (TREND) statement. These guidelines emphasize the reporting of theories used and descriptions of intervention and comparison conditions, research design, and methods of adjusting for possible biases in evaluation studies that use nonrandomized designs. PMID:14998794
Scott, JoAnna M; deCamp, Allan; Juraska, Michal; Fay, Michael P; Gilbert, Peter B
2017-04-01
Stepped wedge designs are increasingly commonplace and advantageous for cluster randomized trials when it is both unethical to assign placebo, and it is logistically difficult to allocate an intervention simultaneously to many clusters. We study marginal mean models fit with generalized estimating equations for assessing treatment effectiveness in stepped wedge cluster randomized trials. This approach has advantages over the more commonly used mixed models that (1) the population-average parameters have an important interpretation for public health applications and (2) they avoid untestable assumptions on latent variable distributions and avoid parametric assumptions about error distributions, therefore, providing more robust evidence on treatment effects. However, cluster randomized trials typically have a small number of clusters, rendering the standard generalized estimating equation sandwich variance estimator biased and highly variable and hence yielding incorrect inferences. We study the usual asymptotic generalized estimating equation inferences (i.e., using sandwich variance estimators and asymptotic normality) and four small-sample corrections to generalized estimating equation for stepped wedge cluster randomized trials and for parallel cluster randomized trials as a comparison. We show by simulation that the small-sample corrections provide improvement, with one correction appearing to provide at least nominal coverage even with only 10 clusters per group. These results demonstrate the viability of the marginal mean approach for both stepped wedge and parallel cluster randomized trials. We also study the comparative performance of the corrected methods for stepped wedge and parallel designs, and describe how the methods can accommodate interval censoring of individual failure times and incorporate semiparametric efficient estimators.
Methods for Synthesizing Findings on Moderation Effects Across Multiple Randomized Trials
Brown, C Hendricks; Sloboda, Zili; Faggiano, Fabrizio; Teasdale, Brent; Keller, Ferdinand; Burkhart, Gregor; Vigna-Taglianti, Federica; Howe, George; Masyn, Katherine; Wang, Wei; Muthén, Bengt; Stephens, Peggy; Grey, Scott; Perrino, Tatiana
2011-01-01
This paper presents new methods for synthesizing results from subgroup and moderation analyses across different randomized trials. We demonstrate that such a synthesis generally results in additional power to detect significant moderation findings above what one would find in a single trial. Three general methods for conducting synthesis analyses are discussed, with two methods, integrative data analysis, and parallel analyses, sharing a large advantage over traditional methods available in meta-analysis. We present a broad class of analytic models to examine moderation effects across trials that can be used to assess their overall effect and explain sources of heterogeneity, and present ways to disentangle differences across trials due to individual differences, contextual level differences, intervention, and trial design. PMID:21360061
Methods for synthesizing findings on moderation effects across multiple randomized trials.
Brown, C Hendricks; Sloboda, Zili; Faggiano, Fabrizio; Teasdale, Brent; Keller, Ferdinand; Burkhart, Gregor; Vigna-Taglianti, Federica; Howe, George; Masyn, Katherine; Wang, Wei; Muthén, Bengt; Stephens, Peggy; Grey, Scott; Perrino, Tatiana
2013-04-01
This paper presents new methods for synthesizing results from subgroup and moderation analyses across different randomized trials. We demonstrate that such a synthesis generally results in additional power to detect significant moderation findings above what one would find in a single trial. Three general methods for conducting synthesis analyses are discussed, with two methods, integrative data analysis and parallel analyses, sharing a large advantage over traditional methods available in meta-analysis. We present a broad class of analytic models to examine moderation effects across trials that can be used to assess their overall effect and explain sources of heterogeneity, and present ways to disentangle differences across trials due to individual differences, contextual level differences, intervention, and trial design.
Chen, Xiaoqin; Li, Ying; Zheng, Hui; Hu, Kaming; Zhang, Hongxing; Zhao, Ling; Li, Yan; Liu, Lian; Mang, Lingling; Yu, Shuyuan
2009-07-01
Acupuncture to treat Bell's palsy is one of the most commonly used methods in China. There are a variety of acupuncture treatment options to treat Bell's palsy in clinical practice. Since Bell's palsy has three different path-stages (acute stage, resting stage and restoration stage), so whether acupuncture is effective in the different path-stages and which acupuncture treatment is the best method are major issues in acupuncture clinical trials about Bell's palsy. In this article, we report the design and protocol of a large sample multi-center randomized controlled trial to treat Bell's palsy with acupuncture. There are five acupuncture groups, with four according to different path-stages and one not. In total, 900 patients with Bell's palsy are enrolled in this study. These patients are randomly assigned to receive one of the following four treatment groups according to different path-stages, i.e. 1) staging acupuncture group, 2) staging acupuncture and moxibustion group, 3) staging electro-acupuncture group, 4) staging acupuncture along yangming musculature group or non-staging acupuncture control group. The outcome measurements in this trial are the effect comparison achieved among these five groups in terms of House-Brackmann scale (Global Score and Regional Score), Facial Disability Index scale, Classification scale of Facial Paralysis, and WHOQOL-BREF scale before randomization (baseline phase) and after randomization. The result of this trial will certify the efficacy of using staging acupuncture and moxibustion to treat Bell's palsy, and to approach a best acupuncture treatment among these five different methods for treating Bell's palsy.
Flores, Glenn; Walker, Candy; Lin, Hua; Lee, Michael; Fierro, Marco; Henry, Monica; Massey, Kenneth; Portillo, Alberto
2014-01-01
Background & objectives Six million US children have no health insurance, and substantial racial/ethnic disparities exist. The design, methods, and baseline characteristics are described for Kids’ Health Insurance by Educating Lots of Parents (Kids’ HELP), the first randomized, clinical trial of the effectiveness of Parent Mentors (PMs) in insuring uninsured minority children. Methods & research design Latino and African-American children eligible for but not enrolled in Medicaid/CHIP were randomized to PMs, or a control group receiving traditional Medicaid/CHIP outreach. PMs are experienced parents with ≥ 1 Medicaid/CHIP-covered children. PMs received two days of training, and provide intervention families with information on Medicaid/CHIP eligibility, assistance with application submission, and help maintaining coverage. Primary outcomes include obtaining health insurance, time interval to obtain coverage, and parental satisfaction. A blinded assessor contacts subjects monthly for one year to monitor outcomes. Results Of 49,361 candidates screened, 329 fulfilled eligibility criteria and were randomized. The mean age is seven years for children and 32 years for caregivers; 2/3 are Latino, 1/3 are African-American, and the mean annual family income is $21,857. Half of caregivers were unaware that their uninsured child is Medicaid/CHIP eligible, and 95% of uninsured children had prior insurance. Fifteen PMs completed two-day training sessions. All PMs are female and minority, 60% are unemployed, and the mean annual family income is $20,913. Post-PM-training, overall knowledge/skills test scores significantly increased, and 100% reported being very satisfied/satisfied with the training. Conclusions Kids’ HELP successfully reached target populations, met participant enrollment goals, and recruited and trained PMs. PMID:25476583
ERIC Educational Resources Information Center
McGinnis, Kathleen A.; Schulz, Richard; Stone, Roslyn A.; Klinger, Julie; Mercurio, Rocco
2006-01-01
Purpose: We assess the effects of racial or ethnic concordance between caregivers and interventionists on caregiver attrition, change in depression, and change in burden in a multisite randomized clinical trial. Design and Methods: Family caregivers of patients with Alzheimer's disease were randomized to intervention or control groups at six sites…
ERIC Educational Resources Information Center
Stern, Susan B.; Walsh, Margaret; Mercado, Micaela; Levene, Kathryn; Pepler, Debra J.; Carr, Ashley; Heppell, Allison; Lowe, Erin
2015-01-01
Objective: This study examines the effect of an ecological and contextually responsive approach, during initial intake call, on engagement for multistressed families seeking child mental health services in an urban setting. Methods: Using a randomized design, parents were allocated to phone Intake As Usual (IAU) or Enhanced Engagement Phone Intake…
ERIC Educational Resources Information Center
Rivkin, Anna; Alexander, Robert C.; Knighton, Jennifer; Hutson, Pete H.; Wang, Xiaojing J.; Snavely, Duane B.; Rosah, Thomas; Watt, Alan P.; Reimherr, Fred W.; Adler, Lenard A.
2012-01-01
Objective: Preclinical models, receptor localization, and genetic linkage data support the role of D4 receptors in the etiology of ADHD. This proof-of-concept study was designed to evaluate MK-0929, a selective D4 receptor antagonist as treatment for adult ADHD. Method: A randomized, double-blind, placebo-controlled, crossover study was conducted…
ERIC Educational Resources Information Center
Possel, Patrick; Baldus, Christiane; Horn, Andrea B.; Groen, Gunter; Hautzinger, Martin
2005-01-01
Background: Depressive disorders in adolescents are a widespread and increasing problem. Prevention seems a promising and feasible approach. Methods: We designed a cognitive-behavioral school-based universal primary prevention program and followed 347 eighth-grade students participating in a randomized controlled trial for three months. Results:…
ERIC Educational Resources Information Center
Tonge, Bruce; Brereton, Avril; Kiomall, Melissa; MacKinnon, Andrew; King, Neville; Rinehart, Nicole
2006-01-01
Objective: To determine the impact of a parent education and behavior management intervention (PEBM) on the mental health and adjustment of parents with preschool children with autism. Method: A randomized, group-comparison design involving a parent education and counseling intervention to control for nonspecific therapist effects and a control…
ERIC Educational Resources Information Center
Lane, Aoife; Murphy, Niamh; Bauman, Adrian; Chey, Tien
2010-01-01
Objective: To assess the impact of a community based, low-contact intervention on the physical activity habits of insufficiently active women. Design: Randomized controlled trial. Participants: Inactive Irish women. Method: A population sample of women participating in a mass 10 km event were up followed at 2 and 6 months, and those who had…
Sequential causal inference: Application to randomized trials of adaptive treatment strategies
Dawson, Ree; Lavori, Philip W.
2009-01-01
SUMMARY Clinical trials that randomize subjects to decision algorithms, which adapt treatments over time according to individual response, have gained considerable interest as investigators seek designs that directly inform clinical decision making. We consider designs in which subjects are randomized sequentially at decision points, among adaptive treatment options under evaluation. We present a sequential method to estimate the comparative effects of the randomized adaptive treatments, which are formalized as adaptive treatment strategies. Our causal estimators are derived using Bayesian predictive inference. We use analytical and empirical calculations to compare the predictive estimators to (i) the ‘standard’ approach that allocates the sequentially obtained data to separate strategy-specific groups as would arise from randomizing subjects at baseline; (ii) the semi-parametric approach of marginal mean models that, under appropriate experimental conditions, provides the same sequential estimator of causal differences as the proposed approach. Simulation studies demonstrate that sequential causal inference offers substantial efficiency gains over the standard approach to comparing treatments, because the predictive estimators can take advantage of the monotone structure of shared data among adaptive strategies. We further demonstrate that the semi-parametric asymptotic variances, which are marginal ‘one-step’ estimators, may exhibit significant bias, in contrast to the predictive variances. We show that the conditions under which the sequential method is attractive relative to the other two approaches are those most likely to occur in real studies. PMID:17914714
The Performance of Methods to Test Upper-Level Mediation in the Presence of Nonnormal Data
ERIC Educational Resources Information Center
Pituch, Keenan A.; Stapleton, Laura M.
2008-01-01
A Monte Carlo study compared the statistical performance of standard and robust multilevel mediation analysis methods to test indirect effects for a cluster randomized experimental design under various departures from normality. The performance of these methods was examined for an upper-level mediation process, where the indirect effect is a fixed…
Perspective: Randomized Controlled Trials Are Not a Panacea for Diet-Related Research12
Hébert, James R; Frongillo, Edward A; Adams, Swann A; Turner-McGrievy, Gabrielle M; Hurley, Thomas G; Miller, Donald R; Ockene, Ira S
2016-01-01
Research into the role of diet in health faces a number of methodologic challenges in the choice of study design, measurement methods, and analytic options. Heavier reliance on randomized controlled trial (RCT) designs is suggested as a way to solve these challenges. We present and discuss 7 inherent and practical considerations with special relevance to RCTs designed to study diet: 1) the need for narrow focus; 2) the choice of subjects and exposures; 3) blinding of the intervention; 4) perceived asymmetry of treatment in relation to need; 5) temporal relations between dietary exposures and putative outcomes; 6) strict adherence to the intervention protocol, despite potential clinical counter-indications; and 7) the need to maintain methodologic rigor, including measuring diet carefully and frequently. Alternatives, including observational studies and adaptive intervention designs, are presented and discussed. Given high noise-to-signal ratios interjected by using inaccurate assessment methods in studies with weak or inappropriate study designs (including RCTs), it is conceivable and indeed likely that effects of diet are underestimated. No matter which designs are used, studies will require continued improvement in the assessment of dietary intake. As technology continues to improve, there is potential for enhanced accuracy and reduced user burden of dietary assessments that are applicable to a wide variety of study designs, including RCTs. PMID:27184269
Quissell, David O.; Bryant, Lucinda L.; Braun, Patricia A.; Cudeii, Diana; Johs, Nikolas; Smith, Vongphone L.; George, Carmen; Henderson, William G.; Albino, Judith
2014-01-01
Navajo Nation children have the greatest prevalence of early childhood caries in the United States. This protocol describes an innovative combination of community-based participatory research and clinical trial methods to rigorously test a lay native Community Oral Health Specialists-delivered oral health intervention, with the goal of reducing the progression of disease and improving family knowledge and behaviors. Methods/Design This cluster-randomized trial designed by researchers at the Center for Native Oral Health Research at the University of Colorado in conjunction with members of the Navajo Nation community compares outcomes between the manualized 2-year oral health fluoride varnish-oral health promotion intervention and usual care in the community (child-caregiver dyads from 26 Head Start classrooms in each study arm; total of 1016 dyads). Outcome assessment includes annual dental screening and an annual caregiver survey of knowledge, attitudes and behaviors; collection of cost data will support cost-benefit analyses. Discussion The study protocol meets all standards required of randomized clinical trials. Aligned with principles of community-based participatory research, extended interaction between members of the Navajo community and researchers preceded study initiation, and collaboration between project staff and a wide variety of community members informed the study design and implementation. We believe the benefits of adding CBPR methods to those of randomized clinical studies outweigh the barriers and constraints, especially in studies of health disparities and in challenging settings. When done well, this innovative mix of methods will increase the likelihood of valid results that communities can use. PMID:24469238
ERIC Educational Resources Information Center
Hirumi, Atsusi; Kleinsmith, Andrea; Johnsen, Kyle; Kubovec, Stacey; Eakins, Michael; Bogert, Kenneth; Rivera-Gutierrez, Diego J.; Reyes, Ramsamooj Javier; Lok, Benjamin; Cendan, Juan
2016-01-01
Systematic reviews and meta-analyses of randomized controlled studies conclude that virtual patient simulations are consistently associated with higher learning outcomes compared to other educational methods. However, we cannot assume that students will learn from simply exposing students to the simulations. The instructional features that are…
Regression Discontinuity Design in Gifted and Talented Education Research
ERIC Educational Resources Information Center
Matthews, Michael S.; Peters, Scott J.; Housand, Angela M.
2012-01-01
This Methodological Brief introduces the reader to the regression discontinuity design (RDD), which is a method that when used correctly can yield estimates of research treatment effects that are equivalent to those obtained through randomized control trials and can therefore be used to infer causality. However, RDD does not require the random…
Are Written Instructions Enough? Efficacy of Male Condom Packaging Leaflets among College Students
ERIC Educational Resources Information Center
Lindemann, Dana F.; Harbke, Colin R.
2013-01-01
Objective: To evaluate whether or not written condom use instructions successfully inform correct condom use skills. Design: Between-subjects, two-group design. Setting: Public university located in rural Midwestern region of the United States. Method: Participants were randomly assigned to either a control condition (read physical exercise…
ERIC Educational Resources Information Center
Joice, Sara; Johnston, Marie; Bonetti, Debbie; Morrison, Val; MacWalter, Ron
2012-01-01
Objective: To report stroke survivors' experiences and perceived usefulness of an effective self-help workbook-based intervention. Design: A cross-sectional study involving the intervention group of an earlier randomized controlled trial. Setting: At the participants' homes approximately seven weeks post-hospital discharge. Method: Following the…
The look AHEAD trial: bone loss at four-year follow-up in type 2 diabetes
USDA-ARS?s Scientific Manuscript database
OBJECTIVE: To determine whether an intensive lifestyle intervention (ILI) designed to sustain weight loss and improve physical fitness in overweight or obese persons with type 2 diabetes was associated with bone loss after 4 years of follow-up. RESEARCH DESIGN AND METHODS: This randomized controlled...
A More Powerful Test in Three-Level Cluster Randomized Designs
ERIC Educational Resources Information Center
Konstantopoulos, Spyros
2011-01-01
Field experiments that involve nested structures frequently assign treatment conditions to entire groups (such as schools). A key aspect of the design of such experiments includes knowledge of the clustering effects that are often expressed via intraclass correlation. This study provides methods for constructing a more powerful test for the…
Air data system optimization using a genetic algorithm
NASA Technical Reports Server (NTRS)
Deshpande, Samir M.; Kumar, Renjith R.; Seywald, Hans; Siemers, Paul M., III
1992-01-01
An optimization method for flush-orifice air data system design has been developed using the Genetic Algorithm approach. The optimization of the orifice array minimizes the effect of normally distributed random noise in the pressure readings on the calculation of air data parameters, namely, angle of attack, sideslip angle and freestream dynamic pressure. The optimization method is applied to the design of Pressure Distribution/Air Data System experiment (PD/ADS) proposed for inclusion in the Aeroassist Flight Experiment (AFE). Results obtained by the Genetic Algorithm method are compared to the results obtained by conventional gradient search method.
Schaeffer, Christine; Teter, Caroline; Finch, Emily A; Hurt, Courtney; Keeter, Mary Kate; Liss, David T; Rogers, Angela; Sheth, Avani; Ackermann, Ronald
2018-02-01
Transitional care programs have been widely used to reduce readmissions and improve the quality and safety of the handoff process between hospital and outpatient providers. Very little is known about effective transitional care interventions among patients who are uninsured or with Medicaid. This paper describes the design and baseline characteristics of a pragmatic randomized comparative effectiveness trial of transitional care. Northwestern Medical Group- Transitional Care (NMG-TC) care model was developed to address the needs of patients with multiple medical problems that required lifestyle changes and were amenable to office-based management. We present the design, evaluation methods and baseline characteristics of NMG-TC trial patients. Baseline demographic characteristics indicate that our patient population is predominantly male, Medicaid insured and non-white. This study will evaluate two methods for implementing an effective transitional care model in a medically complex and socioeconomically diverse population. Copyright © 2017 Elsevier Inc. All rights reserved.
On the design and analysis of clinical trials with correlated outcomes
Follmann, Dean; Proschan, Michael
2014-01-01
SUMMARY The convention in clinical trials is to regard outcomes as independently distributed, but in some situations they may be correlated. For example, in infectious diseases, correlation may be induced if participants have contact with a common infectious source, or share hygienic tips that prevent infection. This paper discusses the design and analysis of randomized clinical trials that allow arbitrary correlation among all randomized volunteers. This perspective generalizes the traditional perspective of strata, where patients are exchangeable within strata, and independent across strata. For theoretical work, we focus on the test of no treatment effect μ1 − μ0 = 0 when the n dimensional vector of outcomes follows a Gaussian distribution with known n × n covariance matrix Σ, where the half randomized to treatment (placebo) have mean response μ1 (μ0). We show how the new test corresponds to familiar tests in simple situations for independent, exchangeable, paired, and clustered data. We also discuss the design of trials where Σ is known before or during randomization of patients and evaluate randomization schemes based on such knowledge. We provide two complex examples to illustrate the method, one for a study of 23 family clusters with cardiomyopathy, the other where the malaria attack rates vary within households and clusters of households in a Malian village. PMID:25111420
Methodological Reporting of Randomized Trials in Five Leading Chinese Nursing Journals
Shi, Chunhu; Tian, Jinhui; Ren, Dan; Wei, Hongli; Zhang, Lihuan; Wang, Quan; Yang, Kehu
2014-01-01
Background Randomized controlled trials (RCTs) are not always well reported, especially in terms of their methodological descriptions. This study aimed to investigate the adherence of methodological reporting complying with CONSORT and explore associated trial level variables in the Chinese nursing care field. Methods In June 2012, we identified RCTs published in five leading Chinese nursing journals and included trials with details of randomized methods. The quality of methodological reporting was measured through the methods section of the CONSORT checklist and the overall CONSORT methodological items score was calculated and expressed as a percentage. Meanwhile, we hypothesized that some general and methodological characteristics were associated with reporting quality and conducted a regression with these data to explore the correlation. The descriptive and regression statistics were calculated via SPSS 13.0. Results In total, 680 RCTs were included. The overall CONSORT methodological items score was 6.34±0.97 (Mean ± SD). No RCT reported descriptions and changes in “trial design,” changes in “outcomes” and “implementation,” or descriptions of the similarity of interventions for “blinding.” Poor reporting was found in detailing the “settings of participants” (13.1%), “type of randomization sequence generation” (1.8%), calculation methods of “sample size” (0.4%), explanation of any interim analyses and stopping guidelines for “sample size” (0.3%), “allocation concealment mechanism” (0.3%), additional analyses in “statistical methods” (2.1%), and targeted subjects and methods of “blinding” (5.9%). More than 50% of trials described randomization sequence generation, the eligibility criteria of “participants,” “interventions,” and definitions of the “outcomes” and “statistical methods.” The regression analysis found that publication year and ITT analysis were weakly associated with CONSORT score. Conclusions The completeness of methodological reporting of RCTs in the Chinese nursing care field is poor, especially with regard to the reporting of trial design, changes in outcomes, sample size calculation, allocation concealment, blinding, and statistical methods. PMID:25415382
Randomized controlled trials in dentistry: common pitfalls and how to avoid them.
Fleming, Padhraig S; Lynch, Christopher D; Pandis, Nikolaos
2014-08-01
Clinical trials are used to appraise the effectiveness of clinical interventions throughout medicine and dentistry. Randomized controlled trials (RCTs) are established as the optimal primary design and are published with increasing frequency within the biomedical sciences, including dentistry. This review outlines common pitfalls associated with the conduct of randomized controlled trials in dentistry. Common failings in RCT design leading to various types of bias including selection, performance, detection and attrition bias are discussed in this review. Moreover, methods of minimizing and eliminating bias are presented to ensure that maximal benefit is derived from RCTs within dentistry. Well-designed RCTs have both upstream and downstream uses acting as a template for development and populating systematic reviews to permit more precise estimates of treatment efficacy and effectiveness. However, there is increasing awareness of waste in clinical research, whereby resource-intensive studies fail to provide a commensurate level of scientific evidence. Waste may stem either from inappropriate design or from inadequate reporting of RCTs; the importance of robust conduct of RCTs within dentistry is clear. Optimal reporting of randomized controlled trials within dentistry is necessary to ensure that trials are reliable and valid. Common shortcomings leading to important forms or bias are discussed and approaches to minimizing these issues are outlined. Copyright © 2014 Elsevier Ltd. All rights reserved.
Multiobjective optimization in structural design with uncertain parameters and stochastic processes
NASA Technical Reports Server (NTRS)
Rao, S. S.
1984-01-01
The application of multiobjective optimization techniques to structural design problems involving uncertain parameters and random processes is studied. The design of a cantilever beam with a tip mass subjected to a stochastic base excitation is considered for illustration. Several of the problem parameters are assumed to be random variables and the structural mass, fatigue damage, and negative of natural frequency of vibration are considered for minimization. The solution of this three-criteria design problem is found by using global criterion, utility function, game theory, goal programming, goal attainment, bounded objective function, and lexicographic methods. It is observed that the game theory approach is superior in finding a better optimum solution, assuming the proper balance of the various objective functions. The procedures used in the present investigation are expected to be useful in the design of general dynamic systems involving uncertain parameters, stochastic process, and multiple objectives.
Statistical process control in nursing research.
Polit, Denise F; Chaboyer, Wendy
2012-02-01
In intervention studies in which randomization to groups is not possible, researchers typically use quasi-experimental designs. Time series designs are strong quasi-experimental designs but are seldom used, perhaps because of technical and analytic hurdles. Statistical process control (SPC) is an alternative analytic approach to testing hypotheses about intervention effects using data collected over time. SPC, like traditional statistical methods, is a tool for understanding variation and involves the construction of control charts that distinguish between normal, random fluctuations (common cause variation), and statistically significant special cause variation that can result from an innovation. The purpose of this article is to provide an overview of SPC and to illustrate its use in a study of a nursing practice improvement intervention. Copyright © 2011 Wiley Periodicals, Inc.
Bolland, Mark J.; Grey, Andrew; Gamble, Greg D.; Reid, Ian R.
2015-01-01
Background Observational studies (OS) and randomized controlled trials (RCTs) often report discordant results. In the Women’s Health Initiative Calcium and Vitamin D (WHI CaD) RCT, women were randomly assigned to CaD or placebo, but were permitted to use personal calcium and vitamin D supplements, creating a unique opportunity to compare results from randomized and observational analyses within the same study. Methods WHI CaD was a 7-year RCT of 1g calcium/400IU vitamin D daily in 36,282 post-menopausal women. We assessed the effects of CaD on cardiovascular events, death, cancer and fracture in a randomized design- comparing CaD with placebo in 43% of women not using personal calcium or vitamin D supplements- and in a observational design- comparing women in the placebo group (44%) using personal calcium and vitamin D supplements with non-users. Incidence was assessed using Cox proportional hazards models, and results from the two study designs deemed concordant if the absolute difference in hazard ratios was ≤0.15. We also compared results from WHI CaD to those from the WHI Observational Study(WHI OS), which used similar methodology for analyses and recruited from the same population. Results In WHI CaD, for myocardial infarction and stroke, results of unadjusted and 6/8 covariate-controlled observational analyses (age-adjusted, multivariate-adjusted, propensity-adjusted, propensity-matched) were not concordant with the randomized design results. For death, hip and total fracture, colorectal and total cancer, unadjusted and covariate-controlled observational results were concordant with randomized results. For breast cancer, unadjusted and age-adjusted observational results were concordant with randomized results, but only 1/3 other covariate-controlled observational results were concordant with randomized results. Multivariate-adjusted results from WHI OS were concordant with randomized WHI CaD results for only 4/8 endpoints. Conclusions Results of randomized analyses in WHI CaD were concordant with observational analyses for 5/8 endpoints in WHI CaD and 4/8 endpoints in WHI OS. PMID:26440516
2017-03-23
solutions obtained through their proposed method to comparative instances of a generalized assignment problem with either ordinal cost components or... method flag: Designates the method by which the changed/ new assignment problem instance is solved. methodFlag = 0:SMAWarmstart Returns a matching...of randomized perturbations. We examine the contrasts between these methods in the context of assigning Army Officers among a set of identified
Sample size calculations for the design of cluster randomized trials: A summary of methodology.
Gao, Fei; Earnest, Arul; Matchar, David B; Campbell, Michael J; Machin, David
2015-05-01
Cluster randomized trial designs are growing in popularity in, for example, cardiovascular medicine research and other clinical areas and parallel statistical developments concerned with the design and analysis of these trials have been stimulated. Nevertheless, reviews suggest that design issues associated with cluster randomized trials are often poorly appreciated and there remain inadequacies in, for example, describing how the trial size is determined and the associated results are presented. In this paper, our aim is to provide pragmatic guidance for researchers on the methods of calculating sample sizes. We focus attention on designs with the primary purpose of comparing two interventions with respect to continuous, binary, ordered categorical, incidence rate and time-to-event outcome variables. Issues of aggregate and non-aggregate cluster trials, adjustment for variation in cluster size and the effect size are detailed. The problem of establishing the anticipated magnitude of between- and within-cluster variation to enable planning values of the intra-cluster correlation coefficient and the coefficient of variation are also described. Illustrative examples of calculations of trial sizes for each endpoint type are included. Copyright © 2015 Elsevier Inc. All rights reserved.
Secure Minutiae-Based Fingerprint Templates Using Random Triangle Hashing
NASA Astrophysics Data System (ADS)
Jin, Zhe; Jin Teoh, Andrew Beng; Ong, Thian Song; Tee, Connie
Due to privacy concern on the widespread use of biometric authentication systems, biometric template protection has gained great attention in the biometric research recently. It is a challenging task to design a biometric template protection scheme which is anonymous, revocable and noninvertible while maintaining acceptable performance. Many methods have been proposed to resolve this problem, and cancelable biometrics is one of them. In this paper, we propose a scheme coined as Random Triangle Hashing which follows the concept of cancelable biometrics in the fingerprint domain. In this method, re-alignment of fingerprints is not required as all the minutiae are translated into a pre-defined 2 dimensional space based on a reference minutia. After that, the proposed Random Triangle hashing method is used to enforce the one-way property (non-invertibility) of the biometric template. The proposed method is resistant to minor translation error and rotation distortion. Finally, the hash vectors are converted into bit-strings to be stored in the database. The proposed method is evaluated using the public database FVC2004 DB1. An EER of less than 1% is achieved by using the proposed method.
NeCamp, Timothy; Kilbourne, Amy; Almirall, Daniel
2017-08-01
Cluster-level dynamic treatment regimens can be used to guide sequential treatment decision-making at the cluster level in order to improve outcomes at the individual or patient-level. In a cluster-level dynamic treatment regimen, the treatment is potentially adapted and re-adapted over time based on changes in the cluster that could be impacted by prior intervention, including aggregate measures of the individuals or patients that compose it. Cluster-randomized sequential multiple assignment randomized trials can be used to answer multiple open questions preventing scientists from developing high-quality cluster-level dynamic treatment regimens. In a cluster-randomized sequential multiple assignment randomized trial, sequential randomizations occur at the cluster level and outcomes are observed at the individual level. This manuscript makes two contributions to the design and analysis of cluster-randomized sequential multiple assignment randomized trials. First, a weighted least squares regression approach is proposed for comparing the mean of a patient-level outcome between the cluster-level dynamic treatment regimens embedded in a sequential multiple assignment randomized trial. The regression approach facilitates the use of baseline covariates which is often critical in the analysis of cluster-level trials. Second, sample size calculators are derived for two common cluster-randomized sequential multiple assignment randomized trial designs for use when the primary aim is a between-dynamic treatment regimen comparison of the mean of a continuous patient-level outcome. The methods are motivated by the Adaptive Implementation of Effective Programs Trial which is, to our knowledge, the first-ever cluster-randomized sequential multiple assignment randomized trial in psychiatry.
Osteoporosis therapies: evidence from health-care databases and observational population studies.
Silverman, Stuart L
2010-11-01
Osteoporosis is a well-recognized disease with severe consequences if left untreated. Randomized controlled trials are the most rigorous method for determining the efficacy and safety of therapies. Nevertheless, randomized controlled trials underrepresent the real-world patient population and are costly in both time and money. Modern technology has enabled researchers to use information gathered from large health-care or medical-claims databases to assess the practical utilization of available therapies in appropriate patients. Observational database studies lack randomization but, if carefully designed and successfully completed, can provide valuable information that complements results obtained from randomized controlled trials and extends our knowledge to real-world clinical patients. Randomized controlled trials comparing fracture outcomes among osteoporosis therapies are difficult to perform. In this regard, large observational database studies could be useful in identifying clinically important differences among therapeutic options. Database studies can also provide important information with regard to osteoporosis prevalence, health economics, and compliance and persistence with treatment. This article describes the strengths and limitations of both randomized controlled trials and observational database studies, discusses considerations for observational study design, and reviews a wealth of information generated by database studies in the field of osteoporosis.
Fretheim, Atle; Zhang, Fang; Ross-Degnan, Dennis; Oxman, Andrew D; Cheyne, Helen; Foy, Robbie; Goodacre, Steve; Herrin, Jeph; Kerse, Ngaire; McKinlay, R James; Wright, Adam; Soumerai, Stephen B
2015-03-01
There is often substantial uncertainty about the impacts of health system and policy interventions. Despite that, randomized controlled trials (RCTs) are uncommon in this field, partly because experiments can be difficult to carry out. An alternative method for impact evaluation is the interrupted time-series (ITS) design. Little is known, however, about how results from the two methods compare. Our aim was to explore whether ITS studies yield results that differ from those of randomized trials. We conducted single-arm ITS analyses (segmented regression) based on data from the intervention arm of cluster randomized trials (C-RCTs), that is, discarding control arm data. Secondarily, we included the control group data in the analyses, by subtracting control group data points from intervention group data points, thereby constructing a time series representing the difference between the intervention and control groups. We compared the results from the single-arm and controlled ITS analyses with results based on conventional aggregated analyses of trial data. The findings were largely concordant, yielding effect estimates with overlapping 95% confidence intervals (CI) across different analytical methods. However, our analyses revealed the importance of a concurrent control group and of taking baseline and follow-up trends into account in the analysis of C-RCTs. The ITS design is valuable for evaluation of health systems interventions, both when RCTs are not feasible and in the analysis and interpretation of data from C-RCTs. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.
Stochastic Analysis and Design of Heterogeneous Microstructural Materials System
NASA Astrophysics Data System (ADS)
Xu, Hongyi
Advanced materials system refers to new materials that are comprised of multiple traditional constituents but complex microstructure morphologies, which lead to superior properties over the conventional materials. To accelerate the development of new advanced materials system, the objective of this dissertation is to develop a computational design framework and the associated techniques for design automation of microstructure materials systems, with an emphasis on addressing the uncertainties associated with the heterogeneity of microstructural materials. Five key research tasks are identified: design representation, design evaluation, design synthesis, material informatics and uncertainty quantification. Design representation of microstructure includes statistical characterization and stochastic reconstruction. This dissertation develops a new descriptor-based methodology, which characterizes 2D microstructures using descriptors of composition, dispersion and geometry. Statistics of 3D descriptors are predicted based on 2D information to enable 2D-to-3D reconstruction. An efficient sequential reconstruction algorithm is developed to reconstruct statistically equivalent random 3D digital microstructures. In design evaluation, a stochastic decomposition and reassembly strategy is developed to deal with the high computational costs and uncertainties induced by material heterogeneity. The properties of Representative Volume Elements (RVE) are predicted by stochastically reassembling SVE elements with stochastic properties into a coarse representation of the RVE. In design synthesis, a new descriptor-based design framework is developed, which integrates computational methods of microstructure characterization and reconstruction, sensitivity analysis, Design of Experiments (DOE), metamodeling and optimization the enable parametric optimization of the microstructure for achieving the desired material properties. Material informatics is studied to efficiently reduce the dimension of microstructure design space. This dissertation develops a machine learning-based methodology to identify the key microstructure descriptors that highly impact properties of interest. In uncertainty quantification, a comparative study on data-driven random process models is conducted to provide guidance for choosing the most accurate model in statistical uncertainty quantification. Two new goodness-of-fit metrics are developed to provide quantitative measurements of random process models' accuracy. The benefits of the proposed methods are demonstrated by the example of designing the microstructure of polymer nanocomposites. This dissertation provides material-generic, intelligent modeling/design methodologies and techniques to accelerate the process of analyzing and designing new microstructural materials system.
Jian Yang; Hong S. He; Brian R. Sturtevant; Brian R. Miranda; Eric J. Gustafson
2008-01-01
We compared four fire spread simulation methods (completely random, dynamic percolation. size-based minimum travel time algorithm. and duration-based minimum travel time algorithm) and two fire occurrence simulation methods (Poisson fire frequency model and hierarchical fire frequency model) using a two-way factorial design. We examined these treatment effects on...
Reducing random measurement error in assessing postural load on the back in epidemiologic surveys.
Burdorf, A
1995-02-01
The goal of this study was to design strategies to assess postural load on the back in occupational epidemiology by taking into account the reliability of measurement methods and the variability of exposure among the workers under study. Intermethod reliability studies were evaluated to estimate the systematic bias (accuracy) and random measurement error (precision) of various methods to assess postural load on the back. Intramethod reliability studies were reviewed to estimate random variability of back load over time. Intermethod surveys have shown that questionnaires have a moderate reliability for gross activities such as sitting, whereas duration of trunk flexion and rotation should be assessed by observation methods or inclinometers. Intramethod surveys indicate that exposure variability can markedly affect the reliability of estimates of back load if the estimates are based upon a single measurement over a certain time period. Equations have been presented to evaluate various study designs according to the reliability of the measurement method, the optimum allocation of the number of repeated measurements per subject, and the number of subjects in the study. Prior to a large epidemiologic study, an exposure-oriented survey should be conducted to evaluate the performance of measurement instruments and to estimate sources of variability for back load. The strategy for assessing back load can be optimized by balancing the number of workers under study and the number of repeated measurements per worker.
An efficient genetic algorithm for maximum coverage deployment in wireless sensor networks.
Yoon, Yourim; Kim, Yong-Hyuk
2013-10-01
Sensor networks have a lot of applications such as battlefield surveillance, environmental monitoring, and industrial diagnostics. Coverage is one of the most important performance metrics for sensor networks since it reflects how well a sensor field is monitored. In this paper, we introduce the maximum coverage deployment problem in wireless sensor networks and analyze the properties of the problem and its solution space. Random deployment is the simplest way to deploy sensor nodes but may cause unbalanced deployment and therefore, we need a more intelligent way for sensor deployment. We found that the phenotype space of the problem is a quotient space of the genotype space in a mathematical view. Based on this property, we propose an efficient genetic algorithm using a novel normalization method. A Monte Carlo method is adopted to design an efficient evaluation function, and its computation time is decreased without loss of solution quality using a method that starts from a small number of random samples and gradually increases the number for subsequent generations. The proposed genetic algorithms could be further improved by combining with a well-designed local search. The performance of the proposed genetic algorithm is shown by a comparative experimental study. When compared with random deployment and existing methods, our genetic algorithm was not only about twice faster, but also showed significant performance improvement in quality.
Knott, V; Rees, D J; Cheng, Z; Brownlee, G G
1988-01-01
Sets of overlapping cosmid clones generated by random sampling and fingerprinting methods complement data at pyrB (96.5') and oriC (84') in the published physical map of E. coli. A new cloning strategy using sheared DNA, and a low copy, inducible cosmid vector were used in order to reduce bias in libraries, in conjunction with micro-methods for preparing cosmid DNA from a large number of clones. Our results are relevant to the design of the best approach to the physical mapping of large genomes. PMID:2834694
ERIC Educational Resources Information Center
Robison, Julie; Curry, Leslie; Gruman, Cynthia; Porter, Martha; Henderson, Charles R., Jr.; Pillemer, Karl
2007-01-01
Purpose: This article reports the results of a randomized, controlled evaluation of Partners in Caregiving in a Special Care Environment, an intervention designed to improve communication and cooperation between staff and families of residents in nursing home dementia programs. Design and Methods: Participants included 388 family members and 384…
The Impact of TCARE[R] on Service Recommendation, Use, and Caregiver Well-Being
ERIC Educational Resources Information Center
Kwak, Jung; Montgomery, Rhonda J. V.; Kosloski, Karl; Lang, Josh
2011-01-01
Purpose of the Study: Findings are reported from a study that examined the effects of the Tailored Caregiver Assessment and Referral (TCARE[R]) protocol, a care management process designed to help family caregivers, on care planning and caregiver outcomes. Design and Methods: A longitudinal, randomized controlled trial was conducted with 97…
Nursing Home Quality, Cost, Staffing, and Staff Mix
ERIC Educational Resources Information Center
Rantz, Marilyn J.; Hicks, Lanis; Grando, Victoria; Petroski, Gregory F.; Madsen, Richard W.; Mehr, David R.; Conn, Vicki; Zwygart-Staffacher, Mary; Scott, Jill; Flesner, Marcia; Bostick, Jane; Porter, Rose; Maas, Meridean
2004-01-01
Purpose: The purpose of this study was to describe the processes of care, organizational attributes, cost of care, staffing level, and staff mix in a sample of Missouri homes with good, average, and poor resident outcomes. Design and Methods: A three-group exploratory study design was used, with 92 nursing homes randomly selected from all nursing…
Liu, Ying; ZENG, Donglin; WANG, Yuanjia
2014-01-01
Summary Dynamic treatment regimens (DTRs) are sequential decision rules tailored at each point where a clinical decision is made based on each patient’s time-varying characteristics and intermediate outcomes observed at earlier points in time. The complexity, patient heterogeneity, and chronicity of mental disorders call for learning optimal DTRs to dynamically adapt treatment to an individual’s response over time. The Sequential Multiple Assignment Randomized Trial (SMARTs) design allows for estimating causal effects of DTRs. Modern statistical tools have been developed to optimize DTRs based on personalized variables and intermediate outcomes using rich data collected from SMARTs; these statistical methods can also be used to recommend tailoring variables for designing future SMART studies. This paper introduces DTRs and SMARTs using two examples in mental health studies, discusses two machine learning methods for estimating optimal DTR from SMARTs data, and demonstrates the performance of the statistical methods using simulated data. PMID:25642116
NASA Astrophysics Data System (ADS)
Tong, Xiaojun; Cui, Minggen; Wang, Zhu
2009-07-01
The design of the new compound two-dimensional chaotic function is presented by exploiting two one-dimensional chaotic functions which switch randomly, and the design is used as a chaotic sequence generator which is proved by Devaney's definition proof of chaos. The properties of compound chaotic functions are also proved rigorously. In order to improve the robustness against difference cryptanalysis and produce avalanche effect, a new feedback image encryption scheme is proposed using the new compound chaos by selecting one of the two one-dimensional chaotic functions randomly and a new image pixels method of permutation and substitution is designed in detail by array row and column random controlling based on the compound chaos. The results from entropy analysis, difference analysis, statistical analysis, sequence randomness analysis, cipher sensitivity analysis depending on key and plaintext have proven that the compound chaotic sequence cipher can resist cryptanalytic, statistical and brute-force attacks, and especially it accelerates encryption speed, and achieves higher level of security. By the dynamical compound chaos and perturbation technology, the paper solves the problem of computer low precision of one-dimensional chaotic function.
Comparison of Structural Optimization Techniques for a Nuclear Electric Space Vehicle
NASA Technical Reports Server (NTRS)
Benford, Andrew
2003-01-01
The purpose of this paper is to utilize the optimization method of genetic algorithms (GA) for truss design on a nuclear propulsion vehicle. Genetic Algorithms are a guided, random search that mirrors Darwin s theory of natural selection and survival of the fittest. To verify the GA s capabilities, other traditional optimization methods were used to compare the results obtained by the GA's, first on simple 2-D structures, and eventually on full-scale 3-D truss designs.
NASA Technical Reports Server (NTRS)
Crowe, D. R.; Henricks, W.
1983-01-01
The combined load statistics are developed by taking the acoustically induced load to be a random population, assumed to be stationary. Each element of this ensemble of acoustically induced loads is assumed to have the same power spectral density (PSD), obtained previously from a random response analysis employing the given acoustic field in the STS cargo bay as a stationary random excitation. The mechanically induced load is treated as either (1) a known deterministic transient, or (2) a nonstationary random variable of known first and second statistical moments which vary with time. A method is then shown for determining the probability that the combined load would, at any time, have a value equal to or less than a certain level. Having obtained a statistical representation of how the acoustic and mechanical loads are expected to combine, an analytical approximation for defining design levels for these loads is presented using the First Passage failure criterion.
Methodological Overview of an African American Couple-Based HIV/STD Prevention Trial
2010-01-01
Objective To provide an overview of the NIMH Multisite HIV/STD Prevention Trial for African American Couples conducted in four urban areas: Atlanta, Los Angeles, New York, and Philadelphia. The rationale, study design methods, proposed data analyses, and study management are described. Design This is a two arm randomized Trial, implementing a modified randomized block design, to evaluate the efficacy of a couples based intervention designed for HIV serodiscordant African American couples. Methods The study phases consisted of formative work, pilot studies, and a randomized clinical trial. The sample is 535 HIV serodiscordant heterosexual African American couples. There are two theoretically derived behavioral interventions with eight group and individual sessions: the Eban HIV/STD Risk Reduction Intervention (treatment) versus the Eban Health Promotion Intervention (control). The treatment intervention was couples based and focused on HIV/STD risk reduction while the control was individual based and focused on health promotion. The two study conditions were structurally similar in length and types of activities. At baseline, participants completed an Audio Computer-assisted Self Interview (ACASI) interview as well as interviewer-administered questionnaire, and provided biological specimens to assess for STDs. Similar follow-up assessments were conducted immediately after the intervention, at 6 months, and at 12 months. Results The Trial results will be analyzed across the four sites by randomization assignment. Generalized estimating equations (GEE) and mixed effects modeling (MEM) are planned to test: (1) the effects of the intervention on STD incidence and condom use as well as on mediator variables of these outcomes, and (2) whether the effects of the intervention differ depending on key moderator variables (e.g., gender of the HIV-seropositive partners, length of relationship, psychological distress, sexual abuse history, and substance abuse history). Conclusions The lessons learned from the design and conduct of this clinical trial provide guidelines for future couples based clinical trials in HIV/STD risk reduction and can be generalized to other couples based behavioral interventions. PMID:18724188
Divergence of Scientific Heuristic Method and Direct Algebraic Instruction
ERIC Educational Resources Information Center
Calucag, Lina S.
2016-01-01
This is an experimental study, made used of the non-randomized experimental and control groups, pretest-posttest designs. The experimental and control groups were two separate intact classes in Algebra. For a period of twelve sessions, the experimental group was subjected to the scientific heuristic method, but the control group instead was given…
Convenience Samples and Caregiving Research: How Generalizable Are the Findings?
ERIC Educational Resources Information Center
Pruchno, Rachel A.; Brill, Jonathan E.; Shands, Yvonne; Gordon, Judith R.; Genderson, Maureen Wilson; Rose, Miriam; Cartwright, Francine
2008-01-01
Purpose: We contrast characteristics of respondents recruited using convenience strategies with those of respondents recruited by random digit dial (RDD) methods. We compare sample variances, means, and interrelationships among variables generated from the convenience and RDD samples. Design and Methods: Women aged 50 to 64 who work full time and…
Heo, Moonseong; Meissner, Paul; Litwin, Alain H; Arnsten, Julia H; McKee, M Diane; Karasz, Alison; McKinley, Paula; Rehm, Colin D; Chambers, Earle C; Yeh, Ming-Chin; Wylie-Rosett, Judith
2017-01-01
Comparative effectiveness research trials in real-world settings may require participants to choose between preferred intervention options. A randomized clinical trial with parallel experimental and control arms is straightforward and regarded as a gold standard design, but by design it forces and anticipates the participants to comply with a randomly assigned intervention regardless of their preference. Therefore, the randomized clinical trial may impose impractical limitations when planning comparative effectiveness research trials. To accommodate participants' preference if they are expressed, and to maintain randomization, we propose an alternative design that allows participants' preference after randomization, which we call a "preference option randomized design (PORD)". In contrast to other preference designs, which ask whether or not participants consent to the assigned intervention after randomization, the crucial feature of preference option randomized design is its unique informed consent process before randomization. Specifically, the preference option randomized design consent process informs participants that they can opt out and switch to the other intervention only if after randomization they actively express the desire to do so. Participants who do not independently express explicit alternate preference or assent to the randomly assigned intervention are considered to not have an alternate preference. In sum, preference option randomized design intends to maximize retention, minimize possibility of forced assignment for any participants, and to maintain randomization by allowing participants with no or equal preference to represent random assignments. This design scheme enables to define five effects that are interconnected with each other through common design parameters-comparative, preference, selection, intent-to-treat, and overall/as-treated-to collectively guide decision making between interventions. Statistical power functions for testing all these effects are derived, and simulations verified the validity of the power functions under normal and binomial distributions.
Local Random Quantum Circuits are Approximate Polynomial-Designs
NASA Astrophysics Data System (ADS)
Brandão, Fernando G. S. L.; Harrow, Aram W.; Horodecki, Michał
2016-09-01
We prove that local random quantum circuits acting on n qubits composed of O( t 10 n 2) many nearest neighbor two-qubit gates form an approximate unitary t-design. Previously it was unknown whether random quantum circuits were a t-design for any t > 3. The proof is based on an interplay of techniques from quantum many-body theory, representation theory, and the theory of Markov chains. In particular we employ a result of Nachtergaele for lower bounding the spectral gap of frustration-free quantum local Hamiltonians; a quasi-orthogonality property of permutation matrices; a result of Oliveira which extends to the unitary group the path-coupling method for bounding the mixing time of random walks; and a result of Bourgain and Gamburd showing that dense subgroups of the special unitary group, composed of elements with algebraic entries, are ∞-copy tensor-product expanders. We also consider pseudo-randomness properties of local random quantum circuits of small depth and prove that circuits of depth O( t 10 n) constitute a quantum t-copy tensor-product expander. The proof also rests on techniques from quantum many-body theory, in particular on the detectability lemma of Aharonov, Arad, Landau, and Vazirani. We give applications of the results to cryptography, equilibration of closed quantum dynamics, and the generation of topological order. In particular we show the following pseudo-randomness property of generic quantum circuits: Almost every circuit U of size O( n k ) on n qubits cannot be distinguished from a Haar uniform unitary by circuits of size O( n ( k-9)/11) that are given oracle access to U.
A comparison of two sampling designs for fish assemblage assessment in a large river
Kiraly, Ian A.; Coghlan, Stephen M.; Zydlewski, Joseph D.; Hayes, Daniel
2014-01-01
We compared the efficiency of stratified random and fixed-station sampling designs to characterize fish assemblages in anticipation of dam removal on the Penobscot River, the largest river in Maine. We used boat electrofishing methods in both sampling designs. Multiple 500-m transects were selected randomly and electrofished in each of nine strata within the stratified random sampling design. Within the fixed-station design, up to 11 transects (1,000 m) were electrofished, all of which had been sampled previously. In total, 88 km of shoreline were electrofished during summer and fall in 2010 and 2011, and 45,874 individuals of 34 fish species were captured. Species-accumulation and dissimilarity curve analyses indicated that all sampling effort, other than fall 2011 under the fixed-station design, provided repeatable estimates of total species richness and proportional abundances. Overall, our sampling designs were similar in precision and efficiency for sampling fish assemblages. The fixed-station design was negatively biased for estimating the abundance of species such as Common Shiner Luxilus cornutus and Fallfish Semotilus corporalis and was positively biased for estimating biomass for species such as White Sucker Catostomus commersonii and Atlantic Salmon Salmo salar. However, we found no significant differences between the designs for proportional catch and biomass per unit effort, except in fall 2011. The difference observed in fall 2011 was due to limitations on the number and location of fixed sites that could be sampled, rather than an inherent bias within the design. Given the results from sampling in the Penobscot River, application of the stratified random design is preferable to the fixed-station design due to less potential for bias caused by varying sampling effort, such as what occurred in the fall 2011 fixed-station sample or due to purposeful site selection.
Proceedings of the First NASA Formal Methods Symposium
NASA Technical Reports Server (NTRS)
Denney, Ewen (Editor); Giannakopoulou, Dimitra (Editor); Pasareanu, Corina S. (Editor)
2009-01-01
Topics covered include: Model Checking - My 27-Year Quest to Overcome the State Explosion Problem; Applying Formal Methods to NASA Projects: Transition from Research to Practice; TLA+: Whence, Wherefore, and Whither; Formal Methods Applications in Air Transportation; Theorem Proving in Intel Hardware Design; Building a Formal Model of a Human-Interactive System: Insights into the Integration of Formal Methods and Human Factors Engineering; Model Checking for Autonomic Systems Specified with ASSL; A Game-Theoretic Approach to Branching Time Abstract-Check-Refine Process; Software Model Checking Without Source Code; Generalized Abstract Symbolic Summaries; A Comparative Study of Randomized Constraint Solvers for Random-Symbolic Testing; Component-Oriented Behavior Extraction for Autonomic System Design; Automated Verification of Design Patterns with LePUS3; A Module Language for Typing by Contracts; From Goal-Oriented Requirements to Event-B Specifications; Introduction of Virtualization Technology to Multi-Process Model Checking; Comparing Techniques for Certified Static Analysis; Towards a Framework for Generating Tests to Satisfy Complex Code Coverage in Java Pathfinder; jFuzz: A Concolic Whitebox Fuzzer for Java; Machine-Checkable Timed CSP; Stochastic Formal Correctness of Numerical Algorithms; Deductive Verification of Cryptographic Software; Coloured Petri Net Refinement Specification and Correctness Proof with Coq; Modeling Guidelines for Code Generation in the Railway Signaling Context; Tactical Synthesis Of Efficient Global Search Algorithms; Towards Co-Engineering Communicating Autonomous Cyber-Physical Systems; and Formal Methods for Automated Diagnosis of Autosub 6000.
Wang, Wei; Griswold, Michael E
2016-11-30
The random effect Tobit model is a regression model that accommodates both left- and/or right-censoring and within-cluster dependence of the outcome variable. Regression coefficients of random effect Tobit models have conditional interpretations on a constructed latent dependent variable and do not provide inference of overall exposure effects on the original outcome scale. Marginalized random effects model (MREM) permits likelihood-based estimation of marginal mean parameters for the clustered data. For random effect Tobit models, we extend the MREM to marginalize over both the random effects and the normal space and boundary components of the censored response to estimate overall exposure effects at population level. We also extend the 'Average Predicted Value' method to estimate the model-predicted marginal means for each person under different exposure status in a designated reference group by integrating over the random effects and then use the calculated difference to assess the overall exposure effect. The maximum likelihood estimation is proposed utilizing a quasi-Newton optimization algorithm with Gauss-Hermite quadrature to approximate the integration of the random effects. We use these methods to carefully analyze two real datasets. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Barlow, D H; Hayes, S C
1979-01-01
A little used and often confused design, capable of comparing two treatments within a single subject, has been termed, variously, a multielement baseline design, a multiple schedule design, and a randomization design. The background of these terms is reviewed, and a new, more descriptive term, Alternating Treatments Design, is proposed. Critical differences between this design and a Simultaneous Treatment Design are outlined, and experimental questions answerable by each design are noted. Potential problems with multiple treatment interference in this procedure are divided into sequential confounding, carryover effects, and alternation effects and the importance of these issues vis-a-vis other single-case experimental designs is considered. Methods of minimizing multiple treatment interference as well as methods of studying these effects are outlined. Finally, appropriate uses of Alternating Treatments Designs are described and discussed in the context of recent examples. PMID:489478
Greek classicism in living structure? Some deductive pathways in animal morphology.
Zweers, G A
1985-01-01
Classical temples in ancient Greece show two deterministic illusionistic principles of architecture, which govern their functional design: geometric proportionalism and a set of illusion-strengthening rules in the proportionalism's "stochastic margin". Animal morphology, in its mechanistic-deductive revival, applies just one architectural principle, which is not always satisfactory. Whether a "Greek Classical" situation occurs in the architecture of living structure is to be investigated by extreme testing with deductive methods. Three deductive methods for explanation of living structure in animal morphology are proposed: the parts, the compromise, and the transformation deduction. The methods are based upon the systems concept for an organism, the flow chart for a functionalistic picture, and the network chart for a structuralistic picture, whereas the "optimal design" serves as the architectural principle for living structure. These methods show clearly the high explanatory power of deductive methods in morphology, but they also make one open end most explicit: neutral issues do exist. Full explanation of living structure asks for three entries: functional design within architectural and transformational constraints. The transformational constraint brings necessarily in a stochastic component: an at random variation being a sort of "free management space". This variation must be a variation from the deterministic principle of the optimal design, since any transformation requires space for plasticity in structure and action, and flexibility in role fulfilling. Nevertheless, finally the question comes up whether for animal structure a similar situation exists as in Greek Classical temples. This means that the at random variation, that is found when the optimal design is used to explain structure, comprises apart from a stochastic part also real deviations being yet another deterministic part. This deterministic part could be a set of rules that governs actualization in the "free management space".
Seiler, CM; Fröhlich, BE; Veit, JA; Gazyakan, E; Wente, MN; Wollermann, C; Deckert, A; Witte, S; Victor, N; Buchler, MW; Knaebel, HP
2006-01-01
Background Annually, more than 90000 surgical procedures of the thyroid gland are performed in Germany. Strategies aimed at reducing the duration of the surgical procedure are relevant to patients and the health care system especially in the context of reducing costs. However, new techniques for quick and safe hemostasis have to be tested in clinically relevance randomized controlled trials before a general recommendation can be given. The current standard for occlusion of blood vessels in thyroid surgery is ligatures. Vascular clips may be a safe alternative but have not been investigated in a large RCT. Methods/design CLIVIT (Clips versus Ligatures in Thyroid Surgery) is an investigator initiated, multicenter, patient-blinded, two-group parallel relevance randomized controlled trial designed by the Study Center of the German Surgical Society. Patients scheduled for elective resection of at least two third of the gland for benign thyroid disease are eligible for participation. After surgical exploration patients are randomized intraoperatively into either the conventional ligature group, or into the clip group. The primary objective is to test for a relevant reduction in operating time (at least 15 min) when using the clip technique. Since April 2004, 121 of the totally required 420 patients were randomized in five centers. Discussion As in all trials the different forms of bias have to be considered, and as in this case, a surgical trial, the role of surgical expertise plays a key role, and will be documented and analyzed separately. This is the first randomized controlled multicenter relevance trial to compare different vessel occlusion techniques in thyroid surgery with adequate power and other detailed information about the design as well as framework. If significant, the results might be generalized and may change the current surgical practice. PMID:16948853
Child/Adolescent Anxiety Multimodal Study (CAMS): rationale, design, and methods
2010-01-01
Objective To present the design, methods, and rationale of the Child/Adolescent Anxiety Multimodal Study (CAMS), a recently completed federally-funded, multi-site, randomized placebo-controlled trial that examined the relative efficacy of cognitive-behavior therapy (CBT), sertraline (SRT), and their combination (COMB) against pill placebo (PBO) for the treatment of separation anxiety disorder (SAD), generalized anxiety disorder (GAD) and social phobia (SoP) in children and adolescents. Methods Following a brief review of the acute outcomes of the CAMS trial, as well as the psychosocial and pharmacologic treatment literature for pediatric anxiety disorders, the design and methods of the CAMS trial are described. Results CAMS was a six-year, six-site, randomized controlled trial. Four hundred eighty-eight (N = 488) children and adolescents (ages 7-17 years) with DSM-IV-TR diagnoses of SAD, GAD, or SoP were randomly assigned to one of four treatment conditions: CBT, SRT, COMB, or PBO. Assessments of anxiety symptoms, safety, and functional outcomes, as well as putative mediators and moderators of treatment response were completed in a multi-measure, multi-informant fashion. Manual-based therapies, trained clinicians and independent evaluators were used to ensure treatment and assessment fidelity. A multi-layered administrative structure with representation from all sites facilitated cross-site coordination of the entire trial, study protocols and quality assurance. Conclusions CAMS offers a model for clinical trials methods applicable to psychosocial and psychopharmacological comparative treatment trials by using state-of-the-art methods and rigorous cross-site quality controls. CAMS also provided a large-scale examination of the relative and combined efficacy and safety of the best evidenced-based psychosocial (CBT) and pharmacologic (SSRI) treatments to date for the most commonly occurring pediatric anxiety disorders. Primary and secondary results of CAMS will hold important implications for informing practice-relevant decisions regarding the initial treatment of youth with anxiety disorders. Trial registration ClinicalTrials.gov NCT00052078. PMID:20051130
Kinematic Methods of Designing Free Form Shells
NASA Astrophysics Data System (ADS)
Korotkiy, V. A.; Khmarova, L. I.
2017-11-01
The geometrical shell model is formed in light of the set requirements expressed through surface parameters. The shell is modelled using the kinematic method according to which the shell is formed as a continuous one-parameter set of curves. The authors offer a kinematic method based on the use of second-order curves with a variable eccentricity as a form-making element. Additional guiding ruled surfaces are used to control the designed surface form. The authors made a software application enabling to plot a second-order curve specified by a random set of five coplanar points and tangents.
ERIC Educational Resources Information Center
Crook, Nicola; Adams, Malcolm; Shorten, Nicola; Langdon, Peter E.
2016-01-01
Background: This study investigated whether a personalized life story book and rummage box enhanced well-being and led to changes in behaviour for people with Down syndrome (DS) who have dementia. Materials and Methods: A randomized single case series design was used with five participants who had DS and a diagnosis of dementia. Participants were…
Detector Design Considerations in High-Dimensional Artificial Immune Systems
2012-03-22
a method known as randomized RNS [15]. In this approach, Monte Carlo integration is used to determine the size of self and non-self within the given...feature space, then a number of randomly placed detectors are chosen according to Monte Carlo integration calculations. Simulated annealing is then...detector is only counted once). This value is termed ‘actual content’ because it does not including overlapping content, but only that content that is
Markov random field model-based edge-directed image interpolation.
Li, Min; Nguyen, Truong Q
2008-07-01
This paper presents an edge-directed image interpolation algorithm. In the proposed algorithm, the edge directions are implicitly estimated with a statistical-based approach. In opposite to explicit edge directions, the local edge directions are indicated by length-16 weighting vectors. Implicitly, the weighting vectors are used to formulate geometric regularity (GR) constraint (smoothness along edges and sharpness across edges) and the GR constraint is imposed on the interpolated image through the Markov random field (MRF) model. Furthermore, under the maximum a posteriori-MRF framework, the desired interpolated image corresponds to the minimal energy state of a 2-D random field given the low-resolution image. Simulated annealing methods are used to search for the minimal energy state from the state space. To lower the computational complexity of MRF, a single-pass implementation is designed, which performs nearly as well as the iterative optimization. Simulation results show that the proposed MRF model-based edge-directed interpolation method produces edges with strong geometric regularity. Compared to traditional methods and other edge-directed interpolation methods, the proposed method improves the subjective quality of the interpolated edges while maintaining a high PSNR level.
Randomized Controlled Trials in Music Therapy: Guidelines for Design and Implementation.
Bradt, Joke
2012-01-01
Evidence from randomized controlled trials (RCTs) plays a powerful role in today's healthcare industry. At the same time, it is important that multiple types of evidence contribute to music therapy's knowledge base and that the dialogue of clinical effectiveness in music therapy is not dominated by the biomedical hierarchical model of evidence-based practice. Whether or not one agrees with the hierarchical model of evidence in the current healthcare climate, RCTs can contribute important knowledge to our field. Therefore, it is important that music therapists are prepared to design trials that meet current methodological standards and, equally important, are able to respond appropriately to those design aspects that may not be feasible in music therapy research. To provide practical guidelines to music therapy researchers for the design and implementation of RCTs as well as to enable music therapists to be well-informed consumers of RCT evidence. This article reviews key design aspects of RCTs and discusses how to best implement these standards in music therapy trials. A systematic presentation of basic randomization methods, allocation concealment strategies, issues related to blinding in music therapy trials and strategies for implementation, the use of treatment manuals, types of control groups, outcome selection, and sample size computation is provided. Despite the challenges of meeting all key design demands typical of an RCT, it is possible to design rigorous music therapy RCTs that accurately estimate music therapy treatment benefits.
Statistical inference for the additive hazards model under outcome-dependent sampling.
Yu, Jichang; Liu, Yanyan; Sandler, Dale P; Zhou, Haibo
2015-09-01
Cost-effective study design and proper inference procedures for data from such designs are always of particular interests to study investigators. In this article, we propose a biased sampling scheme, an outcome-dependent sampling (ODS) design for survival data with right censoring under the additive hazards model. We develop a weighted pseudo-score estimator for the regression parameters for the proposed design and derive the asymptotic properties of the proposed estimator. We also provide some suggestions for using the proposed method by evaluating the relative efficiency of the proposed method against simple random sampling design and derive the optimal allocation of the subsamples for the proposed design. Simulation studies show that the proposed ODS design is more powerful than other existing designs and the proposed estimator is more efficient than other estimators. We apply our method to analyze a cancer study conducted at NIEHS, the Cancer Incidence and Mortality of Uranium Miners Study, to study the risk of radon exposure to cancer.
Statistical inference for the additive hazards model under outcome-dependent sampling
Yu, Jichang; Liu, Yanyan; Sandler, Dale P.; Zhou, Haibo
2015-01-01
Cost-effective study design and proper inference procedures for data from such designs are always of particular interests to study investigators. In this article, we propose a biased sampling scheme, an outcome-dependent sampling (ODS) design for survival data with right censoring under the additive hazards model. We develop a weighted pseudo-score estimator for the regression parameters for the proposed design and derive the asymptotic properties of the proposed estimator. We also provide some suggestions for using the proposed method by evaluating the relative efficiency of the proposed method against simple random sampling design and derive the optimal allocation of the subsamples for the proposed design. Simulation studies show that the proposed ODS design is more powerful than other existing designs and the proposed estimator is more efficient than other estimators. We apply our method to analyze a cancer study conducted at NIEHS, the Cancer Incidence and Mortality of Uranium Miners Study, to study the risk of radon exposure to cancer. PMID:26379363
Regression discontinuity was a valid design for dichotomous outcomes in three randomized trials.
van Leeuwen, Nikki; Lingsma, Hester F; Mooijaart, Simon P; Nieboer, Daan; Trompet, Stella; Steyerberg, Ewout W
2018-06-01
Regression discontinuity (RD) is a quasi-experimental design that may provide valid estimates of treatment effects in case of continuous outcomes. We aimed to evaluate validity and precision in the RD design for dichotomous outcomes. We performed validation studies in three large randomized controlled trials (RCTs) (Corticosteroid Randomization After Significant Head injury [CRASH], the Global Utilization of Streptokinase and Tissue Plasminogen Activator for Occluded Coronary Arteries [GUSTO], and PROspective Study of Pravastatin in elderly individuals at risk of vascular disease [PROSPER]). To mimic the RD design, we selected patients above and below a cutoff (e.g., age 75 years) randomized to treatment and control, respectively. Adjusted logistic regression models using restricted cubic splines (RCS) and polynomials and local logistic regression models estimated the odds ratio (OR) for treatment, with 95% confidence intervals (CIs) to indicate precision. In CRASH, treatment increased mortality with OR 1.22 [95% CI 1.06-1.40] in the RCT. The RD estimates were 1.42 (0.94-2.16) and 1.13 (0.90-1.40) with RCS adjustment and local regression, respectively. In GUSTO, treatment reduced mortality (OR 0.83 [0.72-0.95]), with more extreme estimates in the RD analysis (OR 0.57 [0.35; 0.92] and 0.67 [0.51; 0.86]). In PROSPER, similar RCT and RD estimates were found, again with less precision in RD designs. We conclude that the RD design provides similar but substantially less precise treatment effect estimates compared with an RCT, with local regression being the preferred method of analysis. Copyright © 2018 Elsevier Inc. All rights reserved.
Goulden, Peter A.; Bursac, Zoran; Hudson, Jonell; Purvis, Rachel S.; Yeary, Karen H. Kim; Aitaoto, Nia; Kohler, Peter O.
2016-01-01
This article illustrates how a collaborative research process can successfully engage an underserved minority community to address health disparities. Pacific Islanders, including the Marshallese, are one of the fastest growing US populations. They face significant health disparities, including extremely high rates of type 2 diabetes. This article describes the engagement process of designing patient-centered outcomes research with Marshallese stakeholders, highlighting the specific influences of their input on a randomized control trial to address diabetes. Over 18 months, an interdisciplinary research team used community-based participatory principles to conduct patient-engaged outcomes research that involved 31 stakeholders in all aspects of research design, from defining the research question to making decisions about budgets and staffing. This required academic researcher flexibility, but yielded a design linking scientific methodology with community wisdom. PMID:27325179
[Design of the National Surveillance of Nutritional Indicators (MONIN), Peru 2007-2010].
Campos-Sánchez, Miguel; Ricaldi-Sueldo, Rita; Miranda-Cuadros, Marianella
2011-06-01
To describe the design and methods of the national surveillance of nutritional indicators (MONIN) 2007-2010, carried out by INS/CENAN. MONIN was designed as a continuous (repeated cross-sectional) survey, with stratified multi-stage random sampling, considering the universe as all under five children and pregnant women residing in Peru, divided into 5 geographical strata and 6 trimesters (randomly permuted weeks, about 78% of the time between November 19, 2007 and April 2, 2010). The total sample was 3,827 children in 361 completed clusters. The dropout rate was 8.4% in clusters, 1.8% in houses, and 13.2% in households. Dropout was also 4.2, 13.3, 21.2, 55% and 29% in anthropometry, hemoglobin, food intake, retinol and ioduria measurements, respectively. The MONIN design is feasible and useful for the estimation of indicators of childhood malnutrition.
Velthuis, Miranda J; May, Anne M; Monninkhof, Evelyn M; van der Wall, Elsken; Peeters, Petra H M
2012-03-01
Assessing effects of lifestyle interventions in cancer patients has some specific challenges. Although randomization is urgently needed for evidence-based knowledge, sometimes it is difficult to apply conventional randomization (i.e., consent preceding randomization and intervention) in daily settings. Randomization before seeking consent was proposed by Zelen, and additional modifications were proposed since. We discuss four alternatives for conventional randomization: single and double randomized consent design, two-stage randomized consent design, and the design with consent to postponed information. We considered these designs when designing a study to assess the impact of physical activity on cancer-related fatigue and quality of life. We tested the modified Zelen design with consent to postponed information in a pilot. The design was chosen to prevent drop out of participants in the control group because of disappointment about the allocation. The result was a low overall participation rate most likely because of perceived lack of information by eligible patients and a relatively high dropout in the intervention group. We conclude that the alternatives were not better than conventional randomization. Copyright © 2012 Elsevier Inc. All rights reserved.
Recommendations for research design of telehealth studies.
Chumbler, Neale R; Kobb, Rita; Brennan, David M; Rabinowitz, Terry
2008-11-01
Properly designed randomized controlled trials (RCTs) are the gold standard to use when examining the effectiveness of telehealth interventions on clinical outcomes. Some published telehealth studies have employed well-designed RCTs. However, such methods are not always feasible and practical in particular settings. This white paper addresses not only the need for properly designed RCTs, but also offers alternative research designs, such as quasi-experimental designs, and statistical techniques that can be employed to rigorously assess the effectiveness of telehealth studies. This paper further offers design and measurement recommendations aimed at and relevant to administrative decision-makers, policymakers, and practicing clinicians.
The Statistical Power of the Cluster Randomized Block Design with Matched Pairs--A Simulation Study
ERIC Educational Resources Information Center
Dong, Nianbo; Lipsey, Mark
2010-01-01
This study uses simulation techniques to examine the statistical power of the group- randomized design and the matched-pair (MP) randomized block design under various parameter combinations. Both nearest neighbor matching and random matching are used for the MP design. The power of each design for any parameter combination was calculated from…
ERIC Educational Resources Information Center
Gauthier, Andrea; Jenkinson, Jodie
2017-01-01
We designed a serious game, MolWorlds, to facilitate conceptual change about molecular emergence by using game mechanics (resource management, immersed 3rd person character, sequential level progression, and 3-star scoring system) to encourage cycles of productive negativity. We tested the value-added effect of game design by comparing and…
A Watershed-Scale Survey for Stream-Foraging Birds in Northern California
Sherri L. Miller; C. John Ralph
2005-01-01
Our objective was to develop a survey technique and watershed-scale design to monitor trends of population size and habitat associations in stream-foraging birds. The resulting methods and design will be used to examine the efficacy of quantifying the association of stream and watershed quality with bird abundance. We surveyed 60 randomly selected 2-km stream reaches...
ERIC Educational Resources Information Center
Ford, Marvella E.; Havstad, Suzanne; Vernon, Sally W.; Davis, Shawna D.; Kroll, David; Lamerato, Lois; Swanson, G. Marie
2006-01-01
Purpose: The purpose of this study was to enhance adherence among older (aged 55 years and older) African American men enrolled in a cancer screening trial for prostate, lung, and colorectal cancer. For this study, we defined "adherence" as completing the trial screenings. Design and Methods: We used a randomized trial design. Case managers…
ERIC Educational Resources Information Center
Stineman, Margaret G.; Strumpf, Neville; Kurichi, Jibby E.; Charles, Jeremy; Grisso, Jeane Ann; Jayadevappa, Ravishankar
2011-01-01
Purpose of the study: To assess the recruitment, adherence, and retention of urban elderly, predominantly African Americans to a falls reduction exercise program. Design and methods: The randomized controlled trial was designed as an intervention development pilot study. The goal was to develop a culturally sensitive intervention for elderly…
ERIC Educational Resources Information Center
Mahoney, Diane Feeney; Tarlow, Barbara J.; Jones, Richard N.
2003-01-01
Purpose: We determine the main outcome effects of a 12-month computer-mediated automated interactive voice response (IVR) intervention designed to assist family caregivers managing persons with disruptive behaviors related to Alzheimer's disease (AD). Design and Methods: We conducted a randomized controlled study of 100 caregivers, 51 in the usual…
ERIC Educational Resources Information Center
Lee, Hyegyu; Paek, Hye-Jin
2013-01-01
Objective: To examine how norm appeals and guilt influence smokers' behavioural intention. Design: Quasi-experimental design. Setting: South Korea. Method: Two hundred and fifty-five male smokers were randomly assigned to descriptive, injunctive, or subjective anti-smoking norm messages. After they viewed the norm messages, their norm perceptions,…
ERIC Educational Resources Information Center
Alam, Fahad; Boet, Sylvain; Piquette, Dominique; Lai, Anita; Perkes, Christopher P.; LeBlanc, Vicki R.
2016-01-01
Enhanced podcasts increase learning, but evidence is lacking on how they should be designed to optimize their effectiveness. This study assessed the impact two learning instructional design methods (mental practice and modeling), either on their own or in combination, for teaching complex cognitive medical content when incorporated into enhanced…
2013-01-01
Background Traditional Lot Quality Assurance Sampling (LQAS) designs assume observations are collected using simple random sampling. Alternatively, randomly sampling clusters of observations and then individuals within clusters reduces costs but decreases the precision of the classifications. In this paper, we develop a general framework for designing the cluster(C)-LQAS system and illustrate the method with the design of data quality assessments for the community health worker program in Rwanda. Results To determine sample size and decision rules for C-LQAS, we use the beta-binomial distribution to account for inflated risk of errors introduced by sampling clusters at the first stage. We present general theory and code for sample size calculations. The C-LQAS sample sizes provided in this paper constrain misclassification risks below user-specified limits. Multiple C-LQAS systems meet the specified risk requirements, but numerous considerations, including per-cluster versus per-individual sampling costs, help identify optimal systems for distinct applications. Conclusions We show the utility of C-LQAS for data quality assessments, but the method generalizes to numerous applications. This paper provides the necessary technical detail and supplemental code to support the design of C-LQAS for specific programs. PMID:24160725
Hedt-Gauthier, Bethany L; Mitsunaga, Tisha; Hund, Lauren; Olives, Casey; Pagano, Marcello
2013-10-26
Traditional Lot Quality Assurance Sampling (LQAS) designs assume observations are collected using simple random sampling. Alternatively, randomly sampling clusters of observations and then individuals within clusters reduces costs but decreases the precision of the classifications. In this paper, we develop a general framework for designing the cluster(C)-LQAS system and illustrate the method with the design of data quality assessments for the community health worker program in Rwanda. To determine sample size and decision rules for C-LQAS, we use the beta-binomial distribution to account for inflated risk of errors introduced by sampling clusters at the first stage. We present general theory and code for sample size calculations.The C-LQAS sample sizes provided in this paper constrain misclassification risks below user-specified limits. Multiple C-LQAS systems meet the specified risk requirements, but numerous considerations, including per-cluster versus per-individual sampling costs, help identify optimal systems for distinct applications. We show the utility of C-LQAS for data quality assessments, but the method generalizes to numerous applications. This paper provides the necessary technical detail and supplemental code to support the design of C-LQAS for specific programs.
Blödt, Susanne; Schützler, Lena; Huang, Wenjing; Pach, Daniel; Brinkhaus, Benno; Hummelsberger, Josef; Kirschbaum, Barbara; Kuhlmann, Kirsten; Lao, Lixing; Liang, Fanrong; Mietzner, Anna; Mittring, Nadine; Müller, Sabine; Paul, Anna; Pimpao-Niederle, Carolina; Roll, Stephanie; Wu, Huangan; Zhu, Jiang; Witt, Claudia M
2013-04-11
Self-care acupressure might be successful in treating menstrual pain, which is common among young women. There is a need for comparative effectiveness research with stakeholder engagement in all phases seeking to address the needs of decision-makers. Our aim was to design a study on the effectiveness of additional self-care acupressure for menstrual pain comparing usual care alone using different methods of stakeholder engagement. The study was designed using multiple mixed methods for stakeholder engagement. Based on the results of a survey and focus group discussion, a stakeholder advisory group developed the study design. Stakeholder engagement resulted in a two-arm pragmatic randomized trial. Two hundred and twenty women aged 18 to 25 years with menstrual pain will be included in the study. Outcome measurement will be done using electronic questionnaires provided by a study specific mobile application (App). Primary outcome will be the mean pain intensity at the days of pain during the third menstruation after therapy start. Stakeholder engagement helped to develop a study design that better serves the needs of decision makers, including an App as a modern tool for both intervention and data collection in a young target group. Clinicaltrials.gov identifier http://NCT01582724.
Burke, Lora E.; Styn, Mindi A.; Glanz, Karen; Ewing, Linda J.; Elci, Okan U.; Conroy, Margaret B.; Sereika, Susan M.; Acharya, Sushama D.; Music, Edvin; Keating, Alison L.; Sevick, Mary Ann
2009-01-01
Background The primary form of treatment for obesity today is behavioral therapy. Self-monitoring diet and physical activity plays an important role in interventions targeting behavior and weight change. The SMART weight loss trial examined the impact of replacing the standard paper record used for self-monitoring with a personal digital assistant (PDA). This paper describes the design, methods, intervention, and baseline sample characteristics of the SMART trial. Methods The SMART trial used a 3-group design to determine the effects of different modes of self-monitoring on short- and long-term weight loss and on adherence to self-monitoring in a 24-month intervention. Participants were randomized to one of three conditions (1) use of a standard paper record (PR); (2) use of a PDA with dietary and physical activity software (PDA); or (3), use of a PDA with the same software plus a customized feedback program (PDA + FB). Results We screened 704 individuals and randomized 210. There were statistically but not clinically significant differences among the three cohorts in age, education, HDL cholesterol, blood glucose and systolic blood pressure. At 24 months, retention rate for the first of three cohorts was 90%. Conclusions To the best of our knowledge, the SMART trial is the first large study to compare different methods of self-monitoring in a behavioral weight loss intervention and to compare the use of PDAs to conventional paper records. This study has the potential to reveal significant details about self-monitoring patterns and whether technology can improve adherence to this vital intervention component. PMID:19665588
A new compound control method for sine-on-random mixed vibration test
NASA Astrophysics Data System (ADS)
Zhang, Buyun; Wang, Ruochen; Zeng, Falin
2017-09-01
Vibration environmental test (VET) is one of the important and effective methods to provide supports for the strength design, reliability and durability test of mechanical products. A new separation control strategy was proposed to apply in multiple-input multiple-output (MIMO) sine on random (SOR) mixed mode vibration test, which is the advanced and intensive test type of VET. As the key problem of the strategy, correlation integral method was applied to separate the mixed signals which included random and sinusoidal components. The feedback control formula of MIMO linear random vibration system was systematically deduced in frequency domain, and Jacobi control algorithm was proposed in view of the elements, such as self-spectrum, coherence, and phase of power spectral density (PSD) matrix. Based on the excessive correction of excitation in sine vibration test, compression factor was introduced to reduce the excitation correction, avoiding the destruction to vibration table or other devices. The two methods were synthesized to be applied in MIMO SOR vibration test system. In the final, verification test system with the vibration of a cantilever beam as the control object was established to verify the reliability and effectiveness of the methods proposed in the paper. The test results show that the exceeding values can be controlled in the tolerance range of references accurately, and the method can supply theory and application supports for mechanical engineering.
Ewing, Alexander C.; Kottke, Melissa J.; Kraft, Joan Marie; Sales, Jessica M.; Brown, Jennifer L.; Goedken, Peggy; Wiener, Jeffrey; Kourtis, Athena P.
2018-01-01
Background African American adolescent females are at elevated risk for unintended pregnancy and sexually transmitted infections (STIs). Dual protection (DP) is defined as concurrent prevention of pregnancy and STIs. This can be achieved by abstinence, consistent condom use, or the dual methods of condoms plus an effective non-barrier contraceptive. Previous clinic-based interventions showed short-term effects on increasing dual method use, but evidence of sustained effects on dual method use and decreased incident pregnancies and STIs are lacking. Methods/Design This manuscript describes the 2GETHER Project. 2GETHER is a randomized controlled trial of a multi-component intervention to increase dual protection use among sexually active African American females aged 14–19 years not desiring pregnancy at a Title X clinic in Atlanta, GA. The intervention is clinic-based and includes a culturally tailored interactive multimedia component and counseling sessions, both to assist in selection of a DP method and to reinforce use of the DP method. The participants are randomized to the study intervention or the standard of care, and followed for 12 months to evaluate how the intervention influences DP method selection and adherence, pregnancy and STI incidence, and participants’ DP knowledge, intentions, and self-efficacy. Discussion The 2GETHER Project is a novel trial to reduce unintended pregnancies and STIs among African American adolescents. The intervention is unique in the comprehensive and complementary nature of its components and its individual tailoring of provider-patient interaction. If the trial interventions are shown to be effective, then it will be reasonable to assess their scalability and applicability in other populations. PMID:28007634
Lisovskiĭ, A A; Pavlinov, I Ia
2008-01-01
Any morphospace is partitioned by the forms of group variation, its structure is described by a set of scalar (range, overlap) and vector (direction) characteristics. They are analyzed quantitatively for the sex and age variations in the sample of 200 skulls of the pine marten described by 14 measurable traits. Standard dispersion and variance components analyses are employed, accompanied with several resampling methods (randomization and bootstrep); effects of changes in the analysis design on results of the above methods are also considered. Maximum likelihood algorithm of variance components analysis is shown to give an adequate estimates of portions of particular forms of group variation within the overall disparity. It is quite stable in respect to changes of the analysis design and therefore could be used in the explorations of the real data with variously unbalanced designs. A new algorithm of estimation of co-directionality of particular forms of group variation within the overall disparity is elaborated, which includes angle measures between eigenvectors of covariation matrices of effects of group variations calculated by dispersion analysis. A null hypothesis of random portion of a given group variation could be tested by means of randomization of the respective grouping variable. A null hypothesis of equality of both portions and directionalities of different forms of group variation could be tested by means of the bootstrep procedure.
2010-01-01
Background Manual Therapy applied to patients with non specific neck pain has been investigated several times. In the Netherlands, manual therapy as applied according to the Utrecht School of Manual Therapy (MTU) has not been the subject of a randomized controlled trial. MTU differs in diagnoses and treatment from other forms of manual therapy. Methods/Design This is a single blind randomized controlled trial in patients with sub-acute and chronic non specific neck pain. Patients with neck complaints existing for two weeks (minimum) till one year (maximum) will participate in the trial. 180 participants will be recruited in thirteen primary health care centres in the Netherlands. The experimental group will be treated with MTU during a six week period. The control group will be treated with physical therapy (standard care, mainly active exercise therapy), also for a period of six weeks. Primary outcomes are Global Perceived Effect (GPE) and functional status (Neck Disability Index (NDI-DV)). Secondary outcomes are neck pain (Numeric Rating Scale (NRS)), Eurocol, costs and quality of life (SF36). Discussion This paper presents details on the rationale of MTU, design, methods and operational aspects of the trial. Trial registration ClinicalTrials.gov Identifier: NCT00713843 PMID:20096136
Newton, Katherine M; Carpenter, Janet S; Guthrie, Katherine A; Anderson, Garnet L; Caan, Bette; Cohen, Lee S; Ensrud, Kristine E; Freeman, Ellen W; Joffe, Hadine; Sternfeld, Barbara; Reed, Susan D; Sherman, Sheryl; Sammel, Mary D; Kroenke, Kurt; Larson, Joseph C; Lacroix, Andrea Z
2014-01-01
This report describes the Menopausal Strategies: Finding Lasting Answers to Symptoms and Health network and methodological issues addressed in designing and implementing vasomotor symptom trials. Established in response to a National Institutes of Health request for applications, the network was charged with conducting rapid throughput randomized trials of novel and understudied available interventions postulated to alleviate vasomotor and other menopausal symptoms. Included are descriptions of and rationale for criteria used for interventions and study selection, common eligibility and exclusion criteria, common primary and secondary outcome measures, consideration of placebo response, establishment of a biorepository, trial duration, screening and recruitment, statistical methods, and quality control. All trial designs are presented, including the following: (1) a randomized, double-blind, placebo-controlled clinical trial designed to evaluate the effectiveness of the selective serotonin reuptake inhibitor escitalopram in reducing vasomotor symptom frequency and severity; (2) a two-by-three factorial design trial to test three different interventions (yoga, exercise, and ω-3 supplementation) for the improvement of vasomotor symptom frequency and bother; and (3) a three-arm comparative efficacy trial of the serotonin-norepinephrine reuptake inhibitor venlafaxine and low-dose oral estradiol versus placebo for reducing vasomotor symptom frequency. The network's structure and governance are also discussed. The methods used in and the lessons learned from the Menopausal Strategies: Finding Lasting Answers to Symptoms and Health trials are shared to encourage and support the conduct of similar trials and to encourage collaborations with other researchers.
ERIC Educational Resources Information Center
Changeiywo, Johnson M.; Wambugu, P. W.; Wachanga, S. W.
2011-01-01
Teaching method is a major factor that affects students' motivation to learn physics. This study investigated the effects of using mastery learning approach (MLA) on secondary school students' motivation to learn physics. Solomon four non-equivalent control group design under the quasi-experimental research method was used in which a random sample…
ERIC Educational Resources Information Center
Sewasew, Daniel; Mengestle, Missaye; Abate, Gebeyehu
2015-01-01
The aim of this study was to compare PPT and traditional lecture method in material understandability, effectiveness and attitude among university students. Comparative descriptive survey research design was employed to answer the research questions raised. Four hundred and twenty nine participants were selected randomly using stratified sampling…
Program Retrieval/Dissemination: A Solid State Random Access System.
ERIC Educational Resources Information Center
Weeks, Walter O., Jr.
The trend toward greater flexibility in educational methods has led to a need for better and more rapid access to a variety of aural and audiovisual resource materials. This in turn has demanded the development of a flexible, reliable system of hardware designed to aid existing distribution methods in providing such access. The system must be…
Adaptive cluster sampling: An efficient method for assessing inconspicuous species
Andrea M. Silletti; Joan Walker
2003-01-01
Restorationistis typically evaluate the success of a project by estimating the population sizes of species that have been planted or seeded. Because total census is raely feasible, they must rely on sampling methods for population estimates. However, traditional random sampling designs may be inefficient for species that, for one reason or another, are challenging to...
Mazzucca, Stephanie; Tabak, Rachel G; Pilar, Meagan; Ramsey, Alex T; Baumann, Ana A; Kryzer, Emily; Lewis, Ericka M; Padek, Margaret; Powell, Byron J; Brownson, Ross C
2018-01-01
The need for optimal study designs in dissemination and implementation (D&I) research is increasingly recognized. Despite the wide range of study designs available for D&I research, we lack understanding of the types of designs and methodologies that are routinely used in the field. This review assesses the designs and methodologies in recently proposed D&I studies and provides resources to guide design decisions. We reviewed 404 study protocols published in the journal Implementation Science from 2/2006 to 9/2017. Eligible studies tested the efficacy or effectiveness of D&I strategies (i.e., not effectiveness of the underlying clinical or public health intervention); had a comparison by group and/or time; and used ≥1 quantitative measure. Several design elements were extracted: design category (e.g., randomized); design type [e.g., cluster randomized controlled trial (RCT)]; data type (e.g., quantitative); D&I theoretical framework; levels of treatment assignment, intervention, and measurement; and country in which the research was conducted. Each protocol was double-coded, and discrepancies were resolved through discussion. Of the 404 protocols reviewed, 212 (52%) studies tested one or more implementation strategy across 208 manuscripts, therefore meeting inclusion criteria. Of the included studies, 77% utilized randomized designs, primarily cluster RCTs. The use of alternative designs (e.g., stepped wedge) increased over time. Fewer studies were quasi-experimental (17%) or observational (6%). Many study design categories (e.g., controlled pre-post, matched pair cluster design) were represented by only one or two studies. Most articles proposed quantitative and qualitative methods (61%), with the remaining 39% proposing only quantitative. Half of protocols (52%) reported using a theoretical framework to guide the study. The four most frequently reported frameworks were Consolidated Framework for Implementing Research and RE-AIM ( n = 16 each), followed by Promoting Action on Research Implementation in Health Services and Theoretical Domains Framework ( n = 12 each). While several novel designs for D&I research have been proposed (e.g., stepped wedge, adaptive designs), the majority of the studies in our sample employed RCT designs. Alternative study designs are increasing in use but may be underutilized for a variety of reasons, including preference of funders or lack of awareness of these designs. Promisingly, the prevalent use of quantitative and qualitative methods together reflects methodological innovation in newer D&I research.
Nagasawa, Shinji; Al-Naamani, Eman; Saeki, Akinori
2018-05-17
Owing to the diverse chemical structures, organic photovoltaic (OPV) applications with a bulk heterojunction framework have greatly evolved over the last two decades, which has produced numerous organic semiconductors exhibiting improved power conversion efficiencies (PCEs). Despite the recent fast progress in materials informatics and data science, data-driven molecular design of OPV materials remains challenging. We report a screening of conjugated molecules for polymer-fullerene OPV applications by supervised learning methods (artificial neural network (ANN) and random forest (RF)). Approximately 1000 experimental parameters including PCE, molecular weight, and electronic properties are manually collected from the literature and subjected to machine learning with digitized chemical structures. Contrary to the low correlation coefficient in ANN, RF yields an acceptable accuracy, which is twice that of random classification. We demonstrate the application of RF screening for the design, synthesis, and characterization of a conjugated polymer, which facilitates a rapid development of optoelectronic materials.
Randomizing Roaches: Exploring the "Bugs" of Randomization in Experimental Design
ERIC Educational Resources Information Center
Wagler, Amy; Wagler, Ron
2014-01-01
Understanding the roles of random selection and random assignment in experimental design is a central learning objective in most introductory statistics courses. This article describes an activity, appropriate for a high school or introductory statistics course, designed to teach the concepts, values and pitfalls of random selection and assignment…
Westfall, Jacob; Kenny, David A; Judd, Charles M
2014-10-01
Researchers designing experiments in which a sample of participants responds to a sample of stimuli are faced with difficult questions about optimal study design. The conventional procedures of statistical power analysis fail to provide appropriate answers to these questions because they are based on statistical models in which stimuli are not assumed to be a source of random variation in the data, models that are inappropriate for experiments involving crossed random factors of participants and stimuli. In this article, we present new methods of power analysis for designs with crossed random factors, and we give detailed, practical guidance to psychology researchers planning experiments in which a sample of participants responds to a sample of stimuli. We extensively examine 5 commonly used experimental designs, describe how to estimate statistical power in each, and provide power analysis results based on a reasonable set of default parameter values. We then develop general conclusions and formulate rules of thumb concerning the optimal design of experiments in which a sample of participants responds to a sample of stimuli. We show that in crossed designs, statistical power typically does not approach unity as the number of participants goes to infinity but instead approaches a maximum attainable power value that is possibly small, depending on the stimulus sample. We also consider the statistical merits of designs involving multiple stimulus blocks. Finally, we provide a simple and flexible Web-based power application to aid researchers in planning studies with samples of stimuli.
Zhou, Hanzhi; Elliott, Michael R; Raghunathan, Trivellore E
2016-06-01
Multistage sampling is often employed in survey samples for cost and convenience. However, accounting for clustering features when generating datasets for multiple imputation is a nontrivial task, particularly when, as is often the case, cluster sampling is accompanied by unequal probabilities of selection, necessitating case weights. Thus, multiple imputation often ignores complex sample designs and assumes simple random sampling when generating imputations, even though failing to account for complex sample design features is known to yield biased estimates and confidence intervals that have incorrect nominal coverage. In this article, we extend a recently developed, weighted, finite-population Bayesian bootstrap procedure to generate synthetic populations conditional on complex sample design data that can be treated as simple random samples at the imputation stage, obviating the need to directly model design features for imputation. We develop two forms of this method: one where the probabilities of selection are known at the first and second stages of the design, and the other, more common in public use files, where only the final weight based on the product of the two probabilities is known. We show that this method has advantages in terms of bias, mean square error, and coverage properties over methods where sample designs are ignored, with little loss in efficiency, even when compared with correct fully parametric models. An application is made using the National Automotive Sampling System Crashworthiness Data System, a multistage, unequal probability sample of U.S. passenger vehicle crashes, which suffers from a substantial amount of missing data in "Delta-V," a key crash severity measure.
Zhou, Hanzhi; Elliott, Michael R.; Raghunathan, Trivellore E.
2017-01-01
Multistage sampling is often employed in survey samples for cost and convenience. However, accounting for clustering features when generating datasets for multiple imputation is a nontrivial task, particularly when, as is often the case, cluster sampling is accompanied by unequal probabilities of selection, necessitating case weights. Thus, multiple imputation often ignores complex sample designs and assumes simple random sampling when generating imputations, even though failing to account for complex sample design features is known to yield biased estimates and confidence intervals that have incorrect nominal coverage. In this article, we extend a recently developed, weighted, finite-population Bayesian bootstrap procedure to generate synthetic populations conditional on complex sample design data that can be treated as simple random samples at the imputation stage, obviating the need to directly model design features for imputation. We develop two forms of this method: one where the probabilities of selection are known at the first and second stages of the design, and the other, more common in public use files, where only the final weight based on the product of the two probabilities is known. We show that this method has advantages in terms of bias, mean square error, and coverage properties over methods where sample designs are ignored, with little loss in efficiency, even when compared with correct fully parametric models. An application is made using the National Automotive Sampling System Crashworthiness Data System, a multistage, unequal probability sample of U.S. passenger vehicle crashes, which suffers from a substantial amount of missing data in “Delta-V,” a key crash severity measure. PMID:29226161
Sparse sampling and reconstruction for electron and scanning probe microscope imaging
Anderson, Hyrum; Helms, Jovana; Wheeler, Jason W.; Larson, Kurt W.; Rohrer, Brandon R.
2015-07-28
Systems and methods for conducting electron or scanning probe microscopy are provided herein. In a general embodiment, the systems and methods for conducting electron or scanning probe microscopy with an undersampled data set include: driving an electron beam or probe to scan across a sample and visit a subset of pixel locations of the sample that are randomly or pseudo-randomly designated; determining actual pixel locations on the sample that are visited by the electron beam or probe; and processing data collected by detectors from the visits of the electron beam or probe at the actual pixel locations and recovering a reconstructed image of the sample.
Freedland, Kenneth E.; Mohr, David C.; Davidson, Karina W.; Schwartz, Joseph E.
2011-01-01
Objective To examine the use of existing practice control groups in randomized controlled trials of behavioral interventions, and the role of extrinsic healthcare services in the design and conduct of behavioral trials. Method Selective qualitative review. Results Extrinsic healthcare services, also known as nonstudy care, have important but under-recognized effects on the design and conduct of behavioral trials. Usual care, treatment as usual, standard of care, and other existing practice control groups pose a variety of methodological and ethical challenges, but they play a vital role in behavioral intervention research. Conclusion This review highlights the need for a scientific consensus statement on control groups in behavioral trials. PMID:21536837
Brantley, Phillip; Appel, Lawrence; Hollis, Jack; Stevens, Victor; Ard, Jamy; Champagne, Catherine; Elmer, Patricia; Harsha, David; Myers, Valerie; Proschan, Michael; William, Vollmer; Svetkey, Laura
2008-01-01
The Weight Loss Maintenance Trial (WLM) is a multi-center, randomized, controlled trial that compares the effects of two 30-month maintenance interventions, i.e., Personal Contact (PC) and Interactive Technology (IT) to a self-directed usual care control group (SD), in overweight or obese individuals who are at high risk for cardiovascular disease. This paper provides an overview of the design and methods, and design considerations and lessons learned from this trial. All participants received a 6-month behavioral weight loss program consisting of weekly group sessions. Participants who lost 4 kg were randomized to one of three conditions (PC, IT, or SD). The PC condition provided monthly contacts with an interventionist primarily via telephone and quarterly face-to-face visits. The IT condition provided frequent, individualized contact through a tailored, website system. Both the PC and IT maintenance programs encouraged the DASH dietary pattern and employed theory-based behavioral techniques to promote maintenance. Design considerations included choice of study population, frequency and type of intervention visits, and choice of primary outcome. Overweight or obese persons with CVD risk factors were studied. The pros and cons of studying this population while excluding others are presented. We studied intervention contact strategies that made fewer demands on participant time and travel, while providing frequent opportunities for interaction. The primary outcome variable for the trial was change in weight from randomization to end of follow-up (30 months). Limits to generalizability are discussed. Individuals in need of weight loss strategies may have been excluded due to barriers associated with internet use. Other participants may have been excluded secondary to a comorbid condition. This paper highlights the design and methods of WLM and informs readers of discussions of critical issues and lessons learned from the trial.
Kim, Myoung Kwon; Shin, Young Jun
2017-01-01
Background The objective of this study was to investigate the immediate effect on gait function when ankle balance taping is applied to amateur soccer players with lateral ankle sprain. Material/Methods A cross-over randomized design was used. Twenty-two soccer players with an ankle sprain underwent 3 interventions in a random order. Subjects were randomly assigned to ankle balance taping, placebo taping, and no taping groups. The assessment was performed using the GAITRite portable walkway system, which records the location and timing of each footfall during ambulation. Results Significant differences were found in the velocity, step length, stride length, and H-H base support among the 3 different taping methods (p<0.05). The ankle balance taping group showed significantly greater velocity, step length, and stride length in comparison to the placebo and no taping group. The ankle balance taping group showed a statistically significant decrease (p<0.05) in the H-H base support compared to the placebo and no taping groups, and the placebo group showed significantly greater velocity in comparison to the no taping group (p<0.05). Conclusions We conclude that ankle balance taping that uses kinesiology tape instantly increased the walking ability of amateur soccer players with lateral ankle sprain. Therefore, ankle balance taping is a useful alternative to prevent and treat ankle sprain of soccer players. PMID:29158472
Use of simulation to compare the performance of minimization with stratified blocked randomization.
Toorawa, Robert; Adena, Michael; Donovan, Mark; Jones, Steve; Conlon, John
2009-01-01
Minimization is an alternative method to stratified permuted block randomization, which may be more effective at balancing treatments when there are many strata. However, its use in the regulatory setting for industry trials remains controversial, primarily due to the difficulty in interpreting conventional asymptotic statistical tests under restricted methods of treatment allocation. We argue that the use of minimization should be critically evaluated when designing the study for which it is proposed. We demonstrate by example how simulation can be used to investigate whether minimization improves treatment balance compared with stratified randomization, and how much randomness can be incorporated into the minimization before any balance advantage is no longer retained. We also illustrate by example how the performance of the traditional model-based analysis can be assessed, by comparing the nominal test size with the observed test size over a large number of simulations. We recommend that the assignment probability for the minimization be selected using such simulations. Copyright (c) 2008 John Wiley & Sons, Ltd.
Neustifter, Benjamin; Rathbun, Stephen L; Shiffman, Saul
2012-01-01
Ecological Momentary Assessment is an emerging method of data collection in behavioral research that may be used to capture the times of repeated behavioral events on electronic devices, and information on subjects' psychological states through the electronic administration of questionnaires at times selected from a probability-based design as well as the event times. A method for fitting a mixed Poisson point process model is proposed for the impact of partially-observed, time-varying covariates on the timing of repeated behavioral events. A random frailty is included in the point-process intensity to describe variation among subjects in baseline rates of event occurrence. Covariate coefficients are estimated using estimating equations constructed by replacing the integrated intensity in the Poisson score equations with a design-unbiased estimator. An estimator is also proposed for the variance of the random frailties. Our estimators are robust in the sense that no model assumptions are made regarding the distribution of the time-varying covariates or the distribution of the random effects. However, subject effects are estimated under gamma frailties using an approximate hierarchical likelihood. The proposed approach is illustrated using smoking data.
Bronas, Ulf G; Hirsch, Alan T; Murphy, Timothy; Badenhop, Dalynn; Collins, Tracie C; Ehrman, Jonathan K; Ershow, Abby G; Lewis, Beth; Treat-Jacobson, Diane J; Walsh, M Eileen; Oldenburg, Niki; Regensteiner, Judith G
2009-11-01
The CLaudication: Exercise Vs Endoluminal Revascularization (CLEVER) study is the first randomized, controlled, clinical, multicenter trial that is evaluating a supervised exercise program compared with revascularization procedures to treat claudication. In this report, the methods and dissemination techniques of the supervised exercise training intervention are described. A total of 217 participants are being recruited and randomized to one of three arms: (1) optimal medical care; (2) aortoiliac revascularization with stent; or (3) supervised exercise training. Of the enrolled patients, 84 will receive supervised exercise therapy. Supervised exercise will be administered according to a protocol designed by a central CLEVER exercise training committee based on validated methods previously used in single center randomized control trials. The protocol will be implemented at each site by an exercise committee member using training methods developed and standardized by the exercise training committee. The exercise training committee reviews progress and compliance with the protocol of each participant weekly. In conclusion, a multicenter approach to disseminate the supervised exercise training technique and to evaluate its efficacy, safety and cost-effectiveness for patients with claudication due to peripheral arterial disease (PAD) is being evaluated for the first time in CLEVER. The CLEVER study will further establish the role of supervised exercise training in the treatment of claudication resulting from PAD and provide standardized methods for use of supervised exercise training in future PAD clinical trials as well as in clinical practice.
2011-01-01
Background Hepatic resection is still associated with significant morbidity. Although the period of parenchymal transection presents a crucial step during the operation, uncertainty persists regarding the optimal technique of transection. It was the aim of the present randomized controlled trial to evaluate the efficacy and safety of hepatic resection using the technique of stapler hepatectomy compared to the simple clamp-crushing technique. Methods/Design The CRUNSH Trial is a prospective randomized controlled single-center trial with a two-group parallel design. Patients scheduled for elective hepatic resection without extrahepatic resection at the Department of General-, Visceral- and Transplantation Surgery, University of Heidelberg are enrolled into the trial and randomized intraoperatively to hepatic resection by the clamp-crushing technique and stapler hepatectomy, respectively. The primary endpoint is total intraoperative blood loss. A set of general and surgical variables are documented as secondary endpoints. Patients and outcome-assessors are blinded for the treatment intervention. Discussion The CRUNSH Trial is the first randomized controlled trial to evaluate efficacy and safety of stapler hepatectomy compared to the clamp-crushing technique for parenchymal transection during elective hepatic resection. Trial Registration ClinicalTrials.gov: NCT01049607 PMID:21888669
Ronaldson, Sarah; Adamson, Joy; Dyson, Lisa; Torgerson, David
2014-10-01
Randomized controlled trials (RCTs) are widely used in health care research to provide high-quality evidence of effectiveness of an intervention. However, sometimes a study does not require an RCT in order to answer its primary objective; a case-finding design may be more appropriate. The aim of this paper was to introduce a new study design that nests a waiting list RCT within a case-finding study. An example of the new study design is the DOC Study, which primarily aims to determine the diagnostic accuracy of lung function tests for chronic obstructive pulmonary disease. It also investigates the impact of lung function tests on smoking behaviour through use of a waiting list design. The first step of the study design is to obtain participants' consent. Individuals are then randomized to one of two groups; either the 'intervention now' group or the 'intervention later' group, that is, participants are placed on a waiting list. All participants receive the same intervention; the only difference between the groups is the timing of the intervention. The design addresses patient preference issues and recruitment issues that can arise in other trial designs. Potential limitations include differential attrition between study groups and potential demoralization for the 'intervention later' group. The 'waiting list case-finding trial' design is a valuable method that could be applied to case-finding studies; the design enables the case-finding component of a study to be maintained while simultaneously exploring additional hypotheses through conducting a trial. © 2014 John Wiley & Sons, Ltd.
The chaotic dynamical aperture
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, S.Y.; Tepikian, S.
1985-10-01
Nonlinear magnetic forces become more important for particles in the modern large accelerators. These nonlinear elements are introduced either intentionally to control beam dynamics or by uncontrollable random errors. Equations of motion in the nonlinear Hamiltonian are usually non-integrable. Because of the nonlinear part of the Hamiltonian, the tune diagram of accelerators is a jungle. Nonlinear magnet multipoles are important in keeping the accelerator operation point in the safe quarter of the hostile jungle of resonant tunes. Indeed, all the modern accelerator design have taken advantages of nonlinear mechanics. On the other hand, the effect of the uncontrollable random multipolesmore » should be evaluated carefully. A powerful method of studying the effect of these nonlinear multipoles is using a particle tracking calculation, where a group of test particles are tracing through these magnetic multipoles in the accelerator hundreds to millions of turns in order to test the dynamical aperture of the machine. These methods are extremely useful in the design of a large accelerator such as SSC, LEP, HERA and RHIC. These calculations unfortunately take tremendous amount of computing time. In this paper, we try to apply the existing method in the nonlinear dynamics to study the possible alternative solution. When the Hamiltonian motion becomes chaotic, the tune of the machine becomes undefined. The aperture related to the chaotic orbit can be identified as chaotic dynamical aperture. We review the method of determining chaotic orbit and apply the method to nonlinear problems in accelerator physics. We then discuss the scaling properties and effect of random sextupoles.« less
Experimental studies of two-stage centrifugal dust concentrator
NASA Astrophysics Data System (ADS)
Vechkanova, M. V.; Fadin, Yu M.; Ovsyannikov, Yu G.
2018-03-01
The article presents data of experimental results of two-stage centrifugal dust concentrator, describes its design, and shows the development of a method of engineering calculation and laboratory investigations. For the experiments, the authors used quartz, ceramic dust and slag. Experimental dispersion analysis of dust particles was obtained by sedimentation method. To build a mathematical model of the process, dust collection was built using central composite rotatable design of the four factorial experiment. A sequence of experiments was conducted in accordance with the table of random numbers. Conclusion were made.
DAnTE: a statistical tool for quantitative analysis of –omics data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Polpitiya, Ashoka D.; Qian, Weijun; Jaitly, Navdeep
2008-05-03
DAnTE (Data Analysis Tool Extension) is a statistical tool designed to address challenges unique to quantitative bottom-up, shotgun proteomics data. This tool has also been demonstrated for microarray data and can easily be extended to other high-throughput data types. DAnTE features selected normalization methods, missing value imputation algorithms, peptide to protein rollup methods, an extensive array of plotting functions, and a comprehensive ANOVA scheme that can handle unbalanced data and random effects. The Graphical User Interface (GUI) is designed to be very intuitive and user friendly.
Esserman, Denise; Allore, Heather G.; Travison, Thomas G.
2016-01-01
Cluster-randomized clinical trials (CRT) are trials in which the unit of randomization is not a participant but a group (e.g. healthcare systems or community centers). They are suitable when the intervention applies naturally to the cluster (e.g. healthcare policy); when lack of independence among participants may occur (e.g. nursing home hygiene); or when it is most ethical to apply an intervention to all within a group (e.g. school-level immunization). Because participants in the same cluster receive the same intervention, CRT may approximate clinical practice, and may produce generalizable findings. However, when not properly designed or interpreted, CRT may induce biased results. CRT designs have features that add complexity to statistical estimation and inference. Chief among these is the cluster-level correlation in response measurements induced by the randomization. A critical consideration is the experimental unit of inference; often it is desirable to consider intervention effects at the level of the individual rather than the cluster. Finally, given that the number of clusters available may be limited, simple forms of randomization may not achieve balance between intervention and control arms at either the cluster- or participant-level. In non-clustered clinical trials, balance of key factors may be easier to achieve because the sample can be homogenous by exclusion of participants with multiple chronic conditions (MCC). CRTs, which are often pragmatic, may eschew such restrictions. Failure to account for imbalance may induce bias and reducing validity. This article focuses on the complexities of randomization in the design of CRTs, such as the inclusion of patients with MCC, and imbalances in covariate factors across clusters. PMID:27478520
Estimating peer effects in networks with peer encouragement designs.
Eckles, Dean; Kizilcec, René F; Bakshy, Eytan
2016-07-05
Peer effects, in which the behavior of an individual is affected by the behavior of their peers, are central to social science. Because peer effects are often confounded with homophily and common external causes, recent work has used randomized experiments to estimate effects of specific peer behaviors. These experiments have often relied on the experimenter being able to randomly modulate mechanisms by which peer behavior is transmitted to a focal individual. We describe experimental designs that instead randomly assign individuals' peers to encouragements to behaviors that directly affect those individuals. We illustrate this method with a large peer encouragement design on Facebook for estimating the effects of receiving feedback from peers on posts shared by focal individuals. We find evidence for substantial effects of receiving marginal feedback on multiple behaviors, including giving feedback to others and continued posting. These findings provide experimental evidence for the role of behaviors directed at specific individuals in the adoption and continued use of communication technologies. In comparison, observational estimates differ substantially, both underestimating and overestimating effects, suggesting that researchers and policy makers should be cautious in relying on them.
Estimating peer effects in networks with peer encouragement designs
Eckles, Dean; Kizilcec, René F.; Bakshy, Eytan
2016-01-01
Peer effects, in which the behavior of an individual is affected by the behavior of their peers, are central to social science. Because peer effects are often confounded with homophily and common external causes, recent work has used randomized experiments to estimate effects of specific peer behaviors. These experiments have often relied on the experimenter being able to randomly modulate mechanisms by which peer behavior is transmitted to a focal individual. We describe experimental designs that instead randomly assign individuals’ peers to encouragements to behaviors that directly affect those individuals. We illustrate this method with a large peer encouragement design on Facebook for estimating the effects of receiving feedback from peers on posts shared by focal individuals. We find evidence for substantial effects of receiving marginal feedback on multiple behaviors, including giving feedback to others and continued posting. These findings provide experimental evidence for the role of behaviors directed at specific individuals in the adoption and continued use of communication technologies. In comparison, observational estimates differ substantially, both underestimating and overestimating effects, suggesting that researchers and policy makers should be cautious in relying on them. PMID:27382145
2014-01-01
Background Bipolar I disorder (BD-I) is a chronic mental illness characterized by the presence of one or more manic episodes, or both depressive and manic episodes, usually separated by asymptomatic intervals. Pharmacists can contribute to the management of BD-I, mainly with the use of effective and safe drugs, and improve the patient’s life quality through pharmaceutical care. Some studies have shown the effect of pharmaceutical care in the achievement of therapeutic goals in different illnesses; however, to our knowledge, there is a lack of randomized controlled trials designed to assess the effect of pharmacist intervention in patients with BD. The aim of this study is to assess the effectiveness of the Dader Method for pharmaceutical care in patients with BD-I. Methods/design Randomized, controlled, prospective, single-center clinical trial with duration of 12 months will be performed to compare the effect of Dader Method of pharmaceutical care with the usual care process of patients in a psychiatric clinic. Patients diagnosed with BD-I aged between 18 and 65 years who have been discharged or referred from outpatients service of the San Juan de Dios Clinic (Antioquia, Colombia) will be included. Patients will be randomized into the intervention group who will receive pharmaceutical care provided by pharmacists working in collaboration with psychiatrists, or into the control group who will receive usual care and verbal-written counseling regarding BD. Study outcomes will be assessed at baseline and at 3, 6, 9, and 12 months after randomization. The primary outcome will be to measure the number of hospitalizations, emergency service consultations, and unscheduled outpatient visits. Effectiveness, safety, adherence, and quality of life will be assessed as secondary outcomes. Statistical analyses will be performed using two-tailed McNemar tests, Pearson chi-square tests, and Student’s t-tests; a P value <0.05 will be considered as statistically significant. Discussion As far as we know, this is the first randomized controlled trial to assess the effect of the Dader Method for pharmaceutical care in patients with BD-I and it could generate valuable information and recommendations about the role of pharmacists in the improvement of therapeutic goals, solution of drug-related problems, and adherence. Trial registration Registration number NCT01750255 on August 6, 2012. First patient randomized on 24 November 2011. PMID:24885673
Finite-time stability of neutral-type neural networks with random time-varying delays
NASA Astrophysics Data System (ADS)
Ali, M. Syed; Saravanan, S.; Zhu, Quanxin
2017-11-01
This paper is devoted to the finite-time stability analysis of neutral-type neural networks with random time-varying delays. The randomly time-varying delays are characterised by Bernoulli stochastic variable. This result can be extended to analysis and design for neutral-type neural networks with random time-varying delays. On the basis of this paper, we constructed suitable Lyapunov-Krasovskii functional together and established a set of sufficient linear matrix inequalities approach to guarantee the finite-time stability of the system concerned. By employing the Jensen's inequality, free-weighting matrix method and Wirtinger's double integral inequality, the proposed conditions are derived and two numerical examples are addressed for the effectiveness of the developed techniques.
A Random Forest-based ensemble method for activity recognition.
Feng, Zengtao; Mo, Lingfei; Li, Meng
2015-01-01
This paper presents a multi-sensor ensemble approach to human physical activity (PA) recognition, using random forest. We designed an ensemble learning algorithm, which integrates several independent Random Forest classifiers based on different sensor feature sets to build a more stable, more accurate and faster classifier for human activity recognition. To evaluate the algorithm, PA data collected from the PAMAP (Physical Activity Monitoring for Aging People), which is a standard, publicly available database, was utilized to train and test. The experimental results show that the algorithm is able to correctly recognize 19 PA types with an accuracy of 93.44%, while the training is faster than others. The ensemble classifier system based on the RF (Random Forest) algorithm can achieve high recognition accuracy and fast calculation.
2013-01-01
Background Youth with serious mental illness may experience improved psychiatric stability with second generation antipsychotic (SGA) medication treatment, but unfortunately may also experience unhealthy weight gain adverse events. Research on weight loss strategies for youth who require ongoing antipsychotic treatment is quite limited. The purpose of this paper is to present the design, methods, and rationale of the Improving Metabolic Parameters in Antipsychotic Child Treatment (IMPACT) study, a federally funded, randomized trial comparing two pharmacologic strategies against a control condition to manage SGA-related weight gain. Methods The design and methodology considerations of the IMPACT trial are described and embedded in a description of health risks associated with antipsychotic-related weight gain and the limitations of currently available research. Results The IMPACT study is a 4-site, six month, randomized, open-label, clinical trial of overweight/obese youth ages 8–19 years with pediatric schizophrenia-spectrum and bipolar-spectrum disorders, psychotic or non-psychotic major depressive disorder, or irritability associated with autistic disorder. Youth who have experienced clinically significant weight gain during antipsychotic treatment in the past 3 years are randomized to either (1) switch antipsychotic plus healthy lifestyle education (HLE); (2) add metformin plus HLE; or (3) HLE with no medication change. The primary aim is to compare weight change (body mass index z-scores) for each pharmacologic intervention with the control condition. Key secondary assessments include percentage body fat, insulin resistance, lipid profile, psychiatric symptom stability (monitored independently by the pharmacotherapist and a blinded evaluator), and all-cause and specific cause discontinuation. This study is ongoing, and the targeted sample size is 132 youth. Conclusion Antipsychotic-related weight gain is an important public health issue for youth requiring ongoing antipsychotic treatment to maintain psychiatric stability. The IMPACT study provides a model for pediatric research on adverse event management using state-of-the art methods. The results of this study will provide needed data on risks and benefits of two pharmacologic interventions that are already being used in pediatric clinical settings but that have not yet been compared directly in randomized trials. Trial registration Clinical Trials.gov NCT00806234 PMID:23947389
Treatment of early-onset schizophrenia spectrum disorders (TEOSS): rationale, design, and methods.
McClellan, Jon; Sikich, Linmarie; Findling, Robert L; Frazier, Jean A; Vitiello, Benedetto; Hlastala, Stefanie A; Williams, Emily; Ambler, Denisse; Hunt-Harrison, Tyehimba; Maloney, Ann E; Ritz, Louise; Anderson, Robert; Hamer, Robert M; Lieberman, Jeffrey A
2007-08-01
The Treatment of Early Onset Schizophrenia Spectrum Disorders Study is a publicly funded clinical trial designed to compare the therapeutic benefits, safety, and tolerability of risperidone, olanzapine, and molindone in youths with early-onset schizophrenia spectrum disorders. The rationale, design, and methods of the Treatment of Early Onset Schizophrenia Spectrum Disorders Study are described. Using a randomized, double-blind, parallel-group design at four sites, youths with EOSS (ages 8-19 years) were assigned to an 8-week acute trial of risperidone (0.5-6.0 mg/day), olanzapine (2.5-20 mg/day), or molindone (10-140 mg/day). Responders continued double-blind treatment for 44 weeks. The primary outcome measure was responder status at 8 weeks, defined by a 20% reduction in baseline Positive and Negative Symptom Scale scores plus ratings of significant improvement on the Clinical Global Impressions. Secondary outcome measures included assessments of psychopathology, functional impairment, quality of life, and medication safety. An intent-to-treat analytic plan was used. From February 2002 to May 2006, 476 youths were screened, 173 were further evaluated, and 119 were randomized. Several significant study modifications were required to address safety, the use of adjunctive medications, and the termination of the olanzapine treatment arm due to weight gain. The Treatment of Early Onset Schizophrenia Spectrum Disorders Study will inform clinical practice regarding the use of antipsychotic medications for youths with early-onset schizophrenia spectrum disorders. Important safety concerns emerged during the study, including higher than anticipated rates of suicidality and problems tapering thymoleptic agents before randomization.
Taguchi method of experimental design in materials education
NASA Technical Reports Server (NTRS)
Weiser, Martin W.
1993-01-01
Some of the advantages and disadvantages of the Taguchi Method of experimental design as applied to Materials Science will be discussed. This is a fractional factorial method that employs the minimum number of experimental trials for the information obtained. The analysis is also very simple to use and teach, which is quite advantageous in the classroom. In addition, the Taguchi loss function can be easily incorporated to emphasize that improvements in reproducibility are often at least as important as optimization of the response. The disadvantages of the Taguchi Method include the fact that factor interactions are normally not accounted for, there are zero degrees of freedom if all of the possible factors are used, and randomization is normally not used to prevent environmental biasing. In spite of these disadvantages it is felt that the Taguchi Method is extremely useful for both teaching experimental design and as a research tool, as will be shown with a number of brief examples.
Musk as a Pheromone? Didactic Exercise.
ERIC Educational Resources Information Center
Bersted, Chris T.
A classroom/laboratory exercise has been used to introduce college students to factorial research designs, differentiate between interpretations for experimental and quasi-experimental variables, and exemplify application of laboratory research methods to test practical questions (advertising claims). The exercise involves having randomly divided…
Xu, Chonggang; Gertner, George
2013-01-01
Fourier Amplitude Sensitivity Test (FAST) is one of the most popular uncertainty and sensitivity analysis techniques. It uses a periodic sampling approach and a Fourier transformation to decompose the variance of a model output into partial variances contributed by different model parameters. Until now, the FAST analysis is mainly confined to the estimation of partial variances contributed by the main effects of model parameters, but does not allow for those contributed by specific interactions among parameters. In this paper, we theoretically show that FAST analysis can be used to estimate partial variances contributed by both main effects and interaction effects of model parameters using different sampling approaches (i.e., traditional search-curve based sampling, simple random sampling and random balance design sampling). We also analytically calculate the potential errors and biases in the estimation of partial variances. Hypothesis tests are constructed to reduce the effect of sampling errors on the estimation of partial variances. Our results show that compared to simple random sampling and random balance design sampling, sensitivity indices (ratios of partial variances to variance of a specific model output) estimated by search-curve based sampling generally have higher precision but larger underestimations. Compared to simple random sampling, random balance design sampling generally provides higher estimation precision for partial variances contributed by the main effects of parameters. The theoretical derivation of partial variances contributed by higher-order interactions and the calculation of their corresponding estimation errors in different sampling schemes can help us better understand the FAST method and provide a fundamental basis for FAST applications and further improvements. PMID:24143037
Xu, Chonggang; Gertner, George
2011-01-01
Fourier Amplitude Sensitivity Test (FAST) is one of the most popular uncertainty and sensitivity analysis techniques. It uses a periodic sampling approach and a Fourier transformation to decompose the variance of a model output into partial variances contributed by different model parameters. Until now, the FAST analysis is mainly confined to the estimation of partial variances contributed by the main effects of model parameters, but does not allow for those contributed by specific interactions among parameters. In this paper, we theoretically show that FAST analysis can be used to estimate partial variances contributed by both main effects and interaction effects of model parameters using different sampling approaches (i.e., traditional search-curve based sampling, simple random sampling and random balance design sampling). We also analytically calculate the potential errors and biases in the estimation of partial variances. Hypothesis tests are constructed to reduce the effect of sampling errors on the estimation of partial variances. Our results show that compared to simple random sampling and random balance design sampling, sensitivity indices (ratios of partial variances to variance of a specific model output) estimated by search-curve based sampling generally have higher precision but larger underestimations. Compared to simple random sampling, random balance design sampling generally provides higher estimation precision for partial variances contributed by the main effects of parameters. The theoretical derivation of partial variances contributed by higher-order interactions and the calculation of their corresponding estimation errors in different sampling schemes can help us better understand the FAST method and provide a fundamental basis for FAST applications and further improvements.
Xiong, Chengjie; van Belle, Gerald; Miller, J Philip; Morris, John C
2011-02-01
Therapeutic trials of disease-modifying agents on Alzheimer's disease (AD) require novel designs and analyses involving switch of treatments for at least a portion of subjects enrolled. Randomized start and randomized withdrawal designs are two examples of such designs. Crucial design parameters such as sample size and the time of treatment switch are important to understand in designing such clinical trials. The purpose of this article is to provide methods to determine sample sizes and time of treatment switch as well as optimum statistical tests of treatment efficacy for clinical trials of disease-modifying agents on AD. A general linear mixed effects model is proposed to test the disease-modifying efficacy of novel therapeutic agents on AD. This model links the longitudinal growth from both the placebo arm and the treatment arm at the time of treatment switch for these in the delayed treatment arm or early withdrawal arm and incorporates the potential correlation on the rate of cognitive change before and after the treatment switch. Sample sizes and the optimum time for treatment switch of such trials as well as optimum test statistic for the treatment efficacy are determined according to the model. Assuming an evenly spaced longitudinal design over a fixed duration, the optimum treatment switching time in a randomized start or a randomized withdrawal trial is half way through the trial. With the optimum test statistic for the treatment efficacy and over a wide spectrum of model parameters, the optimum sample size allocations are fairly close to the simplest design with a sample size ratio of 1:1:1 among the treatment arm, the delayed treatment or early withdrawal arm, and the placebo arm. The application of the proposed methodology to AD provides evidence that much larger sample sizes are required to adequately power disease-modifying trials when compared with those for symptomatic agents, even when the treatment switch time and efficacy test are optimally chosen. The proposed method assumes that the only and immediate effect of treatment switch is on the rate of cognitive change. Crucial design parameters for the clinical trials of disease-modifying agents on AD can be optimally chosen. Government and industry officials as well as academia researchers should consider the optimum use of the clinical trials design for disease-modifying agents on AD in their effort to search for the treatments with the potential to modify the underlying pathophysiology of AD.
Samavat, Hamed; Dostal, Allison M.; Wang, Renwei; Bedell, Sarah; Emory, Tim H.; Ursin, Giske; Torkelson, Carolyn J.; Gross, Myron D.; Le, Chap T.; Yu, Mimi C.; Yang, Chung S.; Yee, Douglas; Wu, Anna H.; Yuan, Jian-Min; Kurzer, Mindy S.
2015-01-01
Purpose The Minnesota Green Tea Trial (MGTT) was a randomized, placebo-controlled, double-blinded trial investigating the effect of daily green tea extract consumption for 12 months on biomarkers of breast cancer risk. Methods Participants were healthy postmenopausal women at high risk of breast cancer due to dense breast tissue with differing catechol-O-methyltransferase (COMT) genotypes. The intervention was a green tea catechin extract containing 843.0 ± 44.0 mg/day epigallocatechin gallate or placebo capsules for one year. Annual digital screening mammograms were obtained at baseline and month 12, and fasting blood and 24-hour urine samples were provided at baseline, months 6, and 12. Primary endpoints included changes in percent mammographic density, circulating endogenous sex hormones and insulin-like growth factor axis proteins; secondary endpoints were changes in urinary estrogens and estrogen metabolites and circulating F2-isoprostanes, a biomarker of oxidative stress. Results The MGTT screened more than 100,000 mammograms and randomized 1075 participants based on treatment (green tea extract vs. placebo), stratified by COMT genotype activity (high COMT vs. low/intermediate COMT genotype activity). 937 women successfully completed the study and 138 dropped out (overall dropout rate= 12.8%). Conclusions In this paper we report the rationale, design, recruitment, participant characteristics, and methods for biomarker and statistical analyses. PMID:26206423
ERIC Educational Resources Information Center
Xu, Zeyu; Nichols, Austin
2010-01-01
The gold standard in making causal inference on program effects is a randomized trial. Most randomization designs in education randomize classrooms or schools rather than individual students. Such "clustered randomization" designs have one principal drawback: They tend to have limited statistical power or precision. This study aims to…
47 CFR 1.1602 - Designation for random selection.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 47 Telecommunication 1 2010-10-01 2010-10-01 false Designation for random selection. 1.1602 Section 1.1602 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1602 Designation for random selection...
47 CFR 1.1602 - Designation for random selection.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 47 Telecommunication 1 2011-10-01 2011-10-01 false Designation for random selection. 1.1602 Section 1.1602 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1602 Designation for random selection...
Auxiliary Parameter MCMC for Exponential Random Graph Models
NASA Astrophysics Data System (ADS)
Byshkin, Maksym; Stivala, Alex; Mira, Antonietta; Krause, Rolf; Robins, Garry; Lomi, Alessandro
2016-11-01
Exponential random graph models (ERGMs) are a well-established family of statistical models for analyzing social networks. Computational complexity has so far limited the appeal of ERGMs for the analysis of large social networks. Efficient computational methods are highly desirable in order to extend the empirical scope of ERGMs. In this paper we report results of a research project on the development of snowball sampling methods for ERGMs. We propose an auxiliary parameter Markov chain Monte Carlo (MCMC) algorithm for sampling from the relevant probability distributions. The method is designed to decrease the number of allowed network states without worsening the mixing of the Markov chains, and suggests a new approach for the developments of MCMC samplers for ERGMs. We demonstrate the method on both simulated and actual (empirical) network data and show that it reduces CPU time for parameter estimation by an order of magnitude compared to current MCMC methods.
Weight Control Intervention for Truck Drivers: The SHIFT Randomized Controlled Trial, United States
Wipfli, Brad; Thompson, Sharon V.; Elliot, Diane L.; Anger, W. Kent; Bodner, Todd; Hammer, Leslie B.; Perrin, Nancy A.
2016-01-01
Objectives. To evaluate the effectiveness of the Safety and Health Involvement For Truckers (SHIFT) intervention with a randomized controlled design. Methods. The multicomponent intervention was a weight-loss competition supported with body weight and behavioral self-monitoring, computer-based training, and motivational interviewing. We evaluated intervention effectiveness with a cluster-randomized design involving 22 terminals from 5 companies in the United States in 2012 to 2014. Companies were required to provide interstate transportation services and operate at least 2 larger terminals. We randomly assigned terminals to intervention or usual practice control conditions. We assessed participating drivers (n = 452) at baseline and 6 months. Results. In an intent-to-treat analysis, the postintervention difference between groups in mean body mass index change was 1.00 kilograms per meters squared (P < .001; intervention = −0.73; control = +0.27). Behavioral changes included statistically significant improvements in fruit and vegetable consumption and physical activity. Conclusions. Results establish the effectiveness of a multicomponent and remotely administered intervention for producing significant weight loss among commercial truck drivers. PMID:27463067
NASA Astrophysics Data System (ADS)
Narimani, M.; Sadeghieh Ahari, S.; Rajabi, S.
This research aims to determine efficacy of two therapeutic methods and compare them; Eye Movement, Desensitization and Reprocessing (EMDR) and Cognitive Behavioral Therapy (CBT) for reduction of anxiety and depression of Iranian combatant afflicted with Post traumatic Stress Disorder (PTSD) after imposed war. Statistical population of current study includes combatants afflicted with PTSD that were hospitalized in Isar Hospital of Ardabil province or were inhabited in Ardabil. These persons were selected through simple random sampling and were randomly located in three groups. The method was extended test method and study design was multi-group test-retest. Used tools include hospital anxiety and depression scale. This survey showed that exercise of EMDR and CBT has caused significant reduction of anxiety and depression.
Aiello, Allison E.; Simanek, Amanda M.; Eisenberg, Marisa C.; Walsh, Alison R.; Davis, Brian; Volz, Erik; Cheng, Caroline; Rainey, Jeanette J.; Uzicanin, Amra; Gao, Hongjiang; Osgood, Nathaniel; Knowles, Dylan; Stanley, Kevin; Tarter, Kara; Monto, Arnold S.
2016-01-01
Background Social networks are increasingly recognized as important points of intervention, yet relatively few intervention studies of respiratory infection transmission have utilized a network design. Here we describe the design, methods, and social network structure of a randomized intervention for isolating respiratory infection cases in a university setting over a 10-week period. Methodology/Principal Findings 590 students in six residence halls enrolled in the eX-FLU study during a chain-referral recruitment process from September 2012-January 2013. Of these, 262 joined as “seed” participants, who nominated their social contacts to join the study, of which 328 “nominees” enrolled. Participants were cluster-randomized by 117 residence halls. Participants were asked to respond to weekly surveys on health behaviors, social interactions, and influenza-like illness (ILI) symptoms. Participants were randomized to either a 3-Day dorm room isolation intervention or a control group (no isolation) upon illness onset. ILI cases reported on their isolation behavior during illness and provided throat and nasal swab specimens at onset, day-three, and day-six of illness. A subsample of individuals (N=103) participated in a sub-study using a novel smartphone application, iEpi, which collected sensor and contextually-dependent survey data on social interactions. Within the social network, participants were significantly positively assortative by intervention group, enrollment type, residence hall, iEpi participation, age, gender, race, and alcohol use (all P<0.002). Conclusions/Significance We identified a feasible study design for testing the impact of isolation from social networks in a university setting. These data provide an unparalleled opportunity to address questions about isolation and infection transmission, as well as insights into social networks and behaviors among college-aged students. Several important lessons were learned over the course of this project, including feasible isolation durations, the need for extensive organizational efforts, as well as the need for specialized programmers and server space for managing survey and smartphone data. PMID:27266848
The Effectiveness of Circular Equating as a Criterion for Evaluating Equating.
ERIC Educational Resources Information Center
Wang, Tianyou; Hanson, Bradley A.; Harris, Deborah J.
Equating a test form to itself through a chain of equatings, commonly referred to as circular equating, has been widely used as a criterion to evaluate the adequacy of equating. This paper uses both analytical methods and simulation methods to show that this criterion is in general invalid in serving this purpose. For the random groups design done…
ERIC Educational Resources Information Center
Abdu-Raheem, B. O.
2012-01-01
This study investigated the effects of problem-solving method of teaching on secondary school students' achievement and retention in Social Studies. The study adopted the quasi-experimental, pre-test, post-test, control group design. The sample for the study consisted of 240 Junior Secondary School Class II students randomly selected from six…
ERIC Educational Resources Information Center
Huang, SuHua
2015-01-01
A mixed-method embedded research design was employed to investigate the effectiveness of the integration of technology for second-grade students' vocabulary development and learning. Two second-grade classes with a total of 40 students (21 boys and 19 girls) were randomly selected to participate in this study for the course of a semester. One…
A Test of a Method of Increasing Patient Question Asking in Physician-Patient Interactions.
ERIC Educational Resources Information Center
Feeser, Teresa; Thompson, Teresa L.
A study examined the effectiveness of a method designed to increase active patient involvement in the health care context. Subjects, 38 patients visiting a three-physician dermatology practice one randomly selected morning, were asked to fill out a survey at the end of their visit. Half of the subjects were asked to read a "communication…
NASA Astrophysics Data System (ADS)
El Sachat, Alexandros; Meristoudi, Anastasia; Markos, Christos; Pispas, Stergios; Riziotis, Christos
2014-03-01
A low cost and low complexity optical detection method of proteins is presented by employing a detection scheme based on electrostatic interactions, and implemented by sensitization of a polymer optical fibers' (POF) surface by thin overlayers of properly designed sensitive copolymer materials with predesigned charges. This method enables the fast detection of proteins having opposite charge to the overlayer, and also the effective discrimination of differently charged proteins like lysozyme (LYS) and bovine serum albumin (BSA). As sensitive materials the block and the random copolymers of the same monomers were employed, namely the block copolymer poly(styrene-b-2vinylpyridine) (PS-b- P2VP) and the corresponding random copolymer poly(styrene-r-2vinylpyridine) (PS-r-P2VP), of similar composition and molecular weights. Results show systematically different response between the block and the random copolymers, although of the same order of magnitude, drawing thus important conclusions on their applications' techno-economic aspects given that they have significantly different associated manufacturing method and costs. The use of the POF platform, in combination with those adaptable copolymer sensing materials could lead to efficient low cost bio-detection schemes.
Group versus individual family planning counseling in Ghana: a randomized, noninferiority trial.
Schwandt, Hilary M; Creanga, Andreea A; Danso, Kwabena A; Adanu, Richard M K; Agbenyega, Tsiri; Hindin, Michelle J
2013-08-01
Group, rather than individual, family planning counseling has the potential to increase family planning knowledge and use through more efficient use of limited human resources. A randomized, noninferiority study design was utilized to identify whether group family planning counseling is as effective as individual family planning counseling in Ghana. Female gynecology patients were enrolled from two teaching hospitals in Ghana in June and July 2008. Patients were randomized to receive either group or individual family planning counseling. The primary outcome in this study was change in modern contraceptive method knowledge. Changes in family planning use intention before and after the intervention and intended method type were also explored. Comparisons between the two study arms suggest that randomization was successful. The difference in change in modern contraceptive methods known from baseline to follow-up between the two study arms (group-individual), adjusted for study site, was -0.21, (95% confidence interval: -0.53 to 0.12) suggesting no difference between the two arms. Group family planning counseling was as effective as individual family planning counseling in increasing modern contraceptive knowledge among female gynecology patients in Ghana. Copyright © 2013 Elsevier Inc. All rights reserved.
Review of Recent Methodological Developments in Group-Randomized Trials: Part 1—Design
Li, Fan; Gallis, John A.; Prague, Melanie; Murray, David M.
2017-01-01
In 2004, Murray et al. reviewed methodological developments in the design and analysis of group-randomized trials (GRTs). We have highlighted the developments of the past 13 years in design with a companion article to focus on developments in analysis. As a pair, these articles update the 2004 review. We have discussed developments in the topics of the earlier review (e.g., clustering, matching, and individually randomized group-treatment trials) and in new topics, including constrained randomization and a range of randomized designs that are alternatives to the standard parallel-arm GRT. These include the stepped-wedge GRT, the pseudocluster randomized trial, and the network-randomized GRT, which, like the parallel-arm GRT, require clustering to be accounted for in both their design and analysis. PMID:28426295
Review of Recent Methodological Developments in Group-Randomized Trials: Part 1-Design.
Turner, Elizabeth L; Li, Fan; Gallis, John A; Prague, Melanie; Murray, David M
2017-06-01
In 2004, Murray et al. reviewed methodological developments in the design and analysis of group-randomized trials (GRTs). We have highlighted the developments of the past 13 years in design with a companion article to focus on developments in analysis. As a pair, these articles update the 2004 review. We have discussed developments in the topics of the earlier review (e.g., clustering, matching, and individually randomized group-treatment trials) and in new topics, including constrained randomization and a range of randomized designs that are alternatives to the standard parallel-arm GRT. These include the stepped-wedge GRT, the pseudocluster randomized trial, and the network-randomized GRT, which, like the parallel-arm GRT, require clustering to be accounted for in both their design and analysis.
2012-01-01
Background Optimization of the clinical care process by integration of evidence-based knowledge is one of the active components in care pathways. When studying the impact of a care pathway by using a cluster-randomized design, standardization of the care pathway intervention is crucial. This methodology paper describes the development of the clinical content of an evidence-based care pathway for in-hospital management of chronic obstructive pulmonary disease (COPD) exacerbation in the context of a cluster-randomized controlled trial (cRCT) on care pathway effectiveness. Methods The clinical content of a care pathway for COPD exacerbation was developed based on recognized process design and guideline development methods. Subsequently, based on the COPD case study, a generalized eight-step method was designed to support the development of the clinical content of an evidence-based care pathway. Results A set of 38 evidence-based key interventions and a set of 24 process and 15 outcome indicators were developed in eight different steps. Nine Belgian multidisciplinary teams piloted both the set of key interventions and indicators. The key intervention set was judged by the teams as being valid and clinically applicable. In addition, the pilot study showed that the indicators were feasible for the involved clinicians and patients. Conclusions The set of 38 key interventions and the set of process and outcome indicators were found to be appropriate for the development and standardization of the clinical content of the COPD care pathway in the context of a cRCT on pathway effectiveness. The developed eight-step method may facilitate multidisciplinary teams caring for other patient populations in designing the clinical content of their future care pathways. PMID:23190552
Design of the algorithm of photons migration in the multilayer skin structure
NASA Astrophysics Data System (ADS)
Bulykina, Anastasiia B.; Ryzhova, Victoria A.; Korotaev, Valery V.; Samokhin, Nikita Y.
2017-06-01
Design of approaches and methods of the oncological diseases diagnostics has special significance. It allows determining any kind of tumors at early stages. The development of optical and laser technologies provided increase of a number of methods allowing making diagnostic studies of oncological diseases. A promising area of biomedical diagnostics is the development of automated nondestructive testing systems for the study of the skin polarizing properties based on backscattered radiation detection. Specification of the examined tissue polarizing properties allows studying of structural properties change influenced by various pathologies. Consequently, measurement and analysis of the polarizing properties of the scattered optical radiation for the development of methods for diagnosis and imaging of skin in vivo appear relevant. The purpose of this research is to design the algorithm of photons migration in the multilayer skin structure. In this research, the algorithm of photons migration in the multilayer skin structure was designed. It is based on the use of the Monte Carlo method. Implemented Monte Carlo method appears as a tracking the paths of photons experiencing random discrete direction changes before they are released from the analyzed area or decrease their intensity to negligible levels. Modeling algorithm consists of the medium and the source characteristics generation, a photon generating considering spatial coordinates of the polar and azimuthal angles, the photon weight reduction calculating due to specular and diffuse reflection, the photon mean free path definition, the photon motion direction angle definition as a result of random scattering with a Henyey-Greenstein phase function, the medium's absorption calculation. Biological tissue is modeled as a homogeneous scattering sheet characterized by absorption, a scattering and anisotropy coefficients.
Comparison of Response Surface and Kriging Models for Multidisciplinary Design Optimization
NASA Technical Reports Server (NTRS)
Simpson, Timothy W.; Korte, John J.; Mauery, Timothy M.; Mistree, Farrokh
1998-01-01
In this paper, we compare and contrast the use of second-order response surface models and kriging models for approximating non-random, deterministic computer analyses. After reviewing the response surface method for constructing polynomial approximations, kriging is presented as an alternative approximation method for the design and analysis of computer experiments. Both methods are applied to the multidisciplinary design of an aerospike nozzle which consists of a computational fluid dynamics model and a finite-element model. Error analysis of the response surface and kriging models is performed along with a graphical comparison of the approximations, and four optimization problems m formulated and solved using both sets of approximation models. The second-order response surface models and kriging models-using a constant underlying global model and a Gaussian correlation function-yield comparable results.
FitzGerald, Mary P; Anderson, Rodney U; Potts, Jeannette; Payne, Christopher K; Peters, Kenneth M; Clemens, J Quentin; Kotarinos, Rhonda; Fraser, Laura; Cosby, Annamarie; Fortman, Carole; Neville, Cynthia; Badillo, Suzanne; Odabachian, Lisa; Sanfield, Anna; O’Dougherty, Betsy; Halle-Podell, Rick; Cen, Liyi; Chuai, Shannon; Landis, J Richard; Kusek, John W; Nyberg, Leroy M
2010-01-01
Objectives To determine the feasibility of conducting a randomized clinical trial designed to compare two methods of manual therapy (myofascial physical therapy (MPT) and global therapeutic massage (GTM)) among patients with urologic chronic pelvic pain syndromes. Materials and Methods Our goal was to recruit 48 subjects with chronic prostatitis/chronic pelvic pain syndrome or interstitial cystitis/painful bladder syndrome at six clinical centers. Eligible patients were randomized to either MPT or GTM and were scheduled to receive up to 10 weekly treatments, each 1 hour in duration. Criteria to assess feasibility included adherence of therapists to prescribed therapeutic protocol as determined by records of treatment, adverse events which occurred during study treatment, and rate of response to therapy as assessed by the Patient Global Response Assessment (GRA). Primary outcome analysis compared response rates between treatment arms using Mantel-Haenszel methods. Results Twenty-three (49%) men and 24 (51%) women were randomized over a six month period. Twenty-four (51%) patients were randomized to GTM, 23 (49%) to MPT; 44 (94%) patients completed the study. Therapist adherence to the treatment protocols was excellent. The GRA response rate of 57% in the MPT group was significantly higher than the rate of 21% in the GTM treatment group (p=0.03). Conclusions The goals to judge feasibility of conducting a full-scale trial of physical therapy methods were met. The preliminary findings of a beneficial effect of MPT warrants further study. PMID:19535099
Point process statistics in atom probe tomography.
Philippe, T; Duguay, S; Grancher, G; Blavette, D
2013-09-01
We present a review of spatial point processes as statistical models that we have designed for the analysis and treatment of atom probe tomography (APT) data. As a major advantage, these methods do not require sampling. The mean distance to nearest neighbour is an attractive approach to exhibit a non-random atomic distribution. A χ(2) test based on distance distributions to nearest neighbour has been developed to detect deviation from randomness. Best-fit methods based on first nearest neighbour distance (1 NN method) and pair correlation function are presented and compared to assess the chemical composition of tiny clusters. Delaunay tessellation for cluster selection has been also illustrated. These statistical tools have been applied to APT experiments on microelectronics materials. Copyright © 2012 Elsevier B.V. All rights reserved.
Randomized subspace-based robust principal component analysis for hyperspectral anomaly detection
NASA Astrophysics Data System (ADS)
Sun, Weiwei; Yang, Gang; Li, Jialin; Zhang, Dianfa
2018-01-01
A randomized subspace-based robust principal component analysis (RSRPCA) method for anomaly detection in hyperspectral imagery (HSI) is proposed. The RSRPCA combines advantages of randomized column subspace and robust principal component analysis (RPCA). It assumes that the background has low-rank properties, and the anomalies are sparse and do not lie in the column subspace of the background. First, RSRPCA implements random sampling to sketch the original HSI dataset from columns and to construct a randomized column subspace of the background. Structured random projections are also adopted to sketch the HSI dataset from rows. Sketching from columns and rows could greatly reduce the computational requirements of RSRPCA. Second, the RSRPCA adopts the columnwise RPCA (CWRPCA) to eliminate negative effects of sampled anomaly pixels and that purifies the previous randomized column subspace by removing sampled anomaly columns. The CWRPCA decomposes the submatrix of the HSI data into a low-rank matrix (i.e., background component), a noisy matrix (i.e., noise component), and a sparse anomaly matrix (i.e., anomaly component) with only a small proportion of nonzero columns. The algorithm of inexact augmented Lagrange multiplier is utilized to optimize the CWRPCA problem and estimate the sparse matrix. Nonzero columns of the sparse anomaly matrix point to sampled anomaly columns in the submatrix. Third, all the pixels are projected onto the complemental subspace of the purified randomized column subspace of the background and the anomaly pixels in the original HSI data are finally exactly located. Several experiments on three real hyperspectral images are carefully designed to investigate the detection performance of RSRPCA, and the results are compared with four state-of-the-art methods. Experimental results show that the proposed RSRPCA outperforms four comparison methods both in detection performance and in computational time.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Romero, Vicente; Bonney, Matthew; Schroeder, Benjamin
When very few samples of a random quantity are available from a source distribution of unknown shape, it is usually not possible to accurately infer the exact distribution from which the data samples come. Under-estimation of important quantities such as response variance and failure probabilities can result. For many engineering purposes, including design and risk analysis, we attempt to avoid under-estimation with a strategy to conservatively estimate (bound) these types of quantities -- without being overly conservative -- when only a few samples of a random quantity are available from model predictions or replicate experiments. This report examines a classmore » of related sparse-data uncertainty representation and inference approaches that are relatively simple, inexpensive, and effective. Tradeoffs between the methods' conservatism, reliability, and risk versus number of data samples (cost) are quantified with multi-attribute metrics use d to assess method performance for conservative estimation of two representative quantities: central 95% of response; and 10 -4 probability of exceeding a response threshold in a tail of the distribution. Each method's performance is characterized with 10,000 random trials on a large number of diverse and challenging distributions. The best method and number of samples to use in a given circumstance depends on the uncertainty quantity to be estimated, the PDF character, and the desired reliability of bounding the true value. On the basis of this large data base and study, a strategy is proposed for selecting the method and number of samples for attaining reasonable credibility levels in bounding these types of quantities when sparse samples of random variables or functions are available from experiments or simulations.« less
Reliability optimization design of the gear modification coefficient based on the meshing stiffness
NASA Astrophysics Data System (ADS)
Wang, Qianqian; Wang, Hui
2018-04-01
Since the time varying meshing stiffness of gear system is the key factor affecting gear vibration, it is important to design the meshing stiffness to reduce vibration. Based on the effect of gear modification coefficient on the meshing stiffness, considering the random parameters, reliability optimization design of the gear modification is researched. The dimension reduction and point estimation method is used to estimate the moment of the limit state function, and the reliability is obtained by the forth moment method. The cooperation of the dynamic amplitude results before and after optimization indicates that the research is useful for the reduction of vibration and noise and the improvement of the reliability.
NASA Technical Reports Server (NTRS)
Holland, Frederic A., Jr.
2004-01-01
Modern engineering design practices are tending more toward the treatment of design parameters as random variables as opposed to fixed, or deterministic, values. The probabilistic design approach attempts to account for the uncertainty in design parameters by representing them as a distribution of values rather than as a single value. The motivations for this effort include preventing excessive overdesign as well as assessing and assuring reliability, both of which are important for aerospace applications. However, the determination of the probability distribution is a fundamental problem in reliability analysis. A random variable is often defined by the parameters of the theoretical distribution function that gives the best fit to experimental data. In many cases the distribution must be assumed from very limited information or data. Often the types of information that are available or reasonably estimated are the minimum, maximum, and most likely values of the design parameter. For these situations the beta distribution model is very convenient because the parameters that define the distribution can be easily determined from these three pieces of information. Widely used in the field of operations research, the beta model is very flexible and is also useful for estimating the mean and standard deviation of a random variable given only the aforementioned three values. However, an assumption is required to determine the four parameters of the beta distribution from only these three pieces of information (some of the more common distributions, like the normal, lognormal, gamma, and Weibull distributions, have two or three parameters). The conventional method assumes that the standard deviation is a certain fraction of the range. The beta parameters are then determined by solving a set of equations simultaneously. A new method developed in-house at the NASA Glenn Research Center assumes a value for one of the beta shape parameters based on an analogy with the normal distribution (ref.1). This new approach allows for a very simple and direct algebraic solution without restricting the standard deviation. The beta parameters obtained by the new method are comparable to the conventional method (and identical when the distribution is symmetrical). However, the proposed method generally produces a less peaked distribution with a slightly larger standard deviation (up to 7 percent) than the conventional method in cases where the distribution is asymmetric or skewed. The beta distribution model has now been implemented into the Fast Probability Integration (FPI) module used in the NESSUS computer code for probabilistic analyses of structures (ref. 2).
GENOPT 2016: Design of a generalization-based challenge in global optimization
NASA Astrophysics Data System (ADS)
Battiti, Roberto; Sergeyev, Yaroslav; Brunato, Mauro; Kvasov, Dmitri
2016-10-01
While comparing results on benchmark functions is a widely used practice to demonstrate the competitiveness of global optimization algorithms, fixed benchmarks can lead to a negative data mining process. To avoid this negative effect, the GENOPT contest benchmarks can be used which are based on randomized function generators, designed for scientific experiments, with fixed statistical characteristics but individual variation of the generated instances. The generators are available to participants for off-line tests and online tuning schemes, but the final competition is based on random seeds communicated in the last phase through a cooperative process. A brief presentation and discussion of the methods and results obtained in the framework of the GENOPT contest are given in this contribution.
Tyrrell, Pascal N; Corey, Paul N; Feldman, Brian M; Silverman, Earl D
2013-06-01
Physicians often assess the effectiveness of treatments on a small number of patients. Multiple-baseline designs (MBDs), based on the Wampold-Worsham (WW) method of randomization and applied to four subjects, have relatively low power. Our objective was to propose another approach with greater power that does not suffer from the time requirements of the WW method applied to a greater number of subjects. The power of a design that involves the combination of two four-subject MBDs was estimated using computer simulation and compared with the four- and eight-subject designs. The effect of a delayed linear response to treatment on the power of the test was also investigated. Power was found to be adequate (>80%) for a standardized mean difference (SMD) greater than 0.8. The effect size associated with 80% power from combined tests was smaller than that of the single four-subject MBD (SMD=1.3) and comparable with the eight-subject MBD (SMD=0.6). A delayed linear response to the treatment resulted in important reductions in power (20-35%). By combining two four-subject MBD tests, an investigator can detect better effect sizes (SMD=0.8) and be able to complete a comparatively timelier and feasible study. Copyright © 2013 Elsevier Inc. All rights reserved.
Rotavirus vaccine effectiveness in low-income settings: An evaluation of the test-negative design.
Schwartz, Lauren M; Halloran, M Elizabeth; Rowhani-Rahbar, Ali; Neuzil, Kathleen M; Victor, John C
2017-01-03
The test-negative design (TND), an epidemiologic method currently used to measure rotavirus vaccine (RV) effectiveness, compares the vaccination status of rotavirus-positive cases and rotavirus-negative controls meeting a pre-defined case definition for acute gastroenteritis. Despite the use of this study design in low-income settings, the TND has not been evaluated to measure rotavirus vaccine effectiveness. This study builds upon prior methods to evaluate the use of the TND for influenza vaccine using a randomized controlled clinical trial database. Test-negative vaccine effectiveness (VE-TND) estimates were derived from three large randomized placebo-controlled trials (RCTs) of monovalent (RV1) and pentavalent (RV5) rotavirus vaccines in sub-Saharan Africa and Asia. Derived VE-TND estimates were compared to the original RCT vaccine efficacy estimates (VE-RCTs). The core assumption of the TND (i.e., rotavirus vaccine has no effect on rotavirus-negative diarrhea) was also assessed. TND vaccine effectiveness estimates were nearly equivalent to original RCT vaccine efficacy estimates. Neither RV had a substantial effect on rotavirus-negative diarrhea. This study supports the TND as an appropriate epidemiologic study design to measure rotavirus vaccine effectiveness in low-income settings. Copyright © 2016 The Author(s). Published by Elsevier Ltd.. All rights reserved.
Mieres, Jennifer H; Shaw, Leslee J; Hendel, Robert C; Heller, Gary V
2009-01-01
Coronary artery disease remains the leading cause of morbidity and mortality in women. The optimal non-invasive test for evaluation of ischemic heart disease in women is unknown. Although current guidelines support the choice of the exercise tolerance test (ETT) as a first line test for women with a normal baseline ECG and adequate exercise capabilities, supportive data for this recommendation are controversial. The what is the optimal method for ischemia evaluation in women? (WOMEN) study was designed to determine the optimal non-invasive strategy for CAD risk detection of intermediate and high risk women presenting with chest pain or equivalent symptoms suggestive of ischemic heart disease. The study will prospectively compare the 2-year event rates in women capable of performing exercise treadmill testing or Tc-99 m tetrofosmin SPECT myocardial perfusion imaging (MPI). The study will enroll women presenting for the evaluation of chest pain or anginal equivalent symptoms who are capable of performing >5 METs of exercise while at intermediate-high pretest risk for ischemic heart disease who will be randomized to either ETT testing alone or with Tc-99 m tetrofosmin SPECT MPI. The null hypothesis for this project is that the exercise ECG has the same negative predictive value for risk detection as gated myocardial perfusion SPECT in women. The primary aim is to compare 2-year cardiac event rates in women randomized to SPECT MPI to those randomized to ETT. The WOMEN study seeks to provide objective information for guidelines for the evaluation of symptomatic women with an intermediate-high likelihood for CAD.
Saunders, Gabrielle H; Biswas, Kousick; Serpi, Tracey; McGovern, Stephanie; Groer, Shirley; Stock, Eileen M; Magruder, Kathryn M; Storzbach, Daniel; Skelton, Kelly; Abrams, Thad; McCranie, Mark; Richerson, Joan; Dorn, Patricia A; Huang, Grant D; Fallon, Michael T
2017-11-01
Posttraumatic stress disorder (PTSD) is a leading cause of impairments in quality of life and functioning among Veterans. Service dogs have been promoted as an effective adjunctive intervention for PTSD, however published research is limited and design and implementation flaws in published studies limit validated conclusions. This paper describes the rationale for the study design, a detailed methodological description, and implementation challenges of a multisite randomized clinical trial examining the impact of service dogs on the on the functioning and quality of life of Veterans with PTSD. Trial design considerations prioritized participant and intervention (dog) safety, selection of an intervention comparison group that would optimize enrollment in all treatment arms, pragmatic methods to ensure healthy well-trained dogs, and the selection of outcomes for achieving scientific and clinical validity in a Veteran PTSD population. Since there is no blueprint for conducting a randomized clinical trial examining the impact of dogs on PTSD of this size and scope, it is our primary intent that the successful completion of this trial will set a benchmark for future trial design and scientific rigor, as well as guiding researchers aiming to better understand the role that dogs can have in the management of Veterans experiencing mental health conditions such as PTSD. Published by Elsevier Inc.
Reflections on experimental research in medical education.
Cook, David A; Beckman, Thomas J
2010-08-01
As medical education research advances, it is important that education researchers employ rigorous methods for conducting and reporting their investigations. In this article we discuss several important yet oft neglected issues in designing experimental research in education. First, randomization controls for only a subset of possible confounders. Second, the posttest-only design is inherently stronger than the pretest-posttest design, provided the study is randomized and the sample is sufficiently large. Third, demonstrating the superiority of an educational intervention in comparison to no intervention does little to advance the art and science of education. Fourth, comparisons involving multifactorial interventions are hopelessly confounded, have limited application to new settings, and do little to advance our understanding of education. Fifth, single-group pretest-posttest studies are susceptible to numerous validity threats. Finally, educational interventions (including the comparison group) must be described in detail sufficient to allow replication.
Olives, Casey; Pagano, Marcello; Deitchler, Megan; Hedt, Bethany L; Egge, Kari; Valadez, Joseph J
2009-04-01
Traditional lot quality assurance sampling (LQAS) methods require simple random sampling to guarantee valid results. However, cluster sampling has been proposed to reduce the number of random starting points. This study uses simulations to examine the classification error of two such designs, a 67x3 (67 clusters of three observations) and a 33x6 (33 clusters of six observations) sampling scheme to assess the prevalence of global acute malnutrition (GAM). Further, we explore the use of a 67x3 sequential sampling scheme for LQAS classification of GAM prevalence. Results indicate that, for independent clusters with moderate intracluster correlation for the GAM outcome, the three sampling designs maintain approximate validity for LQAS analysis. Sequential sampling can substantially reduce the average sample size that is required for data collection. The presence of intercluster correlation can impact dramatically the classification error that is associated with LQAS analysis.
Probabilistic DHP adaptive critic for nonlinear stochastic control systems.
Herzallah, Randa
2013-06-01
Following the recently developed algorithms for fully probabilistic control design for general dynamic stochastic systems (Herzallah & Káarnáy, 2011; Kárný, 1996), this paper presents the solution to the probabilistic dual heuristic programming (DHP) adaptive critic method (Herzallah & Káarnáy, 2011) and randomized control algorithm for stochastic nonlinear dynamical systems. The purpose of the randomized control input design is to make the joint probability density function of the closed loop system as close as possible to a predetermined ideal joint probability density function. This paper completes the previous work (Herzallah & Káarnáy, 2011; Kárný, 1996) by formulating and solving the fully probabilistic control design problem on the more general case of nonlinear stochastic discrete time systems. A simulated example is used to demonstrate the use of the algorithm and encouraging results have been obtained. Copyright © 2013 Elsevier Ltd. All rights reserved.
Wampler, Peter J; Rediske, Richard R; Molla, Azizur R
2013-01-18
A remote sensing technique was developed which combines a Geographic Information System (GIS); Google Earth, and Microsoft Excel to identify home locations for a random sample of households in rural Haiti. The method was used to select homes for ethnographic and water quality research in a region of rural Haiti located within 9 km of a local hospital and source of health education in Deschapelles, Haiti. The technique does not require access to governmental records or ground based surveys to collect household location data and can be performed in a rapid, cost-effective manner. The random selection of households and the location of these households during field surveys were accomplished using GIS, Google Earth, Microsoft Excel, and handheld Garmin GPSmap 76CSx GPS units. Homes were identified and mapped in Google Earth, exported to ArcMap 10.0, and a random list of homes was generated using Microsoft Excel which was then loaded onto handheld GPS units for field location. The development and use of a remote sensing method was essential to the selection and location of random households. A total of 537 homes initially were mapped and a randomized subset of 96 was identified as potential survey locations. Over 96% of the homes mapped using Google Earth imagery were correctly identified as occupied dwellings. Only 3.6% of the occupants of mapped homes visited declined to be interviewed. 16.4% of the homes visited were not occupied at the time of the visit due to work away from the home or market days. A total of 55 households were located using this method during the 10 days of fieldwork in May and June of 2012. The method used to generate and field locate random homes for surveys and water sampling was an effective means of selecting random households in a rural environment lacking geolocation infrastructure. The success rate for locating households using a handheld GPS was excellent and only rarely was local knowledge required to identify and locate households. This method provides an important technique that can be applied to other developing countries where a randomized study design is needed but infrastructure is lacking to implement more traditional participant selection methods.
Teleconferenced Educational Detailing: Diabetes Education for Primary Care Physicians
ERIC Educational Resources Information Center
Harris, Stewart B.; Leiter, Lawrence A.; Webster-Bogaert, Susan; Van, Daphne M.; O'Neill, Colleen
2005-01-01
Introduction: Formal didactic continuing medical education (CME) is relatively ineffective for changing physician behavior. Diabetes mellitus is an increasingly prevalent disease, and interventions to improve adherence to clinical practice guidelines (CPGs) are needed. Methods: A stratified, cluster-randomized, controlled trial design was used to…
NASA Technical Reports Server (NTRS)
Hague, D. S.; Merz, A. W.
1975-01-01
Multivariable search techniques are applied to a particular class of airfoil optimization problems. These are the maximization of lift and the minimization of disturbance pressure magnitude in an inviscid nonlinear flow field. A variety of multivariable search techniques contained in an existing nonlinear optimization code, AESOP, are applied to this design problem. These techniques include elementary single parameter perturbation methods, organized search such as steepest-descent, quadratic, and Davidon methods, randomized procedures, and a generalized search acceleration technique. Airfoil design variables are seven in number and define perturbations to the profile of an existing NACA airfoil. The relative efficiency of the techniques are compared. It is shown that elementary one parameter at a time and random techniques compare favorably with organized searches in the class of problems considered. It is also shown that significant reductions in disturbance pressure magnitude can be made while retaining reasonable lift coefficient values at low free stream Mach numbers.
2014-01-01
Background There is a need for evidence of the clinical effectiveness of minimally invasive surgery for the treatment of esophageal cancer, but randomized controlled trials in surgery are often difficult to conduct. The ROMIO (Randomized Open or Minimally Invasive Oesophagectomy) study will establish the feasibility of a main trial which will examine the clinical and cost-effectiveness of minimally invasive and open surgical procedures for the treatment of esophageal cancer. Methods/Design A pilot randomized controlled trial (RCT), in two centers (University Hospitals Bristol NHS Foundation Trust and Plymouth Hospitals NHS Trust) will examine numbers of incident and eligible patients who consent to participate in the ROMIO study. Interventions will include esophagectomy by: (1) open gastric mobilization and right thoracotomy, (2) laparoscopic gastric mobilization and right thoracotomy, and (3) totally minimally invasive surgery (in the Bristol center only). The primary outcomes of the feasibility study will be measures of recruitment, successful development of methods to monitor quality of surgery and fidelity to a surgical protocol, and development of a core outcome set to evaluate esophageal cancer surgery. The study will test patient-reported outcomes measures to assess recovery, methods to blind participants, assessments of surgical morbidity, and methods to capture cost and resource use. ROMIO will integrate methods to monitor and improve recruitment using audio recordings of consultations between recruiting surgeons, nurses, and patients to provide feedback for recruiting staff. Discussion The ROMIO study aims to establish efficient methods to undertake a main trial of minimally invasive surgery versus open surgery for esophageal cancer. Trial registration The pilot trial has Current Controlled Trials registration number ISRCTN59036820(25/02/2013) at http://www.controlled-trials.com; the ROMIO trial record at that site gives a link to the original version of the study protocol. PMID:24888266
Goldenberg, N.A.; Abshire, T.; Blatchford, P.J.; Fenton, L.Z.; Halperin, J.L.; Hiatt, W.R.; Kessler, C.M.; Kittelson, J.M.; Manco-Johnson, M.J.; Spyropoulos, A.C.; Steg, P.G.; Stence, N.V.; Turpie, A.G.G.; Schulman, S.
2015-01-01
BACKGROUND Randomized controlled trials (RCTs) in pediatric venous thromboembolism (VTE) treatment have been challenged by unsubstantiated design assumptions and/or poor accrual. Pilot/feasibility (P/F) studies are critical to future RCT success. METHODS Kids-DOTT is a multicenter RCT investigating non-inferiority of a 6-week (shortened) vs. 3-month (conventional) duration of anticoagulation in patients <21 years old with provoked venous thrombosis. Primary efficacy and safety endpoints are symptomatic recurrent VTE at 1 year and anticoagulant-related, clinically-relevant bleeding. In the P/F phase, 100 participants were enrolled in an open, blinded endpoint, parallel-cohort RCT design. RESULTS No eligibility violations or randomization errors occurred. Of enrolled patients, 69% were randomized, 3% missed the randomization window, and 28% were followed in pre-specified observational cohorts for completely occlusive thrombosis or persistent antiphospholipid antibodies. Retention at 1 year was 82%. Inter-observer agreement between local vs. blinded central determination of venous occlusion by imaging at 6 weeks post-diagnosis was strong (κ-statistic=0.75; 95% confidence interval [CI] 0.48–1.0). Primary efficacy and safety event rates were 3.3% (95% CI 0.3–11.5%) and 1.4% (0.03–7.4%). CONCLUSIONS The P/F phase of Kids-DOTT has demonstrated validity of vascular imaging findings of occlusion as a randomization criterion, and defined randomization, retention, and endpoint rates to inform the fully-powered RCT. PMID:26118944
ERIC Educational Resources Information Center
Huang, SuHua
2012-01-01
The mixed-method explanatory research design was employed to investigate the effectiveness of the Accelerated Reader (AR) program on middle school students' reading achievement and motivation. A total of 211 sixth to eighth-grade students provided quantitative data by completing an AR Survey. Thirty of the 211 students were randomly selected to…
ERIC Educational Resources Information Center
Sola, Agboola Omowunmi; Ojo, Oloyede Ezekiel
2007-01-01
This study assessed and compared the relative effectiveness of three methods for teaching and conducting experiments in separation of mixtures in chemistry. A pre-test, post-test experimental design with a control group was used. Two hundred and thirty three randomly selected Senior Secondary School I (SSS I) chemistry students were drawn from…
Grant, Aileen; Dreischulte, Tobias; Treweek, Shaun; Guthrie, Bruce
2012-08-28
Trials of complex interventions are criticized for being 'black box', so the UK Medical Research Council recommends carrying out a process evaluation to explain the trial findings. We believe it is good practice to pre-specify and publish process evaluation protocols to set standards and minimize bias. Unlike protocols for trials, little guidance or standards exist for the reporting of process evaluations. This paper presents the mixed-method process evaluation protocol of a cluster randomized trial, drawing on a framework designed by the authors. This mixed-method evaluation is based on four research questions and maps data collection to a logic model of how the data-driven quality improvement in primary care (DQIP) intervention is expected to work. Data collection will be predominately by qualitative case studies in eight to ten of the trial practices, focus groups with patients affected by the intervention and quantitative analysis of routine practice data, trial outcome and questionnaire data and data from the DQIP intervention. We believe that pre-specifying the intentions of a process evaluation can help to minimize bias arising from potentially misleading post-hoc analysis. We recognize it is also important to retain flexibility to examine the unexpected and the unintended. From that perspective, a mixed-methods evaluation allows the combination of exploratory and flexible qualitative work, and more pre-specified quantitative analysis, with each method contributing to the design, implementation and interpretation of the other.As well as strengthening the study the authors hope to stimulate discussion among their academic colleagues about publishing protocols for evaluations of randomized trials of complex interventions. DATA-DRIVEN QUALITY IMPROVEMENT IN PRIMARY CARE TRIAL REGISTRATION: ClinicalTrials.gov: NCT01425502.
Robust, Adaptive Functional Regression in Functional Mixed Model Framework.
Zhu, Hongxiao; Brown, Philip J; Morris, Jeffrey S
2011-09-01
Functional data are increasingly encountered in scientific studies, and their high dimensionality and complexity lead to many analytical challenges. Various methods for functional data analysis have been developed, including functional response regression methods that involve regression of a functional response on univariate/multivariate predictors with nonparametrically represented functional coefficients. In existing methods, however, the functional regression can be sensitive to outlying curves and outlying regions of curves, so is not robust. In this paper, we introduce a new Bayesian method, robust functional mixed models (R-FMM), for performing robust functional regression within the general functional mixed model framework, which includes multiple continuous or categorical predictors and random effect functions accommodating potential between-function correlation induced by the experimental design. The underlying model involves a hierarchical scale mixture model for the fixed effects, random effect and residual error functions. These modeling assumptions across curves result in robust nonparametric estimators of the fixed and random effect functions which down-weight outlying curves and regions of curves, and produce statistics that can be used to flag global and local outliers. These assumptions also lead to distributions across wavelet coefficients that have outstanding sparsity and adaptive shrinkage properties, with great flexibility for the data to determine the sparsity and the heaviness of the tails. Together with the down-weighting of outliers, these within-curve properties lead to fixed and random effect function estimates that appear in our simulations to be remarkably adaptive in their ability to remove spurious features yet retain true features of the functions. We have developed general code to implement this fully Bayesian method that is automatic, requiring the user to only provide the functional data and design matrices. It is efficient enough to handle large data sets, and yields posterior samples of all model parameters that can be used to perform desired Bayesian estimation and inference. Although we present details for a specific implementation of the R-FMM using specific distributional choices in the hierarchical model, 1D functions, and wavelet transforms, the method can be applied more generally using other heavy-tailed distributions, higher dimensional functions (e.g. images), and using other invertible transformations as alternatives to wavelets.
Robust, Adaptive Functional Regression in Functional Mixed Model Framework
Zhu, Hongxiao; Brown, Philip J.; Morris, Jeffrey S.
2012-01-01
Functional data are increasingly encountered in scientific studies, and their high dimensionality and complexity lead to many analytical challenges. Various methods for functional data analysis have been developed, including functional response regression methods that involve regression of a functional response on univariate/multivariate predictors with nonparametrically represented functional coefficients. In existing methods, however, the functional regression can be sensitive to outlying curves and outlying regions of curves, so is not robust. In this paper, we introduce a new Bayesian method, robust functional mixed models (R-FMM), for performing robust functional regression within the general functional mixed model framework, which includes multiple continuous or categorical predictors and random effect functions accommodating potential between-function correlation induced by the experimental design. The underlying model involves a hierarchical scale mixture model for the fixed effects, random effect and residual error functions. These modeling assumptions across curves result in robust nonparametric estimators of the fixed and random effect functions which down-weight outlying curves and regions of curves, and produce statistics that can be used to flag global and local outliers. These assumptions also lead to distributions across wavelet coefficients that have outstanding sparsity and adaptive shrinkage properties, with great flexibility for the data to determine the sparsity and the heaviness of the tails. Together with the down-weighting of outliers, these within-curve properties lead to fixed and random effect function estimates that appear in our simulations to be remarkably adaptive in their ability to remove spurious features yet retain true features of the functions. We have developed general code to implement this fully Bayesian method that is automatic, requiring the user to only provide the functional data and design matrices. It is efficient enough to handle large data sets, and yields posterior samples of all model parameters that can be used to perform desired Bayesian estimation and inference. Although we present details for a specific implementation of the R-FMM using specific distributional choices in the hierarchical model, 1D functions, and wavelet transforms, the method can be applied more generally using other heavy-tailed distributions, higher dimensional functions (e.g. images), and using other invertible transformations as alternatives to wavelets. PMID:22308015
Peul, Wilco C; van Houwelingen, Hans C; van der Hout, Wilbert B; Brand, Ronald; Eekhof, Just AH; Tans, Joseph ThJ; Thomeer, Ralph TWM; Koes, Bart W
2005-01-01
Background The design of a randomized multicenter trial is presented on the effectiveness of a prolonged conservative treatment strategy compared with surgery in patients with persisting intense sciatica (lumbosacral radicular syndrome). Methods/design Patients presenting themselves to their general practitioner with disabling sciatica lasting less than twelve weeks are referred to the neurology outpatient department of one of the participating hospitals. After confirmation of the diagnosis and surgical indication MRI scanning is performed. If a distinct disc herniation is discerned which in addition covers the clinically expected site the patient is eligible for randomization. Depending on the outcome of the randomization scheme the patient will either be submitted to prolonged conservative care or surgery. Surgery will be carried out according to the guidelines and between six and twelve weeks after onset of complaints. The experimental therapy consists of a prolonged conservative treatment under supervision of the general practitioner, which may be followed by surgical intervention in case of persisting or progressive disability. The main primary outcome measure is the disease specific disability of daily functioning. Other primary outcome measures are perceived recovery and intensity of legpain. Secondary outcome measures encompass severity of complaints, quality of life, medical consumption, absenteeism, costs and preference. The main research question will be answered at 12 months after randomization. The total follow-up period covers two years. Discussion Evidence is lacking concerning the optimal treatment of lumbar disc induced sciatica. This pragmatic randomized trial, focusses on the 'timing' of intervention, and will contribute to the decision of the general practictioner and neurologist, regarding referral of patients for surgery. PMID:15707491
Vesco, Kimberly K.; Karanja, Njeri; King, Janet C.; Gillman, Matthew W.; Perrin, Nancy; McEvoy, Cindy; Eckhardt, Cara; Smith, K. Sabina; Stevens, Victor J.
2012-01-01
Background Obesity and excessive weight gain during pregnancy are associated with adverse pregnancy outcomes. Observational studies suggest that minimal or no gestational weight gain (GWG) may minimize the risk of adverse pregnancy outcomes for obese women. Objective This report describes the design of Healthy Moms, a randomized trial testing a weekly, group-based, weight management intervention designed to help limit GWG to 3% of weight (measured at the time of randomization) among obese pregnant women (BMI ≥30 kg/m2). Participants are randomized at 10–20 weeks gestation to either the intervention or a single dietary advice control condition. Primary Outcomes The study is powered for the primary outcome of total GWG, yielding a target sample size of 160 women. Additional secondary outcomes include weight change between randomization and one-year postpartum and proportion of infants with birth weight > 90th percentile for gestational age. Statistical analyses will be based on intention-to-treat. Methods Following randomization, all participants receive a 45-minute dietary consultation. They are encouraged to follow the Dietary Approaches to Stop Hypertension diet without sodium restriction. Intervention group participants receive an individualized calorie intake goal, a second individual counseling session and attend weekly group meetings until they give birth. Research staff assess all participants at 34-weeks gestation and at 2-weeks and one-year postpartum with their infants. Summary The Healthy Moms study is testing weight management techniques that have been used with non-pregnant adults. We aim to help obese women limit GWG to improve their long-term health and the health of their offspring. PMID:22465256
A novel method for predicting the power outputs of wave energy converters
NASA Astrophysics Data System (ADS)
Wang, Yingguang
2018-03-01
This paper focuses on realistically predicting the power outputs of wave energy converters operating in shallow water nonlinear waves. A heaving two-body point absorber is utilized as a specific calculation example, and the generated power of the point absorber has been predicted by using a novel method (a nonlinear simulation method) that incorporates a second order random wave model into a nonlinear dynamic filter. It is demonstrated that the second order random wave model in this article can be utilized to generate irregular waves with realistic crest-trough asymmetries, and consequently, more accurate generated power can be predicted by subsequently solving the nonlinear dynamic filter equation with the nonlinearly simulated second order waves as inputs. The research findings demonstrate that the novel nonlinear simulation method in this article can be utilized as a robust tool for ocean engineers in their design, analysis and optimization of wave energy converters.
Zheng, Guanglou; Fang, Gengfa; Shankaran, Rajan; Orgun, Mehmet A; Zhou, Jie; Qiao, Li; Saleem, Kashif
2017-05-01
Generating random binary sequences (BSes) is a fundamental requirement in cryptography. A BS is a sequence of N bits, and each bit has a value of 0 or 1. For securing sensors within wireless body area networks (WBANs), electrocardiogram (ECG)-based BS generation methods have been widely investigated in which interpulse intervals (IPIs) from each heartbeat cycle are processed to produce BSes. Using these IPI-based methods to generate a 128-bit BS in real time normally takes around half a minute. In order to improve the time efficiency of such methods, this paper presents an ECG multiple fiducial-points based binary sequence generation (MFBSG) algorithm. The technique of discrete wavelet transforms is employed to detect arrival time of these fiducial points, such as P, Q, R, S, and T peaks. Time intervals between them, including RR, RQ, RS, RP, and RT intervals, are then calculated based on this arrival time, and are used as ECG features to generate random BSes with low latency. According to our analysis on real ECG data, these ECG feature values exhibit the property of randomness and, thus, can be utilized to generate random BSes. Compared with the schemes that solely rely on IPIs to generate BSes, this MFBSG algorithm uses five feature values from one heart beat cycle, and can be up to five times faster than the solely IPI-based methods. So, it achieves a design goal of low latency. According to our analysis, the complexity of the algorithm is comparable to that of fast Fourier transforms. These randomly generated ECG BSes can be used as security keys for encryption or authentication in a WBAN system.
2011-01-01
Background Microinsurance or Community-Based Health Insurance is a promising healthcare financing mechanism, which is increasingly applied to aid rural poor persons in low-income countries. Robust empirical evidence on the causal relations between Community-Based Health Insurance and healthcare utilisation, financial protection and other areas is scarce and necessary. This paper contains a discussion of the research design of three Cluster Randomised Controlled Trials in India to measure the impact of Community-Based Health Insurance on several outcomes. Methods/Design Each trial sets up a Community-Based Health Insurance scheme among a group of micro-finance affiliate families. Villages are grouped into clusters which are congruous with pre-existing social groupings. These clusters are randomly assigned to one of three waves of implementation, ensuring the entire population is offered Community-Based Health Insurance by the end of the experiment. Each wave of treatment is preceded by a round of mixed methods evaluation, with quantitative, qualitative and spatial evidence on impact collected. Improving upon practices in published Cluster Randomised Controlled Trial literature, we detail how research design decisions have ensured that both the households offered insurance and the implementers of the Community-Based Health Insurance scheme operate in an environment replicating a non-experimental implementation. Discussion When a Cluster Randomised Controlled Trial involves randomizing within a community, generating adequate and valid conclusions requires that the research design must be made congruous with social structures within the target population, to ensure that such trials are conducted in an implementing environment which is a suitable analogue to that of a non-experimental implementing environment. PMID:21988774
NASA Astrophysics Data System (ADS)
Mayer, J. M.; Stead, D.
2017-04-01
With the increased drive towards deeper and more complex mine designs, geotechnical engineers are often forced to reconsider traditional deterministic design techniques in favour of probabilistic methods. These alternative techniques allow for the direct quantification of uncertainties within a risk and/or decision analysis framework. However, conventional probabilistic practices typically discretize geological materials into discrete, homogeneous domains, with attributes defined by spatially constant random variables, despite the fact that geological media display inherent heterogeneous spatial characteristics. This research directly simulates this phenomenon using a geostatistical approach, known as sequential Gaussian simulation. The method utilizes the variogram which imposes a degree of controlled spatial heterogeneity on the system. Simulations are constrained using data from the Ok Tedi mine site in Papua New Guinea and designed to randomly vary the geological strength index and uniaxial compressive strength using Monte Carlo techniques. Results suggest that conventional probabilistic techniques have a fundamental limitation compared to geostatistical approaches, as they fail to account for the spatial dependencies inherent to geotechnical datasets. This can result in erroneous model predictions, which are overly conservative when compared to the geostatistical results.
Improvement of Biocatalysts for Industrial and Environmental Purposes by Saturation Mutagenesis
Valetti, Francesca; Gilardi, Gianfranco
2013-01-01
Laboratory evolution techniques are becoming increasingly widespread among protein engineers for the development of novel and designed biocatalysts. The palette of different approaches ranges from complete randomized strategies to rational and structure-guided mutagenesis, with a wide variety of costs, impacts, drawbacks and relevance to biotechnology. A technique that convincingly compromises the extremes of fully randomized vs. rational mutagenesis, with a high benefit/cost ratio, is saturation mutagenesis. Here we will present and discuss this approach in its many facets, also tackling the issue of randomization, statistical evaluation of library completeness and throughput efficiency of screening methods. Successful recent applications covering different classes of enzymes will be presented referring to the literature and to research lines pursued in our group. The focus is put on saturation mutagenesis as a tool for designing novel biocatalysts specifically relevant to production of fine chemicals for improving bulk enzymes for industry and engineering technical enzymes involved in treatment of waste, detoxification and production of clean energy from renewable sources. PMID:24970191
On the design of random metasurface based devices.
Dupré, Matthieu; Hsu, Liyi; Kanté, Boubacar
2018-05-08
Metasurfaces are generally designed by placing scatterers in periodic or pseudo-periodic grids. We propose and discuss design rules for functional metasurfaces with randomly placed anisotropic elements that randomly sample a well-defined phase function. By analyzing the focusing performance of random metasurface lenses as a function of their density and the density of the phase-maps used to design them, we find that the performance of 1D metasurfaces is mostly governed by their density while 2D metasurfaces strongly depend on both the density and the near-field coupling configuration of the surface. The proposed approach is used to design all-polarization random metalenses at near infrared frequencies. Challenges, as well as opportunities of random metasurfaces compared to periodic ones are discussed. Our results pave the way to new approaches in the design of nanophotonic structures and devices from lenses to solar energy concentrators.
Baker, John [Walnut Creek, CA; Archer, Daniel E [Knoxville, TN; Luke, Stanley John [Pleasanton, CA; Decman, Daniel J [Livermore, CA; White, Gregory K [Livermore, CA
2009-06-23
A tailpulse signal generating/simulating apparatus, system, and method designed to produce electronic pulses which simulate tailpulses produced by a gamma radiation detector, including the pileup effect caused by the characteristic exponential decay of the detector pulses, and the random Poisson distribution pulse timing for radioactive materials. A digital signal process (DSP) is programmed and configured to produce digital values corresponding to pseudo-randomly selected pulse amplitudes and pseudo-randomly selected Poisson timing intervals of the tailpulses. Pulse amplitude values are exponentially decayed while outputting the digital value to a digital to analog converter (DAC). And pulse amplitudes of new pulses are added to decaying pulses to simulate the pileup effect for enhanced realism in the simulation.
An uncertainty model of acoustic metamaterials with random parameters
NASA Astrophysics Data System (ADS)
He, Z. C.; Hu, J. Y.; Li, Eric
2018-01-01
Acoustic metamaterials (AMs) are man-made composite materials. However, the random uncertainties are unavoidable in the application of AMs due to manufacturing and material errors which lead to the variance of the physical responses of AMs. In this paper, an uncertainty model based on the change of variable perturbation stochastic finite element method (CVPS-FEM) is formulated to predict the probability density functions of physical responses of AMs with random parameters. Three types of physical responses including the band structure, mode shapes and frequency response function of AMs are studied in the uncertainty model, which is of great interest in the design of AMs. In this computation, the physical responses of stochastic AMs are expressed as linear functions of the pre-defined random parameters by using the first-order Taylor series expansion and perturbation technique. Then, based on the linear function relationships of parameters and responses, the probability density functions of the responses can be calculated by the change-of-variable technique. Three numerical examples are employed to demonstrate the effectiveness of the CVPS-FEM for stochastic AMs, and the results are validated by Monte Carlo method successfully.
Thase, Michael E.
2010-01-01
Background Major depressive disorder (MDD) is highly prevalent and associated with disability and chronicity. Although cognitive therapy (CT) is an effective short-term treatment for MDD, a significant proportion of responders subsequently suffer relapses or recurrences. Purpose This design prospectively evaluates: 1) a method to discriminate CT-treated responders at lower versus higher risk for relapse; and 2) the subsequent durability of 8-month continuation phase therapies in randomized higher risk responders followed for an additional 24-months. The primary prediction is: after protocol treatments are stopped, higher risk patients randomly assigned to continuation phase CT (C-CT) will have a lower risk of relapse/recurrence than those randomized to fluoxetine (FLX). Methods Outpatients, aged 18 to 70 years, with recurrent MDD received 12–14 weeks of CT provided by 15 experienced therapists from two sites. Responders (i.e., no MDD and 17-item Hamilton Rating Scale for Depression ≤ 12) were stratified into higher and lower risk groups based on stability of remission during the last 6 weeks of CT. The lower risk group entered follow-up for 32 months; the higher risk group was randomized to 8 months of continuation phase therapy with either C-CT or clinical management plus either double-blinded FLX or pill placebo. Following the continuation phase, higher risk patients were followed by blinded evaluators for 24 months. Results The trial began in 2000. Enrollment is complete (N=523). The follow-up continues. Conclusions The trial evaluates the preventive effects and durability of acute and continuation phase treatments in the largest known sample of CT responders collected worldwide. PMID:20451668
SKIRT: The design of a suite of input models for Monte Carlo radiative transfer simulations
NASA Astrophysics Data System (ADS)
Baes, M.; Camps, P.
2015-09-01
The Monte Carlo method is the most popular technique to perform radiative transfer simulations in a general 3D geometry. The algorithms behind and acceleration techniques for Monte Carlo radiative transfer are discussed extensively in the literature, and many different Monte Carlo codes are publicly available. On the contrary, the design of a suite of components that can be used for the distribution of sources and sinks in radiative transfer codes has received very little attention. The availability of such models, with different degrees of complexity, has many benefits. For example, they can serve as toy models to test new physical ingredients, or as parameterised models for inverse radiative transfer fitting. For 3D Monte Carlo codes, this requires algorithms to efficiently generate random positions from 3D density distributions. We describe the design of a flexible suite of components for the Monte Carlo radiative transfer code SKIRT. The design is based on a combination of basic building blocks (which can be either analytical toy models or numerical models defined on grids or a set of particles) and the extensive use of decorators that combine and alter these building blocks to more complex structures. For a number of decorators, e.g. those that add spiral structure or clumpiness, we provide a detailed description of the algorithms that can be used to generate random positions. Advantages of this decorator-based design include code transparency, the avoidance of code duplication, and an increase in code maintainability. Moreover, since decorators can be chained without problems, very complex models can easily be constructed out of simple building blocks. Finally, based on a number of test simulations, we demonstrate that our design using customised random position generators is superior to a simpler design based on a generic black-box random position generator.
NASA Astrophysics Data System (ADS)
Ander, Louise; Lark, Murray; Smedley, Pauline; Watts, Michael; Hamilton, Elliott; Fletcher, Tony; Crabbe, Helen; Close, Rebecca; Studden, Mike; Leonardi, Giovanni
2015-04-01
Random sampling design is optimal in order to be able to assess outcomes, such as the mean of a given variable across an area. However, this optimal sampling design may be compromised to an unknown extent by unavoidable real-world factors: the extent to which the study design can still be considered random, and the influence this may have on the choice of appropriate statistical data analysis is examined in this work. We take a study which relied on voluntary participation for the sampling of private water tap chemical composition in England, UK. This study was designed and implemented as a categorical, randomised study. The local geological classes were grouped into 10 types, which were considered to be most important in likely effects on groundwater chemistry (the source of all the tap waters sampled). Locations of the users of private water supplies were made available to the study group from the Local Authority in the area. These were then assigned, based on location, to geological groups 1 to 10 and randomised within each group. However, the permission to collect samples then required active, voluntary participation by householders and thus, unlike many environmental studies, could not always follow the initial sample design. Impediments to participation ranged from 'willing but not available' during the designated sampling period, to a lack of response to requests to sample (assumed to be wholly unwilling or unable to participate). Additionally, a small number of unplanned samples were collected via new participants making themselves known to the sampling teams, during the sampling period. Here we examine the impact this has on the 'random' nature of the resulting data distribution, by comparison with the non-participating known supplies. We consider the implications this has on choice of statistical analysis methods to predict values and uncertainty at un-sampled locations.
Tuffaha, Haitham W; Reynolds, Heather; Gordon, Louisa G; Rickard, Claire M; Scuffham, Paul A
2014-12-01
Value of information analysis has been proposed as an alternative to the standard hypothesis testing approach, which is based on type I and type II errors, in determining sample sizes for randomized clinical trials. However, in addition to sample size calculation, value of information analysis can optimize other aspects of research design such as possible comparator arms and alternative follow-up times, by considering trial designs that maximize the expected net benefit of research, which is the difference between the expected cost of the trial and the expected value of additional information. To apply value of information methods to the results of a pilot study on catheter securement devices to determine the optimal design of a future larger clinical trial. An economic evaluation was performed using data from a multi-arm randomized controlled pilot study comparing the efficacy of four types of catheter securement devices: standard polyurethane, tissue adhesive, bordered polyurethane and sutureless securement device. Probabilistic Monte Carlo simulation was used to characterize uncertainty surrounding the study results and to calculate the expected value of additional information. To guide the optimal future trial design, the expected costs and benefits of the alternative trial designs were estimated and compared. Analysis of the value of further information indicated that a randomized controlled trial on catheter securement devices is potentially worthwhile. Among the possible designs for the future trial, a four-arm study with 220 patients/arm would provide the highest expected net benefit corresponding to 130% return-on-investment. The initially considered design of 388 patients/arm, based on hypothesis testing calculations, would provide lower net benefit with return-on-investment of 79%. Cost-effectiveness and value of information analyses were based on the data from a single pilot trial which might affect the accuracy of our uncertainty estimation. Another limitation was that different follow-up durations for the larger trial were not evaluated. The value of information approach allows efficient trial design by maximizing the expected net benefit of additional research. This approach should be considered early in the design of randomized clinical trials. © The Author(s) 2014.
ERIC Educational Resources Information Center
Hazell, Philip L.; Kohn, Michael R.; Dickson, Ruth; Walton, Richard J.; Granger, Renee E.; van Wyk, Gregory W.
2011-01-01
Objective: Previous studies comparing atomoxetine and methylphenidate to treat ADHD symptoms have been equivocal. This noninferiority meta-analysis compared core ADHD symptom response between atomoxetine and methylphenidate in children and adolescents. Method: Selection criteria included randomized, controlled design; duration 6 weeks; and…
Who Recommends Long-Term Care Matters
ERIC Educational Resources Information Center
Kane, Robert L.; Bershadsky, Boris; Bershadsky, Julie
2006-01-01
Purpose: Making good consumer decisions requires having good information. This study compared long-term-care recommendations among various types of health professionals. Design and Methods: We gave randomly varied scenarios to a convenience national sample of 211 professionals from varying disciplines and work locations. For each scenario, we…
Exercising Attention within the Classroom
ERIC Educational Resources Information Center
Hill, Liam; Williams, Justin H. G.; Aucott, Lorna; Milne, June; Thomson, Jenny; Greig, Jessie; Munro, Val; Mon-Williams, Mark
2010-01-01
Aim: To investigate whether increased physical exercise during the school day influenced subsequent cognitive performance in the classroom. Method: A randomized, crossover-design trial (two weeks in duration) was conducted in six mainstream primary schools (1224 children aged 8-11y). No data on sex was available. Children received a…
Sampling Methods in Cardiovascular Nursing Research: An Overview.
Kandola, Damanpreet; Banner, Davina; O'Keefe-McCarthy, Sheila; Jassal, Debbie
2014-01-01
Cardiovascular nursing research covers a wide array of topics from health services to psychosocial patient experiences. The selection of specific participant samples is an important part of the research design and process. The sampling strategy employed is of utmost importance to ensure that a representative sample of participants is chosen. There are two main categories of sampling methods: probability and non-probability. Probability sampling is the random selection of elements from the population, where each element of the population has an equal and independent chance of being included in the sample. There are five main types of probability sampling including simple random sampling, systematic sampling, stratified sampling, cluster sampling, and multi-stage sampling. Non-probability sampling methods are those in which elements are chosen through non-random methods for inclusion into the research study and include convenience sampling, purposive sampling, and snowball sampling. Each approach offers distinct advantages and disadvantages and must be considered critically. In this research column, we provide an introduction to these key sampling techniques and draw on examples from the cardiovascular research. Understanding the differences in sampling techniques may aid nurses in effective appraisal of research literature and provide a reference pointfor nurses who engage in cardiovascular research.
Multilattice sampling strategies for region of interest dynamic MRI.
Rilling, Gabriel; Tao, Yuehui; Marshall, Ian; Davies, Mike E
2013-08-01
A multilattice sampling approach is proposed for dynamic MRI with Cartesian trajectories. It relies on the use of sampling patterns composed of several different lattices and exploits an image model where only some parts of the image are dynamic, whereas the rest is assumed static. Given the parameters of such an image model, the methodology followed for the design of a multilattice sampling pattern adapted to the model is described. The multi-lattice approach is compared to single-lattice sampling, as used by traditional acceleration methods such as UNFOLD (UNaliasing by Fourier-Encoding the Overlaps using the temporal Dimension) or k-t BLAST, and random sampling used by modern compressed sensing-based methods. On the considered image model, it allows more flexibility and higher accelerations than lattice sampling and better performance than random sampling. The method is illustrated on a phase-contrast carotid blood velocity mapping MR experiment. Combining the multilattice approach with the KEYHOLE technique allows up to 12× acceleration factors. Simulation and in vivo undersampling results validate the method. Compared to lattice and random sampling, multilattice sampling provides significant gains at high acceleration factors. © 2012 Wiley Periodicals, Inc.
Computational methods for efficient structural reliability and reliability sensitivity analysis
NASA Technical Reports Server (NTRS)
Wu, Y.-T.
1993-01-01
This paper presents recent developments in efficient structural reliability analysis methods. The paper proposes an efficient, adaptive importance sampling (AIS) method that can be used to compute reliability and reliability sensitivities. The AIS approach uses a sampling density that is proportional to the joint PDF of the random variables. Starting from an initial approximate failure domain, sampling proceeds adaptively and incrementally with the goal of reaching a sampling domain that is slightly greater than the failure domain to minimize over-sampling in the safe region. Several reliability sensitivity coefficients are proposed that can be computed directly and easily from the above AIS-based failure points. These probability sensitivities can be used for identifying key random variables and for adjusting design to achieve reliability-based objectives. The proposed AIS methodology is demonstrated using a turbine blade reliability analysis problem.
Randomized trial of anesthetic methods for intravitreal injections.
Blaha, Gregory R; Tilton, Elisha P; Barouch, Fina C; Marx, Jeffrey L
2011-03-01
To compare the effectiveness of four different anesthetic methods for intravitreal injection. Twenty-four patients each received four intravitreal injections using each of four types of anesthesia (proparacaine, tetracaine, lidocaine pledget, and subconjunctival injection of lidocaine) in a prospective, masked, randomized block design. Pain was graded by the patient on a 0 to 10 scale for both the anesthesia and the injection. The average combined pain scores for both the anesthesia and the intravitreal injection were 4.4 for the lidocaine pledget, 3.5 for topical proparacaine, 3.8 for the subconjunctival lidocaine injection, and 4.1 for topical tetracaine. The differences were not significant (P = 0.65). There were also no statistical differences in the individual anesthesia or injection pain scores. Subconjunctival lidocaine injection had the most side effects. Topical anesthesia is an effective method for limiting pain associated with intravitreal injections.
A new method for reconstruction of solar irradiance
NASA Astrophysics Data System (ADS)
Privalsky, Victor
2018-07-01
The purpose of this research is to show how time series should be reconstructed using an example with the data on total solar irradiation (TSI) of the Earth and on sunspot numbers (SSN) since 1749. The traditional approach through regression equation(s) is designed for time-invariant vectors of random variables and is not applicable to time series, which present random functions of time. The autoregressive reconstruction (ARR) method suggested here requires fitting a multivariate stochastic difference equation to the target/proxy time series. The reconstruction is done through the scalar equation for the target time series with the white noise term excluded. The time series approach is shown to provide a better reconstruction of TSI than the correlation/regression method. A reconstruction criterion is introduced which allows one to define in advance the achievable level of success in the reconstruction. The conclusion is that time series, including the total solar irradiance, cannot be reconstructed properly if the data are not treated as sample records of random processes and analyzed in both time and frequency domains.
Jones, Hendrée E.; Fischer, Gabriele; Heil, Sarah H.; Kaltenbach, Karol; Martin, Peter R.; Coyle, Mara G.; Selby, Peter; Stine, Susan M.; O’Grady, Kevin E.; Arria, Amelia M.
2015-01-01
Aims The Maternal Opioid Treatment: Human Experimental Research (MOTHER) project, an eight-site randomized, double-blind, double-dummy, flexible-dosing, parallel-group clinical trial is described. This study is the most current – and single most comprehensive – research effort to investigate the safety and efficacy of maternal and prenatal exposure to methadone and buprenorphine. Methods The MOTHER study design is outlined, and its basic features are presented. Conclusions At least seven important lessons have been learned from the MOTHER study: (1) an interdisciplinary focus improves the design and methods of a randomized clinical trial; (2) multiple sites in a clinical trial present continuing challenges to the investigative team due to variations in recruitment goals, patient populations, and hospital practices that in turn differentially impact recruitment rates, treatment compliance, and attrition; (3) study design and protocols must be flexible in order to meet the unforeseen demands of both research and clinical management; (4) staff turnover needs to be addressed with a proactive focus on both hiring and training; (5) the implementation of a protocol for the treatment of a particular disorder may identify important ancillary clinical issues worthy of investigation; (6) timely tracking of data in a multi-site trial is both demanding and unforgiving; and, (7) complex multi-site trials pose unanticipated challenges that complicate the choice of statistical methods, thereby placing added demands on investigators to effectively communicate their results. PMID:23106924
Handheld laser scanner automatic registration based on random coding
NASA Astrophysics Data System (ADS)
He, Lei; Yu, Chun-ping; Wang, Li
2011-06-01
Current research on Laser Scanner often focuses mainly on the static measurement. Little use has been made of dynamic measurement, that are appropriate for more problems and situations. In particular, traditional Laser Scanner must Keep stable to scan and measure coordinate transformation parameters between different station. In order to make the scanning measurement intelligently and rapidly, in this paper ,we developed a new registration algorithm for handleheld laser scanner based on the positon of target, which realize the dynamic measurement of handheld laser scanner without any more complex work. the double camera on laser scanner can take photograph of the artificial target points to get the three-dimensional coordinates, this points is designed by random coding. And then, a set of matched points is found from control points to realize the orientation of scanner by the least-square common points transformation. After that the double camera can directly measure the laser point cloud in the surface of object and get the point cloud data in an unified coordinate system. There are three major contributions in the paper. Firstly, a laser scanner based on binocular vision is designed with double camera and one laser head. By those, the real-time orientation of laser scanner is realized and the efficiency is improved. Secondly, the coding marker is introduced to solve the data matching, a random coding method is proposed. Compared with other coding methods,the marker with this method is simple to match and can avoid the shading for the object. Finally, a recognition method of coding maker is proposed, with the use of the distance recognition, it is more efficient. The method present here can be used widely in any measurement from small to huge obiect, such as vehicle, airplane which strengthen its intelligence and efficiency. The results of experiments and theory analzing demonstrate that proposed method could realize the dynamic measurement of handheld laser scanner. Theory analysis and experiment shows the method is reasonable and efficient.
Effect of aperiodicity on the broadband reflection of silicon nanorod structures for photovoltaics.
Lin, Chenxi; Huang, Ningfeng; Povinelli, Michelle L
2012-01-02
We carry out a systematic numerical study of the effects of aperiodicity on silicon nanorod anti-reflection structures. We use the scattering matrix method to calculate the average reflection loss over the solar spectrum for periodic and aperiodic arrangements of nanorods. We find that aperiodicity can either improve or deteriorate the anti-reflection performance, depending on the nanorod diameter. We use a guided random-walk algorithm to design optimal aperiodic structures that exhibit lower reflection loss than both optimal periodic and random aperiodic structures.
Design, objectives, execution and reporting of published open-label extension studies.
Megan, Bowers; Pickering, Ruth M; Weatherall, Mark
2012-04-01
Open-label extension (OLE) studies following blinded randomized controlled trials (RCTs) of pharmaceuticals are increasingly being carried out but do not conform to regulatory standards and questions surround the validity of their evidence. OLE studies are usually discussed as a homogenous group, yet substantial differences in study design still meet the definition of an OLE. We describe published papers reporting OLE studies focussing on stated objectives, design, conduct and reporting. A search of Embase and Medline databases for 1996 to July 2008 revealed 268 papers reporting OLE studies that met our eligibility criteria. A random sample of 50 was selected for detailed review. Over 80% of the studies had efficacy stated as an objective. The most common methods of allocation at the start of the OLE were for all RCT participants to switch to one active treatment or for only participants on the new drug to continue, but in three studies all participants were re-randomized at the start of the OLE. Eligibility criteria and other selection factors resulted in on average of 74% of participants in the preceding RCT(s) enrolling in the OLE and only 57% completed it. Published OLE studies do not form a homogenous group with respect to design or retention of participants, and thus the validity of evidence from an OLE should be judged on an individual basis. The term 'open label' suggests bias through lack of blinding, but slippage in relation to the sample randomized in the preceding RCT may be the more important threat to validity. © 2010 Blackwell Publishing Ltd.
Reliability-Based Design Optimization of a Composite Airframe Component
NASA Technical Reports Server (NTRS)
Pai, Shantaram S.; Coroneos, Rula; Patnaik, Surya N.
2011-01-01
A stochastic optimization methodology (SDO) has been developed to design airframe structural components made of metallic and composite materials. The design method accommodates uncertainties in load, strength, and material properties that are defined by distribution functions with mean values and standard deviations. A response parameter, like a failure mode, has become a function of reliability. The primitive variables like thermomechanical loads, material properties, and failure theories, as well as variables like depth of beam or thickness of a membrane, are considered random parameters with specified distribution functions defined by mean values and standard deviations.
Bruynseels, Daniel; Solomon, Cristina; Hallam, Angela; Collins, Peter W; Collis, Rachel E; Hamlyn, Vincent; Hall, Judith E
2016-01-01
The gold standard of trial design is the double-blind, placebo-controlled, randomized trial. Intravenous medication, which needs reconstitution by the attending clinician in an emergency situation, can be challenging to incorporate into a suitably blinded study. We have developed a method of blindly reconstituting and administering fibrinogen concentrate (presented as a lyophilized powder), where the placebo is normal saline. Fibrinogen concentrate is increasingly being used early in the treatment of major hemorrhage. Our methodology was designed for a multicenter study investigating the role of fibrinogen concentrate in the treatment of the coagulopathy associated with major obstetric hemorrhage. The method has been verified by a stand-alone pharmaceutical manufacturing unit with an investigational medicinal products license, and to date has successfully been applied 45 times in four study centers. There have been no difficulties in reconstitution and no related adverse events reported. We feel our method is simple to perform and maintains blinding throughout, making it potentially suitable for use in other trials conducted in psychologically high-pressure environments. Although fibrinogen concentrate was the focus of our study, it is likely that the method is applicable to other lyophilized medication with limited shelf life (e.g., antibiotics). Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
Improving waveform inversion using modified interferometric imaging condition
NASA Astrophysics Data System (ADS)
Guo, Xuebao; Liu, Hong; Shi, Ying; Wang, Weihong; Zhang, Zhen
2017-12-01
Similar to the reverse-time migration, full waveform inversion in the time domain is a memory-intensive processing method. The computational storage size for waveform inversion mainly depends on the model size and time recording length. In general, 3D and 4D data volumes need to be saved for 2D and 3D waveform inversion gradient calculations, respectively. Even the boundary region wavefield-saving strategy creates a huge storage demand. Using the last two slices of the wavefield to reconstruct wavefields at other moments through the random boundary, avoids the need to store a large number of wavefields; however, traditional random boundary method is less effective at low frequencies. In this study, we follow a new random boundary designed to regenerate random velocity anomalies in the boundary region for each shot of each iteration. The results obtained using the random boundary condition in less illuminated areas are more seriously affected by random scattering than other areas due to the lack of coverage. In this paper, we have replaced direct correlation for computing the waveform inversion gradient by modified interferometric imaging, which enhances the continuity of the imaging path and reduces noise interference. The new imaging condition is a weighted average of extended imaging gathers can be directly used in the gradient computation. In this process, we have not changed the objective function, and the role of the imaging condition is similar to regularization. The window size for the modified interferometric imaging condition-based waveform inversion plays an important role in this process. The numerical examples show that the proposed method significantly enhances waveform inversion performance.
Improving waveform inversion using modified interferometric imaging condition
NASA Astrophysics Data System (ADS)
Guo, Xuebao; Liu, Hong; Shi, Ying; Wang, Weihong; Zhang, Zhen
2018-02-01
Similar to the reverse-time migration, full waveform inversion in the time domain is a memory-intensive processing method. The computational storage size for waveform inversion mainly depends on the model size and time recording length. In general, 3D and 4D data volumes need to be saved for 2D and 3D waveform inversion gradient calculations, respectively. Even the boundary region wavefield-saving strategy creates a huge storage demand. Using the last two slices of the wavefield to reconstruct wavefields at other moments through the random boundary, avoids the need to store a large number of wavefields; however, traditional random boundary method is less effective at low frequencies. In this study, we follow a new random boundary designed to regenerate random velocity anomalies in the boundary region for each shot of each iteration. The results obtained using the random boundary condition in less illuminated areas are more seriously affected by random scattering than other areas due to the lack of coverage. In this paper, we have replaced direct correlation for computing the waveform inversion gradient by modified interferometric imaging, which enhances the continuity of the imaging path and reduces noise interference. The new imaging condition is a weighted average of extended imaging gathers can be directly used in the gradient computation. In this process, we have not changed the objective function, and the role of the imaging condition is similar to regularization. The window size for the modified interferometric imaging condition-based waveform inversion plays an important role in this process. The numerical examples show that the proposed method significantly enhances waveform inversion performance.
Probabilistic Modeling of Settlement Risk at Land Disposal Facilities - 12304
DOE Office of Scientific and Technical Information (OSTI.GOV)
Foye, Kevin C.; Soong, Te-Yang
2012-07-01
The long-term reliability of land disposal facility final cover systems - and therefore the overall waste containment - depends on the distortions imposed on these systems by differential settlement/subsidence. The evaluation of differential settlement is challenging because of the heterogeneity of the waste mass (caused by inconsistent compaction, void space distribution, debris-soil mix ratio, waste material stiffness, time-dependent primary compression of the fine-grained soil matrix, long-term creep settlement of the soil matrix and the debris, etc.) at most land disposal facilities. Deterministic approaches to long-term final cover settlement prediction are not able to capture the spatial variability in the wastemore » mass and sub-grade properties which control differential settlement. An alternative, probabilistic solution is to use random fields to model the waste and sub-grade properties. The modeling effort informs the design, construction, operation, and maintenance of land disposal facilities. A probabilistic method to establish design criteria for waste placement and compaction is introduced using the model. Random fields are ideally suited to problems of differential settlement modeling of highly heterogeneous foundations, such as waste. Random fields model the seemingly random spatial distribution of a design parameter, such as compressibility. When used for design, the use of these models prompts the need for probabilistic design criteria. It also allows for a statistical approach to waste placement acceptance criteria. An example design evaluation was performed, illustrating the use of the probabilistic differential settlement simulation methodology to assemble a design guidance chart. The purpose of this design evaluation is to enable the designer to select optimal initial combinations of design slopes and quality control acceptance criteria that yield an acceptable proportion of post-settlement slopes meeting some design minimum. For this specific example, relative density, which can be determined through field measurements, was selected as the field quality control parameter for waste placement. This technique can be extended to include a rigorous performance-based methodology using other parameters (void space criteria, debris-soil mix ratio, pre-loading, etc.). As shown in this example, each parameter range, or sets of parameter ranges can be selected such that they can result in an acceptable, long-term differential settlement according to the probabilistic model. The methodology can also be used to re-evaluate the long-term differential settlement behavior at closed land disposal facilities to identify, if any, problematic facilities so that remedial action (e.g., reinforcement of upper and intermediate waste layers) can be implemented. Considering the inherent spatial variability in waste and earth materials and the need for engineers to apply sound quantitative practices to engineering analysis, it is important to apply the available probabilistic techniques to problems of differential settlement. One such method to implement probability-based differential settlement analyses for the design of landfill final covers has been presented. The design evaluation technique presented is one tool to bridge the gap from deterministic practice to probabilistic practice. (authors)« less
McFarlane, Judith; Karmaliani, Rozina; Maqbool Ahmed Khuwaja, Hussain; Gulzar, Saleema; Somani, Rozina; Saeed Ali, Tazeen; Somani, Yasmeen H; Shehzad Bhamani, Shireen; Krone, Ryan D; Paulson, Rene M; Muhammad, Atta; Jewkes, Rachel
2017-01-01
ABSTRACT Background: Violence against and among children is a global public health problem that annually affects 50% of youth worldwide with major impacts on child development, education, and health including increased probability of major causes of morbidity and mortality in adulthood. It is also associated with the experience of and perpetration of later violence against women. The aim of this article is to describe the intervention, study design, methods, and baseline findings of a cluster randomized controlled trial underway in Pakistan to evaluate a school-based play intervention aiming to reduce peer violence and enhance mental health. Methods: A cluster randomized controlled design is being conducted with boys and girls in grade 6 in 40 schools in Hyderabad, Pakistan, over a period of 2 years. The Multidimensional Peer-Victimization and Peer Perpetration Scales and the Children's Depression Inventory 2 (CDI 2) are being used to measure the primary outcomes while investigator-derived scales are being used to assess domestic violence within the family. Specifics of the intervention, field logistics, ethical, and fidelity management issues employed to test the program's impact on school age youth in a volatile and politically unstable country form this report. Baseline Results: A total of 1,752 school-age youth were enrolled and interviewed at baseline. Over the preceding 4 weeks, 94% of the boys and 85% of the girls reported 1 or more occurrences of victimization, and 85% of the boys and 66% of the girls reported 1 or more acts of perpetration. Boys reported more depression compared with girls, as well as higher negative mood and self-esteem scores and more interpersonal and emotional problems. Interpretation: Globally, prevalence of youth violence perpetration and victimization is high and associated with poor physical and emotional health. Applying a randomized controlled design to evaluate a peer violence prevention program built on a firm infrastructure and that is ready for scale-up and sustainability will make an important contribution to identifying evidence-informed interventions that can reduce youth victimization and perpetration. PMID:28351880
Photostabilization of a landfill containing coal combustion waste
Christopher Barton; Donald Marx; Domy Adriano; Bon Jun Koo; Lee Newman; Stephen Czapka; John Blake
2005-01-01
The establishment of a vegetative cover to enhance evapotranspiration and control runoff and drainage was examined as a method for stabilizing a landfill containing coal combustion waste. Suitable plant species and pretreatment techniques in the form of amendments, tilling, and chemical stabilization were evaluated. A randomized plot design consisting of three...
Phytostabilization of a landfill containing coal combustion waste
Christopher Barton; Donald Marx; Domy Adriano; Bon Jun Koo; Lee Newman; Stephen Czapka; John Blake
2005-01-01
The establishment of a vegetative cover to enhance evapotranspiration and control runoff and drainage was examined as a method for stabilizing a landfill containing coal combustion waste. Suitable plant species and pretreatment techniques in the form of amendments, tilling, and chemical stabilization were evaluated. A randomized plot design consisting of three...
Single-Phase Mail Survey Design for Rare Population Subgroups
ERIC Educational Resources Information Center
Brick, J. Michael; Andrews, William R.; Mathiowetz, Nancy A.
2016-01-01
Although using random digit dialing (RDD) telephone samples was the preferred method for conducting surveys of households for many years, declining response and coverage rates have led researchers to explore alternative approaches. The use of address-based sampling (ABS) has been examined for sampling the general population and subgroups, most…
Leader Positivity and Follower Creativity: An Experimental Analysis
ERIC Educational Resources Information Center
Avey, James B.; Richmond, F. Lynn; Nixon, Don R.
2012-01-01
Using an experimental research design, 191 working adults were randomly assigned to two experimental conditions in order to test a theoretical model linking leader and follower positive psychological capital (PsyCap). Multiple methods were used to gather information from the participants. We found when leader PsyCap was manipulated experimentally,…
An Intervention to Improve Motivation for Homework
ERIC Educational Resources Information Center
Akioka, Elisabeth; Gilmore, Linda
2013-01-01
A repeated measures design, with randomly assigned intervention and control groups and multiple sources of information on each participant, was used to examine whether changing the method of delivery of a school's homework program in order to better meet the students' needs for autonomy, relatedness and competence would lead to more positive…
Relationship of Study Habits with Mathematics Achievement
ERIC Educational Resources Information Center
Odiri, Onoshakpokaiye E.
2015-01-01
The study examined the relationship of study habits of students and their achievement in mathematics. The method used for the study was correlation design. A sample of 500 students were randomly selected from 25 public secondary schools in Delta Central Senatorial District, Delta State, Nigeria. Questionnaires were drawn to gather data on…
Problem-Solving Therapy for Depression in Adults: A Systematic Review
ERIC Educational Resources Information Center
Gellis, Zvi D.; Kenaley, Bonnie
2008-01-01
Objectives: This article presents a systematic review of the evidence on problem-solving therapy (PST) for depressive disorders in noninstitutionalized adults. Method: Intervention studies using randomized controlled designs are included and methodological quality is assessed using a standard set of criteria from the Cochrane Collaborative Review…
INVESTIGATION OF ORGANIC WEED CONTROL METHODS, PESTICIDE SPECIAL STUDY, COLORADO STATE UNIVERSITY
The project is proposed for the 2003 and 2004 growing seasons. Corn gluten meal (CGM), treated paper mulch and plastic mulch, along with conventional herbicide, will be applied to fields of drip irrigated broccoli in a randomized complete block design with 6 replicates. Due to ...
How Do Hired Workers Fare under Consumer-Directed Personal Care?
ERIC Educational Resources Information Center
Dale, Stacy; Brown, Randall; Phillips, Barbara; Carlson, Barbara Lepidus
2005-01-01
Purpose: This study describes the experiences of workers hired under consumer direction. Design and Methods: Medicaid beneficiaries who volunteered for the Cash and Counseling demonstration were randomly assigned to the treatment group, which could participate in the consumer-directed program, or the control group, which was referred to agency…
Supervisor Attachment, Supervisory Working Alliance, and Affect in Social Work Field Instruction
ERIC Educational Resources Information Center
Bennett, Susanne; Mohr, Jonathan; Deal, Kathleen Holtz; Hwang, Jeongha
2013-01-01
Objective: This study focused on interrelationships among supervisor attachment, supervisory working alliance, and supervision-related affect, plus the moderating effect of a field instructor training. Method: The researchers employed a pretest-posttest follow-up design of 100 randomly assigned field instructors and 64 students in two…
Resource Utilisation and Curriculum Implementation in Community Colleges in Kenya
ERIC Educational Resources Information Center
Kigwilu, Peter Changilwa; Akala, Winston Jumba
2017-01-01
The study investigated how Catholic-sponsored community colleges in Nairobi utilise the existing physical facilities and teaching and learning resources for effective implementation of Artisan and Craft curricula. The study adopted a mixed methods research design. Proportional stratified random sampling was used to sample 172 students and 18…
The Efficacy of Stuttering Measurement Training: Evaluating Two Training Programs
ERIC Educational Resources Information Center
Bainbridge, Lauren A.; Stavros, Candace; Ebrahimian, Mineh; Wang, Yuedong; Ingham, Roger J.
2015-01-01
Purpose: Two stuttering measurement training programs currently used for training clinicians were evaluated for their efficacy in improving the accuracy of total stuttering event counting. Method: Four groups, each with 12 randomly allocated participants, completed a pretest-posttest design training study. They were evaluated by their counts of…
Variable Selection in the Presence of Missing Data: Imputation-based Methods.
Zhao, Yize; Long, Qi
2017-01-01
Variable selection plays an essential role in regression analysis as it identifies important variables that associated with outcomes and is known to improve predictive accuracy of resulting models. Variable selection methods have been widely investigated for fully observed data. However, in the presence of missing data, methods for variable selection need to be carefully designed to account for missing data mechanisms and statistical techniques used for handling missing data. Since imputation is arguably the most popular method for handling missing data due to its ease of use, statistical methods for variable selection that are combined with imputation are of particular interest. These methods, valid used under the assumptions of missing at random (MAR) and missing completely at random (MCAR), largely fall into three general strategies. The first strategy applies existing variable selection methods to each imputed dataset and then combine variable selection results across all imputed datasets. The second strategy applies existing variable selection methods to stacked imputed datasets. The third variable selection strategy combines resampling techniques such as bootstrap with imputation. Despite recent advances, this area remains under-developed and offers fertile ground for further research.
Araújo, Ricardo de A
2010-12-01
This paper presents a hybrid intelligent methodology to design increasing translation invariant morphological operators applied to Brazilian stock market prediction (overcoming the random walk dilemma). The proposed Translation Invariant Morphological Robust Automatic phase-Adjustment (TIMRAA) method consists of a hybrid intelligent model composed of a Modular Morphological Neural Network (MMNN) with a Quantum-Inspired Evolutionary Algorithm (QIEA), which searches for the best time lags to reconstruct the phase space of the time series generator phenomenon and determines the initial (sub-optimal) parameters of the MMNN. Each individual of the QIEA population is further trained by the Back Propagation (BP) algorithm to improve the MMNN parameters supplied by the QIEA. Also, for each prediction model generated, it uses a behavioral statistical test and a phase fix procedure to adjust time phase distortions observed in stock market time series. Furthermore, an experimental analysis is conducted with the proposed method through four Brazilian stock market time series, and the achieved results are discussed and compared to results found with random walk models and the previously introduced Time-delay Added Evolutionary Forecasting (TAEF) and Morphological-Rank-Linear Time-lag Added Evolutionary Forecasting (MRLTAEF) methods. Copyright © 2010 Elsevier Ltd. All rights reserved.
Monte Carlo Sampling in Fractal Landscapes
NASA Astrophysics Data System (ADS)
Leitão, Jorge C.; Lopes, J. M. Viana Parente; Altmann, Eduardo G.
2013-05-01
We design a random walk to explore fractal landscapes such as those describing chaotic transients in dynamical systems. We show that the random walk moves efficiently only when its step length depends on the height of the landscape via the largest Lyapunov exponent of the chaotic system. We propose a generalization of the Wang-Landau algorithm which constructs not only the density of states (transient time distribution) but also the correct step length. As a result, we obtain a flat-histogram Monte Carlo method which samples fractal landscapes in polynomial time, a dramatic improvement over the exponential scaling of traditional uniform-sampling methods. Our results are not limited by the dimensionality of the landscape and are confirmed numerically in chaotic systems with up to 30 dimensions.
Chen, Minghao; Wei, Shiyou; Hu, Junyan; Yuan, Jing; Liu, Fenghua
2017-01-01
The present study aimed to undertake a review of available evidence assessing whether time-lapse imaging (TLI) has favorable outcomes for embryo incubation and selection compared with conventional methods in clinical in vitro fertilization (IVF). Using PubMed, EMBASE, Cochrane library and ClinicalTrial.gov up to February 2017 to search for randomized controlled trials (RCTs) comparing TLI versus conventional methods. Both studies randomized women and oocytes were included. For studies randomized women, the primary outcomes were live birth and ongoing pregnancy, the secondary outcomes were clinical pregnancy and miscarriage; for studies randomized oocytes, the primary outcome was blastocyst rate, the secondary outcome was good quality embryo on Day 2/3. Subgroup analysis was conducted based on different incubation and embryo selection between groups. Ten RCTs were included, four randomized oocytes and six randomized women. For oocyte-based review, the pool-analysis observed no significant difference between TLI group and control group for blastocyst rate [relative risk (RR) 1.08, 95% CI 0.94-1.25, I2 = 0%, two studies, including 1154 embryos]. The quality of evidence was moderate for all outcomes in oocyte-based review. For woman-based review, only one study provided live birth rate (RR 1,23, 95% CI 1.06-1.44,I2 N/A, one study, including 842 women), the pooled result showed no significant difference in ongoing pregnancy rate (RR 1.04, 95% CI 0.80-1.36, I2 = 59%, four studies, including 1403 women) between two groups. The quality of the evidence was low or very low for all outcomes in woman-based review. Currently there is insufficient evidence to support that TLI is superior to conventional methods for human embryo incubation and selection. In consideration of the limitations and flaws of included studies, more well designed RCTs are still in need to comprehensively evaluate the effectiveness of clinical TLI use.
2011-01-01
Background We present the design, methods and population characteristics of a large community trial that assessed the efficacy of a weekly supplement containing vitamin A or beta-carotene, at recommended dietary levels, in reducing maternal mortality from early gestation through 12 weeks postpartum. We identify challenges faced and report solutions in implementing an intervention trial under low-resource, rural conditions, including the importance of population choice in promoting generalizability, maintaining rigorous data quality control to reduce inter- and intra- worker variation, and optimizing efficiencies in information and resources flow from and to the field. Methods This trial was a double-masked, cluster-randomized, dual intervention, placebo-controlled trial in a contiguous rural area of ~435 sq km with a population of ~650,000 in Gaibandha and Rangpur Districts of Northwestern Bangladesh. Approximately 120,000 married women of reproductive age underwent 5-weekly home surveillance, of whom ~60,000 were detected as pregnant, enrolled into the trial and gave birth to ~44,000 live-born infants. Upon enrollment, at ~ 9 weeks' gestation, pregnant women received a weekly oral supplement containing vitamin A (7000 ug retinol equivalents (RE)), beta-carotene (42 mg, or ~7000 ug RE) or a placebo through 12 weeks postpartum, according to prior randomized allocation of their cluster of residence. Systems described include enlistment and 5-weekly home surveillance for pregnancy based on menstrual history and urine testing, weekly supervised supplementation, periodic risk factor interviews, maternal and infant vital outcome monitoring, birth defect surveillance and clinical/biochemical substudies. Results The primary outcome was pregnancy-related mortality assessed for 3 months following parturition. Secondary outcomes included fetal loss due to miscarriage or stillbirth, infant mortality under three months of age, maternal obstetric and infectious morbidity, infant infectious morbidity, maternal and infant micronutrient status, fetal and infant growth and prematurity, external birth defects and postnatal infant growth to 3 months of age. Conclusion Aspects of study site selection and its "resonance" with national and rural qualities of Bangladesh, the trial's design, methods and allocation group comparability achieved by randomization, field procedures and innovative approaches to solving challenges in trial conduct are described and discussed. This trial is registered with http://Clinicaltrials.gov as protocol NCT00198822. PMID:21510905
Probabilistic structural analysis methods for improving Space Shuttle engine reliability
NASA Technical Reports Server (NTRS)
Boyce, L.
1989-01-01
Probabilistic structural analysis methods are particularly useful in the design and analysis of critical structural components and systems that operate in very severe and uncertain environments. These methods have recently found application in space propulsion systems to improve the structural reliability of Space Shuttle Main Engine (SSME) components. A computer program, NESSUS, based on a deterministic finite-element program and a method of probabilistic analysis (fast probability integration) provides probabilistic structural analysis for selected SSME components. While computationally efficient, it considers both correlated and nonnormal random variables as well as an implicit functional relationship between independent and dependent variables. The program is used to determine the response of a nickel-based superalloy SSME turbopump blade. Results include blade tip displacement statistics due to the variability in blade thickness, modulus of elasticity, Poisson's ratio or density. Modulus of elasticity significantly contributed to blade tip variability while Poisson's ratio did not. Thus, a rational method for choosing parameters to be modeled as random is provided.
Bamberger, Michael; Tarsilla, Michele; Hesse-Biber, Sharlene
2016-04-01
Many widely-used impact evaluation designs, including randomized control trials (RCTs) and quasi-experimental designs (QEDs), frequently fail to detect what are often quite serious unintended consequences of development programs. This seems surprising as experienced planners and evaluators are well aware that unintended consequences frequently occur. Most evaluation designs are intended to determine whether there is credible evidence (statistical, theory-based or narrative) that programs have achieved their intended objectives and the logic of many evaluation designs, even those that are considered the most "rigorous," does not permit the identification of outcomes that were not specified in the program design. We take the example of RCTs as they are considered by many to be the most rigorous evaluation designs. We present a numbers of cases to illustrate how infusing RCTs with a mixed-methods approach (sometimes called an "RCT+" design) can strengthen the credibility of these designs and can also capture important unintended consequences. We provide a Mixed Methods Evaluation Framework that identifies 9 ways in which UCs can occur, and we apply this framework to two of the case studies. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Chui, T. F. M.; Yang, Y.
2017-12-01
Green infrastructures (GI) have been widely used to mitigate flood risk, improve surface water quality, and to restore predevelopment hydrologic regimes. Commonly-used GI include, bioretention system, porous pavement and green roof, etc. They are normally sized to fulfil different design criteria (e.g. providing certain storage depths, limiting peak surface flow rates) that are formulated for current climate conditions. While GI commonly have long lifespan, the sensitivity of their performance to climate change is however unclear. This study first proposes a method to formulate suitable design criteria to meet different management interests (e.g. different levels of first flush reduction and peak flow reduction). Then typical designs of GI are proposed. In addition, a high resolution stochastic design storm generator using copulas and random cascade model is developed, which is calibrated using recorded rainfall time series. Then, few climate change scenarios are generated by varying the duration and depth of design storms, and changing the parameters of the calibrated storm generator. Finally, the performance of GI with typical designs under the random synthesized design storms are then assessed using numerical modeling. The robustness of the designs is obtained by the comparing their performance in the future scenarios to the current one. This study overall examines the robustness of the current GI design criteria under uncertain future climate conditions, demonstrating whether current GI design criteria should be modified to account for climate change.
Cook, Richard J; Wei, Wei
2003-07-01
The design of clinical trials is typically based on marginal comparisons of a primary response under two or more treatments. The considerable gains in efficiency afforded by models conditional on one or more baseline responses has been extensively studied for Gaussian models. The purpose of this article is to present methods for the design and analysis of clinical trials in which the response is a count or a point process, and a corresponding baseline count is available prior to randomization. The methods are based on a conditional negative binomial model for the response given the baseline count and can be used to examine the effect of introducing selection criteria on power and sample size requirements. We show that designs based on this approach are more efficient than those proposed by McMahon et al. (1994).
Soil variability in engineering applications
NASA Astrophysics Data System (ADS)
Vessia, Giovanna
2014-05-01
Natural geomaterials, as soils and rocks, show spatial variability and heterogeneity of physical and mechanical properties. They can be measured by in field and laboratory testing. The heterogeneity concerns different values of litho-technical parameters pertaining similar lithological units placed close to each other. On the contrary, the variability is inherent to the formation and evolution processes experienced by each geological units (homogeneous geomaterials on average) and captured as a spatial structure of fluctuation of physical property values about their mean trend, e.g. the unit weight, the hydraulic permeability, the friction angle, the cohesion, among others. The preceding spatial variations shall be managed by engineering models to accomplish reliable designing of structures and infrastructures. Materon (1962) introduced the Geostatistics as the most comprehensive tool to manage spatial correlation of parameter measures used in a wide range of earth science applications. In the field of the engineering geology, Vanmarcke (1977) developed the first pioneering attempts to describe and manage the inherent variability in geomaterials although Terzaghi (1943) already highlighted that spatial fluctuations of physical and mechanical parameters used in geotechnical designing cannot be neglected. A few years later, Mandelbrot (1983) and Turcotte (1986) interpreted the internal arrangement of geomaterial according to Fractal Theory. In the same years, Vanmarcke (1983) proposed the Random Field Theory providing mathematical tools to deal with inherent variability of each geological units or stratigraphic succession that can be resembled as one material. In this approach, measurement fluctuations of physical parameters are interpreted through the spatial variability structure consisting in the correlation function and the scale of fluctuation. Fenton and Griffiths (1992) combined random field simulation with the finite element method to produce the Random Finite Element Method (RFEM). This method has been used to investigate the random behavior of soils in the context of a variety of classical geotechnical problems. Afterward, some following studies collected the worldwide variability values of many technical parameters of soils (Phoon and Kulhawy 1999a) and their spatial correlation functions (Phoon and Kulhawy 1999b). In Italy, Cherubini et al. (2007) calculated the spatial variability structure of sandy and clayey soils from the standard cone penetration test readings. The large extent of the worldwide measured spatial variability of soils and rocks heavily affects the reliability of geotechnical designing as well as other uncertainties introduced by testing devices and engineering models. So far, several methods have been provided to deal with the preceding sources of uncertainties in engineering designing models (e.g. First Order Reliability Method, Second Order Reliability Method, Response Surface Method, High Dimensional Model Representation, etc.). Nowadays, the efforts in this field have been focusing on (1) measuring spatial variability of different rocks and soils and (2) developing numerical models that take into account the spatial variability as additional physical variable. References Cherubini C., Vessia G. and Pula W. 2007. Statistical soil characterization of Italian sites for reliability analyses. Proc. 2nd Int. Workshop. on Characterization and Engineering Properties of Natural Soils, 3-4: 2681-2706. Griffiths D.V. and Fenton G.A. 1993. Seepage beneath water retaining structures founded on spatially random soil, Géotechnique, 43(6): 577-587. Mandelbrot B.B. 1983. The Fractal Geometry of Nature. San Francisco: W H Freeman. Matheron G. 1962. Traité de Géostatistique appliquée. Tome 1, Editions Technip, Paris, 334 p. Phoon K.K. and Kulhawy F.H. 1999a. Characterization of geotechnical variability. Can Geotech J, 36(4): 612-624. Phoon K.K. and Kulhawy F.H. 1999b. Evaluation of geotechnical property variability. Can Geotech J, 36(4): 625-639. Terzaghi K. 1943. Theoretical Soil Mechanics. New York: John Wiley and Sons. Turcotte D.L. 1986. Fractals and fragmentation. J Geophys Res, 91: 1921-1926. Vanmarcke E.H. 1977. Probabilistic modeling of soil profiles. J Geotech Eng Div, ASCE, 103: 1227-1246. Vanmarcke E.H. 1983. Random fields: analysis and synthesis. MIT Press, Cambridge.
Rahbar, Mohammad H.; Wyatt, Gwen; Sikorskii, Alla; Victorson, David; Ardjomand-Hessabi, Manouchehr
2011-01-01
Background Multisite randomized clinical trials allow for increased research collaboration among investigators and expedite data collection efforts. As a result, government funding agencies typically look favorably upon this approach. As the field of complementary and alternative medicine (CAM) continues to evolve, so do increased calls for the use of more rigorous study design and trial methodologies, which can present challenges for investigators. Purpose To describe the processes involved in the coordination and management of a multisite randomized clinical trial of a CAM intervention. Methods Key aspects related to the coordination and management of a multisite CAM randomized clinical trial are presented, including organizational and site selection considerations, recruitment concerns and issues related to data collection and randomization to treatment groups. Management and monitoring of data, as well as quality assurance procedures are described. Finally, a real world perspective is shared from a recently conducted multisite randomized clinical trial of reflexology for women diagnosed with advanced breast cancer. Results The use of multiple sites in the conduct of CAM-based randomized clinical trials can provide an efficient, collaborative and robust approach to study coordination and data collection that maximizes efficiency and ensures the quality of results. Conclusions Multisite randomized clinical trial designs can offer the field of CAM research a more standardized and efficient approach to examine the effectiveness of novel therapies and treatments. Special attention must be given to intervention fidelity, consistent data collection and ensuring data quality. Assessment and reporting of quantitative indicators of data quality should be required. PMID:21664296
Hamm, Michele P; Scott, Shannon D; Klassen, Terry P; Moher, David; Hartling, Lisa
2012-10-18
Pediatric randomized controlled trials (RCTs) are susceptible to a high risk of bias. We examined the barriers and facilitators that pediatric trialists face in the design and conduct of unbiased trials. We used a mixed methods design, with semi-structured interviews building upon the results of a quantitative survey. We surveyed Canadian (n=253) and international (n=600) pediatric trialists regarding their knowledge and awareness of bias and their perceived barriers and facilitators in conducting clinical trials. We then interviewed 13 participants from different subspecialties and geographic locations to gain a more detailed description of how their experiences and attitudes towards research interacted with trial design and conduct. The survey response rate was 23.0% (186/807). 68.1% of respondents agreed that bias is a problem in pediatric RCTs and 72.0% felt that there is sufficient evidence to support changing some aspects of how trials are conducted. Knowledge related to bias was variable, with inconsistent awareness of study design features that may introduce bias into a study. Interview participants highlighted a lack of formal training in research methods, a negative research culture, and the pragmatics of trial conduct as barriers. Facilitators included contact with knowledgeable and supportive colleagues and infrastructure for research. A lack of awareness of bias and negative attitudes towards research present significant barriers in terms of conducting methodologically rigorous pediatric RCTs. Knowledge translation efforts must focus on these issues to ensure the relevance and validity of trial results.
Design of Phase II Non-inferiority Trials.
Jung, Sin-Ho
2017-09-01
With the development of inexpensive treatment regimens and less invasive surgical procedures, we are confronted with non-inferiority study objectives. A non-inferiority phase III trial requires a roughly four times larger sample size than that of a similar standard superiority trial. Because of the large required sample size, we often face feasibility issues to open a non-inferiority trial. Furthermore, due to lack of phase II non-inferiority trial design methods, we do not have an opportunity to investigate the efficacy of the experimental therapy through a phase II trial. As a result, we often fail to open a non-inferiority phase III trial and a large number of non-inferiority clinical questions still remain unanswered. In this paper, we want to develop some designs for non-inferiority randomized phase II trials with feasible sample sizes. At first, we review a design method for non-inferiority phase III trials. Subsequently, we propose three different designs for non-inferiority phase II trials that can be used under different settings. Each method is demonstrated with examples. Each of the proposed design methods is shown to require a reasonable sample size for non-inferiority phase II trials. The three different non-inferiority phase II trial designs are used under different settings, but require similar sample sizes that are typical for phase II trials.
Peterson, Janey C; Czajkowski, Susan; Charlson, Mary E; Link, Alissa R; Wells, Martin T; Isen, Alice M; Mancuso, Carol A; Allegrante, John P; Boutin-Foster, Carla; Ogedegbe, Gbenga; Jobe, Jared B
2013-04-01
To describe a mixed-methods approach to develop and test a basic behavioral science-informed intervention to motivate behavior change in 3 high-risk clinical populations. Our theoretically derived intervention comprised a combination of positive affect and self-affirmation (PA/SA), which we applied to 3 clinical chronic disease populations. We employed a sequential mixed methods model (EVOLVE) to design and test the PA/SA intervention in order to increase physical activity in people with coronary artery disease (post-percutaneous coronary intervention [PCI]) or asthma (ASM) and to improve medication adherence in African Americans with hypertension (HTN). In an initial qualitative phase, we explored participant values and beliefs. We next pilot tested and refined the intervention and then conducted 3 randomized controlled trials with parallel study design. Participants were randomized to combined PA/SA versus an informational control and were followed bimonthly for 12 months, assessing for health behaviors and interval medical events. Over 4.5 years, we enrolled 1,056 participants. Changes were sequentially made to the intervention during the qualitative and pilot phases. The 3 randomized controlled trials enrolled 242 participants who had undergone PCI, 258 with ASM, and 256 with HTN (n = 756). Overall, 45.1% of PA/SA participants versus 33.6% of informational control participants achieved successful behavior change (p = .001). In multivariate analysis, PA/SA intervention remained a significant predictor of achieving behavior change (p < .002, odds ratio = 1.66), 95% CI [1.22, 2.27], controlling for baseline negative affect, comorbidity, gender, race/ethnicity, medical events, smoking, and age. The EVOLVE method is a means by which basic behavioral science research can be translated into efficacious interventions for chronic disease populations.
Check-Standard Testing Across Multiple Transonic Wind Tunnels with the Modern Design of Experiments
NASA Technical Reports Server (NTRS)
Deloach, Richard
2012-01-01
This paper reports the result of an analysis of wind tunnel data acquired in support of the Facility Analysis Verification & Operational Reliability (FAVOR) project. The analysis uses methods referred to collectively at Langley Research Center as the Modern Design of Experiments (MDOE). These methods quantify the total variance in a sample of wind tunnel data and partition it into explained and unexplained components. The unexplained component is further partitioned in random and systematic components. This analysis was performed on data acquired in similar wind tunnel tests executed in four different U.S. transonic facilities. The measurement environment of each facility was quantified and compared.
Exact tests using two correlated binomial variables in contemporary cancer clinical trials.
Yu, Jihnhee; Kepner, James L; Iyer, Renuka
2009-12-01
New therapy strategies for the treatment of cancer are rapidly emerging because of recent technology advances in genetics and molecular biology. Although newer targeted therapies can improve survival without measurable changes in tumor size, clinical trial conduct has remained nearly unchanged. When potentially efficacious therapies are tested, current clinical trial design and analysis methods may not be suitable for detecting therapeutic effects. We propose an exact method with respect to testing cytostatic cancer treatment using correlated bivariate binomial random variables to simultaneously assess two primary outcomes. The method is easy to implement. It does not increase the sample size over that of the univariate exact test and in most cases reduces the sample size required. Sample size calculations are provided for selected designs.
Perceptions of Massage Therapists Participating in a Randomized Controlled Trial
Perlman, Adam; Dreusicke, Mark; Keever, Teresa; Ali, Ather
2015-01-01
Background Clinical practice and randomized trials often have disparate aims, despite involving similar interventions. Attitudes and expectancies of practitioners influence patient outcomes, and there is growing emphasis on optimizing provider–patient relationships. In this study, we evaluated the experiences of licensed massage therapists involved in a randomized controlled clinical trial using qualitative methodology. Methods Seven massage therapists who were interventionists in a randomized controlled trial participated in structured interviews approximately 30 minutes in length. Interviews focused on their experiences and perceptions regarding aspects of the clinical trial, as well as recommendations for future trials. Transcribed interviews were analyzed for emergent topics and themes using standard qualitative methods. Results Six themes emerged. Therapists discussed 1) promoting the profession of massage therapy through research, 2) mixed views on using standardized protocols, 3) challenges of sham interventions, 4) participant response to the sham intervention, 5) views on scheduling and compensation, and 6) unanticipated benefits of participating in research. Conclusions Therapists largely appreciated the opportunity to promote massage through research. They demonstrated insight and understanding of the rationale for a clinical trial adhering to a standardized protocol. Evaluating the experiences and ideas of complementary and alternative medicine practitioners provides valuable insight that is relevant for the implementation and design of randomized trials. PMID:26388961
Percolation of the site random-cluster model by Monte Carlo method
NASA Astrophysics Data System (ADS)
Wang, Songsong; Zhang, Wanzhou; Ding, Chengxiang
2015-08-01
We propose a site random-cluster model by introducing an additional cluster weight in the partition function of the traditional site percolation. To simulate the model on a square lattice, we combine the color-assignation and the Swendsen-Wang methods to design a highly efficient cluster algorithm with a small critical slowing-down phenomenon. To verify whether or not it is consistent with the bond random-cluster model, we measure several quantities, such as the wrapping probability Re, the percolating cluster density P∞, and the magnetic susceptibility per site χp, as well as two exponents, such as the thermal exponent yt and the fractal dimension yh of the percolating cluster. We find that for different exponents of cluster weight q =1.5 , 2, 2.5 , 3, 3.5 , and 4, the numerical estimation of the exponents yt and yh are consistent with the theoretical values. The universalities of the site random-cluster model and the bond random-cluster model are completely identical. For larger values of q , we find obvious signatures of the first-order percolation transition by the histograms and the hysteresis loops of percolating cluster density and the energy per site. Our results are helpful for the understanding of the percolation of traditional statistical models.
Ivanova, Anastasia; Tamura, Roy N
2015-12-01
A new clinical trial design, designated the two-way enriched design (TED), is introduced, which augments the standard randomized placebo-controlled trial with second-stage enrichment designs in placebo non-responders and drug responders. The trial is run in two stages. In the first stage, patients are randomized between drug and placebo. In the second stage, placebo non-responders are re-randomized between drug and placebo and drug responders are re-randomized between drug and placebo. All first-stage data, and second-stage data from first-stage placebo non-responders and first-stage drug responders, are utilized in the efficacy analysis. The authors developed one, two and three degrees of freedom score tests for treatment effect in the TED and give formulae for asymptotic power and for sample size computations. The authors compute the optimal allocation ratio between drug and placebo in the first stage for the TED and compare the operating characteristics of the design to the standard parallel clinical trial, placebo lead-in and randomized withdrawal designs. Two motivating examples from different disease areas are presented to illustrate the possible design considerations. © The Author(s) 2011.
Wiggers, Jimme K; Coelen, Robert J S; Rauws, Erik A J; van Delden, Otto M; van Eijck, Casper H J; de Jonge, Jeroen; Porte, Robert J; Buis, Carlijn I; Dejong, Cornelis H C; Molenaar, I Quintus; Besselink, Marc G H; Busch, Olivier R C; Dijkgraaf, Marcel G W; van Gulik, Thomas M
2015-02-14
Liver surgery in perihilar cholangiocarcinoma (PHC) is associated with high postoperative morbidity because the tumor typically causes biliary obstruction. Preoperative biliary drainage is used to create a safer environment prior to liver surgery, but biliary drainage may be harmful when severe drainage-related complications deteriorate the patients' condition or increase the risk of postoperative morbidity. Biliary drainage can cause cholangitis/cholecystitis, pancreatitis, hemorrhage, portal vein thrombosis, bowel wall perforation, or dehydration. Two methods of preoperative biliary drainage are mostly applied: endoscopic biliary drainage, which is currently used in most regional centers before referring patients for surgical treatment, and percutaneous transhepatic biliary drainage. Both methods are associated with severe drainage-related complications, but two small retrospective series found a lower incidence in the number of preoperative complications after percutaneous drainage compared to endoscopic drainage (18-25% versus 38-60%, respectively). The present study randomizes patients with potentially resectable PHC and biliary obstruction between preoperative endoscopic or percutaneous transhepatic biliary drainage. The study is a multi-center trial with an "all-comers" design, randomizing patients between endoscopic or percutaneous transhepatic biliary drainage. All patients selected to potentially undergo a major liver resection for presumed PHC are eligible for inclusion in the study provided that the biliary system in the future liver remnant is obstructed (even if they underwent previous inadequate endoscopic drainage). Primary outcome measure is the total number of severe preoperative complications between randomization and exploratory laparotomy. The study is designed to detect superiority of percutaneous drainage: a provisional sample size of 106 patients is required to detect a relative decrease of 50% in the number of severe preoperative complications (alpha = 0.95; beta = 0.8). Interim analysis after inclusion of 53 patients (50%) will provide the definitive sample size. Secondary outcome measures encompass the success of biliary drainage, quality of life, and postoperative morbidity and mortality. The DRAINAGE trial is designed to identify a difference in the number of severe drainage-related complications after endoscopic and percutaneous transhepatic biliary drainage in patients selected to undergo a major liver resection for perihilar cholangiocarcinoma. Netherlands Trial Register [ NTR4243 , 11 October 2013].
Aiello, Allison E; Simanek, Amanda M; Eisenberg, Marisa C; Walsh, Alison R; Davis, Brian; Volz, Erik; Cheng, Caroline; Rainey, Jeanette J; Uzicanin, Amra; Gao, Hongjiang; Osgood, Nathaniel; Knowles, Dylan; Stanley, Kevin; Tarter, Kara; Monto, Arnold S
2016-06-01
Social networks are increasingly recognized as important points of intervention, yet relatively few intervention studies of respiratory infection transmission have utilized a network design. Here we describe the design, methods, and social network structure of a randomized intervention for isolating respiratory infection cases in a university setting over a 10-week period. 590 students in six residence halls enrolled in the eX-FLU study during a chain-referral recruitment process from September 2012-January 2013. Of these, 262 joined as "seed" participants, who nominated their social contacts to join the study, of which 328 "nominees" enrolled. Participants were cluster-randomized by 117 residence halls. Participants were asked to respond to weekly surveys on health behaviors, social interactions, and influenza-like illness (ILI) symptoms. Participants were randomized to either a 3-Day dorm room isolation intervention or a control group (no isolation) upon illness onset. ILI cases reported on their isolation behavior during illness and provided throat and nasal swab specimens at onset, day-three, and day-six of illness. A subsample of individuals (N=103) participated in a sub-study using a novel smartphone application, iEpi, which collected sensor and contextually-dependent survey data on social interactions. Within the social network, participants were significantly positively assortative by intervention group, enrollment type, residence hall, iEpi participation, age, gender, race, and alcohol use (all P<0.002). We identified a feasible study design for testing the impact of isolation from social networks in a university setting. These data provide an unparalleled opportunity to address questions about isolation and infection transmission, as well as insights into social networks and behaviors among college-aged students. Several important lessons were learned over the course of this project, including feasible isolation durations, the need for extensive organizational efforts, as well as the need for specialized programmers and server space for managing survey and smartphone data. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Shayanfar, Mohsen Ali; Barkhordari, Mohammad Ali; Roudak, Mohammad Amin
2017-06-01
Monte Carlo simulation (MCS) is a useful tool for computation of probability of failure in reliability analysis. However, the large number of required random samples makes it time-consuming. Response surface method (RSM) is another common method in reliability analysis. Although RSM is widely used for its simplicity, it cannot be trusted in highly nonlinear problems due to its linear nature. In this paper, a new efficient algorithm, employing the combination of importance sampling, as a class of MCS, and RSM is proposed. In the proposed algorithm, analysis starts with importance sampling concepts and using a represented two-step updating rule of design point. This part finishes after a small number of samples are generated. Then RSM starts to work using Bucher experimental design, with the last design point and a represented effective length as the center point and radius of Bucher's approach, respectively. Through illustrative numerical examples, simplicity and efficiency of the proposed algorithm and the effectiveness of the represented rules are shown.
[Quantitative and qualitative research methods, can they coexist yet?].
Hunt, Elena; Lavoie, Anne-Marise
2011-06-01
Qualitative design is gaining ground in Nursing research. In spite of a relative progress however, the evidence based practice movement continues to dominate and to underline the exclusive value of quantitative design (particularly that of randomized clinical trials) for clinical decision making. In the actual context convenient to those in power making utilitarian decisions on one hand, and facing nursing criticism of the establishment in favor of qualitative research on the other hand, it is difficult to chose a practical and ethical path that values the nursing role within the health care system, keeping us committed to quality care and maintaining researcher's integrity. Both qualitative and quantitative methods have advantages and disadvantages, and clearly, none of them can, by itself, capture, describe and explain reality adequately. Therefore, a balance between the two methods is needed. Researchers bare responsibility to society and science, and they should opt for the appropriate design susceptible to answering the research question, not promote the design favored by the research funding distributors.
Yuan, Jing; Liu, Fenghua
2017-01-01
Objective The present study aimed to undertake a review of available evidence assessing whether time-lapse imaging (TLI) has favorable outcomes for embryo incubation and selection compared with conventional methods in clinical in vitro fertilization (IVF). Methods Using PubMed, EMBASE, Cochrane library and ClinicalTrial.gov up to February 2017 to search for randomized controlled trials (RCTs) comparing TLI versus conventional methods. Both studies randomized women and oocytes were included. For studies randomized women, the primary outcomes were live birth and ongoing pregnancy, the secondary outcomes were clinical pregnancy and miscarriage; for studies randomized oocytes, the primary outcome was blastocyst rate, the secondary outcome was good quality embryo on Day 2/3. Subgroup analysis was conducted based on different incubation and embryo selection between groups. Results Ten RCTs were included, four randomized oocytes and six randomized women. For oocyte-based review, the pool-analysis observed no significant difference between TLI group and control group for blastocyst rate [relative risk (RR) 1.08, 95% CI 0.94–1.25, I2 = 0%, two studies, including 1154 embryos]. The quality of evidence was moderate for all outcomes in oocyte-based review. For woman-based review, only one study provided live birth rate (RR 1,23, 95% CI 1.06–1.44,I2 N/A, one study, including 842 women), the pooled result showed no significant difference in ongoing pregnancy rate (RR 1.04, 95% CI 0.80–1.36, I2 = 59%, four studies, including 1403 women) between two groups. The quality of the evidence was low or very low for all outcomes in woman-based review. Conclusions Currently there is insufficient evidence to support that TLI is superior to conventional methods for human embryo incubation and selection. In consideration of the limitations and flaws of included studies, more well designed RCTs are still in need to comprehensively evaluate the effectiveness of clinical TLI use. PMID:28570713
Roetzheim, Richard G.; Freund, Karen M.; Corle, Don K.; Murray, David M.; Snyder, Frederick R.; Kronman, Andrea C.; Jean-Pierre, Pascal; Raich, Peter C.; Holden, Alan E. C.; Darnell, Julie S.; Warren-Mears, Victoria; Patierno, Steven; Design, PNRP; Committee, Analysis
2013-01-01
Background The Patient Navigation Research Program (PNRP) is a cooperative effort of nine research projects, each employing its own unique study design. To evaluate projects such as PNRP, it is desirable to perform a pooled analysis to increase power relative to the individual projects. There is no agreed upon prospective methodology, however, for analyzing combined data arising from different study designs. Expert opinions were thus solicited from members of the PNRP Design and Analysis Committee Purpose To review possible methodologies for analyzing combined data arising from heterogeneous study designs. Methods The Design and Analysis Committee critically reviewed the pros and cons of five potential methods for analyzing combined PNRP project data. Conclusions were based on simple consensus. The five approaches reviewed included: 1) Analyzing and reporting each project separately, 2) Combining data from all projects and performing an individual-level analysis, 3) Pooling data from projects having similar study designs, 4) Analyzing pooled data using a prospective meta analytic technique, 5) Analyzing pooled data utilizing a novel simulated group randomized design. Results Methodologies varied in their ability to incorporate data from all PNRP projects, to appropriately account for differing study designs, and in their impact from differing project sample sizes. Limitations The conclusions reached were based on expert opinion and not derived from actual analyses performed. Conclusions The ability to analyze pooled data arising from differing study designs may provide pertinent information to inform programmatic, budgetary, and policy perspectives. Multi-site community-based research may not lend itself well to the more stringent explanatory and pragmatic standards of a randomized controlled trial design. Given our growing interest in community-based population research, the challenges inherent in the analysis of heterogeneous study design are likely to become more salient. Discussion of the analytic issues faced by the PNRP and the methodological approaches we considered may be of value to other prospective community-based research programs. PMID:22273587
The case for randomized controlled trials to assess the impact of clinical information systems.
Liu, Joseph L Y; Wyatt, Jeremy C
2011-01-01
There is a persistent view of a significant minority in the medical informatics community that the randomized controlled trial (RCT) has a limited role to play in evaluating clinical information systems. A common reason voiced by skeptics is that these systems are fundamentally different from drug interventions, so the RCT is irrelevant. There is an urgent need to promote the use of RCTs, given the shift to evidence-based policy and the need to demonstrate cost-effectiveness of these systems. The authors suggest returning to first principles and argue that what is required is clarity about how to match methods to evaluation questions. The authors address common concerns about RCTs, and the extent to which they are fallacious, and also discuss the challenges of conducting RCTs in informatics and alternative study designs when randomized trials are infeasible. While neither a perfect nor universal evaluation method, RCTs form an important part of an evaluator's toolkit.
The case for randomized controlled trials to assess the impact of clinical information systems
Wyatt, Jeremy C
2011-01-01
There is a persistent view of a significant minority in the medical informatics community that the randomized controlled trial (RCT) has a limited role to play in evaluating clinical information systems. A common reason voiced by skeptics is that these systems are fundamentally different from drug interventions, so the RCT is irrelevant. There is an urgent need to promote the use of RCTs, given the shift to evidence-based policy and the need to demonstrate cost-effectiveness of these systems. The authors suggest returning to first principles and argue that what is required is clarity about how to match methods to evaluation questions. The authors address common concerns about RCTs, and the extent to which they are fallacious, and also discuss the challenges of conducting RCTs in informatics and alternative study designs when randomized trials are infeasible. While neither a perfect nor universal evaluation method, RCTs form an important part of an evaluator's toolkit. PMID:21270132
Chan, Derwin K; Ivarsson, Andreas; Stenling, Andreas; Yang, Sophie X; Chatzisarantis, Nikos L; Hagger, Martin S
2015-12-01
Consistency tendency is characterized by the propensity for participants responding to subsequent items in a survey consistent with their responses to previous items. This method effect might contaminate the results of sport psychology surveys using cross-sectional design. We present a randomized controlled crossover study examining the effect of consistency tendency on the motivational pathway (i.e., autonomy support → autonomous motivation → intention) of self-determination theory in the context of sport injury prevention. Athletes from Sweden (N = 341) responded to the survey printed in either low interitem distance (IID; consistency tendency likely) or high IID (consistency tendency suppressed) on two separate occasions, with a one-week interim period. Participants were randomly allocated into two groups, and they received the survey of different IID at each occasion. Bayesian structural equation modeling showed that low IID condition had stronger parameter estimates than high IID condition, but the differences were not statistically significant.
Ligand design by a combinatorial approach based on modeling and experiment: application to HLA-DR4
NASA Astrophysics Data System (ADS)
Evensen, Erik; Joseph-McCarthy, Diane; Weiss, Gregory A.; Schreiber, Stuart L.; Karplus, Martin
2007-07-01
Combinatorial synthesis and large scale screening methods are being used increasingly in drug discovery, particularly for finding novel lead compounds. Although these "random" methods sample larger areas of chemical space than traditional synthetic approaches, only a relatively small percentage of all possible compounds are practically accessible. It is therefore helpful to select regions of chemical space that have greater likelihood of yielding useful leads. When three-dimensional structural data are available for the target molecule this can be achieved by applying structure-based computational design methods to focus the combinatorial library. This is advantageous over the standard usage of computational methods to design a small number of specific novel ligands, because here computation is employed as part of the combinatorial design process and so is required only to determine a propensity for binding of certain chemical moieties in regions of the target molecule. This paper describes the application of the Multiple Copy Simultaneous Search (MCSS) method, an active site mapping and de novo structure-based design tool, to design a focused combinatorial library for the class II MHC protein HLA-DR4. Methods for the synthesizing and screening the computationally designed library are presented; evidence is provided to show that binding was achieved. Although the structure of the protein-ligand complex could not be determined, experimental results including cross-exclusion of a known HLA-DR4 peptide ligand (HA) by a compound from the library. Computational model building suggest that at least one of the ligands designed and identified by the methods described binds in a mode similar to that of native peptides.
Learning style and teaching method preferences of Saudi students of physical therapy
Al Maghraby, Mohamed A.; Alshami, Ali M.
2013-01-01
Context: To the researchers’ knowledge, there are no published studies that have investigated the learning styles and preferred teaching methods of physical therapy students in Saudi Arabia. Aim: The study was conducted to determine the learning styles and preferred teaching methods of Saudi physical therapy students. Settings and Design: A cross-sectional study design. Materials and Methods: Fifty-three Saudis studying physical therapy (21 males and 32 females) participated in the study. The principal researcher gave an introductory lecture to explain the different learning styles and common teaching methods. Upon completion of the lecture, questionnaires were distributed, and were collected on completion. Statistical Analysis Used: Percentages were calculated for the learning styles and teaching methods. Pearson’s correlations were performed to investigate the relationship between them. Results: More than 45 (85%) of the students rated hands-on training as the most preferred teaching method. Approximately 30 (57%) students rated the following teaching methods as the most preferred methods: “Advanced organizers,” “demonstrations,” and “multimedia activities.” Although 31 (59%) students rated the concrete-sequential learning style the most preferred, these students demonstrated mixed styles on the other style dimensions: Abstract-sequential, abstract-random, and concrete-random. Conclusions: The predominant concrete-sequential learning style is consistent with the most preferred teaching method (hands-on training). The high percentage of physical therapy students whose responses were indicative of mixed learning styles suggests that they can accommodate multiple teaching methods. It is recommended that educators consider the diverse learning styles of the students and utilize a variety of teaching methods in order to promote an optimal learning environment for the students. PMID:24672278
2012-01-01
Background Multiple sclerosis (MS) is a complex, chronic and progressive disease and rehabilitation services can provide important support to patients. Few MS rehabilitation programs have been shown to provide health improvements to patients in a cost-effective manner. The objective of this study is to assess the effects in terms of changes measured by a variety of standardized quality of life, mastery, coping, compliance and individual goal-related endpoints. This combination provides the basis for analyzing the complexity of MS and outcomes of a personalized rehabilitation. Methods/Design Patients with MS referred to hospital rehabilitation services will be randomized to either early admission (within two months) or usual admission (after an average waiting time of eight months). They will complete a battery of standardized health outcome instruments prior to randomization, and again six and twelve months after randomization, and a battery of goal-related outcome measures at admission and discharge, and again one, six and twelve months after randomization. Discussion The results of the study are expected to contribute to further development of MS rehabilitation services and to discussions about the design and content of such services. The results will also provide additional information to health authorities responsible for providing and financing rehabilitation services. Trial registration Current Controlled Trials (ISRCTN05245917) PMID:22954027
Houston, Denise K.; Leng, Xiaoyan; Bray, George A.; Hergenroeder, Andrea L.; Hill, James O.; Jakicic, John M.; Johnson, Karen C.; Neiberg, Rebecca H.; Marsh, Anthony P.; Rejeski, W. Jack; Kritchevsky, Stephen B.
2014-01-01
OBJECTIVE To assess the long-term effects of an intensive lifestyle intervention on physical function using a randomized post-test design in the Look AHEAD trial. METHODS Overweight and obese (BMI ≥25 kg/m2) middle-aged and older adults (aged 45–76 years at enrollment) with type 2 diabetes (n=964) at four clinics in Look AHEAD, a trial evaluating an intensive lifestyle intervention (ILI) designed to achieve weight loss through caloric restriction and increased physical activity compared to diabetes support and education (DSE), underwent standardized assessments of performance-based physical function including an expanded short physical performance battery (SPPBexp), 20-m and 400-m walk, and grip and knee extensor strength 8 years post-randomization, during the trial’s weight maintenance phase. RESULTS Eight years post-randomization, individuals randomized to ILI had better SPPBexp scores (adjusted mean (SE) difference: 0.055 (0.022), p=0.01) and faster 20-m and 400-m walk speeds (0.032 (0.012) m/sec, p=0.01, and 0.025 (0.011) m/sec, p=0.02, respectively) compared to those randomized to DSE. Achieved weight loss greatly attenuated the group differences in physical function and the intervention effect was no longer significant. CONCLUSIONS An intensive lifestyle intervention has long-term benefits for mobility function in overweight and obese middle-aged and older individuals with type 2 diabetes. PMID:25452229
Michael G. Shelton
1997-01-01
The shelterwood reproduction cutting method using two overstory compositions (a pine basal area of 30 ft* per acre with and without 15 ftâ per acre of hardwoods) and two methods of submerchantable hardwood control (chain-saw felling with and without stump-applied herbicide) was tested in a 2x2 factorial, split-plot design with four randomized complete blocks....
Luce, Bryan R; Connor, Jason T; Broglio, Kristine R; Mullins, C Daniel; Ishak, K Jack; Saunders, Elijah; Davis, Barry R
2016-09-20
Bayesian and adaptive clinical trial designs offer the potential for more efficient processes that result in lower sample sizes and shorter trial durations than traditional designs. To explore the use and potential benefits of Bayesian adaptive clinical trial designs in comparative effectiveness research. Virtual execution of ALLHAT (Antihypertensive and Lipid-Lowering Treatment to Prevent Heart Attack Trial) as if it had been done according to a Bayesian adaptive trial design. Comparative effectiveness trial of antihypertensive medications. Patient data sampled from the more than 42 000 patients enrolled in ALLHAT with publicly available data. Number of patients randomly assigned between groups, trial duration, observed numbers of events, and overall trial results and conclusions. The Bayesian adaptive approach and original design yielded similar overall trial conclusions. The Bayesian adaptive trial randomly assigned more patients to the better-performing group and would probably have ended slightly earlier. This virtual trial execution required limited resampling of ALLHAT patients for inclusion in RE-ADAPT (REsearch in ADAptive methods for Pragmatic Trials). Involvement of a data monitoring committee and other trial logistics were not considered. In a comparative effectiveness research trial, Bayesian adaptive trial designs are a feasible approach and potentially generate earlier results and allocate more patients to better-performing groups. National Heart, Lung, and Blood Institute.
Trends in study design and the statistical methods employed in a leading general medicine journal.
Gosho, M; Sato, Y; Nagashima, K; Takahashi, S
2018-02-01
Study design and statistical methods have become core components of medical research, and the methodology has become more multifaceted and complicated over time. The study of the comprehensive details and current trends of study design and statistical methods is required to support the future implementation of well-planned clinical studies providing information about evidence-based medicine. Our purpose was to illustrate study design and statistical methods employed in recent medical literature. This was an extension study of Sato et al. (N Engl J Med 2017; 376: 1086-1087), which reviewed 238 articles published in 2015 in the New England Journal of Medicine (NEJM) and briefly summarized the statistical methods employed in NEJM. Using the same database, we performed a new investigation of the detailed trends in study design and individual statistical methods that were not reported in the Sato study. Due to the CONSORT statement, prespecification and justification of sample size are obligatory in planning intervention studies. Although standard survival methods (eg Kaplan-Meier estimator and Cox regression model) were most frequently applied, the Gray test and Fine-Gray proportional hazard model for considering competing risks were sometimes used for a more valid statistical inference. With respect to handling missing data, model-based methods, which are valid for missing-at-random data, were more frequently used than single imputation methods. These methods are not recommended as a primary analysis, but they have been applied in many clinical trials. Group sequential design with interim analyses was one of the standard designs, and novel design, such as adaptive dose selection and sample size re-estimation, was sometimes employed in NEJM. Model-based approaches for handling missing data should replace single imputation methods for primary analysis in the light of the information found in some publications. Use of adaptive design with interim analyses is increasing after the presentation of the FDA guidance for adaptive design. © 2017 John Wiley & Sons Ltd.
Kulig, Kornelia; Pomrantz, Amy B; Burnfield, Judith M; Reischl, Stephen F; Mais-Requejo, Susan; Thordarson, David B; Smith, Ronald W
2006-01-01
Background Posterior tibialis tendon dysfunction (PTTD) is a common cause of foot pain and dysfunction in adults. Clinical observations strongly suggest that the condition is progressive. There are currently no controlled studies evaluating the effectiveness of exercise, orthoses, or orthoses and exercise on Stage I or IIA PTTD. Our study will explore the effectiveness of an eccentric versus concentric strengthening intervention to results obtained with the use of orthoses alone. Findings from this study will guide the development of more efficacious PTTD intervention programs and contribute to enhanced function and quality of life in persons with posterior tibialis tendon dysfunction. Methods/design This paper presents the rationale and design for a randomized clinical trial evaluating the effectiveness of a treatment regime for the non-operative management of Stage I or IIA PTTD. Discussion We have presented the rationale and design for an RCT evaluating the effectiveness of a treatment regimen for the non-operative management of Stage I or IIA PTTD. The results of this trial will be presented as soon as they are available. PMID:16756656
Gagnon, Marie-Pierre; Gagnon, Johanne; Desmartis, Marie; Njoya, Merlin
2013-01-01
This study aimed to assess the effectiveness of a blended-teaching intervention using Internet-based tutorials coupled with traditional lectures in an introduction to research undergraduate nursing course. Effects of the intervention were compared with conventional, face-to-face classroom teaching on three outcomes: knowledge, satisfaction, and self-learning readiness. A two-group, randomized, controlled design was used, involving 112 participants. Descriptive statistics and analysis of covariance (ANCOVA) were performed. The teaching method was found to have no direct impact on knowledge acquisition, satisfaction, and self-learning readiness. However, motivation and teaching method had an interaction effect on knowledge acquisition by students. Among less motivated students, those in the intervention group performed better than those who received traditional training. These findings suggest that this blended-teaching method could better suit some students, depending on their degree of motivation and level of self-directed learning readiness.
Commercialization of NESSUS: Status
NASA Technical Reports Server (NTRS)
Thacker, Ben H.; Millwater, Harry R.
1991-01-01
A plan was initiated in 1988 to commercialize the Numerical Evaluation of Stochastic Structures Under Stress (NESSUS) probabilistic structural analysis software. The goal of the on-going commercialization effort is to begin the transfer of Probabilistic Structural Analysis Method (PSAM) developed technology into industry and to develop additional funding resources in the general area of structural reliability. The commercialization effort is summarized. The SwRI NESSUS Software System is a general purpose probabilistic finite element computer program using state of the art methods for predicting stochastic structural response due to random loads, material properties, part geometry, and boundary conditions. NESSUS can be used to assess structural reliability, to compute probability of failure, to rank the input random variables by importance, and to provide a more cost effective design than traditional methods. The goal is to develop a general probabilistic structural analysis methodology to assist in the certification of critical components in the next generation Space Shuttle Main Engine.
Ryeznik, Yevgen; Sverdlov, Oleksandr; Wong, Weng Kee
2015-08-01
Response-adaptive randomization designs are becoming increasingly popular in clinical trial practice. In this paper, we present RARtool , a user interface software developed in MATLAB for designing response-adaptive randomized comparative clinical trials with censored time-to-event outcomes. The RARtool software can compute different types of optimal treatment allocation designs, and it can simulate response-adaptive randomization procedures targeting selected optimal allocations. Through simulations, an investigator can assess design characteristics under a variety of experimental scenarios and select the best procedure for practical implementation. We illustrate the utility of our RARtool software by redesigning a survival trial from the literature.
Fiellin, Lynn E; Kyriakides, Tassos C; Hieftje, Kimberly D; Pendergrass, Tyra M; Duncan, Lindsay R; Dziura, James D; Sawyer, Benjamin G; Fiellin, David A
2016-01-01
Background To address the need for risk behavior reduction and HIV prevention interventions that capture adolescents “where they live,” we created a tablet-based videogame to teach skills and knowledge and influence psychosocial antecedents for decreasing risk and preventing HIV infection in minority youth in schools, after-school programs, and summer camps. Methods We developed PlayForward: Elm City Stories over a 2-year period, working with researchers, commercial game designers, and staff and teens from community programs. The videogame PlayForward provides an interactive world where players, using an avatar, “travel” through time, facing challenges such as peer pressure to drink alcohol or engage in risky sexual behaviors. Players experience how their choices affect their future and then are able to go back in time and change their choices, creating different outcomes. A randomized controlled trial was designed to evaluate the efficacy of PlayForward. Participants were randomly assigned to play PlayForward or a set of attention/time control games on a tablet at their community-based program. Assessment data were collected during face-to-face study visits and entered into a web-based platform and unique real-time “in-game” PlayForward data were collected as players engaged in the game. The innovative methods of this randomized controlled trial are described. We highlight the logistical issues of conducting a large-scale trial using mobile technology such as the iPad®, and collecting, transferring, and storing large amounts of in-game data. We outline the methods used to analyze the in-game data alone and in conjunction with standardized assessment data to establish correlations between behaviors during gameplay and those reported in real life. We also describe the use of the in-game data as a measure of fidelity to the intervention. Results In total, 333 boys and girls, aged 11–14 years, were randomized over a 14-month period: 166 were assigned to play PlayForward and 167 to play the control games. To date (as of 1 March 2016), 18 have withdrawn from the study; the following have completed the protocol-defined assessments: 6 weeks: 271 (83%); 3 months: 269 (84%); 6 months: 254 (79%); 12 months: 259 (82%); and 24 months: is ongoing with 152 having completed out of the 199 participants (76%) who were eligible to date (assessment windows were still open). Conclusion Videogames can be developed to address complex behaviors and can be subject to empiric testing using community-based randomized controlled trials. Although mobile technologies pose challenges in their use as interventions and in the collection and storage of data they produce, they provide unique opportunities as new sources of potentially valid data and novel methods to measure the fidelity of digitally delivered behavioral interventions. PMID:27013483
Distributed collaborative response surface method for mechanical dynamic assembly reliability design
NASA Astrophysics Data System (ADS)
Bai, Guangchen; Fei, Chengwei
2013-11-01
Because of the randomness of many impact factors influencing the dynamic assembly relationship of complex machinery, the reliability analysis of dynamic assembly relationship needs to be accomplished considering the randomness from a probabilistic perspective. To improve the accuracy and efficiency of dynamic assembly relationship reliability analysis, the mechanical dynamic assembly reliability(MDAR) theory and a distributed collaborative response surface method(DCRSM) are proposed. The mathematic model of DCRSM is established based on the quadratic response surface function, and verified by the assembly relationship reliability analysis of aeroengine high pressure turbine(HPT) blade-tip radial running clearance(BTRRC). Through the comparison of the DCRSM, traditional response surface method(RSM) and Monte Carlo Method(MCM), the results show that the DCRSM is not able to accomplish the computational task which is impossible for the other methods when the number of simulation is more than 100 000 times, but also the computational precision for the DCRSM is basically consistent with the MCM and improved by 0.40˜4.63% to the RSM, furthermore, the computational efficiency of DCRSM is up to about 188 times of the MCM and 55 times of the RSM under 10000 times simulations. The DCRSM is demonstrated to be a feasible and effective approach for markedly improving the computational efficiency and accuracy of MDAR analysis. Thus, the proposed research provides the promising theory and method for the MDAR design and optimization, and opens a novel research direction of probabilistic analysis for developing the high-performance and high-reliability of aeroengine.
Adaptive adjustment of the randomization ratio using historical control data
Hobbs, Brian P.; Carlin, Bradley P.; Sargent, Daniel J.
2013-01-01
Background Prospective trial design often occurs in the presence of “acceptable” [1] historical control data. Typically this data is only utilized for treatment comparison in a posteriori retrospective analysis to estimate population-averaged effects in a random-effects meta-analysis. Purpose We propose and investigate an adaptive trial design in the context of an actual randomized controlled colorectal cancer trial. This trial, originally reported by Goldberg et al. [2], succeeded a similar trial reported by Saltz et al. [3], and used a control therapy identical to that tested (and found beneficial) in the Saltz trial. Methods The proposed trial implements an adaptive randomization procedure for allocating patients aimed at balancing total information (concurrent and historical) among the study arms. This is accomplished by assigning more patients to receive the novel therapy in the absence of strong evidence for heterogeneity among the concurrent and historical controls. Allocation probabilities adapt as a function of the effective historical sample size (EHSS) characterizing relative informativeness defined in the context of a piecewise exponential model for evaluating time to disease progression. Commensurate priors [4] are utilized to assess historical and concurrent heterogeneity at interim analyses and to borrow strength from the historical data in the final analysis. The adaptive trial’s frequentist properties are simulated using the actual patient-level historical control data from the Saltz trial and the actual enrollment dates for patients enrolled into the Goldberg trial. Results Assessing concurrent and historical heterogeneity at interim analyses and balancing total information with the adaptive randomization procedure leads to trials that on average assign more new patients to the novel treatment when the historical controls are unbiased or slightly biased compared to the concurrent controls. Large magnitudes of bias lead to approximately equal allocation of patients among the treatment arms. Using the proposed commensurate prior model to borrow strength from the historical data, after balancing total information with the adaptive randomization procedure, provides admissible estimators of the novel treatment effect with desirable bias-variance trade-offs. Limitations Adaptive randomization methods in general are sensitive to population drift and more suitable for trials that initiate with gradual enrollment. Balancing information among study arms in time-to-event analyses is difficult in the presence of informative right-censoring. Conclusions The proposed design could prove important in trials that follow recent evaluations of a control therapy. Efficient use of the historical controls is especially important in contexts where reliance on pre-existing information is unavoidable because the control therapy is exceptionally hazardous, expensive, or the disease is rare. PMID:23690095
Two new methods to fit models for network meta-analysis with random inconsistency effects.
Law, Martin; Jackson, Dan; Turner, Rebecca; Rhodes, Kirsty; Viechtbauer, Wolfgang
2016-07-28
Meta-analysis is a valuable tool for combining evidence from multiple studies. Network meta-analysis is becoming more widely used as a means to compare multiple treatments in the same analysis. However, a network meta-analysis may exhibit inconsistency, whereby the treatment effect estimates do not agree across all trial designs, even after taking between-study heterogeneity into account. We propose two new estimation methods for network meta-analysis models with random inconsistency effects. The model we consider is an extension of the conventional random-effects model for meta-analysis to the network meta-analysis setting and allows for potential inconsistency using random inconsistency effects. Our first new estimation method uses a Bayesian framework with empirically-based prior distributions for both the heterogeneity and the inconsistency variances. We fit the model using importance sampling and thereby avoid some of the difficulties that might be associated with using Markov Chain Monte Carlo (MCMC). However, we confirm the accuracy of our importance sampling method by comparing the results to those obtained using MCMC as the gold standard. The second new estimation method we describe uses a likelihood-based approach, implemented in the metafor package, which can be used to obtain (restricted) maximum-likelihood estimates of the model parameters and profile likelihood confidence intervals of the variance components. We illustrate the application of the methods using two contrasting examples. The first uses all-cause mortality as an outcome, and shows little evidence of between-study heterogeneity or inconsistency. The second uses "ear discharge" as an outcome, and exhibits substantial between-study heterogeneity and inconsistency. Both new estimation methods give results similar to those obtained using MCMC. The extent of heterogeneity and inconsistency should be assessed and reported in any network meta-analysis. Our two new methods can be used to fit models for network meta-analysis with random inconsistency effects. They are easily implemented using the accompanying R code in the Additional file 1. Using these estimation methods, the extent of inconsistency can be assessed and reported.
Data entry errors and design for model-based tight glycemic control in critical care.
Ward, Logan; Steel, James; Le Compte, Aaron; Evans, Alicia; Tan, Chia-Siong; Penning, Sophie; Shaw, Geoffrey M; Desaive, Thomas; Chase, J Geoffrey
2012-01-01
Tight glycemic control (TGC) has shown benefits but has been difficult to achieve consistently. Model-based methods and computerized protocols offer the opportunity to improve TGC quality but require human data entry, particularly of blood glucose (BG) values, which can be significantly prone to error. This study presents the design and optimization of data entry methods to minimize error for a computerized and model-based TGC method prior to pilot clinical trials. To minimize data entry error, two tests were carried out to optimize a method with errors less than the 5%-plus reported in other studies. Four initial methods were tested on 40 subjects in random order, and the best two were tested more rigorously on 34 subjects. The tests measured entry speed and accuracy. Errors were reported as corrected and uncorrected errors, with the sum comprising a total error rate. The first set of tests used randomly selected values, while the second set used the same values for all subjects to allow comparisons across users and direct assessment of the magnitude of errors. These research tests were approved by the University of Canterbury Ethics Committee. The final data entry method tested reduced errors to less than 1-2%, a 60-80% reduction from reported values. The magnitude of errors was clinically significant and was typically by 10.0 mmol/liter or an order of magnitude but only for extreme values of BG < 2.0 mmol/liter or BG > 15.0-20.0 mmol/liter, both of which could be easily corrected with automated checking of extreme values for safety. The data entry method selected significantly reduced data entry errors in the limited design tests presented, and is in use on a clinical pilot TGC study. The overall approach and testing methods are easily performed and generalizable to other applications and protocols. © 2012 Diabetes Technology Society.
Design method for multi-user workstations utilizing anthropometry and preference data.
Mahoney, Joseph M; Kurczewski, Nicolas A; Froede, Erick W
2015-01-01
Past efforts have been made to design single-user workstations to accommodate users' anthropometric and preference distributions. However, there is a lack of methods for designing workstations for group interaction. This paper introduces a method for sizing workstations to allow for a personal work area for each user and a shared space for adjacent users. We first create a virtual population with the same anthropometric and preference distributions as an intended demographic of college-aged students. Members of the virtual population are randomly paired to test if their extended reaches overlap but their normal reaches do not. This process is repeated in a Monte Carlo simulation to estimate the total percentage of groups in the population that will be accommodated for a workstation size. We apply our method to two test cases: in the first, we size polygonal workstations for two populations and, in the second, we dimension circular workstations for different group sizes. Copyright © 2014 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Intimate Partner Violence in Older Women
ERIC Educational Resources Information Center
Bonomi, Amy E.; Anderson, Melissa L.; Reid, Robert J.; Carrell, David; Fishman, Paul A.; Rivara, Frederick P.; Thompson, Robert S.
2007-01-01
Purpose: We describe the prevalence, types, duration, frequency, and severity of intimate partner violence ("partner violence") in older women. Design and Methods: We randomly sampled a total of 370 English-speaking women (65 years of age and older) from a health care system to participate in a cross-sectional telephone interview. Using 5…
ERIC Educational Resources Information Center
Mohammadipour, Mohammad; Rashid, Sabariah Md; Rafik-Galea, Shameem; Thai, Yap Ngee
2018-01-01
Emotions are an indispensable part of second language learning. The aim of this study is to determine the relationship between the use of language learning strategies and positive emotions. The present study adopted a sequential mixed methods design. The participants were 300 Malaysian ESL undergraduates selected through stratified random sampling…
ERIC Educational Resources Information Center
Altfeld, Susan J.; Shier, Gayle E.; Rooney, Madeleine; Johnson, Tricia J.; Golden, Robyn L.; Karavolos, Kelly; Avery, Elizabeth; Nandi, Vijay; Perry, Anthony J.
2013-01-01
Purpose of the Study: To identify needs encountered by older adult patients after hospital discharge and assess the impact of a telephone transitional care intervention on stress, health care utilization, readmissions, and mortality. Design and Methods: Older adult inpatients who met criteria for risk of post-discharge complications were…
ERIC Educational Resources Information Center
Liao, Minli; Testa, Mark
2016-01-01
Objectives: This study evaluated the effects of the Adoption Preservation, Assessment, and Linkage (APAL) postpermanency program. Method: A quasi-experimental, posttest-only design was used to estimate the program's effects on youth discharged from foster care to adoption or legal guardianship. A random sample was surveyed (female = 44.7%; African…
Duration of Sleep and ADHD Tendency among Adolescents in China
ERIC Educational Resources Information Center
Lam, Lawrence T.; Yang, L.
2008-01-01
Objective: This study investigates the association between duration of sleep and ADHD tendency among adolescents. Method: This population-based health survey uses a two-stage random cluster sampling design. Participants ages 13 to 17 are recruited from the total population of adolescents attending high school in one city of China. Duration of…
Examining Work and Family Conflict among Female Bankers in Accra Metropolis, Ghana
ERIC Educational Resources Information Center
Kissi-Abrokwah, Bernard; Andoh-Robertson, Theophilus; Tutu-Danquah, Cecilia; Agbesi, Catherine Selorm
2015-01-01
This study investigated the effects and solutions of work and family conflict among female bankers in Accra Metropolis. Using triangulatory mixed method design, a structured questionnaire was randomly administered to 300 female bankers and 15 female Bankers who were interviewed were also sampled by using convenient sampling technique. The…
Impact of the Fit and Strong Intervention on Older Adults with Osteoarthritis
ERIC Educational Resources Information Center
Hughes, Susan L.; Seymour, Rachel B.; Campbell, Richard; Pollak, Naomi; Huber, Gail; Sharma, Leena
2004-01-01
Purpose: This study assessed the impact of a low cost, multicomponent physical activity intervention for older adults with lower extremity osteoarthritis. Design and Methods: A randomized controlled trial compared the effects of a facility-based multiple-component training program followed by home-based adherence (n = 80) to a wait list control…
Replicating Experimental Impact Estimates Using a Regression Discontinuity Approach. NCEE 2012-4025
ERIC Educational Resources Information Center
Gleason, Philip M.; Resch, Alexandra M.; Berk, Jillian A.
2012-01-01
This NCEE Technical Methods Paper compares the estimated impacts of an educational intervention using experimental and regression discontinuity (RD) study designs. The analysis used data from two large-scale randomized controlled trials--the Education Technology Evaluation and the Teach for America Study--to provide evidence on the performance of…
Predictors of Self-Regulated Learning in Malaysian Smart Schools
ERIC Educational Resources Information Center
Yen, Ng Lee; Bakar, Kamariah Abu; Roslan, Samsilah; Luan, Wong Su; Abd Rahman, Petri Zabariah Mega
2005-01-01
This study sought to uncover the predictors of self-regulated learning in Malaysian smart schools. The sample consisted of 409 students, from six randomly chosen smart schools. A quantitative correlational research design was employed and the data were collected through survey method. Six factors were examined in relation to the predictors of…
Physical Education in Primary Schools: Classroom Teachers' Perceptions of Benefits and Outcomes
ERIC Educational Resources Information Center
Morgan, Philip J.; Hansen, Vibeke
2008-01-01
Objective: The aim of the current study was to examine the perceptions of classroom teachers regarding the benefits and outcomes of their PE programs. Design: Cross-sectional. Setting: Thirty eight randomly selected primary schools in New South Wales (NSW), Australia. Method: A mixed-mode methodology was utilized, incorporating semi-structured…
TT : a program that implements predictor sort design and analysis
S. P. Verrill; D. W. Green; V. L. Herian
1997-01-01
In studies on wood strength, researchers sometimes replace experimental unit allocation via random sampling with allocation via sorts based on nondestructive measurements of strength predictors such as modulus of elasticity and specific gravity. This report documents TT, a computer program that implements recently published methods to increase the sensitivity of such...
Motivation among Public Primary School Teachers in Mauritius
ERIC Educational Resources Information Center
Seebaluck, Ashley Keshwar; Seegum, Trisha Devi
2013-01-01
Purpose: The purpose of this study was to critically analyse the factors that affect the motivation of public primary school teachers and also to investigate if there is any relationship between teacher motivation and job satisfaction in Mauritius. Design/methodology/approach: Simple random sampling method was used to collect data from 250 primary…
Perceptions of Online Credentials for School Principals
ERIC Educational Resources Information Center
Richardson, Jayson W.; McLeod, Scott; Dikkers, Amy Garrett
2011-01-01
Purpose: The purpose of this study is to investigate the perceptions of human resource directors in the USA about online credentials earned by K-12 school principals and principal candidates. Design/methodology/approach: In this mixed methods study, a survey was sent to a random sample of 500 human resource directors in K-12 school districts…
ERIC Educational Resources Information Center
McConeghy, Kevin; Wing, Coady; Wong, Vivian C.
2015-01-01
Randomized experiments have long been established as the gold standard for addressing causal questions. However, experiments are not always feasible or desired, so observational methods are also needed. When multiple observations on the same variable are available, a repeated measures design may be used to assess whether a treatment administered…
Educational Research with Real-World Data: Reducing Selection Bias with Propensity Scores
ERIC Educational Resources Information Center
Adelson, Jill L.
2013-01-01
Often it is infeasible or unethical to use random assignment in educational settings to study important constructs and questions. Hence, educational research often uses observational data, such as large-scale secondary data sets and state and school district data, and quasi-experimental designs. One method of reducing selection bias in estimations…
A Confirmatory Factor Analysis of the Professional Opinion Scale
ERIC Educational Resources Information Center
Greeno, Elizabeth J.; Hughes, Anne K.; Hayward, R. Anna; Parker, Karen L.
2007-01-01
The Professional Opinion Scale (POS) was developed to measure social work values orientation. Objective: A confirmatory factor analysis was performed on the POS. Method: This cross-sectional study used a mailed survey design with a national random (simple) sample of members of the National Association of Social Workers. Results: The study…
Primary Care Physicians' Dementia Care Practices: Evidence of Geographic Variation
ERIC Educational Resources Information Center
Fortinsky, Richard H.; Zlateva, Ianita; Delaney, Colleen; Kleppinger, Alison
2010-01-01
Purpose: This article explores primary care physicians' (PCPs) self-reported approaches and barriers to management of patients with dementia, with a focus on comparisons in dementia care practices between PCPs in 2 states. Design and Methods: In this cross-sectional study, questionnaires were mailed to 600 randomly selected licensed PCPs in…
The Impact of Sample Size and Other Factors When Estimating Multilevel Logistic Models
ERIC Educational Resources Information Center
Schoeneberger, Jason A.
2016-01-01
The design of research studies utilizing binary multilevel models must necessarily incorporate knowledge of multiple factors, including estimation method, variance component size, or number of predictors, in addition to sample sizes. This Monte Carlo study examined the performance of random effect binary outcome multilevel models under varying…
Public-Private Partnership and Infrastructural Development in Nigerian Universities
ERIC Educational Resources Information Center
Oduwaiye, R. O.; Sofoluwe, A. O.; Bello, T. O.; Durosaro, I. A.
2014-01-01
This study investigated the degree to which Public-Private Partnership (PPP) services are related to infrastructural development in Nigerian Universities. The research design used was descriptive survey method. The population for the study encompassed all the 20 universities in South-west Nigeria. Stratified random sampling was used to select 12…
Context Therapy: A New Intervention Approach for Children with Cerebral Palsy
ERIC Educational Resources Information Center
Darrah, Johanna; Law, Mary C.; Pollock, Nancy; Wilson, Brenda; Russell, Dianne J.; Walter, Stephen D.; Rosenbaum, Peter; Galuppi, Barb
2011-01-01
Aim: To describe the development of context therapy, a new intervention approach designed for a randomized controlled trial. Method: Therapists were trained to change task and environmental factors to achieve parent-identified functional goals for children with cerebral palsy. Therapists did not provide any remediation strategies to change the…
Cigarette Smoking and Anti-Smoking Counseling Practices among Physicians in Wuhan, China
ERIC Educational Resources Information Center
Gong, Jie; Zhang, Zhifeng; Zhu, Zhaoyang; Wan, Jun; Yang, Niannian; Li, Fang; Sun, Huiling; Li, Weiping; Xia, Jiang; Zhou, Dunjin; Chen, Xinguang
2012-01-01
Purpose: The paper seeks to report data on cigarette smoking, anti-smoking practices, physicians' receipt of anti-smoking training, and the association between receipt of the training and anti-smoking practice among physicians in Wuhan, China. Design/methodology/approach: Participants were selected through the stratified random sampling method.…
Evaluating the Impact of a Multistrategy Inference Intervention for Middle-Grade Struggling Readers
ERIC Educational Resources Information Center
Barth, Amy E.; Elleman, Amy
2017-01-01
Purpose: We examined the effectiveness of a multistrategy inference intervention designed to increase inference making and reading comprehension for middle-grade struggling readers. Method: A total of 66 middle-grade struggling readers were randomized to treatment (n = 33) and comparison (n = 33) conditions. Students in the treatment group…
Design, Baseline Results of Irbid Longitudinal, School-Based Smoking Study
ERIC Educational Resources Information Center
Mzayek, Fawaz; Khader, Yousef; Eissenberg, Thomas; Ward, Kenneth D.; Maziak, Wasim
2011-01-01
Objective: To compare patterns of water pipe and cigarette smoking in an eastern Mediterranean country. Methods: In 2008, 1781 out of 1877 seventh graders enrolled in 19 randomly selected schools in Irbid, Jordan, were surveyed. Results: Experimentation with and current water pipe smoking were more prevalent than cigarette smoking (boys: 38.7% vs…
Component-based control of oil-gas-water mixture composition in pipelines
NASA Astrophysics Data System (ADS)
Voytyuk, I. N.
2018-03-01
The article theoretically proves the method for measuring the changes in content of oil, gas and water in pipelines; also the measurement system design for implementation thereof is discussed. An assessment is presented in connection with random and systemic errors for the future system, and recommendations for optimization thereof are presented.
ERIC Educational Resources Information Center
Emanouilidis, Emanuel
2005-01-01
Latin squares have existed for hundreds of years but it wasn't until rather recently that Latin squares were used in other areas such as statistics, graph theory, coding theory and the generation of random numbers as well as in the design and analysis of experiments. This note describes Latin and diagonal Latin squares, a method of constructing…
ERIC Educational Resources Information Center
Mayfield, Carlene A.; Child, Stephanie; Weaver, Robert G.; Zarrett, Nicole; Beets, Michael W.; Moore, Justin B.
2017-01-01
Background: We examined the effectiveness of Peaceful Playgrounds™ (P2) to decrease antisocial behaviors (ASB) while increasing physical activity (PA) and prosocial behaviors (PSB) in elementary school children. Methods: A longitudinal, cluster-randomized design was employed in 4 elementary school playgrounds where students (third to fifth) from 2…
Review of Estimation Methods for Landline and Cell Phone Surveys
ERIC Educational Resources Information Center
Arcos, Antonio; del Mar Rueda, María; Trujillo, Manuel; Molina, David
2015-01-01
The rapid proliferation of cell phone use and the accompanying decline in landline service in recent years have resulted in substantial potential for coverage bias in landline random-digit-dial telephone surveys, which has led to the implementation of dual-frame designs that incorporate both landline and cell phone samples. Consequently,…
Olives, Casey; Pagano, Marcello; Deitchler, Megan; Hedt, Bethany L; Egge, Kari; Valadez, Joseph J
2009-01-01
Traditional lot quality assurance sampling (LQAS) methods require simple random sampling to guarantee valid results. However, cluster sampling has been proposed to reduce the number of random starting points. This study uses simulations to examine the classification error of two such designs, a 67×3 (67 clusters of three observations) and a 33×6 (33 clusters of six observations) sampling scheme to assess the prevalence of global acute malnutrition (GAM). Further, we explore the use of a 67×3 sequential sampling scheme for LQAS classification of GAM prevalence. Results indicate that, for independent clusters with moderate intracluster correlation for the GAM outcome, the three sampling designs maintain approximate validity for LQAS analysis. Sequential sampling can substantially reduce the average sample size that is required for data collection. The presence of intercluster correlation can impact dramatically the classification error that is associated with LQAS analysis. PMID:20011037
Yuan, Changzheng; Lv, Jun; VanderWeele, Tyler J.
2013-01-01
Background Relatively little is known about the peer influence in health behaviors within university dormitory rooms. Moreover, in China, the problem of unhealthy behaviors among university students has not yet been sufficiently recognized. We thus investigated health behavior peer influence in Peking University dormitories utilizing a randomized cluster-assignment design. Methods Study design: Cross-sectional in-dormitory survey. Study population: Current students from Peking University Health Science Center from April to June, 2009. Measurement: Self-reported questionnaire on health behaviors: physical activity (including bicycling), dietary intake and tobacco use. Results Use of bicycle, moderate-intensity exercise, frequency of sweet food and soybean milk intake, frequency of roasted/baked/toasted food intake were behaviors significantly or marginally significantly affected by peer influence. Conclusion Health behavior peer effects exist within dormitory rooms among university students. This could provide guidance on room assignment, or inform intervention programs. Examining these may demand attention from university administrators and policy makers. PMID:24040377
Finding the Optimal Nets for Self-Folding Kirigami
NASA Astrophysics Data System (ADS)
Araújo, N. A. M.; da Costa, R. A.; Dorogovtsev, S. N.; Mendes, J. F. F.
2018-05-01
Three-dimensional shells can be synthesized from the spontaneous self-folding of two-dimensional templates of interconnected panels, called nets. However, some nets are more likely to self-fold into the desired shell under random movements. The optimal nets are the ones that maximize the number of vertex connections, i.e., vertices that have only two of its faces cut away from each other in the net. Previous methods for finding such nets are based on random search, and thus, they do not guarantee the optimal solution. Here, we propose a deterministic procedure. We map the connectivity of the shell into a shell graph, where the nodes and links of the graph represent the vertices and edges of the shell, respectively. Identifying the nets that maximize the number of vertex connections corresponds to finding the set of maximum leaf spanning trees of the shell graph. This method allows us not only to design the self-assembly of much larger shell structures but also to apply additional design criteria, as a complete catalog of the maximum leaf spanning trees is obtained.
Morrissey, C Orla; Chen, Sharon C-A; Sorrell, Tania C; Bradstock, Kenneth F; Szer, Jeffrey; Halliday, Catriona L; Gilroy, Nicole M; Schwarer, Anthony P; Slavin, Monica A
2011-02-01
Invasive aspergillosis (IA) is a major cause of mortality in patients with hematological malignancies, due largely to the inability of traditional culture and biopsy methods to make an early or accurate diagnosis. Diagnostic accuracy studies suggest that Aspergillus galactomannan (GM) enzyme immunoassay (ELISA) and Aspergillus PCR-based methods may overcome these limitations, but their impact on patient outcomes should be evaluated in a diagnostic randomized controlled trial (D-RCT). This article describes the methodology of a D-RCT which compares a new pre-emptive strategy (GM-ELISA- and Aspergillus PCR-driven antifungal therapy) with the standard fever-driven empiric antifungal treatment strategy. Issues including primary end-point and patient selection, duration of screening, choice of tests for the pre-emptive strategy, antifungal prophylaxis and bias control, which were considered in the design of the trial, are discussed. We suggest that the template presented herein is considered by researchers when evaluating the utility of new diagnostic tests (ClinicalTrials.gov number, NCT00163722).
Raja-Khan, Nazia; Agito, Katrina; Shah, Julie; Stetter, Christy M; Gustafson, Theresa S; Socolow, Holly; Kunselman, Allen R; Reibel, Diane K; Legro, Richard S
2015-03-01
Mindfulness-based stress reduction (MBSR) may be beneficial for overweight/obese women, including women with polycystic ovary syndrome (PCOS), as it has been shown to reduce psychological distress and improve quality of life in other patient populations. Preliminary studies suggest that MBSR may also have salutary effects on blood pressure and blood glucose. This paper describes the design and methods of an ongoing pilot randomized controlled trial evaluating the feasibility and effects of MBSR in PCOS and non-PCOS women who are overweight or obese (NCT01464398). Eighty six (86) women with body mass index ≥ 25 kg/m(2), including 31 women with PCOS, have been randomized to 8 weeks of MBSR or health education control, and followed for 16 weeks. The primary outcome is mindfulness assessed with the Toronto Mindfulness Scale. Secondary outcomes include measures of blood pressure, blood glucose, quality of life, anxiety and depression. Our overall hypothesis is that MBSR will increase mindfulness and ultimately lead to favorable changes in blood pressure, blood glucose, psychological distress and quality of life in PCOS and non-PCOS women. This would support the integration of MBSR with conventional medical treatments to reduce psychological distress, cardiovascular disease and diabetes in PCOS and non-PCOS women who are overweight or obese. Copyright © 2015 Elsevier Inc. All rights reserved.
Raja-Khan, Nazia; Agito, Katrina; Shah, Julie; Stetter, Christy M.; Gustafson, Theresa S.; Socolow, Holly; Kunselman, Allen R.; Reibel, Diane K.; Legro, Richard S.
2015-01-01
Mindfulness-based stress reduction (MBSR) may be beneficial for overweight/obese women, including women with polycystic ovary syndrome (PCOS), as it has been shown to reduce psychological distress and improve quality of life in other patient populations. Preliminary studies suggest that MBSR may also have salutary effects on blood pressure and blood glucose. This paper describes the design and methods of an ongoing pilot randomized controlled trial evaluating the feasibility and effects of MBSR in PCOS and non-PCOS women who are overweight or obese. Eighty six (86) women with body mass index ≥25 kg/m2, including 31 women with PCOS, have been randomized to 8 weeks of MBSR or health education control, and followed for 16 weeks. The primary outcome is mindfulness assessed with the Toronto Mindfulness Scale. Secondary outcomes include measures of blood pressure, blood glucose, quality of life, anxiety and depression. Our overall hypothesis is that MBSR will increase mindfulness and ultimately lead to favorable changes in blood pressure, blood glucose, psychological distress and quality of life in PCOS and non-PCOS women. This would support the integration of MBSR with conventional medical treatments to reduce psychological distress, cardiovascular disease and diabetes in PCOS and non-PCOS women who are overweight or obese. PMID:25662105
Lattice Boltzmann simulations for wall-flow dynamics in porous ceramic diesel particulate filters
NASA Astrophysics Data System (ADS)
Lee, Da Young; Lee, Gi Wook; Yoon, Kyu; Chun, Byoungjin; Jung, Hyun Wook
2018-01-01
Flows through porous filter walls of wall-flow diesel particulate filter are investigated using the lattice Boltzmann method (LBM). The microscopic model of the realistic filter wall is represented by randomly overlapped arrays of solid spheres. The LB simulation results are first validated by comparison to those from previous hydrodynamic theories and constitutive models for flows in porous media with simple regular and random solid-wall configurations. We demonstrate that the newly designed randomly overlapped array structures of porous walls allow reliable and accurate simulations for the porous wall-flow dynamics in a wide range of solid volume fractions from 0.01 to about 0.8, which is beyond the maximum random packing limit of 0.625. The permeable performance of porous media is scrutinized by changing the solid volume fraction and particle Reynolds number using Darcy's law and Forchheimer's extension in the laminar flow region.
Asghari Jafarabadi, Mohammad; Sadeghi-Bazrgani, Homayoun; Dianat, Iman
2018-06-01
To evaluate the quality of reporting in published randomized controlled trials (RTCs) in the field of fall injuries. The 188 RTCs published between 2001 and 2011, indexed in EMBASE and Medline databases were extracted through searching by appropriate keywords and EMTree classification terms. The evaluation trustworthiness was assured through parallel evaluations of two experts in epidemiology and biostatistics. About 40%-75% of papers had problems in reporting random allocation method, allocation concealment, random allocation implementation, blinding and similarity among groups, intention to treat and balancing benefits and harms. Moreover, at least 10% of papers inappropriately/not reported the design, protocol violations, sample size justification, subgroup/adjusted analyses, presenting flow diagram, drop outs, recruitment time, baseline data, suitable effect size on outcome, ancillary analyses, limitations and generalizability. Considering the shortcomings found and due to the importance of the RCTs for fall injury prevention programmes, their reporting quality should be improved.
Simon, Richard
2008-06-01
Developments in genomics and biotechnology provide unprecedented opportunities for the development of effective therapeutics and companion diagnostics for matching the right drug to the right patient. Effective co-development involves many new challenges with increased opportunity for success as well as delay and failure. Clinical trial designs and adaptive analysis plans for the prospective design of pivotal trials of new therapeutics and companion diagnostics are reviewed. Effective co-development requires careful prospective planning of the design and analysis strategy for pivotal clinical trials. Randomized clinical trials continue to be important for evaluating the effectiveness of new treatments, but the target populations for analysis should be prospectively specified based on the companion diagnostic. Post hoc analyses of traditionally designed randomized clinical trials are often deeply problematic. Clear separation is generally required of the data used for developing the diagnostic test, including their threshold of positivity, from the data used for evaluating treatment effectiveness in subsets determined by the test. Adaptive analysis can be used to provide flexibility to the analysis but the use of such methods requires careful planning and prospective definition in order to assure that the pivotal trial adequately limits the chance of erroneous conclusions.
Chen, Xiao; Liu, Peng; Zhu, Xiaofei; Cao, Liehu; Zhang, Chuncai; Su, Jiacan
2013-06-01
We carried out this study to test the efficacy of the olecranon memory connector (OMC) in olecranon fractures. We designed a prospective randomised controlled trial involving 40 cases of olecranon fractures. From May 2004 to December 2009, 40 patients with olecranon fractures were randomly assigned into two groups. Twenty patients were treated with OMC, while another 20 patients were fixed with locking plates in our hospital. The DASH score, MEP score, range of motion and radiographs were used to evaluate the postoperative elbow function and complications. For MEP score, OMC was better than the locking plate; for DASH score, complication rate, and range of elbow motion, the two methods presented no significant difference. The study showed that OMC could be an effective alternative to treat olecranon fractures.
Friedrich, Roberta R; Caetano, Lisandrea C; Schiffner, Mariana D; Wagner, Mário B; Schuch, Ilaine
2015-04-11
The prevalence of child obesity in Brazil has increased rapidly in recent decades. There is, therefore, an urgent need to develop effective strategies to prevent and control child obesity. In light of these considerations, an intervention program with a focus on nutrition education and physical activity was developed for to prevent and control obesity in schools. The intervention was called the TriAtiva Program: Education, Nutrition and Physical Activity. This article describes the design, randomization and method used to evaluate the TriAtiva program. This randomized controlled cluster trial was performed in 12 municipal schools in the city of Porto Alegre/RS (six schools in the intervention group and six control schools) which offered first- through fourth grade, during one school year. The TriAtiva Program was implemented through educational activities related to healthy eating and physical activity, creating an environment which promoted student health while involving the school community and student families. The primary outcome of the present study was body mass, while its secondary outcomes were waist circumference, percent body fat, blood pressure and behavioural variables such as eating habits and physical activity levels, as well as the prevalence, incidence and remission rates of obesity. The intervention was developed based on a comprehensive review of controlled trials of similar design. The TriAtiva Program: Education, Nutrition and Physical Activity was the first study in Southern Brazil to use a randomized controlled design to evaluate an intervention involving both nutrition education and physical activity in schools. Our results will contribute to the development of future interventions aimed at preventing and controlling child obesity in schools, especially in Brazil. Brazilian Clinical Trials Registry (REBEC) number RBR2xx2z4.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Yonggang, E-mail: wangyg@ustc.edu.cn; Hui, Cong; Liu, Chong
The contribution of this paper is proposing a new entropy extraction mechanism based on sampling phase jitter in ring oscillators to make a high throughput true random number generator in a field programmable gate array (FPGA) practical. Starting from experimental observation and analysis of the entropy source in FPGA, a multi-phase sampling method is exploited to harvest the clock jitter with a maximum entropy and fast sampling speed. This parametrized design is implemented in a Xilinx Artix-7 FPGA, where the carry chains in the FPGA are explored to realize the precise phase shifting. The generator circuit is simple and resource-saving,more » so that multiple generation channels can run in parallel to scale the output throughput for specific applications. The prototype integrates 64 circuit units in the FPGA to provide a total output throughput of 7.68 Gbps, which meets the requirement of current high-speed quantum key distribution systems. The randomness evaluation, as well as its robustness to ambient temperature, confirms that the new method in a purely digital fashion can provide high-speed high-quality random bit sequences for a variety of embedded applications.« less
Wang, Yonggang; Hui, Cong; Liu, Chong; Xu, Chao
2016-04-01
The contribution of this paper is proposing a new entropy extraction mechanism based on sampling phase jitter in ring oscillators to make a high throughput true random number generator in a field programmable gate array (FPGA) practical. Starting from experimental observation and analysis of the entropy source in FPGA, a multi-phase sampling method is exploited to harvest the clock jitter with a maximum entropy and fast sampling speed. This parametrized design is implemented in a Xilinx Artix-7 FPGA, where the carry chains in the FPGA are explored to realize the precise phase shifting. The generator circuit is simple and resource-saving, so that multiple generation channels can run in parallel to scale the output throughput for specific applications. The prototype integrates 64 circuit units in the FPGA to provide a total output throughput of 7.68 Gbps, which meets the requirement of current high-speed quantum key distribution systems. The randomness evaluation, as well as its robustness to ambient temperature, confirms that the new method in a purely digital fashion can provide high-speed high-quality random bit sequences for a variety of embedded applications.
Fatigue Damage Spectrum calculation in a Mission Synthesis procedure for Sine-on-Random excitations
NASA Astrophysics Data System (ADS)
Angeli, Andrea; Cornelis, Bram; Troncossi, Marco
2016-09-01
In many real-life environments, certain mechanical and electronic components may be subjected to Sine-on-Random vibrations, i.e. excitations composed of random vibrations superimposed on deterministic (sinusoidal) contributions, in particular sine tones due to some rotating parts of the system (e.g. helicopters, engine-mounted components,...). These components must be designed to withstand the fatigue damage induced by the “composed” vibration environment, and qualification tests are advisable for the most critical ones. In the case of an accelerated qualification test, a proper test tailoring which starts from the real environment (measured vibration signals) and which preserves not only the accumulated fatigue damage but also the “nature” of the excitation (i.e. sinusoidal components plus random process) is important to obtain reliable results. In this paper, the classic time domain approach is taken as a reference for the comparison of different methods for the Fatigue Damage Spectrum (FDS) calculation in case of Sine-on-Random vibration environments. Then, a methodology to compute a Sine-on-Random specification based on a mission FDS is proposed.
The variability of software scoring of the CDMAM phantom associated with a limited number of images
NASA Astrophysics Data System (ADS)
Yang, Chang-Ying J.; Van Metter, Richard
2007-03-01
Software scoring approaches provide an attractive alternative to human evaluation of CDMAM images from digital mammography systems, particularly for annual quality control testing as recommended by the European Protocol for the Quality Control of the Physical and Technical Aspects of Mammography Screening (EPQCM). Methods for correlating CDCOM-based results with human observer performance have been proposed. A common feature of all methods is the use of a small number (at most eight) of CDMAM images to evaluate the system. This study focuses on the potential variability in the estimated system performance that is associated with these methods. Sets of 36 CDMAM images were acquired under carefully controlled conditions from three different digital mammography systems. The threshold visibility thickness (TVT) for each disk diameter was determined using previously reported post-analysis methods from the CDCOM scorings for a randomly selected group of eight images for one measurement trial. This random selection process was repeated 3000 times to estimate the variability in the resulting TVT values for each disk diameter. The results from using different post-analysis methods, different random selection strategies and different digital systems were compared. Additional variability of the 0.1 mm disk diameter was explored by comparing the results from two different image data sets acquired under the same conditions from the same system. The magnitude and the type of error estimated for experimental data was explained through modeling. The modeled results also suggest a limitation in the current phantom design for the 0.1 mm diameter disks. Through modeling, it was also found that, because of the binomial statistic nature of the CDMAM test, the true variability of the test could be underestimated by the commonly used method of random re-sampling.
Dahabreh, Issa J.; Sheldrick, Radley C.; Paulus, Jessica K.; Chung, Mei; Varvarigou, Vasileia; Jafri, Haseeb; Rassen, Jeremy A.; Trikalinos, Thomas A.; Kitsios, Georgios D.
2012-01-01
Aims Randomized controlled trials (RCTs) are the gold standard for assessing the efficacy of therapeutic interventions because randomization protects from biases inherent in observational studies. Propensity score (PS) methods, proposed as a potential solution to confounding of the treatment–outcome association, are widely used in observational studies of therapeutic interventions for acute coronary syndromes (ACS). We aimed to systematically assess agreement between observational studies using PS methods and RCTs on therapeutic interventions for ACS. Methods and results We searched for observational studies of interventions for ACS that used PS methods to estimate treatment effects on short- or long-term mortality. Using a standardized algorithm, we matched observational studies to RCTs based on patients’ characteristics, interventions, and outcomes (‘topics’), and we compared estimates of treatment effect between the two designs. When multiple observational studies or RCTs were identified for the same topic, we performed a meta-analysis and used the summary relative risk for comparisons. We matched 21 observational studies investigating 17 distinct clinical topics to 63 RCTs (median = 3 RCTs per observational study) for short-term (7 topics) and long-term (10 topics) mortality. Estimates from PS analyses differed statistically significantly from randomized evidence in two instances; however, observational studies reported more extreme beneficial treatment effects compared with RCTs in 13 of 17 instances (P = 0.049). Sensitivity analyses limited to large RCTs, and using alternative meta-analysis models yielded similar results. Conclusion For the treatment of ACS, observational studies using PS methods produce treatment effect estimates that are of more extreme magnitude compared with those from RCTs, although the differences are rarely statistically significant. PMID:22711757
Mao, Jun J; Li, Qing S.; Soeller, Irene; Xie, Sharon X; Amsterdam, Jay D.
2014-01-01
Background Rhodiola rosea (R. rosea), a botanical of both western and traditional Chinese medicine, has been used as a folk remedy for improving stamina and reducing stress. However, few controlled clinical trials have examined the safety and efficacy of R. rosea for the treatment of major depressive disorder (MDD). This study seeks to evaluate the safety and efficacy of R. rosea in a 12-week, randomized, double-blind, placebo-controlled, parallel group study design. Methods / Design Subjects with MDD not receiving antidepressant therapy will be randomized to either R. rosea extract 340–1,360 mg daily; sertraline 50–200 mg daily, or placebo for 12 weeks. The primary outcome measure will be change over time in the mean 17-item Hamilton Depression Rating score. Secondary outcome measures will include safety and quality of life ratings. Statistical procedures will include mixed-effects models to assess efficacy for primary and secondary outcomes. Discussion This study will provide valuable preliminary information on the safety and efficacy data of R. rosea versus conventional antidepressant therapy of MDD. It will also inform additional hypotheses and study design of future, fully powered, phase III clinical trials with R. rosea to determine its safety and efficacy in MDD. PMID:25610752
True and Quasi-Experimental Designs. ERIC/AE Digest.
ERIC Educational Resources Information Center
Gribbons, Barry; Herman, Joan
Among the different types of experimental design are two general categories: true experimental designs and quasi- experimental designs. True experimental designs include more than one purposively created group, common measured outcomes, and random assignment. Quasi-experimental designs are commonly used when random assignment is not practical or…
Nonlinear analyses of composite aerospace structures in sonic fatigue
NASA Technical Reports Server (NTRS)
Mei, Chuh
1993-01-01
This report summarizes the semiannual research progress, accomplishments, and future plans performed under the NASA Langley Research Center Grant No. NAG-1-1358. The primary research effort of this project is the development of analytical methods for the prediction of nonlinear random response of composite aerospace structures subjected to combined acoustic and thermal loads. The progress, accomplishments, and future plates on four sonic fatigue research topics are described. The sonic fatigue design and passive control of random response of shape memory alloy hybrid composites presented in section 4, which is suited especially for HSCT, is a new initiative.
Nonlinear analyses of composite aerospace structures in sonic fatigue
NASA Astrophysics Data System (ADS)
Mei, Chuh
1993-06-01
This report summarizes the semiannual research progress, accomplishments, and future plans performed under the NASA Langley Research Center Grant No. NAG-1-1358. The primary research effort of this project is the development of analytical methods for the prediction of nonlinear random response of composite aerospace structures subjected to combined acoustic and thermal loads. The progress, accomplishments, and future plates on four sonic fatigue research topics are described. The sonic fatigue design and passive control of random response of shape memory alloy hybrid composites presented in section 4, which is suited especially for HSCT, is a new initiative.
Optimized Projection Matrix for Compressive Sensing
NASA Astrophysics Data System (ADS)
Xu, Jianping; Pi, Yiming; Cao, Zongjie
2010-12-01
Compressive sensing (CS) is mainly concerned with low-coherence pairs, since the number of samples needed to recover the signal is proportional to the mutual coherence between projection matrix and sparsifying matrix. Until now, papers on CS always assume the projection matrix to be a random matrix. In this paper, aiming at minimizing the mutual coherence, a method is proposed to optimize the projection matrix. This method is based on equiangular tight frame (ETF) design because an ETF has minimum coherence. It is impossible to solve the problem exactly because of the complexity. Therefore, an alternating minimization type method is used to find a feasible solution. The optimally designed projection matrix can further reduce the necessary number of samples for recovery or improve the recovery accuracy. The proposed method demonstrates better performance than conventional optimization methods, which brings benefits to both basis pursuit and orthogonal matching pursuit.
Experiment Design for Complex VTOL Aircraft with Distributed Propulsion and Tilt Wing
NASA Technical Reports Server (NTRS)
Murphy, Patrick C.; Landman, Drew
2015-01-01
Selected experimental results from a wind tunnel study of a subscale VTOL concept with distributed propulsion and tilt lifting surfaces are presented. The vehicle complexity and automated test facility were ideal for use with a randomized designed experiment. Design of Experiments and Response Surface Methods were invoked to produce run efficient, statistically rigorous regression models with minimized prediction error. Static tests were conducted at the NASA Langley 12-Foot Low-Speed Tunnel to model all six aerodynamic coefficients over a large flight envelope. This work supports investigations at NASA Langley in developing advanced configurations, simulations, and advanced control systems.
Methods of learning in statistical education: Design and analysis of a randomized trial
NASA Astrophysics Data System (ADS)
Boyd, Felicity Turner
Background. Recent psychological and technological advances suggest that active learning may enhance understanding and retention of statistical principles. A randomized trial was designed to evaluate the addition of innovative instructional methods within didactic biostatistics courses for public health professionals. Aims. The primary objectives were to evaluate and compare the addition of two active learning methods (cooperative and internet) on students' performance; assess their impact on performance after adjusting for differences in students' learning style; and examine the influence of learning style on trial participation. Methods. Consenting students enrolled in a graduate introductory biostatistics course were randomized to cooperative learning, internet learning, or control after completing a pretest survey. The cooperative learning group participated in eight small group active learning sessions on key statistical concepts, while the internet learning group accessed interactive mini-applications on the same concepts. Controls received no intervention. Students completed evaluations after each session and a post-test survey. Study outcome was performance quantified by examination scores. Intervention effects were analyzed by generalized linear models using intent-to-treat analysis and marginal structural models accounting for reported participation. Results. Of 376 enrolled students, 265 (70%) consented to randomization; 69, 100, and 96 students were randomized to the cooperative, internet, and control groups, respectively. Intent-to-treat analysis showed no differences between study groups; however, 51% of students in the intervention groups had dropped out after the second session. After accounting for reported participation, expected examination scores were 2.6 points higher (of 100 points) after completing one cooperative learning session (95% CI: 0.3, 4.9) and 2.4 points higher after one internet learning session (95% CI: 0.0, 4.7), versus nonparticipants or controls, adjusting for other performance predictors. Students who preferred learning by reflective observation and active experimentation experienced improved performance through internet learning (5.9 points, 95% CI: 1.2, 10.6) and cooperative learning (2.9 points, 95% CI: 0.6, 5.2), respectively. Learning style did not influence study participation. Conclusions. No performance differences by group were observed by intent-to-treat analysis. Participation in active learning appears to improve student performance in an introductory biostatistics course and provides opportunities for enhancing understanding beyond that attained in traditional didactic classrooms.
Improvement of Automated POST Case Success Rate Using Support Vector Machines
NASA Technical Reports Server (NTRS)
Zwack, Matthew R.; Dees, Patrick D.
2017-01-01
During early conceptual design of complex systems, concept down selection can have a large impact upon program life-cycle cost. Therefore, any concepts selected during early design will inherently commit program costs and affect the overall probability of program success. For this reason it is important to consider as large a design space as possible in order to better inform the down selection process. For conceptual design of launch vehicles, trajectory analysis and optimization often presents the largest obstacle to evaluating large trade spaces. This is due to the sensitivity of the trajectory discipline to changes in all other aspects of the vehicle design. Small deltas in the performance of other subsystems can result in relatively large fluctuations in the ascent trajectory because the solution space is non-linear and multi-modal [1]. In order to help capture large design spaces for new launch vehicles, the authors have performed previous work seeking to automate the execution of the industry standard tool, Program to Optimize Simulated Trajectories (POST). This work initially focused on implementation of analyst heuristics to enable closure of cases in an automated fashion, with the goal of applying the concepts of design of experiments (DOE) and surrogate modeling to enable near instantaneous throughput of vehicle cases [2]. Additional work was then completed to improve the DOE process by utilizing a graph theory based approach to connect similar design points [3]. The conclusion of the previous work illustrated the utility of the graph theory approach for completing a DOE through POST. However, this approach was still dependent upon the use of random repetitions to generate seed points for the graph. As noted in [3], only 8% of these random repetitions resulted in converged trajectories. This ultimately affects the ability of the random reps method to confidently approach the global optima for a given vehicle case in a reasonable amount of time. With only an 8% pass rate, tens or hundreds of thousands of reps may be needed to be confident that the best repetition is at least close to the global optima. However, typical design study time constraints require that fewer repetitions be attempted, sometimes resulting in seed points that have only a handful of successful completions. If a small number of successful repetitions are used to generate a seed point, the graph method may inherit some inaccuracies as it chains DOE cases from the non-global-optimal seed points. This creates inherent noise in the graph data, which can limit the accuracy of the resulting surrogate models. For this reason, the goal of this work is to improve the seed point generation method and ultimately the accuracy of the resulting POST surrogate model. The work focuses on increasing the case pass rate for seed point generation.
A Kalman Filter for SINS Self-Alignment Based on Vector Observation.
Xu, Xiang; Xu, Xiaosu; Zhang, Tao; Li, Yao; Tong, Jinwu
2017-01-29
In this paper, a self-alignment method for strapdown inertial navigation systems based on the q -method is studied. In addition, an improved method based on integrating gravitational apparent motion to form apparent velocity is designed, which can reduce the random noises of the observation vectors. For further analysis, a novel self-alignment method using a Kalman filter based on adaptive filter technology is proposed, which transforms the self-alignment procedure into an attitude estimation using the observation vectors. In the proposed method, a linear psuedo-measurement equation is adopted by employing the transfer method between the quaternion and the observation vectors. Analysis and simulation indicate that the accuracy of the self-alignment is improved. Meanwhile, to improve the convergence rate of the proposed method, a new method based on parameter recognition and a reconstruction algorithm for apparent gravitation is devised, which can reduce the influence of the random noises of the observation vectors. Simulations and turntable tests are carried out, and the results indicate that the proposed method can acquire sound alignment results with lower standard variances, and can obtain higher alignment accuracy and a faster convergence rate.
A Kalman Filter for SINS Self-Alignment Based on Vector Observation
Xu, Xiang; Xu, Xiaosu; Zhang, Tao; Li, Yao; Tong, Jinwu
2017-01-01
In this paper, a self-alignment method for strapdown inertial navigation systems based on the q-method is studied. In addition, an improved method based on integrating gravitational apparent motion to form apparent velocity is designed, which can reduce the random noises of the observation vectors. For further analysis, a novel self-alignment method using a Kalman filter based on adaptive filter technology is proposed, which transforms the self-alignment procedure into an attitude estimation using the observation vectors. In the proposed method, a linear psuedo-measurement equation is adopted by employing the transfer method between the quaternion and the observation vectors. Analysis and simulation indicate that the accuracy of the self-alignment is improved. Meanwhile, to improve the convergence rate of the proposed method, a new method based on parameter recognition and a reconstruction algorithm for apparent gravitation is devised, which can reduce the influence of the random noises of the observation vectors. Simulations and turntable tests are carried out, and the results indicate that the proposed method can acquire sound alignment results with lower standard variances, and can obtain higher alignment accuracy and a faster convergence rate. PMID:28146059
A Comparison of Two Balance Calibration Model Building Methods
NASA Technical Reports Server (NTRS)
DeLoach, Richard; Ulbrich, Norbert
2007-01-01
Simulated strain-gage balance calibration data is used to compare the accuracy of two balance calibration model building methods for different noise environments and calibration experiment designs. The first building method obtains a math model for the analysis of balance calibration data after applying a candidate math model search algorithm to the calibration data set. The second building method uses stepwise regression analysis in order to construct a model for the analysis. Four balance calibration data sets were simulated in order to compare the accuracy of the two math model building methods. The simulated data sets were prepared using the traditional One Factor At a Time (OFAT) technique and the Modern Design of Experiments (MDOE) approach. Random and systematic errors were introduced in the simulated calibration data sets in order to study their influence on the math model building methods. Residuals of the fitted calibration responses and other statistical metrics were compared in order to evaluate the calibration models developed with different combinations of noise environment, experiment design, and model building method. Overall, predicted math models and residuals of both math model building methods show very good agreement. Significant differences in model quality were attributable to noise environment, experiment design, and their interaction. Generally, the addition of systematic error significantly degraded the quality of calibration models developed from OFAT data by either method, but MDOE experiment designs were more robust with respect to the introduction of a systematic component of the unexplained variance.
2014-01-01
Background Acute kidney injury (AKI) is observed in up to 41% of patients undergoing transcatheter aortic valve implantation (TAVI) and is associated with increased risk for mortality. The aim of the present study is to evaluate whether furosemide-induced diuresis with matched isotonic intravenous hydration using the RenalGuard system reduces AKI in patients undergoing TAVI. Methods/Design Reduce-AKI is a randomized sham-controlled study designed to examine the effect of an automated matched hydration system in the prevention of AKI in patients undergoing TAVI. Patients will be randomized in a 1:1 fashion to the RenalGuard system (active group) versus non-matched saline infusion (sham-controlled group). Both arms receive standard overnight saline infusion and N-acetyl cysteine before the procedure. Discussion The Reduce-AKI trial will investigate whether the use of automated forced diuresis with matched saline infusion is an effective therapeutic tool to reduce the occurrence of AKI in patients undergoing TAVI. Trial registration Clinicaltrials.gov: NCT01866800, 30 April 30 2013. PMID:24986373
NASA Astrophysics Data System (ADS)
Resita, I.; Ertikanto, C.
2018-05-01
This study aims to develop electronic module design based on Learning Content Development System (LCDS) to foster students’ multi representation skills in physics subject material. This study uses research and development method to the product design. This study involves 90 students and 6 physics teachers who were randomly chosen from 3 different Senior High Schools in Lampung Province. The data were collected by using questionnaires and analyzed by using quantitative descriptive method. Based on the data, 95% of the students only use one form of representation in solving physics problems. Representation which is tend to be used by students is symbolic representation. Students are considered to understand the concept of physics if they are able to change from one form to the other forms of representation. Product design of LCDS-based electronic module presents text, image, symbolic, video, and animation representation.
2013-01-01
Background A high prevalence of low back pain has persisted over the years despite extensive primary prevention initiatives among nurses’ aides. Many single-faceted interventions addressing just one aspect of low back pain have been carried out at workplaces, but with low success rate. This may be due to the multi-factorial origin of low back pain. Participatory ergonomics, cognitive behavioral training and physical training have previously shown promising effects on prevention and rehabilitation of low back pain. Therefore, the main aim of this study is to examine whether a multi-faceted workplace intervention consisting of participatory ergonomics, physical training and cognitive behavioral training can prevent low back pain and its consequences among nurses’ aides. External resources for the participating workplace and a strong commitment from the management and the organization support the intervention. Methods/design To overcome implementation barriers within usual randomized controlled trial designed workplace interventions, this study uses a stepped-wedge cluster-randomized controlled trial design with 4 groups. The intervention is delivered to the groups at random along four successive time periods three months apart. The intervention lasts three months and integrates participatory ergonomics, physical training and cognitive behavioral training tailored to the target group. Local physiotherapists and occupational therapists conduct the intervention after having received standardized training. Primary outcomes are low back pain and its consequences measured monthly by text messages up to three months after initiation of the intervention. Discussion Intervention effectiveness trials for preventing low back pain and its consequences in workplaces with physically demanding work are few, primarily single-faceted, with strict adherence to a traditional randomized controlled trial design that may hamper implementation and compliance, and have mostly been unsuccessful. By using a stepped wedge design, and obtain high management commitment and support we intend to improve implementation and aim to establish the effectiveness of a multi-faceted intervention to prevent low back pain. This study will potentially provide knowledge of prevention of low back pain and its consequences among nurses’ aides. Results are expected to be published in 2015–2016. Trial registration The study is registered as ISRCTN78113519. PMID:24261985
Alfawal, Alaa M H; Hajeer, Mohammad Y; Ajaj, Mowaffak A; Hamadah, Omar; Brad, Bassel
2018-02-17
To evaluate the effectiveness of two minimally invasive surgical procedures in the acceleration of canine retraction: piezocision and laser-assisted flapless corticotomy (LAFC). Trial design: A single-centre randomized controlled trial with a compound design (two-arm parallel-group design and a split-mouth design for each arm). 36 Class II division I patients (12 males, 24 females; age range: 15 to 27 years) requiring first upper premolars extraction followed by canine retraction. piezocision group (PG; n = 18) and laser-assisted flapless corticotomy group (LG; n = 18). A split-mouth design was applied for each group where the flapless surgical intervention was randomly allocated to one side and the other side served as a control side. the rate of canine retraction (primary outcome), anchorage loss and canine rotation, which were assessed at 1, 2, 3 and 4 months following the onset of canine retraction. Also the duration of canine retraction was recorded. Random sequence: Computer-generated random numbers. Allocation concealment: sequentially numbered, opaque, sealed envelopes. Blinding: Single blinded (outcomes' assessor). Seventeen patients in each group were enrolled in the statistical analysis. The rate of canine retraction was significantly greater in the experimental side than in the control side in both groups by two-fold in the first month and 1.5-fold in the second month (p < 0.001). Also the overall canine retraction duration was significantly reduced in the experimental side as compared with control side in both groups about 25% (p ≤ 0.001). There were no significant differences between the experimental and the control sides regarding loss of anchorage and upper canine rotation in both groups (p > 0.05). There were no significant differences between the two flapless techniques regarding the studied variables during all evaluation times (p > 0.05). Piezocision and laser-assisted flapless corticotomy appeared to be effective treatment methods for accelerating canine retraction without any significant untoward effect on anchorage or canine rotation during rapid retraction. ClinicalTrials.gov (Identifier: NCT02606331 ).
Zerfu, Taddese Alemu; Ayele, Henok Taddese; Bogale, Tariku Nigatu
2018-06-01
To investigate the effect of innovative means to distribute LARC on contraceptive use, we implemented a three arm, parallel groups, cluster randomized community trial design. The intervention consisted of placing trained community-based reproductive health nurses (CORN) within health centers or health posts. The nurses provided counseling to encourage women to use LARC and distributed all contraceptive methods. A total of 282 villages were randomly selected and assigned to a control arm (n = 94) or 1 of 2 treatment arms (n = 94 each). The treatment groups differed by where the new service providers were deployed, health post or health center. We calculated difference-in-difference (DID) estimates to assess program impacts on LARC use. After nine months of intervention, the use of LARC methods increased significantly by 72.3 percent, while the use of short acting methods declined by 19.6 percent. The proportion of women using LARC methods increased by 45.9 percent and 45.7 percent in the health post and health center based intervention arms, respectively. Compared to the control group, the DID estimates indicate that the use of LARC methods increased by 11.3 and 12.3 percentage points in the health post and health center based intervention arms. Given the low use of LARC methods in similar settings, deployment of contextually trained nurses at the grassroots level could substantially increase utilization of these methods. © 2018 The Population Council, Inc.
Force Limited Vibration Testing: Computation C2 for Real Load and Probabilistic Source
NASA Astrophysics Data System (ADS)
Wijker, J. J.; de Boer, A.; Ellenbroek, M. H. M.
2014-06-01
To prevent over-testing of the test-item during random vibration testing Scharton proposed and discussed the force limited random vibration testing (FLVT) in a number of publications, in which the factor C2 is besides the random vibration specification, the total mass and the turnover frequency of the load(test item), a very important parameter. A number of computational methods to estimate C2 are described in the literature, i.e. the simple and the complex two degrees of freedom system, STDFS and CTDFS, respectively. Both the STDFS and the CTDFS describe in a very reduced (simplified) manner the load and the source (adjacent structure to test item transferring the excitation forces, i.e. spacecraft supporting an instrument).The motivation of this work is to establish a method for the computation of a realistic value of C2 to perform a representative random vibration test based on force limitation, when the adjacent structure (source) description is more or less unknown. Marchand formulated a conservative estimation of C2 based on maximum modal effective mass and damping of the test item (load) , when no description of the supporting structure (source) is available [13].Marchand discussed the formal description of getting C 2 , using the maximum PSD of the acceleration and maximum PSD of the force, both at the interface between load and source, in combination with the apparent mass and total mass of the the load. This method is very convenient to compute the factor C 2 . However, finite element models are needed to compute the spectra of the PSD of both the acceleration and force at the interface between load and source.Stevens presented the coupled systems modal approach (CSMA), where simplified asparagus patch models (parallel-oscillator representation) of load and source are connected, consisting of modal effective masses and the spring stiffnesses associated with the natural frequencies. When the random acceleration vibration specification is given the CMSA method is suitable to compute the valueof the parameter C 2 .When no mathematical model of the source can be made available, estimations of the value C2 can be find in literature.In this paper a probabilistic mathematical representation of the unknown source is proposed, such that the asparagus patch model of the source can be approximated. The computation of the value C2 can be done in conjunction with the CMSA method, knowing the apparent mass of the load and the random acceleration specification at the interface between load and source, respectively.Strength & stiffness design rules for spacecraft, instrumentation, units, etc. will be practiced, as mentioned in ECSS Standards and Handbooks, Launch Vehicle User's manuals, papers, books , etc. A probabilistic description of the design parameters is foreseen.As an example a simple experiment has been worked out.
2009-01-01
Background Modern radiotherapy plays an important role in therapy of advanced head and neck carcinomas. However, no clinical studies have been published addressing the effectiveness of postoperative radiotherapy in patients with small tumor (pT1, pT2) and concomitant ipsilateral metastasis of a single lymph node (pN1), which would provide a basis for a general treatment recommendation. Methods/Design The present study is a non-blinded, prospective, multi-center randomized controlled trial (RCT). As the primary clinical endpoint, overall-survival in patients receiving postoperative radiation therapy vs. patients without adjuvant therapy following curative intended surgery is compared. The aim of the study is to enroll 560 adult males and females for 1:1 randomization to one of the two treatment arms (irradiation/no irradiation). Since patients with small tumor (T1/T2) but singular lymph node metastasis are rare and the amount of patients consenting to randomization is not predictable in advance, all patients rejecting randomization will be treated as preferred and enrolled in a prospective observational study (comprehensive cohort design) after giving informed consent. This observational part of the trial will be performed with maximum consistency to the treatment and observation protocol of the RCT. Because the impact of patient preference for a certain treatment option is not calculable, parallel design of RCT and observational study may provide a maximum of evidence and efficacy for evaluation of treatment outcome. Secondary clinical endpoints are as follows: incidence and time to tumor relapse (locoregional relapse, lymph node involvement and distant metastatic spread), Quality of life as reported by EORTC (QLQ-C30 with H&N 35 module), and time from operation to orofacial rehabilitation. All tumors represent a homogeneous clinical state and therefore additional investigation of protein expression levels within resection specimen may serve for establishment of surrogate parameters of patient outcome. Conclusion The inherent challenges of a rare clinical condition (pN1) and two substantially different therapy arms would limit the practicality of a classical randomized study. The concept of a Comprehensive Cohort Design combines the preference of a randomized study, with the option of careful data interpretation within an observational study. Trial registration ClinicalTrials.gov: NCT00964977 PMID:20028566
2013-01-01
Background Cancer and other chronic diseases reduce quality and length of life and productivity, and represent a significant financial burden to society. Evidence-based public health approaches to prevent cancer and other chronic diseases have been identified in recent decades and have the potential for high impact. Yet, barriers to implement prevention approaches persist as a result of multiple factors including lack of organizational support, limited resources, competing emerging priorities and crises, and limited skill among the public health workforce. The purpose of this study is to learn how best to promote the adoption of evidence based public health practice related to chronic disease prevention. Methods/design This paper describes the methods for a multi-phase dissemination study with a cluster randomized trial component that will evaluate the dissemination of public health knowledge about evidence-based prevention of cancer and other chronic diseases. Phase one involves development of measures of practitioner views on and organizational supports for evidence-based public health and data collection using a national online survey involving state health department chronic disease practitioners. In phase two, a cluster randomized trial design will be conducted to test receptivity and usefulness of dissemination strategies directed toward state health department chronic disease practitioners to enhance capacity and organizational support for evidence-based chronic disease prevention. Twelve state health department chronic disease units will be randomly selected and assigned to intervention or control. State health department staff and the university-based study team will jointly identify, refine, and select dissemination strategies within intervention units. Intervention (dissemination) strategies may include multi-day in-person training workshops, electronic information exchange modalities, and remote technical assistance. Evaluation methods include pre-post surveys, structured qualitative phone interviews, and abstraction of state-level chronic disease prevention program plans and progress reports. Trial registration clinicaltrials.gov: NCT01978054. PMID:24330729