Sample records for methods including random

  1. Randomization in clinical trials in orthodontics: its significance in research design and methods to achieve it.

    PubMed

    Pandis, Nikolaos; Polychronopoulou, Argy; Eliades, Theodore

    2011-12-01

    Randomization is a key step in reducing selection bias during the treatment allocation phase in randomized clinical trials. The process of randomization follows specific steps, which include generation of the randomization list, allocation concealment, and implementation of randomization. The phenomenon in the dental and orthodontic literature of characterizing treatment allocation as random is frequent; however, often the randomization procedures followed are not appropriate. Randomization methods assign, at random, treatment to the trial arms without foreknowledge of allocation by either the participants or the investigators thus reducing selection bias. Randomization entails generation of random allocation, allocation concealment, and the actual methodology of implementing treatment allocation randomly and unpredictably. Most popular randomization methods include some form of restricted and/or stratified randomization. This article introduces the reasons, which make randomization an integral part of solid clinical trial methodology, and presents the main randomization schemes applicable to clinical trials in orthodontics.

  2. A Structural Modeling Approach to a Multilevel Random Coefficients Model.

    ERIC Educational Resources Information Center

    Rovine, Michael J.; Molenaar, Peter C. M.

    2000-01-01

    Presents a method for estimating the random coefficients model using covariance structure modeling and allowing one to estimate both fixed and random effects. The method is applied to real and simulated data, including marriage data from J. Belsky and M. Rovine (1990). (SLD)

  3. Review of Recent Methodological Developments in Group-Randomized Trials: Part 2-Analysis.

    PubMed

    Turner, Elizabeth L; Prague, Melanie; Gallis, John A; Li, Fan; Murray, David M

    2017-07-01

    In 2004, Murray et al. reviewed methodological developments in the design and analysis of group-randomized trials (GRTs). We have updated that review with developments in analysis of the past 13 years, with a companion article to focus on developments in design. We discuss developments in the topics of the earlier review (e.g., methods for parallel-arm GRTs, individually randomized group-treatment trials, and missing data) and in new topics, including methods to account for multiple-level clustering and alternative estimation methods (e.g., augmented generalized estimating equations, targeted maximum likelihood, and quadratic inference functions). In addition, we describe developments in analysis of alternative group designs (including stepped-wedge GRTs, network-randomized trials, and pseudocluster randomized trials), which require clustering to be accounted for in their design and analysis.

  4. Does time-lapse imaging have favorable results for embryo incubation and selection compared with conventional methods in clinical in vitro fertilization? A meta-analysis and systematic review of randomized controlled trials.

    PubMed

    Chen, Minghao; Wei, Shiyou; Hu, Junyan; Yuan, Jing; Liu, Fenghua

    2017-01-01

    The present study aimed to undertake a review of available evidence assessing whether time-lapse imaging (TLI) has favorable outcomes for embryo incubation and selection compared with conventional methods in clinical in vitro fertilization (IVF). Using PubMed, EMBASE, Cochrane library and ClinicalTrial.gov up to February 2017 to search for randomized controlled trials (RCTs) comparing TLI versus conventional methods. Both studies randomized women and oocytes were included. For studies randomized women, the primary outcomes were live birth and ongoing pregnancy, the secondary outcomes were clinical pregnancy and miscarriage; for studies randomized oocytes, the primary outcome was blastocyst rate, the secondary outcome was good quality embryo on Day 2/3. Subgroup analysis was conducted based on different incubation and embryo selection between groups. Ten RCTs were included, four randomized oocytes and six randomized women. For oocyte-based review, the pool-analysis observed no significant difference between TLI group and control group for blastocyst rate [relative risk (RR) 1.08, 95% CI 0.94-1.25, I2 = 0%, two studies, including 1154 embryos]. The quality of evidence was moderate for all outcomes in oocyte-based review. For woman-based review, only one study provided live birth rate (RR 1,23, 95% CI 1.06-1.44,I2 N/A, one study, including 842 women), the pooled result showed no significant difference in ongoing pregnancy rate (RR 1.04, 95% CI 0.80-1.36, I2 = 59%, four studies, including 1403 women) between two groups. The quality of the evidence was low or very low for all outcomes in woman-based review. Currently there is insufficient evidence to support that TLI is superior to conventional methods for human embryo incubation and selection. In consideration of the limitations and flaws of included studies, more well designed RCTs are still in need to comprehensively evaluate the effectiveness of clinical TLI use.

  5. Chlorine resistant desalination membranes based on directly sulfonated poly(arylene ether sulfone) copolymers

    DOEpatents

    McGrath, James E [Blacksburg, VA; Park, Ho Bum [Austin, TX; Freeman, Benny D [Austin, TX

    2011-10-04

    The present invention provides a membrane, kit, and method of making a hydrophilic-hydrophobic random copolymer membrane. The hydrophilic-hydrophobic random copolymer membrane includes a hydrophilic-hydrophobic random copolymer. The hydrophilic-hydrophobic random copolymer includes one or more hydrophilic monomers having a sulfonated polyarylsulfone monomer and a second monomer and one or more hydrophobic monomers having a non-sulfonated third monomer and a fourth monomer. The sulfonated polyarylsulfone monomer introduces a sulfonate into the hydrophilic-hydrophobic random copolymer prior to polymerization.

  6. Does time-lapse imaging have favorable results for embryo incubation and selection compared with conventional methods in clinical in vitro fertilization? A meta-analysis and systematic review of randomized controlled trials

    PubMed Central

    Yuan, Jing; Liu, Fenghua

    2017-01-01

    Objective The present study aimed to undertake a review of available evidence assessing whether time-lapse imaging (TLI) has favorable outcomes for embryo incubation and selection compared with conventional methods in clinical in vitro fertilization (IVF). Methods Using PubMed, EMBASE, Cochrane library and ClinicalTrial.gov up to February 2017 to search for randomized controlled trials (RCTs) comparing TLI versus conventional methods. Both studies randomized women and oocytes were included. For studies randomized women, the primary outcomes were live birth and ongoing pregnancy, the secondary outcomes were clinical pregnancy and miscarriage; for studies randomized oocytes, the primary outcome was blastocyst rate, the secondary outcome was good quality embryo on Day 2/3. Subgroup analysis was conducted based on different incubation and embryo selection between groups. Results Ten RCTs were included, four randomized oocytes and six randomized women. For oocyte-based review, the pool-analysis observed no significant difference between TLI group and control group for blastocyst rate [relative risk (RR) 1.08, 95% CI 0.94–1.25, I2 = 0%, two studies, including 1154 embryos]. The quality of evidence was moderate for all outcomes in oocyte-based review. For woman-based review, only one study provided live birth rate (RR 1,23, 95% CI 1.06–1.44,I2 N/A, one study, including 842 women), the pooled result showed no significant difference in ongoing pregnancy rate (RR 1.04, 95% CI 0.80–1.36, I2 = 59%, four studies, including 1403 women) between two groups. The quality of the evidence was low or very low for all outcomes in woman-based review. Conclusions Currently there is insufficient evidence to support that TLI is superior to conventional methods for human embryo incubation and selection. In consideration of the limitations and flaws of included studies, more well designed RCTs are still in need to comprehensively evaluate the effectiveness of clinical TLI use. PMID:28570713

  7. Dynamic Loads Generation for Multi-Point Vibration Excitation Problems

    NASA Technical Reports Server (NTRS)

    Shen, Lawrence

    2011-01-01

    A random-force method has been developed to predict dynamic loads produced by rocket-engine random vibrations for new rocket-engine designs. The method develops random forces at multiple excitation points based on random vibration environments scaled from accelerometer data obtained during hot-fire tests of existing rocket engines. This random-force method applies random forces to the model and creates expected dynamic response in a manner that simulates the way the operating engine applies self-generated random vibration forces (random pressure acting on an area) with the resulting responses that we measure with accelerometers. This innovation includes the methodology (implementation sequence), the computer code, two methods to generate the random-force vibration spectra, and two methods to reduce some of the inherent conservatism in the dynamic loads. This methodology would be implemented to generate the random-force spectra at excitation nodes without requiring the use of artificial boundary conditions in a finite element model. More accurate random dynamic loads than those predicted by current industry methods can then be generated using the random force spectra. The scaling method used to develop the initial power spectral density (PSD) environments for deriving the random forces for the rocket engine case is based on the Barrett Criteria developed at Marshall Space Flight Center in 1963. This invention approach can be applied in the aerospace, automotive, and other industries to obtain reliable dynamic loads and responses from a finite element model for any structure subject to multipoint random vibration excitations.

  8. An observational study showed that explaining randomization using gambling-related metaphors and computer-agency descriptions impeded randomized clinical trial recruitment.

    PubMed

    Jepson, Marcus; Elliott, Daisy; Conefrey, Carmel; Wade, Julia; Rooshenas, Leila; Wilson, Caroline; Beard, David; Blazeby, Jane M; Birtle, Alison; Halliday, Alison; Stein, Rob; Donovan, Jenny L

    2018-07-01

    To explore how the concept of randomization is described by clinicians and understood by patients in randomized controlled trials (RCTs) and how it contributes to patient understanding and recruitment. Qualitative analysis of 73 audio recordings of recruitment consultations from five, multicenter, UK-based RCTs with identified or anticipated recruitment difficulties. One in 10 appointments did not include any mention of randomization. Most included a description of the method or process of allocation. Descriptions often made reference to gambling-related metaphors or similes, or referred to allocation by a computer. Where reference was made to a computer, some patients assumed that they would receive the treatment that was "best for them". Descriptions of the rationale for randomization were rarely present and often only came about as a consequence of patients questioning the reason for a random allocation. The methods and processes of randomization were usually described by recruiters, but often without clarity, which could lead to patient misunderstanding. The rationale for randomization was rarely mentioned. Recruiters should avoid problematic gambling metaphors and illusions of agency in their explanations and instead focus on clearer descriptions of the rationale and method of randomization to ensure patients are better informed about randomization and RCT participation. Copyright © 2018 University of Bristol. Published by Elsevier Inc. All rights reserved.

  9. Self-correcting random number generator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Humble, Travis S.; Pooser, Raphael C.

    2016-09-06

    A system and method for generating random numbers. The system may include a random number generator (RNG), such as a quantum random number generator (QRNG) configured to self-correct or adapt in order to substantially achieve randomness from the output of the RNG. By adapting, the RNG may generate a random number that may be considered random regardless of whether the random number itself is tested as such. As an example, the RNG may include components to monitor one or more characteristics of the RNG during operation, and may use the monitored characteristics as a basis for adapting, or self-correcting, tomore » provide a random number according to one or more performance criteria.« less

  10. Methods and analysis of realizing randomized grouping.

    PubMed

    Hu, Liang-Ping; Bao, Xiao-Lei; Wang, Qi

    2011-07-01

    Randomization is one of the four basic principles of research design. The meaning of randomization includes two aspects: one is to randomly select samples from the population, which is known as random sampling; the other is to randomly group all the samples, which is called randomized grouping. Randomized grouping can be subdivided into three categories: completely, stratified and dynamically randomized grouping. This article mainly introduces the steps of complete randomization, the definition of dynamic randomization and the realization of random sampling and grouping by SAS software.

  11. Hybrid spread spectrum radio system

    DOEpatents

    Smith, Stephen F.; Dress, William B.

    2010-02-02

    Systems and methods are described for hybrid spread spectrum radio systems. A method includes modulating a signal by utilizing a subset of bits from a pseudo-random code generator to control an amplification circuit that provides a gain to the signal. Another method includes: modulating a signal by utilizing a subset of bits from a pseudo-random code generator to control a fast hopping frequency synthesizer; and fast frequency hopping the signal with the fast hopping frequency synthesizer, wherein multiple frequency hops occur within a single data-bit time.

  12. Evaluation of active and passive recruitment methods used in randomized controlled trials targeting pediatric obesity

    PubMed Central

    RAYNOR, HOLLIE A.; OSTERHOLT, KATHRIN M.; HART, CHANTELLE N.; JELALIAN, ELISSA; VIVIER, PATRICK; WING, RENA R.

    2016-01-01

    Objective Evaluate enrollment numbers, randomization rates, costs, and cost-effectiveness of active versus passive recruitment methods for parent-child dyads into two pediatric obesity intervention trials. Methods Recruitment methods were categorized into active (pediatrician referral and targeted mailings, with participants identified by researcher/health care provider) versus passive methods (newspaper, bus, internet, television, and earning statements; fairs/community centers/schools; and word of mouth; with participants self-identified). Numbers of enrolled and randomized families and costs/recruitment method were monitored throughout the 22-month recruitment period. Costs (in USD) per recruitment method included staff time, mileage, and targeted costs of each method. Results A total of 940 families were referred or made contact, with 164 families randomized (child: 7.2±1.6 years, 2.27±0.61 standardized body mass index [zBMI], 86.6% obese, 61.7% female, 83.5% white; parent: 38.0±5.8 years, 32.9±8.4 BMI, 55.2% obese, 92.7% female, 89.6% white). Pediatrician referral, followed by targeted mailings, produced the largest number of enrolled and randomized families (both methods combined producing 87.2% of randomized families). Passive recruitment methods yielded better retention from enrollment to randomization (p <0.05), but produced few families (21 in total). Approximately $91 000 was spent on recruitment, with cost per randomized family at $554.77. Pediatrician referral was the most cost-effective method, $145.95/randomized family, but yielded only 91 randomized families over 22-months of continuous recruitment. Conclusion Pediatrician referral and targeted mailings, which are active recruitment methods, were the most successful strategies. However, recruitment demanded significant resources. Successful recruitment for pediatric trials should use several strategies. Clinical Trials Registration: NCT00259324, NCT00200265 PMID:19922036

  13. Evaluation of active and passive recruitment methods used in randomized controlled trials targeting pediatric obesity.

    PubMed

    Raynor, Hollie A; Osterholt, Kathrin M; Hart, Chantelle N; Jelalian, Elissa; Vivier, Patrick; Wing, Rena R

    2009-01-01

    Evaluate enrollment numbers, randomization rates, costs, and cost-effectiveness of active versus passive recruitment methods for parent-child dyads into two pediatric obesity intervention trials. Recruitment methods were categorized into active (pediatrician referral and targeted mailings, with participants identified by researcher/health care provider) versus passive methods (newspaper, bus, internet, television, and earning statements; fairs/community centers/schools; and word of mouth; with participants self-identified). Numbers of enrolled and randomized families and costs/recruitment method were monitored throughout the 22-month recruitment period. Costs (in USD) per recruitment method included staff time, mileage, and targeted costs of each method. A total of 940 families were referred or made contact, with 164 families randomized (child: 7.2+/-1.6 years, 2.27+/-0.61 standardized body mass index [zBMI], 86.6% obese, 61.7% female, 83.5% Caucasian; parent: 38.0+/-5.8 years, 32.9+/-8.4 BMI, 55.2% obese, 92.7% female, 89.6% caucasian). Pediatrician referral, followed by targeted mailings, produced the largest number of enrolled and randomized families (both methods combined producing 87.2% of randomized families). Passive recruitment methods yielded better retention from enrollment to randomization (p<0.05), but produced few families (21 in total). Approximately $91,000 was spent on recruitment, with cost per randomized family at $554.77. Pediatrician referral was the most cost-effective method, $145.95/randomized family, but yielded only 91 randomized families over 22-months of continuous recruitment. Pediatrician referral and targeted mailings, which are active recruitment methods, were the most successful strategies. However, recruitment demanded significant resources. Successful recruitment for pediatric trials should use several strategies. NCT00259324, NCT00200265.

  14. The Effects of Including Observed Means or Latent Means as Covariates in Multilevel Models for Cluster Randomized Trials

    ERIC Educational Resources Information Center

    Aydin, Burak; Leite, Walter L.; Algina, James

    2016-01-01

    We investigated methods of including covariates in two-level models for cluster randomized trials to increase power to detect the treatment effect. We compared multilevel models that included either an observed cluster mean or a latent cluster mean as a covariate, as well as the effect of including Level 1 deviation scores in the model. A Monte…

  15. Response Rates in Random-Digit-Dialed Telephone Surveys: Estimation vs. Measurement.

    ERIC Educational Resources Information Center

    Franz, Jennifer D.

    The efficacy of the random digit dialing method in telephone surveys was examined. Random digit dialing (RDD) generates a pure random sample and provides the advantage of including unlisted phone numbers, as well as numbers which are too new to be listed. Its disadvantage is that it generates a major proportion of nonworking and business…

  16. A Method of Reducing Random Drift in the Combined Signal of an Array of Inertial Sensors

    DTIC Science & Technology

    2015-09-30

    stability of the collective output, Bayard et al, US Patent 6,882,964. The prior art methods rely upon the use of Kalman filtering and averaging...including scale-factor errors, quantization effects, temperature effects, random drift, and additive noise. A comprehensive account of all of these

  17. Derek Vigil-Fowler | NREL

    Science.gov Websites

    simulation methods for materials physics and chemistry, with particular expertise in post-DFT, high accuracy methods such as the GW approximation for electronic structure and random phase approximation (RPA) total the art in computational methods, including efficient methods for including the effects of substrates

  18. [Theory, method and application of method R on estimation of (co)variance components].

    PubMed

    Liu, Wen-Zhong

    2004-07-01

    Theory, method and application of Method R on estimation of (co)variance components were reviewed in order to make the method be reasonably used. Estimation requires R values,which are regressions of predicted random effects that are calculated using complete dataset on predicted random effects that are calculated using random subsets of the same data. By using multivariate iteration algorithm based on a transformation matrix,and combining with the preconditioned conjugate gradient to solve the mixed model equations, the computation efficiency of Method R is much improved. Method R is computationally inexpensive,and the sampling errors and approximate credible intervals of estimates can be obtained. Disadvantages of Method R include a larger sampling variance than other methods for the same data,and biased estimates in small datasets. As an alternative method, Method R can be used in larger datasets. It is necessary to study its theoretical properties and broaden its application range further.

  19. Methods of making wind turbine rotor blades

    DOEpatents

    Livingston, Jamie T.; Burke, Arthur H. E.; Bakhuis, Jan Willem; Van Breugel, Sjef; Billen, Andrew

    2008-04-01

    A method of manufacturing a root portion of a wind turbine blade includes, in an exemplary embodiment, providing an outer layer of reinforcing fibers including at least two woven mats of reinforcing fibers, providing an inner layer of reinforcing fibers including at least two woven mats of reinforcing fibers, and positioning at least two bands of reinforcing fibers between the inner and outer layers, with each band of reinforcing fibers including at least two woven mats of reinforcing fibers. The method further includes positioning a mat of randomly arranged reinforcing fibers between each pair of adjacent bands of reinforcing fibers, introducing a polymeric resin into the root potion of the wind turbine blade, infusing the resin through the outer layer, the inner layer, each band of reinforcing fibers, and each mat of random reinforcing fibers, and curing the resin to form the root portion of the wind turbine blade.

  20. Genomic-Enabled Prediction Kernel Models with Random Intercepts for Multi-environment Trials.

    PubMed

    Cuevas, Jaime; Granato, Italo; Fritsche-Neto, Roberto; Montesinos-Lopez, Osval A; Burgueño, Juan; Bandeira E Sousa, Massaine; Crossa, José

    2018-03-28

    In this study, we compared the prediction accuracy of the main genotypic effect model (MM) without G×E interactions, the multi-environment single variance G×E deviation model (MDs), and the multi-environment environment-specific variance G×E deviation model (MDe) where the random genetic effects of the lines are modeled with the markers (or pedigree). With the objective of further modeling the genetic residual of the lines, we incorporated the random intercepts of the lines ([Formula: see text]) and generated another three models. Each of these 6 models were fitted with a linear kernel method (Genomic Best Linear Unbiased Predictor, GB) and a Gaussian Kernel (GK) method. We compared these 12 model-method combinations with another two multi-environment G×E interactions models with unstructured variance-covariances (MUC) using GB and GK kernels (4 model-method). Thus, we compared the genomic-enabled prediction accuracy of a total of 16 model-method combinations on two maize data sets with positive phenotypic correlations among environments, and on two wheat data sets with complex G×E that includes some negative and close to zero phenotypic correlations among environments. The two models (MDs and MDE with the random intercept of the lines and the GK method) were computationally efficient and gave high prediction accuracy in the two maize data sets. Regarding the more complex G×E wheat data sets, the prediction accuracy of the model-method combination with G×E, MDs and MDe, including the random intercepts of the lines with GK method had important savings in computing time as compared with the G×E interaction multi-environment models with unstructured variance-covariances but with lower genomic prediction accuracy. Copyright © 2018 Cuevas et al.

  1. Genomic-Enabled Prediction Kernel Models with Random Intercepts for Multi-environment Trials

    PubMed Central

    Cuevas, Jaime; Granato, Italo; Fritsche-Neto, Roberto; Montesinos-Lopez, Osval A.; Burgueño, Juan; Bandeira e Sousa, Massaine; Crossa, José

    2018-01-01

    In this study, we compared the prediction accuracy of the main genotypic effect model (MM) without G×E interactions, the multi-environment single variance G×E deviation model (MDs), and the multi-environment environment-specific variance G×E deviation model (MDe) where the random genetic effects of the lines are modeled with the markers (or pedigree). With the objective of further modeling the genetic residual of the lines, we incorporated the random intercepts of the lines (l) and generated another three models. Each of these 6 models were fitted with a linear kernel method (Genomic Best Linear Unbiased Predictor, GB) and a Gaussian Kernel (GK) method. We compared these 12 model-method combinations with another two multi-environment G×E interactions models with unstructured variance-covariances (MUC) using GB and GK kernels (4 model-method). Thus, we compared the genomic-enabled prediction accuracy of a total of 16 model-method combinations on two maize data sets with positive phenotypic correlations among environments, and on two wheat data sets with complex G×E that includes some negative and close to zero phenotypic correlations among environments. The two models (MDs and MDE with the random intercept of the lines and the GK method) were computationally efficient and gave high prediction accuracy in the two maize data sets. Regarding the more complex G×E wheat data sets, the prediction accuracy of the model-method combination with G×E, MDs and MDe, including the random intercepts of the lines with GK method had important savings in computing time as compared with the G×E interaction multi-environment models with unstructured variance-covariances but with lower genomic prediction accuracy. PMID:29476023

  2. The Efficacy of Parent-Child Interaction Therapy with Chinese Families: Randomized Controlled Trial

    ERIC Educational Resources Information Center

    Leung, Cynthia; Tsang, Sandra; Sin, Tammy C. S.; Choi, Siu-yan

    2015-01-01

    Objective: This study aimed to examine the efficacy of the Parent-Child Interaction Therapy (PCIT) in Hong Kong Chinese families, using randomized controlled trial design. Methods: The participants included 111 Hong Kong Chinese parents with children aged 2--7 years old, who were randomized into the intervention group (n = 54) and control group (n…

  3. Random learning units using WIRIS quizzes in Moodle

    NASA Astrophysics Data System (ADS)

    Mora, Ángel; Mérida, Enrique; Eixarch, Ramon

    2011-09-01

    Moodle is an extended learning management system for developing learning units, including mathematically-based subjects. A wide variety of material can be developed in Moodle which contains facilities for forums, questionnaires, lessons, tasks, wikis, glossaries and chats. Therefore, the Moodle platform provides a meeting point for those working in a mathematics course. Mathematics requires special materials and activities: The material must include mathematical objects and the activities included in the virtual course must be able to do mathematical computations. WIRIS is a powerful software for educational environments. It has libraries for calculus, algebra, geometry and much more. In this article, examples showing the use of WIRIS in numerical methods and examples of using a new tool, WIRIS quizzes, are illustrated. By enhancing Moodle with WIRIS, we can add random learning questions to modules. Moodle has a simpler version of this capability, but WIRIS extends the method in which the random material is presented to the students. Random objects can appear in a question, in a variable of a question, in a plot or in the definition of a mathematical object. This article illustrates material prepared for numerical methods using a WIRIS library integrated in WIRIS quizzes. As a result, WIRIS in Moodle can be considered as a global solution for mathematics education.

  4. Sampling Methods in Cardiovascular Nursing Research: An Overview.

    PubMed

    Kandola, Damanpreet; Banner, Davina; O'Keefe-McCarthy, Sheila; Jassal, Debbie

    2014-01-01

    Cardiovascular nursing research covers a wide array of topics from health services to psychosocial patient experiences. The selection of specific participant samples is an important part of the research design and process. The sampling strategy employed is of utmost importance to ensure that a representative sample of participants is chosen. There are two main categories of sampling methods: probability and non-probability. Probability sampling is the random selection of elements from the population, where each element of the population has an equal and independent chance of being included in the sample. There are five main types of probability sampling including simple random sampling, systematic sampling, stratified sampling, cluster sampling, and multi-stage sampling. Non-probability sampling methods are those in which elements are chosen through non-random methods for inclusion into the research study and include convenience sampling, purposive sampling, and snowball sampling. Each approach offers distinct advantages and disadvantages and must be considered critically. In this research column, we provide an introduction to these key sampling techniques and draw on examples from the cardiovascular research. Understanding the differences in sampling techniques may aid nurses in effective appraisal of research literature and provide a reference pointfor nurses who engage in cardiovascular research.

  5. Weighted re-randomization tests for minimization with unbalanced allocation.

    PubMed

    Han, Baoguang; Yu, Menggang; McEntegart, Damian

    2013-01-01

    Re-randomization test has been considered as a robust alternative to the traditional population model-based methods for analyzing randomized clinical trials. This is especially so when the clinical trials are randomized according to minimization, which is a popular covariate-adaptive randomization method for ensuring balance among prognostic factors. Among various re-randomization tests, fixed-entry-order re-randomization is advocated as an effective strategy when a temporal trend is suspected. Yet when the minimization is applied to trials with unequal allocation, fixed-entry-order re-randomization test is biased and thus compromised in power. We find that the bias is due to non-uniform re-allocation probabilities incurred by the re-randomization in this case. We therefore propose a weighted fixed-entry-order re-randomization test to overcome the bias. The performance of the new test was investigated in simulation studies that mimic the settings of a real clinical trial. The weighted re-randomization test was found to work well in the scenarios investigated including the presence of a strong temporal trend. Copyright © 2013 John Wiley & Sons, Ltd.

  6. True Randomness from Big Data.

    PubMed

    Papakonstantinou, Periklis A; Woodruff, David P; Yang, Guang

    2016-09-26

    Generating random bits is a difficult task, which is important for physical systems simulation, cryptography, and many applications that rely on high-quality random bits. Our contribution is to show how to generate provably random bits from uncertain events whose outcomes are routinely recorded in the form of massive data sets. These include scientific data sets, such as in astronomics, genomics, as well as data produced by individuals, such as internet search logs, sensor networks, and social network feeds. We view the generation of such data as the sampling process from a big source, which is a random variable of size at least a few gigabytes. Our view initiates the study of big sources in the randomness extraction literature. Previous approaches for big sources rely on statistical assumptions about the samples. We introduce a general method that provably extracts almost-uniform random bits from big sources and extensively validate it empirically on real data sets. The experimental findings indicate that our method is efficient enough to handle large enough sources, while previous extractor constructions are not efficient enough to be practical. Quality-wise, our method at least matches quantum randomness expanders and classical world empirical extractors as measured by standardized tests.

  7. True Randomness from Big Data

    NASA Astrophysics Data System (ADS)

    Papakonstantinou, Periklis A.; Woodruff, David P.; Yang, Guang

    2016-09-01

    Generating random bits is a difficult task, which is important for physical systems simulation, cryptography, and many applications that rely on high-quality random bits. Our contribution is to show how to generate provably random bits from uncertain events whose outcomes are routinely recorded in the form of massive data sets. These include scientific data sets, such as in astronomics, genomics, as well as data produced by individuals, such as internet search logs, sensor networks, and social network feeds. We view the generation of such data as the sampling process from a big source, which is a random variable of size at least a few gigabytes. Our view initiates the study of big sources in the randomness extraction literature. Previous approaches for big sources rely on statistical assumptions about the samples. We introduce a general method that provably extracts almost-uniform random bits from big sources and extensively validate it empirically on real data sets. The experimental findings indicate that our method is efficient enough to handle large enough sources, while previous extractor constructions are not efficient enough to be practical. Quality-wise, our method at least matches quantum randomness expanders and classical world empirical extractors as measured by standardized tests.

  8. True Randomness from Big Data

    PubMed Central

    Papakonstantinou, Periklis A.; Woodruff, David P.; Yang, Guang

    2016-01-01

    Generating random bits is a difficult task, which is important for physical systems simulation, cryptography, and many applications that rely on high-quality random bits. Our contribution is to show how to generate provably random bits from uncertain events whose outcomes are routinely recorded in the form of massive data sets. These include scientific data sets, such as in astronomics, genomics, as well as data produced by individuals, such as internet search logs, sensor networks, and social network feeds. We view the generation of such data as the sampling process from a big source, which is a random variable of size at least a few gigabytes. Our view initiates the study of big sources in the randomness extraction literature. Previous approaches for big sources rely on statistical assumptions about the samples. We introduce a general method that provably extracts almost-uniform random bits from big sources and extensively validate it empirically on real data sets. The experimental findings indicate that our method is efficient enough to handle large enough sources, while previous extractor constructions are not efficient enough to be practical. Quality-wise, our method at least matches quantum randomness expanders and classical world empirical extractors as measured by standardized tests. PMID:27666514

  9. Wave propagation in a random medium

    NASA Technical Reports Server (NTRS)

    Lee, R. W.; Harp, J. C.

    1969-01-01

    A simple technique is used to derive statistical characterizations of the perturbations imposed upon a wave (plane, spherical or beamed) propagating through a random medium. The method is essentially physical rather than mathematical, and is probably equivalent to the Rytov method. The limitations of the method are discussed in some detail; in general they are restrictive only for optical paths longer than a few hundred meters, and for paths at the lower microwave frequencies. Situations treated include arbitrary path geometries, finite transmitting and receiving apertures, and anisotropic media. Results include, in addition to the usual statistical quantities, time-lagged functions, mixed functions involving amplitude and phase fluctuations, angle-of-arrival covariances, frequency covariances, and other higher-order quantities.

  10. Multiaxis Rainflow Fatigue Methods for Nonstationary Vibration

    NASA Technical Reports Server (NTRS)

    Irvine, T.

    2016-01-01

    Mechanical structures and components may be subjected to cyclical loading conditions, including sine and random vibration. Such systems must be designed and tested accordingly. Rainflow cycle counting is the standard method for reducing a stress time history to a table of amplitude-cycle pairings prior to the Palmgren-Miner cumulative damage calculation. The damage calculation is straightforward for sinusoidal stress but very complicated for random stress, particularly for nonstationary vibration. This paper evaluates candidate methods and makes a recommendation for further study of a hybrid technique.

  11. Analysis of drift correction in different simulated weighing schemes

    NASA Astrophysics Data System (ADS)

    Beatrici, A.; Rebelo, A.; Quintão, D.; Cacais, F. L.; Loayza, V. M.

    2015-10-01

    In the calibration of high accuracy mass standards some weighing schemes are used to reduce or eliminate the zero drift effects in mass comparators. There are different sources for the drift and different methods for its treatment. By using numerical methods, drift functions were simulated and a random term was included in each function. The comparison between the results obtained from ABABAB and ABBA weighing series was carried out. The results show a better efficacy of ABABAB method for drift with smooth variation and small randomness.

  12. Complementary nonparametric analysis of covariance for logistic regression in a randomized clinical trial setting.

    PubMed

    Tangen, C M; Koch, G G

    1999-03-01

    In the randomized clinical trial setting, controlling for covariates is expected to produce variance reduction for the treatment parameter estimate and to adjust for random imbalances of covariates between the treatment groups. However, for the logistic regression model, variance reduction is not obviously obtained. This can lead to concerns about the assumptions of the logistic model. We introduce a complementary nonparametric method for covariate adjustment. It provides results that are usually compatible with expectations for analysis of covariance. The only assumptions required are based on randomization and sampling arguments. The resulting treatment parameter is a (unconditional) population average log-odds ratio that has been adjusted for random imbalance of covariates. Data from a randomized clinical trial are used to compare results from the traditional maximum likelihood logistic method with those from the nonparametric logistic method. We examine treatment parameter estimates, corresponding standard errors, and significance levels in models with and without covariate adjustment. In addition, we discuss differences between unconditional population average treatment parameters and conditional subpopulation average treatment parameters. Additional features of the nonparametric method, including stratified (multicenter) and multivariate (multivisit) analyses, are illustrated. Extensions of this methodology to the proportional odds model are also made.

  13. 10-year trend in quantity and quality of pediatric randomized controlled trials published in mainland China: 2002–2011

    PubMed Central

    2013-01-01

    Background Quality assessment of pediatric randomized controlled trials (RCTs) in China is limited. The aim of this study was to evaluate the quantitative trends and quality indicators of RCTs published in mainland China over a recent 10-year period. Methods We individually searched all 17 available pediatric journals published in China from January 1, 2002 to December 30, 2011 to identify RCTs of drug treatment in participants under the age of 18 years. The quality was evaluated according to the Cochrane quality assessment protocol. Results Of 1287 journal issues containing 44398 articles, a total of 2.4% (1077/44398) articles were included in the analysis. The proportion of RCTs increased from 0.28% in 2002 to 0.32% in 2011. Individual sample sizes ranged from 10 to 905 participants (median 81 participants); 2.3% of the RCTs were multiple center trials; 63.9% evaluated Western medicine, 32.5% evaluated traditional Chinese medicine; 15% used an adequate method of random sequence generation; and 10.4% used a quasi-random method for randomization. Only 1% of the RCTs reported adequate allocation concealment and 0.6% reported the method of blinding. The follow-up period was from 7 days to 96 months, with a median of 7.5 months. There was incomplete outcome data reported in 8.3%, of which 4.5% (4/89) used intention-to-treat analysis. Only 0.4% of the included trials used adequate random sequence allocation, concealment and blinding. The articles published from 2007 to 2011 revealed an improvement in the randomization method compared with articles published from 2002 to 2006 (from 2.7% to 23.6%, p = 0.000). Conclusions In mainland China, the quantity of RCTs did not increase in the pediatric population, and the general quality was relatively poor. Quality improvements were suboptimal in the later 5 years. PMID:23914882

  14. Interpreting findings from Mendelian randomization using the MR-Egger method.

    PubMed

    Burgess, Stephen; Thompson, Simon G

    2017-05-01

    Mendelian randomization-Egger (MR-Egger) is an analysis method for Mendelian randomization using summarized genetic data. MR-Egger consists of three parts: (1) a test for directional pleiotropy, (2) a test for a causal effect, and (3) an estimate of the causal effect. While conventional analysis methods for Mendelian randomization assume that all genetic variants satisfy the instrumental variable assumptions, the MR-Egger method is able to assess whether genetic variants have pleiotropic effects on the outcome that differ on average from zero (directional pleiotropy), as well as to provide a consistent estimate of the causal effect, under a weaker assumption-the InSIDE (INstrument Strength Independent of Direct Effect) assumption. In this paper, we provide a critical assessment of the MR-Egger method with regard to its implementation and interpretation. While the MR-Egger method is a worthwhile sensitivity analysis for detecting violations of the instrumental variable assumptions, there are several reasons why causal estimates from the MR-Egger method may be biased and have inflated Type 1 error rates in practice, including violations of the InSIDE assumption and the influence of outlying variants. The issues raised in this paper have potentially serious consequences for causal inferences from the MR-Egger approach. We give examples of scenarios in which the estimates from conventional Mendelian randomization methods and MR-Egger differ, and discuss how to interpret findings in such cases.

  15. Analysis of baseline, average, and longitudinally measured blood pressure data using linear mixed models.

    PubMed

    Hossain, Ahmed; Beyene, Joseph

    2014-01-01

    This article compares baseline, average, and longitudinal data analysis methods for identifying genetic variants in genome-wide association study using the Genetic Analysis Workshop 18 data. We apply methods that include (a) linear mixed models with baseline measures, (b) random intercept linear mixed models with mean measures outcome, and (c) random intercept linear mixed models with longitudinal measurements. In the linear mixed models, covariates are included as fixed effects, whereas relatedness among individuals is incorporated as the variance-covariance structure of the random effect for the individuals. The overall strategy of applying linear mixed models decorrelate the data is based on Aulchenko et al.'s GRAMMAR. By analyzing systolic and diastolic blood pressure, which are used separately as outcomes, we compare the 3 methods in identifying a known genetic variant that is associated with blood pressure from chromosome 3 and simulated phenotype data. We also analyze the real phenotype data to illustrate the methods. We conclude that the linear mixed model with longitudinal measurements of diastolic blood pressure is the most accurate at identifying the known single-nucleotide polymorphism among the methods, but linear mixed models with baseline measures perform best with systolic blood pressure as the outcome.

  16. Multi-Agent Methods for the Configuration of Random Nanocomputers

    NASA Technical Reports Server (NTRS)

    Lawson, John W.

    2004-01-01

    As computational devices continue to shrink, the cost of manufacturing such devices is expected to grow exponentially. One alternative to the costly, detailed design and assembly of conventional computers is to place the nano-electronic components randomly on a chip. The price for such a trivial assembly process is that the resulting chip would not be programmable by conventional means. In this work, we show that such random nanocomputers can be adaptively programmed using multi-agent methods. This is accomplished through the optimization of an associated high dimensional error function. By representing each of the independent variables as a reinforcement learning agent, we are able to achieve convergence must faster than with other methods, including simulated annealing. Standard combinational logic circuits such as adders and multipliers are implemented in a straightforward manner. In addition, we show that the intrinsic flexibility of these adaptive methods allows the random computers to be reconfigured easily, making them reusable. Recovery from faults is also demonstrated.

  17. Designing Studies That Would Address the Multilayered Nature of Health Care

    PubMed Central

    Pennell, Michael; Rhoda, Dale; Hade, Erinn M.; Paskett, Electra D.

    2010-01-01

    We review design and analytic methods available for multilevel interventions in cancer research with particular attention to study design, sample size requirements, and potential to provide statistical evidence for causal inference. The most appropriate methods will depend on the stage of development of the research and whether randomization is possible. Early on, fractional factorial designs may be used to screen intervention components, particularly when randomization of individuals is possible. Quasi-experimental designs, including time-series and multiple baseline designs, can be useful once the intervention is designed because they require few sites and can provide the preliminary evidence to plan efficacy studies. In efficacy and effectiveness studies, group-randomized trials are preferred when randomization is possible and regression discontinuity designs are preferred otherwise if assignment based on a quantitative score is possible. Quasi-experimental designs may be used, especially when combined with recent developments in analytic methods to reduce bias in effect estimates. PMID:20386057

  18. SNP selection and classification of genome-wide SNP data using stratified sampling random forests.

    PubMed

    Wu, Qingyao; Ye, Yunming; Liu, Yang; Ng, Michael K

    2012-09-01

    For high dimensional genome-wide association (GWA) case-control data of complex disease, there are usually a large portion of single-nucleotide polymorphisms (SNPs) that are irrelevant with the disease. A simple random sampling method in random forest using default mtry parameter to choose feature subspace, will select too many subspaces without informative SNPs. Exhaustive searching an optimal mtry is often required in order to include useful and relevant SNPs and get rid of vast of non-informative SNPs. However, it is too time-consuming and not favorable in GWA for high-dimensional data. The main aim of this paper is to propose a stratified sampling method for feature subspace selection to generate decision trees in a random forest for GWA high-dimensional data. Our idea is to design an equal-width discretization scheme for informativeness to divide SNPs into multiple groups. In feature subspace selection, we randomly select the same number of SNPs from each group and combine them to form a subspace to generate a decision tree. The advantage of this stratified sampling procedure can make sure each subspace contains enough useful SNPs, but can avoid a very high computational cost of exhaustive search of an optimal mtry, and maintain the randomness of a random forest. We employ two genome-wide SNP data sets (Parkinson case-control data comprised of 408 803 SNPs and Alzheimer case-control data comprised of 380 157 SNPs) to demonstrate that the proposed stratified sampling method is effective, and it can generate better random forest with higher accuracy and lower error bound than those by Breiman's random forest generation method. For Parkinson data, we also show some interesting genes identified by the method, which may be associated with neurological disorders for further biological investigations.

  19. Randomized Trial of Suicide Gatekeeper Training for Social Work Students

    ERIC Educational Resources Information Center

    Jacobson, Jodi M.; Osteen, Phillip J.; Sharpe, Tanya L.; Pastoor, Jennifer B.

    2012-01-01

    Problem: Education and research on social work's role in preventing client suicide is limited. Method: Seventy advanced master of social work students were randomly assigned to either the training group (Question, Persuade, and Referral suicide gatekeeper training) or the control group. Outcomes measured over time included suicide knowledge,…

  20. Recruitment methods and costs for a randomized, placebo-controlled trial of chiropractic care for lumbar spinal stenosis: a single-site pilot study.

    PubMed

    Cambron, Jerrilyn A; Dexheimer, Jennifer M; Chang, Mabel; Cramer, Gregory D

    2010-01-01

    The purpose of this article is to describe the methods for recruitment in a clinical trial on chiropractic care for lumbar spinal stenosis. This randomized, placebo-controlled pilot study investigated the efficacy of different amounts of total treatment dosage over 6 weeks in 60 volunteer subjects with lumbar spinal stenosis. Subjects were recruited for this study through several media venues, focusing on successful and cost-effective strategies. Included in our efforts were radio advertising, newspaper advertising, direct mail, and various other low-cost initiatives. Of the 1211 telephone screens, 60 responders (5.0%) were randomized into the study. The most successful recruitment method was radio advertising, generating more than 64% of the calls (776 subjects). Newspaper and magazine advertising generated approximately 9% of all calls (108 subjects), and direct mail generated less than 7% (79 subjects). The total direct cost for recruitment was $40 740 or $679 per randomized patient. The costs per randomization were highest for direct mail ($995 per randomization) and lowest for newspaper/magazine advertising ($558 per randomization). Success of recruitment methods may vary based on target population and location. Planning of recruitment efforts is essential to the success of any clinical trial. Copyright 2010 National University of Health Sciences. Published by Mosby, Inc. All rights reserved.

  1. Comparison of Efficacy of Eye Movement, Desensitization and Reprocessing and Cognitive Behavioral Therapy Therapeutic Methods for Reducing Anxiety and Depression of Iranian Combatant Afflicted by Post Traumatic Stress Disorder

    NASA Astrophysics Data System (ADS)

    Narimani, M.; Sadeghieh Ahari, S.; Rajabi, S.

    This research aims to determine efficacy of two therapeutic methods and compare them; Eye Movement, Desensitization and Reprocessing (EMDR) and Cognitive Behavioral Therapy (CBT) for reduction of anxiety and depression of Iranian combatant afflicted with Post traumatic Stress Disorder (PTSD) after imposed war. Statistical population of current study includes combatants afflicted with PTSD that were hospitalized in Isar Hospital of Ardabil province or were inhabited in Ardabil. These persons were selected through simple random sampling and were randomly located in three groups. The method was extended test method and study design was multi-group test-retest. Used tools include hospital anxiety and depression scale. This survey showed that exercise of EMDR and CBT has caused significant reduction of anxiety and depression.

  2. Site Selection in Experiments: An Assessment of Site Recruitment and Generalizability in Two Scale-Up Studies

    ERIC Educational Resources Information Center

    Tipton, Elizabeth; Fellers, Lauren; Caverly, Sarah; Vaden-Kiernan, Michael; Borman, Geoffrey; Sullivan, Kate; Ruiz de Castilla, Veronica

    2016-01-01

    Recently, statisticians have begun developing methods to improve the generalizability of results from large-scale experiments in education. This work has included the development of methods for improved site selection when random sampling is infeasible, including the use of stratification and targeted recruitment strategies. This article provides…

  3. Methodological reporting of randomized trials in five leading Chinese nursing journals.

    PubMed

    Shi, Chunhu; Tian, Jinhui; Ren, Dan; Wei, Hongli; Zhang, Lihuan; Wang, Quan; Yang, Kehu

    2014-01-01

    Randomized controlled trials (RCTs) are not always well reported, especially in terms of their methodological descriptions. This study aimed to investigate the adherence of methodological reporting complying with CONSORT and explore associated trial level variables in the Chinese nursing care field. In June 2012, we identified RCTs published in five leading Chinese nursing journals and included trials with details of randomized methods. The quality of methodological reporting was measured through the methods section of the CONSORT checklist and the overall CONSORT methodological items score was calculated and expressed as a percentage. Meanwhile, we hypothesized that some general and methodological characteristics were associated with reporting quality and conducted a regression with these data to explore the correlation. The descriptive and regression statistics were calculated via SPSS 13.0. In total, 680 RCTs were included. The overall CONSORT methodological items score was 6.34 ± 0.97 (Mean ± SD). No RCT reported descriptions and changes in "trial design," changes in "outcomes" and "implementation," or descriptions of the similarity of interventions for "blinding." Poor reporting was found in detailing the "settings of participants" (13.1%), "type of randomization sequence generation" (1.8%), calculation methods of "sample size" (0.4%), explanation of any interim analyses and stopping guidelines for "sample size" (0.3%), "allocation concealment mechanism" (0.3%), additional analyses in "statistical methods" (2.1%), and targeted subjects and methods of "blinding" (5.9%). More than 50% of trials described randomization sequence generation, the eligibility criteria of "participants," "interventions," and definitions of the "outcomes" and "statistical methods." The regression analysis found that publication year and ITT analysis were weakly associated with CONSORT score. The completeness of methodological reporting of RCTs in the Chinese nursing care field is poor, especially with regard to the reporting of trial design, changes in outcomes, sample size calculation, allocation concealment, blinding, and statistical methods.

  4. Randomized trials published in some Chinese journals: how many are randomized?

    PubMed Central

    Wu, Taixiang; Li, Youping; Bian, Zhaoxiang; Liu, Guanjian; Moher, David

    2009-01-01

    Background The approximately 1100 medical journals now active in China are publishing a rapidly increasing number of research reports, including many studies identified by their authors as randomized controlled trials. It has been noticed that these reports mostly present positive results, and their quality and authenticity have consequently been called into question. We investigated the adequacy of randomization of clinical trials published in recent years in China to determine how many of them met acceptable standards for allocating participants to treatment groups. Methods The China National Knowledge Infrastructure electronic database was searched for reports of randomized controlled trials on 20 common diseases published from January 1994 to June 2005. From this sample, a subset of trials that appeared to have used randomization methods was selected. Twenty-one investigators trained in the relevant knowledge, communication skills and quality control issues interviewed the original authors of these trials about the participant randomization methods and related quality-control features of their trials. Results From an initial sample of 37,313 articles identified in the China National Knowledge Infrastructure database, we found 3137 apparent randomized controlled trials. Of these, 1452 were studies of conventional medicine (published in 411 journals) and 1685 were studies of traditional Chinese medicine (published in 352 journals). Interviews with the authors of 2235 of these reports revealed that only 207 studies adhered to accepted methodology for randomization and could on those grounds be deemed authentic randomized controlled trials (6.8%, 95% confidence interval 5.9–7.7). There was no statistically significant difference in the rate of authenticity between randomized controlled trials of traditional interventions and those of conventional interventions. Randomized controlled trials conducted at hospitals affiliated to medical universities were more likely to be authentic than trials conducted at level 3 and level 2 hospitals (relative risk 1.58, 95% confidence interval 1.18–2.13, and relative risk 14.42, 95% confidence interval 9.40–22.10, respectively). The likelihood of authenticity was higher in level 3 hospitals than in level 2 hospitals (relative risk 9.32, 95% confidence interval 5.83–14.89). All randomized controlled trials of pre-market drug clinical trial were authentic by our criteria. Of the trials conducted at university-affiliated hospitals, 56.3% were authentic (95% confidence interval 32.0–81.0). Conclusion Most reports of randomized controlled trials published in some Chinese journals lacked an adequate description of randomization. Similarly, most so called 'randomized controlled trials' were not real randomized controlled trials owing toa lack of adequate understanding on the part of the authors of rigorous clinical trial design. All randomized controlled trials of pre-market drug clinical trial included in this research were authentic. Randomized controlled trials conducted by authors in high level hospitals, especially in hospitals affiliated to medical universities had a higher rate of authenticity. That so many non-randomized controlled trials were published as randomized controlled trials reflected the fact that peer review needs to be improved and a good practice guide for peer review including how to identify the authenticity of the study urgently needs to be developed. PMID:19573242

  5. A complexity theory model in science education problem solving: random walks for working memory and mental capacity.

    PubMed

    Stamovlasis, Dimitrios; Tsaparlis, Georgios

    2003-07-01

    The present study examines the role of limited human channel capacity from a science education perspective. A model of science problem solving has been previously validated by applying concepts and tools of complexity theory (the working memory, random walk method). The method correlated the subjects' rank-order achievement scores in organic-synthesis chemistry problems with the subjects' working memory capacity. In this work, we apply the same nonlinear approach to a different data set, taken from chemical-equilibrium problem solving. In contrast to the organic-synthesis problems, these problems are algorithmic, require numerical calculations, and have a complex logical structure. As a result, these problems cause deviations from the model, and affect the pattern observed with the nonlinear method. In addition to Baddeley's working memory capacity, the Pascual-Leone's mental (M-) capacity is examined by the same random-walk method. As the complexity of the problem increases, the fractal dimension of the working memory random walk demonstrates a sudden drop, while the fractal dimension of the M-capacity random walk decreases in a linear fashion. A review of the basic features of the two capacities and their relation is included. The method and findings have consequences for problem solving not only in chemistry and science education, but also in other disciplines.

  6. The Effectiveness of Healthy Start Home Visit Program: Cluster Randomized Controlled Trial

    ERIC Educational Resources Information Center

    Leung, Cynthia; Tsang, Sandra; Heung, Kitty

    2015-01-01

    Purpose: The study reported the effectiveness of a home visit program for disadvantaged Chinese parents with preschool children, using cluster randomized controlled trial design. Method: Participants included 191 parents and their children from 24 preschools, with 84 dyads (12 preschools) in the intervention group and 107 dyads (12 preschools) in…

  7. Electromagnetic Scattering by Fully Ordered and Quasi-Random Rigid Particulate Samples

    NASA Technical Reports Server (NTRS)

    Mishchenko, Michael I.; Dlugach, Janna M.; Mackowski, Daniel W.

    2016-01-01

    In this paper we have analyzed circumstances under which a rigid particulate sample can behave optically as a true discrete random medium consisting of particles randomly moving relative to each other during measurement. To this end, we applied the numerically exact superposition T-matrix method to model far-field scattering characteristics of fully ordered and quasi-randomly arranged rigid multiparticle groups in fixed and random orientations. We have shown that, in and of itself, averaging optical observables over movements of a rigid sample as a whole is insufficient unless it is combined with a quasi-random arrangement of the constituent particles in the sample. Otherwise, certain scattering effects typical of discrete random media (including some manifestations of coherent backscattering) may not be accurately replicated.

  8. A method for multi-codon scanning mutagenesis of proteins based on asymmetric transposons.

    PubMed

    Liu, Jia; Cropp, T Ashton

    2012-02-01

    Random mutagenesis followed by selection or screening is a commonly used strategy to improve protein function. Despite many available methods for random mutagenesis, nearly all generate mutations at the nucleotide level. An ideal mutagenesis method would allow for the generation of 'codon mutations' to change protein sequence with defined or mixed amino acids of choice. Herein we report a method that allows for mutations of one, two or three consecutive codons. Key to this method is the development of a Mu transposon variant with asymmetric terminal sequences. As a demonstration of the method, we performed multi-codon scanning on the gene encoding superfolder GFP (sfGFP). Characterization of 50 randomly chosen clones from each library showed that more than 40% of the mutants in these three libraries contained seamless, in-frame mutations with low site preference. By screening only 500 colonies from each library, we successfully identified several spectra-shift mutations, including a S205D variant that was found to bear a single excitation peak in the UV region.

  9. Logistic Regression with Multiple Random Effects: A Simulation Study of Estimation Methods and Statistical Packages.

    PubMed

    Kim, Yoonsang; Choi, Young-Ku; Emery, Sherry

    2013-08-01

    Several statistical packages are capable of estimating generalized linear mixed models and these packages provide one or more of three estimation methods: penalized quasi-likelihood, Laplace, and Gauss-Hermite. Many studies have investigated these methods' performance for the mixed-effects logistic regression model. However, the authors focused on models with one or two random effects and assumed a simple covariance structure between them, which may not be realistic. When there are multiple correlated random effects in a model, the computation becomes intensive, and often an algorithm fails to converge. Moreover, in our analysis of smoking status and exposure to anti-tobacco advertisements, we have observed that when a model included multiple random effects, parameter estimates varied considerably from one statistical package to another even when using the same estimation method. This article presents a comprehensive review of the advantages and disadvantages of each estimation method. In addition, we compare the performances of the three methods across statistical packages via simulation, which involves two- and three-level logistic regression models with at least three correlated random effects. We apply our findings to a real dataset. Our results suggest that two packages-SAS GLIMMIX Laplace and SuperMix Gaussian quadrature-perform well in terms of accuracy, precision, convergence rates, and computing speed. We also discuss the strengths and weaknesses of the two packages in regard to sample sizes.

  10. A modified hybrid uncertain analysis method for dynamic response field of the LSOAAC with random and interval parameters

    NASA Astrophysics Data System (ADS)

    Zi, Bin; Zhou, Bin

    2016-07-01

    For the prediction of dynamic response field of the luffing system of an automobile crane (LSOAAC) with random and interval parameters, a hybrid uncertain model is introduced. In the hybrid uncertain model, the parameters with certain probability distribution are modeled as random variables, whereas, the parameters with lower and upper bounds are modeled as interval variables instead of given precise values. Based on the hybrid uncertain model, the hybrid uncertain dynamic response equilibrium equation, in which different random and interval parameters are simultaneously included in input and output terms, is constructed. Then a modified hybrid uncertain analysis method (MHUAM) is proposed. In the MHUAM, based on random interval perturbation method, the first-order Taylor series expansion and the first-order Neumann series, the dynamic response expression of the LSOAAC is developed. Moreover, the mathematical characteristics of extrema of bounds of dynamic response are determined by random interval moment method and monotonic analysis technique. Compared with the hybrid Monte Carlo method (HMCM) and interval perturbation method (IPM), numerical results show the feasibility and efficiency of the MHUAM for solving the hybrid LSOAAC problems. The effects of different uncertain models and parameters on the LSOAAC response field are also investigated deeply, and numerical results indicate that the impact made by the randomness in the thrust of the luffing cylinder F is larger than that made by the gravity of the weight in suspension Q . In addition, the impact made by the uncertainty in the displacement between the lower end of the lifting arm and the luffing cylinder a is larger than that made by the length of the lifting arm L .

  11. Key-Generation Algorithms for Linear Piece In Hand Matrix Method

    NASA Astrophysics Data System (ADS)

    Tadaki, Kohtaro; Tsujii, Shigeo

    The linear Piece In Hand (PH, for short) matrix method with random variables was proposed in our former work. It is a general prescription which can be applicable to any type of multivariate public-key cryptosystems for the purpose of enhancing their security. Actually, we showed, in an experimental manner, that the linear PH matrix method with random variables can certainly enhance the security of HFE against the Gröbner basis attack, where HFE is one of the major variants of multivariate public-key cryptosystems. In 1998 Patarin, Goubin, and Courtois introduced the plus method as a general prescription which aims to enhance the security of any given MPKC, just like the linear PH matrix method with random variables. In this paper we prove the equivalence between the plus method and the primitive linear PH matrix method, which is introduced by our previous work to explain the notion of the PH matrix method in general in an illustrative manner and not for a practical use to enhance the security of any given MPKC. Based on this equivalence, we show that the linear PH matrix method with random variables has the substantial advantage over the plus method with respect to the security enhancement. In the linear PH matrix method with random variables, the three matrices, including the PH matrix, play a central role in the secret-key and public-key. In this paper, we clarify how to generate these matrices and thus present two probabilistic polynomial-time algorithms to generate these matrices. In particular, the second one has a concise form, and is obtained as a byproduct of the proof of the equivalence between the plus method and the primitive linear PH matrix method.

  12. Supercomputer optimizations for stochastic optimal control applications

    NASA Technical Reports Server (NTRS)

    Chung, Siu-Leung; Hanson, Floyd B.; Xu, Huihuang

    1991-01-01

    Supercomputer optimizations for a computational method of solving stochastic, multibody, dynamic programming problems are presented. The computational method is valid for a general class of optimal control problems that are nonlinear, multibody dynamical systems, perturbed by general Markov noise in continuous time, i.e., nonsmooth Gaussian as well as jump Poisson random white noise. Optimization techniques for vector multiprocessors or vectorizing supercomputers include advanced data structures, loop restructuring, loop collapsing, blocking, and compiler directives. These advanced computing techniques and superconducting hardware help alleviate Bellman's curse of dimensionality in dynamic programming computations, by permitting the solution of large multibody problems. Possible applications include lumped flight dynamics models for uncertain environments, such as large scale and background random aerospace fluctuations.

  13. [Methodological quality and reporting quality evaluation of randomized controlled trials published in China Journal of Chinese Materia Medica].

    PubMed

    Yu, Dan-Dan; Xie, Yan-Ming; Liao, Xing; Zhi, Ying-Jie; Jiang, Jun-Jie; Chen, Wei

    2018-02-01

    To evaluate the methodological quality and reporting quality of randomized controlled trials(RCTs) published in China Journal of Chinese Materia Medica, we searched CNKI and China Journal of Chinese Materia webpage to collect RCTs since the establishment of the magazine. The Cochrane risk of bias assessment tool was used to evaluate the methodological quality of RCTs. The CONSORT 2010 list was adopted as reporting quality evaluating tool. Finally, 184 RCTs were included and evaluated methodologically, of which 97 RCTs were evaluated with reporting quality. For the methodological evaluating, 62 trials(33.70%) reported the random sequence generation; 9(4.89%) trials reported the allocation concealment; 25(13.59%) trials adopted the method of blinding; 30(16.30%) trials reported the number of patients withdrawing, dropping out and those lost to follow-up;2 trials (1.09%) reported trial registration and none of the trial reported the trial protocol; only 8(4.35%) trials reported the sample size estimation in details. For reporting quality appraising, 3 reporting items of 25 items were evaluated with high-quality,including: abstract, participants qualified criteria, and statistical methods; 4 reporting items with medium-quality, including purpose, intervention, random sequence method, and data collection of sites and locations; 9 items with low-quality reporting items including title, backgrounds, random sequence types, allocation concealment, blindness, recruitment of subjects, baseline data, harms, and funding;the rest of items were of extremely low quality(the compliance rate of reporting item<10%). On the whole, the methodological and reporting quality of RCTs published in the magazine are generally low. Further improvement in both methodological and reporting quality for RCTs of traditional Chinese medicine are warranted. It is recommended that the international standards and procedures for RCT design should be strictly followed to conduct high-quality trials. At the same time, in order to improve the reporting quality of randomized controlled trials, CONSORT standards should be adopted in the preparation of research reports and submissions. Copyright© by the Chinese Pharmaceutical Association.

  14. Absorption and scattering of light by nonspherical particles. [in atmosphere

    NASA Technical Reports Server (NTRS)

    Bohren, C. F.

    1986-01-01

    Using the example of the polarization of scattered light, it is shown that the scattering matrices for identical, randomly ordered particles and for spherical particles are unequal. The spherical assumptions of Mie theory are therefore inconsistent with the random shapes and sizes of atmospheric particulates. The implications for corrections made to extinction measurements of forward scattering light are discussed. Several analytical methods are examined as potential bases for developing more accurate models, including Rayleigh theory, Fraunhoffer Diffraction theory, anomalous diffraction theory, Rayleigh-Gans theory, the separation of variables technique, the Purcell-Pennypacker method, the T-matrix method, and finite difference calculations.

  15. CW-SSIM kernel based random forest for image classification

    NASA Astrophysics Data System (ADS)

    Fan, Guangzhe; Wang, Zhou; Wang, Jiheng

    2010-07-01

    Complex wavelet structural similarity (CW-SSIM) index has been proposed as a powerful image similarity metric that is robust to translation, scaling and rotation of images, but how to employ it in image classification applications has not been deeply investigated. In this paper, we incorporate CW-SSIM as a kernel function into a random forest learning algorithm. This leads to a novel image classification approach that does not require a feature extraction or dimension reduction stage at the front end. We use hand-written digit recognition as an example to demonstrate our algorithm. We compare the performance of the proposed approach with random forest learning based on other kernels, including the widely adopted Gaussian and the inner product kernels. Empirical evidences show that the proposed method is superior in its classification power. We also compared our proposed approach with the direct random forest method without kernel and the popular kernel-learning method support vector machine. Our test results based on both simulated and realworld data suggest that the proposed approach works superior to traditional methods without the feature selection procedure.

  16. Tile Patterns with Logo--Part I: Laying Tile with Logo.

    ERIC Educational Resources Information Center

    Clason, Robert G.

    1990-01-01

    Described is a method for drawing periodic tile patterns using LOGO. Squares, triangles, hexagons, shape filling, and random tile laying are included. These activities incorporate problem solving, programing methods, and the geometry of angles and polygons. (KR)

  17. Competitive Facility Location with Fuzzy Random Demands

    NASA Astrophysics Data System (ADS)

    Uno, Takeshi; Katagiri, Hideki; Kato, Kosuke

    2010-10-01

    This paper proposes a new location problem of competitive facilities, e.g. shops, with uncertainty and vagueness including demands for the facilities in a plane. By representing the demands for facilities as fuzzy random variables, the location problem can be formulated as a fuzzy random programming problem. For solving the fuzzy random programming problem, first the α-level sets for fuzzy numbers are used for transforming it to a stochastic programming problem, and secondly, by using their expectations and variances, it can be reformulated to a deterministic programming problem. After showing that one of their optimal solutions can be found by solving 0-1 programming problems, their solution method is proposed by improving the tabu search algorithm with strategic oscillation. The efficiency of the proposed method is shown by applying it to numerical examples of the facility location problems.

  18. Inverse random source scattering for the Helmholtz equation in inhomogeneous media

    NASA Astrophysics Data System (ADS)

    Li, Ming; Chen, Chuchu; Li, Peijun

    2018-01-01

    This paper is concerned with an inverse random source scattering problem in an inhomogeneous background medium. The wave propagation is modeled by the stochastic Helmholtz equation with the source driven by additive white noise. The goal is to reconstruct the statistical properties of the random source such as the mean and variance from the boundary measurement of the radiated random wave field at multiple frequencies. Both the direct and inverse problems are considered. We show that the direct problem has a unique mild solution by a constructive proof. For the inverse problem, we derive Fredholm integral equations, which connect the boundary measurement of the radiated wave field with the unknown source function. A regularized block Kaczmarz method is developed to solve the ill-posed integral equations. Numerical experiments are included to demonstrate the effectiveness of the proposed method.

  19. Selection of Variables in Cluster Analysis: An Empirical Comparison of Eight Procedures

    ERIC Educational Resources Information Center

    Steinley, Douglas; Brusco, Michael J.

    2008-01-01

    Eight different variable selection techniques for model-based and non-model-based clustering are evaluated across a wide range of cluster structures. It is shown that several methods have difficulties when non-informative variables (i.e., random noise) are included in the model. Furthermore, the distribution of the random noise greatly impacts the…

  20. General Retarded Contact Self-energies in and beyond the Non-equilibrium Green's Functions Method

    NASA Astrophysics Data System (ADS)

    Kubis, Tillmann; He, Yu; Andrawis, Robert; Klimeck, Gerhard

    2016-03-01

    Retarded contact self-energies in the framework of nonequilibrium Green's functions allow to model the impact of lead structures on the device without explicitly including the leads in the actual device calculation. Most of the contact self-energy algorithms are limited to homogeneous or periodic, semi-infinite lead structures. In this work, the complex absorbing potential method is extended to solve retarded contact self-energies for arbitrary lead structures, including irregular and randomly disordered leads. This method is verified for regular leads against common approaches and on physically equivalent, but numerically different irregular leads. Transmission results on randomly alloyed In0.5Ga0.5As structures show the importance of disorder in the leads. The concept of retarded contact self-energies is expanded to model passivation of atomically resolved surfaces without explicitly increasing the device's Hamiltonian.

  1. 77 FR 40866 - Applications for New Awards; Innovative Approaches to Literacy Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-11

    ... supported by the methods that have been employed. The term includes, appropriate to the research being... observational methods that provide reliable data; (iv) making claims of causal relationships only in random...; and (vii) using research designs and methods appropriate to the research question posed...

  2. A matrix-based method of moments for fitting the multivariate random effects model for meta-analysis and meta-regression

    PubMed Central

    Jackson, Dan; White, Ian R; Riley, Richard D

    2013-01-01

    Multivariate meta-analysis is becoming more commonly used. Methods for fitting the multivariate random effects model include maximum likelihood, restricted maximum likelihood, Bayesian estimation and multivariate generalisations of the standard univariate method of moments. Here, we provide a new multivariate method of moments for estimating the between-study covariance matrix with the properties that (1) it allows for either complete or incomplete outcomes and (2) it allows for covariates through meta-regression. Further, for complete data, it is invariant to linear transformations. Our method reduces to the usual univariate method of moments, proposed by DerSimonian and Laird, in a single dimension. We illustrate our method and compare it with some of the alternatives using a simulation study and a real example. PMID:23401213

  3. Using histograms to introduce randomization in the generation of ensembles of decision trees

    DOEpatents

    Kamath, Chandrika; Cantu-Paz, Erick; Littau, David

    2005-02-22

    A system for decision tree ensembles that includes a module to read the data, a module to create a histogram, a module to evaluate a potential split according to some criterion using the histogram, a module to select a split point randomly in an interval around the best split, a module to split the data, and a module to combine multiple decision trees in ensembles. The decision tree method includes the steps of reading the data; creating a histogram; evaluating a potential split according to some criterion using the histogram, selecting a split point randomly in an interval around the best split, splitting the data, and combining multiple decision trees in ensembles.

  4. Logistic Regression with Multiple Random Effects: A Simulation Study of Estimation Methods and Statistical Packages

    PubMed Central

    Kim, Yoonsang; Emery, Sherry

    2013-01-01

    Several statistical packages are capable of estimating generalized linear mixed models and these packages provide one or more of three estimation methods: penalized quasi-likelihood, Laplace, and Gauss-Hermite. Many studies have investigated these methods’ performance for the mixed-effects logistic regression model. However, the authors focused on models with one or two random effects and assumed a simple covariance structure between them, which may not be realistic. When there are multiple correlated random effects in a model, the computation becomes intensive, and often an algorithm fails to converge. Moreover, in our analysis of smoking status and exposure to anti-tobacco advertisements, we have observed that when a model included multiple random effects, parameter estimates varied considerably from one statistical package to another even when using the same estimation method. This article presents a comprehensive review of the advantages and disadvantages of each estimation method. In addition, we compare the performances of the three methods across statistical packages via simulation, which involves two- and three-level logistic regression models with at least three correlated random effects. We apply our findings to a real dataset. Our results suggest that two packages—SAS GLIMMIX Laplace and SuperMix Gaussian quadrature—perform well in terms of accuracy, precision, convergence rates, and computing speed. We also discuss the strengths and weaknesses of the two packages in regard to sample sizes. PMID:24288415

  5. Symplectic analysis of vertical random vibration for coupled vehicle track systems

    NASA Astrophysics Data System (ADS)

    Lu, F.; Kennedy, D.; Williams, F. W.; Lin, J. H.

    2008-10-01

    A computational model for random vibration analysis of vehicle-track systems is proposed and solutions use the pseudo excitation method (PEM) and the symplectic method. The vehicle is modelled as a mass, spring and damping system with 10 degrees of freedom (dofs) which consist of vertical and pitching motion for the vehicle body and its two bogies and vertical motion for the four wheelsets. The track is treated as an infinite Bernoulli-Euler beam connected to sleepers and hence to ballast and is regarded as a periodic structure. Linear springs couple the vehicle and the track. Hence, the coupled vehicle-track system has only 26 dofs. A fixed excitation model is used, i.e. the vehicle does not move along the track but instead the track irregularity profile moves backwards at the vehicle velocity. This irregularity is assumed to be a stationary random process. Random vibration theory is used to obtain the response power spectral densities (PSDs), by using PEM to transform this random multiexcitation problem into a deterministic harmonic excitation one and then applying symplectic solution methodology. Numerical results for an example include verification of the proposed method by comparing with finite element method (FEM) results; comparison between the present model and the traditional rigid track model and; discussion of the influences of track damping and vehicle velocity.

  6. A mathematical study of a random process proposed as an atmospheric turbulence model

    NASA Technical Reports Server (NTRS)

    Sidwell, K.

    1977-01-01

    A random process is formed by the product of a local Gaussian process and a random amplitude process, and the sum of that product with an independent mean value process. The mathematical properties of the resulting process are developed, including the first and second order properties and the characteristic function of general order. An approximate method for the analysis of the response of linear dynamic systems to the process is developed. The transition properties of the process are also examined.

  7. Significance of Random Bladder Biopsies in Non-Muscle Invasive Bladder Cancer

    PubMed Central

    Kumano, Masafumi; Miyake, Hideaki; Nakano, Yuzo; Fujisawa, Masato

    2013-01-01

    Background/Aims To evaluate retrospectively the clinical outcome of random bladder biopsies in patients with non-muscle invasive bladder cancer (NMIBC) undergoing transurethral resection (TUR). Patients and Method This study included 234 consecutive patients with NMIBC who underwent random biopsies from normal-appearing urothelium of the bladder, including the anterior wall, posterior wall, right wall, left wall, dome, trigone and/or prostatic urethra, during TUR. Result Thirty-seven patients (15.8%) were diagnosed by random biopsies as having urothelial cancer. Among several factors available prior to TUR, preoperative urinary cytology appeared to be independently related to the detection of urothelial cancer in random biopsies on multivariate analysis. Urinary cytology prior to TUR gave 50.0% sensitivity, 91.7% specificity, 56.8% positive predictive value and 89.3% negative predictive value for predicting the findings of the random biopsies. Conclusion Biopsies of normal-appearing urothelium resulted in the additional detection of urothelial cancer in a definite proportion of NMIBC patients, and it remains difficult to find a reliable alternative to random biopsies. Collectively, these findings suggest that it would be beneficial to perform random biopsies as part of the routine management of NMIBC. PMID:24917759

  8. A New Method of Random Environmental Walking for Assessing Behavioral Preferences for Different Lighting Applications

    PubMed Central

    Patching, Geoffrey R.; Rahm, Johan; Jansson, Märit; Johansson, Maria

    2017-01-01

    Accurate assessment of people’s preferences for different outdoor lighting applications is increasingly considered important in the development of new urban environments. Here a new method of random environmental walking is proposed to complement current methods of assessing urban lighting applications, such as self-report questionnaires. The procedure involves participants repeatedly walking between different lighting applications by random selection of a lighting application and preferred choice or by random selection of a lighting application alone. In this manner, participants are exposed to all lighting applications of interest more than once and participants’ preferences for the different lighting applications are reflected in the number of times they walk to each lighting application. On the basis of an initial simulation study, to explore the feasibility of this approach, a comprehensive field test was undertaken. The field test included random environmental walking and collection of participants’ subjective ratings of perceived pleasantness (PP), perceived quality, perceived strength, and perceived flicker of four lighting applications. The results indicate that random environmental walking can reveal participants’ preferences for different lighting applications that, in the present study, conformed to participants’ ratings of PP and perceived quality of the lighting applications. As a complement to subjectively stated environmental preferences, random environmental walking has the potential to expose behavioral preferences for different lighting applications. PMID:28337163

  9. Probabilistic analysis for fatigue strength degradation of materials

    NASA Technical Reports Server (NTRS)

    Royce, Lola

    1989-01-01

    This report presents the results of the first year of a research program conducted for NASA-LeRC by the University of Texas at San Antonio. The research included development of methodology that provides a probabilistic treatment of lifetime prediction of structural components of aerospace propulsion systems subjected to fatigue. Material strength degradation models, based on primitive variables, include both a fatigue strength reduction model and a fatigue crack growth model. Linear elastic fracture mechanics is utilized in the latter model. Probabilistic analysis is based on simulation, and both maximum entropy and maximum penalized likelihood methods are used for the generation of probability density functions. The resulting constitutive relationships are included in several computer programs, RANDOM2, RANDOM3, and RANDOM4. These programs determine the random lifetime of an engine component, in mechanical load cycles, to reach a critical fatigue strength or crack size. The material considered was a cast nickel base superalloy, one typical of those used in the Space Shuttle Main Engine.

  10. Correction of confounding bias in non-randomized studies by appropriate weighting.

    PubMed

    Schmoor, Claudia; Gall, Christine; Stampf, Susanne; Graf, Erika

    2011-03-01

    In non-randomized studies, the assessment of a causal effect of treatment or exposure on outcome is hampered by possible confounding. Applying multiple regression models including the effects of treatment and covariates on outcome is the well-known classical approach to adjust for confounding. In recent years other approaches have been promoted. One of them is based on the propensity score and considers the effect of possible confounders on treatment as a relevant criterion for adjustment. Another proposal is based on using an instrumental variable. Here inference relies on a factor, the instrument, which affects treatment but is thought to be otherwise unrelated to outcome, so that it mimics randomization. Each of these approaches can basically be interpreted as a simple reweighting scheme, designed to address confounding. The procedures will be compared with respect to their fundamental properties, namely, which bias they aim to eliminate, which effect they aim to estimate, and which parameter is modelled. We will expand our overview of methods for analysis of non-randomized studies to methods for analysis of randomized controlled trials and show that analyses of both study types may target different effects and different parameters. The considerations will be illustrated using a breast cancer study with a so-called Comprehensive Cohort Study design, including a randomized controlled trial and a non-randomized study in the same patient population as sub-cohorts. This design offers ideal opportunities to discuss and illustrate the properties of the different approaches. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Dimension from covariance matrices.

    PubMed

    Carroll, T L; Byers, J M

    2017-02-01

    We describe a method to estimate embedding dimension from a time series. This method includes an estimate of the probability that the dimension estimate is valid. Such validity estimates are not common in algorithms for calculating the properties of dynamical systems. The algorithm described here compares the eigenvalues of covariance matrices created from an embedded signal to the eigenvalues for a covariance matrix of a Gaussian random process with the same dimension and number of points. A statistical test gives the probability that the eigenvalues for the embedded signal did not come from the Gaussian random process.

  12. Effects of a Psychological Intervention in a Primary Health Care Center for Caregivers of Dependent Relatives: A Randomized Trial

    ERIC Educational Resources Information Center

    Rodriguez-Sanchez, Emiliano; Patino-Alonso, Maria C.; Mora-Simon, Sara; Gomez-Marcos, Manuel A.; Perez-Penaranda, Anibal; Losada-Baltar, Andres; Garcia-Ortiz, Luis

    2013-01-01

    Purpose: To assess, in the context of Primary Health Care (PHC), the effect of a psychological intervention in mental health among caregivers (CGs) of dependent relatives. Design and Methods: Randomized multicenter, controlled clinical trial. The 125 CGs included in the trial were receiving health care in PHC. Inclusion criteria: Identifying…

  13. A Randomized Trial of Extended Telephone-Based Continuing Care for Alcohol Dependence: Within-Treatment Substance Use Outcomes

    ERIC Educational Resources Information Center

    McKay, James R.; Van Horn, Deborah H. A.; Oslin, David W.; Lynch, Kevin G.; Ivey, Megan; Ward, Kathleen; Drapkin, Michelle L.; Becher, Julie R.; Coviello, Donna M.

    2010-01-01

    Objective: The study tested whether adding up to 18 months of telephone continuing care, either as monitoring and feedback (TM) or longer contacts that included counseling (TMC), to intensive outpatient programs (IOPs) improved outcomes for alcohol-dependent patients. Method: Participants (N = 252) who completed 3 weeks of IOP were randomized to…

  14. A School-Randomized Clinical Trial of an Integrated Social-Emotional Learning and Literacy Intervention: Impacts after 1 School Year

    ERIC Educational Resources Information Center

    Jones, Stephanie M.; Brown, Joshua L.; Hoglund, Wendy L. G.; Aber, J. Lawrence

    2010-01-01

    Objective: To report experimental impacts of a universal, integrated school-based intervention in social-emotional learning and literacy development on change over 1 school year in 3rd-grade children's social-emotional, behavioral, and academic outcomes. Method: This study employed a school-randomized, experimental design and included 942…

  15. Is using multiple imputation better than complete case analysis for estimating a prevalence (risk) difference in randomized controlled trials when binary outcome observations are missing?

    PubMed

    Mukaka, Mavuto; White, Sarah A; Terlouw, Dianne J; Mwapasa, Victor; Kalilani-Phiri, Linda; Faragher, E Brian

    2016-07-22

    Missing outcomes can seriously impair the ability to make correct inferences from randomized controlled trials (RCTs). Complete case (CC) analysis is commonly used, but it reduces sample size and is perceived to lead to reduced statistical efficiency of estimates while increasing the potential for bias. As multiple imputation (MI) methods preserve sample size, they are generally viewed as the preferred analytical approach. We examined this assumption, comparing the performance of CC and MI methods to determine risk difference (RD) estimates in the presence of missing binary outcomes. We conducted simulation studies of 5000 simulated data sets with 50 imputations of RCTs with one primary follow-up endpoint at different underlying levels of RD (3-25 %) and missing outcomes (5-30 %). For missing at random (MAR) or missing completely at random (MCAR) outcomes, CC method estimates generally remained unbiased and achieved precision similar to or better than MI methods, and high statistical coverage. Missing not at random (MNAR) scenarios yielded invalid inferences with both methods. Effect size estimate bias was reduced in MI methods by always including group membership even if this was unrelated to missingness. Surprisingly, under MAR and MCAR conditions in the assessed scenarios, MI offered no statistical advantage over CC methods. While MI must inherently accompany CC methods for intention-to-treat analyses, these findings endorse CC methods for per protocol risk difference analyses in these conditions. These findings provide an argument for the use of the CC approach to always complement MI analyses, with the usual caveat that the validity of the mechanism for missingness be thoroughly discussed. More importantly, researchers should strive to collect as much data as possible.

  16. All about Eve: Secret Sharing using Quantum Effects

    NASA Technical Reports Server (NTRS)

    Jackson, Deborah J.

    2005-01-01

    This document discusses the nature of light (including classical light and photons), encryption, quantum key distribution (QKD), light polarization and beamsplitters and their application to information communication. A quantum of light represents the smallest possible subdivision of radiant energy (light) and is called a photon. The QKD key generation sequence is outlined including the receiver broadcasting the initial signal indicating reception availability, timing pulses from the sender to provide reference for gated detection of photons, the sender generating photons through random polarization while the receiver detects photons with random polarization and communicating via data link to mutually establish random keys. The QKD network vision includes inter-SATCOM, point-to-point Gnd Fiber and SATCOM-fiber nodes. QKD offers an unconditionally secure method of exchanging encryption keys. Ongoing research will focus on how to increase the key generation rate.

  17. Methodological Reporting of Randomized Trials in Five Leading Chinese Nursing Journals

    PubMed Central

    Shi, Chunhu; Tian, Jinhui; Ren, Dan; Wei, Hongli; Zhang, Lihuan; Wang, Quan; Yang, Kehu

    2014-01-01

    Background Randomized controlled trials (RCTs) are not always well reported, especially in terms of their methodological descriptions. This study aimed to investigate the adherence of methodological reporting complying with CONSORT and explore associated trial level variables in the Chinese nursing care field. Methods In June 2012, we identified RCTs published in five leading Chinese nursing journals and included trials with details of randomized methods. The quality of methodological reporting was measured through the methods section of the CONSORT checklist and the overall CONSORT methodological items score was calculated and expressed as a percentage. Meanwhile, we hypothesized that some general and methodological characteristics were associated with reporting quality and conducted a regression with these data to explore the correlation. The descriptive and regression statistics were calculated via SPSS 13.0. Results In total, 680 RCTs were included. The overall CONSORT methodological items score was 6.34±0.97 (Mean ± SD). No RCT reported descriptions and changes in “trial design,” changes in “outcomes” and “implementation,” or descriptions of the similarity of interventions for “blinding.” Poor reporting was found in detailing the “settings of participants” (13.1%), “type of randomization sequence generation” (1.8%), calculation methods of “sample size” (0.4%), explanation of any interim analyses and stopping guidelines for “sample size” (0.3%), “allocation concealment mechanism” (0.3%), additional analyses in “statistical methods” (2.1%), and targeted subjects and methods of “blinding” (5.9%). More than 50% of trials described randomization sequence generation, the eligibility criteria of “participants,” “interventions,” and definitions of the “outcomes” and “statistical methods.” The regression analysis found that publication year and ITT analysis were weakly associated with CONSORT score. Conclusions The completeness of methodological reporting of RCTs in the Chinese nursing care field is poor, especially with regard to the reporting of trial design, changes in outcomes, sample size calculation, allocation concealment, blinding, and statistical methods. PMID:25415382

  18. Analysis of the Interaction of Student Characteristics with Method in Micro-Teaching.

    ERIC Educational Resources Information Center

    Chavers, Katherine; And Others

    A study examined the comparative effects on microteaching performance of (1) eight different methods of teacher training and (2) the interaction of method with student characteristics. Subjects, 71 enrollees in an educational psychology course, were randomly assigned to eight treatment groups (including one control group). Treatments consisted of…

  19. Baseline Serum Estradiol and Fracture Reduction During Treatment With Hormone Therapy: The Women’s Health Initiative Randomized Trial

    PubMed Central

    Cauley, Jane A.; LaCroix, Andrea Z.; Robbins, John A.; Larson, Joseph; Wallace, Robert; Wactawski-Wende, Jean; Chen, Zhao; Bauer, Douglas C.; Cummings, Steven R.; Jackson, Rebecca

    2009-01-01

    Purpose To test the hypothesis that the reduction in fractures with hormone therapy (HT) is greater in women with lower estradiol levels. Methods We conducted a nested case-control study within the Women’s Health Initiative HT Trials. The sample included 231 hip fracture case-control pairs and a random sample of 519 all fracture case-control pairs. Cases and controls were matched for age, ethnicity, randomization date, fracture history and hysterectomy status. Hormones were measured prior to randomization. Incident cases of fracture identified over an average follow-up of 6.53 years. Results There was no evidence that the effect of HT on fracture differed by baseline estradiol (E2) or sex hormone binding globulin (SHBG). Across all quartiles of E2 and SHBG, women randomized to HT had about a 50% lower risk of fracture including hip fracture, compared to placebo. Conclusion The effect of HT on fracture reduction is independent of estradiol and SHBG levels. PMID:19436934

  20. Speckle lithography for fabricating Gaussian, quasi-random 2D structures and black silicon structures.

    PubMed

    Bingi, Jayachandra; Murukeshan, Vadakke Matham

    2015-12-18

    Laser speckle pattern is a granular structure formed due to random coherent wavelet interference and generally considered as noise in optical systems including photolithography. Contrary to this, in this paper, we use the speckle pattern to generate predictable and controlled Gaussian random structures and quasi-random structures photo-lithographically. The random structures made using this proposed speckle lithography technique are quantified based on speckle statistics, radial distribution function (RDF) and fast Fourier transform (FFT). The control over the speckle size, density and speckle clustering facilitates the successful fabrication of black silicon with different surface structures. The controllability and tunability of randomness makes this technique a robust method for fabricating predictable 2D Gaussian random structures and black silicon structures. These structures can enhance the light trapping significantly in solar cells and hence enable improved energy harvesting. Further, this technique can enable efficient fabrication of disordered photonic structures and random media based devices.

  1. Authentication via wavefront-shaped optical responses

    NASA Astrophysics Data System (ADS)

    Eilers, Hergen; Anderson, Benjamin R.; Gunawidjaja, Ray

    2018-02-01

    Authentication/tamper-indication is required in a wide range of applications, including nuclear materials management and product counterfeit detection. State-of-the-art techniques include reflective particle tags, laser speckle authentication, and birefringent seals. Each of these passive techniques has its own advantages and disadvantages, including the need for complex image comparisons, limited flexibility, sensitivity to environmental conditions, limited functionality, etc. We have developed a new active approach to address some of these short-comings. The use of an active characterization technique adds more flexibility and additional layers of security over current techniques. Our approach uses randomly-distributed nanoparticles embedded in a polymer matrix (tag/seal) which is attached to the item to be secured. A spatial light modulator is used to adjust the wavefront of a laser which interacts with the tag/seal, and a detector is used to monitor this interaction. The interaction can occur in various ways, including transmittance, reflectance, fluorescence, random lasing, etc. For example, at the time of origination, the wavefront-shaped reflectance from a tag/seal can be adjusted to result in a specific pattern (symbol, words, etc.) Any tampering with the tag/seal would results in a disturbance of the random orientation of the nanoparticles and thus distort the reflectance pattern. A holographic waveplate could be inserted into the laser beam for verification. The absence/distortion of the original pattern would then indicate that tampering has occurred. We have tested the tag/seal's and authentication method's tamper-indicating ability using various attack methods, including mechanical, thermal, and chemical attacks, and have verified our material/method's robust tamper-indicating ability.

  2. A method for validating Rent's rule for technological and biological networks.

    PubMed

    Alcalde Cuesta, Fernando; González Sequeiros, Pablo; Lozano Rojo, Álvaro

    2017-07-14

    Rent's rule is empirical power law introduced in an effort to describe and optimize the wiring complexity of computer logic graphs. It is known that brain and neuronal networks also obey Rent's rule, which is consistent with the idea that wiring costs play a fundamental role in brain evolution and development. Here we propose a method to validate this power law for a certain range of network partitions. This method is based on the bifurcation phenomenon that appears when the network is subjected to random alterations preserving its degree distribution. It has been tested on a set of VLSI circuits and real networks, including biological and technological ones. We also analyzed the effect of different types of random alterations on the Rentian scaling in order to test the influence of the degree distribution. There are network architectures quite sensitive to these randomization procedures with significant increases in the values of the Rent exponents.

  3. The competing risks Cox model with auxiliary case covariates under weaker missing-at-random cause of failure.

    PubMed

    Nevo, Daniel; Nishihara, Reiko; Ogino, Shuji; Wang, Molin

    2017-08-04

    In the analysis of time-to-event data with multiple causes using a competing risks Cox model, often the cause of failure is unknown for some of the cases. The probability of a missing cause is typically assumed to be independent of the cause given the time of the event and covariates measured before the event occurred. In practice, however, the underlying missing-at-random assumption does not necessarily hold. Motivated by colorectal cancer molecular pathological epidemiology analysis, we develop a method to conduct valid analysis when additional auxiliary variables are available for cases only. We consider a weaker missing-at-random assumption, with missing pattern depending on the observed quantities, which include the auxiliary covariates. We use an informative likelihood approach that will yield consistent estimates even when the underlying model for missing cause of failure is misspecified. The superiority of our method over naive methods in finite samples is demonstrated by simulation study results. We illustrate the use of our method in an analysis of colorectal cancer data from the Nurses' Health Study cohort, where, apparently, the traditional missing-at-random assumption fails to hold.

  4. Causal inference from observational data.

    PubMed

    Listl, Stefan; Jürges, Hendrik; Watt, Richard G

    2016-10-01

    Randomized controlled trials have long been considered the 'gold standard' for causal inference in clinical research. In the absence of randomized experiments, identification of reliable intervention points to improve oral health is often perceived as a challenge. But other fields of science, such as social science, have always been challenged by ethical constraints to conducting randomized controlled trials. Methods have been established to make causal inference using observational data, and these methods are becoming increasingly relevant in clinical medicine, health policy and public health research. This study provides an overview of state-of-the-art methods specifically designed for causal inference in observational data, including difference-in-differences (DiD) analyses, instrumental variables (IV), regression discontinuity designs (RDD) and fixed-effects panel data analysis. The described methods may be particularly useful in dental research, not least because of the increasing availability of routinely collected administrative data and electronic health records ('big data'). © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  5. Bayesian statistical inference enhances the interpretation of contemporary randomized controlled trials.

    PubMed

    Wijeysundera, Duminda N; Austin, Peter C; Hux, Janet E; Beattie, W Scott; Laupacis, Andreas

    2009-01-01

    Randomized trials generally use "frequentist" statistics based on P-values and 95% confidence intervals. Frequentist methods have limitations that might be overcome, in part, by Bayesian inference. To illustrate these advantages, we re-analyzed randomized trials published in four general medical journals during 2004. We used Medline to identify randomized superiority trials with two parallel arms, individual-level randomization and dichotomous or time-to-event primary outcomes. Studies with P<0.05 in favor of the intervention were deemed "positive"; otherwise, they were "negative." We used several prior distributions and exact conjugate analyses to calculate Bayesian posterior probabilities for clinically relevant effects. Of 88 included studies, 39 were positive using a frequentist analysis. Although the Bayesian posterior probabilities of any benefit (relative risk or hazard ratio<1) were high in positive studies, these probabilities were lower and variable for larger benefits. The positive studies had only moderate probabilities for exceeding the effects that were assumed for calculating the sample size. By comparison, there were moderate probabilities of any benefit in negative studies. Bayesian and frequentist analyses complement each other when interpreting the results of randomized trials. Future reports of randomized trials should include both.

  6. Tubal anastomosis after previous sterilization: a systematic review.

    PubMed

    van Seeters, Jacoba A H; Chua, Su Jen; Mol, Ben W J; Koks, Carolien A M

    2017-05-01

    Female sterilization is one of the most common contraceptive methods. A small number of women, however, opt for reversal of sterilization procedures after they experience regret. Procedures can be performed by laparotomy or laparoscopy, with or without robotic assistance. Another commonly utilized alternative is IVF. The choice between surgery and IVF is often influenced by reimbursement politics for that particular geographic location. We evaluated the fertility outcomes of different surgical methods available for the reversal of female sterilization, compared these to IVF and assessed the prognostic factors for success. Two search strategies were employed. Firstly, we searched for randomized and non-randomized clinical studies presenting fertility outcomes of sterilization reversal up to July 2016. Data on the following outcomes were collected: pregnancy rate, ectopic pregnancy rate, cost of the procedure and operative time. Eligible study designs included prospective or retrospective studies, randomized controlled trials, cohort studies, case-control studies and case series. No age restriction was applied. Exclusion criteria were patients suffering from tubal infertility from any other reason (e.g. infection, endometriosis and adhesions from previous surgery) and studies including <10 participants. The following factors likely to influence the success of sterilization reversal procedures were then evaluated: female age, BMI and duration and method of sterilization. Secondly, we searched for randomized and non-randomized clinical studies that compared reversal of sterilization to IVF and evaluated them for pregnancy outcomes and cost effectiveness. We included 37 studies that investigated a total of 10 689 women. No randomized controlled trials were found. Most studies were retrospective cohort studies of a moderate quality. The pooled pregnancy rate after sterilization reversal was 42-69%, with heterogeneity seen from the different methods utilized. The reported ectopic pregnancy rate was 4-8%. The only prognostic factor affecting the chance of conception was female age. The surgical approach (i.e. laparotomy [microscopic], laparoscopy or robotic) had no impact on the outcome, with the exception of the macroscopic laparotomic technique, which had inferior results and is not currently utilized. For older women, IVF could be a more cost-effective alternative for the reversal of sterilization. However, direct comparative data are lacking and a cut-off age cannot be stated. In sterilized women who suffer regret, surgical tubal re-anastomosis is an effective treatment, especially in younger women. However, there is a need for randomized controlled trials comparing the success rates and costs of surgical reversal with IVF. © The Author 2017. Published by Oxford University Press on behalf of the European Society of Human Reproduction and Embryology. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  7. Safety assessment of a shallow foundation using the random finite element method

    NASA Astrophysics Data System (ADS)

    Zaskórski, Łukasz; Puła, Wojciech

    2015-04-01

    A complex structure of soil and its random character are reasons why soil modeling is a cumbersome task. Heterogeneity of soil has to be considered even within a homogenous layer of soil. Therefore an estimation of shear strength parameters of soil for the purposes of a geotechnical analysis causes many problems. In applicable standards (Eurocode 7) there is not presented any explicit method of an evaluation of characteristic values of soil parameters. Only general guidelines can be found how these values should be estimated. Hence many approaches of an assessment of characteristic values of soil parameters are presented in literature and can be applied in practice. In this paper, the reliability assessment of a shallow strip footing was conducted using a reliability index β. Therefore some approaches of an estimation of characteristic values of soil properties were compared by evaluating values of reliability index β which can be achieved by applying each of them. Method of Orr and Breysse, Duncan's method, Schneider's method, Schneider's method concerning influence of fluctuation scales and method included in Eurocode 7 were examined. Design values of the bearing capacity based on these approaches were referred to the stochastic bearing capacity estimated by the random finite element method (RFEM). Design values of the bearing capacity were conducted for various widths and depths of a foundation in conjunction with design approaches DA defined in Eurocode. RFEM was presented by Griffiths and Fenton (1993). It combines deterministic finite element method, random field theory and Monte Carlo simulations. Random field theory allows to consider a random character of soil parameters within a homogenous layer of soil. For this purpose a soil property is considered as a separate random variable in every element of a mesh in the finite element method with proper correlation structure between points of given area. RFEM was applied to estimate which theoretical probability distribution fits the empirical probability distribution of bearing capacity basing on 3000 realizations. Assessed probability distribution was applied to compute design values of the bearing capacity and related reliability indices β. Conducted analysis were carried out for a cohesion soil. Hence a friction angle and a cohesion were defined as a random parameters and characterized by two dimensional random fields. A friction angle was described by a bounded distribution as it differs within limited range. While a lognormal distribution was applied in case of a cohesion. Other properties - Young's modulus, Poisson's ratio and unit weight were assumed as deterministic values because they have negligible influence on the stochastic bearing capacity. Griffiths D. V., & Fenton G. A. (1993). Seepage beneath water retaining structures founded on spatially random soil. Géotechnique, 43(6), 577-587.

  8. Computer methods for sampling from the gamma distribution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, M.E.; Tadikamalla, P.R.

    1978-01-01

    Considerable attention has recently been directed at developing ever faster algorithms for generating gamma random variates on digital computers. This paper surveys the current state of the art including the leading algorithms of Ahrens and Dieter, Atkinson, Cheng, Fishman, Marsaglia, Tadikamalla, and Wallace. General random variate generation techniques are explained with reference to these gamma algorithms. Computer simulation experiments on IBM and CDC computers are reported.

  9. A Randomized, Double-Blind, Placebo-Controlled Study of Modafinil Film-Coated Tablets in Children and Adolescents with Attention-Deficit/Hyperactivity Disorder

    ERIC Educational Resources Information Center

    Greenhill, Laurence L.; Biederman, Joseph; Boellner, Samuel W.; Rugino, Thomas A.; Sangal, R. Bart; Earl, Craig Q.; Jiang, John G.; Swanson, James M.

    2006-01-01

    Objective: To evaluate the efficacy and tolerability of modafinil in children and adolescents, ages 7 to 17, with attention-deficit/hyperactivity disorder (ADHD). Method: In this 9-week, double-blind, flexible-dose study, patients were randomized to once-daily modafinil (170-425 mg) or placebo. Assessments included ADHD Rating Scale-IV…

  10. Random Fields

    NASA Astrophysics Data System (ADS)

    Vanmarcke, Erik

    1983-03-01

    Random variation over space and time is one of the few attributes that might safely be predicted as characterizing almost any given complex system. Random fields or "distributed disorder systems" confront astronomers, physicists, geologists, meteorologists, biologists, and other natural scientists. They appear in the artifacts developed by electrical, mechanical, civil, and other engineers. They even underlie the processes of social and economic change. The purpose of this book is to bring together existing and new methodologies of random field theory and indicate how they can be applied to these diverse areas where a "deterministic treatment is inefficient and conventional statistics insufficient." Many new results and methods are included. After outlining the extent and characteristics of the random field approach, the book reviews the classical theory of multidimensional random processes and introduces basic probability concepts and methods in the random field context. It next gives a concise amount of the second-order analysis of homogeneous random fields, in both the space-time domain and the wave number-frequency domain. This is followed by a chapter on spectral moments and related measures of disorder and on level excursions and extremes of Gaussian and related random fields. After developing a new framework of analysis based on local averages of one-, two-, and n-dimensional processes, the book concludes with a chapter discussing ramifications in the important areas of estimation, prediction, and control. The mathematical prerequisite has been held to basic college-level calculus.

  11. Patient Satisfaction with Different Interpreting Methods: A Randomized Controlled Trial

    PubMed Central

    Leng, Jennifer; Shapiro, Ephraim; Abramson, David; Motola, Ivette; Shield, David C.; Changrani, Jyotsna

    2007-01-01

    Background Growth of the foreign-born population in the U.S. has led to increasing numbers of limited-English-proficient (LEP) patients. Innovative medical interpreting strategies, including remote simultaneous medical interpreting (RSMI), have arisen to address the language barrier. This study evaluates the impact of interpreting method on patient satisfaction. Methods 1,276 English-, Spanish-, Mandarin-, and Cantonese-speaking patients attending the primary care clinic and emergency department of a large New York City municipal hospital were screened for enrollment in a randomized controlled trial. Language-discordant patients were randomized to RSMI or usual and customary (U&C) interpreting. Patients with language-concordant providers received usual care. Demographic and patient satisfaction questionnaires were administered to all participants. Results 541 patients were language-concordant with their providers and not randomized; 371 were randomized to RSMI, 167 of whom were exposed to RSMI; and 364 were randomized to U&C, 198 of whom were exposed to U&C. Patients randomized to RSMI were more likely than those with U&C to think doctors treated them with respect (RSMI 71%, U&C 64%, p < 0.05), but they did not differ in other measures of physician communication/care. In a linear regression analysis, exposure to RSMI was significantly associated with an increase in overall satisfaction with physician communication/care (β 0.10, 95% CI 0.02–0.18, scale 0–1.0). Patients randomized to RSMI were more likely to think the interpreting method protected their privacy (RSMI 51%, U&C 38%, p < 0.05). Patients randomized to either arm of interpretation reported less comprehension and satisfaction than patients in language-concordant encounters. Conclusions While not a substitute for language-concordant providers, RSMI can improve patient satisfaction and privacy among LEP patients. Implementing RSMI should be considered an important component of a multipronged approach to addressing language barriers in health care. PMID:17957417

  12. A new compound control method for sine-on-random mixed vibration test

    NASA Astrophysics Data System (ADS)

    Zhang, Buyun; Wang, Ruochen; Zeng, Falin

    2017-09-01

    Vibration environmental test (VET) is one of the important and effective methods to provide supports for the strength design, reliability and durability test of mechanical products. A new separation control strategy was proposed to apply in multiple-input multiple-output (MIMO) sine on random (SOR) mixed mode vibration test, which is the advanced and intensive test type of VET. As the key problem of the strategy, correlation integral method was applied to separate the mixed signals which included random and sinusoidal components. The feedback control formula of MIMO linear random vibration system was systematically deduced in frequency domain, and Jacobi control algorithm was proposed in view of the elements, such as self-spectrum, coherence, and phase of power spectral density (PSD) matrix. Based on the excessive correction of excitation in sine vibration test, compression factor was introduced to reduce the excitation correction, avoiding the destruction to vibration table or other devices. The two methods were synthesized to be applied in MIMO SOR vibration test system. In the final, verification test system with the vibration of a cantilever beam as the control object was established to verify the reliability and effectiveness of the methods proposed in the paper. The test results show that the exceeding values can be controlled in the tolerance range of references accurately, and the method can supply theory and application supports for mechanical engineering.

  13. Some practical problems in implementing randomization.

    PubMed

    Downs, Matt; Tucker, Kathryn; Christ-Schmidt, Heidi; Wittes, Janet

    2010-06-01

    While often theoretically simple, implementing randomization to treatment in a masked, but confirmable, fashion can prove difficult in practice. At least three categories of problems occur in randomization: (1) bad judgment in the choice of method, (2) design and programming errors in implementing the method, and (3) human error during the conduct of the trial. This article focuses on these latter two types of errors, dealing operationally with what can go wrong after trial designers have selected the allocation method. We offer several case studies and corresponding recommendations for lessening the frequency of problems in allocating treatment or for mitigating the consequences of errors. Recommendations include: (1) reviewing the randomization schedule before starting a trial, (2) being especially cautious of systems that use on-demand random number generators, (3) drafting unambiguous randomization specifications, (4) performing thorough testing before entering a randomization system into production, (5) maintaining a dataset that captures the values investigators used to randomize participants, thereby allowing the process of treatment allocation to be reproduced and verified, (6) resisting the urge to correct errors that occur in individual treatment assignments, (7) preventing inadvertent unmasking to treatment assignments in kit allocations, and (8) checking a sample of study drug kits to allow detection of errors in drug packaging and labeling. Although we performed a literature search of documented randomization errors, the examples that we provide and the resultant recommendations are based largely on our own experience in industry-sponsored clinical trials. We do not know how representative our experience is or how common errors of the type we have seen occur. Our experience underscores the importance of verifying the integrity of the treatment allocation process before and during a trial. Clinical Trials 2010; 7: 235-245. http://ctj.sagepub.com.

  14. Analysis of training sample selection strategies for regression-based quantitative landslide susceptibility mapping methods

    NASA Astrophysics Data System (ADS)

    Erener, Arzu; Sivas, A. Abdullah; Selcuk-Kestel, A. Sevtap; Düzgün, H. Sebnem

    2017-07-01

    All of the quantitative landslide susceptibility mapping (QLSM) methods requires two basic data types, namely, landslide inventory and factors that influence landslide occurrence (landslide influencing factors, LIF). Depending on type of landslides, nature of triggers and LIF, accuracy of the QLSM methods differs. Moreover, how to balance the number of 0 (nonoccurrence) and 1 (occurrence) in the training set obtained from the landslide inventory and how to select which one of the 1's and 0's to be included in QLSM models play critical role in the accuracy of the QLSM. Although performance of various QLSM methods is largely investigated in the literature, the challenge of training set construction is not adequately investigated for the QLSM methods. In order to tackle this challenge, in this study three different training set selection strategies along with the original data set is used for testing the performance of three different regression methods namely Logistic Regression (LR), Bayesian Logistic Regression (BLR) and Fuzzy Logistic Regression (FLR). The first sampling strategy is proportional random sampling (PRS), which takes into account a weighted selection of landslide occurrences in the sample set. The second method, namely non-selective nearby sampling (NNS), includes randomly selected sites and their surrounding neighboring points at certain preselected distances to include the impact of clustering. Selective nearby sampling (SNS) is the third method, which concentrates on the group of 1's and their surrounding neighborhood. A randomly selected group of landslide sites and their neighborhood are considered in the analyses similar to NNS parameters. It is found that LR-PRS, FLR-PRS and BLR-Whole Data set-ups, with order, yield the best fits among the other alternatives. The results indicate that in QLSM based on regression models, avoidance of spatial correlation in the data set is critical for the model's performance.

  15. Impedance measurement using a two-microphone, random-excitation method

    NASA Technical Reports Server (NTRS)

    Seybert, A. F.; Parrott, T. L.

    1978-01-01

    The feasibility of using a two-microphone, random-excitation technique for the measurement of acoustic impedance was studied. Equations were developed, including the effect of mean flow, which show that acoustic impedance is related to the pressure ratio and phase difference between two points in a duct carrying plane waves only. The impedances of a honeycomb ceramic specimen and a Helmholtz resonator were measured and compared with impedances obtained using the conventional standing-wave method. Agreement between the two methods was generally good. A sensitivity analysis was performed to pinpoint possible error sources and recommendations were made for future study. The two-microphone approach evaluated in this study appears to have some advantages over other impedance measuring techniques.

  16. Effects of pilates on patients with chronic non-specific low back pain: a systematic review

    PubMed Central

    Lin, Hui-Ting; Hung, Wei-Ching; Hung, Jia-Ling; Wu, Pei-Shan; Liaw, Li-Jin; Chang, Jia-Hao

    2016-01-01

    [Purpose] To evaluate the effects of Pilates on patients with chronic low back pain through a systematic review of high-quality articles on randomized controlled trials. [Subjects and Methods] Keywords and synonyms for “Pilates” and “Chronic low back pain” were used in database searches. The databases included PubMed, Physiotherapy Evidence Database (PEDro), Medline, and the Cochrane Library. Articles involving randomized controlled trials with higher than 5 points on the PEDro scale were reviewed for suitability and inclusion. The methodological quality of the included randomized controlled trials was evaluated using the PEDro scale. Relevant information was extracted by 3 reviewers. [Results] Eight randomized controlled trial articles were included. Patients with chronic low back pain showed statistically significant improvement in pain relief and functional ability compared to patients who only performed usual or routine health care. However, other forms of exercise were similar to Pilates in the improvement of pain relief and functional capacity. [Conclusion] In patients with chronic low back pain, Pilates showed significant improvement in pain relief and functional enhancement. Other exercises showed effects similar to those of Pilates, if waist or torso movement was included and the exercises were performed for 20 cumulative hours. PMID:27821970

  17. The challenge for genetic epidemiologists: how to analyze large numbers of SNPs in relation to complex diseases.

    PubMed

    Heidema, A Geert; Boer, Jolanda M A; Nagelkerke, Nico; Mariman, Edwin C M; van der A, Daphne L; Feskens, Edith J M

    2006-04-21

    Genetic epidemiologists have taken the challenge to identify genetic polymorphisms involved in the development of diseases. Many have collected data on large numbers of genetic markers but are not familiar with available methods to assess their association with complex diseases. Statistical methods have been developed for analyzing the relation between large numbers of genetic and environmental predictors to disease or disease-related variables in genetic association studies. In this commentary we discuss logistic regression analysis, neural networks, including the parameter decreasing method (PDM) and genetic programming optimized neural networks (GPNN) and several non-parametric methods, which include the set association approach, combinatorial partitioning method (CPM), restricted partitioning method (RPM), multifactor dimensionality reduction (MDR) method and the random forests approach. The relative strengths and weaknesses of these methods are highlighted. Logistic regression and neural networks can handle only a limited number of predictor variables, depending on the number of observations in the dataset. Therefore, they are less useful than the non-parametric methods to approach association studies with large numbers of predictor variables. GPNN on the other hand may be a useful approach to select and model important predictors, but its performance to select the important effects in the presence of large numbers of predictors needs to be examined. Both the set association approach and random forests approach are able to handle a large number of predictors and are useful in reducing these predictors to a subset of predictors with an important contribution to disease. The combinatorial methods give more insight in combination patterns for sets of genetic and/or environmental predictor variables that may be related to the outcome variable. As the non-parametric methods have different strengths and weaknesses we conclude that to approach genetic association studies using the case-control design, the application of a combination of several methods, including the set association approach, MDR and the random forests approach, will likely be a useful strategy to find the important genes and interaction patterns involved in complex diseases.

  18. A Multilevel AR(1) Model: Allowing for Inter-Individual Differences in Trait-Scores, Inertia, and Innovation Variance.

    PubMed

    Jongerling, Joran; Laurenceau, Jean-Philippe; Hamaker, Ellen L

    2015-01-01

    In this article we consider a multilevel first-order autoregressive [AR(1)] model with random intercepts, random autoregression, and random innovation variance (i.e., the level 1 residual variance). Including random innovation variance is an important extension of the multilevel AR(1) model for two reasons. First, between-person differences in innovation variance are important from a substantive point of view, in that they capture differences in sensitivity and/or exposure to unmeasured internal and external factors that influence the process. Second, using simulation methods we show that modeling the innovation variance as fixed across individuals, when it should be modeled as a random effect, leads to biased parameter estimates. Additionally, we use simulation methods to compare maximum likelihood estimation to Bayesian estimation of the multilevel AR(1) model and investigate the trade-off between the number of individuals and the number of time points. We provide an empirical illustration by applying the extended multilevel AR(1) model to daily positive affect ratings from 89 married women over the course of 42 consecutive days.

  19. Application of Poisson random effect models for highway network screening.

    PubMed

    Jiang, Ximiao; Abdel-Aty, Mohamed; Alamili, Samer

    2014-02-01

    In recent years, Bayesian random effect models that account for the temporal and spatial correlations of crash data became popular in traffic safety research. This study employs random effect Poisson Log-Normal models for crash risk hotspot identification. Both the temporal and spatial correlations of crash data were considered. Potential for Safety Improvement (PSI) were adopted as a measure of the crash risk. Using the fatal and injury crashes that occurred on urban 4-lane divided arterials from 2006 to 2009 in the Central Florida area, the random effect approaches were compared to the traditional Empirical Bayesian (EB) method and the conventional Bayesian Poisson Log-Normal model. A series of method examination tests were conducted to evaluate the performance of different approaches. These tests include the previously developed site consistence test, method consistence test, total rank difference test, and the modified total score test, as well as the newly proposed total safety performance measure difference test. Results show that the Bayesian Poisson model accounting for both temporal and spatial random effects (PTSRE) outperforms the model that with only temporal random effect, and both are superior to the conventional Poisson Log-Normal model (PLN) and the EB model in the fitting of crash data. Additionally, the method evaluation tests indicate that the PTSRE model is significantly superior to the PLN model and the EB model in consistently identifying hotspots during successive time periods. The results suggest that the PTSRE model is a superior alternative for road site crash risk hotspot identification. Copyright © 2013 Elsevier Ltd. All rights reserved.

  20. Randomized trials published in some Chinese journals: how many are randomized?

    PubMed

    Wu, Taixiang; Li, Youping; Bian, Zhaoxiang; Liu, Guanjian; Moher, David

    2009-07-02

    The approximately 1100 medical journals now active in China are publishing a rapidly increasing number of research reports, including many studies identified by their authors as randomized controlled trials. It has been noticed that these reports mostly present positive results, and their quality and authenticity have consequently been called into question. We investigated the adequacy of randomization of clinical trials published in recent years in China to determine how many of them met acceptable standards for allocating participants to treatment groups. The China National Knowledge Infrastructure electronic database was searched for reports of randomized controlled trials on 20 common diseases published from January 1994 to June 2005. From this sample, a subset of trials that appeared to have used randomization methods was selected. Twenty-one investigators trained in the relevant knowledge, communication skills and quality control issues interviewed the original authors of these trials about the participant randomization methods and related quality-control features of their trials. From an initial sample of 37,313 articles identified in the China National Knowledge Infrastructure database, we found 3137 apparent randomized controlled trials. Of these, 1452 were studies of conventional medicine (published in 411 journals) and 1685 were studies of traditional Chinese medicine (published in 352 journals). Interviews with the authors of 2235 of these reports revealed that only 207 studies adhered to accepted methodology for randomization and could on those grounds be deemed authentic randomized controlled trials (6.8%, 95% confidence interval 5.9-7.7). There was no statistically significant difference in the rate of authenticity between randomized controlled trials of traditional interventions and those of conventional interventions. Randomized controlled trials conducted at hospitals affiliated to medical universities were more likely to be authentic than trials conducted at level 3 and level 2 hospitals (relative risk 1.58, 95% confidence interval 1.18-2.13, and relative risk 14.42, 95% confidence interval 9.40-22.10, respectively). The likelihood of authenticity was higher in level 3 hospitals than in level 2 hospitals (relative risk 9.32, 95% confidence interval 5.83-14.89). All randomized controlled trials of pre-market drug clinical trial were authentic by our criteria. Of the trials conducted at university-affiliated hospitals, 56.3% were authentic (95% confidence interval 32.0-81.0). Most reports of randomized controlled trials published in some Chinese journals lacked an adequate description of randomization. Similarly, most so called 'randomized controlled trials' were not real randomized controlled trials owing to a lack of adequate understanding on the part of the authors of rigorous clinical trial design. All randomized controlled trials of pre-market drug clinical trial included in this research were authentic. Randomized controlled trials conducted by authors in high level hospitals, especially in hospitals affiliated to medical universities had a higher rate of authenticity. That so many non-randomized controlled trials were published as randomized controlled trials reflected the fact that peer review needs to be improved and a good practice guide for peer review including how to identify the authenticity of the study urgently needs to be developed.

  1. Complete Numerical Solution of the Diffusion Equation of Random Genetic Drift

    PubMed Central

    Zhao, Lei; Yue, Xingye; Waxman, David

    2013-01-01

    A numerical method is presented to solve the diffusion equation for the random genetic drift that occurs at a single unlinked locus with two alleles. The method was designed to conserve probability, and the resulting numerical solution represents a probability distribution whose total probability is unity. We describe solutions of the diffusion equation whose total probability is unity as complete. Thus the numerical method introduced in this work produces complete solutions, and such solutions have the property that whenever fixation and loss can occur, they are automatically included within the solution. This feature demonstrates that the diffusion approximation can describe not only internal allele frequencies, but also the boundary frequencies zero and one. The numerical approach presented here constitutes a single inclusive framework from which to perform calculations for random genetic drift. It has a straightforward implementation, allowing it to be applied to a wide variety of problems, including those with time-dependent parameters, such as changing population sizes. As tests and illustrations of the numerical method, it is used to determine: (i) the probability density and time-dependent probability of fixation for a neutral locus in a population of constant size; (ii) the probability of fixation in the presence of selection; and (iii) the probability of fixation in the presence of selection and demographic change, the latter in the form of a changing population size. PMID:23749318

  2. Enhanced Precision Geolocation in 4G Wireless Networks

    DTIC Science & Technology

    2013-03-01

    years has implemented a National Emergency Warning System using text messages delivered to cell phones [5]. The November 1999 FCC E911 regulations...statistical theory of passive geolocation of emitters may be found in [18]. Papers that survey methods of geolocation applied to cell phones include [4...where to put the tower % n: which tower to place %randomTowers(obj,dispersion, seperation ): generates % random towers for the network % obj: the network

  3. Comparison of Random Forest and Parametric Imputation Models for Imputing Missing Data Using MICE: A CALIBER Study

    PubMed Central

    Shah, Anoop D.; Bartlett, Jonathan W.; Carpenter, James; Nicholas, Owen; Hemingway, Harry

    2014-01-01

    Multivariate imputation by chained equations (MICE) is commonly used for imputing missing data in epidemiologic research. The “true” imputation model may contain nonlinearities which are not included in default imputation models. Random forest imputation is a machine learning technique which can accommodate nonlinearities and interactions and does not require a particular regression model to be specified. We compared parametric MICE with a random forest-based MICE algorithm in 2 simulation studies. The first study used 1,000 random samples of 2,000 persons drawn from the 10,128 stable angina patients in the CALIBER database (Cardiovascular Disease Research using Linked Bespoke Studies and Electronic Records; 2001–2010) with complete data on all covariates. Variables were artificially made “missing at random,” and the bias and efficiency of parameter estimates obtained using different imputation methods were compared. Both MICE methods produced unbiased estimates of (log) hazard ratios, but random forest was more efficient and produced narrower confidence intervals. The second study used simulated data in which the partially observed variable depended on the fully observed variables in a nonlinear way. Parameter estimates were less biased using random forest MICE, and confidence interval coverage was better. This suggests that random forest imputation may be useful for imputing complex epidemiologic data sets in which some patients have missing data. PMID:24589914

  4. Comparison of random forest and parametric imputation models for imputing missing data using MICE: a CALIBER study.

    PubMed

    Shah, Anoop D; Bartlett, Jonathan W; Carpenter, James; Nicholas, Owen; Hemingway, Harry

    2014-03-15

    Multivariate imputation by chained equations (MICE) is commonly used for imputing missing data in epidemiologic research. The "true" imputation model may contain nonlinearities which are not included in default imputation models. Random forest imputation is a machine learning technique which can accommodate nonlinearities and interactions and does not require a particular regression model to be specified. We compared parametric MICE with a random forest-based MICE algorithm in 2 simulation studies. The first study used 1,000 random samples of 2,000 persons drawn from the 10,128 stable angina patients in the CALIBER database (Cardiovascular Disease Research using Linked Bespoke Studies and Electronic Records; 2001-2010) with complete data on all covariates. Variables were artificially made "missing at random," and the bias and efficiency of parameter estimates obtained using different imputation methods were compared. Both MICE methods produced unbiased estimates of (log) hazard ratios, but random forest was more efficient and produced narrower confidence intervals. The second study used simulated data in which the partially observed variable depended on the fully observed variables in a nonlinear way. Parameter estimates were less biased using random forest MICE, and confidence interval coverage was better. This suggests that random forest imputation may be useful for imputing complex epidemiologic data sets in which some patients have missing data.

  5. Creating ensembles of decision trees through sampling

    DOEpatents

    Kamath, Chandrika; Cantu-Paz, Erick

    2005-08-30

    A system for decision tree ensembles that includes a module to read the data, a module to sort the data, a module to evaluate a potential split of the data according to some criterion using a random sample of the data, a module to split the data, and a module to combine multiple decision trees in ensembles. The decision tree method is based on statistical sampling techniques and includes the steps of reading the data; sorting the data; evaluating a potential split according to some criterion using a random sample of the data, splitting the data, and combining multiple decision trees in ensembles.

  6. Speckle lithography for fabricating Gaussian, quasi-random 2D structures and black silicon structures

    PubMed Central

    Bingi, Jayachandra; Murukeshan, Vadakke Matham

    2015-01-01

    Laser speckle pattern is a granular structure formed due to random coherent wavelet interference and generally considered as noise in optical systems including photolithography. Contrary to this, in this paper, we use the speckle pattern to generate predictable and controlled Gaussian random structures and quasi-random structures photo-lithographically. The random structures made using this proposed speckle lithography technique are quantified based on speckle statistics, radial distribution function (RDF) and fast Fourier transform (FFT). The control over the speckle size, density and speckle clustering facilitates the successful fabrication of black silicon with different surface structures. The controllability and tunability of randomness makes this technique a robust method for fabricating predictable 2D Gaussian random structures and black silicon structures. These structures can enhance the light trapping significantly in solar cells and hence enable improved energy harvesting. Further, this technique can enable efficient fabrication of disordered photonic structures and random media based devices. PMID:26679513

  7. Statistical parameters of random heterogeneity estimated by analysing coda waves based on finite difference method

    NASA Astrophysics Data System (ADS)

    Emoto, K.; Saito, T.; Shiomi, K.

    2017-12-01

    Short-period (<1 s) seismograms are strongly affected by small-scale (<10 km) heterogeneities in the lithosphere. In general, short-period seismograms are analysed based on the statistical method by considering the interaction between seismic waves and randomly distributed small-scale heterogeneities. Statistical properties of the random heterogeneities have been estimated by analysing short-period seismograms. However, generally, the small-scale random heterogeneity is not taken into account for the modelling of long-period (>2 s) seismograms. We found that the energy of the coda of long-period seismograms shows a spatially flat distribution. This phenomenon is well known in short-period seismograms and results from the scattering by small-scale heterogeneities. We estimate the statistical parameters that characterize the small-scale random heterogeneity by modelling the spatiotemporal energy distribution of long-period seismograms. We analyse three moderate-size earthquakes that occurred in southwest Japan. We calculate the spatial distribution of the energy density recorded by a dense seismograph network in Japan at the period bands of 8-16 s, 4-8 s and 2-4 s and model them by using 3-D finite difference (FD) simulations. Compared to conventional methods based on statistical theories, we can calculate more realistic synthetics by using the FD simulation. It is not necessary to assume a uniform background velocity, body or surface waves and scattering properties considered in general scattering theories. By taking the ratio of the energy of the coda area to that of the entire area, we can separately estimate the scattering and the intrinsic absorption effects. Our result reveals the spectrum of the random inhomogeneity in a wide wavenumber range including the intensity around the corner wavenumber as P(m) = 8πε2a3/(1 + a2m2)2, where ε = 0.05 and a = 3.1 km, even though past studies analysing higher-frequency records could not detect the corner. Finally, we estimate the intrinsic attenuation by modelling the decay rate of the energy. The method proposed in this study is suitable for quantifying the statistical properties of long-wavelength subsurface random inhomogeneity, which leads the way to characterizing a wider wavenumber range of spectra, including the corner wavenumber.

  8. Evaluation of a Class of Simple and Effective Uncertainty Methods for Sparse Samples of Random Variables and Functions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Romero, Vicente; Bonney, Matthew; Schroeder, Benjamin

    When very few samples of a random quantity are available from a source distribution of unknown shape, it is usually not possible to accurately infer the exact distribution from which the data samples come. Under-estimation of important quantities such as response variance and failure probabilities can result. For many engineering purposes, including design and risk analysis, we attempt to avoid under-estimation with a strategy to conservatively estimate (bound) these types of quantities -- without being overly conservative -- when only a few samples of a random quantity are available from model predictions or replicate experiments. This report examines a classmore » of related sparse-data uncertainty representation and inference approaches that are relatively simple, inexpensive, and effective. Tradeoffs between the methods' conservatism, reliability, and risk versus number of data samples (cost) are quantified with multi-attribute metrics use d to assess method performance for conservative estimation of two representative quantities: central 95% of response; and 10 -4 probability of exceeding a response threshold in a tail of the distribution. Each method's performance is characterized with 10,000 random trials on a large number of diverse and challenging distributions. The best method and number of samples to use in a given circumstance depends on the uncertainty quantity to be estimated, the PDF character, and the desired reliability of bounding the true value. On the basis of this large data base and study, a strategy is proposed for selecting the method and number of samples for attaining reasonable credibility levels in bounding these types of quantities when sparse samples of random variables or functions are available from experiments or simulations.« less

  9. Simulating propagation of coherent light in random media using the Fredholm type integral equation

    NASA Astrophysics Data System (ADS)

    Kraszewski, Maciej; Pluciński, Jerzy

    2017-06-01

    Studying propagation of light in random scattering materials is important for both basic and applied research. Such studies often require usage of numerical method for simulating behavior of light beams in random media. However, if such simulations require consideration of coherence properties of light, they may become a complex numerical problems. There are well established methods for simulating multiple scattering of light (e.g. Radiative Transfer Theory and Monte Carlo methods) but they do not treat coherence properties of light directly. Some variations of these methods allows to predict behavior of coherent light but only for an averaged realization of the scattering medium. This limits their application in studying many physical phenomena connected to a specific distribution of scattering particles (e.g. laser speckle). In general, numerical simulation of coherent light propagation in a specific realization of random medium is a time- and memory-consuming problem. The goal of the presented research was to develop new efficient method for solving this problem. The method, presented in our earlier works, is based on solving the Fredholm type integral equation, which describes multiple light scattering process. This equation can be discretized and solved numerically using various algorithms e.g. by direct solving the corresponding linear equations system, as well as by using iterative or Monte Carlo solvers. Here we present recent development of this method including its comparison with well-known analytical results and a finite-difference type simulations. We also present extension of the method for problems of multiple scattering of a polarized light on large spherical particles that joins presented mathematical formalism with Mie theory.

  10. Combined Uncertainty and A-Posteriori Error Bound Estimates for General CFD Calculations: Theory and Software Implementation

    NASA Technical Reports Server (NTRS)

    Barth, Timothy J.

    2014-01-01

    This workshop presentation discusses the design and implementation of numerical methods for the quantification of statistical uncertainty, including a-posteriori error bounds, for output quantities computed using CFD methods. Hydrodynamic realizations often contain numerical error arising from finite-dimensional approximation (e.g. numerical methods using grids, basis functions, particles) and statistical uncertainty arising from incomplete information and/or statistical characterization of model parameters and random fields. The first task at hand is to derive formal error bounds for statistics given realizations containing finite-dimensional numerical error [1]. The error in computed output statistics contains contributions from both realization error and the error resulting from the calculation of statistics integrals using a numerical method. A second task is to devise computable a-posteriori error bounds by numerically approximating all terms arising in the error bound estimates. For the same reason that CFD calculations including error bounds but omitting uncertainty modeling are only of limited value, CFD calculations including uncertainty modeling but omitting error bounds are only of limited value. To gain maximum value from CFD calculations, a general software package for uncertainty quantification with quantified error bounds has been developed at NASA. The package provides implementations for a suite of numerical methods used in uncertainty quantification: Dense tensorization basis methods [3] and a subscale recovery variant [1] for non-smooth data, Sparse tensorization methods[2] utilizing node-nested hierarchies, Sampling methods[4] for high-dimensional random variable spaces.

  11. Randomized Multicenter Feasibility Trial of Myofascial Physical Therapy for Treatment of Urologic Chronic Pelvic Pain Syndrome

    PubMed Central

    FitzGerald, Mary P; Anderson, Rodney U; Potts, Jeannette; Payne, Christopher K; Peters, Kenneth M; Clemens, J Quentin; Kotarinos, Rhonda; Fraser, Laura; Cosby, Annamarie; Fortman, Carole; Neville, Cynthia; Badillo, Suzanne; Odabachian, Lisa; Sanfield, Anna; O’Dougherty, Betsy; Halle-Podell, Rick; Cen, Liyi; Chuai, Shannon; Landis, J Richard; Kusek, John W; Nyberg, Leroy M

    2010-01-01

    Objectives To determine the feasibility of conducting a randomized clinical trial designed to compare two methods of manual therapy (myofascial physical therapy (MPT) and global therapeutic massage (GTM)) among patients with urologic chronic pelvic pain syndromes. Materials and Methods Our goal was to recruit 48 subjects with chronic prostatitis/chronic pelvic pain syndrome or interstitial cystitis/painful bladder syndrome at six clinical centers. Eligible patients were randomized to either MPT or GTM and were scheduled to receive up to 10 weekly treatments, each 1 hour in duration. Criteria to assess feasibility included adherence of therapists to prescribed therapeutic protocol as determined by records of treatment, adverse events which occurred during study treatment, and rate of response to therapy as assessed by the Patient Global Response Assessment (GRA). Primary outcome analysis compared response rates between treatment arms using Mantel-Haenszel methods. Results Twenty-three (49%) men and 24 (51%) women were randomized over a six month period. Twenty-four (51%) patients were randomized to GTM, 23 (49%) to MPT; 44 (94%) patients completed the study. Therapist adherence to the treatment protocols was excellent. The GRA response rate of 57% in the MPT group was significantly higher than the rate of 21% in the GTM treatment group (p=0.03). Conclusions The goals to judge feasibility of conducting a full-scale trial of physical therapy methods were met. The preliminary findings of a beneficial effect of MPT warrants further study. PMID:19535099

  12. The feasibility of a randomized controlled trial of esophagectomy for esophageal cancer - the ROMIO (Randomized Oesophagectomy: Minimally Invasive or Open) study: protocol for a randomized controlled trial

    PubMed Central

    2014-01-01

    Background There is a need for evidence of the clinical effectiveness of minimally invasive surgery for the treatment of esophageal cancer, but randomized controlled trials in surgery are often difficult to conduct. The ROMIO (Randomized Open or Minimally Invasive Oesophagectomy) study will establish the feasibility of a main trial which will examine the clinical and cost-effectiveness of minimally invasive and open surgical procedures for the treatment of esophageal cancer. Methods/Design A pilot randomized controlled trial (RCT), in two centers (University Hospitals Bristol NHS Foundation Trust and Plymouth Hospitals NHS Trust) will examine numbers of incident and eligible patients who consent to participate in the ROMIO study. Interventions will include esophagectomy by: (1) open gastric mobilization and right thoracotomy, (2) laparoscopic gastric mobilization and right thoracotomy, and (3) totally minimally invasive surgery (in the Bristol center only). The primary outcomes of the feasibility study will be measures of recruitment, successful development of methods to monitor quality of surgery and fidelity to a surgical protocol, and development of a core outcome set to evaluate esophageal cancer surgery. The study will test patient-reported outcomes measures to assess recovery, methods to blind participants, assessments of surgical morbidity, and methods to capture cost and resource use. ROMIO will integrate methods to monitor and improve recruitment using audio recordings of consultations between recruiting surgeons, nurses, and patients to provide feedback for recruiting staff. Discussion The ROMIO study aims to establish efficient methods to undertake a main trial of minimally invasive surgery versus open surgery for esophageal cancer. Trial registration The pilot trial has Current Controlled Trials registration number ISRCTN59036820(25/02/2013) at http://www.controlled-trials.com; the ROMIO trial record at that site gives a link to the original version of the study protocol. PMID:24888266

  13. Generating and using truly random quantum states in Mathematica

    NASA Astrophysics Data System (ADS)

    Miszczak, Jarosław Adam

    2012-01-01

    The problem of generating random quantum states is of a great interest from the quantum information theory point of view. In this paper we present a package for Mathematica computing system harnessing a specific piece of hardware, namely Quantis quantum random number generator (QRNG), for investigating statistical properties of quantum states. The described package implements a number of functions for generating random states, which use Quantis QRNG as a source of randomness. It also provides procedures which can be used in simulations not related directly to quantum information processing. Program summaryProgram title: TRQS Catalogue identifier: AEKA_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEKA_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 7924 No. of bytes in distributed program, including test data, etc.: 88 651 Distribution format: tar.gz Programming language: Mathematica, C Computer: Requires a Quantis quantum random number generator (QRNG, http://www.idquantique.com/true-random-number-generator/products-overview.html) and supporting a recent version of Mathematica Operating system: Any platform supporting Mathematica; tested with GNU/Linux (32 and 64 bit) RAM: Case dependent Classification: 4.15 Nature of problem: Generation of random density matrices. Solution method: Use of a physical quantum random number generator. Running time: Generating 100 random numbers takes about 1 second, generating 1000 random density matrices takes more than a minute.

  14. Essential energy space random walk via energy space metadynamics method to accelerate molecular dynamics simulations

    NASA Astrophysics Data System (ADS)

    Li, Hongzhi; Min, Donghong; Liu, Yusong; Yang, Wei

    2007-09-01

    To overcome the possible pseudoergodicity problem, molecular dynamic simulation can be accelerated via the realization of an energy space random walk. To achieve this, a biased free energy function (BFEF) needs to be priori obtained. Although the quality of BFEF is essential for sampling efficiency, its generation is usually tedious and nontrivial. In this work, we present an energy space metadynamics algorithm to efficiently and robustly obtain BFEFs. Moreover, in order to deal with the associated diffusion sampling problem caused by the random walk in the total energy space, the idea in the original umbrella sampling method is generalized to be the random walk in the essential energy space, which only includes the energy terms determining the conformation of a region of interest. This essential energy space generalization allows the realization of efficient localized enhanced sampling and also offers the possibility of further sampling efficiency improvement when high frequency energy terms irrelevant to the target events are free of activation. The energy space metadynamics method and its generalization in the essential energy space for the molecular dynamics acceleration are demonstrated in the simulation of a pentanelike system, the blocked alanine dipeptide model, and the leucine model.

  15. Solving large test-day models by iteration on data and preconditioned conjugate gradient.

    PubMed

    Lidauer, M; Strandén, I; Mäntysaari, E A; Pösö, J; Kettunen, A

    1999-12-01

    A preconditioned conjugate gradient method was implemented into an iteration on a program for data estimation of breeding values, and its convergence characteristics were studied. An algorithm was used as a reference in which one fixed effect was solved by Gauss-Seidel method, and other effects were solved by a second-order Jacobi method. Implementation of the preconditioned conjugate gradient required storing four vectors (size equal to number of unknowns in the mixed model equations) in random access memory and reading the data at each round of iteration. The preconditioner comprised diagonal blocks of the coefficient matrix. Comparison of algorithms was based on solutions of mixed model equations obtained by a single-trait animal model and a single-trait, random regression test-day model. Data sets for both models used milk yield records of primiparous Finnish dairy cows. Animal model data comprised 665,629 lactation milk yields and random regression test-day model data of 6,732,765 test-day milk yields. Both models included pedigree information of 1,099,622 animals. The animal model ¿random regression test-day model¿ required 122 ¿305¿ rounds of iteration to converge with the reference algorithm, but only 88 ¿149¿ were required with the preconditioned conjugate gradient. To solve the random regression test-day model with the preconditioned conjugate gradient required 237 megabytes of random access memory and took 14% of the computation time needed by the reference algorithm.

  16. Detector Design Considerations in High-Dimensional Artificial Immune Systems

    DTIC Science & Technology

    2012-03-22

    a method known as randomized RNS [15]. In this approach, Monte Carlo integration is used to determine the size of self and non-self within the given...feature space, then a number of randomly placed detectors are chosen according to Monte Carlo integration calculations. Simulated annealing is then...detector is only counted once). This value is termed ‘actual content’ because it does not including overlapping content, but only that content that is

  17. Omega-3/Omega-6 Fatty Acids for Attention Deficit Hyperactivity Disorder: A Randomized Placebo-Controlled Trial in Children and Adolescents

    ERIC Educational Resources Information Center

    Johnson, Mats; Ostlund, Sven; Fransson, Gunnar; Kadesjo, Bjorn; Gillberg, Christopher

    2009-01-01

    Objective: The aim of the study was to assess omega 3/6 fatty acids (eye q) in attention deficit hyperactivity disorder (ADHD). Method: The study included a randomized, 3-month, omega 3/6 placebo-controlled, one-way crossover trial with 75 children and adolescents (8-18 years), followed by 3 months with omega 3/6 for all. Investigator-rated ADHD…

  18. Research Methods in Healthcare Epidemiology and Antimicrobial Stewardship – Quasi-Experimental Designs

    PubMed Central

    Schweizer, Marin L.; Braun, Barbara I.; Milstone, Aaron M.

    2016-01-01

    Quasi-experimental studies evaluate the association between an intervention and an outcome using experiments in which the intervention is not randomly assigned. Quasi-experimental studies are often used to evaluate rapid responses to outbreaks or other patient safety problems requiring prompt non-randomized interventions. Quasi-experimental studies can be categorized into three major types: interrupted time series designs, designs with control groups, and designs without control groups. This methods paper highlights key considerations for quasi-experimental studies in healthcare epidemiology and antimicrobial stewardship including study design and analytic approaches to avoid selection bias and other common pitfalls of quasi-experimental studies. PMID:27267457

  19. Sparse sampling and reconstruction for electron and scanning probe microscope imaging

    DOEpatents

    Anderson, Hyrum; Helms, Jovana; Wheeler, Jason W.; Larson, Kurt W.; Rohrer, Brandon R.

    2015-07-28

    Systems and methods for conducting electron or scanning probe microscopy are provided herein. In a general embodiment, the systems and methods for conducting electron or scanning probe microscopy with an undersampled data set include: driving an electron beam or probe to scan across a sample and visit a subset of pixel locations of the sample that are randomly or pseudo-randomly designated; determining actual pixel locations on the sample that are visited by the electron beam or probe; and processing data collected by detectors from the visits of the electron beam or probe at the actual pixel locations and recovering a reconstructed image of the sample.

  20. Multiple ECG Fiducial Points-Based Random Binary Sequence Generation for Securing Wireless Body Area Networks.

    PubMed

    Zheng, Guanglou; Fang, Gengfa; Shankaran, Rajan; Orgun, Mehmet A; Zhou, Jie; Qiao, Li; Saleem, Kashif

    2017-05-01

    Generating random binary sequences (BSes) is a fundamental requirement in cryptography. A BS is a sequence of N bits, and each bit has a value of 0 or 1. For securing sensors within wireless body area networks (WBANs), electrocardiogram (ECG)-based BS generation methods have been widely investigated in which interpulse intervals (IPIs) from each heartbeat cycle are processed to produce BSes. Using these IPI-based methods to generate a 128-bit BS in real time normally takes around half a minute. In order to improve the time efficiency of such methods, this paper presents an ECG multiple fiducial-points based binary sequence generation (MFBSG) algorithm. The technique of discrete wavelet transforms is employed to detect arrival time of these fiducial points, such as P, Q, R, S, and T peaks. Time intervals between them, including RR, RQ, RS, RP, and RT intervals, are then calculated based on this arrival time, and are used as ECG features to generate random BSes with low latency. According to our analysis on real ECG data, these ECG feature values exhibit the property of randomness and, thus, can be utilized to generate random BSes. Compared with the schemes that solely rely on IPIs to generate BSes, this MFBSG algorithm uses five feature values from one heart beat cycle, and can be up to five times faster than the solely IPI-based methods. So, it achieves a design goal of low latency. According to our analysis, the complexity of the algorithm is comparable to that of fast Fourier transforms. These randomly generated ECG BSes can be used as security keys for encryption or authentication in a WBAN system.

  1. Strategies for efficient resolution analysis in full-waveform inversion

    NASA Astrophysics Data System (ADS)

    Fichtner, A.; van Leeuwen, T.; Trampert, J.

    2016-12-01

    Full-waveform inversion is developing into a standard method in the seismological toolbox. It combines numerical wave propagation for heterogeneous media with adjoint techniques in order to improve tomographic resolution. However, resolution becomes increasingly difficult to quantify because of the enormous computational requirements. Here we present two families of methods that can be used for efficient resolution analysis in full-waveform inversion. They are based on the targeted extraction of resolution proxies from the Hessian matrix, which is too large to store and to compute explicitly. Fourier methods rest on the application of the Hessian to Earth models with harmonic oscillations. This yields the Fourier spectrum of the Hessian for few selected wave numbers, from which we can extract properties of the tomographic point-spread function for any point in space. Random probing methods use uncorrelated, random test models instead of harmonic oscillations. Auto-correlating the Hessian-model applications for sufficiently many test models also characterises the point-spread function. Both Fourier and random probing methods provide a rich collection of resolution proxies. These include position- and direction-dependent resolution lengths, and the volume of point-spread functions as indicator of amplitude recovery and inter-parameter trade-offs. The computational requirements of these methods are equivalent to approximately 7 conjugate-gradient iterations in full-waveform inversion. This is significantly less than the optimisation itself, which may require tens to hundreds of iterations to reach convergence. In addition to the theoretical foundations of the Fourier and random probing methods, we show various illustrative examples from real-data full-waveform inversion for crustal and mantle structure.

  2. MATIN: a random network coding based framework for high quality peer-to-peer live video streaming.

    PubMed

    Barekatain, Behrang; Khezrimotlagh, Dariush; Aizaini Maarof, Mohd; Ghaeini, Hamid Reza; Salleh, Shaharuddin; Quintana, Alfonso Ariza; Akbari, Behzad; Cabrera, Alicia Triviño

    2013-01-01

    In recent years, Random Network Coding (RNC) has emerged as a promising solution for efficient Peer-to-Peer (P2P) video multicasting over the Internet. This probably refers to this fact that RNC noticeably increases the error resiliency and throughput of the network. However, high transmission overhead arising from sending large coefficients vector as header has been the most important challenge of the RNC. Moreover, due to employing the Gauss-Jordan elimination method, considerable computational complexity can be imposed on peers in decoding the encoded blocks and checking linear dependency among the coefficients vectors. In order to address these challenges, this study introduces MATIN which is a random network coding based framework for efficient P2P video streaming. The MATIN includes a novel coefficients matrix generation method so that there is no linear dependency in the generated coefficients matrix. Using the proposed framework, each peer encapsulates one instead of n coefficients entries into the generated encoded packet which results in very low transmission overhead. It is also possible to obtain the inverted coefficients matrix using a bit number of simple arithmetic operations. In this regard, peers sustain very low computational complexities. As a result, the MATIN permits random network coding to be more efficient in P2P video streaming systems. The results obtained from simulation using OMNET++ show that it substantially outperforms the RNC which uses the Gauss-Jordan elimination method by providing better video quality on peers in terms of the four important performance metrics including video distortion, dependency distortion, End-to-End delay and Initial Startup delay.

  3. An Evaluation of the Effectiveness of Recruitment Methods: The Staying Well after Depression Randomized Controlled Trial

    PubMed Central

    Krusche, Adele; Rudolf von Rohr, Isabelle; Muse, Kate; Duggan, Danielle; Crane, Catherine; Williams, J. Mark G.

    2014-01-01

    Background Randomized controlled trials (RCTs) are widely accepted as being the most efficient way of investigating the efficacy of psychological therapies. However, researchers conducting RCTs commonly report difficulties recruiting an adequate sample within planned timescales. In an effort to overcome recruitment difficulties, researchers often are forced to expand their recruitment criteria or extend the recruitment phase, thus increasing costs and delaying publication of results. Research investigating the effectiveness of recruitment strategies is limited and trials often fail to report sufficient details about the recruitment sources and resources utilised. Purpose We examined the efficacy of strategies implemented during the Staying Well after Depression RCT in Oxford to recruit participants with a history of recurrent depression. Methods We describe eight recruitment methods utilised and two further sources not initiated by the research team and examine their efficacy in terms of (i) the return, including the number of potential participants who contacted the trial and the number who were randomized into the trial, (ii) cost-effectiveness, comprising direct financial cost and manpower for initial contacts and randomized participants, and (iii) comparison of sociodemographic characteristics of individuals recruited from different sources. Results Poster advertising, web-based advertising and mental health worker referrals were the cheapest methods per randomized participant; however, the ratio of randomized participants to initial contacts differed markedly per source. Advertising online, via posters and on a local radio station were the most cost-effective recruitment methods for soliciting participants who subsequently were randomized into the trial. Advertising across many sources (saturation) was found to be important. Limitations It may not be feasible to employ all the recruitment methods used in this trial to obtain participation from other populations, such as those currently unwell, or in other geographical locations. Recruitment source was unavailable for participants who could not be reached after the initial contact. Thus, it is possible that the efficiency of certain methods of recruitment was poorer than estimated. Efficacy and costs of other recruitment initiatives, such as providing travel expenses to the in-person eligibility assessment and making follow-up telephone calls to candidates who contacted the recruitment team but could not be screened promptly, were not analysed. Conclusions Website advertising resulted in the highest number of randomized participants and was the second cheapest method of recruiting. Future research should evaluate the effectiveness of recruitment strategies for other samples to contribute to a comprehensive base of knowledge for future RCTs. PMID:24686105

  4. Reporting of participant flow diagrams in published reports of randomized trials

    PubMed Central

    2011-01-01

    Background Reporting of the flow of participants through each stage of a randomized trial is essential to assess the generalisability and validity of its results. We assessed the type and completeness of information reported in CONSORT (Consolidated Standards of Reporting Trials) flow diagrams published in current reports of randomized trials. Methods A cross sectional review of all primary reports of randomized trials which included a CONSORT flow diagram indexed in PubMed core clinical journals (2009). We assessed the proportion of parallel group trial publications reporting specific items recommended by CONSORT for inclusion in a flow diagram. Results Of 469 primary reports of randomized trials, 263 (56%) included a CONSORT flow diagram of which 89% (237/263) were published in a CONSORT endorsing journal. Reports published in CONSORT endorsing journals were more likely to include a flow diagram (62%; 237/380 versus 29%; 26/89). Ninety percent (236/263) of reports which included a flow diagram had a parallel group design, of which 49% (116/236) evaluated drug interventions, 58% (137/236) were multicentre, and 79% (187/236) compared two study groups, with a median sample size of 213 participants. Eighty-one percent (191/236) reported the overall number of participants assessed for eligibility, 71% (168/236) the number excluded prior to randomization and 98% (231/236) the overall number randomized. Reasons for exclusion prior to randomization were more poorly reported. Ninety-four percent (223/236) reported the number of participants allocated to each arm of the trial. However, only 40% (95/236) reported the number who actually received the allocated intervention, 67% (158/236) the number lost to follow up in each arm of the trial, 61% (145/236) whether participants discontinued the intervention during the trial and 54% (128/236) the number included in the main analysis. Conclusions Over half of published reports of randomized trials included a diagram showing the flow of participants through the trial. However, information was often missing from published flow diagrams, even in articles published in CONSORT endorsing journals. If important information is not reported it can be difficult and sometimes impossible to know if the conclusions of that trial are justified by the data presented. PMID:22141446

  5. Methods to Limit Attrition in Longitudinal Comparative Effectiveness Trials: Lessons from the Lithium Use for Bipolar Disorder (LiTMUS) Study

    PubMed Central

    Sylvia, Louisa G.; Reilly-Harrington, Noreen A.; Leon, Andrew C.; Kansky, Christine I.; Ketter, Terence A.; Calabrese, Joseph R.; Thase, Michael E.; Bowden, Charles L.; Friedman, Edward S.; Ostacher, Michael J.; Iosifescu, Dan V.; Severe, Joanne; Nierenberg, Andrew A.

    2013-01-01

    Background High attrition rates which occur frequently in longitudinal clinical trials of interventions for bipolar disorder limit the interpretation of results. Purpose The aim of this article is to present design approaches that limited attrition in the Lithium Use for Bipolar Disorder (LiTMUS) Study. Methods LiTMUS was a 6-month randomized, longitudinal multi-site comparative effectiveness trial that examined bipolar participants who were at least mildly ill. Participants were randomized to either low to moderate doses of lithium or no lithium, in addition to other treatments needed for mood stabilization administered in a guideline-informed, empirically supported, and personalized fashion (N=283). Results Components of the study design that may have contributed to the low attrition rate of the study included use of: (1) an intent-to-treat design; (2) a randomized adjunctive single-blind design; (3) participant reimbursement; (4) intent-to-attend the next study visit (includes a discussion of attendance obstacles when intention is low); (5) quality care with limited participant burden; and (6) target windows for study visits. Limitations Site differences and the effectiveness and tolerability data have not been analyzed yet. Conclusions These components of the LiTMUS study design may have reduced the probability of attrition which would inform the design of future randomized clinical effectiveness trials. PMID:22076437

  6. Patient satisfaction with different interpreting methods: a randomized controlled trial.

    PubMed

    Gany, Francesca; Leng, Jennifer; Shapiro, Ephraim; Abramson, David; Motola, Ivette; Shield, David C; Changrani, Jyotsna

    2007-11-01

    Growth of the foreign-born population in the U.S. has led to increasing numbers of limited-English-proficient (LEP) patients. Innovative medical interpreting strategies, including remote simultaneous medical interpreting (RSMI), have arisen to address the language barrier. This study evaluates the impact of interpreting method on patient satisfaction. 1,276 English-, Spanish-, Mandarin-, and Cantonese-speaking patients attending the primary care clinic and emergency department of a large New York City municipal hospital were screened for enrollment in a randomized controlled trial. Language-discordant patients were randomized to RSMI or usual and customary (U&C) interpreting. Patients with language-concordant providers received usual care. Demographic and patient satisfaction questionnaires were administered to all participants. 541 patients were language-concordant with their providers and not randomized; 371 were randomized to RSMI, 167 of whom were exposed to RSMI; and 364 were randomized to U&C, 198 of whom were exposed to U&C. Patients randomized to RSMI were more likely than those with U&C to think doctors treated them with respect (RSMI 71%, U&C 64%, p < 0.05), but they did not differ in other measures of physician communication/care. In a linear regression analysis, exposure to RSMI was significantly associated with an increase in overall satisfaction with physician communication/care (beta 0.10, 95% CI 0.02-0.18, scale 0-1.0). Patients randomized to RSMI were more likely to think the interpreting method protected their privacy (RSMI 51%, U&C 38%, p < 0.05). Patients randomized to either arm of interpretation reported less comprehension and satisfaction than patients in language-concordant encounters. While not a substitute for language-concordant providers, RSMI can improve patient satisfaction and privacy among LEP patients. Implementing RSMI should be considered an important component of a multipronged approach to addressing language barriers in health care.

  7. Marginalized zero-altered models for longitudinal count data.

    PubMed

    Tabb, Loni Philip; Tchetgen, Eric J Tchetgen; Wellenius, Greg A; Coull, Brent A

    2016-10-01

    Count data often exhibit more zeros than predicted by common count distributions like the Poisson or negative binomial. In recent years, there has been considerable interest in methods for analyzing zero-inflated count data in longitudinal or other correlated data settings. A common approach has been to extend zero-inflated Poisson models to include random effects that account for correlation among observations. However, these models have been shown to have a few drawbacks, including interpretability of regression coefficients and numerical instability of fitting algorithms even when the data arise from the assumed model. To address these issues, we propose a model that parameterizes the marginal associations between the count outcome and the covariates as easily interpretable log relative rates, while including random effects to account for correlation among observations. One of the main advantages of this marginal model is that it allows a basis upon which we can directly compare the performance of standard methods that ignore zero inflation with that of a method that explicitly takes zero inflation into account. We present simulations of these various model formulations in terms of bias and variance estimation. Finally, we apply the proposed approach to analyze toxicological data of the effect of emissions on cardiac arrhythmias.

  8. Marginalized zero-altered models for longitudinal count data

    PubMed Central

    Tabb, Loni Philip; Tchetgen, Eric J. Tchetgen; Wellenius, Greg A.; Coull, Brent A.

    2015-01-01

    Count data often exhibit more zeros than predicted by common count distributions like the Poisson or negative binomial. In recent years, there has been considerable interest in methods for analyzing zero-inflated count data in longitudinal or other correlated data settings. A common approach has been to extend zero-inflated Poisson models to include random effects that account for correlation among observations. However, these models have been shown to have a few drawbacks, including interpretability of regression coefficients and numerical instability of fitting algorithms even when the data arise from the assumed model. To address these issues, we propose a model that parameterizes the marginal associations between the count outcome and the covariates as easily interpretable log relative rates, while including random effects to account for correlation among observations. One of the main advantages of this marginal model is that it allows a basis upon which we can directly compare the performance of standard methods that ignore zero inflation with that of a method that explicitly takes zero inflation into account. We present simulations of these various model formulations in terms of bias and variance estimation. Finally, we apply the proposed approach to analyze toxicological data of the effect of emissions on cardiac arrhythmias. PMID:27867423

  9. Electronic Cigarettes for Smoking Cessation.

    PubMed

    Orellana-Barrios, Menfil A; Payne, Drew; Medrano-Juarez, Rita M; Yang, Shengping; Nugent, Kenneth

    2016-10-01

    The use of electronic cigarettes (e-cigarettes) is increasing, but their use as a smoking-cessation aid is controversial. The reporting of e-cigarette studies on cessation is variable and inconsistent. To date, only 1 randomized clinical trial has included an arm with other cessation methods (nicotine patches). The cessation rates for available clinical trials are difficult to compare given differing follow-up periods and broad ranges (4% at 12 months with non-nicotine e-cigarettes to 68% at 4 weeks with concomitant nicotine e-cigarettes and other cessation methods). The average combined abstinence rate for included prospective studies was 29.1% (combination of 6-18 months׳ rates). There are few comparable clinical trials and prospective studies related to e-cigarettes use for smoking cessation, despite an increasing number of citations. Larger randomized clinical trials are essential to determine whether e-cigarettes are effective smoking-cessation devices. Copyright © 2016 Southern Society for Clinical Investigation. Published by Elsevier Inc. All rights reserved.

  10. Investigation of random walks knee cartilage segmentation model using inter-observer reproducibility: Data from the osteoarthritis initiative.

    PubMed

    Hong-Seng, Gan; Sayuti, Khairil Amir; Karim, Ahmad Helmy Abdul

    2017-01-01

    Existing knee cartilage segmentation methods have reported several technical drawbacks. In essence, graph cuts remains highly susceptible to image noise despite extended research interest; active shape model is often constraint by the selection of training data while shortest path have demonstrated shortcut problem in the presence of weak boundary, which is a common problem in medical images. The aims of this study is to investigate the capability of random walks as knee cartilage segmentation method. Experts would scribble on knee cartilage image to initialize random walks segmentation. Then, reproducibility of the method is assessed against manual segmentation by using Dice Similarity Index. The evaluation consists of normal cartilage and diseased cartilage sections which is divided into whole and single cartilage categories. A total of 15 normal images and 10 osteoarthritic images were included. The results showed that random walks method has demonstrated high reproducibility in both normal cartilage (observer 1: 0.83±0.028 and observer 2: 0.82±0.026) and osteoarthritic cartilage (observer 1: 0.80±0.069 and observer 2: 0.83±0.029). Besides, results from both experts were found to be consistent with each other, suggesting the inter-observer variation is insignificant (Normal: P=0.21; Diseased: P=0.15). The proposed segmentation model has overcame technical problems reported by existing semi-automated techniques and demonstrated highly reproducible and consistent results against manual segmentation method.

  11. Packet Randomized Experiments for Eliminating Classes of Confounders

    PubMed Central

    Pavela, Greg; Wiener, Howard; Fontaine, Kevin R.; Fields, David A.; Voss, Jameson D.; Allison, David B.

    2014-01-01

    Background Although randomization is considered essential for causal inference, it is often not possible to randomize in nutrition and obesity research. To address this, we develop a framework for an experimental design—packet randomized experiments (PREs), which improves causal inferences when randomization on a single treatment variable is not possible. This situation arises when subjects are randomly assigned to a condition (such as a new roommate) which varies in one characteristic of interest (such as weight), but also varies across many others. There has been no general discussion of this experimental design, including its strengths, limitations, and statistical properties. As such, researchers are left to develop and apply PREs on an ad hoc basis, limiting its potential to improve causal inferences among nutrition and obesity researchers. Methods We introduce PREs as an intermediary design between randomized controlled trials and observational studies. We review previous research that used the PRE design and describe its application in obesity-related research, including random roommate assignments, heterochronic parabiosis, and the quasi-random assignment of subjects to geographic areas. We then provide a statistical framework to control for potential packet-level confounders not accounted for by randomization. Results PREs have successfully been used to improve causal estimates of the effect of roommates, altitude, and breastfeeding on weight outcomes. When certain assumptions are met, PREs can asymptotically control for packet-level characteristics. This has the potential to statistically estimate the effect of a single treatment even when randomization to a single treatment did not occur. Conclusions Applying PREs to obesity-related research will improve decisions about clinical, public health, and policy actions insofar as it offers researchers new insight into cause and effect relationships among variables. PMID:25444088

  12. Quenched Large Deviations for Simple Random Walks on Percolation Clusters Including Long-Range Correlations

    NASA Astrophysics Data System (ADS)

    Berger, Noam; Mukherjee, Chiranjib; Okamura, Kazuki

    2018-03-01

    We prove a quenched large deviation principle (LDP) for a simple random walk on a supercritical percolation cluster (SRWPC) on {Z^d} ({d ≥ 2}). The models under interest include classical Bernoulli bond and site percolation as well as models that exhibit long range correlations, like the random cluster model, the random interlacement and the vacant set of random interlacements (for {d ≥ 3}) and the level sets of the Gaussian free field ({d≥ 3}). Inspired by the methods developed by Kosygina et al. (Commun Pure Appl Math 59:1489-1521, 2006) for proving quenched LDP for elliptic diffusions with a random drift, and by Yilmaz (Commun Pure Appl Math 62(8):1033-1075, 2009) and Rosenbluth (Quenched large deviations for multidimensional random walks in a random environment: a variational formula. Ph.D. thesis, NYU, arXiv:0804.1444v1) for similar results regarding elliptic random walks in random environment, we take the point of view of the moving particle and prove a large deviation principle for the quenched distribution of the pair empirical measures of the environment Markov chain in the non-elliptic case of SRWPC. Via a contraction principle, this reduces easily to a quenched LDP for the distribution of the mean velocity of the random walk and both rate functions admit explicit variational formulas. The main difficulty in our set up lies in the inherent non-ellipticity as well as the lack of translation-invariance stemming from conditioning on the fact that the origin belongs to the infinite cluster. We develop a unifying approach for proving quenched large deviations for SRWPC based on exploiting coercivity properties of the relative entropies in the context of convex variational analysis, combined with input from ergodic theory and invoking geometric properties of the supercritical percolation cluster.

  13. Quenched Large Deviations for Simple Random Walks on Percolation Clusters Including Long-Range Correlations

    NASA Astrophysics Data System (ADS)

    Berger, Noam; Mukherjee, Chiranjib; Okamura, Kazuki

    2017-12-01

    We prove a quenched large deviation principle (LDP) for a simple random walk on a supercritical percolation cluster (SRWPC) on {Z^d} ({d ≥ 2} ). The models under interest include classical Bernoulli bond and site percolation as well as models that exhibit long range correlations, like the random cluster model, the random interlacement and the vacant set of random interlacements (for {d ≥ 3} ) and the level sets of the Gaussian free field ({d≥ 3} ). Inspired by the methods developed by Kosygina et al. (Commun Pure Appl Math 59:1489-1521, 2006) for proving quenched LDP for elliptic diffusions with a random drift, and by Yilmaz (Commun Pure Appl Math 62(8):1033-1075, 2009) and Rosenbluth (Quenched large deviations for multidimensional random walks in a random environment: a variational formula. Ph.D. thesis, NYU, arXiv:0804.1444v1) for similar results regarding elliptic random walks in random environment, we take the point of view of the moving particle and prove a large deviation principle for the quenched distribution of the pair empirical measures of the environment Markov chain in the non-elliptic case of SRWPC. Via a contraction principle, this reduces easily to a quenched LDP for the distribution of the mean velocity of the random walk and both rate functions admit explicit variational formulas. The main difficulty in our set up lies in the inherent non-ellipticity as well as the lack of translation-invariance stemming from conditioning on the fact that the origin belongs to the infinite cluster. We develop a unifying approach for proving quenched large deviations for SRWPC based on exploiting coercivity properties of the relative entropies in the context of convex variational analysis, combined with input from ergodic theory and invoking geometric properties of the supercritical percolation cluster.

  14. Design and methods for a randomized clinical trial treating comorbid obesity and major depressive disorder

    PubMed Central

    Schneider, Kristin L; Bodenlos, Jamie S; Ma, Yunsheng; Olendzki, Barbara; Oleski, Jessica; Merriam, Philip; Crawford, Sybil; Ockene, Ira S; Pagoto, Sherry L

    2008-01-01

    Background Obesity is often comorbid with depression and individuals with this comorbidity fare worse in behavioral weight loss treatment. Treating depression directly prior to behavioral weight loss treatment might bolster weight loss outcomes in this population, but this has not yet been tested in a randomized clinical trial. Methods and design This randomized clinical trial will examine whether behavior therapy for depression administered prior to standard weight loss treatment produces greater weight loss than standard weight loss treatment alone. Obese women with major depressive disorder (N = 174) will be recruited from primary care clinics and the community and randomly assigned to one of the two treatment conditions. Treatment will last 2 years, and will include a 6-month intensive treatment phase followed by an 18-month maintenance phase. Follow-up assessment will occur at 6-months and 1- and 2 years following randomization. The primary outcome is weight loss. The study was designed to provide 90% power for detecting a weight change difference between conditions of 3.1 kg (standard deviation of 5.5 kg) at 1-year assuming a 25% rate of loss to follow-up. Secondary outcomes include depression, physical activity, dietary intake, psychosocial variables and cardiovascular risk factors. Potential mediators (e.g., adherence, depression, physical activity and caloric intake) of the intervention effect on weight change will also be examined. Discussion Treating depression before administering intensive health behavior interventions could potentially boost the impact on both mental and physical health outcomes. Trial registration NCT00572520 PMID:18793398

  15. Review of Test Theory and Methods.

    DTIC Science & Technology

    1981-01-01

    literature, although some books , technical reports, and unpub- lished literature have been included where relevant. The focus of the review is on practical...1977) and Abu-Sayf (1977) developed new versions of formula scores, and Molenaar (1977) took a Bayesian approach to correcting for random guessing. The...Snow’s (1977) book on aptitude and instructional methods is a landmark review of the research on the interaction between instructional methods and

  16. On the predictivity of pore-scale simulations: Estimating uncertainties with multilevel Monte Carlo

    NASA Astrophysics Data System (ADS)

    Icardi, Matteo; Boccardo, Gianluca; Tempone, Raúl

    2016-09-01

    A fast method with tunable accuracy is proposed to estimate errors and uncertainties in pore-scale and Digital Rock Physics (DRP) problems. The overall predictivity of these studies can be, in fact, hindered by many factors including sample heterogeneity, computational and imaging limitations, model inadequacy and not perfectly known physical parameters. The typical objective of pore-scale studies is the estimation of macroscopic effective parameters such as permeability, effective diffusivity and hydrodynamic dispersion. However, these are often non-deterministic quantities (i.e., results obtained for specific pore-scale sample and setup are not totally reproducible by another ;equivalent; sample and setup). The stochastic nature can arise due to the multi-scale heterogeneity, the computational and experimental limitations in considering large samples, and the complexity of the physical models. These approximations, in fact, introduce an error that, being dependent on a large number of complex factors, can be modeled as random. We propose a general simulation tool, based on multilevel Monte Carlo, that can reduce drastically the computational cost needed for computing accurate statistics of effective parameters and other quantities of interest, under any of these random errors. This is, to our knowledge, the first attempt to include Uncertainty Quantification (UQ) in pore-scale physics and simulation. The method can also provide estimates of the discretization error and it is tested on three-dimensional transport problems in heterogeneous materials, where the sampling procedure is done by generation algorithms able to reproduce realistic consolidated and unconsolidated random sphere and ellipsoid packings and arrangements. A totally automatic workflow is developed in an open-source code [1], that include rigid body physics and random packing algorithms, unstructured mesh discretization, finite volume solvers, extrapolation and post-processing techniques. The proposed method can be efficiently used in many porous media applications for problems such as stochastic homogenization/upscaling, propagation of uncertainty from microscopic fluid and rock properties to macro-scale parameters, robust estimation of Representative Elementary Volume size for arbitrary physics.

  17. Promoting state health department evidence-based cancer and chronic disease prevention: a multi-phase dissemination study with a cluster randomized trial component

    PubMed Central

    2013-01-01

    Background Cancer and other chronic diseases reduce quality and length of life and productivity, and represent a significant financial burden to society. Evidence-based public health approaches to prevent cancer and other chronic diseases have been identified in recent decades and have the potential for high impact. Yet, barriers to implement prevention approaches persist as a result of multiple factors including lack of organizational support, limited resources, competing emerging priorities and crises, and limited skill among the public health workforce. The purpose of this study is to learn how best to promote the adoption of evidence based public health practice related to chronic disease prevention. Methods/design This paper describes the methods for a multi-phase dissemination study with a cluster randomized trial component that will evaluate the dissemination of public health knowledge about evidence-based prevention of cancer and other chronic diseases. Phase one involves development of measures of practitioner views on and organizational supports for evidence-based public health and data collection using a national online survey involving state health department chronic disease practitioners. In phase two, a cluster randomized trial design will be conducted to test receptivity and usefulness of dissemination strategies directed toward state health department chronic disease practitioners to enhance capacity and organizational support for evidence-based chronic disease prevention. Twelve state health department chronic disease units will be randomly selected and assigned to intervention or control. State health department staff and the university-based study team will jointly identify, refine, and select dissemination strategies within intervention units. Intervention (dissemination) strategies may include multi-day in-person training workshops, electronic information exchange modalities, and remote technical assistance. Evaluation methods include pre-post surveys, structured qualitative phone interviews, and abstraction of state-level chronic disease prevention program plans and progress reports. Trial registration clinicaltrials.gov: NCT01978054. PMID:24330729

  18. An evaluation of the effectiveness of recruitment methods: the staying well after depression randomized controlled trial.

    PubMed

    Krusche, Adele; Rudolf von Rohr, Isabelle; Muse, Kate; Duggan, Danielle; Crane, Catherine; Williams, J Mark G

    2014-04-01

    Randomized controlled trials (RCTs) are widely accepted as being the most efficient way of investigating the efficacy of psychological therapies. However, researchers conducting RCTs commonly report difficulties in recruiting an adequate sample within planned timescales. In an effort to overcome recruitment difficulties, researchers often are forced to expand their recruitment criteria or extend the recruitment phase, thus increasing costs and delaying publication of results. Research investigating the effectiveness of recruitment strategies is limited, and trials often fail to report sufficient details about the recruitment sources and resources utilized. We examined the efficacy of strategies implemented during the Staying Well after Depression RCT in Oxford to recruit participants with a history of recurrent depression. We describe eight recruitment methods utilized and two further sources not initiated by the research team and examine their efficacy in terms of (1) the return, including the number of potential participants who contacted the trial and the number who were randomized into the trial; (2) cost-effectiveness, comprising direct financial cost and manpower for initial contacts and randomized participants; and (3) comparison of sociodemographic characteristics of individuals recruited from different sources. Poster advertising, web-based advertising, and mental health worker referrals were the cheapest methods per randomized participant; however, the ratio of randomized participants to initial contacts differed markedly per source. Advertising online, via posters, and on a local radio station were the most cost-effective recruitment methods for soliciting participants who subsequently were randomized into the trial. Advertising across many sources (saturation) was found to be important. It may not be feasible to employ all the recruitment methods used in this trial to obtain participation from other populations, such as those currently unwell, or in other geographical locations. Recruitment source was unavailable for participants who could not be reached after the initial contact. Thus, it is possible that the efficiency of certain methods of recruitment was poorer than estimated. Efficacy and costs of other recruitment initiatives, such as providing travel expenses to the in-person eligibility assessment and making follow-up telephone calls to candidates who contacted the recruitment team but could not be screened promptly, were not analysed. Website advertising resulted in the highest number of randomized participants and was the second cheapest method of recruiting. Future research should evaluate the effectiveness of recruitment strategies for other samples to contribute to a comprehensive base of knowledge for future RCTs.

  19. Double versus single cervical cerclage for patients with recurrent pregnancy loss: a randomized clinical trial.

    PubMed

    Zolghadri, Jaleh; Younesi, Masoumeh; Asadi, Nasrin; Khosravi, Dezire; Behdin, Shabnam; Tavana, Zohre; Ghaffarpasand, Fariborz

    2014-02-01

    To compare the effectiveness of the double cervical cerclage method versus the single method in women with recurrent second-trimester delivery. In this randomized clinical trial, we included 33 singleton pregnancies suffering from recurrent second-trimester pregnancy loss (≥2 consecutive fetal loss during second-trimester or with a history of unsuccessful procedures utilizing the McDonald method), due to cervical incompetence. Patients were randomly assigned to undergo either the classic McDonald method (n = 14) or the double cerclage method (n = 19). The successful pregnancy rate and gestational age at delivery was also compared between the two groups. The two study groups were comparable regarding their baseline characteristics. The successful pregnancy rate did not differ significantly between those who underwent the double cerclage method or the classic McDonald cerclage method (100% vs 85.7%; P = 0.172). In the same way, the preterm delivery rate (<34 weeks of gestation) was comparable between the two study groups (10.5% vs 35.7%; P = 0.106). Those undergoing the double cerclage method had longer gestational duration (37.2 ± 2.6 vs 34.3 ± 3.8 weeks; P = 0.016). The double cervical cerclage method seems to provide better cervical support, as compared with the classic McDonald cerclage method, in those suffering from recurrent pregnancy loss, due to cervical incompetence. © 2013 The Authors. Journal of Obstetrics and Gynaecology Research © 2013 Japan Society of Obstetrics and Gynecology.

  20. Evolution of basic equations for nearshore wave field

    PubMed Central

    ISOBE, Masahiko

    2013-01-01

    In this paper, a systematic, overall view of theories for periodic waves of permanent form, such as Stokes and cnoidal waves, is described first with their validity ranges. To deal with random waves, a method for estimating directional spectra is given. Then, various wave equations are introduced according to the assumptions included in their derivations. The mild-slope equation is derived for combined refraction and diffraction of linear periodic waves. Various parabolic approximations and time-dependent forms are proposed to include randomness and nonlinearity of waves as well as to simplify numerical calculation. Boussinesq equations are the equations developed for calculating nonlinear wave transformations in shallow water. Nonlinear mild-slope equations are derived as a set of wave equations to predict transformation of nonlinear random waves in the nearshore region. Finally, wave equations are classified systematically for a clear theoretical understanding and appropriate selection for specific applications. PMID:23318680

  1. A Spectral Approach for Quenched Limit Theorems for Random Expanding Dynamical Systems

    NASA Astrophysics Data System (ADS)

    Dragičević, D.; Froyland, G.; González-Tokman, C.; Vaienti, S.

    2018-06-01

    We prove quenched versions of (i) a large deviations principle (LDP), (ii) a central limit theorem (CLT), and (iii) a local central limit theorem for non-autonomous dynamical systems. A key advance is the extension of the spectral method, commonly used in limit laws for deterministic maps, to the general random setting. We achieve this via multiplicative ergodic theory and the development of a general framework to control the regularity of Lyapunov exponents of twisted transfer operator cocycles with respect to a twist parameter. While some versions of the LDP and CLT have previously been proved with other techniques, the local central limit theorem is, to our knowledge, a completely new result, and one that demonstrates the strength of our method. Applications include non-autonomous (piecewise) expanding maps, defined by random compositions of the form {T_{σ^{n-1} ω} circ\\cdotscirc T_{σω}circ T_ω}. An important aspect of our results is that we only assume ergodicity and invertibility of the random driving {σ:Ω\\toΩ} ; in particular no expansivity or mixing properties are required.

  2. A Spectral Approach for Quenched Limit Theorems for Random Expanding Dynamical Systems

    NASA Astrophysics Data System (ADS)

    Dragičević, D.; Froyland, G.; González-Tokman, C.; Vaienti, S.

    2018-01-01

    We prove quenched versions of (i) a large deviations principle (LDP), (ii) a central limit theorem (CLT), and (iii) a local central limit theorem for non-autonomous dynamical systems. A key advance is the extension of the spectral method, commonly used in limit laws for deterministic maps, to the general random setting. We achieve this via multiplicative ergodic theory and the development of a general framework to control the regularity of Lyapunov exponents of twisted transfer operator cocycles with respect to a twist parameter. While some versions of the LDP and CLT have previously been proved with other techniques, the local central limit theorem is, to our knowledge, a completely new result, and one that demonstrates the strength of our method. Applications include non-autonomous (piecewise) expanding maps, defined by random compositions of the form {T_{σ^{n-1} ω} circ\\cdotscirc T_{σω}circ T_ω} . An important aspect of our results is that we only assume ergodicity and invertibility of the random driving {σ:Ω\\toΩ} ; in particular no expansivity or mixing properties are required.

  3. An uncertainty model of acoustic metamaterials with random parameters

    NASA Astrophysics Data System (ADS)

    He, Z. C.; Hu, J. Y.; Li, Eric

    2018-01-01

    Acoustic metamaterials (AMs) are man-made composite materials. However, the random uncertainties are unavoidable in the application of AMs due to manufacturing and material errors which lead to the variance of the physical responses of AMs. In this paper, an uncertainty model based on the change of variable perturbation stochastic finite element method (CVPS-FEM) is formulated to predict the probability density functions of physical responses of AMs with random parameters. Three types of physical responses including the band structure, mode shapes and frequency response function of AMs are studied in the uncertainty model, which is of great interest in the design of AMs. In this computation, the physical responses of stochastic AMs are expressed as linear functions of the pre-defined random parameters by using the first-order Taylor series expansion and perturbation technique. Then, based on the linear function relationships of parameters and responses, the probability density functions of the responses can be calculated by the change-of-variable technique. Three numerical examples are employed to demonstrate the effectiveness of the CVPS-FEM for stochastic AMs, and the results are validated by Monte Carlo method successfully.

  4. A comparative study of restricted randomization procedures for multiarm trials with equal or unequal treatment allocation ratios.

    PubMed

    Ryeznik, Yevgen; Sverdlov, Oleksandr

    2018-06-04

    Randomization designs for multiarm clinical trials are increasingly used in practice, especially in phase II dose-ranging studies. Many new methods have been proposed in the literature; however, there is lack of systematic, head-to-head comparison of the competing designs. In this paper, we systematically investigate statistical properties of various restricted randomization procedures for multiarm trials with fixed and possibly unequal allocation ratios. The design operating characteristics include measures of allocation balance, randomness of treatment assignments, variations in the allocation ratio, and statistical characteristics such as type I error rate and power. The results from the current paper should help clinical investigators select an appropriate randomization procedure for their clinical trial. We also provide a web-based R shiny application that can be used to reproduce all results in this paper and run simulations under additional user-defined experimental scenarios. Copyright © 2018 John Wiley & Sons, Ltd.

  5. Stochastic differential equations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sobczyk, K.

    1990-01-01

    This book provides a unified treatment of both regular (or random) and Ito stochastic differential equations. It focuses on solution methods, including some developed only recently. Applications are discussed, in particular an insight is given into both the mathematical structure, and the most efficient solution methods (analytical as well as numerical). Starting from basic notions and results of the theory of stochastic processes and stochastic calculus (including Ito's stochastic integral), many principal mathematical problems and results related to stochastic differential equations are expounded here for the first time. Applications treated include those relating to road vehicles, earthquake excitations and offshoremore » structures.« less

  6. Hydro-mechanical coupled simulation of hydraulic fracturing using the eXtended Finite Element Method (XFEM)

    NASA Astrophysics Data System (ADS)

    Youn, Dong Joon

    This thesis presents the development and validation of an advanced hydro-mechanical coupled finite element program analyzing hydraulic fracture propagation within unconventional hydrocarbon formations under various conditions. The realistic modeling of hydraulic fracturing is necessarily required to improve the understanding and efficiency of the stimulation technique. Such modeling remains highly challenging, however, due to factors including the complexity of fracture propagation mechanisms, the coupled behavior of fracture displacement and fluid pressure, the interactions between pre-existing natural and initiated hydraulic fractures and the formation heterogeneity of the target reservoir. In this research, an eXtended Finite Element Method (XFEM) scheme is developed allowing for representation of single or multiple fracture propagations without any need for re-meshing. Also, the coupled flows through the fracture are considered in the program to account for their influence on stresses and deformations along the hydraulic fracture. In this research, a sequential coupling scheme is applied to estimate fracture aperture and fluid pressure with the XFEM. Later, the coupled XFEM program is used to estimate wellbore bottomhole pressure during fracture propagation, and the pressure variations are analyzed to determine the geometry and performance of the hydraulic fracturing as pressure leak-off test. Finally, material heterogeneity is included into the XFEM program to check the effect of random formation property distributions to the hydraulic fracture geometry. Random field theory is used to create the random realization of the material heterogeneity with the consideration of mean, standard deviation, and property correlation length. These analyses lead to probabilistic information on the response of unconventional reservoirs and offer a more scientific approach regarding risk management for the unconventional reservoir stimulation. The new stochastic approach combining XFEM and random field is named as eXtended Random Finite Element Method (XRFEM). All the numerical analysis codes in this thesis are written in Fortran 2003, and these codes are applicable as a series of sub-modules within a suite of finite element codes developed by Smith and Griffiths (2004).

  7. Probabilistic Fiber Composite Micromechanics

    NASA Technical Reports Server (NTRS)

    Stock, Thomas A.

    1996-01-01

    Probabilistic composite micromechanics methods are developed that simulate expected uncertainties in unidirectional fiber composite properties. These methods are in the form of computational procedures using Monte Carlo simulation. The variables in which uncertainties are accounted for include constituent and void volume ratios, constituent elastic properties and strengths, and fiber misalignment. A graphite/epoxy unidirectional composite (ply) is studied to demonstrate fiber composite material property variations induced by random changes expected at the material micro level. Regression results are presented to show the relative correlation between predictor and response variables in the study. These computational procedures make possible a formal description of anticipated random processes at the intra-ply level, and the related effects of these on composite properties.

  8. Mindfulness-based stress reduction for overweight/obese women with and without polycystic ovary syndrome: design and methods of a pilot randomized controlled trial.

    PubMed

    Raja-Khan, Nazia; Agito, Katrina; Shah, Julie; Stetter, Christy M; Gustafson, Theresa S; Socolow, Holly; Kunselman, Allen R; Reibel, Diane K; Legro, Richard S

    2015-03-01

    Mindfulness-based stress reduction (MBSR) may be beneficial for overweight/obese women, including women with polycystic ovary syndrome (PCOS), as it has been shown to reduce psychological distress and improve quality of life in other patient populations. Preliminary studies suggest that MBSR may also have salutary effects on blood pressure and blood glucose. This paper describes the design and methods of an ongoing pilot randomized controlled trial evaluating the feasibility and effects of MBSR in PCOS and non-PCOS women who are overweight or obese (NCT01464398). Eighty six (86) women with body mass index ≥ 25 kg/m(2), including 31 women with PCOS, have been randomized to 8 weeks of MBSR or health education control, and followed for 16 weeks. The primary outcome is mindfulness assessed with the Toronto Mindfulness Scale. Secondary outcomes include measures of blood pressure, blood glucose, quality of life, anxiety and depression. Our overall hypothesis is that MBSR will increase mindfulness and ultimately lead to favorable changes in blood pressure, blood glucose, psychological distress and quality of life in PCOS and non-PCOS women. This would support the integration of MBSR with conventional medical treatments to reduce psychological distress, cardiovascular disease and diabetes in PCOS and non-PCOS women who are overweight or obese. Copyright © 2015 Elsevier Inc. All rights reserved.

  9. Mindfulness-Based Stress Reduction for Overweight/Obese Women With and Without Polycystic Ovary Syndrome: Design and Methods of a Pilot Randomized Controlled Trial

    PubMed Central

    Raja-Khan, Nazia; Agito, Katrina; Shah, Julie; Stetter, Christy M.; Gustafson, Theresa S.; Socolow, Holly; Kunselman, Allen R.; Reibel, Diane K.; Legro, Richard S.

    2015-01-01

    Mindfulness-based stress reduction (MBSR) may be beneficial for overweight/obese women, including women with polycystic ovary syndrome (PCOS), as it has been shown to reduce psychological distress and improve quality of life in other patient populations. Preliminary studies suggest that MBSR may also have salutary effects on blood pressure and blood glucose. This paper describes the design and methods of an ongoing pilot randomized controlled trial evaluating the feasibility and effects of MBSR in PCOS and non-PCOS women who are overweight or obese. Eighty six (86) women with body mass index ≥25 kg/m2, including 31 women with PCOS, have been randomized to 8 weeks of MBSR or health education control, and followed for 16 weeks. The primary outcome is mindfulness assessed with the Toronto Mindfulness Scale. Secondary outcomes include measures of blood pressure, blood glucose, quality of life, anxiety and depression. Our overall hypothesis is that MBSR will increase mindfulness and ultimately lead to favorable changes in blood pressure, blood glucose, psychological distress and quality of life in PCOS and non-PCOS women. This would support the integration of MBSR with conventional medical treatments to reduce psychological distress, cardiovascular disease and diabetes in PCOS and non-PCOS women who are overweight or obese. PMID:25662105

  10. Multivariate normal maximum likelihood with both ordinal and continuous variables, and data missing at random.

    PubMed

    Pritikin, Joshua N; Brick, Timothy R; Neale, Michael C

    2018-04-01

    A novel method for the maximum likelihood estimation of structural equation models (SEM) with both ordinal and continuous indicators is introduced using a flexible multivariate probit model for the ordinal indicators. A full information approach ensures unbiased estimates for data missing at random. Exceeding the capability of prior methods, up to 13 ordinal variables can be included before integration time increases beyond 1 s per row. The method relies on the axiom of conditional probability to split apart the distribution of continuous and ordinal variables. Due to the symmetry of the axiom, two similar methods are available. A simulation study provides evidence that the two similar approaches offer equal accuracy. A further simulation is used to develop a heuristic to automatically select the most computationally efficient approach. Joint ordinal continuous SEM is implemented in OpenMx, free and open-source software.

  11. Using Audit Information to Adjust Parameter Estimates for Data Errors in Clinical Trials

    PubMed Central

    Shepherd, Bryan E.; Shaw, Pamela A.; Dodd, Lori E.

    2013-01-01

    Background Audits are often performed to assess the quality of clinical trial data, but beyond detecting fraud or sloppiness, the audit data is generally ignored. In earlier work using data from a non-randomized study, Shepherd and Yu (2011) developed statistical methods to incorporate audit results into study estimates, and demonstrated that audit data could be used to eliminate bias. Purpose In this manuscript we examine the usefulness of audit-based error-correction methods in clinical trial settings where a continuous outcome is of primary interest. Methods We demonstrate the bias of multiple linear regression estimates in general settings with an outcome that may have errors and a set of covariates for which some may have errors and others, including treatment assignment, are recorded correctly for all subjects. We study this bias under different assumptions including independence between treatment assignment, covariates, and data errors (conceivable in a double-blinded randomized trial) and independence between treatment assignment and covariates but not data errors (possible in an unblinded randomized trial). We review moment-based estimators to incorporate the audit data and propose new multiple imputation estimators. The performance of estimators is studied in simulations. Results When treatment is randomized and unrelated to data errors, estimates of the treatment effect using the original error-prone data (i.e., ignoring the audit results) are unbiased. In this setting, both moment and multiple imputation estimators incorporating audit data are more variable than standard analyses using the original data. In contrast, in settings where treatment is randomized but correlated with data errors and in settings where treatment is not randomized, standard treatment effect estimates will be biased. And in all settings, parameter estimates for the original, error-prone covariates will be biased. Treatment and covariate effect estimates can be corrected by incorporating audit data using either the multiple imputation or moment-based approaches. Bias, precision, and coverage of confidence intervals improve as the audit size increases. Limitations The extent of bias and the performance of methods depend on the extent and nature of the error as well as the size of the audit. This work only considers methods for the linear model. Settings much different than those considered here need further study. Conclusions In randomized trials with continuous outcomes and treatment assignment independent of data errors, standard analyses of treatment effects will be unbiased and are recommended. However, if treatment assignment is correlated with data errors or other covariates, naive analyses may be biased. In these settings, and when covariate effects are of interest, approaches for incorporating audit results should be considered. PMID:22848072

  12. Robust, Adaptive Functional Regression in Functional Mixed Model Framework.

    PubMed

    Zhu, Hongxiao; Brown, Philip J; Morris, Jeffrey S

    2011-09-01

    Functional data are increasingly encountered in scientific studies, and their high dimensionality and complexity lead to many analytical challenges. Various methods for functional data analysis have been developed, including functional response regression methods that involve regression of a functional response on univariate/multivariate predictors with nonparametrically represented functional coefficients. In existing methods, however, the functional regression can be sensitive to outlying curves and outlying regions of curves, so is not robust. In this paper, we introduce a new Bayesian method, robust functional mixed models (R-FMM), for performing robust functional regression within the general functional mixed model framework, which includes multiple continuous or categorical predictors and random effect functions accommodating potential between-function correlation induced by the experimental design. The underlying model involves a hierarchical scale mixture model for the fixed effects, random effect and residual error functions. These modeling assumptions across curves result in robust nonparametric estimators of the fixed and random effect functions which down-weight outlying curves and regions of curves, and produce statistics that can be used to flag global and local outliers. These assumptions also lead to distributions across wavelet coefficients that have outstanding sparsity and adaptive shrinkage properties, with great flexibility for the data to determine the sparsity and the heaviness of the tails. Together with the down-weighting of outliers, these within-curve properties lead to fixed and random effect function estimates that appear in our simulations to be remarkably adaptive in their ability to remove spurious features yet retain true features of the functions. We have developed general code to implement this fully Bayesian method that is automatic, requiring the user to only provide the functional data and design matrices. It is efficient enough to handle large data sets, and yields posterior samples of all model parameters that can be used to perform desired Bayesian estimation and inference. Although we present details for a specific implementation of the R-FMM using specific distributional choices in the hierarchical model, 1D functions, and wavelet transforms, the method can be applied more generally using other heavy-tailed distributions, higher dimensional functions (e.g. images), and using other invertible transformations as alternatives to wavelets.

  13. Robust, Adaptive Functional Regression in Functional Mixed Model Framework

    PubMed Central

    Zhu, Hongxiao; Brown, Philip J.; Morris, Jeffrey S.

    2012-01-01

    Functional data are increasingly encountered in scientific studies, and their high dimensionality and complexity lead to many analytical challenges. Various methods for functional data analysis have been developed, including functional response regression methods that involve regression of a functional response on univariate/multivariate predictors with nonparametrically represented functional coefficients. In existing methods, however, the functional regression can be sensitive to outlying curves and outlying regions of curves, so is not robust. In this paper, we introduce a new Bayesian method, robust functional mixed models (R-FMM), for performing robust functional regression within the general functional mixed model framework, which includes multiple continuous or categorical predictors and random effect functions accommodating potential between-function correlation induced by the experimental design. The underlying model involves a hierarchical scale mixture model for the fixed effects, random effect and residual error functions. These modeling assumptions across curves result in robust nonparametric estimators of the fixed and random effect functions which down-weight outlying curves and regions of curves, and produce statistics that can be used to flag global and local outliers. These assumptions also lead to distributions across wavelet coefficients that have outstanding sparsity and adaptive shrinkage properties, with great flexibility for the data to determine the sparsity and the heaviness of the tails. Together with the down-weighting of outliers, these within-curve properties lead to fixed and random effect function estimates that appear in our simulations to be remarkably adaptive in their ability to remove spurious features yet retain true features of the functions. We have developed general code to implement this fully Bayesian method that is automatic, requiring the user to only provide the functional data and design matrices. It is efficient enough to handle large data sets, and yields posterior samples of all model parameters that can be used to perform desired Bayesian estimation and inference. Although we present details for a specific implementation of the R-FMM using specific distributional choices in the hierarchical model, 1D functions, and wavelet transforms, the method can be applied more generally using other heavy-tailed distributions, higher dimensional functions (e.g. images), and using other invertible transformations as alternatives to wavelets. PMID:22308015

  14. A new method for reconstruction of solar irradiance

    NASA Astrophysics Data System (ADS)

    Privalsky, Victor

    2018-07-01

    The purpose of this research is to show how time series should be reconstructed using an example with the data on total solar irradiation (TSI) of the Earth and on sunspot numbers (SSN) since 1749. The traditional approach through regression equation(s) is designed for time-invariant vectors of random variables and is not applicable to time series, which present random functions of time. The autoregressive reconstruction (ARR) method suggested here requires fitting a multivariate stochastic difference equation to the target/proxy time series. The reconstruction is done through the scalar equation for the target time series with the white noise term excluded. The time series approach is shown to provide a better reconstruction of TSI than the correlation/regression method. A reconstruction criterion is introduced which allows one to define in advance the achievable level of success in the reconstruction. The conclusion is that time series, including the total solar irradiance, cannot be reconstructed properly if the data are not treated as sample records of random processes and analyzed in both time and frequency domains.

  15. Review of probabilistic analysis of dynamic response of systems with random parameters

    NASA Technical Reports Server (NTRS)

    Kozin, F.; Klosner, J. M.

    1989-01-01

    The various methods that have been studied in the past to allow probabilistic analysis of dynamic response for systems with random parameters are reviewed. Dynamic response may have been obtained deterministically if the variations about the nominal values were small; however, for space structures which require precise pointing, the variations about the nominal values of the structural details and of the environmental conditions are too large to be considered as negligible. These uncertainties are accounted for in terms of probability distributions about their nominal values. The quantities of concern for describing the response of the structure includes displacements, velocities, and the distributions of natural frequencies. The exact statistical characterization of the response would yield joint probability distributions for the response variables. Since the random quantities will appear as coefficients, determining the exact distributions will be difficult at best. Thus, certain approximations will have to be made. A number of techniques that are available are discussed, even in the nonlinear case. The methods that are described were: (1) Liouville's equation; (2) perturbation methods; (3) mean square approximate systems; and (4) nonlinear systems with approximation by linear systems.

  16. Inference of median difference based on the Box-Cox model in randomized clinical trials.

    PubMed

    Maruo, K; Isogawa, N; Gosho, M

    2015-05-10

    In randomized clinical trials, many medical and biological measurements are not normally distributed and are often skewed. The Box-Cox transformation is a powerful procedure for comparing two treatment groups for skewed continuous variables in terms of a statistical test. However, it is difficult to directly estimate and interpret the location difference between the two groups on the original scale of the measurement. We propose a helpful method that infers the difference of the treatment effect on the original scale in a more easily interpretable form. We also provide statistical analysis packages that consistently include an estimate of the treatment effect, covariance adjustments, standard errors, and statistical hypothesis tests. The simulation study that focuses on randomized parallel group clinical trials with two treatment groups indicates that the performance of the proposed method is equivalent to or better than that of the existing non-parametric approaches in terms of the type-I error rate and power. We illustrate our method with cluster of differentiation 4 data in an acquired immune deficiency syndrome clinical trial. Copyright © 2015 John Wiley & Sons, Ltd.

  17. MATIN: A Random Network Coding Based Framework for High Quality Peer-to-Peer Live Video Streaming

    PubMed Central

    Barekatain, Behrang; Khezrimotlagh, Dariush; Aizaini Maarof, Mohd; Ghaeini, Hamid Reza; Salleh, Shaharuddin; Quintana, Alfonso Ariza; Akbari, Behzad; Cabrera, Alicia Triviño

    2013-01-01

    In recent years, Random Network Coding (RNC) has emerged as a promising solution for efficient Peer-to-Peer (P2P) video multicasting over the Internet. This probably refers to this fact that RNC noticeably increases the error resiliency and throughput of the network. However, high transmission overhead arising from sending large coefficients vector as header has been the most important challenge of the RNC. Moreover, due to employing the Gauss-Jordan elimination method, considerable computational complexity can be imposed on peers in decoding the encoded blocks and checking linear dependency among the coefficients vectors. In order to address these challenges, this study introduces MATIN which is a random network coding based framework for efficient P2P video streaming. The MATIN includes a novel coefficients matrix generation method so that there is no linear dependency in the generated coefficients matrix. Using the proposed framework, each peer encapsulates one instead of n coefficients entries into the generated encoded packet which results in very low transmission overhead. It is also possible to obtain the inverted coefficients matrix using a bit number of simple arithmetic operations. In this regard, peers sustain very low computational complexities. As a result, the MATIN permits random network coding to be more efficient in P2P video streaming systems. The results obtained from simulation using OMNET++ show that it substantially outperforms the RNC which uses the Gauss-Jordan elimination method by providing better video quality on peers in terms of the four important performance metrics including video distortion, dependency distortion, End-to-End delay and Initial Startup delay. PMID:23940530

  18. Sampling Strategies and Processing of Biobank Tissue Samples from Porcine Biomedical Models.

    PubMed

    Blutke, Andreas; Wanke, Rüdiger

    2018-03-06

    In translational medical research, porcine models have steadily become more popular. Considering the high value of individual animals, particularly of genetically modified pig models, and the often-limited number of available animals of these models, establishment of (biobank) collections of adequately processed tissue samples suited for a broad spectrum of subsequent analyses methods, including analyses not specified at the time point of sampling, represent meaningful approaches to take full advantage of the translational value of the model. With respect to the peculiarities of porcine anatomy, comprehensive guidelines have recently been established for standardized generation of representative, high-quality samples from different porcine organs and tissues. These guidelines are essential prerequisites for the reproducibility of results and their comparability between different studies and investigators. The recording of basic data, such as organ weights and volumes, the determination of the sampling locations and of the numbers of tissue samples to be generated, as well as their orientation, size, processing and trimming directions, are relevant factors determining the generalizability and usability of the specimen for molecular, qualitative, and quantitative morphological analyses. Here, an illustrative, practical, step-by-step demonstration of the most important techniques for generation of representative, multi-purpose biobank specimen from porcine tissues is presented. The methods described here include determination of organ/tissue volumes and densities, the application of a volume-weighted systematic random sampling procedure for parenchymal organs by point-counting, determination of the extent of tissue shrinkage related to histological embedding of samples, and generation of randomly oriented samples for quantitative stereological analyses, such as isotropic uniform random (IUR) sections generated by the "Orientator" and "Isector" methods, and vertical uniform random (VUR) sections.

  19. Statin therapy and plasma vitamin E concentrations: A systematic review and meta-analysis of randomized placebo-controlled trials.

    PubMed

    Sahebkar, Amirhossein; Simental-Mendía, Luis E; Ferretti, Gianna; Bacchetti, Tiziana; Golledge, Jonathan

    2015-12-01

    Vitamin E is one of the most important natural antioxidants, and its plasma levels are inversely associated with the progression of atherosclerosis. There have been reports suggesting a potential negative effect of statin therapy on plasma vitamin E levels. The aim of this meta-analysis was to determine the impact of statin therapy on plasma vitamin E concentrations. PubMed-Medline, SCOPUS, Web of Science and Google Scholar databases were searched to identify randomized placebo-controlled trials evaluating the impact of statins on plasma vitamin E concentrations from inception to February 27, 2015. A systematic assessment of bias in the included studies was performed using the Cochrane criteria. A random-effects model (using DerSimonian-Laird method) and the generic inverse variance method were used to examine the effect of statins on plasma vitamin E concentrations. Heterogeneity was quantitatively assessed using the I(2) index. Sensitivity analysis was conducted using the leave-one-out method. A meta-analysis of data from 8 randomized treatment arms including 504 participants indicated a significant reduction in plasma vitamin E concentrations following statin treatment (WMD: -16.30%, 95% CI: -16.93, -15.98, p < 0.001). However, cholesterol-adjusted vitamin E concentrations (defined as vitamin E:total cholesterol ratio) were found to be improved by statin therapy (WMD: 29.35%, 95% CI: 24.98, 33.72, p < 0.001). Statin therapy was not associated with any significant alteration in LDL vitamin E content (SMD: 0.003, 95% CI: -0.90, 0.90, p = 0.995). Findings of the present study suggest that statin therapy has no negative impact on plasma vitamin E concentrations or LDL vitamin E content. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  20. Determination of Slope Safety Factor with Analytical Solution and Searching Critical Slip Surface with Genetic-Traversal Random Method

    PubMed Central

    2014-01-01

    In the current practice, to determine the safety factor of a slope with two-dimensional circular potential failure surface, one of the searching methods for the critical slip surface is Genetic Algorithm (GA), while the method to calculate the slope safety factor is Fellenius' slices method. However GA needs to be validated with more numeric tests, while Fellenius' slices method is just an approximate method like finite element method. This paper proposed a new method to determine the minimum slope safety factor which is the determination of slope safety factor with analytical solution and searching critical slip surface with Genetic-Traversal Random Method. The analytical solution is more accurate than Fellenius' slices method. The Genetic-Traversal Random Method uses random pick to utilize mutation. A computer automatic search program is developed for the Genetic-Traversal Random Method. After comparison with other methods like slope/w software, results indicate that the Genetic-Traversal Random Search Method can give very low safety factor which is about half of the other methods. However the obtained minimum safety factor with Genetic-Traversal Random Search Method is very close to the lower bound solutions of slope safety factor given by the Ansys software. PMID:24782679

  1. Minimizing effects of methodological decisions on interpretation and prediction in species distribution studies: An example with background selection

    USGS Publications Warehouse

    Jarnevich, Catherine S.; Talbert, Marian; Morisette, Jeffrey T.; Aldridge, Cameron L.; Brown, Cynthia; Kumar, Sunil; Manier, Daniel; Talbert, Colin; Holcombe, Tracy R.

    2017-01-01

    Evaluating the conditions where a species can persist is an important question in ecology both to understand tolerances of organisms and to predict distributions across landscapes. Presence data combined with background or pseudo-absence locations are commonly used with species distribution modeling to develop these relationships. However, there is not a standard method to generate background or pseudo-absence locations, and method choice affects model outcomes. We evaluated combinations of both model algorithms (simple and complex generalized linear models, multivariate adaptive regression splines, Maxent, boosted regression trees, and random forest) and background methods (random, minimum convex polygon, and continuous and binary kernel density estimator (KDE)) to assess the sensitivity of model outcomes to choices made. We evaluated six questions related to model results, including five beyond the common comparison of model accuracy assessment metrics (biological interpretability of response curves, cross-validation robustness, independent data accuracy and robustness, and prediction consistency). For our case study with cheatgrass in the western US, random forest was least sensitive to background choice and the binary KDE method was least sensitive to model algorithm choice. While this outcome may not hold for other locations or species, the methods we used can be implemented to help determine appropriate methodologies for particular research questions.

  2. Missing data and multiple imputation in clinical epidemiological research.

    PubMed

    Pedersen, Alma B; Mikkelsen, Ellen M; Cronin-Fenton, Deirdre; Kristensen, Nickolaj R; Pham, Tra My; Pedersen, Lars; Petersen, Irene

    2017-01-01

    Missing data are ubiquitous in clinical epidemiological research. Individuals with missing data may differ from those with no missing data in terms of the outcome of interest and prognosis in general. Missing data are often categorized into the following three types: missing completely at random (MCAR), missing at random (MAR), and missing not at random (MNAR). In clinical epidemiological research, missing data are seldom MCAR. Missing data can constitute considerable challenges in the analyses and interpretation of results and can potentially weaken the validity of results and conclusions. A number of methods have been developed for dealing with missing data. These include complete-case analyses, missing indicator method, single value imputation, and sensitivity analyses incorporating worst-case and best-case scenarios. If applied under the MCAR assumption, some of these methods can provide unbiased but often less precise estimates. Multiple imputation is an alternative method to deal with missing data, which accounts for the uncertainty associated with missing data. Multiple imputation is implemented in most statistical software under the MAR assumption and provides unbiased and valid estimates of associations based on information from the available data. The method affects not only the coefficient estimates for variables with missing data but also the estimates for other variables with no missing data.

  3. Missing data and multiple imputation in clinical epidemiological research

    PubMed Central

    Pedersen, Alma B; Mikkelsen, Ellen M; Cronin-Fenton, Deirdre; Kristensen, Nickolaj R; Pham, Tra My; Pedersen, Lars; Petersen, Irene

    2017-01-01

    Missing data are ubiquitous in clinical epidemiological research. Individuals with missing data may differ from those with no missing data in terms of the outcome of interest and prognosis in general. Missing data are often categorized into the following three types: missing completely at random (MCAR), missing at random (MAR), and missing not at random (MNAR). In clinical epidemiological research, missing data are seldom MCAR. Missing data can constitute considerable challenges in the analyses and interpretation of results and can potentially weaken the validity of results and conclusions. A number of methods have been developed for dealing with missing data. These include complete-case analyses, missing indicator method, single value imputation, and sensitivity analyses incorporating worst-case and best-case scenarios. If applied under the MCAR assumption, some of these methods can provide unbiased but often less precise estimates. Multiple imputation is an alternative method to deal with missing data, which accounts for the uncertainty associated with missing data. Multiple imputation is implemented in most statistical software under the MAR assumption and provides unbiased and valid estimates of associations based on information from the available data. The method affects not only the coefficient estimates for variables with missing data but also the estimates for other variables with no missing data. PMID:28352203

  4. Local spatiotemporal time-frequency peak filtering method for seismic random noise reduction

    NASA Astrophysics Data System (ADS)

    Liu, Yanping; Dang, Bo; Li, Yue; Lin, Hongbo

    2014-12-01

    To achieve a higher level of seismic random noise suppression, the Radon transform has been adopted to implement spatiotemporal time-frequency peak filtering (TFPF) in our previous studies. Those studies involved performing TFPF in full-aperture Radon domain, including linear Radon and parabolic Radon. Although the superiority of this method to the conventional TFPF has been tested through processing on synthetic seismic models and field seismic data, there are still some limitations in the method. Both full-aperture linear Radon and parabolic Radon are applicable and effective for some relatively simple situations (e.g., curve reflection events with regular geometry) but inapplicable for complicated situations such as reflection events with irregular shapes, or interlaced events with quite different slope or curvature parameters. Therefore, a localized approach to the application of the Radon transform must be applied. It would serve the filter method better by adapting the transform to the local character of the data variations. In this article, we propose an idea that adopts the local Radon transform referred to as piecewise full-aperture Radon to realize spatiotemporal TFPF, called local spatiotemporal TFPF. Through experiments on synthetic seismic models and field seismic data, this study demonstrates the advantage of our method in seismic random noise reduction and reflection event recovery for relatively complicated situations of seismic data.

  5. Realistic inversion of diffraction data for an amorphous solid: The case of amorphous silicon

    NASA Astrophysics Data System (ADS)

    Pandey, Anup; Biswas, Parthapratim; Bhattarai, Bishal; Drabold, D. A.

    2016-12-01

    We apply a method called "force-enhanced atomic refinement" (FEAR) to create a computer model of amorphous silicon (a -Si) based upon the highly precise x-ray diffraction experiments of Laaziri et al. [Phys. Rev. Lett. 82, 3460 (1999), 10.1103/PhysRevLett.82.3460]. The logic underlying our calculation is to estimate the structure of a real sample a -Si using experimental data and chemical information included in a nonbiased way, starting from random coordinates. The model is in close agreement with experiment and also sits at a suitable energy minimum according to density-functional calculations. In agreement with experiments, we find a small concentration of coordination defects that we discuss, including their electronic consequences. The gap states in the FEAR model are delocalized compared to a continuous random network model. The method is more efficient and accurate, in the sense of fitting the diffraction data, than conventional melt-quench methods. We compute the vibrational density of states and the specific heat, and we find that both compare favorably to experiments.

  6. OARSI Clinical Trials Recommendations: Design and conduct of clinical trials of lifestyle diet and exercise interventions for osteoarthritis.

    PubMed

    Messier, S P; Callahan, L F; Golightly, Y M; Keefe, F J

    2015-05-01

    The objective was to develop a set of "best practices" for use as a primer for those interested in entering the clinical trials field for lifestyle diet and/or exercise interventions in osteoarthritis (OA), and as a set of recommendations for experienced clinical trials investigators. A subcommittee of the non-pharmacologic therapies committee of the OARSI Clinical Trials Working Group was selected by the Steering Committee to develop a set of recommended principles for non-pharmacologic diet/exercise OA randomized clinical trials. Topics were identified for inclusion by co-authors and reviewed by the subcommittee. Resources included authors' expert opinions, traditional search methods including MEDLINE (via PubMed), and previously published guidelines. Suggested steps and considerations for study methods (e.g., recruitment and enrollment of participants, study design, intervention and assessment methods) were recommended. The recommendations set forth in this paper provide a guide from which a research group can design a lifestyle diet/exercise randomized clinical trial in patients with OA. Copyright © 2015 Osteoarthritis Research Society International. Published by Elsevier Ltd. All rights reserved.

  7. Quantifying the impact of fixed effects modeling of clusters in multiple imputation for cluster randomized trials

    PubMed Central

    Andridge, Rebecca. R.

    2011-01-01

    In cluster randomized trials (CRTs), identifiable clusters rather than individuals are randomized to study groups. Resulting data often consist of a small number of clusters with correlated observations within a treatment group. Missing data often present a problem in the analysis of such trials, and multiple imputation (MI) has been used to create complete data sets, enabling subsequent analysis with well-established analysis methods for CRTs. We discuss strategies for accounting for clustering when multiply imputing a missing continuous outcome, focusing on estimation of the variance of group means as used in an adjusted t-test or ANOVA. These analysis procedures are congenial to (can be derived from) a mixed effects imputation model; however, this imputation procedure is not yet available in commercial statistical software. An alternative approach that is readily available and has been used in recent studies is to include fixed effects for cluster, but the impact of using this convenient method has not been studied. We show that under this imputation model the MI variance estimator is positively biased and that smaller ICCs lead to larger overestimation of the MI variance. Analytical expressions for the bias of the variance estimator are derived in the case of data missing completely at random (MCAR), and cases in which data are missing at random (MAR) are illustrated through simulation. Finally, various imputation methods are applied to data from the Detroit Middle School Asthma Project, a recent school-based CRT, and differences in inference are compared. PMID:21259309

  8. Colloidal polyaniline

    DOEpatents

    Armes, Steven P.; Aldissi, Mahmoud

    1990-01-01

    Processable electrically conductive latex polymer compositions including colloidal particles of an oxidized, polymerized amino-substituted aromatic monomer, a stabilizing effective amount of a random copolymer containing amino-benzene type moieties as side chain constituents, and dopant anions, and a method of preparing such polymer compositions are provided.

  9. Effectiveness of workplace weight management interventions: a systematic review

    USDA-ARS?s Scientific Manuscript database

    Background: A systematic review was conducted of randomized trials of workplace weight management interventions, including trials with dietary, physical activity, environmental, behavioral and incentive based components. Main outcomes were defined as change in weight-related measures. Methods: Key w...

  10. Methods for resistive switching of memristors

    DOEpatents

    Mickel, Patrick R.; James, Conrad D.; Lohn, Andrew; Marinella, Matthew; Hsia, Alexander H.

    2016-05-10

    The present invention is directed generally to resistive random-access memory (RRAM or ReRAM) devices and systems, as well as methods of employing a thermal resistive model to understand and determine switching of such devices. In particular example, the method includes generating a power-resistance measurement for the memristor device and applying an isothermal model to the power-resistance measurement in order to determine one or more parameters of the device (e.g., filament state).

  11. Designing with figer-reinforced plastics (planar random composites)

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.

    1982-01-01

    The use of composite mechanics to predict the hygrothermomechanical behavior of planar random composites (PRC) is reviewed and described. These composites are usually made from chopped fiber reinforced resins (thermoplastics or thermosets). The hygrothermomechanical behavior includes mechanical properties, physical properties, thermal properties, fracture toughness, creep and creep rupture. Properties are presented in graphical form with sample calculations to illustrate their use. Concepts such as directional reinforcement and strip hybrids are described. Typical data that can be used for preliminary design for various PRCs are included. Several resins and molding compounds used to make PRCs are described briefly. Pertinent references are cited that cover analysis and design methods, materials, data, fabrication procedures and applications.

  12. 3D exemplar-based random walks for tooth segmentation from cone-beam computed tomography images

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pei, Yuru, E-mail: peiyuru@cis.pku.edu.cn; Ai, Xin

    Purpose: Tooth segmentation is an essential step in acquiring patient-specific dental geometries from cone-beam computed tomography (CBCT) images. Tooth segmentation from CBCT images is still a challenging task considering the comparatively low image quality caused by the limited radiation dose, as well as structural ambiguities from intercuspation and nearby alveolar bones. The goal of this paper is to present and discuss the latest accomplishments in semisupervised tooth segmentation with adaptive 3D shape constraints. Methods: The authors propose a 3D exemplar-based random walk method of tooth segmentation from CBCT images. The proposed method integrates semisupervised label propagation and regularization by 3Dmore » exemplar registration. To begin with, the pure random walk method is to get an initial segmentation of the teeth, which tends to be erroneous because of the structural ambiguity of CBCT images. And then, as an iterative refinement, the authors conduct a regularization by using 3D exemplar registration, as well as label propagation by random walks with soft constraints, to improve the tooth segmentation. In the first stage of the iteration, 3D exemplars with well-defined topologies are adapted to fit the tooth contours, which are obtained from the random walks based segmentation. The soft constraints on voxel labeling are defined by shape-based foreground dentine probability acquired by the exemplar registration, as well as the appearance-based probability from a support vector machine (SVM) classifier. In the second stage, the labels of the volume-of-interest (VOI) are updated by the random walks with soft constraints. The two stages are optimized iteratively. Instead of the one-shot label propagation in the VOI, an iterative refinement process can achieve a reliable tooth segmentation by virtue of exemplar-based random walks with adaptive soft constraints. Results: The proposed method was applied for tooth segmentation of twenty clinically captured CBCT images. Three metrics, including the Dice similarity coefficient (DSC), the Jaccard similarity coefficient (JSC), and the mean surface deviation (MSD), were used to quantitatively analyze the segmentation of anterior teeth including incisors and canines, premolars, and molars. The segmentation of the anterior teeth achieved a DSC up to 98%, a JSC of 97%, and an MSD of 0.11 mm compared with manual segmentation. For the premolars, the average values of DSC, JSC, and MSD were 98%, 96%, and 0.12 mm, respectively. The proposed method yielded a DSC of 95%, a JSC of 89%, and an MSD of 0.26 mm for molars. Aside from the interactive definition of label priors by the user, automatic tooth segmentation can be achieved in an average of 1.18 min. Conclusions: The proposed technique enables an efficient and reliable tooth segmentation from CBCT images. This study makes it clinically practical to segment teeth from CBCT images, thus facilitating pre- and interoperative uses of dental morphologies in maxillofacial and orthodontic treatments.« less

  13. Tailpulse signal generator

    DOEpatents

    Baker, John [Walnut Creek, CA; Archer, Daniel E [Knoxville, TN; Luke, Stanley John [Pleasanton, CA; Decman, Daniel J [Livermore, CA; White, Gregory K [Livermore, CA

    2009-06-23

    A tailpulse signal generating/simulating apparatus, system, and method designed to produce electronic pulses which simulate tailpulses produced by a gamma radiation detector, including the pileup effect caused by the characteristic exponential decay of the detector pulses, and the random Poisson distribution pulse timing for radioactive materials. A digital signal process (DSP) is programmed and configured to produce digital values corresponding to pseudo-randomly selected pulse amplitudes and pseudo-randomly selected Poisson timing intervals of the tailpulses. Pulse amplitude values are exponentially decayed while outputting the digital value to a digital to analog converter (DAC). And pulse amplitudes of new pulses are added to decaying pulses to simulate the pileup effect for enhanced realism in the simulation.

  14. Method For Determining And Modifying Protein/Peptide Solubilty

    DOEpatents

    Waldo, Geoffrey S.

    2005-03-15

    A solubility reporter for measuring a protein's solubility in vivo or in vitro is described. The reporter, which can be used in a single living cell, gives a specific signal suitable for determining whether the cell bears a soluble version of the protein of interest. A pool of random mutants of an arbitrary protein, generated using error-prone in vitro recombination, may also be screened for more soluble versions using the reporter, and these versions may be recombined to yield variants having further-enhanced solubility. The method of the present invention includes "irrational" (random mutagenesis) methods, which do not require a priori knowledge of the three-dimensional structure of the protein of interest. Multiple sequences of mutation/genetic recombination and selection for improved solubility are demonstrated to yield versions of the protein which display enhanced solubility.

  15. Exploring MEDLINE Space with Random Indexing and Pathfinder Networks

    PubMed Central

    Cohen, Trevor

    2008-01-01

    The integration of disparate research domains is a prerequisite for the success of the translational science initiative. MEDLINE abstracts contain content from a broad range of disciplines, presenting an opportunity for the development of methods able to integrate the knowledge they contain. Latent Semantic Analysis (LSA) and related methods learn human-like associations between terms from unannotated text. However, their computational and memory demands limits their ability to address a corpus of this size. Furthermore, visualization methods previously used in conjunction with LSA have limited ability to define the local structure of the associative networks LSA learns. This paper explores these issues by (1) processing the entire MEDLINE corpus using Random Indexing, a variant of LSA, and (2) exploring learned associations using Pathfinder Networks. Meaningful associations are inferred from MEDLINE, including a drug-disease association undetected by PUBMED search. PMID:18999236

  16. Exploring MEDLINE space with random indexing and pathfinder networks.

    PubMed

    Cohen, Trevor

    2008-11-06

    The integration of disparate research domains is a prerequisite for the success of the translational science initiative. MEDLINE abstracts contain content from a broad range of disciplines, presenting an opportunity for the development of methods able to integrate the knowledge they contain. Latent Semantic Analysis (LSA) and related methods learn human-like associations between terms from unannotated text. However, their computational and memory demands limits their ability to address a corpus of this size. Furthermore, visualization methods previously used in conjunction with LSA have limited ability to define the local structure of the associative networks LSA learns. This paper explores these issues by (1) processing the entire MEDLINE corpus using Random Indexing, a variant of LSA, and (2) exploring learned associations using Pathfinder Networks. Meaningful associations are inferred from MEDLINE, including a drug-disease association undetected by PUBMED search.

  17. Effectiveness of anisodamine for the treatment of critically ill patients with septic shock (ACIdoSIS study): study protocol for randomized controlled trial

    PubMed Central

    Zhou, Jiancang; Shang, You; Wang, Xin’an; Yin, Rui; Zhu, Zhenhua; Chen, Wensen; Tian, Xin; Yu, Yuetian; Zuo, Xiangrong; Chen, Kun; Ji, Xuqing; Ni, Hongying

    2015-01-01

    Background Septic shock is an important contributor of mortality in the intensive care unit (ICU). Although strenuous effort has been made to improve its outcome, the mortality rate is only marginally decreased. The present study aimed to investigate the effectiveness of anisodamine in the treatment of septic shock, in the hope that the drug will provide alternatives to the treatment of septic shock. Methods The study is a multi-center randomized controlled clinical trial. Study population will include critically ill patients with septic shock requiring vasopressor use. Blocked randomization was performed where anisodamine and control treatments were allocated at random in a ratio of 1:1 in blocks of sizes 2, 4, 6, 8, and 10 to 354 subjects. Interim analysis will be performed. The primary study end point is the hospital mortality, and other secondary study endpoints include ICU mortality, length of stay in ICU and hospital, organ failure free days. Adverse events including new onset psychosis, urinary retention, significant hypotension and tachycardia will be reported. Discussion The study will provide new insight into the treatment of septic shock and can help to reduce mortality rate of septic shock. Trial registration NCT02442440 (https://register.clinicaltrials.gov/). PMID:26605292

  18. Components of effective randomized controlled trials of hydrotherapy programs for fibromyalgia syndrome: A systematic review

    PubMed Central

    Perraton, Luke; Machotka, Zuzana; Kumar, Saravana

    2009-01-01

    Aim Previous systematic reviews have found hydrotherapy to be an effective management strategy for fibromyalgia syndrome (FMS). The aim of this systematic review was to summarize the components of hydrotherapy programs used in randomized controlled trials. Method A systematic review of randomized controlled trials was conducted. Only trials that have reported significant FMS-related outcomes were included. Data relating to the components of hydrotherapy programs (exercise type, duration, frequency and intensity, environmental factors, and service delivery) were analyzed. Results Eleven randomized controlled trials were included in this review. Overall, the quality of trials was good. Aerobic exercise featured in all 11 trials and the majority of hydrotherapy programs included either a strengthening or flexibility component. Great variability was noted in both the environmental components of hydrotherapy programs and service delivery. Conclusions Aerobic exercise, warm up and cool-down periods and relaxation exercises are common features of hydrotherapy programs that report significant FMS-related outcomes. Treatment duration of 60 minutes, frequency of three sessions per week and an intensity equivalent to 60%–80% maximum heart rate were the most commonly reported exercise components. Exercise appears to be the most important component of an effective hydrotherapy program for FMS, particularly when considering mental health-related outcomes. PMID:21197303

  19. At convenience and systematic random sampling: effects on the prognostic value of nuclear area assessments in breast cancer patients.

    PubMed

    Jannink, I; Bennen, J N; Blaauw, J; van Diest, P J; Baak, J P

    1995-01-01

    This study compares the influence of two different nuclear sampling methods on the prognostic value of assessments of mean and standard deviation of nuclear area (MNA, SDNA) in 191 consecutive invasive breast cancer patients with long term follow up. The first sampling method used was 'at convenience' sampling (ACS); the second, systematic random sampling (SRS). Both sampling methods were tested with a sample size of 50 nuclei (ACS-50 and SRS-50). To determine whether, besides the sampling methods, sample size had impact on prognostic value as well, the SRS method was also tested using a sample size of 100 nuclei (SRS-100). SDNA values were systematically lower for ACS, obviously due to (unconsciously) not including small and large nuclei. Testing prognostic value of a series of cut off points, MNA and SDNA values assessed by the SRS method were prognostically significantly stronger than the values obtained by the ACS method. This was confirmed in Cox regression analysis. For the MNA, the Mantel-Cox p-values from SRS-50 and SRS-100 measurements were not significantly different. However, for the SDNA, SRS-100 yielded significantly lower p-values than SRS-50. In conclusion, compared with the 'at convenience' nuclear sampling method, systematic random sampling of nuclei is not only superior with respect to reproducibility of results, but also provides a better prognostic value in patients with invasive breast cancer.

  20. What are the appropriate methods for analyzing patient-reported outcomes in randomized trials when data are missing?

    PubMed

    Hamel, J F; Sebille, V; Le Neel, T; Kubis, G; Boyer, F C; Hardouin, J B

    2017-12-01

    Subjective health measurements using Patient Reported Outcomes (PRO) are increasingly used in randomized trials, particularly for patient groups comparisons. Two main types of analytical strategies can be used for such data: Classical Test Theory (CTT) and Item Response Theory models (IRT). These two strategies display very similar characteristics when data are complete, but in the common case when data are missing, whether IRT or CTT would be the most appropriate remains unknown and was investigated using simulations. We simulated PRO data such as quality of life data. Missing responses to items were simulated as being completely random, depending on an observable covariate or on an unobserved latent trait. The considered CTT-based methods allowed comparing scores using complete-case analysis, personal mean imputations or multiple-imputations based on a two-way procedure. The IRT-based method was the Wald test on a Rasch model including a group covariate. The IRT-based method and the multiple-imputations-based method for CTT displayed the highest observed power and were the only unbiased method whatever the kind of missing data. Online software and Stata® modules compatibles with the innate mi impute suite are provided for performing such analyses. Traditional procedures (listwise deletion and personal mean imputations) should be avoided, due to inevitable problems of biases and lack of power.

  1. Methods for increasing upper airway muscle tonus in treating obstructive sleep apnea: systematic review.

    PubMed

    Valbuza, Juliana Spelta; de Oliveira, Márcio Moysés; Conti, Cristiane Fiquene; Prado, Lucila Bizari F; de Carvalho, Luciane Bizari Coin; do Prado, Gilmar Fernandes

    2010-12-01

    Treatment of obstructive sleep apnea (OSA) using methods for increasing upper airway muscle tonus has been controversial and poorly reported. Thus, a review of the evidence is needed to evaluate the effectiveness of these methods. The design used was a systematic review of randomized controlled trials. Data sources are from the Cochrane Library, Medline, Embase and Scielo, registries of ongoing trials, theses indexed at Biblioteca Regional de Medicina/Pan-American Health Organization of the World Health Organization and the reference lists of all the trials retrieved. This was a review of randomized or quasi-randomized double-blind trials on OSA. Two reviewers independently applied eligibility criteria. One reviewer assessed study quality and extracted data, and these processes were checked by a second reviewer. The primary outcome was a decrease in the apnea/hypopnea index (AHI) of below five episodes per hour. Other outcomes were subjective sleep quality, sleep quality measured by night polysomnography, quality of life measured subjectively and adverse events associated with the treatments. Three eligible trials were included. Two studies showed improvements through the objective and subjective analyses, and one study showed improvement of snoring, but not of AHI while the subjective analyses showed no improvement. The adverse events were reported and they were not significant. There is no accepted scientific evidence that methods aiming to increase muscle tonus of the stomatognathic system are effective in reducing AHI to below five events per hour. Well-designed randomized controlled trials are needed to assess the efficacy of such methods.

  2. Probabilistic structural analysis methods for improving Space Shuttle engine reliability

    NASA Technical Reports Server (NTRS)

    Boyce, L.

    1989-01-01

    Probabilistic structural analysis methods are particularly useful in the design and analysis of critical structural components and systems that operate in very severe and uncertain environments. These methods have recently found application in space propulsion systems to improve the structural reliability of Space Shuttle Main Engine (SSME) components. A computer program, NESSUS, based on a deterministic finite-element program and a method of probabilistic analysis (fast probability integration) provides probabilistic structural analysis for selected SSME components. While computationally efficient, it considers both correlated and nonnormal random variables as well as an implicit functional relationship between independent and dependent variables. The program is used to determine the response of a nickel-based superalloy SSME turbopump blade. Results include blade tip displacement statistics due to the variability in blade thickness, modulus of elasticity, Poisson's ratio or density. Modulus of elasticity significantly contributed to blade tip variability while Poisson's ratio did not. Thus, a rational method for choosing parameters to be modeled as random is provided.

  3. Ship Detection Based on Multiple Features in Random Forest Model for Hyperspectral Images

    NASA Astrophysics Data System (ADS)

    Li, N.; Ding, L.; Zhao, H.; Shi, J.; Wang, D.; Gong, X.

    2018-04-01

    A novel method for detecting ships which aim to make full use of both the spatial and spectral information from hyperspectral images is proposed. Firstly, the band which is high signal-noise ratio in the range of near infrared or short-wave infrared spectrum, is used to segment land and sea on Otsu threshold segmentation method. Secondly, multiple features that include spectral and texture features are extracted from hyperspectral images. Principal components analysis (PCA) is used to extract spectral features, the Grey Level Co-occurrence Matrix (GLCM) is used to extract texture features. Finally, Random Forest (RF) model is introduced to detect ships based on the extracted features. To illustrate the effectiveness of the method, we carry out experiments over the EO-1 data by comparing single feature and different multiple features. Compared with the traditional single feature method and Support Vector Machine (SVM) model, the proposed method can stably achieve the target detection of ships under complex background and can effectively improve the detection accuracy of ships.

  4. Differences in Reporting of Analyses in Internal Company Documents Versus Published Trial Reports: Comparisons in Industry-Sponsored Trials in Off-Label Uses of Gabapentin

    PubMed Central

    Vedula, S. Swaroop; Li, Tianjing; Dickersin, Kay

    2013-01-01

    Background Details about the type of analysis (e.g., intent to treat [ITT]) and definitions (i.e., criteria for including participants in the analysis) are necessary for interpreting a clinical trial's findings. Our objective was to compare the description of types of analyses and criteria for including participants in the publication (i.e., what was reported) with descriptions in the corresponding internal company documents (i.e., what was planned and what was done). Trials were for off-label uses of gabapentin sponsored by Pfizer and Parke-Davis, and documents were obtained through litigation. Methods and Findings For each trial, we compared internal company documents (protocols, statistical analysis plans, and research reports, all unpublished), with publications. One author extracted data and another verified, with a third person verifying discordant items and a sample of the rest. Extracted data included the number of participants randomized and analyzed for efficacy, and types of analyses for efficacy and safety and their definitions (i.e., criteria for including participants in each type of analysis). We identified 21 trials, 11 of which were published randomized controlled trials, and that provided the documents needed for planned comparisons. For three trials, there was disagreement on the number of randomized participants between the research report and publication. Seven types of efficacy analyses were described in the protocols, statistical analysis plans, and publications, including ITT and six others. The protocol or publication described ITT using six different definitions, resulting in frequent disagreements between the two documents (i.e., different numbers of participants were included in the analyses). Conclusions Descriptions of analyses conducted did not agree between internal company documents and what was publicly reported. Internal company documents provide extensive documentation of methods planned and used, and trial findings, and should be publicly accessible. Reporting standards for randomized controlled trials should recommend transparent descriptions and definitions of analyses performed and which study participants are excluded. Please see later in the article for the Editors' Summary PMID:23382656

  5. The Clinical Effects of Aromatherapy Massage on Reducing Pain for the Cancer Patients: Meta-Analysis of Randomized Controlled Trials.

    PubMed

    Chen, Ting-Hao; Tung, Tao-Hsin; Chen, Pei-Shih; Wang, Shu-Hui; Chao, Chuang-Min; Hsiung, Nan-Hsing; Chi, Ching-Chi

    2016-01-01

    Purpose. Aromatherapy massage is an alternative treatment in reducing the pain of the cancer patients. This study was to investigate whether aromatherapy massage could improve the pain of the cancer patients. Methods. We searched PubMed and Cochrane Library for relevant randomized controlled trials without language limitations between 1 January 1990 and 31 July 2015 with a priori defined inclusion and exclusion criteria. The search terms included aromatherapy, essential oil, pain, ache, cancer, tumor, and carcinoma. There were 7 studies which met the selection criteria and 3 studies were eventually included among 63 eligible publications. Results. This meta-analysis included three randomized controlled trials with a total of 278 participants (135 participants in the massage with essential oil group and 143 participants in the control (usual care) group). Compared with the control group, the massage with essential oil group had nonsignificant effect on reducing the pain (standardized mean difference = 0.01; 95% CI [-0.23,0.24]). Conclusion. Aromatherapy massage does not appear to reduce pain of the cancer patients. Further rigorous studies should be conducted with more objective measures.

  6. Evaluation of some random effects methodology applicable to bird ringing data

    USGS Publications Warehouse

    Burnham, K.P.; White, Gary C.

    2002-01-01

    Existing models for ring recovery and recapture data analysis treat temporal variations in annual survival probability (S) as fixed effects. Often there is no explainable structure to the temporal variation in S1,..., Sk; random effects can then be a useful model: Si = E(S) + ??i. Here, the temporal variation in survival probability is treated as random with average value E(??2) = ??2. This random effects model can now be fit in program MARK. Resultant inferences include point and interval estimation for process variation, ??2, estimation of E(S) and var (E??(S)) where the latter includes a component for ??2 as well as the traditional component for v??ar(S??\\S??). Furthermore, the random effects model leads to shrinkage estimates, Si, as improved (in mean square error) estimators of Si compared to the MLE, S??i, from the unrestricted time-effects model. Appropriate confidence intervals based on the Si are also provided. In addition, AIC has been generalized to random effects models. This paper presents results of a Monte Carlo evaluation of inference performance under the simple random effects model. Examined by simulation, under the simple one group Cormack-Jolly-Seber (CJS) model, are issues such as bias of ??s2, confidence interval coverage on ??2, coverage and mean square error comparisons for inference about Si based on shrinkage versus maximum likelihood estimators, and performance of AIC model selection over three models: Si ??? S (no effects), Si = E(S) + ??i (random effects), and S1,..., Sk (fixed effects). For the cases simulated, the random effects methods performed well and were uniformly better than fixed effects MLE for the Si.

  7. Logistic random effects regression models: a comparison of statistical packages for binary and ordinal outcomes

    PubMed Central

    2011-01-01

    Background Logistic random effects models are a popular tool to analyze multilevel also called hierarchical data with a binary or ordinal outcome. Here, we aim to compare different statistical software implementations of these models. Methods We used individual patient data from 8509 patients in 231 centers with moderate and severe Traumatic Brain Injury (TBI) enrolled in eight Randomized Controlled Trials (RCTs) and three observational studies. We fitted logistic random effects regression models with the 5-point Glasgow Outcome Scale (GOS) as outcome, both dichotomized as well as ordinal, with center and/or trial as random effects, and as covariates age, motor score, pupil reactivity or trial. We then compared the implementations of frequentist and Bayesian methods to estimate the fixed and random effects. Frequentist approaches included R (lme4), Stata (GLLAMM), SAS (GLIMMIX and NLMIXED), MLwiN ([R]IGLS) and MIXOR, Bayesian approaches included WinBUGS, MLwiN (MCMC), R package MCMCglmm and SAS experimental procedure MCMC. Three data sets (the full data set and two sub-datasets) were analysed using basically two logistic random effects models with either one random effect for the center or two random effects for center and trial. For the ordinal outcome in the full data set also a proportional odds model with a random center effect was fitted. Results The packages gave similar parameter estimates for both the fixed and random effects and for the binary (and ordinal) models for the main study and when based on a relatively large number of level-1 (patient level) data compared to the number of level-2 (hospital level) data. However, when based on relatively sparse data set, i.e. when the numbers of level-1 and level-2 data units were about the same, the frequentist and Bayesian approaches showed somewhat different results. The software implementations differ considerably in flexibility, computation time, and usability. There are also differences in the availability of additional tools for model evaluation, such as diagnostic plots. The experimental SAS (version 9.2) procedure MCMC appeared to be inefficient. Conclusions On relatively large data sets, the different software implementations of logistic random effects regression models produced similar results. Thus, for a large data set there seems to be no explicit preference (of course if there is no preference from a philosophical point of view) for either a frequentist or Bayesian approach (if based on vague priors). The choice for a particular implementation may largely depend on the desired flexibility, and the usability of the package. For small data sets the random effects variances are difficult to estimate. In the frequentist approaches the MLE of this variance was often estimated zero with a standard error that is either zero or could not be determined, while for Bayesian methods the estimates could depend on the chosen "non-informative" prior of the variance parameter. The starting value for the variance parameter may be also critical for the convergence of the Markov chain. PMID:21605357

  8. Quantum Random Number Generation Using a Quanta Image Sensor

    PubMed Central

    Amri, Emna; Felk, Yacine; Stucki, Damien; Ma, Jiaju; Fossum, Eric R.

    2016-01-01

    A new quantum random number generation method is proposed. The method is based on the randomness of the photon emission process and the single photon counting capability of the Quanta Image Sensor (QIS). It has the potential to generate high-quality random numbers with remarkable data output rate. In this paper, the principle of photon statistics and theory of entropy are discussed. Sample data were collected with QIS jot device, and its randomness quality was analyzed. The randomness assessment method and results are discussed. PMID:27367698

  9. Two-dimensional hexagonally oriented CdCl2.H2O nanorod assembly: formation and replication.

    PubMed

    Deng, Zhaoxiang; Mao, Chengde

    2004-09-14

    This paper reports a simple bottom-up method that can controllably fabricate 2D hexagonally oriented and randomly distributed CdCl(2).H(2)O nanorods on mica surfaces. The as-formed nanorod assemblies have been successfully replicated into various matrixes, including gold, poly(dimethylsiloxane), and polyurethane. Thus, this method is compatible with soft-lithography towards further applications.

  10. Ground vibration test of the laminar flow control JStar airplane

    NASA Technical Reports Server (NTRS)

    Kehoe, M. W.; Cazier, F. W., Jr.; Ellison, J. F.

    1985-01-01

    A ground vibration test was conducted on a Lockheed JetStar airplane that had been modified for the purpose of conducting laminar flow control experiments. The test was performed prior to initial flight flutter tests. Both sine-dwell and single-point-random excitation methods were used. The data presented include frequency response functions and a comparison of mode frequencies and mode shapes from both methods.

  11. Impact of including or excluding both-armed zero-event studies on using standard meta-analysis methods for rare event outcome: a simulation study

    PubMed Central

    Cheng, Ji; Pullenayegum, Eleanor; Marshall, John K; Thabane, Lehana

    2016-01-01

    Objectives There is no consensus on whether studies with no observed events in the treatment and control arms, the so-called both-armed zero-event studies, should be included in a meta-analysis of randomised controlled trials (RCTs). Current analytic approaches handled them differently depending on the choice of effect measures and authors' discretion. Our objective is to evaluate the impact of including or excluding both-armed zero-event (BA0E) studies in meta-analysis of RCTs with rare outcome events through a simulation study. Method We simulated 2500 data sets for different scenarios varying the parameters of baseline event rate, treatment effect and number of patients in each trial, and between-study variance. We evaluated the performance of commonly used pooling methods in classical meta-analysis—namely, Peto, Mantel-Haenszel with fixed-effects and random-effects models, and inverse variance method with fixed-effects and random-effects models—using bias, root mean square error, length of 95% CI and coverage. Results The overall performance of the approaches of including or excluding BA0E studies in meta-analysis varied according to the magnitude of true treatment effect. Including BA0E studies introduced very little bias, decreased mean square error, narrowed the 95% CI and increased the coverage when no true treatment effect existed. However, when a true treatment effect existed, the estimates from the approach of excluding BA0E studies led to smaller bias than including them. Among all evaluated methods, the Peto method excluding BA0E studies gave the least biased results when a true treatment effect existed. Conclusions We recommend including BA0E studies when treatment effects are unlikely, but excluding them when there is a decisive treatment effect. Providing results of including and excluding BA0E studies to assess the robustness of the pooled estimated effect is a sensible way to communicate the results of a meta-analysis when the treatment effects are unclear. PMID:27531725

  12. Cascade phenomenon against subsequent failures in complex networks

    NASA Astrophysics Data System (ADS)

    Jiang, Zhong-Yuan; Liu, Zhi-Quan; He, Xuan; Ma, Jian-Feng

    2018-06-01

    Cascade phenomenon may lead to catastrophic disasters which extremely imperil the network safety or security in various complex systems such as communication networks, power grids, social networks and so on. In some flow-based networks, the load of failed nodes can be redistributed locally to their neighboring nodes to maximally preserve the traffic oscillations or large-scale cascading failures. However, in such local flow redistribution model, a small set of key nodes attacked subsequently can result in network collapse. Then it is a critical problem to effectively find the set of key nodes in the network. To our best knowledge, this work is the first to study this problem comprehensively. We first introduce the extra capacity for every node to put up with flow fluctuations from neighbors, and two extra capacity distributions including degree based distribution and average distribution are employed. Four heuristic key nodes discovering methods including High-Degree-First (HDF), Low-Degree-First (LDF), Random and Greedy Algorithms (GA) are presented. Extensive simulations are realized in both scale-free networks and random networks. The results show that the greedy algorithm can efficiently find the set of key nodes in both scale-free and random networks. Our work studies network robustness against cascading failures from a very novel perspective, and methods and results are very useful for network robustness evaluations and protections.

  13. Performance Enhancement of the RatCAP Awake Rate Brain PET System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vaska, P.; Vaska, P.; Woody, C.

    The first full prototype of the RatCAP PET system, designed to image the brain of a rat while conscious, has been completed. Initial results demonstrated excellent spatial resolution, 1.8 mm FWHM with filtered backprojection and <1.5 mm FWHM with a Monte Carlo based MLEM method. However, noise equivalent countrate studies indicated the need for better timing to mitigate the effect of randoms. Thus, the front-end ASIC has been redesigned to minimize time walk, an accurate coincidence time alignment method has been implemented, and a variance reduction technique for the randoms is being developed. To maximize the quantitative capabilities required formore » neuroscience, corrections are being implemented and validated for positron range and photon noncollinearity, scatter (including outside the field of view), attenuation, randoms, and detector efficiency (deadtime is negligible). In addition, a more robust and compact PCI-based optical data acquisition system has been built to replace the original VME-based system while retaining the linux-based data processing and image reconstruction codes. Finally, a number of new animal imaging experiments have been carried out to demonstrate the performance of the RatCAP in real imaging situations, including an F-18 fluoride bone scan, a C-11 raclopride scan, and a dynamic C-11 methamphetamine scan.« less

  14. Impact of a mHealth intervention for peer health workers on AIDS care in rural Uganda: a mixed methods evaluation of a cluster-randomized trial.

    PubMed

    Chang, Larry W; Kagaayi, Joseph; Arem, Hannah; Nakigozi, Gertrude; Ssempijja, Victor; Serwadda, David; Quinn, Thomas C; Gray, Ronald H; Bollinger, Robert C; Reynolds, Steven J

    2011-11-01

    Mobile phone access in low and middle-income countries is rapidly expanding and offers an opportunity to leverage limited human resources for health. We conducted a mixed methods evaluation of a cluster-randomized trial exploratory substudy on the impact of a mHealth (mobile phone) support intervention used by community-based peer health workers (PHW) on AIDS care in rural Uganda. 29 PHWs at 10 clinics were randomized by clinic to receive the intervention or not. PHWs used phones to call and text higher level providers with patient-specific clinical information. 970 patients cared for by the PHWs were followed over a 26 month period. No significant differences were found in patients' risk of virologic failure. Qualitative analyses found improvements in patient care and logistics and broad support for the mHealth intervention among patients, clinic staff, and PHWs. Key challenges identified included variable patient phone access, privacy concerns, and phone maintenance.

  15. Interrelation Between Safety Factors and Reliability

    NASA Technical Reports Server (NTRS)

    Elishakoff, Isaac; Chamis, Christos C. (Technical Monitor)

    2001-01-01

    An evaluation was performed to establish relationships between safety factors and reliability relationships. Results obtained show that the use of the safety factor is not contradictory to the employment of the probabilistic methods. In many cases the safety factors can be directly expressed by the required reliability levels. However, there is a major difference that must be emphasized: whereas the safety factors are allocated in an ad hoc manner, the probabilistic approach offers a unified mathematical framework. The establishment of the interrelation between the concepts opens an avenue to specify safety factors based on reliability. In cases where there are several forms of failure, then the allocation of safety factors should he based on having the same reliability associated with each failure mode. This immediately suggests that by the probabilistic methods the existing over-design or under-design can be eliminated. The report includes three parts: Part 1-Random Actual Stress and Deterministic Yield Stress; Part 2-Deterministic Actual Stress and Random Yield Stress; Part 3-Both Actual Stress and Yield Stress Are Random.

  16. Random walks on cubic lattices with bond disorder

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ernst, M.H.; van Velthoven, P.F.J.

    1986-12-01

    The authors consider diffusive systems with static disorder, such as Lorentz gases, lattice percolation, ants in a labyrinth, termite problems, random resistor networks, etc. In the case of diluted randomness the authors can apply the methods of kinetic theory to obtain systematic expansions of dc and ac transport properties in powers of the impurity concentration c. The method is applied to a hopping model on a d-dimensional cubic lattice having two types of bonds with conductivity sigma and sigma/sub 0/ = 1, with concentrations c and 1-c, respectively. For the square lattice the authors explicitly calculate the diffusion coefficient D(c,sigma)more » as a function of c, to O(c/sup 2/) terms included for different ratios of the bond conductivity sigma. The probability of return at long times is given by P/sub 0/(t) approx. (4..pi..D(c,sigma)t)/sup -d/2/, which is determined by the diffusion coefficient of the disordered system.« less

  17. Text Detection and Translation from Natural Scenes

    DTIC Science & Technology

    2001-06-01

    is no explicit tags around Chinese words. A module for Chinese word segmentation is included in the system. This segmentor uses a word- frequency ... list to make segmentation decisions. We tested the EBMT based method using randomly selected 50 signs from our database, assuming perfect sign

  18. A stochastic method for stand-alone photovoltaic system sizing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cabral, Claudia Valeria Tavora; Filho, Delly Oliveira; Martins, Jose Helvecio

    Photovoltaic systems utilize solar energy to generate electrical energy to meet load demands. Optimal sizing of these systems includes the characterization of solar radiation. Solar radiation at the Earth's surface has random characteristics and has been the focus of various academic studies. The objective of this study was to stochastically analyze parameters involved in the sizing of photovoltaic generators and develop a methodology for sizing of stand-alone photovoltaic systems. Energy storage for isolated systems and solar radiation were analyzed stochastically due to their random behavior. For the development of the methodology proposed stochastic analysis were studied including the Markov chainmore » and beta probability density function. The obtained results were compared with those for sizing of stand-alone using from the Sandia method (deterministic), in which the stochastic model presented more reliable values. Both models present advantages and disadvantages; however, the stochastic one is more complex and provides more reliable and realistic results. (author)« less

  19. Methodological challenges in evaluating the effectiveness of women's crisis houses compared with psychiatric wards: findings from a pilot patient preference RCT.

    PubMed

    Howard, Louise M; Leese, Morven; Byford, Sarah; Killaspy, Helen; Cole, Laura; Lawlor, Caroline; Johnson, Sonia

    2009-10-01

    There are several methodological difficulties to address when evaluating acute psychiatric services. This study explored potential methods in evaluating the effectiveness of women's crisis houses compared with psychiatric wards in a pilot patient preference randomized controlled trial. Women requiring voluntary admission to a psychiatric hospital or women's crisis house were asked to enter this pilot and different options for recruitment were explored, including different recruitment sites in the pathway to admission and methods for including women without capacity. Forty-one percent (n = 42) of women entering the study agreed to be randomized and 59% (n = 61) entered patient preference arms. Only 7% of women were recruited before admission and 1 woman without capacity entered the study, despite procedures to facilitate this. Recruitment of patients with acute psychiatric crises is therefore challenging; researchers evaluating acute services should establish a consensus on how ethically and practically to recruit patients in this setting.

  20. Iterative Usage of Fixed and Random Effect Models for Powerful and Efficient Genome-Wide Association Studies

    PubMed Central

    Liu, Xiaolei; Huang, Meng; Fan, Bin; Buckler, Edward S.; Zhang, Zhiwu

    2016-01-01

    False positives in a Genome-Wide Association Study (GWAS) can be effectively controlled by a fixed effect and random effect Mixed Linear Model (MLM) that incorporates population structure and kinship among individuals to adjust association tests on markers; however, the adjustment also compromises true positives. The modified MLM method, Multiple Loci Linear Mixed Model (MLMM), incorporates multiple markers simultaneously as covariates in a stepwise MLM to partially remove the confounding between testing markers and kinship. To completely eliminate the confounding, we divided MLMM into two parts: Fixed Effect Model (FEM) and a Random Effect Model (REM) and use them iteratively. FEM contains testing markers, one at a time, and multiple associated markers as covariates to control false positives. To avoid model over-fitting problem in FEM, the associated markers are estimated in REM by using them to define kinship. The P values of testing markers and the associated markers are unified at each iteration. We named the new method as Fixed and random model Circulating Probability Unification (FarmCPU). Both real and simulated data analyses demonstrated that FarmCPU improves statistical power compared to current methods. Additional benefits include an efficient computing time that is linear to both number of individuals and number of markers. Now, a dataset with half million individuals and half million markers can be analyzed within three days. PMID:26828793

  1. A randomized controlled trial of the different impression methods for the complete denture fabrication: Patient reported outcomes.

    PubMed

    Jo, Ayami; Kanazawa, Manabu; Sato, Yusuke; Iwaki, Maiko; Akiba, Norihisa; Minakuchi, Shunsuke

    2015-08-01

    To compare the effect of conventional complete dentures (CD) fabricated using two different impression methods on patient-reported outcomes in a randomized controlled trial (RCT). A cross-over RCT was performed with edentulous patients, required maxillomandibular CDs. Mandibular CDs were fabricated using two different methods. The conventional method used a custom tray border moulded with impression compound and a silicone. The simplified used a stock tray and an alginate. Participants were randomly divided into two groups. The C-S group had the conventional method used first, followed by the simplified. The S-C group was in the reverse order. Adjustment was performed four times. A wash out period was set for 1 month. The primary outcome was general patient satisfaction, measured using visual analogue scales, and the secondary outcome was oral health-related quality of life, measured using the Japanese version of the Oral Health Impact Profile for edentulous (OHIP-EDENT-J) questionnaire scores. Twenty-four participants completed the trial. With regard to general patient satisfaction, the conventional method was significantly more acceptable than the simplified. No significant differences were observed between the two methods in the OHIP-EDENT-J scores. This study showed CDs fabricated with a conventional method were significantly more highly rated for general patient satisfaction than a simplified. CDs, fabricated with the conventional method that included a preliminary impression made using alginate in a stock tray and subsequently a final impression made using silicone in a border moulded custom tray resulted in higher general patient satisfaction. UMIN000009875. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. A Probabilistic Design Method Applied to Smart Composite Structures

    NASA Technical Reports Server (NTRS)

    Shiao, Michael C.; Chamis, Christos C.

    1995-01-01

    A probabilistic design method is described and demonstrated using a smart composite wing. Probabilistic structural design incorporates naturally occurring uncertainties including those in constituent (fiber/matrix) material properties, fabrication variables, structure geometry and control-related parameters. Probabilistic sensitivity factors are computed to identify those parameters that have a great influence on a specific structural reliability. Two performance criteria are used to demonstrate this design methodology. The first criterion requires that the actuated angle at the wing tip be bounded by upper and lower limits at a specified reliability. The second criterion requires that the probability of ply damage due to random impact load be smaller than an assigned value. When the relationship between reliability improvement and the sensitivity factors is assessed, the results show that a reduction in the scatter of the random variable with the largest sensitivity factor (absolute value) provides the lowest failure probability. An increase in the mean of the random variable with a negative sensitivity factor will reduce the failure probability. Therefore, the design can be improved by controlling or selecting distribution parameters associated with random variables. This can be implemented during the manufacturing process to obtain maximum benefit with minimum alterations.

  3. A novel image encryption algorithm based on synchronized random bit generated in cascade-coupled chaotic semiconductor ring lasers

    NASA Astrophysics Data System (ADS)

    Li, Jiafu; Xiang, Shuiying; Wang, Haoning; Gong, Junkai; Wen, Aijun

    2018-03-01

    In this paper, a novel image encryption algorithm based on synchronization of physical random bit generated in a cascade-coupled semiconductor ring lasers (CCSRL) system is proposed, and the security analysis is performed. In both transmitter and receiver parts, the CCSRL system is a master-slave configuration consisting of a master semiconductor ring laser (M-SRL) with cross-feedback and a solitary SRL (S-SRL). The proposed image encryption algorithm includes image preprocessing based on conventional chaotic maps, pixel confusion based on control matrix extracted from physical random bit, and pixel diffusion based on random bit stream extracted from physical random bit. Firstly, the preprocessing method is used to eliminate the correlation between adjacent pixels. Secondly, physical random bit with verified randomness is generated based on chaos in the CCSRL system, and is used to simultaneously generate the control matrix and random bit stream. Finally, the control matrix and random bit stream are used for the encryption algorithm in order to change the position and the values of pixels, respectively. Simulation results and security analysis demonstrate that the proposed algorithm is effective and able to resist various typical attacks, and thus is an excellent candidate for secure image communication application.

  4. Randomized controlled trial of a computer-based module to improve contraceptive method choice.

    PubMed

    Garbers, Samantha; Meserve, Allison; Kottke, Melissa; Hatcher, Robert; Ventura, Alicia; Chiasson, Mary Ann

    2012-10-01

    Unintended pregnancy is common in the United States, and interventions are needed to improve contraceptive use among women at higher risk of unintended pregnancy, including Latinas and women with low educational attainment. A three-arm randomized controlled trial was conducted at two family planning sites serving low-income, predominantly Latina populations. The trial tested the efficacy of a computer-based contraceptive assessment module in increasing the proportion of patients choosing an effective method of contraception (<10 pregnancies/100 women per year, typical use). Participants were randomized to complete the module and receive tailored health materials, to complete the module and receive generic health materials, or to a control condition. In intent-to-treat analyses adjusted for recruitment site (n=2231), family planning patients who used the module were significantly more likely to choose an effective contraceptive method: 75% among those who received tailored materials [odds ratio (OR)=1.56; 95% confidence interval (CI): 1.23-1.98] and 78% among those who received generic materials (OR=1.74; 95% CI: 1.35-2.25), compared to 65% among control arm participants. The findings support prior research suggesting that patient-centered interventions can positively influence contraceptive method choice. Copyright © 2012 Elsevier Inc. All rights reserved.

  5. Random-effects meta-analysis: the number of studies matters.

    PubMed

    Guolo, Annamaria; Varin, Cristiano

    2017-06-01

    This paper investigates the impact of the number of studies on meta-analysis and meta-regression within the random-effects model framework. It is frequently neglected that inference in random-effects models requires a substantial number of studies included in meta-analysis to guarantee reliable conclusions. Several authors warn about the risk of inaccurate results of the traditional DerSimonian and Laird approach especially in the common case of meta-analysis involving a limited number of studies. This paper presents a selection of likelihood and non-likelihood methods for inference in meta-analysis proposed to overcome the limitations of the DerSimonian and Laird procedure, with a focus on the effect of the number of studies. The applicability and the performance of the methods are investigated in terms of Type I error rates and empirical power to detect effects, according to scenarios of practical interest. Simulation studies and applications to real meta-analyses highlight that it is not possible to identify an approach uniformly superior to alternatives. The overall recommendation is to avoid the DerSimonian and Laird method when the number of meta-analysis studies is modest and prefer a more comprehensive procedure that compares alternative inferential approaches. R code for meta-analysis according to all of the inferential methods examined in the paper is provided.

  6. Role of Internal Mammary Node Radiation as a Part of Modern Breast Cancer Radiation Therapy: A Systematic Review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Verma, Vivek; Vicini, Frank; Tendulkar, Rahul D.

    Purpose: Despite data from multiple randomized trials, the role of internal mammary lymph node irradiation as a part of regional nodal irradiation (IMLN RT–RNI) remains unanswered. Recent noteworthy data and modern RT techniques might identify a subset of patients who will benefit from IMLN RT–RNI, lending insight into the balance between improved outcomes and acceptable toxicity. We evaluated the current role of IMLN RT–RNI by analyzing randomized, prospective, and retrospective data. Methods and Materials: In accordance with the Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines, a review of the published data was performed using PubMed to evaluate publishedmore » studies from 1994 to 2015. The information evaluated included the number of patients, follow-up period, technical aspects of RT, and outcomes (clinical outcomes, complications/toxicity). Results: We included 16 studies (4 randomized, 4 nonrandomized, 7 retrospective, and 1 meta-analysis). Although older randomized trials failed to show differences in clinical outcomes or toxicity with IMLN RT–RNI, recent randomized data suggest the potential for improved outcomes, including overall survival, with IMLN RT–RNI. Furthermore, nonrandomized data have suggested a potential benefit for central tumors with IMLN RT–RNI. Although recent data have suggested a potential increase in pulmonary complications with IMLN RT–RNI with the use of advanced radiation techniques, toxicity rates remain low with limited cardiac toxicity data available. Conclusions: Increasing data from recent randomized trials support the use of IMLN RT–RNI. IMLN RT can be considered based on the inclusion of IMLN RT as a part of RNI in recent trials and the inclusion criteria from IMLN RT–RNI trials and for patients with central or medial tumors and axillary disease.« less

  7. Delivering successful randomized controlled trials in surgery: Methods to optimize collaboration and study design.

    PubMed

    Blencowe, Natalie S; Cook, Jonathan A; Pinkney, Thomas; Rogers, Chris; Reeves, Barnaby C; Blazeby, Jane M

    2017-04-01

    Randomized controlled trials in surgery are notoriously difficult to design and conduct due to numerous methodological and cultural challenges. Over the last 5 years, several UK-based surgical trial-related initiatives have been funded to address these issues. These include the development of Surgical Trials Centers and Surgical Specialty Leads (individual surgeons responsible for championing randomized controlled trials in their specialist fields), both funded by the Royal College of Surgeons of England; networks of research-active surgeons in training; and investment in methodological research relating to surgical randomized controlled trials (to address issues such as recruitment, blinding, and the selection and standardization of interventions). This article discusses these initiatives more in detail and provides exemplar cases to illustrate how the methodological challenges have been tackled. The initiatives have surpassed expectations, resulting in a renaissance in surgical research throughout the United Kingdom, such that the number of patients entering surgical randomized controlled trials has doubled.

  8. MLACP: machine-learning-based prediction of anticancer peptides

    PubMed Central

    Manavalan, Balachandran; Basith, Shaherin; Shin, Tae Hwan; Choi, Sun; Kim, Myeong Ok; Lee, Gwang

    2017-01-01

    Cancer is the second leading cause of death globally, and use of therapeutic peptides to target and kill cancer cells has received considerable attention in recent years. Identification of anticancer peptides (ACPs) through wet-lab experimentation is expensive and often time consuming; therefore, development of an efficient computational method is essential to identify potential ACP candidates prior to in vitro experimentation. In this study, we developed support vector machine- and random forest-based machine-learning methods for the prediction of ACPs using the features calculated from the amino acid sequence, including amino acid composition, dipeptide composition, atomic composition, and physicochemical properties. We trained our methods using the Tyagi-B dataset and determined the machine parameters by 10-fold cross-validation. Furthermore, we evaluated the performance of our methods on two benchmarking datasets, with our results showing that the random forest-based method outperformed the existing methods with an average accuracy and Matthews correlation coefficient value of 88.7% and 0.78, respectively. To assist the scientific community, we also developed a publicly accessible web server at www.thegleelab.org/MLACP.html. PMID:29100375

  9. Assessing the germplasm of Laminaria (phaeophyceae) with random amplified polymorphic DNA (RAPD) method

    NASA Astrophysics Data System (ADS)

    He, Yingjun; Zou, Yuping; Wang, Xiaodong; Zheng, Zhiguo; Zhang, Daming; Duan, Delin

    2003-06-01

    Eighteen gametophytes including L. japonica, L. ochotensis and L. longissima, were verified with random amplified polymorphic DNA (RAPD) technique. Eighteen ten-base primers were chosen from 100 primers selected for final amplification test. Among the total of 205 bands amplified, 181 (88.3%) were polymorphic. The genetic distance among different strains ranged from 0.072 to 0.391. The dendrogram constructed by unweighted pair-group method with arithmetic (UPGMA) method showed that the female and male gametophytes of the same cell lines could be grouped in pairs respectively. It indicated that RAPD analysis could be used not only to distinguish different strains of Laminaria, but also to distinguish male and female gametophyte within the same cell lines. There is ambiguous systematic relationship if judged merely by the present data. It seems that the use of RAPD marker is limited to elucidation of the phylogenetic relationship among the species of Laminaria.

  10. Modeling and dynamic environment analysis technology for spacecraft

    NASA Astrophysics Data System (ADS)

    Fang, Ren; Zhaohong, Qin; Zhong, Zhang; Zhenhao, Liu; Kai, Yuan; Long, Wei

    Spacecraft sustains complex and severe vibrations and acoustic environments during flight. Predicting the resulting structures, including numerical predictions of fluctuating pressure, updating models and random vibration and acoustic analysis, plays an important role during the design, manufacture and ground testing of spacecraft. In this paper, Monotony Integrative Large Eddy Simulation (MILES) is introduced to predict the fluctuating pressure of the fairing. The exact flow structures of the fairing wall surface under different Mach numbers are obtained, then a spacecraft model is constructed using the finite element method (FEM). According to the modal test data, the model is updated by the penalty method. On this basis, the random vibration and acoustic responses of the fairing and satellite are analyzed by different methods. The simulated results agree well with the experimental ones, which shows the validity of the modeling and dynamic environment analysis technology. This information can better support test planning, defining test conditions and designing optimal structures.

  11. 3D exemplar-based random walks for tooth segmentation from cone-beam computed tomography images.

    PubMed

    Pei, Yuru; Ai, Xingsheng; Zha, Hongbin; Xu, Tianmin; Ma, Gengyu

    2016-09-01

    Tooth segmentation is an essential step in acquiring patient-specific dental geometries from cone-beam computed tomography (CBCT) images. Tooth segmentation from CBCT images is still a challenging task considering the comparatively low image quality caused by the limited radiation dose, as well as structural ambiguities from intercuspation and nearby alveolar bones. The goal of this paper is to present and discuss the latest accomplishments in semisupervised tooth segmentation with adaptive 3D shape constraints. The authors propose a 3D exemplar-based random walk method of tooth segmentation from CBCT images. The proposed method integrates semisupervised label propagation and regularization by 3D exemplar registration. To begin with, the pure random walk method is to get an initial segmentation of the teeth, which tends to be erroneous because of the structural ambiguity of CBCT images. And then, as an iterative refinement, the authors conduct a regularization by using 3D exemplar registration, as well as label propagation by random walks with soft constraints, to improve the tooth segmentation. In the first stage of the iteration, 3D exemplars with well-defined topologies are adapted to fit the tooth contours, which are obtained from the random walks based segmentation. The soft constraints on voxel labeling are defined by shape-based foreground dentine probability acquired by the exemplar registration, as well as the appearance-based probability from a support vector machine (SVM) classifier. In the second stage, the labels of the volume-of-interest (VOI) are updated by the random walks with soft constraints. The two stages are optimized iteratively. Instead of the one-shot label propagation in the VOI, an iterative refinement process can achieve a reliable tooth segmentation by virtue of exemplar-based random walks with adaptive soft constraints. The proposed method was applied for tooth segmentation of twenty clinically captured CBCT images. Three metrics, including the Dice similarity coefficient (DSC), the Jaccard similarity coefficient (JSC), and the mean surface deviation (MSD), were used to quantitatively analyze the segmentation of anterior teeth including incisors and canines, premolars, and molars. The segmentation of the anterior teeth achieved a DSC up to 98%, a JSC of 97%, and an MSD of 0.11 mm compared with manual segmentation. For the premolars, the average values of DSC, JSC, and MSD were 98%, 96%, and 0.12 mm, respectively. The proposed method yielded a DSC of 95%, a JSC of 89%, and an MSD of 0.26 mm for molars. Aside from the interactive definition of label priors by the user, automatic tooth segmentation can be achieved in an average of 1.18 min. The proposed technique enables an efficient and reliable tooth segmentation from CBCT images. This study makes it clinically practical to segment teeth from CBCT images, thus facilitating pre- and interoperative uses of dental morphologies in maxillofacial and orthodontic treatments.

  12. Certified randomness in quantum physics.

    PubMed

    Acín, Antonio; Masanes, Lluis

    2016-12-07

    The concept of randomness plays an important part in many disciplines. On the one hand, the question of whether random processes exist is fundamental for our understanding of nature. On the other, randomness is a resource for cryptography, algorithms and simulations. Standard methods for generating randomness rely on assumptions about the devices that are often not valid in practice. However, quantum technologies enable new methods for generating certified randomness, based on the violation of Bell inequalities. These methods are referred to as device-independent because they do not rely on any modelling of the devices. Here we review efforts to design device-independent randomness generators and the associated challenges.

  13. Outcomes of a Pilot Hand Hygiene Randomized Cluster Trial to Reduce Communicable Infections Among US Office-Based Employees

    PubMed Central

    DuBois, Cathy L.Z.; Grey, Scott F.; Kingsbury, Diana M.; Shakya, Sunita; Scofield, Jennifer; Slenkovich, Ken

    2015-01-01

    Objective: To determine the effectiveness of an office-based multimodal hand hygiene improvement intervention in reducing self-reported communicable infections and work-related absence. Methods: A randomized cluster trial including an electronic training video, hand sanitizer, and educational posters (n = 131, intervention; n = 193, control). Primary outcomes include (1) self-reported acute respiratory infections (ARIs)/influenza-like illness (ILI) and/or gastrointestinal (GI) infections during the prior 30 days; and (2) related lost work days. Incidence rate ratios calculated using generalized linear mixed models with a Poisson distribution, adjusted for confounders and random cluster effects. Results: A 31% relative reduction in self-reported combined ARI-ILI/GI infections (incidence rate ratio: 0.69; 95% confidence interval, 0.49 to 0.98). A 21% nonsignificant relative reduction in lost work days. Conclusions: An office-based multimodal hand hygiene improvement intervention demonstrated a substantive reduction in self-reported combined ARI-ILI/GI infections. PMID:25719534

  14. Effect of H-wave polarization on laser radar detection of partially convex targets in random media.

    PubMed

    El-Ocla, Hosam

    2010-07-01

    A study on the performance of laser radar cross section (LRCS) of conducting targets with large sizes is investigated numerically in free space and random media. The LRCS is calculated using a boundary value method with beam wave incidence and H-wave polarization. Considered are those elements that contribute to the LRCS problem including random medium strength, target configuration, and beam width. The effect of the creeping waves, stimulated by H-polarization, on the LRCS behavior is manifested. Targets taking large sizes of up to five wavelengths are sufficiently larger than the beam width and are sufficient for considering fairly complex targets. Scatterers are assumed to have analytical partially convex contours with inflection points.

  15. Objective assessment of image quality. IV. Application to adaptive optics

    PubMed Central

    Barrett, Harrison H.; Myers, Kyle J.; Devaney, Nicholas; Dainty, Christopher

    2008-01-01

    The methodology of objective assessment, which defines image quality in terms of the performance of specific observers on specific tasks of interest, is extended to temporal sequences of images with random point spread functions and applied to adaptive imaging in astronomy. The tasks considered include both detection and estimation, and the observers are the optimal linear discriminant (Hotelling observer) and the optimal linear estimator (Wiener). A general theory of first- and second-order spatiotemporal statistics in adaptive optics is developed. It is shown that the covariance matrix can be rigorously decomposed into three terms representing the effect of measurement noise, random point spread function, and random nature of the astronomical scene. Figures of merit are developed, and computational methods are discussed. PMID:17106464

  16. The Use of Monte Carlo Techniques to Teach Probability.

    ERIC Educational Resources Information Center

    Newell, G. J.; MacFarlane, J. D.

    1985-01-01

    Presents sports-oriented examples (cricket and football) in which Monte Carlo methods are used on microcomputers to teach probability concepts. Both examples include computer programs (with listings) which utilize the microcomputer's random number generator. Instructional strategies, with further challenges to help students understand the role of…

  17. Methods for identifying high collision concentrations for identifying potential safety improvements : development of advanced type 2 safety performance functions.

    DOT National Transportation Integrated Search

    2016-06-30

    This research developed advanced type 2 safety performance functions (SPF) for roadway segments, intersections and ramps on the entire Caltrans network. The advanced type 2 SPFs included geometrics, traffic volume and hierarchical random effects, whi...

  18. Teaching Probability in Intermediate Grades

    ERIC Educational Resources Information Center

    Engel, Arthur

    1971-01-01

    A discussion of the importance and procedures for including probability in the elementary through secondary mathematics curriculum is presented. Many examples and problems are presented which the author feels students can understand and will be motivated to do. Random digits, Monte Carlo methods, combinatorial theory, and Markov chains are…

  19. Core ADHD Symptom Improvement with Atomoxetine versus Methylphenidate: A Direct Comparison Meta-Analysis

    ERIC Educational Resources Information Center

    Hazell, Philip L.; Kohn, Michael R.; Dickson, Ruth; Walton, Richard J.; Granger, Renee E.; van Wyk, Gregory W.

    2011-01-01

    Objective: Previous studies comparing atomoxetine and methylphenidate to treat ADHD symptoms have been equivocal. This noninferiority meta-analysis compared core ADHD symptom response between atomoxetine and methylphenidate in children and adolescents. Method: Selection criteria included randomized, controlled design; duration 6 weeks; and…

  20. Association between Markers of Classroom Environmental Conditions and Teachers' Respiratory Health

    ERIC Educational Resources Information Center

    Claudio, Luz; Rivera, Glory A.; Ramirez, Olivia F.

    2016-01-01

    Background: Studies have assessed health in schoolchildren. Less is known about the environmental and occupational health of teachers. Methods: A cross-sectional survey of teachers was conducted in 24 randomly selected public elementary schools. Questionnaire included sociodemographic information, healthcare, school conditions, and health…

  1. Will Your Catalog Stand FTC Scrutiny?

    ERIC Educational Resources Information Center

    Bender, Louis W.

    1976-01-01

    In light of recent court rulings and Federal Trade Commission (FTC) hearings regarding unfair methods of competition and deceptive advertising, a content analysis was conducted of 20 randomly selected college catalogs from 2-year and 4-year, public and private institutions. Four types of misrepresentations were identified including institutional…

  2. Developing Appropriate Methods for Cost-Effectiveness Analysis of Cluster Randomized Trials

    PubMed Central

    Gomes, Manuel; Ng, Edmond S.-W.; Nixon, Richard; Carpenter, James; Thompson, Simon G.

    2012-01-01

    Aim. Cost-effectiveness analyses (CEAs) may use data from cluster randomized trials (CRTs), where the unit of randomization is the cluster, not the individual. However, most studies use analytical methods that ignore clustering. This article compares alternative statistical methods for accommodating clustering in CEAs of CRTs. Methods. Our simulation study compared the performance of statistical methods for CEAs of CRTs with 2 treatment arms. The study considered a method that ignored clustering—seemingly unrelated regression (SUR) without a robust standard error (SE)—and 4 methods that recognized clustering—SUR and generalized estimating equations (GEEs), both with robust SE, a “2-stage” nonparametric bootstrap (TSB) with shrinkage correction, and a multilevel model (MLM). The base case assumed CRTs with moderate numbers of balanced clusters (20 per arm) and normally distributed costs. Other scenarios included CRTs with few clusters, imbalanced cluster sizes, and skewed costs. Performance was reported as bias, root mean squared error (rMSE), and confidence interval (CI) coverage for estimating incremental net benefits (INBs). We also compared the methods in a case study. Results. Each method reported low levels of bias. Without the robust SE, SUR gave poor CI coverage (base case: 0.89 v. nominal level: 0.95). The MLM and TSB performed well in each scenario (CI coverage, 0.92–0.95). With few clusters, the GEE and SUR (with robust SE) had coverage below 0.90. In the case study, the mean INBs were similar across all methods, but ignoring clustering underestimated statistical uncertainty and the value of further research. Conclusions. MLMs and the TSB are appropriate analytical methods for CEAs of CRTs with the characteristics described. SUR and GEE are not recommended for studies with few clusters. PMID:22016450

  3. Probabilistic Structural Analysis Methods (PSAM) for select space propulsion systems components

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Summarized here is the technical effort and computer code developed during the five year duration of the program for probabilistic structural analysis methods. The summary includes a brief description of the computer code manuals and a detailed description of code validation demonstration cases for random vibrations of a discharge duct, probabilistic material nonlinearities of a liquid oxygen post, and probabilistic buckling of a transfer tube liner.

  4. Comparison of Address-based Sampling and Random-digit Dialing Methods for Recruiting Young Men as Controls in a Case-Control Study of Testicular Cancer Susceptibility

    PubMed Central

    Clagett, Bartholt; Nathanson, Katherine L.; Ciosek, Stephanie L.; McDermoth, Monique; Vaughn, David J.; Mitra, Nandita; Weiss, Andrew; Martonik, Rachel; Kanetsky, Peter A.

    2013-01-01

    Random-digit dialing (RDD) using landline telephone numbers is the historical gold standard for control recruitment in population-based epidemiologic research. However, increasing cell-phone usage and diminishing response rates suggest that the effectiveness of RDD in recruiting a random sample of the general population, particularly for younger target populations, is decreasing. In this study, we compared landline RDD with alternative methods of control recruitment, including RDD using cell-phone numbers and address-based sampling (ABS), to recruit primarily white men aged 18–55 years into a study of testicular cancer susceptibility conducted in the Philadelphia, Pennsylvania, metropolitan area between 2009 and 2012. With few exceptions, eligible and enrolled controls recruited by means of RDD and ABS were similar with regard to characteristics for which data were collected on the screening survey. While we find ABS to be a comparably effective method of recruiting young males compared with landline RDD, we acknowledge the potential impact that selection bias may have had on our results because of poor overall response rates, which ranged from 11.4% for landline RDD to 1.7% for ABS. PMID:24008901

  5. Efficient Text Encryption and Hiding with Double-Random Phase-Encoding

    PubMed Central

    Sang, Jun; Ling, Shenggui; Alam, Mohammad S.

    2012-01-01

    In this paper, a double-random phase-encoding technique-based text encryption and hiding method is proposed. First, the secret text is transformed into a 2-dimensional array and the higher bits of the elements in the transformed array are used to store the bit stream of the secret text, while the lower bits are filled with specific values. Then, the transformed array is encoded with double-random phase-encoding technique. Finally, the encoded array is superimposed on an expanded host image to obtain the image embedded with hidden data. The performance of the proposed technique, including the hiding capacity, the recovery accuracy of the secret text, and the quality of the image embedded with hidden data, is tested via analytical modeling and test data stream. Experimental results show that the secret text can be recovered either accurately or almost accurately, while maintaining the quality of the host image embedded with hidden data by properly selecting the method of transforming the secret text into an array and the superimposition coefficient. By using optical information processing techniques, the proposed method has been found to significantly improve the security of text information transmission, while ensuring hiding capacity at a prescribed level. PMID:23202003

  6. The effects of Sahaja Yoga meditation on mental health: a systematic review.

    PubMed

    Hendriks, Tom

    2018-05-30

    Objectives To determine the efficacy of Sahaja Yoga (SY) meditation on mental health among clinical and healthy populations. Methods All publications on SY were eligible. Databases were searched up to November 2017, namely PubMed, MEDLINE (NLM), PsychINFO, and Scopus. An internet search (Google Scholar) was also conducted. The quality of the randomized controlled trails was assessed using the Cochrane Risk Assessment for Bias. The quality of cross-sectional studies, a non-randomized controlled trial and a cohort study was assessed with the Newcastle-Ottawa Quality Assessment Scale. Results We included a total of eleven studies; four randomized controlled trials, one non-randomized controlled trial, five cross-sectional studies, and one prospective cohort study. The studies included a total of 910 participants. Significant findings were reported in relation to the following outcomes: anxiety, depression, stress, subjective well-being, and psychological well-being. Two randomized studies were rated as high quality studies, two randomized studies as low quality studies. The quality of the non-randomized trial, the cross-sectional studies and the cohort study was high. Effect sizes could not be calculated in five studies due to unclear or incomplete reporting. Conclusions After reviewing the articles and taking the quality of the studies into account, it appears that SY may reduce depression and possibly anxiety. In addition, the practice of SY is also associated with increased subjective wellbeing and psychological well-beng. However, due to the limited number of publications, definite conclusions on the effects of SY cannot be made and more high quality randomized studies are needed to justify any firm conclusions on the beneficial effects of SY on mental health.

  7. Concurrent design of quasi-random photonic nanostructures

    PubMed Central

    Lee, Won-Kyu; Yu, Shuangcheng; Engel, Clifford J.; Reese, Thaddeus; Rhee, Dongjoon; Chen, Wei

    2017-01-01

    Nanostructured surfaces with quasi-random geometries can manipulate light over broadband wavelengths and wide ranges of angles. Optimization and realization of stochastic patterns have typically relied on serial, direct-write fabrication methods combined with real-space design. However, this approach is not suitable for customizable features or scalable nanomanufacturing. Moreover, trial-and-error processing cannot guarantee fabrication feasibility because processing–structure relations are not included in conventional designs. Here, we report wrinkle lithography integrated with concurrent design to produce quasi-random nanostructures in amorphous silicon at wafer scales that achieved over 160% light absorption enhancement from 800 to 1,200 nm. The quasi-periodicity of patterns, materials filling ratio, and feature depths could be independently controlled. We statistically represented the quasi-random patterns by Fourier spectral density functions (SDFs) that could bridge the processing–structure and structure–performance relations. Iterative search of the optimal structure via the SDF representation enabled concurrent design of nanostructures and processing. PMID:28760975

  8. Recruitment, screening, and baseline participant characteristics in the WALK 2.0 study: A randomized controlled trial using web 2.0 applications to promote physical activity.

    PubMed

    Caperchione, Cristina M; Duncan, Mitch J; Rosenkranz, Richard R; Vandelanotte, Corneel; Van Itallie, Anetta K; Savage, Trevor N; Hooker, Cindy; Maeder, Anthony J; Mummery, W Kerry; Kolt, Gregory S

    2016-04-15

    To describe in detail the recruitment methods and enrollment rates, the screening methods, and the baseline characteristics of a sample of adults participating in the Walk 2.0 Study, an 18 month, 3-arm randomized controlled trial of a Web 2.0 based physical activity intervention. A two-fold recruitment plan was developed and implemented, including a direct mail-out to an extract from the Australian Electoral Commission electoral roll, and other supplementary methods including email and telephone. Physical activity screening involved two steps: a validated single-item self-report instrument and the follow-up Active Australia Questionnaire. Readiness for physical activity participation was also based on a two-step process of administering the Physical Activity Readiness Questionnaire and, where needed, further clearance from a medical practitioner. Across all recruitment methods, a total of 1244 participants expressed interest in participating, of which 656 were deemed eligible. Of these, 504 were later enrolled in the Walk 2.0 trial (77% enrollment rate) and randomized to the Walk 1.0 group (n = 165), the Walk 2.0 group (n = 168), or the Logbook group (n = 171). Mean age of the total sample was 50.8 years, with 65.2% female and 79.1% born in Australia. The results of this recruitment process demonstrate the successful use of multiple strategies to obtain a diverse sample of adults eligible to take part in a web-based physical activity promotion intervention. The use of dual screening processes ensured safe participation in the intervention. This approach to recruitment and physical activity screening can be used as a model for further trials in this area.

  9. Crowdsourcing for Conducting Randomized Trials of Internet Delivered Interventions in People with Serious Mental Illness: A Systematic Review

    PubMed Central

    Naslund, John A.; Aschbrenner, Kelly A.; Marsch, Lisa A.; McHugo, Gregory J.; Bartels, Stephen J.

    2015-01-01

    Objective Online crowdsourcing refers to the process of obtaining needed services, ideas, or content by soliciting contributions from a large group of people over the Internet. We examined the potential for using online crowdsourcing methods for conducting behavioral health intervention research among people with serious mental illness (SMI). Methods Systematic review of randomized trials using online crowdsourcing methods for recruitment, intervention delivery, and data collection in people with SMI, including schizophrenia spectrum disorders and mood disorders. Included studies were completed entirely over the Internet without any face-to-face contact between participants and researchers. Databases and sources Medline, Cochrane Library, Web of Science, CINAHL, Scopus, PsychINFO, Google Scholar, and reference lists of relevant articles. Results We identified 7 randomized trials that enrolled N=1,214 participants (range: 39 to 419) with SMI. Participants were mostly female (72%) and had mood disorders (94%). Attrition ranged from 14% to 81%. Three studies had attrition rates below 25%. Most interventions were adapted from existing evidence-based programs, and consisted of self-directed education, psychoeducation, self-help, and illness self-management. Six studies collected self-reported mental health symptoms, quality of life, and illness severity. Three studies supported intervention effectiveness and two studies showed improvements in the intervention and comparison conditions over time. Peer support emerged as an important component of several interventions. Overall, studies were of medium to high methodological quality. Conclusion Online crowdsourcing methods appear feasible for conducting intervention research in people with SMI. Future efforts are needed to improve retention rates, collect objective outcome measures, and reach a broader demographic. PMID:26188164

  10. A cluster-randomized, placebo-controlled, maternal vitamin a or beta-carotene supplementation trial in bangladesh: design and methods

    PubMed Central

    2011-01-01

    Background We present the design, methods and population characteristics of a large community trial that assessed the efficacy of a weekly supplement containing vitamin A or beta-carotene, at recommended dietary levels, in reducing maternal mortality from early gestation through 12 weeks postpartum. We identify challenges faced and report solutions in implementing an intervention trial under low-resource, rural conditions, including the importance of population choice in promoting generalizability, maintaining rigorous data quality control to reduce inter- and intra- worker variation, and optimizing efficiencies in information and resources flow from and to the field. Methods This trial was a double-masked, cluster-randomized, dual intervention, placebo-controlled trial in a contiguous rural area of ~435 sq km with a population of ~650,000 in Gaibandha and Rangpur Districts of Northwestern Bangladesh. Approximately 120,000 married women of reproductive age underwent 5-weekly home surveillance, of whom ~60,000 were detected as pregnant, enrolled into the trial and gave birth to ~44,000 live-born infants. Upon enrollment, at ~ 9 weeks' gestation, pregnant women received a weekly oral supplement containing vitamin A (7000 ug retinol equivalents (RE)), beta-carotene (42 mg, or ~7000 ug RE) or a placebo through 12 weeks postpartum, according to prior randomized allocation of their cluster of residence. Systems described include enlistment and 5-weekly home surveillance for pregnancy based on menstrual history and urine testing, weekly supervised supplementation, periodic risk factor interviews, maternal and infant vital outcome monitoring, birth defect surveillance and clinical/biochemical substudies. Results The primary outcome was pregnancy-related mortality assessed for 3 months following parturition. Secondary outcomes included fetal loss due to miscarriage or stillbirth, infant mortality under three months of age, maternal obstetric and infectious morbidity, infant infectious morbidity, maternal and infant micronutrient status, fetal and infant growth and prematurity, external birth defects and postnatal infant growth to 3 months of age. Conclusion Aspects of study site selection and its "resonance" with national and rural qualities of Bangladesh, the trial's design, methods and allocation group comparability achieved by randomization, field procedures and innovative approaches to solving challenges in trial conduct are described and discussed. This trial is registered with http://Clinicaltrials.gov as protocol NCT00198822. PMID:21510905

  11. A quantitative and qualitative evaluation of reports of clinical trials published in six Brazilian dental journals indexed in the Scientific Electronic Library Online (SciELO)

    PubMed Central

    de SOUZA, Raphael Freitas; CHAVES, Carolina de Andrade Lima; CHAVES, Carolina de Andrade Lima; CHAVES, Carolina de Andrade Lima; NASSER, Mona; FEDOROWICZ, Zbys

    2010-01-01

    Open access publishing is becoming increasingly popular within the biomedical sciences. SciELO, the Scientific Electronic Library Online, is a digital library covering a selected collection of Brazilian scientific journals many of which provide open access to full-text articles. This library includes a number of dental journals some of which may include reports of clinical trials in English, Portuguese and/or Spanish. Thus, SciELO could play an important role as a source of evidence for dental healthcare interventions especially if it yields a sizeable number of high quality reports. Objective The aim of this study was to identify reports of clinical trials by handsearching of dental journals that are accessible through SciELO, and to assess the overall quality of these reports. Material and methods Electronic versions of six Brazilian dental Journals indexed in SciELO were handsearched at www.scielo.br in September 2008. Reports of clinical trials were identified and classified as controlled clinical trials (CCTs - prospective, experimental studies comparing 2 or more healthcare interventions in human beings) or randomized controlled trials (RCTs - a random allocation method is clearly reported), according to Cochrane eligibility criteria. Criteria to assess methodological quality included: method of randomization, concealment of treatment allocation, blinded outcome assessment, handling of withdrawals and losses and whether an intention-totreat analysis had been carried out. Results The search retrieved 33 CCTs and 43 RCTs. A majority of the reports provided no description of either the method of randomization (75.3%) or concealment of the allocation sequence (84.2%). Participants and outcome assessors were reported as blinded in only 31.2% of the reports. Withdrawals and losses were only clearly described in 6.5% of the reports and none mentioned an intention-totreat analysis or any similar procedure. Conclusions The results of this study indicate that a substantial number of reports of trials and systematic reviews are available in the dental journals listed in SciELO, and that these could provide valuable evidence for clinical decision making. However, it is clear that the quality of a number of these reports is of some concern and that improvement in the conduct and reporting of these trials could be achieved if authors adhered to internationally accepted guidelines, e.g. the CONSORT statement. PMID:20485919

  12. Predictability of monthly temperature and precipitation using automatic time series forecasting methods

    NASA Astrophysics Data System (ADS)

    Papacharalampous, Georgia; Tyralis, Hristos; Koutsoyiannis, Demetris

    2018-02-01

    We investigate the predictability of monthly temperature and precipitation by applying automatic univariate time series forecasting methods to a sample of 985 40-year-long monthly temperature and 1552 40-year-long monthly precipitation time series. The methods include a naïve one based on the monthly values of the last year, as well as the random walk (with drift), AutoRegressive Fractionally Integrated Moving Average (ARFIMA), exponential smoothing state-space model with Box-Cox transformation, ARMA errors, Trend and Seasonal components (BATS), simple exponential smoothing, Theta and Prophet methods. Prophet is a recently introduced model inspired by the nature of time series forecasted at Facebook and has not been applied to hydrometeorological time series before, while the use of random walk, BATS, simple exponential smoothing and Theta is rare in hydrology. The methods are tested in performing multi-step ahead forecasts for the last 48 months of the data. We further investigate how different choices of handling the seasonality and non-normality affect the performance of the models. The results indicate that: (a) all the examined methods apart from the naïve and random walk ones are accurate enough to be used in long-term applications; (b) monthly temperature and precipitation can be forecasted to a level of accuracy which can barely be improved using other methods; (c) the externally applied classical seasonal decomposition results mostly in better forecasts compared to the automatic seasonal decomposition used by the BATS and Prophet methods; and (d) Prophet is competitive, especially when it is combined with externally applied classical seasonal decomposition.

  13. Verification of recursive probabilistic integration (RPI) method for fatigue life management using non-destructive inspections

    NASA Astrophysics Data System (ADS)

    Chen, Tzikang J.; Shiao, Michael

    2016-04-01

    This paper verified a generic and efficient assessment concept for probabilistic fatigue life management. The concept is developed based on an integration of damage tolerance methodology, simulations methods1, 2, and a probabilistic algorithm RPI (recursive probability integration)3-9 considering maintenance for damage tolerance and risk-based fatigue life management. RPI is an efficient semi-analytical probabilistic method for risk assessment subjected to various uncertainties such as the variability in material properties including crack growth rate, initial flaw size, repair quality, random process modeling of flight loads for failure analysis, and inspection reliability represented by probability of detection (POD). In addition, unlike traditional Monte Carlo simulations (MCS) which requires a rerun of MCS when maintenance plan is changed, RPI can repeatedly use a small set of baseline random crack growth histories excluding maintenance related parameters from a single MCS for various maintenance plans. In order to fully appreciate the RPI method, a verification procedure was performed. In this study, MC simulations in the orders of several hundred billions were conducted for various flight conditions, material properties, and inspection scheduling, POD and repair/replacement strategies. Since the MC simulations are time-consuming methods, the simulations were conducted parallelly on DoD High Performance Computers (HPC) using a specialized random number generator for parallel computing. The study has shown that RPI method is several orders of magnitude more efficient than traditional Monte Carlo simulations.

  14. Random forests for classification in ecology

    USGS Publications Warehouse

    Cutler, D.R.; Edwards, T.C.; Beard, K.H.; Cutler, A.; Hess, K.T.; Gibson, J.; Lawler, J.J.

    2007-01-01

    Classification procedures are some of the most widely used statistical methods in ecology. Random forests (RF) is a new and powerful statistical classifier that is well established in other disciplines but is relatively unknown in ecology. Advantages of RF compared to other statistical classifiers include (1) very high classification accuracy; (2) a novel method of determining variable importance; (3) ability to model complex interactions among predictor variables; (4) flexibility to perform several types of statistical data analysis, including regression, classification, survival analysis, and unsupervised learning; and (5) an algorithm for imputing missing values. We compared the accuracies of RF and four other commonly used statistical classifiers using data on invasive plant species presence in Lava Beds National Monument, California, USA, rare lichen species presence in the Pacific Northwest, USA, and nest sites for cavity nesting birds in the Uinta Mountains, Utah, USA. We observed high classification accuracy in all applications as measured by cross-validation and, in the case of the lichen data, by independent test data, when comparing RF to other common classification methods. We also observed that the variables that RF identified as most important for classifying invasive plant species coincided with expectations based on the literature. ?? 2007 by the Ecological Society of America.

  15. Radiation Therapy Intensification for Solid Tumors: A Systematic Review of Randomized Trials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yamoah, Kosj; Showalter, Timothy N.; Ohri, Nitin, E-mail: ohri.nitin@gmail.com

    Purpose: To systematically review the outcomes of randomized trials testing radiation therapy (RT) intensification, including both dose escalation and/or the use of altered fractionation, as a strategy to improve disease control for a number of malignancies. Methods and Materials: We performed a literature search to identify randomized trials testing RT intensification for cancers of the central nervous system, head and neck, breast, lung, esophagus, rectum, and prostate. Findings were described qualitatively. Where adequate data were available, pooled estimates for the effect of RT intensification on local control (LC) or overall survival (OS) were obtained using the inverse variance method. Results: Inmore » primary central nervous system tumors, esophageal cancer, and rectal cancer, randomized trials have not demonstrated that RT intensification improves clinical outcomes. In breast cancer and prostate cancer, dose escalation has been shown to improve LC or biochemical disease control but not OS. Radiation therapy intensification may improve LC and OS in head and neck and lung cancers, but these benefits have generally been limited to studies that did not incorporate concurrent chemotherapy. Conclusions: In randomized trials, the benefits of RT intensification have largely been restricted to trials in which concurrent chemotherapy was not used. Novel strategies to optimize the incorporation of RT in the multimodality treatment of solid tumors should be explored.« less

  16. Cluster randomized trials in comparative effectiveness research: randomizing hospitals to test methods for prevention of healthcare-associated infections.

    PubMed

    Platt, Richard; Takvorian, Samuel U; Septimus, Edward; Hickok, Jason; Moody, Julia; Perlin, Jonathan; Jernigan, John A; Kleinman, Ken; Huang, Susan S

    2010-06-01

    The need for evidence about the effectiveness of therapeutics and other medical practices has triggered new interest in methods for comparative effectiveness research. Describe an approach to comparative effectiveness research involving cluster randomized trials in networks of hospitals, health plans, or medical practices with centralized administrative and informatics capabilities. We discuss the example of an ongoing cluster randomized trial to prevent methicillin-resistant Staphylococcus aureus (MRSA) infection in intensive care units (ICUs). The trial randomizes 45 hospitals to: (a) screening cultures of ICU admissions, followed by Contact Precautions if MRSA-positive, (b) screening cultures of ICU admissions followed by decolonization if MRSA-positive, or (c) universal decolonization of ICU admissions without screening. All admissions to adult ICUs. The primary outcome is MRSA-positive clinical cultures occurring >or=2 days following ICU admission. Secondary outcomes include blood and urine infection caused by MRSA (and, separately, all pathogens), as well as the development of resistance to decolonizing agents. Recruitment of hospitals is complete. Data collection will end in Summer 2011. This trial takes advantage of existing personnel, procedures, infrastructure, and information systems in a large integrated hospital network to conduct a low-cost evaluation of prevention strategies under usual practice conditions. This approach is applicable to many comparative effectiveness topics in both inpatient and ambulatory settings.

  17. Identification by random forest method of HLA class I amino acid substitutions associated with lower survival at day 100 in unrelated donor hematopoietic cell transplantation.

    PubMed

    Marino, S R; Lin, S; Maiers, M; Haagenson, M; Spellman, S; Klein, J P; Binkowski, T A; Lee, S J; van Besien, K

    2012-02-01

    The identification of important amino acid substitutions associated with low survival in hematopoietic cell transplantation (HCT) is hampered by the large number of observed substitutions compared with the small number of patients available for analysis. Random forest analysis is designed to address these limitations. We studied 2107 HCT recipients with good or intermediate risk hematological malignancies to identify HLA class I amino acid substitutions associated with reduced survival at day 100 post transplant. Random forest analysis and traditional univariate and multivariate analyses were used. Random forest analysis identified amino acid substitutions in 33 positions that were associated with reduced 100 day survival, including HLA-A 9, 43, 62, 63, 76, 77, 95, 97, 114, 116, 152, 156, 166 and 167; HLA-B 97, 109, 116 and 156; and HLA-C 6, 9, 11, 14, 21, 66, 77, 80, 95, 97, 99, 116, 156, 163 and 173. In all 13 had been previously reported by other investigators using classical biostatistical approaches. Using the same data set, traditional multivariate logistic regression identified only five amino acid substitutions associated with lower day 100 survival. Random forest analysis is a novel statistical methodology for analysis of HLA mismatching and outcome studies, capable of identifying important amino acid substitutions missed by other methods.

  18. A new time domain random walk method for solute transport in 1-D heterogeneous media

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Banton, O.; Delay, F.; Porel, G.

    A new method to simulate solute transport in 1-D heterogeneous media is presented. This time domain random walk method (TDRW), similar in concept to the classical random walk method, calculates the arrival time of a particle cloud at a given location (directly providing the solute breakthrough curve). The main advantage of the method is that the restrictions on the space increments and the time steps which exist with the finite differences and random walk methods are avoided. In a homogeneous zone, the breakthrough curve (BTC) can be calculated directly at a given distance using a few hundred particles or directlymore » at the boundary of the zone. Comparisons with analytical solutions and with the classical random walk method show the reliability of this method. The velocity and dispersivity calculated from the simulated results agree within two percent with the values used as input in the model. For contrasted heterogeneous media, the random walk can generate high numerical dispersion, while the time domain approach does not.« less

  19. Turbulence and fire-spotting effects into wild-land fire simulators

    NASA Astrophysics Data System (ADS)

    Kaur, Inderpreet; Mentrelli, Andrea; Bosseur, Frédéric; Filippi, Jean-Baptiste; Pagnini, Gianni

    2016-10-01

    This paper presents a mathematical approach to model the effects and the role of phenomena with random nature such as turbulence and fire-spotting into the existing wildfire simulators. The formulation proposes that the propagation of the fire-front is the sum of a drifting component (obtained from an existing wildfire simulator without turbulence and fire-spotting) and a random fluctuating component. The modelling of the random effects is embodied in a probability density function accounting for the fluctuations around the fire perimeter which is given by the drifting component. In past, this formulation has been applied to include these random effects into a wildfire simulator based on an Eulerian moving interface method, namely the Level Set Method (LSM), but in this paper the same formulation is adapted for a wildfire simulator based on a Lagrangian front tracking technique, namely the Discrete Event System Specification (DEVS). The main highlight of the present study is the comparison of the performance of a Lagrangian and an Eulerian moving interface method when applied to wild-land fire propagation. Simple idealised numerical experiments are used to investigate the potential applicability of the proposed formulation to DEVS and to compare its behaviour with respect to the LSM. The results show that DEVS based wildfire propagation model qualitatively improves its performance (e.g., reproducing flank and back fire, increase in fire spread due to pre-heating of the fuel by hot air and firebrands, fire propagation across no fuel zones, secondary fire generation, ...) when random effects are included according to the present formulation. The performance of DEVS and LSM based wildfire models is comparable and the only differences which arise among the two are due to the differences in the geometrical construction of the direction of propagation. Though the results presented here are devoid of any validation exercise and provide only a proof of concept, they show a strong inclination towards an intended operational use. The existing LSM or DEVS based operational simulators like WRF-SFIRE and ForeFire respectively can serve as an ideal basis for the same.

  20. Developing appropriate methods for cost-effectiveness analysis of cluster randomized trials.

    PubMed

    Gomes, Manuel; Ng, Edmond S-W; Grieve, Richard; Nixon, Richard; Carpenter, James; Thompson, Simon G

    2012-01-01

    Cost-effectiveness analyses (CEAs) may use data from cluster randomized trials (CRTs), where the unit of randomization is the cluster, not the individual. However, most studies use analytical methods that ignore clustering. This article compares alternative statistical methods for accommodating clustering in CEAs of CRTs. Our simulation study compared the performance of statistical methods for CEAs of CRTs with 2 treatment arms. The study considered a method that ignored clustering--seemingly unrelated regression (SUR) without a robust standard error (SE)--and 4 methods that recognized clustering--SUR and generalized estimating equations (GEEs), both with robust SE, a "2-stage" nonparametric bootstrap (TSB) with shrinkage correction, and a multilevel model (MLM). The base case assumed CRTs with moderate numbers of balanced clusters (20 per arm) and normally distributed costs. Other scenarios included CRTs with few clusters, imbalanced cluster sizes, and skewed costs. Performance was reported as bias, root mean squared error (rMSE), and confidence interval (CI) coverage for estimating incremental net benefits (INBs). We also compared the methods in a case study. Each method reported low levels of bias. Without the robust SE, SUR gave poor CI coverage (base case: 0.89 v. nominal level: 0.95). The MLM and TSB performed well in each scenario (CI coverage, 0.92-0.95). With few clusters, the GEE and SUR (with robust SE) had coverage below 0.90. In the case study, the mean INBs were similar across all methods, but ignoring clustering underestimated statistical uncertainty and the value of further research. MLMs and the TSB are appropriate analytical methods for CEAs of CRTs with the characteristics described. SUR and GEE are not recommended for studies with few clusters.

  1. Mechanical and Pharmacologic Methods of Labor Induction: A Randomized Controlled Trial

    PubMed Central

    Levine, Lisa D.; Downes, Katheryne L.; Elovitz, Michal A.; Parry, Samuel; Sammel, Mary D.; Srinivas, Sindhu K

    2016-01-01

    Objective To evaluate the effectiveness of four commonly used induction methods. Methods This randomized trial compared four induction methods: Misoprostol alone, Foley alone, Misoprostol–cervical Foley concurrently, and Foley–oxytocin concurrently,. Women undergoing labor induction with full term (≥37 weeks), singleton, vertex presenting gestations, with no contraindication to vaginal delivery, intact membranes, Bishop score ≤6, and cervical dilation ≤2cm were included. Women were enrolled only once during the study period. Our primary outcome was time to delivery. Neither patients nor providers were blinded to assigned treatment group since examinations are required for placement of all methods; however, research personnel were blinded during data abstraction. A sample size of 123 per group (N=492) was planned to compare the four groups pairwise (P≤.008), with a 4-hour reduction in delivery time considered clinically meaningful. Results From May 2013 through June 2015, 997 women were screened and 491 were randomized and analyzed. Demographic and clinical characteristics were similar among the four treatment groups. When comparing all induction method groups, combination methods achieved a faster median time to delivery than single-agent methods, (misoprostol–Foley: 13.1 hours, Foley–oxytocin: 14.5 hours, misoprostol: 17.6 hours, Foley: 17.7 hours, p<0.001). When censored for cesarean and adjusting for parity, women who received misoprostol–Foley delivered almost twice as likely to deliver before women who received misoprostol alone (hazard ratio (HR, 95% CI) 1.92 [1.42–2.59]) or Foley alone (HR, 95%CI: 1.87 [1.39–2.52]), whereas Foley–oxytocin was not statistically different from single-agent methods. Conclusion After censoring for cesarean and adjusting for parity, misoprostol–cervical Foley resulted in twice the chance of delivering before either single-agent method. PMID:27824758

  2. Perspective: Randomized Controlled Trials Are Not a Panacea for Diet-Related Research12

    PubMed Central

    Hébert, James R; Frongillo, Edward A; Adams, Swann A; Turner-McGrievy, Gabrielle M; Hurley, Thomas G; Miller, Donald R; Ockene, Ira S

    2016-01-01

    Research into the role of diet in health faces a number of methodologic challenges in the choice of study design, measurement methods, and analytic options. Heavier reliance on randomized controlled trial (RCT) designs is suggested as a way to solve these challenges. We present and discuss 7 inherent and practical considerations with special relevance to RCTs designed to study diet: 1) the need for narrow focus; 2) the choice of subjects and exposures; 3) blinding of the intervention; 4) perceived asymmetry of treatment in relation to need; 5) temporal relations between dietary exposures and putative outcomes; 6) strict adherence to the intervention protocol, despite potential clinical counter-indications; and 7) the need to maintain methodologic rigor, including measuring diet carefully and frequently. Alternatives, including observational studies and adaptive intervention designs, are presented and discussed. Given high noise-to-signal ratios interjected by using inaccurate assessment methods in studies with weak or inappropriate study designs (including RCTs), it is conceivable and indeed likely that effects of diet are underestimated. No matter which designs are used, studies will require continued improvement in the assessment of dietary intake. As technology continues to improve, there is potential for enhanced accuracy and reduced user burden of dietary assessments that are applicable to a wide variety of study designs, including RCTs. PMID:27184269

  3. Building capacity for rigorous controlled trials in autism: the importance of measuring treatment adherence.

    PubMed

    McConachie, H; Fletcher-Watson, S

    2015-03-01

    Research groups across Europe have been networking to share information and ideas about research on preschool children with autism. The paper describes preliminary work to develop capacity for future multi-site randomized controlled trials of early intervention, with a specific focus on the need to measure treatment adherence where parents deliver therapy. The paper includes a review of randomized and controlled studies of parent-mediated early intervention from two sources, a recent Cochrane Collaboration review and a mapping of European early intervention studies in autism published since 2002. The data extracted focused on methods for describing parent adherence, that is, how and to what extent parents carry out the strategies taught them by therapists. Less than half of the 32 studies reviewed included any measure of parent adherence. Only seven included a direct assessment method. The challenges of developing pan-European early intervention evaluation studies are discussed, including choice of intervention model and of important outcomes, the need for translation of measurement tools and achievement of joint training to reliability of assessors. Measurement of parent-child interaction style and of adherence to strategies taught need further study. © 2014 The Authors. Child: Care, Health and Development published by John Wiley & Sons Ltd.

  4. Adsorption energies of benzene on close packed transition metal surfaces using the random phase approximation

    NASA Astrophysics Data System (ADS)

    Garrido Torres, José A.; Ramberger, Benjamin; Früchtl, Herbert A.; Schaub, Renald; Kresse, Georg

    2017-11-01

    The adsorption energy of benzene on various metal substrates is predicted using the random phase approximation (RPA) for the correlation energy. Agreement with available experimental data is systematically better than 10% for both coinage and reactive metals. The results are also compared with more approximate methods, including van der Waals density functional theory (DFT), as well as dispersion-corrected DFT functionals. Although dispersion-corrected DFT can yield accurate results, for instance, on coinage metals, the adsorption energies are clearly overestimated on more reactive transition metals. Furthermore, coverage dependent adsorption energies are well described by the RPA. This shows that for the description of aromatic molecules on metal surfaces further improvements in density functionals are necessary, or more involved many-body methods such as the RPA are required.

  5. Albumin in Burn Shock Resuscitation: A Meta-Analysis of Controlled Clinical Studies.

    PubMed

    Navickis, Roberta J; Greenhalgh, David G; Wilkes, Mahlon M

    2016-01-01

    Critical appraisal of outcomes after burn shock resuscitation with albumin has previously been restricted to small relatively old randomized trials, some with high risk of bias. Extensive recent data from nonrandomized studies assessing the use of albumin can potentially reduce bias and add precision. The objective of this meta-analysis was to determine the effect of burn shock resuscitation with albumin on mortality and morbidity in adult patients. Randomized and nonrandomized controlled clinical studies evaluating mortality and morbidity in adult patients receiving albumin for burn shock resuscitation were identified by multiple methods, including computer database searches and examination of journal contents and reference lists. Extracted data were quantitatively combined by random-effects meta-analysis. Four randomized and four nonrandomized studies with 688 total adult patients were included. Treatment effects did not differ significantly between the included randomized and nonrandomized studies. Albumin infusion during the first 24 hours showed no significant overall effect on mortality. However, significant statistical heterogeneity was present, which could be abolished by excluding two studies at high risk of bias. After those exclusions, albumin infusion was associated with reduced mortality. The pooled odds ratio was 0.34 with a 95% confidence interval of 0.19 to 0.58 (P < .001). Albumin administration was also accompanied by decreased occurrence of compartment syndrome (pooled odds ratio, 0.19; 95% confidence interval, 0.07-0.50; P < .001). This meta-analysis suggests that albumin can improve outcomes of burn shock resuscitation. However, the scope and quality of current evidence are limited, and additional trials are needed.

  6. Kindergarten Teachers' Experience with Reporting Child Abuse in Taiwan

    ERIC Educational Resources Information Center

    Feng, Jui-Ying; Huang, Tzu-Yi; Wang, Chi-Jen

    2010-01-01

    Objective: The objectives were to examine factors associated with reporting child abuse among kindergarten teachers in Taiwan based on the Theory of Planned Behavior (TPB). Method: A stratified quota sampling technique was used to randomly select kindergarten teachers in Taiwan. The Child Abuse Intention Report Scale, which includes demographics,…

  7. School Programs Targeting Stress Management in Children and Adolescents: A Meta-Analysis

    ERIC Educational Resources Information Center

    Kraag, Gerda; Zeegers, Maurice P.; Kok, Gerjo; Hosman, Clemens; Abu-Saad, Huda Huijer

    2006-01-01

    Introduction: This meta-analysis evaluates the effect of school programs targeting stress management or coping skills in school children. Methods: Articles were selected through a systematic literature search. Only randomized controlled trials or quasi-experimental studies were included. The standardized mean differences (SMDs) between baseline…

  8. Anhedonia Predicts Poorer Recovery among Youth with Selective Serotonin Reuptake Inhibitor Treatment-Resistant Depression

    ERIC Educational Resources Information Center

    McMakin, Dana L.; Olino, Thomas M.; Porta, Giovanna; Dietz, Laura J.; Emslie, Graham; Clarke, Gregory; Wagner, Karen Dineen; Asarnow, Joan R.; Ryan, Neal D.; Birmaher, Boris; Shamseddeen, Wael; Mayes, Taryn; Kennard, Betsy; Spirito, Anthony; Keller, Martin; Lynch, Frances L.; Dickerson, John F.; Brent, David A.

    2012-01-01

    Objective: To identify symptom dimensions of depression that predict recovery among selective serotonin reuptake inhibitor (SSRI) treatment-resistant adolescents undergoing second-step treatment. Method: The Treatment of Resistant Depression in Adolescents (TORDIA) trial included 334 SSRI treatment-resistant youth randomized to a medication…

  9. Decision-Tree, Rule-Based, and Random Forest Classification of High-Resolution Multispectral Imagery for Wetland Mapping and Inventory

    EPA Science Inventory

    Efforts are increasingly being made to classify the world’s wetland resources, an important ecosystem and habitat that is diminishing in abundance. There are multiple remote sensing classification methods, including a suite of nonparametric classifiers such as decision-tree...

  10. Allowing for Correlations between Correlations in Random-Effects Meta-Analysis of Correlation Matrices

    ERIC Educational Resources Information Center

    Prevost, A. Toby; Mason, Dan; Griffin, Simon; Kinmonth, Ann-Louise; Sutton, Stephen; Spiegelhalter, David

    2007-01-01

    Practical meta-analysis of correlation matrices generally ignores covariances (and hence correlations) between correlation estimates. The authors consider various methods for allowing for covariances, including generalized least squares, maximum marginal likelihood, and Bayesian approaches, illustrated using a 6-dimensional response in a series of…

  11. Urban College Student Self-Report of Hookah Use with Health Care Providers

    ERIC Educational Resources Information Center

    Jani, Samir Ranjit; Brown, Darryl; Berhane, Zekarias; Peter, Nadja; Solecki, Susan; Turchi, Renee

    2018-01-01

    Objective: This study's purpose was to describe urban college students' communication about hookah with health care providers. Participants: Participants included a random sample of undergraduate urban college students and health care providers. Methods: Students surveyed determined the epidemiology of hookah use in this population, how many…

  12. Problem-Solving Therapy for Depression in Adults: A Systematic Review

    ERIC Educational Resources Information Center

    Gellis, Zvi D.; Kenaley, Bonnie

    2008-01-01

    Objectives: This article presents a systematic review of the evidence on problem-solving therapy (PST) for depressive disorders in noninstitutionalized adults. Method: Intervention studies using randomized controlled designs are included and methodological quality is assessed using a standard set of criteria from the Cochrane Collaborative Review…

  13. Delivery Systems: "Saber Tooth" Effect in Counseling.

    ERIC Educational Resources Information Center

    Traylor, Elwood B.

    This study reported the role of counselors as perceived by black students in a secondary school. Observational and interview methods were employed to obtain data from 24 black students selected at random from the junior and senior classes of a large metropolitan secondary school. Findings include: counselors were essentially concerned with…

  14. Calorie Restriction on Drinking Days: An Examination of Drinking Consequences among College Students

    ERIC Educational Resources Information Center

    Giles, Steven M.; Champion, Heather; Sutfin, Erin L.; McCoy, Thomas P.; Wagoner, Kim

    2009-01-01

    Objective: This study examined the association between restricting calories on intended drinking days and drunkenness frequency and alcohol-related consequences among college students. Participants: Participants included a random sample of 4,271 undergraduate college students from 10 universities. Methods: Students completed a Web-based survey…

  15. Random function representation of stationary stochastic vector processes for probability density evolution analysis of wind-induced structures

    NASA Astrophysics Data System (ADS)

    Liu, Zhangjun; Liu, Zenghui

    2018-06-01

    This paper develops a hybrid approach of spectral representation and random function for simulating stationary stochastic vector processes. In the proposed approach, the high-dimensional random variables, included in the original spectral representation (OSR) formula, could be effectively reduced to only two elementary random variables by introducing the random functions that serve as random constraints. Based on this, a satisfactory simulation accuracy can be guaranteed by selecting a small representative point set of the elementary random variables. The probability information of the stochastic excitations can be fully emerged through just several hundred of sample functions generated by the proposed approach. Therefore, combined with the probability density evolution method (PDEM), it could be able to implement dynamic response analysis and reliability assessment of engineering structures. For illustrative purposes, a stochastic turbulence wind velocity field acting on a frame-shear-wall structure is simulated by constructing three types of random functions to demonstrate the accuracy and efficiency of the proposed approach. Careful and in-depth studies concerning the probability density evolution analysis of the wind-induced structure have been conducted so as to better illustrate the application prospects of the proposed approach. Numerical examples also show that the proposed approach possesses a good robustness.

  16. Guidelines for randomized clinical trial protocol content: a systematic review

    PubMed Central

    2012-01-01

    Background All randomized clinical trials (RCTs) require a protocol; however, numerous studies have highlighted protocol deficiencies. Reporting guidelines may improve the content of research reports and, if developed using robust methods, may increase the utility of reports to stakeholders. The objective of this study was to systematically identify and review RCT protocol guidelines, to assess their characteristics and methods of development, and to compare recommendations. Methods We conducted a systematic review of indexed literature (MEDLINE, EMBASE and the Cochrane Methodology Register from inception to September 2010; reference lists; related article features; forward citation searching) and a targeted search of supplementary sources, including a survey of major trial funding agencies in six countries. Records were eligible if they described a content guideline in English or French relevant to RCT protocols. Guidelines were excluded if they specified content for protocols for trials of specific procedures or conditions or were intended to assess trial quality. We extracted guideline characteristics and methods. Content was mapped for a subset of guidelines that described development methods or had institutional endorsement. Results Forty guidelines published in journals, books and institutional reports were included in the review; seven were specific to RCT protocols. Only eight (20%) described development methods which included informal consensus methods, pilot testing and formal validation; no guideline described all of these methods. No guideline described formal consensus methods or a systematic retrieval of empirical evidence to inform its development. The guidelines included a median of 23 concepts per guideline (interquartile range (IQR) = 14 to 34; range = 7 to 109). Among the subset of guidelines (n = 23) for which content was mapped, approximately 380 concepts were explicitly addressed (median concepts per guideline IQR = 31 (24,80); range = 16 to 150); most concepts were addressed in a minority of guidelines. Conclusions Existing guidelines for RCT protocol content varied substantially in their recommendations. Few reports described the methods of guideline development, limiting comparisons of guideline validity. Given the importance of protocols to diverse stakeholders, we believe a systematically developed, evidence-informed guideline for clinical trial protocols is needed. PMID:23006870

  17. The Misgav Ladach method for cesarean section compared to the Pfannenstiel method.

    PubMed

    Darj, E; Nordström, M L

    1999-01-01

    The aim of the study was to evaluate the outcome of two different methods of cesarean section (CS). The study was designed as a prospective, randomized, controlled trial. All CS were performed at the University Hospital in Uppsala, Sweden. Fifty women admitted to hospital for a first elective CS were consecutively included in the study. They were randomly allocated to two groups. One group was operated on by the Misgav Ladach method for CS and the other group by the Pfannenstiel method. All operations were performed by the same surgeon. Duration of operation, amount of bleeding, analgesics required, scar appearance and length of hospitalization. Operating time was significantly different between the two methods, with an average of 12.5 minutes with the Misgav Ladach method and 26 minutes with the Pfannenstiel method (p<0.001). The amount of blood loss differed significantly, with 448 ml and 608 ml respectively (p=0.017). Significantly less analgesic injections and tablets (p=0.004) were needed after the Misgav Ladach method. The Misgav Ladach method of CS has advantages over the Pfannenstiel method by being significantly quicker to perform, with a reduced amount of bleeding and diminished postoperative pain. The women were satisfied with the appearance of their scars. In this study no negative effects of the new operation technique were discovered.

  18. Effects of different preservation methods on inter simple sequence repeat (ISSR) and random amplified polymorphic DNA (RAPD) molecular markers in botanic samples.

    PubMed

    Wang, Xiaolong; Li, Lin; Zhao, Jiaxin; Li, Fangliang; Guo, Wei; Chen, Xia

    2017-04-01

    To evaluate the effects of different preservation methods (stored in a -20°C ice chest, preserved in liquid nitrogen and dried in silica gel) on inter simple sequence repeat (ISSR) or random amplified polymorphic DNA (RAPD) analyses in various botanical specimens (including broad-leaved plants, needle-leaved plants and succulent plants) for different times (three weeks and three years), we used a statistical analysis based on the number of bands, genetic index and cluster analysis. The results demonstrate that methods used to preserve samples can provide sufficient amounts of genomic DNA for ISSR and RAPD analyses; however, the effect of different preservation methods on these analyses vary significantly, and the preservation time has little effect on these analyses. Our results provide a reference for researchers to select the most suitable preservation method depending on their study subject for the analysis of molecular markers based on genomic DNA. Copyright © 2017 Académie des sciences. Published by Elsevier Masson SAS. All rights reserved.

  19. Translational models of infection prevention and control: lessons from studying high risk aging populations.

    PubMed

    Mody, Lona

    2018-06-13

    The present review describes our research experiences and efforts in advancing the field of infection prevention and control in nursing facilities including postacute and long-term care settings. There are over two million infections in postacute and long-term care settings each year in the United States and $4 billion in associated costs. To define a target group most amenable to infection prevention and control interventions, we sought to quantify the relation between indwelling device use and microbial colonization in nursing facility patients. Using various methodologies including survey methods, observational epidemiology, randomized controlled studies, and collaboratives, we showed that indwelling device type is related to the site of multidrug-resistant organism (MDRO) colonization; multianatomic site colonization with MDROs is common; community-associated methicillin-resistant Staphylococcus aureus (MRSA) appeared in the nursing facility setting almost immediately following its emergence in acute care; (4) MDRO prevalence and catheter-associated infection rates can be reduced through a multimodal targeted infection prevention intervention; and (5) using a collaborative approach, such an intervention can be successfully scaled up. Our work advances the infection prevention field through translational research utilizing various methodologies, including quantitative and qualitative surveys, patient-oriented randomized controlled trials, and clinical microbiologic and molecular methods. The resulting interventions employ patient-oriented methods to reduce infections and antimicrobial resistance, and with partnerships from major national entities, can be implemented nationally.

  20. The Clinical Effects of Aromatherapy Massage on Reducing Pain for the Cancer Patients: Meta-Analysis of Randomized Controlled Trials

    PubMed Central

    Chen, Ting-Hao; Tung, Tao-Hsin; Chen, Pei-Shih; Wang, Shu-Hui; Chao, Chuang-Min; Hsiung, Nan-Hsing; Chi, Ching-Chi

    2016-01-01

    Purpose. Aromatherapy massage is an alternative treatment in reducing the pain of the cancer patients. This study was to investigate whether aromatherapy massage could improve the pain of the cancer patients. Methods. We searched PubMed and Cochrane Library for relevant randomized controlled trials without language limitations between 1 January 1990 and 31 July 2015 with a priori defined inclusion and exclusion criteria. The search terms included aromatherapy, essential oil, pain, ache, cancer, tumor, and carcinoma. There were 7 studies which met the selection criteria and 3 studies were eventually included among 63 eligible publications. Results. This meta-analysis included three randomized controlled trials with a total of 278 participants (135 participants in the massage with essential oil group and 143 participants in the control (usual care) group). Compared with the control group, the massage with essential oil group had nonsignificant effect on reducing the pain (standardized mean difference = 0.01; 95% CI [−0.23,0.24]). Conclusion. Aromatherapy massage does not appear to reduce pain of the cancer patients. Further rigorous studies should be conducted with more objective measures. PMID:26884799

  1. Intention-to-treat analysis and accounting for missing data in orthopaedic randomized clinical trials.

    PubMed

    Herman, Amir; Botser, Itamar Busheri; Tenenbaum, Shay; Chechick, Ahron

    2009-09-01

    The intention-to-treat principle implies that all patients who are randomized in a clinical trial should be analyzed according to their original allocation. This means that patients crossing over to another treatment group and patients lost to follow-up should be included in the analysis as a part of their original group. This principle is important for preserving the randomization scheme, which is the basis for correct inference in any randomized trial. In this study, we examined the use of the intention-to-treat principle in recently published orthopaedic clinical trials. We surveyed eight leading orthopaedic journals for randomized clinical trials published between January 2005 and August 2008. We determined whether the intention-to-treat principle was implemented and, if so, how it was used in each trial. Specifically, we ascertained which methods were used to account for missing data. Our search yielded 274 randomized clinical trials, and the intention-to-treat principle was used in ninety-six (35%) of them. There were significant differences among the journals with regard to the use of the intention-to-treat principle. The relative number of trials in which the principle was used increased each year. The authors adhered to the strict definition of the intention-to-treat principle in forty-five of the ninety-six studies in which it was claimed that this principle had been used. In forty-four randomized trials, patients who had been lost to follow-up were excluded from the final analysis; this practice was most notable in studies of surgical interventions. The most popular method of adjusting for missing data was the "last observation carried forward" technique. In most of the randomized clinical trials published in the orthopaedic literature, the investigators did not adhere to the stringent use of the intention-to-treat principle, with the most conspicuous problem being a lack of accounting for patients lost to follow-up. This omission might introduce bias to orthopaedic randomized clinical trials and their analysis.

  2. A Bayesian comparative effectiveness trial in action: developing a platform for multisite study adaptive randomization.

    PubMed

    Brown, Alexandra R; Gajewski, Byron J; Aaronson, Lauren S; Mudaranthakam, Dinesh Pal; Hunt, Suzanne L; Berry, Scott M; Quintana, Melanie; Pasnoor, Mamatha; Dimachkie, Mazen M; Jawdat, Omar; Herbelin, Laura; Barohn, Richard J

    2016-08-31

    In the last few decades, the number of trials using Bayesian methods has grown rapidly. Publications prior to 1990 included only three clinical trials that used Bayesian methods, but that number quickly jumped to 19 in the 1990s and to 99 from 2000 to 2012. While this literature provides many examples of Bayesian Adaptive Designs (BAD), none of the papers that are available walks the reader through the detailed process of conducting a BAD. This paper fills that gap by describing the BAD process used for one comparative effectiveness trial (Patient Assisted Intervention for Neuropathy: Comparison of Treatment in Real Life Situations) that can be generalized for use by others. A BAD was chosen with efficiency in mind. Response-adaptive randomization allows the potential for substantially smaller sample sizes, and can provide faster conclusions about which treatment or treatments are most effective. An Internet-based electronic data capture tool, which features a randomization module, facilitated data capture across study sites and an in-house computation software program was developed to implement the response-adaptive randomization. A process for adapting randomization with minimal interruption to study sites was developed. A new randomization table can be generated quickly and can be seamlessly integrated in the data capture tool with minimal interruption to study sites. This manuscript is the first to detail the technical process used to evaluate a multisite comparative effectiveness trial using adaptive randomization. An important opportunity for the application of Bayesian trials is in comparative effectiveness trials. The specific case study presented in this paper can be used as a model for conducting future clinical trials using a combination of statistical software and a web-based application. ClinicalTrials.gov Identifier: NCT02260388 , registered on 6 October 2014.

  3. A randomized, controlled, multicenter contraceptive efficacy clinical trial of the intravas device, a nonocclusive surgical male sterilization.

    PubMed

    Lu, Wen-Hong; Liang, Xiao-Wei; Gu, Yi-Qun; Wu, Wei-Xiong; Bo, Li-Wei; Zheng, Tian-Gui; Chen, Zhen-Wen

    2014-01-01

    Because of unavoidable complications of vasectomy, this study was undertaken to assess the efficacy and safety of male sterilization with a nonobstructive intravas device (IVD) implanted into the vas lumen by a mini-surgical method compared with no-scalpel vasectomy (NSV). IVDs were categorized into two types: IVD-B has a tail used for fixing to the vas deferens (fixed wing) whereas IVD-A does not. A multicenter prospective randomized controlled clinical trial was conducted in China. The study was comprised of 1459 male volunteers seeking vasectomy who were randomly assigned to the IVD-A (n = 487), IVD-B (n = 485) or NSV (n = 487) groups and underwent operation. Follow-up included visits at the 3 rd -6 th and 12 th postoperative months. The assessments of the subjects involved regular physical examinations (including general and andrological examinations) and semen analysis. The subjects' partners also underwent monitoring for pregnancy by monthly interviews regarding menstruation and if necessary, urine tests. There were no significant differences in pregnancy rates (0.65% for IVD-A, 0 for IVD-B and 0.21% for NSV) among the three groups (P > 0.05). The cumulative rates of complications at the 12 th postoperative month were zero, 0.9% and 1.7% in the three groups, respectively. In conclusion, IVD male sterilization exhibits a low risk of long-term adverse events and was found to be effective as a male sterilization method, similar to the NSV technique. IVD male sterilization is expected to be a novel contraceptive method.

  4. One-step random mutagenesis by error-prone rolling circle amplification

    PubMed Central

    Fujii, Ryota; Kitaoka, Motomitsu; Hayashi, Kiyoshi

    2004-01-01

    In vitro random mutagenesis is a powerful tool for altering properties of enzymes. We describe here a novel random mutagenesis method using rolling circle amplification, named error-prone RCA. This method consists of only one DNA amplification step followed by transformation of the host strain, without treatment with any restriction enzymes or DNA ligases, and results in a randomly mutated plasmid library with 3–4 mutations per kilobase. Specific primers or special equipment, such as a thermal-cycler, are not required. This method permits rapid preparation of randomly mutated plasmid libraries, enabling random mutagenesis to become a more commonly used technique. PMID:15507684

  5. Comparing cluster-level dynamic treatment regimens using sequential, multiple assignment, randomized trials: Regression estimation and sample size considerations.

    PubMed

    NeCamp, Timothy; Kilbourne, Amy; Almirall, Daniel

    2017-08-01

    Cluster-level dynamic treatment regimens can be used to guide sequential treatment decision-making at the cluster level in order to improve outcomes at the individual or patient-level. In a cluster-level dynamic treatment regimen, the treatment is potentially adapted and re-adapted over time based on changes in the cluster that could be impacted by prior intervention, including aggregate measures of the individuals or patients that compose it. Cluster-randomized sequential multiple assignment randomized trials can be used to answer multiple open questions preventing scientists from developing high-quality cluster-level dynamic treatment regimens. In a cluster-randomized sequential multiple assignment randomized trial, sequential randomizations occur at the cluster level and outcomes are observed at the individual level. This manuscript makes two contributions to the design and analysis of cluster-randomized sequential multiple assignment randomized trials. First, a weighted least squares regression approach is proposed for comparing the mean of a patient-level outcome between the cluster-level dynamic treatment regimens embedded in a sequential multiple assignment randomized trial. The regression approach facilitates the use of baseline covariates which is often critical in the analysis of cluster-level trials. Second, sample size calculators are derived for two common cluster-randomized sequential multiple assignment randomized trial designs for use when the primary aim is a between-dynamic treatment regimen comparison of the mean of a continuous patient-level outcome. The methods are motivated by the Adaptive Implementation of Effective Programs Trial which is, to our knowledge, the first-ever cluster-randomized sequential multiple assignment randomized trial in psychiatry.

  6. Encoding plaintext by Fourier transform hologram in double random phase encoding using fingerprint keys

    NASA Astrophysics Data System (ADS)

    Takeda, Masafumi; Nakano, Kazuya; Suzuki, Hiroyuki; Yamaguchi, Masahiro

    2012-09-01

    It has been shown that biometric information can be used as a cipher key for binary data encryption by applying double random phase encoding. In such methods, binary data are encoded in a bit pattern image, and the decrypted image becomes a plain image when the key is genuine; otherwise, decrypted images become random images. In some cases, images decrypted by imposters may not be fully random, such that the blurred bit pattern can be partially observed. In this paper, we propose a novel bit coding method based on a Fourier transform hologram, which makes images decrypted by imposters more random. Computer experiments confirm that the method increases the randomness of images decrypted by imposters while keeping the false rejection rate as low as in the conventional method.

  7. Methodological Quality of Randomized Clinical Trials of Respiratory Physiotherapy in Coronary Artery Bypass Grafting Patients in the Intensive Care Unit: a Systematic Review

    PubMed Central

    Lorscheitter, Jaqueline; Stein, Cinara; Plentz, Rodrigo Della Méa

    2017-01-01

    Objective To assess methodological quality of the randomized controlled trials of physiotherapy in patients undergoing coronary artery bypass grafting in the intensive care unit. Methods The studies published until May 2015, in MEDLINE, Cochrane and PEDro were included. The primary outcome extracted was proper filling of the Cochrane Collaboration's tool's items and the secondary was suitability to the requirements of the CONSORT Statement and its extension. Results From 807 studies identified, 39 were included. Most at CONSORT items showed a better adequacy after the statement's publication. Studies with positive outcomes presented better methodological quality. Conclusion The methodological quality of the studies has been improving over the years. However, many aspects can still be better designed. PMID:28977205

  8. Unit Costs of Interlibrary Loans and Photocopies at a Regional Medical Library: Preliminary Report *

    PubMed Central

    Spencer, Carol C.

    1970-01-01

    Unit costs of providing interlibrary loans and photocopies were determined by a method not previously used for library cost studies: random time sampling with self-observation. The working time of all appropriate personnel was sampled using Random Alarm Mechanisms and a structured checklist of tasks. The total lender's unit cost per request received, including direct labor, materials, fringe benefits, and overhead, was $1.526 for originals mailed postpaid by lender, and $1.534 for photocopies mailed. Corresponding unit costs per request filled were: originals, $1.932, and photocopies, $1.763. PMID:5439910

  9. Reproduction of exact solutions of Lipkin model by nonlinear higher random-phase approximation

    NASA Astrophysics Data System (ADS)

    Terasaki, J.; Smetana, A.; Šimkovic, F.; Krivoruchenko, M. I.

    2017-10-01

    It is shown that the random-phase approximation (RPA) method with its nonlinear higher generalization, which was previously considered as approximation except for a very limited case, reproduces the exact solutions of the Lipkin model. The nonlinear higher RPA is based on an equation nonlinear on eigenvectors and includes many-particle-many-hole components in the creation operator of the excited states. We demonstrate the exact character of solutions analytically for the particle number N = 2 and numerically for N = 8. This finding indicates that the nonlinear higher RPA is equivalent to the exact Schrödinger equation.

  10. A reanalysis of cluster randomized trials showed interrupted time-series studies were valuable in health system evaluation.

    PubMed

    Fretheim, Atle; Zhang, Fang; Ross-Degnan, Dennis; Oxman, Andrew D; Cheyne, Helen; Foy, Robbie; Goodacre, Steve; Herrin, Jeph; Kerse, Ngaire; McKinlay, R James; Wright, Adam; Soumerai, Stephen B

    2015-03-01

    There is often substantial uncertainty about the impacts of health system and policy interventions. Despite that, randomized controlled trials (RCTs) are uncommon in this field, partly because experiments can be difficult to carry out. An alternative method for impact evaluation is the interrupted time-series (ITS) design. Little is known, however, about how results from the two methods compare. Our aim was to explore whether ITS studies yield results that differ from those of randomized trials. We conducted single-arm ITS analyses (segmented regression) based on data from the intervention arm of cluster randomized trials (C-RCTs), that is, discarding control arm data. Secondarily, we included the control group data in the analyses, by subtracting control group data points from intervention group data points, thereby constructing a time series representing the difference between the intervention and control groups. We compared the results from the single-arm and controlled ITS analyses with results based on conventional aggregated analyses of trial data. The findings were largely concordant, yielding effect estimates with overlapping 95% confidence intervals (CI) across different analytical methods. However, our analyses revealed the importance of a concurrent control group and of taking baseline and follow-up trends into account in the analysis of C-RCTs. The ITS design is valuable for evaluation of health systems interventions, both when RCTs are not feasible and in the analysis and interpretation of data from C-RCTs. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  11. Bayesian randomized clinical trials: From fixed to adaptive design.

    PubMed

    Yin, Guosheng; Lam, Chi Kin; Shi, Haolun

    2017-08-01

    Randomized controlled studies are the gold standard for phase III clinical trials. Using α-spending functions to control the overall type I error rate, group sequential methods are well established and have been dominating phase III studies. Bayesian randomized design, on the other hand, can be viewed as a complement instead of competitive approach to the frequentist methods. For the fixed Bayesian design, the hypothesis testing can be cast in the posterior probability or Bayes factor framework, which has a direct link to the frequentist type I error rate. Bayesian group sequential design relies upon Bayesian decision-theoretic approaches based on backward induction, which is often computationally intensive. Compared with the frequentist approaches, Bayesian methods have several advantages. The posterior predictive probability serves as a useful and convenient tool for trial monitoring, and can be updated at any time as the data accrue during the trial. The Bayesian decision-theoretic framework possesses a direct link to the decision making in the practical setting, and can be modeled more realistically to reflect the actual cost-benefit analysis during the drug development process. Other merits include the possibility of hierarchical modeling and the use of informative priors, which would lead to a more comprehensive utilization of information from both historical and longitudinal data. From fixed to adaptive design, we focus on Bayesian randomized controlled clinical trials and make extensive comparisons with frequentist counterparts through numerical studies. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. Discrete-element modeling of nacre-like materials: Effects of random microstructures on strain localization and mechanical performance

    NASA Astrophysics Data System (ADS)

    Abid, Najmul; Mirkhalaf, Mohammad; Barthelat, Francois

    2018-03-01

    Natural materials such as nacre, collagen, and spider silk are composed of staggered stiff and strong inclusions in a softer matrix. This type of hybrid microstructure results in remarkable combinations of stiffness, strength, and toughness and it now inspires novel classes of high-performance composites. However, the analytical and numerical approaches used to predict and optimize the mechanics of staggered composites often neglect statistical variations and inhomogeneities, which may have significant impacts on modulus, strength, and toughness. Here we present an analysis of localization using small representative volume elements (RVEs) and large scale statistical volume elements (SVEs) based on the discrete element method (DEM). DEM is an efficient numerical method which enabled the evaluation of more than 10,000 microstructures in this study, each including about 5,000 inclusions. The models explore the combined effects of statistics, inclusion arrangement, and interface properties. We find that statistical variations have a negative effect on all properties, in particular on the ductility and energy absorption because randomness precipitates the localization of deformations. However, the results also show that the negative effects of random microstructures can be offset by interfaces with large strain at failure accompanied by strain hardening. More specifically, this quantitative study reveals an optimal range of interface properties where the interfaces are the most effective at delaying localization. These findings show how carefully designed interfaces in bioinspired staggered composites can offset the negative effects of microstructural randomness, which is inherent to most current fabrication methods.

  13. Guidelines for randomized clinical trial protocol content: a systematic review.

    PubMed

    Tetzlaff, Jennifer M; Chan, An-Wen; Kitchen, Jessica; Sampson, Margaret; Tricco, Andrea C; Moher, David

    2012-09-24

    All randomized clinical trials (RCTs) require a protocol; however, numerous studies have highlighted protocol deficiencies. Reporting guidelines may improve the content of research reports and, if developed using robust methods, may increase the utility of reports to stakeholders. The objective of this study was to systematically identify and review RCT protocol guidelines, to assess their characteristics and methods of development, and to compare recommendations. We conducted a systematic review of indexed literature (MEDLINE, EMBASE and the Cochrane Methodology Register from inception to September 2010; reference lists; related article features; forward citation searching) and a targeted search of supplementary sources, including a survey of major trial funding agencies in six countries. Records were eligible if they described a content guideline in English or French relevant to RCT protocols. Guidelines were excluded if they specified content for protocols for trials of specific procedures or conditions or were intended to assess trial quality. We extracted guideline characteristics and methods. Content was mapped for a subset of guidelines that described development methods or had institutional endorsement. Forty guidelines published in journals, books and institutional reports were included in the review; seven were specific to RCT protocols. Only eight (20%) described development methods which included informal consensus methods, pilot testing and formal validation; no guideline described all of these methods. No guideline described formal consensus methods or a systematic retrieval of empirical evidence to inform its development. The guidelines included a median of 23 concepts per guideline (interquartile range (IQR) = 14 to 34; range = 7 to 109). Among the subset of guidelines (n = 23) for which content was mapped, approximately 380 concepts were explicitly addressed (median concepts per guideline IQR = 31 (24,80); range = 16 to 150); most concepts were addressed in a minority of guidelines. Existing guidelines for RCT protocol content varied substantially in their recommendations. Few reports described the methods of guideline development, limiting comparisons of guideline validity. Given the importance of protocols to diverse stakeholders, we believe a systematically developed, evidence-informed guideline for clinical trial protocols is needed.

  14. Chromatic refraction with global ozone monitoring by occultation of stars. I. Description and scintillation correction.

    PubMed

    Dalaudier, F; Kan, V; Gurvich, A S

    2001-02-20

    We describe refractive and chromatic effects, both regular and random, that occur during star occultations by the Earth's atmosphere. The scintillation that results from random density fluctuations, as well as the consequences of regular chromatic refraction, is qualitatively described. The resultant chromatic scintillation will produce random features on the Global Ozone Monitoring by Occultation of Stars (GOMOS) spectrometer, with an amplitude comparable with that of some of the real absorbing features that result from atmospheric constituents. A correction method that is based on the use of fast photometer signals is described, and its efficiency is discussed. We give a qualitative (although accurate) description of the phenomena, including numerical values when needed. Geometrical optics and the phase-screen approximation are used to keep the description simple.

  15. Quantifying Uncertainties in N2O Emission Due to N Fertilizer Application in Cultivated Areas

    PubMed Central

    Philibert, Aurore; Loyce, Chantal; Makowski, David

    2012-01-01

    Nitrous oxide (N2O) is a greenhouse gas with a global warming potential approximately 298 times greater than that of CO2. In 2006, the Intergovernmental Panel on Climate Change (IPCC) estimated N2O emission due to synthetic and organic nitrogen (N) fertilization at 1% of applied N. We investigated the uncertainty on this estimated value, by fitting 13 different models to a published dataset including 985 N2O measurements. These models were characterized by (i) the presence or absence of the explanatory variable “applied N”, (ii) the function relating N2O emission to applied N (exponential or linear function), (iii) fixed or random background (i.e. in the absence of N application) N2O emission and (iv) fixed or random applied N effect. We calculated ranges of uncertainty on N2O emissions from a subset of these models, and compared them with the uncertainty ranges currently used in the IPCC-Tier 1 method. The exponential models outperformed the linear models, and models including one or two random effects outperformed those including fixed effects only. The use of an exponential function rather than a linear function has an important practical consequence: the emission factor is not constant and increases as a function of applied N. Emission factors estimated using the exponential function were lower than 1% when the amount of N applied was below 160 kg N ha−1. Our uncertainty analysis shows that the uncertainty range currently used by the IPCC-Tier 1 method could be reduced. PMID:23226430

  16. Stroke rehabilitation evidence and comorbidity: a systematic scoping review of randomized controlled trials.

    PubMed

    Nelson, Michelle L A; McKellar, Kaileah A; Yi, Juliana; Kelloway, Linda; Munce, Sarah; Cott, Cheryl; Hall, Ruth; Fortin, Martin; Teasell, Robert; Lyons, Renee

    2017-07-01

    Most strokes occur in the context of other medical diagnoses. Currently, stroke rehabilitation evidence reviews have not synthesized or presented evidence with a focus on comorbidities and correspondingly may not align with current patient population. The purpose of this review was to determine the extent and nature of randomized controlled trial stroke rehabilitation evidence that included patients with multimorbidity. A systematic scoping review was conducted. Electronic databases were searched using a combination of terms related to "stroke" and "rehabilitation." Selection criteria captured inpatient rehabilitation studies. Methods were modified to account for the amount of literature, classified by study design, and randomized controlled trials (RCTs) were abstracted. The database search yielded 10771 unique articles. Screening resulted in 428 included RCTs. Three studies explicitly included patients with a comorbid condition. Fifteen percent of articles did not specify additional conditions that were excluded. Impaired cognition was the most commonly excluded condition. Approximately 37% of articles excluded patients who had experienced a previous stroke. Twenty-four percent excluded patients one or more Charlson Index condition, and 83% excluded patients with at least one other medical condition. This review represents a first attempt to map literature on stroke rehabilitation related to co/multimorbidity and identify gaps in existing research. Existing evidence on stroke rehabilitation often excluded individuals with comorbidities. This is problematic as the evidence that is used to generate clinical guidelines may not match the patient typically seen in practice. The use of alternate research methods are therefore needed for studying the care of individuals with stroke and multimorbidity.

  17. Effects of daily iron supplementation in primary-school–aged children: systematic review and meta-analysis of randomized controlled trials

    PubMed Central

    Low, Michael; Farrell, Ann; Biggs, Beverley-Ann; Pasricha, Sant-Rayn

    2013-01-01

    Background: Anemia is an important public health and clinical problem. Observational studies have linked iron deficiency and anemia in children with many poor outcomes, including impaired cognitive development; however, iron supplementation, a widely used preventive and therapeutic strategy, is associated with adverse effects. Primary-school–aged children are at a critical stage in intellectual development, and optimization of their cognitive performance could have long-lasting individual and population benefits. In this study, we summarize the evidence for the benefits and safety of daily iron supplementation in primary-school–aged children. Methods: We searched electronic databases (including MEDLINE and Embase) and other sources (July 2013) for randomized and quasi-randomized controlled trials involving daily iron supplementation in children aged 5–12 years. We combined the data using random effects meta-analysis. Results: We identified 16 501 studies; of these, we evaluated 76 full-text papers and included 32 studies including 7089 children. Of the included studies, 31 were conducted in low- or middle-income settings. Iron supplementation improved global cognitive scores (standardized mean difference 0.50, 95% confidence interval [CI] 0.11 to 0.90, p = 0.01), intelligence quotient among anemic children (mean difference 4.55, 95% CI 0.16 to 8.94, p = 0.04) and measures of attention and concentration. Iron supplementation also improved age-adjusted height among all children and age-adjusted weight among anemic children. Iron supplementation reduced the risk of anemia by 50% and the risk of iron deficiency by 79%. Adherence in the trial settings was generally high. Safety data were limited. Interpretation: Our analysis suggests that iron supplementation safely improves hematologic and nonhematologic outcomes among primary-school–aged children in low- or middle-income settings and is well-tolerated. PMID:24130243

  18. Logistic random effects regression models: a comparison of statistical packages for binary and ordinal outcomes.

    PubMed

    Li, Baoyue; Lingsma, Hester F; Steyerberg, Ewout W; Lesaffre, Emmanuel

    2011-05-23

    Logistic random effects models are a popular tool to analyze multilevel also called hierarchical data with a binary or ordinal outcome. Here, we aim to compare different statistical software implementations of these models. We used individual patient data from 8509 patients in 231 centers with moderate and severe Traumatic Brain Injury (TBI) enrolled in eight Randomized Controlled Trials (RCTs) and three observational studies. We fitted logistic random effects regression models with the 5-point Glasgow Outcome Scale (GOS) as outcome, both dichotomized as well as ordinal, with center and/or trial as random effects, and as covariates age, motor score, pupil reactivity or trial. We then compared the implementations of frequentist and Bayesian methods to estimate the fixed and random effects. Frequentist approaches included R (lme4), Stata (GLLAMM), SAS (GLIMMIX and NLMIXED), MLwiN ([R]IGLS) and MIXOR, Bayesian approaches included WinBUGS, MLwiN (MCMC), R package MCMCglmm and SAS experimental procedure MCMC.Three data sets (the full data set and two sub-datasets) were analysed using basically two logistic random effects models with either one random effect for the center or two random effects for center and trial. For the ordinal outcome in the full data set also a proportional odds model with a random center effect was fitted. The packages gave similar parameter estimates for both the fixed and random effects and for the binary (and ordinal) models for the main study and when based on a relatively large number of level-1 (patient level) data compared to the number of level-2 (hospital level) data. However, when based on relatively sparse data set, i.e. when the numbers of level-1 and level-2 data units were about the same, the frequentist and Bayesian approaches showed somewhat different results. The software implementations differ considerably in flexibility, computation time, and usability. There are also differences in the availability of additional tools for model evaluation, such as diagnostic plots. The experimental SAS (version 9.2) procedure MCMC appeared to be inefficient. On relatively large data sets, the different software implementations of logistic random effects regression models produced similar results. Thus, for a large data set there seems to be no explicit preference (of course if there is no preference from a philosophical point of view) for either a frequentist or Bayesian approach (if based on vague priors). The choice for a particular implementation may largely depend on the desired flexibility, and the usability of the package. For small data sets the random effects variances are difficult to estimate. In the frequentist approaches the MLE of this variance was often estimated zero with a standard error that is either zero or could not be determined, while for Bayesian methods the estimates could depend on the chosen "non-informative" prior of the variance parameter. The starting value for the variance parameter may be also critical for the convergence of the Markov chain.

  19. A natively paired antibody library yields drug leads with higher sensitivity and specificity than a randomly paired antibody library.

    PubMed

    Adler, Adam S; Bedinger, Daniel; Adams, Matthew S; Asensio, Michael A; Edgar, Robert C; Leong, Renee; Leong, Jackson; Mizrahi, Rena A; Spindler, Matthew J; Bandi, Srinivasa Rao; Huang, Haichun; Tawde, Pallavi; Brams, Peter; Johnson, David S

    2018-04-01

    Deep sequencing and single-chain variable fragment (scFv) yeast display methods are becoming more popular for discovery of therapeutic antibody candidates in mouse B cell repertoires. In this study, we compare a deep sequencing and scFv display method that retains native heavy and light chain pairing with a related method that randomly pairs heavy and light chain. We performed the studies in a humanized mouse, using interleukin 21 receptor (IL-21R) as a test immunogen. We identified 44 high-affinity binder scFv with the native pairing method and 100 high-affinity binder scFv with the random pairing method. 30% of the natively paired scFv binders were also discovered with the randomly paired method, and 13% of the randomly paired binders were also discovered with the natively paired method. Additionally, 33% of the scFv binders discovered only in the randomly paired library were initially present in the natively paired pre-sort library. Thus, a significant proportion of "randomly paired" scFv were actually natively paired. We synthesized and produced 46 of the candidates as full-length antibodies and subjected them to a panel of binding assays to characterize their therapeutic potential. 87% of the antibodies were verified as binding IL-21R by at least one assay. We found that antibodies with native light chains were more likely to bind IL-21R than antibodies with non-native light chains, suggesting a higher false positive rate for antibodies from the randomly paired library. Additionally, the randomly paired method failed to identify nearly half of the true natively paired binders, suggesting a higher false negative rate. We conclude that natively paired libraries have critical advantages in sensitivity and specificity for antibody discovery programs.

  20. A natively paired antibody library yields drug leads with higher sensitivity and specificity than a randomly paired antibody library

    PubMed Central

    Adler, Adam S.; Bedinger, Daniel; Adams, Matthew S.; Asensio, Michael A.; Edgar, Robert C.; Leong, Renee; Leong, Jackson; Mizrahi, Rena A.; Spindler, Matthew J.; Bandi, Srinivasa Rao; Huang, Haichun; Brams, Peter; Johnson, David S.

    2018-01-01

    ABSTRACT Deep sequencing and single-chain variable fragment (scFv) yeast display methods are becoming more popular for discovery of therapeutic antibody candidates in mouse B cell repertoires. In this study, we compare a deep sequencing and scFv display method that retains native heavy and light chain pairing with a related method that randomly pairs heavy and light chain. We performed the studies in a humanized mouse, using interleukin 21 receptor (IL-21R) as a test immunogen. We identified 44 high-affinity binder scFv with the native pairing method and 100 high-affinity binder scFv with the random pairing method. 30% of the natively paired scFv binders were also discovered with the randomly paired method, and 13% of the randomly paired binders were also discovered with the natively paired method. Additionally, 33% of the scFv binders discovered only in the randomly paired library were initially present in the natively paired pre-sort library. Thus, a significant proportion of “randomly paired” scFv were actually natively paired. We synthesized and produced 46 of the candidates as full-length antibodies and subjected them to a panel of binding assays to characterize their therapeutic potential. 87% of the antibodies were verified as binding IL-21R by at least one assay. We found that antibodies with native light chains were more likely to bind IL-21R than antibodies with non-native light chains, suggesting a higher false positive rate for antibodies from the randomly paired library. Additionally, the randomly paired method failed to identify nearly half of the true natively paired binders, suggesting a higher false negative rate. We conclude that natively paired libraries have critical advantages in sensitivity and specificity for antibody discovery programs. PMID:29376776

  1. Inventory of Amphibians and Reptiles in Southern Colorado Plateau National Parks

    USGS Publications Warehouse

    Persons, Trevor B.; Nowak, Erika M.

    2006-01-01

    In fiscal year 2000, the National Park Service (NPS) initiated a nationwide program to inventory vertebrates andvascular plants within the National Parks, and an inventory plan was developed for the 19 park units in the Southern Colorado Plateau Inventory & Monitoring Network. We surveyed 12 parks in this network for reptiles and amphibians between 2001 and 2003. The overall goals of our herpetofaunal inventories were to document 90% of the species present, identify park-specific species of special concern, and, based on the inventory results, make recommendations for the development of an effective monitoring program. We used the following standardized herpetological methods to complete the inventories: time-area constrained searches, visual encounter ('general') surveys, and nighttime road cruising. We also recorded incidental species sightings and surveyed existing literature and museum specimen databases. We found 50 amphibian and reptile species during fieldwork. These included 1 salamander, 11 anurans, 21 lizards, and 17 snakes. Literature reviews, museum specimen data records, and personal communications with NPS staff added an additional eight species, including one salamander, one turtle, one lizard, and five snakes. It was necessary to use a variety of methods to detect all species in each park. Randomly-generated 1-ha time-area constrained searches and night drives produced the fewest species and individuals of all the methods, while general surveys and randomly-generated 10-ha time-areas constrained searches produced the most. Inventory completeness was likely compromised by a severe drought across the region during our surveys. In most parks we did not come close to the goal of detecting 90% of the expected species present; however, we did document several species range extensions. Effective monitoring programs for herpetofauna on the Colorado Plateau should use a variety of methods to detect species, and focus on taxa-specific methods. Randomly-generated plots must take into account microhabitat and aquatic features to be effective at sampling for herpetofauna.

  2. CONSORT for Reporting Randomized Controlled Trials in Journal and Conference Abstracts: Explanation and Elaboration

    PubMed Central

    Hopewell, Sally; Clarke, Mike; Moher, David; Wager, Elizabeth; Middleton, Philippa; Altman, Douglas G; Schulz, Kenneth F

    2008-01-01

    Background Clear, transparent, and sufficiently detailed abstracts of conferences and journal articles related to randomized controlled trials (RCTs) are important, because readers often base their assessment of a trial solely on information in the abstract. Here, we extend the CONSORT (Consolidated Standards of Reporting Trials) Statement to develop a minimum list of essential items, which authors should consider when reporting the results of a RCT in any journal or conference abstract. Methods and Findings We generated a list of items from existing quality assessment tools and empirical evidence. A three-round, modified-Delphi process was used to select items. In all, 109 participants were invited to participate in an electronic survey; the response rate was 61%. Survey results were presented at a meeting of the CONSORT Group in Montebello, Canada, January 2007, involving 26 participants, including clinical trialists, statisticians, epidemiologists, and biomedical editors. Checklist items were discussed for eligibility into the final checklist. The checklist was then revised to ensure that it reflected discussions held during and subsequent to the meeting. CONSORT for Abstracts recommends that abstracts relating to RCTs have a structured format. Items should include details of trial objectives; trial design (e.g., method of allocation, blinding/masking); trial participants (i.e., description, numbers randomized, and number analyzed); interventions intended for each randomized group and their impact on primary efficacy outcomes and harms; trial conclusions; trial registration name and number; and source of funding. We recommend the checklist be used in conjunction with this explanatory document, which includes examples of good reporting, rationale, and evidence, when available, for the inclusion of each item. Conclusions CONSORT for Abstracts aims to improve reporting of abstracts of RCTs published in journal articles and conference proceedings. It will help authors of abstracts of these trials provide the detail and clarity needed by readers wishing to assess a trial's validity and the applicability of its results. PMID:18215107

  3. Damage/fault diagnosis in an operating wind turbine under uncertainty via a vibration response Gaussian mixture random coefficient model based framework

    NASA Astrophysics Data System (ADS)

    Avendaño-Valencia, Luis David; Fassois, Spilios D.

    2017-07-01

    The study focuses on vibration response based health monitoring for an operating wind turbine, which features time-dependent dynamics under environmental and operational uncertainty. A Gaussian Mixture Model Random Coefficient (GMM-RC) model based Structural Health Monitoring framework postulated in a companion paper is adopted and assessed. The assessment is based on vibration response signals obtained from a simulated offshore 5 MW wind turbine. The non-stationarity in the vibration signals originates from the continually evolving, due to blade rotation, inertial properties, as well as the wind characteristics, while uncertainty is introduced by random variations of the wind speed within the range of 10-20 m/s. Monte Carlo simulations are performed using six distinct structural states, including the healthy state and five types of damage/fault in the tower, the blades, and the transmission, with each one of them characterized by four distinct levels. Random vibration response modeling and damage diagnosis are illustrated, along with pertinent comparisons with state-of-the-art diagnosis methods. The results demonstrate consistently good performance of the GMM-RC model based framework, offering significant performance improvements over state-of-the-art methods. Most damage types and levels are shown to be properly diagnosed using a single vibration sensor.

  4. Infrared Ship Target Segmentation Based on Spatial Information Improved FCM.

    PubMed

    Bai, Xiangzhi; Chen, Zhiguo; Zhang, Yu; Liu, Zhaoying; Lu, Yi

    2016-12-01

    Segmentation of infrared (IR) ship images is always a challenging task, because of the intensity inhomogeneity and noise. The fuzzy C-means (FCM) clustering is a classical method widely used in image segmentation. However, it has some shortcomings, like not considering the spatial information or being sensitive to noise. In this paper, an improved FCM method based on the spatial information is proposed for IR ship target segmentation. The improvements include two parts: 1) adding the nonlocal spatial information based on the ship target and 2) using the spatial shape information of the contour of the ship target to refine the local spatial constraint by Markov random field. In addition, the results of K -means are used to initialize the improved FCM method. Experimental results show that the improved method is effective and performs better than the existing methods, including the existing FCM methods, for segmentation of the IR ship images.

  5. Mean first-passage times of non-Markovian random walkers in confinement.

    PubMed

    Guérin, T; Levernier, N; Bénichou, O; Voituriez, R

    2016-06-16

    The first-passage time, defined as the time a random walker takes to reach a target point in a confining domain, is a key quantity in the theory of stochastic processes. Its importance comes from its crucial role in quantifying the efficiency of processes as varied as diffusion-limited reactions, target search processes or the spread of diseases. Most methods of determining the properties of first-passage time in confined domains have been limited to Markovian (memoryless) processes. However, as soon as the random walker interacts with its environment, memory effects cannot be neglected: that is, the future motion of the random walker does not depend only on its current position, but also on its past trajectory. Examples of non-Markovian dynamics include single-file diffusion in narrow channels, or the motion of a tracer particle either attached to a polymeric chain or diffusing in simple or complex fluids such as nematics, dense soft colloids or viscoelastic solutions. Here we introduce an analytical approach to calculate, in the limit of a large confining volume, the mean first-passage time of a Gaussian non-Markovian random walker to a target. The non-Markovian features of the dynamics are encompassed by determining the statistical properties of the fictitious trajectory that the random walker would follow after the first-passage event takes place, which are shown to govern the first-passage time kinetics. This analysis is applicable to a broad range of stochastic processes, which may be correlated at long times. Our theoretical predictions are confirmed by numerical simulations for several examples of non-Markovian processes, including the case of fractional Brownian motion in one and higher dimensions. These results reveal, on the basis of Gaussian processes, the importance of memory effects in first-passage statistics of non-Markovian random walkers in confinement.

  6. Some Aspects of the Investigation of Random Vibration Influence on Ride Comfort

    NASA Astrophysics Data System (ADS)

    DEMIĆ, M.; LUKIĆ, J.; MILIĆ, Ž.

    2002-05-01

    Contemporary vehicles must satisfy high ride comfort criteria. This paper attempts to develop criteria for ride comfort improvement. The highest loading levels have been found to be in the vertical direction and the lowest in lateral direction in passenger cars and trucks. These results have formed the basis for further laboratory and field investigations. An investigation of the human body behaviour under random vibrations is reported in this paper. The research included two phases; biodynamic research and ride comfort investigation. A group of 30 subjects was tested. The influence of broadband random vibrations on the human body was examined through the seat-to-head transmissibility function (STHT). Initially, vertical and fore and aft vibrations were considered. Multi-directional vibration was also investigated. In the biodynamic research, subjects were exposed to 0·55, 1·75 and 2·25 m/s2 r.m.s. vibration levels in the 0·5- 40 Hz frequency domain. The influence of sitting position on human body behaviour under two axial vibrations was also examined. Data analysis showed that the human body behaviour under two-directional random vibrations could not be approximated by superposition of one-directional random vibrations. Non-linearity of the seated human body in the vertical and fore and aft directions was observed. Seat-backrest angle also influenced STHT. In the second phase of experimental research, a new method for the assessment of the influence of narrowband random vibration on the human body was formulated and tested. It included determination of equivalent comfort curves in the vertical and fore and aft directions under one- and two-directional narrowband random vibrations. Equivalent comfort curves for durations of 2·5, 4 and 8 h were determined.

  7. Mean first-passage times of non-Markovian random walkers in confinement

    NASA Astrophysics Data System (ADS)

    Guérin, T.; Levernier, N.; Bénichou, O.; Voituriez, R.

    2016-06-01

    The first-passage time, defined as the time a random walker takes to reach a target point in a confining domain, is a key quantity in the theory of stochastic processes. Its importance comes from its crucial role in quantifying the efficiency of processes as varied as diffusion-limited reactions, target search processes or the spread of diseases. Most methods of determining the properties of first-passage time in confined domains have been limited to Markovian (memoryless) processes. However, as soon as the random walker interacts with its environment, memory effects cannot be neglected: that is, the future motion of the random walker does not depend only on its current position, but also on its past trajectory. Examples of non-Markovian dynamics include single-file diffusion in narrow channels, or the motion of a tracer particle either attached to a polymeric chain or diffusing in simple or complex fluids such as nematics, dense soft colloids or viscoelastic solutions. Here we introduce an analytical approach to calculate, in the limit of a large confining volume, the mean first-passage time of a Gaussian non-Markovian random walker to a target. The non-Markovian features of the dynamics are encompassed by determining the statistical properties of the fictitious trajectory that the random walker would follow after the first-passage event takes place, which are shown to govern the first-passage time kinetics. This analysis is applicable to a broad range of stochastic processes, which may be correlated at long times. Our theoretical predictions are confirmed by numerical simulations for several examples of non-Markovian processes, including the case of fractional Brownian motion in one and higher dimensions. These results reveal, on the basis of Gaussian processes, the importance of memory effects in first-passage statistics of non-Markovian random walkers in confinement.

  8. Methods for sample size determination in cluster randomized trials

    PubMed Central

    Rutterford, Clare; Copas, Andrew; Eldridge, Sandra

    2015-01-01

    Background: The use of cluster randomized trials (CRTs) is increasing, along with the variety in their design and analysis. The simplest approach for their sample size calculation is to calculate the sample size assuming individual randomization and inflate this by a design effect to account for randomization by cluster. The assumptions of a simple design effect may not always be met; alternative or more complicated approaches are required. Methods: We summarise a wide range of sample size methods available for cluster randomized trials. For those familiar with sample size calculations for individually randomized trials but with less experience in the clustered case, this manuscript provides formulae for a wide range of scenarios with associated explanation and recommendations. For those with more experience, comprehensive summaries are provided that allow quick identification of methods for a given design, outcome and analysis method. Results: We present first those methods applicable to the simplest two-arm, parallel group, completely randomized design followed by methods that incorporate deviations from this design such as: variability in cluster sizes; attrition; non-compliance; or the inclusion of baseline covariates or repeated measures. The paper concludes with methods for alternative designs. Conclusions: There is a large amount of methodology available for sample size calculations in CRTs. This paper gives the most comprehensive description of published methodology for sample size calculation and provides an important resource for those designing these trials. PMID:26174515

  9. Generation of pseudo-random numbers

    NASA Technical Reports Server (NTRS)

    Howell, L. W.; Rheinfurth, M. H.

    1982-01-01

    Practical methods for generating acceptable random numbers from a variety of probability distributions which are frequently encountered in engineering applications are described. The speed, accuracy, and guarantee of statistical randomness of the various methods are discussed.

  10. Propensity score to detect baseline imbalance in cluster randomized trials: the role of the c-statistic.

    PubMed

    Leyrat, Clémence; Caille, Agnès; Foucher, Yohann; Giraudeau, Bruno

    2016-01-22

    Despite randomization, baseline imbalance and confounding bias may occur in cluster randomized trials (CRTs). Covariate imbalance may jeopardize the validity of statistical inferences if they occur on prognostic factors. Thus, the diagnosis of a such imbalance is essential to adjust statistical analysis if required. We developed a tool based on the c-statistic of the propensity score (PS) model to detect global baseline covariate imbalance in CRTs and assess the risk of confounding bias. We performed a simulation study to assess the performance of the proposed tool and applied this method to analyze the data from 2 published CRTs. The proposed method had good performance for large sample sizes (n =500 per arm) and when the number of unbalanced covariates was not too small as compared with the total number of baseline covariates (≥40% of unbalanced covariates). We also provide a strategy for pre selection of the covariates needed to be included in the PS model to enhance imbalance detection. The proposed tool could be useful in deciding whether covariate adjustment is required before performing statistical analyses of CRTs.

  11. Learning spinal manipulation: A best-evidence synthesis of teaching methods.

    PubMed

    Stainsby, Brynne E; Clarke, Michelle C S; Egonia, Jade R

    2016-10-01

    The purpose of this study was to evaluate the effectiveness of different reported methods used to teach spinal manipulative therapy to chiropractic students. For this best-evidence literature synthesis, 5 electronic databases were searched from 1900 to 2015. Eligible studies were critically appraised using the criteria of the Scottish Intercollegiate Guidelines Network. Scientifically admissible studies were synthesized following best-evidence synthesis principles. Twenty articles were critically appraised, including 9 randomized clinical trials, 9 cohort studies, and 2 systematic reviews/meta-analyses. Eleven articles were accepted as scientifically admissible. The type of teaching method aids included a Thrust in Motion cervical manikin, instrumented cardiopulmonary reanimation manikin, padded contact with a load cell, instrumented treatment table with force sensor/transducer, and Dynadjust instrument. Several different methods exist in the literature for teaching spinal manipulative therapy techniques; however, future research in this developing area of chiropractic education is proposed. It is suggested that various teaching methods be included in the regular curricula of chiropractic colleges to aid in developing manipulation skills, efficiency, and knowledge of performance.

  12. Reputation and Image: Some Connections to Resource Environments

    ERIC Educational Resources Information Center

    Hoagland, Steven R.

    2012-01-01

    The purpose of this paper is to extend knowledge about the external environment in which educational organizations operate and the patterns by which their resource flows translate into sources of uncertainty. Methods include examination of cross sectional and longitudinal data from secondary sources and a survey on a random sample of 80 public…

  13. Effects of a Conversation Intervention on the Expressive Vocabulary Development of Prekindergarten Children

    ERIC Educational Resources Information Center

    Ruston, Hilary P.; Schwanenflugel, Paula J.

    2010-01-01

    Purpose: The purpose of this study was to determine the effectiveness of a conversation intervention including 500 min of linguistically and cognitively complex talk on the expressive vocabulary growth of prekindergarten children. Method: Children (N = 73) were randomly assigned to control or a 10-week experimental intervention condition. Twice…

  14. Sudden Gains in Cognitive Therapy and Interpersonal Therapy for Social Anxiety Disorder

    ERIC Educational Resources Information Center

    Bohn, Christiane; Aderka, Idan M.; Schreiber, Franziska; Stangier, Ulrich; Hofmann, Stefan G.

    2013-01-01

    Objective: The present study examined the effects of sudden gains on treatment outcome in a randomized controlled trial including individual cognitive therapy (CT) and interpersonal therapy (IPT) for social anxiety disorder (SAD). Method: Participants were 67 individuals with SAD who received 16 treatment sessions. Symptom severity at each session…

  15. An Adapted Brief Strategic Family Therapy for Gang-Affiliated Mexican American Adolescents

    ERIC Educational Resources Information Center

    Valdez, Avelardo; Cepeda, Alice; Parrish, Danielle; Horowitz, Rosalind; Kaplan, Charles

    2013-01-01

    Objective: This study assessed the effectiveness of an adapted Brief Strategic Family Therapy (BSFT) intervention for gang-affiliated Mexican American adolescents and their parents. Methods: A total of 200 adolescents and their family caregivers were randomized to either a treatment or a control condition. Outcomes included adolescent substance…

  16. The Impact of Sample Size and Other Factors When Estimating Multilevel Logistic Models

    ERIC Educational Resources Information Center

    Schoeneberger, Jason A.

    2016-01-01

    The design of research studies utilizing binary multilevel models must necessarily incorporate knowledge of multiple factors, including estimation method, variance component size, or number of predictors, in addition to sample sizes. This Monte Carlo study examined the performance of random effect binary outcome multilevel models under varying…

  17. Efficacy and Safety of Dexmethylphenidate Extended-Release Capsules in Children with Attention-Deficit/Hyperactivity Disorder

    ERIC Educational Resources Information Center

    Greenhill, Laurence L.; Muniz, Rafael; Ball, Roberta R.; Levine, Alan; Pestreich, Linda; Jiang, Hai

    2006-01-01

    Objective: The efficacy and safety of dexmethylphenidate extended release (d-MPH-ER) was compared to placebo in pediatric patients with attention-deficit/hyperactivity disorder (ADHD). Method: This multicenter, randomized, double-blind, placebo-controlled, parallel-group, two-phase study included 97 patients (ages 6-17 years) with…

  18. The Sun Sense Study: An Intervention to Improve Sun Protection in Children

    ERIC Educational Resources Information Center

    Glasser, Alice; Shaheen, Magda; Glenn, Beth A.; Bastani, Roshan

    2010-01-01

    Objectives: To assess the effect of a multicomponent intervention on parental knowledge, sun avoidance behaviors, and sun protection practices in children 3-10 years. Methods: A randomized trial at a pediatric clinic recruited 197 caregiver-child pairs (90% parents). Intervention included a brief presentation and brochure for the parent and…

  19. Sleep Patterns of College Students at a Public University

    ERIC Educational Resources Information Center

    Forquer, LeAnne M.; Camden, Adrian E.; Gabriau, Krista M.; Johnson, C. Merle

    2008-01-01

    Objective: The authors' purpose in this study was to determine the sleep patterns of college students to identify problem areas and potential solutions. Participants: A total of 313 students returned completed surveys. Methods: A sleep survey was e-mailed to a random sample of students at a North Central university. Questions included individual…

  20. Determination of Factors Effected Dietary Glycemic Index in Turkish University Students

    ERIC Educational Resources Information Center

    Gumus, Huseyin; Akdevelioglu, Yasemin; Bulduk, Sidika

    2014-01-01

    We aimed to determine how factors such as smoking, regular activity, etc. affected dietary glycemic index in university students. Methods: This study was carried out at Gazi University, Ankara, Turkey. The participants were 577 randomly selected Turkish healthy female university students aged 17-32 years. The survey included a questionnaire that…

  1. A More Powerful Test in Three-Level Cluster Randomized Designs

    ERIC Educational Resources Information Center

    Konstantopoulos, Spyros

    2011-01-01

    Field experiments that involve nested structures frequently assign treatment conditions to entire groups (such as schools). A key aspect of the design of such experiments includes knowledge of the clustering effects that are often expressed via intraclass correlation. This study provides methods for constructing a more powerful test for the…

  2. Unsupervised learning on scientific ocean drilling datasets from the South China Sea

    NASA Astrophysics Data System (ADS)

    Tse, Kevin C.; Chiu, Hon-Chim; Tsang, Man-Yin; Li, Yiliang; Lam, Edmund Y.

    2018-06-01

    Unsupervised learning methods were applied to explore data patterns in multivariate geophysical datasets collected from ocean floor sediment core samples coming from scientific ocean drilling in the South China Sea. Compared to studies on similar datasets, but using supervised learning methods which are designed to make predictions based on sample training data, unsupervised learning methods require no a priori information and focus only on the input data. In this study, popular unsupervised learning methods including K-means, self-organizing maps, hierarchical clustering and random forest were coupled with different distance metrics to form exploratory data clusters. The resulting data clusters were externally validated with lithologic units and geologic time scales assigned to the datasets by conventional methods. Compact and connected data clusters displayed varying degrees of correspondence with existing classification by lithologic units and geologic time scales. K-means and self-organizing maps were observed to perform better with lithologic units while random forest corresponded best with geologic time scales. This study sets a pioneering example of how unsupervised machine learning methods can be used as an automatic processing tool for the increasingly high volume of scientific ocean drilling data.

  3. Dual Dynamically Orthogonal approximation of incompressible Navier Stokes equations with random boundary conditions

    NASA Astrophysics Data System (ADS)

    Musharbash, Eleonora; Nobile, Fabio

    2018-02-01

    In this paper we propose a method for the strong imposition of random Dirichlet boundary conditions in the Dynamical Low Rank (DLR) approximation of parabolic PDEs and, in particular, incompressible Navier Stokes equations. We show that the DLR variational principle can be set in the constrained manifold of all S rank random fields with a prescribed value on the boundary, expressed in low rank format, with rank smaller then S. We characterize the tangent space to the constrained manifold by means of a Dual Dynamically Orthogonal (Dual DO) formulation, in which the stochastic modes are kept orthonormal and the deterministic modes satisfy suitable boundary conditions, consistent with the original problem. The Dual DO formulation is also convenient to include the incompressibility constraint, when dealing with incompressible Navier Stokes equations. We show the performance of the proposed Dual DO approximation on two numerical test cases: the classical benchmark of a laminar flow around a cylinder with random inflow velocity, and a biomedical application for simulating blood flow in realistic carotid artery reconstructed from MRI data with random inflow conditions coming from Doppler measurements.

  4. Lay public's understanding of equipoise and randomisation in randomised controlled trials.

    PubMed

    Robinson, E J; Kerr, C E P; Stevens, A J; Lilford, R J; Braunholtz, D A; Edwards, S J; Beck, S R; Rowley, M G

    2005-03-01

    To research the lay public's understanding of equipoise and randomisation in randomised controlled trials (RCTs) and to look at why information on this may not be not taken in or remembered, as well as the effects of providing information designed to overcome barriers. Investigations were informed by an update of systematic review on patients' understanding of consent information in clinical trials, and by relevant theory and evidence from experimental psychology. Nine investigations were conducted with nine participants. Access (return to education), leisure and vocational courses at Further Education Colleges in the Midlands, UK. Healthy adults with a wide range of educational backgrounds and ages. Participants read hypothetical scenarios and wrote brief answers to subsequent questions. Sub-samples of participants were interviewed individually to elaborate on their written answers. Participants' background assumptions concerning equipoise and randomisation were examined and ways of helping participants recognise the scientific benefits of randomisation were explored. Judgments on allocation methods; treatment preferences; the acceptability of random allocation; whether or not individual doctors could be completely unsure about the best treatment; whether or not doctors should reveal treatment preferences under conditions of collective equipoise; and how sure experts would be about the best treatment following random allocation vs doctor/patient choice. Assessments of understanding hypothetical trial information. Recent literature continues to report trial participants' failure to understand or remember information about randomisation and equipoise, despite the provision of clear and readable trial information leaflets. In current best practice, written trial information describes what will happen without offering accessible explanations. As a consequence, patients may create their own incorrect interpretations and consent or refusal may be inadequately informed. In six investigations, most participants identified which methods of allocation were random, but judged the random allocation methods to be unacceptable in a trial context; the mere description of a treatment as new was insufficient to engender a preference for it over a standard treatment; around half of the participants denied that a doctor could be completely unsure about the best treatment. A majority of participants judged it unacceptable for a doctor to suggest letting chance decide when uncertain of the best treatment, and, in the absence of a justification for random allocation, participants did not recognise scientific benefits of random allocation over normal treatment allocation methods. The pattern of results across three intervention studies suggests that merely supplementing written trial information with an explanation is unlikely to be helpful. However, when people manage to focus on the trial's aim of increasing knowledge (as opposed to making treatment decisions about individuals), and process an explanation actively, they may be helped to understand the scientific reasons for random allocation. This research was not carried out in real healthcare settings. However, participants who could correctly identify random allocation methods, yet judged random allocation unacceptable, doubted the possibility of individual equipoise and saw no scientific benefits of random allocation over doctor/patient choice, are unlikely to draw upon contrasting views if invited to enter a real clinical trial. This suggests that many potential trial participants may have difficulty understanding and remembering trial information that conforms to current best practice in its descriptions of randomisation and equipoise. Given the extent of the disparity between the assumptions underlying trial design and the assumptions held by the lay public, the solution is unlikely to be simple. Nevertheless, the results suggest that including an accessible explanation of the scientific benefits of randomisation may be beneficial provided potential participants are also enabled to reflect on the trial's aim of advancing knowledge, and to think actively about the information presented. Further areas for consideration include: the identification of effective combinations of written and oral information; helping participants to reflect on the aim of advancing knowledge; and an evidence-based approach to leaflet construction.

  5. Randomly and Non-Randomly Missing Renal Function Data in the Strong Heart Study: A Comparison of Imputation Methods

    PubMed Central

    Shara, Nawar; Yassin, Sayf A.; Valaitis, Eduardas; Wang, Hong; Howard, Barbara V.; Wang, Wenyu; Lee, Elisa T.; Umans, Jason G.

    2015-01-01

    Kidney and cardiovascular disease are widespread among populations with high prevalence of diabetes, such as American Indians participating in the Strong Heart Study (SHS). Studying these conditions simultaneously in longitudinal studies is challenging, because the morbidity and mortality associated with these diseases result in missing data, and these data are likely not missing at random. When such data are merely excluded, study findings may be compromised. In this article, a subset of 2264 participants with complete renal function data from Strong Heart Exams 1 (1989–1991), 2 (1993–1995), and 3 (1998–1999) was used to examine the performance of five methods used to impute missing data: listwise deletion, mean of serial measures, adjacent value, multiple imputation, and pattern-mixture. Three missing at random models and one non-missing at random model were used to compare the performance of the imputation techniques on randomly and non-randomly missing data. The pattern-mixture method was found to perform best for imputing renal function data that were not missing at random. Determining whether data are missing at random or not can help in choosing the imputation method that will provide the most accurate results. PMID:26414328

  6. Randomly and Non-Randomly Missing Renal Function Data in the Strong Heart Study: A Comparison of Imputation Methods.

    PubMed

    Shara, Nawar; Yassin, Sayf A; Valaitis, Eduardas; Wang, Hong; Howard, Barbara V; Wang, Wenyu; Lee, Elisa T; Umans, Jason G

    2015-01-01

    Kidney and cardiovascular disease are widespread among populations with high prevalence of diabetes, such as American Indians participating in the Strong Heart Study (SHS). Studying these conditions simultaneously in longitudinal studies is challenging, because the morbidity and mortality associated with these diseases result in missing data, and these data are likely not missing at random. When such data are merely excluded, study findings may be compromised. In this article, a subset of 2264 participants with complete renal function data from Strong Heart Exams 1 (1989-1991), 2 (1993-1995), and 3 (1998-1999) was used to examine the performance of five methods used to impute missing data: listwise deletion, mean of serial measures, adjacent value, multiple imputation, and pattern-mixture. Three missing at random models and one non-missing at random model were used to compare the performance of the imputation techniques on randomly and non-randomly missing data. The pattern-mixture method was found to perform best for imputing renal function data that were not missing at random. Determining whether data are missing at random or not can help in choosing the imputation method that will provide the most accurate results.

  7. The treatment of medial tibial stress syndrome in athletes; a randomized clinical trial

    PubMed Central

    2012-01-01

    Background The only three randomized trials on the treatment of MTSS were all performed in military populations. The treatment options investigated in this study were not previously examined in athletes. This study investigated if functional outcome of three common treatment options for medial tibial stress syndrome (MTSS) in athletes in a non-military setting was the same. Methods The study design was randomized and multi-centered. Physical therapists and sports physicians referred athletes with MTSS to the hospital for inclusion. 81 athletes were assessed for eligibility of which 74 athletes were included and randomized to three treatment groups. Group one performed a graded running program, group two performed a graded running program with additional stretching and strengthening exercises for the calves, while group three performed a graded running program with an additional sports compression stocking. The primary outcome measure was: time to complete a running program (able to run 18 minutes with high intensity) and secondary outcome was: general satisfaction with treatment. Results 74 Athletes were randomized and included of which 14 did not complete the study due a lack of progress (18.9%). The data was analyzed on an intention-to-treat basis. Time to complete a running program and general satisfaction with the treatment were not significantly different between the three treatment groups. Conclusion This was the first randomized trial on the treatment of MTSS in athletes in a non-military setting. No differences were found between the groups for the time to complete a running program. Trial registration CCMO; NL23471.098.08 PMID:22464032

  8. Calibrating random forests for probability estimation.

    PubMed

    Dankowski, Theresa; Ziegler, Andreas

    2016-09-30

    Probabilities can be consistently estimated using random forests. It is, however, unclear how random forests should be updated to make predictions for other centers or at different time points. In this work, we present two approaches for updating random forests for probability estimation. The first method has been proposed by Elkan and may be used for updating any machine learning approach yielding consistent probabilities, so-called probability machines. The second approach is a new strategy specifically developed for random forests. Using the terminal nodes, which represent conditional probabilities, the random forest is first translated to logistic regression models. These are, in turn, used for re-calibration. The two updating strategies were compared in a simulation study and are illustrated with data from the German Stroke Study Collaboration. In most simulation scenarios, both methods led to similar improvements. In the simulation scenario in which the stricter assumptions of Elkan's method were not met, the logistic regression-based re-calibration approach for random forests outperformed Elkan's method. It also performed better on the stroke data than Elkan's method. The strength of Elkan's method is its general applicability to any probability machine. However, if the strict assumptions underlying this approach are not met, the logistic regression-based approach is preferable for updating random forests for probability estimation. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

  9. Data transmission system and method

    NASA Technical Reports Server (NTRS)

    Bruck, Jehoshua (Inventor); Langberg, Michael (Inventor); Sprintson, Alexander (Inventor)

    2010-01-01

    A method of transmitting data packets, where randomness is added to the schedule. Universal broadcast schedules using encoding and randomization techniques are also discussed, together with optimal randomized schedules and an approximation algorithm for finding near-optimal schedules.

  10. Effectiveness of the Dader Method for pharmaceutical care in patients with bipolar I disorder: EMDADER-TAB: study protocol for a randomized controlled trial

    PubMed Central

    2014-01-01

    Background Bipolar I disorder (BD-I) is a chronic mental illness characterized by the presence of one or more manic episodes, or both depressive and manic episodes, usually separated by asymptomatic intervals. Pharmacists can contribute to the management of BD-I, mainly with the use of effective and safe drugs, and improve the patient’s life quality through pharmaceutical care. Some studies have shown the effect of pharmaceutical care in the achievement of therapeutic goals in different illnesses; however, to our knowledge, there is a lack of randomized controlled trials designed to assess the effect of pharmacist intervention in patients with BD. The aim of this study is to assess the effectiveness of the Dader Method for pharmaceutical care in patients with BD-I. Methods/design Randomized, controlled, prospective, single-center clinical trial with duration of 12 months will be performed to compare the effect of Dader Method of pharmaceutical care with the usual care process of patients in a psychiatric clinic. Patients diagnosed with BD-I aged between 18 and 65 years who have been discharged or referred from outpatients service of the San Juan de Dios Clinic (Antioquia, Colombia) will be included. Patients will be randomized into the intervention group who will receive pharmaceutical care provided by pharmacists working in collaboration with psychiatrists, or into the control group who will receive usual care and verbal-written counseling regarding BD. Study outcomes will be assessed at baseline and at 3, 6, 9, and 12 months after randomization. The primary outcome will be to measure the number of hospitalizations, emergency service consultations, and unscheduled outpatient visits. Effectiveness, safety, adherence, and quality of life will be assessed as secondary outcomes. Statistical analyses will be performed using two-tailed McNemar tests, Pearson chi-square tests, and Student’s t-tests; a P value <0.05 will be considered as statistically significant. Discussion As far as we know, this is the first randomized controlled trial to assess the effect of the Dader Method for pharmaceutical care in patients with BD-I and it could generate valuable information and recommendations about the role of pharmacists in the improvement of therapeutic goals, solution of drug-related problems, and adherence. Trial registration Registration number NCT01750255 on August 6, 2012. First patient randomized on 24 November 2011. PMID:24885673

  11. Mapping Health Data: Improved Privacy Protection With Donut Method Geomasking

    PubMed Central

    Hampton, Kristen H.; Fitch, Molly K.; Allshouse, William B.; Doherty, Irene A.; Gesink, Dionne C.; Leone, Peter A.; Serre, Marc L.; Miller, William C.

    2010-01-01

    A major challenge in mapping health data is protecting patient privacy while maintaining the spatial resolution necessary for spatial surveillance and outbreak identification. A new adaptive geomasking technique, referred to as the donut method, extends current methods of random displacement by ensuring a user-defined minimum level of geoprivacy. In donut method geomasking, each geocoded address is relocated in a random direction by at least a minimum distance, but less than a maximum distance. The authors compared the donut method with current methods of random perturbation and aggregation regarding measures of privacy protection and cluster detection performance by masking multiple disease field simulations under a range of parameters. Both the donut method and random perturbation performed better than aggregation in cluster detection measures. The performance of the donut method in geoprivacy measures was at least 42.7% higher and in cluster detection measures was less than 4.8% lower than that of random perturbation. Results show that the donut method provides a consistently higher level of privacy protection with a minimal decrease in cluster detection performance, especially in areas where the risk to individual geoprivacy is greatest. PMID:20817785

  12. Mapping health data: improved privacy protection with donut method geomasking.

    PubMed

    Hampton, Kristen H; Fitch, Molly K; Allshouse, William B; Doherty, Irene A; Gesink, Dionne C; Leone, Peter A; Serre, Marc L; Miller, William C

    2010-11-01

    A major challenge in mapping health data is protecting patient privacy while maintaining the spatial resolution necessary for spatial surveillance and outbreak identification. A new adaptive geomasking technique, referred to as the donut method, extends current methods of random displacement by ensuring a user-defined minimum level of geoprivacy. In donut method geomasking, each geocoded address is relocated in a random direction by at least a minimum distance, but less than a maximum distance. The authors compared the donut method with current methods of random perturbation and aggregation regarding measures of privacy protection and cluster detection performance by masking multiple disease field simulations under a range of parameters. Both the donut method and random perturbation performed better than aggregation in cluster detection measures. The performance of the donut method in geoprivacy measures was at least 42.7% higher and in cluster detection measures was less than 4.8% lower than that of random perturbation. Results show that the donut method provides a consistently higher level of privacy protection with a minimal decrease in cluster detection performance, especially in areas where the risk to individual geoprivacy is greatest.

  13. Comparison of cast materials for the treatment of congenital idiopathic clubfoot using the Ponseti method: a prospective randomized controlled trial

    PubMed Central

    Hui, Catherine; Joughin, Elaine; Nettel-Aguirre, Alberto; Goldstein, Simon; Harder, James; Kiefer, Gerhard; Parsons, David; Brauer, Carmen; Howard, Jason

    2014-01-01

    Background The Ponseti method of congenital idiopathic clubfoot correction has traditionally specified plaster of Paris (POP) as the cast material of choice; however, there are negative aspects to using POP. We sought to determine the influence of cast material (POP v. semirigid fibreglass [SRF]) on clubfoot correction using the Ponseti method. Methods Patients were randomized to POP or SRF before undergoing the Ponseti method. The primary outcome measure was the number of casts required for clubfoot correction. Secondary outcome measures included the number of casts by severity, ease of cast removal, need for Achilles tenotomy, brace compliance, deformity relapse, need for repeat casting and need for ancillary surgical procedures. Results We enrolled 30 patients: 12 randomized to POP and 18 to SRF. There was no difference in the number of casts required for clubfoot correction between the groups (p = 0.13). According to parents, removal of POP was more difficult (p < 0.001), more time consuming (p < 0.001) and required more than 1 method (p < 0.001). At a final follow-up of 30.8 months, the mean times to deformity relapse requiring repeat casting, surgery or both were 18.7 and 16.4 months for the SRF and POP groups, respectively. Conclusion There was no significant difference in the number of casts required for correction of clubfoot between the 2 materials, but SRF resulted in a more favourable parental experience, which cannot be ignored as it may have a positive impact on psychological well-being despite the increased cost associated. PMID:25078929

  14. Development and computer implementation of design/analysis techniques for multilayered composite structures. Probabilistic fiber composite micromechanics. M.S. Thesis, Mar. 1987 Final Report, 1 Sep. 1984 - 1 Oct. 1990

    NASA Technical Reports Server (NTRS)

    Stock, Thomas A.

    1995-01-01

    Probabilistic composite micromechanics methods are developed that simulate expected uncertainties in unidirectional fiber composite properties. These methods are in the form of computational procedures using Monte Carlo simulation. The variables in which uncertainties are accounted for include constituent and void volume ratios, constituent elastic properties and strengths, and fiber misalignment. A graphite/epoxy unidirectional composite (ply) is studied to demonstrate fiber composite material property variations induced by random changes expected at the material micro level. Regression results are presented to show the relative correlation between predictor and response variables in the study. These computational procedures make possible a formal description of anticipated random processes at the intraply level, and the related effects of these on composite properties.

  15. Self-administration of intranasal influenza vaccine: Immunogenicity and volunteer acceptance

    PubMed Central

    Burgess, Timothy H.; Murray, Clinton K.; Bavaro, Mary F.; Landrum, Michael L.; O’Bryan, Thomas A.; Rosas, Jessica G.; Cammarata, Stephanie M.; Martin, Nicholas J.; Ewing, Daniel; Raviprakash, Kanakatte; Mor, Deepika; Zell, Elizabeth R.; Wilkins, Kenneth J.; Millar, Eugene V.

    2018-01-01

    Background In outbreak settings, mass vaccination strategies could maximize health protection of military personnel. Self-administration of live attenuated influenza vaccine (LAIV) may be a means to vaccinate large numbers of people and achieve deployment readiness while sparing the use of human resources. Methods A phase IV, open-label, randomized controlled trial evaluating the immunogenicity and acceptance of self-administered (SA) LAIV was conducted from 2012 to 2014. SA subjects were randomized to either individual self-administration or self-administration in a group setting. Control randomized subjects received healthcare worker-administered (HCWA) LAIV. Anti-hemagglutinin (HAI) antibody concentrations were measured pre- and post-vaccination. The primary endpoint was immunogenicity non-inferiority between SA and HCWA groups. Subjects were surveyed on preferred administration method. Results A total of 1077 subjects consented and were randomized (529 SA, 548 HCWA). Subject characteristics were very similar between groups, though SA subjects were younger, more likely to be white and on active duty. The per-protocol analysis included 1024 subjects (501 SA, 523 HCWA). Post-vaccination geometric mean titers by vaccine strain and by study group (HCWA vs. SA) were: A/H1N1 (45.8 vs. 48.7, respectively; p = 0.43), A/H3N2 (45.5 vs. 46.4; p = 0.80), B/Yamagata (17.2 vs. 17.8; p = 0.55). Seroresponses to A components were high (∼67%), while seroresponses to B components were lower (∼25%). Seroresponse did not differ by administration method. Baseline preference for administration method was similar between groups, with the majority in each group expressing no preference. At follow-up, the majority (64%) of SA subjects preferred SA vaccine. Conclusions LAIV immunogenicity was similar for HCWA and SA vaccines. SA was well-tolerated and preferred to HCWA among those who performed SA. PMID:26117150

  16. How to deal with missing longitudinal data in cost of illness analysis in Alzheimer's disease-suggestions from the GERAS observational study.

    PubMed

    Belger, Mark; Haro, Josep Maria; Reed, Catherine; Happich, Michael; Kahle-Wrobleski, Kristin; Argimon, Josep Maria; Bruno, Giuseppe; Dodel, Richard; Jones, Roy W; Vellas, Bruno; Wimo, Anders

    2016-07-18

    Missing data are a common problem in prospective studies with a long follow-up, and the volume, pattern and reasons for missing data may be relevant when estimating the cost of illness. We aimed to evaluate the effects of different methods for dealing with missing longitudinal cost data and for costing caregiver time on total societal costs in Alzheimer's disease (AD). GERAS is an 18-month observational study of costs associated with AD. Total societal costs included patient health and social care costs, and caregiver health and informal care costs. Missing data were classified as missing completely at random (MCAR), missing at random (MAR) or missing not at random (MNAR). Simulation datasets were generated from baseline data with 10-40 % missing total cost data for each missing data mechanism. Datasets were also simulated to reflect the missing cost data pattern at 18 months using MAR and MNAR assumptions. Naïve and multiple imputation (MI) methods were applied to each dataset and results compared with complete GERAS 18-month cost data. Opportunity and replacement cost approaches were used for caregiver time, which was costed with and without supervision included and with time for working caregivers only being costed. Total costs were available for 99.4 % of 1497 patients at baseline. For MCAR datasets, naïve methods performed as well as MI methods. For MAR, MI methods performed better than naïve methods. All imputation approaches were poor for MNAR data. For all approaches, percentage bias increased with missing data volume. For datasets reflecting 18-month patterns, a combination of imputation methods provided more accurate cost estimates (e.g. bias: -1 % vs -6 % for single MI method), although different approaches to costing caregiver time had a greater impact on estimated costs (29-43 % increase over base case estimate). Methods used to impute missing cost data in AD will impact on accuracy of cost estimates although varying approaches to costing informal caregiver time has the greatest impact on total costs. Tailoring imputation methods to the reason for missing data will further our understanding of the best analytical approach for studies involving cost outcomes.

  17. Quality Control for Interviews to Obtain Dietary Recalls from Children for Research Studies

    PubMed Central

    SHAFFER, NICOLE M.; THOMPSON, WILLIAM O.; BAGLIO, MICHELLE L.; GUINN, CAROLINE H.; FRYE, FRANCESCA H. A.

    2005-01-01

    Quality control is an important aspect of a study because the quality of data collected provides a foundation for the conclusions drawn from the study. For studies that include interviews, establishing quality control for interviews is critical in ascertaining whether interviews are conducted according to protocol. Despite the importance of quality control for interviews, few studies adequately document the quality control procedures used during data collection. This article reviews quality control for interviews and describes methods and results of quality control for interviews from two of our studies regarding the accuracy of children's dietary recalls; the focus is on quality control regarding interviewer performance during the interview, and examples are provided from studies with children. For our two studies, every interview was audio recorded and transcribed. The audio recording and typed transcript from one interview conducted by each research dietitian either weekly or daily were randomly selected and reviewed by another research dietitian, who completed a standardized quality control for interviews checklist. Major strengths of the methods of quality control for interviews in our two studies include: (a) interviews obtained for data collection were randomly selected for quality control for interviews, and (b) quality control for interviews was assessed on a regular basis throughout data collection. The methods of quality control for interviews described may help researchers design appropriate methods of quality control for interviews for future studies. PMID:15389417

  18. Difference method to search for the anisotropy of primary cosmic radiation

    NASA Astrophysics Data System (ADS)

    Pavlyuchenko, V. P.; Martirosov, R. M.; Nikolskaya, N. M.; Erlykin, A. D.

    2018-01-01

    The original difference method used in the search for an anisotropy of primary cosmic radiation at the knee region of its energy spectrum is considered. Its methodical features and properties are analyzed. It is shown that this method, in which properties of particle fluxes (rather than an intensity) are investigated, is stable against random experimental errors and allows one to separate anomalies connected with the laboratory coordinate system from anomalies in the celestial coordinate system. The method uses the multiple scattering of charged particles in the magnetic fields of the Galaxy to study the whole celestial sphere, including the regions outside the line of sight of the installation.

  19. Cluster-randomized Studies in Educational Research: Principles and Methodological Aspects.

    PubMed

    Dreyhaupt, Jens; Mayer, Benjamin; Keis, Oliver; Öchsner, Wolfgang; Muche, Rainer

    2017-01-01

    An increasing number of studies are being performed in educational research to evaluate new teaching methods and approaches. These studies could be performed more efficiently and deliver more convincing results if they more strictly applied and complied with recognized standards of scientific studies. Such an approach could substantially increase the quality in particular of prospective, two-arm (intervention) studies that aim to compare two different teaching methods. A key standard in such studies is randomization, which can minimize systematic bias in study findings; such bias may result if the two study arms are not structurally equivalent. If possible, educational research studies should also achieve this standard, although this is not yet generally the case. Some difficulties and concerns exist, particularly regarding organizational and methodological aspects. An important point to consider in educational research studies is that usually individuals cannot be randomized, because of the teaching situation, and instead whole groups have to be randomized (so-called "cluster randomization"). Compared with studies with individual randomization, studies with cluster randomization normally require (significantly) larger sample sizes and more complex methods for calculating sample size. Furthermore, cluster-randomized studies require more complex methods for statistical analysis. The consequence of the above is that a competent expert with respective special knowledge needs to be involved in all phases of cluster-randomized studies. Studies to evaluate new teaching methods need to make greater use of randomization in order to achieve scientifically convincing results. Therefore, in this article we describe the general principles of cluster randomization and how to implement these principles, and we also outline practical aspects of using cluster randomization in prospective, two-arm comparative educational research studies.

  20. Yoga for veterans with chronic low back pain: Design and methods of a randomized clinical trial.

    PubMed

    Groessl, Erik J; Schmalzl, Laura; Maiya, Meghan; Liu, Lin; Goodman, Debora; Chang, Douglas G; Wetherell, Julie L; Bormann, Jill E; Atkinson, J Hamp; Baxi, Sunita

    2016-05-01

    Chronic low back pain (CLBP) afflicts millions of people worldwide, with particularly high prevalence in military veterans. Many treatment options exist for CLBP, but most have limited effectiveness and some have significant side effects. In general populations with CLBP, yoga has been shown to improve health outcomes with few side effects. However, yoga has not been adequately studied in military veteran populations. In the current paper we will describe the design and methods of a randomized clinical trial aimed at examining whether yoga can effectively reduce disability and pain in US military veterans with CLBP. A total of 144 US military veterans with CLBP will be randomized to either yoga or a delayed treatment comparison group. The yoga intervention will consist of 2× weekly yoga classes for 12weeks, complemented by regular home practice guided by a manual. The delayed treatment group will receive the same intervention after six months. The primary outcome is the change in back pain-related disability measured with the Roland-Morris Disability Questionnaire at baseline and 12-weeks. Secondary outcomes include pain intensity, pain interference, depression, anxiety, fatigue/energy, quality of life, self-efficacy, sleep quality, and medication usage. Additional process and/or mediational factors will be measured to examine dose response and effect mechanisms. Assessments will be conducted at baseline, 6-weeks, 12-weeks, and 6-months. All randomized participants will be included in intention-to-treat analyses. Study results will provide much needed evidence on the feasibility and effectiveness of yoga as a therapeutic modality for the treatment of CLBP in US military veterans. Published by Elsevier Inc.

  1. Low or High Fractionation Dose {beta}-Radiotherapy for Pterygium? A Randomized Clinical Trial

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Viani, Gustavo Arruda, E-mail: gusviani@gmail.com; De Fendi, Ligia Issa; Fonseca, Ellen Carrara

    2012-02-01

    Purpose: Postoperative adjuvant treatment using {beta}-radiotherapy (RT) is a proven technique for reducing the recurrence of pterygium. A randomized trial was conducted to determine whether a low fractionation dose of 2 Gy within 10 fractions would provide local control similar to that after a high fractionation dose of 5 Gy within 7 fractions for surgically resected pterygium. Methods: A randomized trial was conducted in 200 patients (216 pterygia) between February 2006 and July 2007. Only patients with fresh pterygium resected using a bare sclera method and given RT within 3 days were included. Postoperative RT was delivered using a strontium-90more » eye applicator. The pterygia were randomly treated using either 5 Gy within 7 fractions (Group 1) or 2 Gy within 10 fractions (Group 2). The local control rate was calculated from the date of surgery. Results: Of the 216 pterygia included, 112 were allocated to Group 1 and 104 to Group 2. The 3-year local control rate for Groups 1 and 2 was 93.8% and 92.3%, respectively (p = .616). A statistically significant difference for cosmetic effect (p = .034), photophobia (p = .02), irritation (p = .001), and scleromalacia (p = .017) was noted in favor of Group 2. Conclusions: No better local control rate for postoperative pterygium was obtained using high-dose fractionation vs. low-dose fractionation. However, a low-dose fractionation schedule produced better cosmetic effects and resulted in fewer symptoms than high-dose fractionation. Moreover, pterygia can be safely treated in terms of local recurrence using RT schedules with a biologic effective dose of 24-52.5 Gy{sub 10.}.« less

  2. A multi-level intervention in worksites to increase fruit and vegetable access and intake: Rationale, design and methods of the ‘Good to Go’ cluster randomized trial

    PubMed Central

    Risica, Patricia M.; Gorham, Gemma; Dionne, Laura; Nardi, William; Ng, Doug; Middler, Reese; Mello, Jennifer; Akpolat, Rahmet; Gettens, Katelyn; Gans, Kim M.

    2018-01-01

    Background Fruit and vegetable (F&V) consumption is an important contributor to chronic disease prevention. However, most Americans do not eat adequate amounts. The worksite is an advantageous setting to reach large, diverse segments of the population with interventions to increase F&V intake, but research gaps exist. No studies have evaluated the implementation of mobile F&V markets at worksites nor compared the effectiveness of such markets with or without nutrition education. Methods This paper describes the protocol for Good to Go (GTG), a cluster randomized trial to evaluate F&V intake change in employees from worksites randomized into three experimental arms: discount, fresh F&V markets (Access Only arm); markets plus educational components including campaigns, cooking demonstrations, videos, newsletters, and a web site (Access Plus arm); and an attention placebo comparison intervention on physical activity and stress reduction (Comparison). Secondary aims include: 1) Process evaluation to determine costs, reach, fidelity, and dose as well as the relationship of these variables with changes in F&V intake; 2) Applying a mediating variable framework to examine relationships of psychosocial factors/determinants with changes in F&V consumption; and 3) Cost effectiveness analysis of the different intervention arms. Discussion The GTG study will fill important research gaps in the field by implementing a rigorous cluster randomized trial to evaluate the efficacy of an innovative environmental intervention providing access and availability to F&V at the worksite and whether this access intervention is further enhanced by accompanying educational interventions. GTG will provide an important contribution to public health research and practice. Trial registration number NCT02729675, ClinicalTrials.gov PMID:29242108

  3. Combining macula clinical signs and patient characteristics for age-related macular degeneration diagnosis: a machine learning approach.

    PubMed

    Fraccaro, Paolo; Nicolo, Massimo; Bonetto, Monica; Giacomini, Mauro; Weller, Peter; Traverso, Carlo Enrico; Prosperi, Mattia; OSullivan, Dympna

    2015-01-27

    To investigate machine learning methods, ranging from simpler interpretable techniques to complex (non-linear) "black-box" approaches, for automated diagnosis of Age-related Macular Degeneration (AMD). Data from healthy subjects and patients diagnosed with AMD or other retinal diseases were collected during routine visits via an Electronic Health Record (EHR) system. Patients' attributes included demographics and, for each eye, presence/absence of major AMD-related clinical signs (soft drusen, retinal pigment epitelium, defects/pigment mottling, depigmentation area, subretinal haemorrhage, subretinal fluid, macula thickness, macular scar, subretinal fibrosis). Interpretable techniques known as white box methods including logistic regression and decision trees as well as less interpreitable techniques known as black box methods, such as support vector machines (SVM), random forests and AdaBoost, were used to develop models (trained and validated on unseen data) to diagnose AMD. The gold standard was confirmed diagnosis of AMD by physicians. Sensitivity, specificity and area under the receiver operating characteristic (AUC) were used to assess performance. Study population included 487 patients (912 eyes). In terms of AUC, random forests, logistic regression and adaboost showed a mean performance of (0.92), followed by SVM and decision trees (0.90). All machine learning models identified soft drusen and age as the most discriminating variables in clinicians' decision pathways to diagnose AMD. Both black-box and white box methods performed well in identifying diagnoses of AMD and their decision pathways. Machine learning models developed through the proposed approach, relying on clinical signs identified by retinal specialists, could be embedded into EHR to provide physicians with real time (interpretable) support.

  4. A Tutorial on Multilevel Survival Analysis: Methods, Models and Applications

    PubMed Central

    Austin, Peter C.

    2017-01-01

    Summary Data that have a multilevel structure occur frequently across a range of disciplines, including epidemiology, health services research, public health, education and sociology. We describe three families of regression models for the analysis of multilevel survival data. First, Cox proportional hazards models with mixed effects incorporate cluster-specific random effects that modify the baseline hazard function. Second, piecewise exponential survival models partition the duration of follow-up into mutually exclusive intervals and fit a model that assumes that the hazard function is constant within each interval. This is equivalent to a Poisson regression model that incorporates the duration of exposure within each interval. By incorporating cluster-specific random effects, generalised linear mixed models can be used to analyse these data. Third, after partitioning the duration of follow-up into mutually exclusive intervals, one can use discrete time survival models that use a complementary log–log generalised linear model to model the occurrence of the outcome of interest within each interval. Random effects can be incorporated to account for within-cluster homogeneity in outcomes. We illustrate the application of these methods using data consisting of patients hospitalised with a heart attack. We illustrate the application of these methods using three statistical programming languages (R, SAS and Stata). PMID:29307954

  5. Network meta-analysis, electrical networks and graph theory.

    PubMed

    Rücker, Gerta

    2012-12-01

    Network meta-analysis is an active field of research in clinical biostatistics. It aims to combine information from all randomized comparisons among a set of treatments for a given medical condition. We show how graph-theoretical methods can be applied to network meta-analysis. A meta-analytic graph consists of vertices (treatments) and edges (randomized comparisons). We illustrate the correspondence between meta-analytic networks and electrical networks, where variance corresponds to resistance, treatment effects to voltage, and weighted treatment effects to current flows. Based thereon, we then show that graph-theoretical methods that have been routinely applied to electrical networks also work well in network meta-analysis. In more detail, the resulting consistent treatment effects induced in the edges can be estimated via the Moore-Penrose pseudoinverse of the Laplacian matrix. Moreover, the variances of the treatment effects are estimated in analogy to electrical effective resistances. It is shown that this method, being computationally simple, leads to the usual fixed effect model estimate when applied to pairwise meta-analysis and is consistent with published results when applied to network meta-analysis examples from the literature. Moreover, problems of heterogeneity and inconsistency, random effects modeling and including multi-armed trials are addressed. Copyright © 2012 John Wiley & Sons, Ltd. Copyright © 2012 John Wiley & Sons, Ltd.

  6. A Tutorial on Multilevel Survival Analysis: Methods, Models and Applications.

    PubMed

    Austin, Peter C

    2017-08-01

    Data that have a multilevel structure occur frequently across a range of disciplines, including epidemiology, health services research, public health, education and sociology. We describe three families of regression models for the analysis of multilevel survival data. First, Cox proportional hazards models with mixed effects incorporate cluster-specific random effects that modify the baseline hazard function. Second, piecewise exponential survival models partition the duration of follow-up into mutually exclusive intervals and fit a model that assumes that the hazard function is constant within each interval. This is equivalent to a Poisson regression model that incorporates the duration of exposure within each interval. By incorporating cluster-specific random effects, generalised linear mixed models can be used to analyse these data. Third, after partitioning the duration of follow-up into mutually exclusive intervals, one can use discrete time survival models that use a complementary log-log generalised linear model to model the occurrence of the outcome of interest within each interval. Random effects can be incorporated to account for within-cluster homogeneity in outcomes. We illustrate the application of these methods using data consisting of patients hospitalised with a heart attack. We illustrate the application of these methods using three statistical programming languages (R, SAS and Stata).

  7. Randomized controlled trials in dentistry: common pitfalls and how to avoid them.

    PubMed

    Fleming, Padhraig S; Lynch, Christopher D; Pandis, Nikolaos

    2014-08-01

    Clinical trials are used to appraise the effectiveness of clinical interventions throughout medicine and dentistry. Randomized controlled trials (RCTs) are established as the optimal primary design and are published with increasing frequency within the biomedical sciences, including dentistry. This review outlines common pitfalls associated with the conduct of randomized controlled trials in dentistry. Common failings in RCT design leading to various types of bias including selection, performance, detection and attrition bias are discussed in this review. Moreover, methods of minimizing and eliminating bias are presented to ensure that maximal benefit is derived from RCTs within dentistry. Well-designed RCTs have both upstream and downstream uses acting as a template for development and populating systematic reviews to permit more precise estimates of treatment efficacy and effectiveness. However, there is increasing awareness of waste in clinical research, whereby resource-intensive studies fail to provide a commensurate level of scientific evidence. Waste may stem either from inappropriate design or from inadequate reporting of RCTs; the importance of robust conduct of RCTs within dentistry is clear. Optimal reporting of randomized controlled trials within dentistry is necessary to ensure that trials are reliable and valid. Common shortcomings leading to important forms or bias are discussed and approaches to minimizing these issues are outlined. Copyright © 2014 Elsevier Ltd. All rights reserved.

  8. Educational interventions in childhood obesity: a systematic review with meta-analysis of randomized clinical trials.

    PubMed

    Sbruzzi, Graciele; Eibel, Bruna; Barbiero, Sandra M; Petkowicz, Rosemary O; Ribeiro, Rodrigo A; Cesa, Claudia C; Martins, Carla C; Marobin, Roberta; Schaan, Camila W; Souza, Willian B; Schaan, Beatriz D; Pellanda, Lucia C

    2013-05-01

    To assess the effectiveness of educational interventions including behavioral modification, nutrition and physical activity to prevent or treat childhood obesity through a systematic review and meta-analysis of randomized trials. A search of databases (PubMed, EMBASE and Cochrane CENTRAL) and references of published studies (from inception until May 2012) was conducted. Eligible studies were randomized trials enrolling children 6 to 12 years old and assessing the impact of educational interventions during 6 months or longer on waist circumference, body mass index (BMI), blood pressure and lipid profile to prevent or treat childhood obesity. Calculations were performed using a random effects method and pooled-effect estimates were obtained using the final values. Of 22.852 articles retrieved, 26 trials (23.617 participants) were included. There were no differences in outcomes assessed in prevention studies. However, in treatment studies, educational interventions were associated with a significant reduction in waist circumference [-3.21 cm (95%CI -6.34, -0.07)], BMI [-0.86 kg/m(2) (95%CI -1.59, -0.14)] and diastolic blood pressure [-3.68 mmHg (95%CI -5.48, -1.88)]. Educational interventions are effective in treatment, but not prevention, of childhood obesity and its consequences. Copyright © 2013 Elsevier Inc. All rights reserved.

  9. Quality of radiotherapy reporting in randomized controlled trials of prostate cancer.

    PubMed

    Soon, Yu Yang; Chen, Desiree; Tan, Teng Hwee; Tey, Jeremy

    2018-06-07

    Good radiotherapy reporting in clinical trials of prostate radiotherapy is important because it will allow accurate reproducibility of radiotherapy treatment and minimize treatment variations that can affect patient outcomes. The aim of our study is to assess the quality of prostate radiotherapy (RT) treatment reporting in randomized controlled trials in prostate cancer. We searched MEDLINE for randomized trials of prostate cancer, published from 1996 to 2016 and included prostate RT as one of the intervention arms. We assessed if the investigators reported the ten criteria adequately in the trial reports: RT dose prescription method; RT dose-planning procedures; organs at risk (OAR) dose constraints; target volume definition, simulation procedures; treatment verification procedures; total RT dose; fractionation schedule; conduct of quality assurance (QA) as well as presence or absence of deviations in RT treatment planning and delivery. We performed multivariate logistic regression to determine the factors that may influence the quality of reporting. We found 59 eligible trials. There was significant variability in the quality of reporting. Target volume definition, total RT dose and fractionation schedule were reported adequately in 97% of included trials. OAR constraints, simulation procedures and presence or absence of deviations in RT treatment planning and delivery were reported adequately in 30% of included trials. Twenty-four trials (40%) reported seven criteria or more adequately. Multivariable logistic analysis showed that trials that published their quality assurance results and cooperative group trials were more likely to have adequate quality in reporting in at least seven criteria. There is significant variability in the quality of reporting on prostate radiotherapy treatment in randomized trials of prostate cancer. We need to have consensus guidelines to standardize the reporting of radiotherapy treatment in randomized trials.

  10. Biases and power for groups comparison on subjective health measurements.

    PubMed

    Hamel, Jean-François; Hardouin, Jean-Benoit; Le Neel, Tanguy; Kubis, Gildas; Roquelaure, Yves; Sébille, Véronique

    2012-01-01

    Subjective health measurements are increasingly used in clinical research, particularly for patient groups comparisons. Two main types of analytical strategies can be used for such data: so-called classical test theory (CTT), relying on observed scores and models coming from Item Response Theory (IRT) relying on a response model relating the items responses to a latent parameter, often called latent trait. Whether IRT or CTT would be the most appropriate method to compare two independent groups of patients on a patient reported outcomes measurement remains unknown and was investigated using simulations. For CTT-based analyses, groups comparison was performed using t-test on the scores. For IRT-based analyses, several methods were compared, according to whether the Rasch model was considered with random effects or with fixed effects, and the group effect was included as a covariate or not. Individual latent traits values were estimated using either a deterministic method or by stochastic approaches. Latent traits were then compared with a t-test. Finally, a two-steps method was performed to compare the latent trait distributions, and a Wald test was performed to test the group effect in the Rasch model including group covariates. The only unbiased IRT-based method was the group covariate Wald's test, performed on the random effects Rasch model. This model displayed the highest observed power, which was similar to the power using the score t-test. These results need to be extended to the case frequently encountered in practice where data are missing and possibly informative.

  11. Measuring the volume of brain tumour and determining its location in T2-weighted MRI images using hidden Markov random field: expectation maximization algorithm

    NASA Astrophysics Data System (ADS)

    Mat Jafri, Mohd. Zubir; Abdulbaqi, Hayder Saad; Mutter, Kussay N.; Mustapha, Iskandar Shahrim; Omar, Ahmad Fairuz

    2017-06-01

    A brain tumour is an abnormal growth of tissue in the brain. Most tumour volume measurement processes are carried out manually by the radiographer and radiologist without relying on any auto program. This manual method is a timeconsuming task and may give inaccurate results. Treatment, diagnosis, signs and symptoms of the brain tumours mainly depend on the tumour volume and its location. In this paper, an approach is proposed to improve volume measurement of brain tumors as well as using a new method to determine the brain tumour location. The current study presents a hybrid method that includes two methods. One method is hidden Markov random field - expectation maximization (HMRFEM), which employs a positive initial classification of the image. The other method employs the threshold, which enables the final segmentation. In this method, the tumour volume is calculated using voxel dimension measurements. The brain tumour location was determined accurately in T2- weighted MRI image using a new algorithm. According to the results, this process was proven to be more useful compared to the manual method. Thus, it provides the possibility of calculating the volume and determining location of a brain tumour.

  12. Comparison of Time-to-First Event and Recurrent Event Methods in Randomized Clinical Trials.

    PubMed

    Claggett, Brian; Pocock, Stuart; Wei, L J; Pfeffer, Marc A; McMurray, John J V; Solomon, Scott D

    2018-03-27

    Background -Most Phase-3 trials feature time-to-first event endpoints for their primary and/or secondary analyses. In chronic diseases where a clinical event can occur more than once, recurrent-event methods have been proposed to more fully capture disease burden and have been assumed to improve statistical precision and power compared to conventional "time-to-first" methods. Methods -To better characterize factors that influence statistical properties of recurrent-events and time-to-first methods in the evaluation of randomized therapy, we repeatedly simulated trials with 1:1 randomization of 4000 patients to active vs control therapy, with true patient-level risk reduction of 20% (i.e. RR=0.80). For patients who discontinued active therapy after a first event, we assumed their risk reverted subsequently to their original placebo-level risk. Through simulation, we varied a) the degree of between-patient heterogeneity of risk and b) the extent of treatment discontinuation. Findings were compared with those from actual randomized clinical trials. Results -As the degree of between-patient heterogeneity of risk was increased, both time-to-first and recurrent-events methods lost statistical power to detect a true risk reduction and confidence intervals widened. The recurrent-events analyses continued to estimate the true RR=0.80 as heterogeneity increased, while the Cox model produced estimates that were attenuated. The power of recurrent-events methods declined as the rate of study drug discontinuation post-event increased. Recurrent-events methods provided greater power than time-to-first methods in scenarios where drug discontinuation was ≤30% following a first event, lesser power with drug discontinuation rates of ≥60%, and comparable power otherwise. We confirmed in several actual trials in chronic heart failure that treatment effect estimates were attenuated when estimated via the Cox model and that increased statistical power from recurrent-events methods was most pronounced in trials with lower treatment discontinuation rates. Conclusions -We find that the statistical power of both recurrent-events and time-to-first methods are reduced by increasing heterogeneity of patient risk, a parameter not included in conventional power and sample size formulas. Data from real clinical trials are consistent with simulation studies, confirming that the greatest statistical gains from use of recurrent-events methods occur in the presence of high patient heterogeneity and low rates of study drug discontinuation.

  13. Effects of a novel method for enteral nutrition infusion involving a viscosity-regulating pectin solution: A multicenter randomized controlled trial.

    PubMed

    Tabei, Isao; Tsuchida, Shigeru; Akashi, Tetsuro; Ookubo, Katsuichiro; Hosoda, Satoru; Furukawa, Yoshiyuki; Tanabe, Yoshiaki; Tamura, Yoshiko

    2018-02-01

    The initial complications associated with infusion of enteral nutrition (EN) for clinical and nutritional care are vomiting, aspiration pneumonia, and diarrhea. There are many recommendations to prevent these complications. A novel method involving a viscosity-regulating pectin solution has been demonstrated. In Japan, this method along with the other so-called "semi-solid EN" approaches has been widely used in practice. However, there has been no randomized clinical trial to prove the efficiency and safety of a viscosity-regulating pectin solution in EN management. Therefore, we planned and initiated a multicenter randomized controlled trial to determine the efficiency and safety. This study included 34 patients from 7 medical institutions who participated. Institutional review board (IRB) approval was obtained from all participating institutions. Patients who required EN management were enrolled and randomly assigned to the viscosity regulation of enteral feeding (VREF) group and control group. The VREF group (n = 15) was managed with the addition of a viscosity-regulating pectin solution. The control group (n = 12) was managed with conventional EN administration, usually in a gradual step-up method. Daily clinical symptoms of pneumonia, fever, vomiting, and diarrhea; defecation frequency; and stool form were observed in the 2 week trial period. The dose of EN and duration of infusion were also examined. A favorable trend for clinical symptoms was noticed in the VREF group. No significant differences were observed in episodes of pneumonia, fever, vomiting, and diarrhea between the 2 groups. An apparent reduction in infusion duration and hardening of stool form were noted in the VREF group. The novel method involving a viscosity-regulating pectin solution with EN administration can be clinically performed safely and efficiently, similar to the conventional method. Moreover, there were benefits, such as improvement in stool form, a short time for EN infusion, and a reduction in vomiting episodes, with the use of the novel method. This indicates some potential advantages in the quality of life among patients receiving this novel method. Copyright © 2017 European Society for Clinical Nutrition and Metabolism. Published by Elsevier Ltd. All rights reserved.

  14. The Enhancement of 3D Scans Depth Resolution Obtained by Confocal Scanning of Porous Materials

    NASA Astrophysics Data System (ADS)

    Martisek, Dalibor; Prochazkova, Jana

    2017-12-01

    The 3D reconstruction of simple structured materials using a confocal microscope is widely used in many different areas including civil engineering. Nonetheless, scans of porous materials such as concrete or cement paste are highly problematic. The well-known problem of these scans is low depth resolution in comparison to the horizontal and vertical resolution. The degradation of the image depth resolution is caused by systematic errors and especially by different random events. Our method is focused on the elimination of such random events, mainly the additive noise. We use an averaging method based on the Lindeberg-Lévy theorem that improves the final depth resolution to a level comparable with horizontal and vertical resolution. Moreover, using the least square method, we also precisely determine the limit value of a depth resolution. Therefore, we can continuously evaluate the difference between current resolution and the optimal one. This substantially simplifies the scanning process because the operator can easily determine the required number of scans.

  15. Land Covers Classification Based on Random Forest Method Using Features from Full-Waveform LIDAR Data

    NASA Astrophysics Data System (ADS)

    Ma, L.; Zhou, M.; Li, C.

    2017-09-01

    In this study, a Random Forest (RF) based land covers classification method is presented to predict the types of land covers in Miyun area. The returned full-waveforms which were acquired by a LiteMapper 5600 airborne LiDAR system were processed, including waveform filtering, waveform decomposition and features extraction. The commonly used features that were distance, intensity, Full Width at Half Maximum (FWHM), skewness and kurtosis were extracted. These waveform features were used as attributes of training data for generating the RF prediction model. The RF prediction model was applied to predict the types of land covers in Miyun area as trees, buildings, farmland and ground. The classification results of these four types of land covers were obtained according to the ground truth information acquired from CCD image data of the same region. The RF classification results were compared with that of SVM method and show better results. The RF classification accuracy reached 89.73% and the classification Kappa was 0.8631.

  16. Seismic noise attenuation using an online subspace tracking algorithm

    NASA Astrophysics Data System (ADS)

    Zhou, Yatong; Li, Shuhua; Zhang, Dong; Chen, Yangkang

    2018-02-01

    We propose a new low-rank based noise attenuation method using an efficient algorithm for tracking subspaces from highly corrupted seismic observations. The subspace tracking algorithm requires only basic linear algebraic manipulations. The algorithm is derived by analysing incremental gradient descent on the Grassmannian manifold of subspaces. When the multidimensional seismic data are mapped to a low-rank space, the subspace tracking algorithm can be directly applied to the input low-rank matrix to estimate the useful signals. Since the subspace tracking algorithm is an online algorithm, it is more robust to random noise than traditional truncated singular value decomposition (TSVD) based subspace tracking algorithm. Compared with the state-of-the-art algorithms, the proposed denoising method can obtain better performance. More specifically, the proposed method outperforms the TSVD-based singular spectrum analysis method in causing less residual noise and also in saving half of the computational cost. Several synthetic and field data examples with different levels of complexities demonstrate the effectiveness and robustness of the presented algorithm in rejecting different types of noise including random noise, spiky noise, blending noise, and coherent noise.

  17. A general method for handling missing binary outcome data in randomized controlled trials

    PubMed Central

    Jackson, Dan; White, Ian R; Mason, Dan; Sutton, Stephen

    2014-01-01

    Aims The analysis of randomized controlled trials with incomplete binary outcome data is challenging. We develop a general method for exploring the impact of missing data in such trials, with a focus on abstinence outcomes. Design We propose a sensitivity analysis where standard analyses, which could include ‘missing = smoking’ and ‘last observation carried forward’, are embedded in a wider class of models. Setting We apply our general method to data from two smoking cessation trials. Participants A total of 489 and 1758 participants from two smoking cessation trials. Measurements The abstinence outcomes were obtained using telephone interviews. Findings The estimated intervention effects from both trials depend on the sensitivity parameters used. The findings differ considerably in magnitude and statistical significance under quite extreme assumptions about the missing data, but are reasonably consistent under more moderate assumptions. Conclusions A new method for undertaking sensitivity analyses when handling missing data in trials with binary outcomes allows a wide range of assumptions about the missing data to be assessed. In two smoking cessation trials the results were insensitive to all but extreme assumptions. PMID:25171441

  18. Multilevel mixed effects parametric survival models using adaptive Gauss-Hermite quadrature with application to recurrent events and individual participant data meta-analysis.

    PubMed

    Crowther, Michael J; Look, Maxime P; Riley, Richard D

    2014-09-28

    Multilevel mixed effects survival models are used in the analysis of clustered survival data, such as repeated events, multicenter clinical trials, and individual participant data (IPD) meta-analyses, to investigate heterogeneity in baseline risk and covariate effects. In this paper, we extend parametric frailty models including the exponential, Weibull and Gompertz proportional hazards (PH) models and the log logistic, log normal, and generalized gamma accelerated failure time models to allow any number of normally distributed random effects. Furthermore, we extend the flexible parametric survival model of Royston and Parmar, modeled on the log-cumulative hazard scale using restricted cubic splines, to include random effects while also allowing for non-PH (time-dependent effects). Maximum likelihood is used to estimate the models utilizing adaptive or nonadaptive Gauss-Hermite quadrature. The methods are evaluated through simulation studies representing clinically plausible scenarios of a multicenter trial and IPD meta-analysis, showing good performance of the estimation method. The flexible parametric mixed effects model is illustrated using a dataset of patients with kidney disease and repeated times to infection and an IPD meta-analysis of prognostic factor studies in patients with breast cancer. User-friendly Stata software is provided to implement the methods. Copyright © 2014 John Wiley & Sons, Ltd.

  19. Force Limited Random Vibration Test of TESS Camera Mass Model

    NASA Technical Reports Server (NTRS)

    Karlicek, Alexandra; Hwang, James Ho-Jin; Rey, Justin J.

    2015-01-01

    The Transiting Exoplanet Survey Satellite (TESS) is a spaceborne instrument consisting of four wide field-of-view-CCD cameras dedicated to the discovery of exoplanets around the brightest stars. As part of the environmental testing campaign, force limiting was used to simulate a realistic random vibration launch environment. While the force limit vibration test method is a standard approach used at multiple institutions including Jet Propulsion Laboratory (JPL), NASA Goddard Space Flight Center (GSFC), European Space Research and Technology Center (ESTEC), and Japan Aerospace Exploration Agency (JAXA), it is still difficult to find an actual implementation process in the literature. This paper describes the step-by-step process on how the force limit method was developed and applied on the TESS camera mass model. The process description includes the design of special fixtures to mount the test article for properly installing force transducers, development of the force spectral density using the semi-empirical method, estimation of the fuzzy factor (C2) based on the mass ratio between the supporting structure and the test article, subsequent validating of the C2 factor during the vibration test, and calculation of the C.G. accelerations using the Root Mean Square (RMS) reaction force in the spectral domain and the peak reaction force in the time domain.

  20. Comparisons between physics-based, engineering, and statistical learning models for outdoor sound propagation.

    PubMed

    Hart, Carl R; Reznicek, Nathan J; Wilson, D Keith; Pettit, Chris L; Nykaza, Edward T

    2016-05-01

    Many outdoor sound propagation models exist, ranging from highly complex physics-based simulations to simplified engineering calculations, and more recently, highly flexible statistical learning methods. Several engineering and statistical learning models are evaluated by using a particular physics-based model, namely, a Crank-Nicholson parabolic equation (CNPE), as a benchmark. Narrowband transmission loss values predicted with the CNPE, based upon a simulated data set of meteorological, boundary, and source conditions, act as simulated observations. In the simulated data set sound propagation conditions span from downward refracting to upward refracting, for acoustically hard and soft boundaries, and low frequencies. Engineering models used in the comparisons include the ISO 9613-2 method, Harmonoise, and Nord2000 propagation models. Statistical learning methods used in the comparisons include bagged decision tree regression, random forest regression, boosting regression, and artificial neural network models. Computed skill scores are relative to sound propagation in a homogeneous atmosphere over a rigid ground. Overall skill scores for the engineering noise models are 0.6%, -7.1%, and 83.8% for the ISO 9613-2, Harmonoise, and Nord2000 models, respectively. Overall skill scores for the statistical learning models are 99.5%, 99.5%, 99.6%, and 99.6% for bagged decision tree, random forest, boosting, and artificial neural network regression models, respectively.

  1. Random forest regression for magnetic resonance image synthesis.

    PubMed

    Jog, Amod; Carass, Aaron; Roy, Snehashis; Pham, Dzung L; Prince, Jerry L

    2017-01-01

    By choosing different pulse sequences and their parameters, magnetic resonance imaging (MRI) can generate a large variety of tissue contrasts. This very flexibility, however, can yield inconsistencies with MRI acquisitions across datasets or scanning sessions that can in turn cause inconsistent automated image analysis. Although image synthesis of MR images has been shown to be helpful in addressing this problem, an inability to synthesize both T 2 -weighted brain images that include the skull and FLuid Attenuated Inversion Recovery (FLAIR) images has been reported. The method described herein, called REPLICA, addresses these limitations. REPLICA is a supervised random forest image synthesis approach that learns a nonlinear regression to predict intensities of alternate tissue contrasts given specific input tissue contrasts. Experimental results include direct image comparisons between synthetic and real images, results from image analysis tasks on both synthetic and real images, and comparison against other state-of-the-art image synthesis methods. REPLICA is computationally fast, and is shown to be comparable to other methods on tasks they are able to perform. Additionally REPLICA has the capability to synthesize both T 2 -weighted images of the full head and FLAIR images, and perform intensity standardization between different imaging datasets. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. A general framework for the use of logistic regression models in meta-analysis.

    PubMed

    Simmonds, Mark C; Higgins, Julian Pt

    2016-12-01

    Where individual participant data are available for every randomised trial in a meta-analysis of dichotomous event outcomes, "one-stage" random-effects logistic regression models have been proposed as a way to analyse these data. Such models can also be used even when individual participant data are not available and we have only summary contingency table data. One benefit of this one-stage regression model over conventional meta-analysis methods is that it maximises the correct binomial likelihood for the data and so does not require the common assumption that effect estimates are normally distributed. A second benefit of using this model is that it may be applied, with only minor modification, in a range of meta-analytic scenarios, including meta-regression, network meta-analyses and meta-analyses of diagnostic test accuracy. This single model can potentially replace the variety of often complex methods used in these areas. This paper considers, with a range of meta-analysis examples, how random-effects logistic regression models may be used in a number of different types of meta-analyses. This one-stage approach is compared with widely used meta-analysis methods including Bayesian network meta-analysis and the bivariate and hierarchical summary receiver operating characteristic (ROC) models for meta-analyses of diagnostic test accuracy. © The Author(s) 2014.

  3. Automated diagnosis of myositis from muscle ultrasound: Exploring the use of machine learning and deep learning methods

    PubMed Central

    Burlina, Philippe; Billings, Seth; Joshi, Neil

    2017-01-01

    Objective To evaluate the use of ultrasound coupled with machine learning (ML) and deep learning (DL) techniques for automated or semi-automated classification of myositis. Methods Eighty subjects comprised of 19 with inclusion body myositis (IBM), 14 with polymyositis (PM), 14 with dermatomyositis (DM), and 33 normal (N) subjects were included in this study, where 3214 muscle ultrasound images of 7 muscles (observed bilaterally) were acquired. We considered three problems of classification including (A) normal vs. affected (DM, PM, IBM); (B) normal vs. IBM patients; and (C) IBM vs. other types of myositis (DM or PM). We studied the use of an automated DL method using deep convolutional neural networks (DL-DCNNs) for diagnostic classification and compared it with a semi-automated conventional ML method based on random forests (ML-RF) and “engineered” features. We used the known clinical diagnosis as the gold standard for evaluating performance of muscle classification. Results The performance of the DL-DCNN method resulted in accuracies ± standard deviation of 76.2% ± 3.1% for problem (A), 86.6% ± 2.4% for (B) and 74.8% ± 3.9% for (C), while the ML-RF method led to accuracies of 72.3% ± 3.3% for problem (A), 84.3% ± 2.3% for (B) and 68.9% ± 2.5% for (C). Conclusions This study demonstrates the application of machine learning methods for automatically or semi-automatically classifying inflammatory muscle disease using muscle ultrasound. Compared to the conventional random forest machine learning method used here, which has the drawback of requiring manual delineation of muscle/fat boundaries, DCNN-based classification by and large improved the accuracies in all classification problems while providing a fully automated approach to classification. PMID:28854220

  4. Model selection for logistic regression models

    NASA Astrophysics Data System (ADS)

    Duller, Christine

    2012-09-01

    Model selection for logistic regression models decides which of some given potential regressors have an effect and hence should be included in the final model. The second interesting question is whether a certain factor is heterogeneous among some subsets, i.e. whether the model should include a random intercept or not. In this paper these questions will be answered with classical as well as with Bayesian methods. The application show some results of recent research projects in medicine and business administration.

  5. Secure content objects

    DOEpatents

    Evans, William D [Cupertino, CA

    2009-02-24

    A secure content object protects electronic documents from unauthorized use. The secure content object includes an encrypted electronic document, a multi-key encryption table having at least one multi-key component, an encrypted header and a user interface device. The encrypted document is encrypted using a document encryption key associated with a multi-key encryption method. The encrypted header includes an encryption marker formed by a random number followed by a derivable variation of the same random number. The user interface device enables a user to input a user authorization. The user authorization is combined with each of the multi-key components in the multi-key encryption key table and used to try to decrypt the encrypted header. If the encryption marker is successfully decrypted, the electronic document may be decrypted. Multiple electronic documents or a document and annotations may be protected by the secure content object.

  6. 2GETHER - The Dual Protection Project: Design and rationale of a randomized controlled trial to increase dual protection strategy selection and adherence among African American adolescent females

    PubMed Central

    Ewing, Alexander C.; Kottke, Melissa J.; Kraft, Joan Marie; Sales, Jessica M.; Brown, Jennifer L.; Goedken, Peggy; Wiener, Jeffrey; Kourtis, Athena P.

    2018-01-01

    Background African American adolescent females are at elevated risk for unintended pregnancy and sexually transmitted infections (STIs). Dual protection (DP) is defined as concurrent prevention of pregnancy and STIs. This can be achieved by abstinence, consistent condom use, or the dual methods of condoms plus an effective non-barrier contraceptive. Previous clinic-based interventions showed short-term effects on increasing dual method use, but evidence of sustained effects on dual method use and decreased incident pregnancies and STIs are lacking. Methods/Design This manuscript describes the 2GETHER Project. 2GETHER is a randomized controlled trial of a multi-component intervention to increase dual protection use among sexually active African American females aged 14–19 years not desiring pregnancy at a Title X clinic in Atlanta, GA. The intervention is clinic-based and includes a culturally tailored interactive multimedia component and counseling sessions, both to assist in selection of a DP method and to reinforce use of the DP method. The participants are randomized to the study intervention or the standard of care, and followed for 12 months to evaluate how the intervention influences DP method selection and adherence, pregnancy and STI incidence, and participants’ DP knowledge, intentions, and self-efficacy. Discussion The 2GETHER Project is a novel trial to reduce unintended pregnancies and STIs among African American adolescents. The intervention is unique in the comprehensive and complementary nature of its components and its individual tailoring of provider-patient interaction. If the trial interventions are shown to be effective, then it will be reasonable to assess their scalability and applicability in other populations. PMID:28007634

  7. Polymorphous computing fabric

    DOEpatents

    Wolinski, Christophe Czeslaw [Los Alamos, NM; Gokhale, Maya B [Los Alamos, NM; McCabe, Kevin Peter [Los Alamos, NM

    2011-01-18

    Fabric-based computing systems and methods are disclosed. A fabric-based computing system can include a polymorphous computing fabric that can be customized on a per application basis and a host processor in communication with said polymorphous computing fabric. The polymorphous computing fabric includes a cellular architecture that can be highly parameterized to enable a customized synthesis of fabric instances for a variety of enhanced application performances thereof. A global memory concept can also be included that provides the host processor random access to all variables and instructions associated with the polymorphous computing fabric.

  8. SMSIM--Fortran programs for simulating ground motions from earthquakes: Version 2.0.--a revision of OFR 96-80-A

    USGS Publications Warehouse

    Boore, David M.

    2000-01-01

    A simple and powerful method for simulating ground motions is based on the assumption that the amplitude of ground motion at a site can be specified in a deterministic way, with a random phase spectrum modified such that the motion is distributed over a duration related to the earthquake magnitude and to distance from the source. This method of simulating ground motions often goes by the name "the stochastic method." It is particularly useful for simulating the higher-frequency ground motions of most interest to engineers, and it is widely used to predict ground motions for regions of the world in which recordings of motion from damaging earthquakes are not available. This simple method has been successful in matching a variety of ground-motion measures for earthquakes with seismic moments spanning more than 12 orders of magnitude. One of the essential characteristics of the method is that it distills what is known about the various factors affecting ground motions (source, path, and site) into simple functional forms that can be used to predict ground motions. SMSIM is a set of programs for simulating ground motions based on the stochastic method. This Open-File Report is a revision of an earlier report (Boore, 1996) describing a set of programs for simulating ground motions from earthquakes. The programs are based on modifications I have made to the stochastic method first introduced by Hanks and McGuire (1981). The report contains source codes, written in Fortran, and executables that can be used on a PC. Programs are included both for time-domain and for random vibration simulations. In addition, programs are included to produce Fourier amplitude spectra for the models used in the simulations and to convert shear velocity vs. depth into frequency-dependent amplification. The revision to the previous report is needed because the input and output files have changed significantly, and a number of new programs have been included in the set.

  9. Numerical simulation of a shear-thinning fluid through packed spheres

    NASA Astrophysics Data System (ADS)

    Liu, Hai Long; Moon, Jong Sin; Hwang, Wook Ryol

    2012-12-01

    Flow behaviors of a non-Newtonian fluid in spherical microstructures have been studied by a direct numerical simulation. A shear-thinning (power-law) fluid through both regular and randomly packed spheres has been numerically investigated in a representative unit cell with the tri-periodic boundary condition, employing a rigorous three-dimensional finite-element scheme combined with fictitious-domain mortar-element methods. The present scheme has been validated for the classical spherical packing problems with literatures. The flow mobility of regular packing structures, including simple cubic (SC), body-centered cubic (BCC), face-centered cubic (FCC), as well as randomly packed spheres, has been investigated quantitatively by considering the amount of shear-thinning, the pressure gradient and the porosity as parameters. Furthermore, the mechanism leading to the main flow path in a highly shear-thinning fluid through randomly packed spheres has been discussed.

  10. Self-dual random-plaquette gauge model and the quantum toric code

    NASA Astrophysics Data System (ADS)

    Takeda, Koujin; Nishimori, Hidetoshi

    2004-05-01

    We study the four-dimensional Z2 random-plaquette lattice gauge theory as a model of topological quantum memory, the toric code in particular. In this model, the procedure of quantum error correction works properly in the ordered (Higgs) phase, and phase boundary between the ordered (Higgs) and disordered (confinement) phases gives the accuracy threshold of error correction. Using self-duality of the model in conjunction with the replica method, we show that this model has exactly the same mathematical structure as that of the two-dimensional random-bond Ising model, which has been studied very extensively. This observation enables us to derive a conjecture on the exact location of the multicritical point (accuracy threshold) of the model, pc=0.889972…, and leads to several nontrivial results including bounds on the accuracy threshold in three dimensions.

  11. A Cluster-Randomized Trial of Restorative Practices: An Illustration to Spur High-Quality Research and Evaluation.

    PubMed

    Acosta, Joie D; Chinman, Matthew; Ebener, Patricia; Phillips, Andrea; Xenakis, Lea; Malone, Patrick S

    2016-01-01

    Restorative Practices in schools lack rigorous evaluation studies. As an example of rigorous school-based research, this paper describes the first randomized control trial of restorative practices to date, the Study of Restorative Practices. It is a 5-year, cluster-randomized controlled trial (RCT) of the Restorative Practices Intervention (RPI) in 14 middle schools in Maine to assess whether RPI impacts both positive developmental outcomes and problem behaviors and whether the effects persist during the transition from middle to high school. The two-year RPI intervention began in the 2014-2015 school year. The study's rationale and theoretical concerns are discussed along with methodological concerns including teacher professional development. The theoretical rationale and description of the methods from this study may be useful to others conducting rigorous research and evaluation in this area.

  12. Statistical analysis of loopy belief propagation in random fields

    NASA Astrophysics Data System (ADS)

    Yasuda, Muneki; Kataoka, Shun; Tanaka, Kazuyuki

    2015-10-01

    Loopy belief propagation (LBP), which is equivalent to the Bethe approximation in statistical mechanics, is a message-passing-type inference method that is widely used to analyze systems based on Markov random fields (MRFs). In this paper, we propose a message-passing-type method to analytically evaluate the quenched average of LBP in random fields by using the replica cluster variation method. The proposed analytical method is applicable to general pairwise MRFs with random fields whose distributions differ from each other and can give the quenched averages of the Bethe free energies over random fields, which are consistent with numerical results. The order of its computational cost is equivalent to that of standard LBP. In the latter part of this paper, we describe the application of the proposed method to Bayesian image restoration, in which we observed that our theoretical results are in good agreement with the numerical results for natural images.

  13. Acupuncture for ankle sprain: systematic review and meta-analysis

    PubMed Central

    2013-01-01

    Background Ankle sprain is one of the most frequently encountered musculoskeletal injuries; however, the efficacy of acupuncture in treating ankle sprains remains uncertain. We therefore performed a systematic review to evaluate the evidence regarding acupuncture for ankle sprains. Methods We searched 15 data sources and two trial registries up to February 2012. Randomized controlled trials of acupuncture were included if they involved patients with ankle sprains and reported outcomes of symptom improvement, including pain. A Cochrane risk of bias assessment tool was used. Risk ratio (RR) or mean difference (MD) was calculated with 95% confidence intervals (CIs) in a random effects model. Subgroup analyses were performed based on acupuncture type, grade of sprain, and control type. Sensitivity analyses were also performed with respect to risk of bias, sample size, and outcomes reported. Results Seventeen trials involving 1820 participants were included. Trial quality was generally poor, with just three reporting adequate methods of randomization and only one a method of allocation concealment. Significantly more participants in acupuncture groups reported global symptom improvement compared with no acupuncture groups (RR of symptoms persisting with acupuncture = 0.56, 95% CI 0.42–0.77). However, this is probably an overestimate due to the heterogeneity (I2 = 51%) and high risk of bias of the included studies. Acupuncture as an add-on treatment also improved global symptoms compared with other treatments only, without significant variability (RR 0.61, 95% CI 0.51–0.73, I2 = 1%). The benefit of acupuncture remained significant when the analysis was limited to two studies with a low risk of bias. Acupuncture was more effective than various controls in relieving pain, facilitating return to normal activity, and promoting quality of life, but these analyses were based on only a small number of studies. Acupuncture did not appear to be associated with adverse events. Conclusions Given methodological shortcomings and the small number of high-quality primary studies, the available evidence is insufficient to recommend acupuncture as an evidence-based treatment option. This calls for further rigorous investigations. PMID:23496981

  14. Reducing Unintended Pregnancies Through Web-Based Reproductive Life Planning and Contraceptive Action Planning among Privately Insured Women: Study Protocol for the MyNewOptions Randomized, Controlled Trial.

    PubMed

    Chuang, Cynthia H; Velott, Diana L; Weisman, Carol S; Sciamanna, Christopher N; Legro, Richard S; Chinchilli, Vernon M; Moos, Merry-K; Francis, Erica B; Confer, Lindsay N; Lehman, Erik B; Armitage, Christopher J

    2015-01-01

    The Affordable Care Act mandates that most women of reproductive age with private health insurance have full contraceptive coverage with no out-of-pocket costs, creating an actionable time for women to evaluate their contraceptive choices without cost considerations. The MyNewOptions study is a three-arm, randomized, controlled trial testing web-based interventions aimed at assisting privately insured women with making contraceptive choices that are consistent with their reproductive goals. Privately insured women between the ages of 18 and 40 not intending pregnancy were randomly assigned to one of three groups: 1) a reproductive life planning (RLP) intervention, 2) a reproductive life planning enriched with contraceptive action planning (RLP+) intervention, or 3) an information only control group. Both the RLP and RLP+ guide women to identify their individualized reproductive goals and contraceptive method requirements. The RLP+ additionally includes a contraceptive action planning component, which uses if-then scenarios that allow the user to problem solve situations that make it difficult to be adherent to their contraceptive method. All three groups have access to a reproductive options library containing information about their contraceptive coverage and the attributes of alternative contraceptive methods. Women completed a baseline survey with follow-up surveys every 6 months for 2 years concurrent with intervention boosters. Study outcomes include contraceptive use and adherence. ClinicalTrials.gov identifier: NCT02100124. Results from the MyNewOptions study will demonstrate whether web-based reproductive life planning, with or without contraceptive action planning, helps insured women make patient-centered contraceptive choices compared with an information-only control condition. Copyright © 2015 Jacobs Institute of Women's Health. Published by Elsevier Inc. All rights reserved.

  15. Prediction of Detailed Enzyme Functions and Identification of Specificity Determining Residues by Random Forests

    PubMed Central

    Nagao, Chioko; Nagano, Nozomi; Mizuguchi, Kenji

    2014-01-01

    Determining enzyme functions is essential for a thorough understanding of cellular processes. Although many prediction methods have been developed, it remains a significant challenge to predict enzyme functions at the fourth-digit level of the Enzyme Commission numbers. Functional specificity of enzymes often changes drastically by mutations of a small number of residues and therefore, information about these critical residues can potentially help discriminate detailed functions. However, because these residues must be identified by mutagenesis experiments, the available information is limited, and the lack of experimentally verified specificity determining residues (SDRs) has hindered the development of detailed function prediction methods and computational identification of SDRs. Here we present a novel method for predicting enzyme functions by random forests, EFPrf, along with a set of putative SDRs, the random forests derived SDRs (rf-SDRs). EFPrf consists of a set of binary predictors for enzymes in each CATH superfamily and the rf-SDRs are the residue positions corresponding to the most highly contributing attributes obtained from each predictor. EFPrf showed a precision of 0.98 and a recall of 0.89 in a cross-validated benchmark assessment. The rf-SDRs included many residues, whose importance for specificity had been validated experimentally. The analysis of the rf-SDRs revealed both a general tendency that functionally diverged superfamilies tend to include more active site residues in their rf-SDRs than in less diverged superfamilies, and superfamily-specific conservation patterns of each functional residue. EFPrf and the rf-SDRs will be an effective tool for annotating enzyme functions and for understanding how enzyme functions have diverged within each superfamily. PMID:24416252

  16. Learning spinal manipulation: A best-evidence synthesis of teaching methods*

    PubMed Central

    Stainsby, Brynne E.; Clarke, Michelle C.S.; Egonia, Jade R.

    2016-01-01

    Objective: The purpose of this study was to evaluate the effectiveness of different reported methods used to teach spinal manipulative therapy to chiropractic students. Methods: For this best-evidence literature synthesis, 5 electronic databases were searched from 1900 to 2015. Eligible studies were critically appraised using the criteria of the Scottish Intercollegiate Guidelines Network. Scientifically admissible studies were synthesized following best-evidence synthesis principles. Results: Twenty articles were critically appraised, including 9 randomized clinical trials, 9 cohort studies, and 2 systematic reviews/meta-analyses. Eleven articles were accepted as scientifically admissible. The type of teaching method aids included a Thrust in Motion cervical manikin, instrumented cardiopulmonary reanimation manikin, padded contact with a load cell, instrumented treatment table with force sensor/transducer, and Dynadjust instrument. Conclusions: Several different methods exist in the literature for teaching spinal manipulative therapy techniques; however, future research in this developing area of chiropractic education is proposed. It is suggested that various teaching methods be included in the regular curricula of chiropractic colleges to aid in developing manipulation skills, efficiency, and knowledge of performance. PMID:26998630

  17. A study of active learning methods for named entity recognition in clinical text.

    PubMed

    Chen, Yukun; Lasko, Thomas A; Mei, Qiaozhu; Denny, Joshua C; Xu, Hua

    2015-12-01

    Named entity recognition (NER), a sequential labeling task, is one of the fundamental tasks for building clinical natural language processing (NLP) systems. Machine learning (ML) based approaches can achieve good performance, but they often require large amounts of annotated samples, which are expensive to build due to the requirement of domain experts in annotation. Active learning (AL), a sample selection approach integrated with supervised ML, aims to minimize the annotation cost while maximizing the performance of ML-based models. In this study, our goal was to develop and evaluate both existing and new AL methods for a clinical NER task to identify concepts of medical problems, treatments, and lab tests from the clinical notes. Using the annotated NER corpus from the 2010 i2b2/VA NLP challenge that contained 349 clinical documents with 20,423 unique sentences, we simulated AL experiments using a number of existing and novel algorithms in three different categories including uncertainty-based, diversity-based, and baseline sampling strategies. They were compared with the passive learning that uses random sampling. Learning curves that plot performance of the NER model against the estimated annotation cost (based on number of sentences or words in the training set) were generated to evaluate different active learning and the passive learning methods and the area under the learning curve (ALC) score was computed. Based on the learning curves of F-measure vs. number of sentences, uncertainty sampling algorithms outperformed all other methods in ALC. Most diversity-based methods also performed better than random sampling in ALC. To achieve an F-measure of 0.80, the best method based on uncertainty sampling could save 66% annotations in sentences, as compared to random sampling. For the learning curves of F-measure vs. number of words, uncertainty sampling methods again outperformed all other methods in ALC. To achieve 0.80 in F-measure, in comparison to random sampling, the best uncertainty based method saved 42% annotations in words. But the best diversity based method reduced only 7% annotation effort. In the simulated setting, AL methods, particularly uncertainty-sampling based approaches, seemed to significantly save annotation cost for the clinical NER task. The actual benefit of active learning in clinical NER should be further evaluated in a real-time setting. Copyright © 2015 Elsevier Inc. All rights reserved.

  18. Cluster-randomized Studies in Educational Research: Principles and Methodological Aspects

    PubMed Central

    Dreyhaupt, Jens; Mayer, Benjamin; Keis, Oliver; Öchsner, Wolfgang; Muche, Rainer

    2017-01-01

    An increasing number of studies are being performed in educational research to evaluate new teaching methods and approaches. These studies could be performed more efficiently and deliver more convincing results if they more strictly applied and complied with recognized standards of scientific studies. Such an approach could substantially increase the quality in particular of prospective, two-arm (intervention) studies that aim to compare two different teaching methods. A key standard in such studies is randomization, which can minimize systematic bias in study findings; such bias may result if the two study arms are not structurally equivalent. If possible, educational research studies should also achieve this standard, although this is not yet generally the case. Some difficulties and concerns exist, particularly regarding organizational and methodological aspects. An important point to consider in educational research studies is that usually individuals cannot be randomized, because of the teaching situation, and instead whole groups have to be randomized (so-called “cluster randomization”). Compared with studies with individual randomization, studies with cluster randomization normally require (significantly) larger sample sizes and more complex methods for calculating sample size. Furthermore, cluster-randomized studies require more complex methods for statistical analysis. The consequence of the above is that a competent expert with respective special knowledge needs to be involved in all phases of cluster-randomized studies. Studies to evaluate new teaching methods need to make greater use of randomization in order to achieve scientifically convincing results. Therefore, in this article we describe the general principles of cluster randomization and how to implement these principles, and we also outline practical aspects of using cluster randomization in prospective, two-arm comparative educational research studies. PMID:28584874

  19. Conic Sampling: An Efficient Method for Solving Linear and Quadratic Programming by Randomly Linking Constraints within the Interior

    PubMed Central

    Serang, Oliver

    2012-01-01

    Linear programming (LP) problems are commonly used in analysis and resource allocation, frequently surfacing as approximations to more difficult problems. Existing approaches to LP have been dominated by a small group of methods, and randomized algorithms have not enjoyed popularity in practice. This paper introduces a novel randomized method of solving LP problems by moving along the facets and within the interior of the polytope along rays randomly sampled from the polyhedral cones defined by the bounding constraints. This conic sampling method is then applied to randomly sampled LPs, and its runtime performance is shown to compare favorably to the simplex and primal affine-scaling algorithms, especially on polytopes with certain characteristics. The conic sampling method is then adapted and applied to solve a certain quadratic program, which compute a projection onto a polytope; the proposed method is shown to outperform the proprietary software Mathematica on large, sparse QP problems constructed from mass spectometry-based proteomics. PMID:22952741

  20. Simulation of multivariate stationary stochastic processes using dimension-reduction representation methods

    NASA Astrophysics Data System (ADS)

    Liu, Zhangjun; Liu, Zenghui; Peng, Yongbo

    2018-03-01

    In view of the Fourier-Stieltjes integral formula of multivariate stationary stochastic processes, a unified formulation accommodating spectral representation method (SRM) and proper orthogonal decomposition (POD) is deduced. By introducing random functions as constraints correlating the orthogonal random variables involved in the unified formulation, the dimension-reduction spectral representation method (DR-SRM) and the dimension-reduction proper orthogonal decomposition (DR-POD) are addressed. The proposed schemes are capable of representing the multivariate stationary stochastic process with a few elementary random variables, bypassing the challenges of high-dimensional random variables inherent in the conventional Monte Carlo methods. In order to accelerate the numerical simulation, the technique of Fast Fourier Transform (FFT) is integrated with the proposed schemes. For illustrative purposes, the simulation of horizontal wind velocity field along the deck of a large-span bridge is proceeded using the proposed methods containing 2 and 3 elementary random variables. Numerical simulation reveals the usefulness of the dimension-reduction representation methods.

  1. Phase transition in nonuniform Josephson arrays: Monte Carlo simulations

    NASA Astrophysics Data System (ADS)

    Lozovik, Yu. E.; Pomirchy, L. M.

    1994-01-01

    Disordered 2D system with Josephson interactions is considered. Disordered XY-model describes the granular films, Josephson arrays etc. Two types of disorder are analyzed: (1) randomly diluted system: Josephson coupling constants J ij are equal to J with probability p or zero (bond percolation problem); (2) coupling constants J ij are positive and distributed randomly and uniformly in some interval either including the vicinity of zero or apart from it. These systems are simulated by Monte Carlo method. Behaviour of potential energy, specific heat, phase correlation function and helicity modulus are analyzed. The phase diagram of the diluted system in T c-p plane is obtained.

  2. Applying under-sampling techniques and cost-sensitive learning methods on risk assessment of breast cancer.

    PubMed

    Hsu, Jia-Lien; Hung, Ping-Cheng; Lin, Hung-Yen; Hsieh, Chung-Ho

    2015-04-01

    Breast cancer is one of the most common cause of cancer mortality. Early detection through mammography screening could significantly reduce mortality from breast cancer. However, most of screening methods may consume large amount of resources. We propose a computational model, which is solely based on personal health information, for breast cancer risk assessment. Our model can be served as a pre-screening program in the low-cost setting. In our study, the data set, consisting of 3976 records, is collected from Taipei City Hospital starting from 2008.1.1 to 2008.12.31. Based on the dataset, we first apply the sampling techniques and dimension reduction method to preprocess the testing data. Then, we construct various kinds of classifiers (including basic classifiers, ensemble methods, and cost-sensitive methods) to predict the risk. The cost-sensitive method with random forest classifier is able to achieve recall (or sensitivity) as 100 %. At the recall of 100 %, the precision (positive predictive value, PPV), and specificity of cost-sensitive method with random forest classifier was 2.9 % and 14.87 %, respectively. In our study, we build a breast cancer risk assessment model by using the data mining techniques. Our model has the potential to be served as an assisting tool in the breast cancer screening.

  3. Ensemble of trees approaches to risk adjustment for evaluating a hospital's performance.

    PubMed

    Liu, Yang; Traskin, Mikhail; Lorch, Scott A; George, Edward I; Small, Dylan

    2015-03-01

    A commonly used method for evaluating a hospital's performance on an outcome is to compare the hospital's observed outcome rate to the hospital's expected outcome rate given its patient (case) mix and service. The process of calculating the hospital's expected outcome rate given its patient mix and service is called risk adjustment (Iezzoni 1997). Risk adjustment is critical for accurately evaluating and comparing hospitals' performances since we would not want to unfairly penalize a hospital just because it treats sicker patients. The key to risk adjustment is accurately estimating the probability of an Outcome given patient characteristics. For cases with binary outcomes, the method that is commonly used in risk adjustment is logistic regression. In this paper, we consider ensemble of trees methods as alternatives for risk adjustment, including random forests and Bayesian additive regression trees (BART). Both random forests and BART are modern machine learning methods that have been shown recently to have excellent performance for prediction of outcomes in many settings. We apply these methods to carry out risk adjustment for the performance of neonatal intensive care units (NICU). We show that these ensemble of trees methods outperform logistic regression in predicting mortality among babies treated in NICU, and provide a superior method of risk adjustment compared to logistic regression.

  4. Peer Inclusion in Interventions for Children with ADHD: A Systematic Review and Meta-Analysis

    PubMed Central

    Vilaysack, Brandon; Doma, Kenji; Wilkes-Gillan, Sarah; Speyer, Renée

    2018-01-01

    Objective To assess the effectiveness of peer inclusion in interventions to improve the social functioning of children with ADHD. Methods We searched four electronic databases for randomized controlled trials and controlled quasi-experimental studies that investigated peer inclusion interventions alone or combined with pharmacological treatment. Data were collected from the included studies and methodologically assessed. Meta-analyses were conducted using a random-effects model. Results Seventeen studies met eligibility criteria. Studies investigated interventions consisting of peer involvement and peer proximity; no study included peer mediation. Most included studies had an unclear or high risk of bias regarding inadequate reporting of randomization, blinding, and control for confounders. Meta-analyses indicated improvements in pre-post measures of social functioning for participants in peer-inclusive treatment groups. Peer inclusion was advantageous compared to treatment as usual. The benefits of peer inclusion over other therapies or medication only could not be determined. Using parents as raters for outcome measurement significantly mediated the intervention effect. Conclusions The evidence to support or contest the efficacy of peer inclusion interventions for children with ADHD is lacking. Future studies need to reduce risks of bias, use appropriate sample sizes, and provide detailed results to investigate the efficacy of peer inclusion interventions for children with ADHD. PMID:29744363

  5. A cluster-randomized, placebo-controlled, maternal vitamin A or beta-carotene supplementation trial in Bangladesh: design and methods.

    PubMed

    Labrique, Alain B; Christian, Parul; Klemm, Rolf D W; Rashid, Mahbubur; Shamim, Abu Ahmed; Massie, Allan; Schulze, Kerry; Hackman, Andre; West, Keith P

    2011-04-21

    We present the design, methods and population characteristics of a large community trial that assessed the efficacy of a weekly supplement containing vitamin A or beta-carotene, at recommended dietary levels, in reducing maternal mortality from early gestation through 12 weeks postpartum. We identify challenges faced and report solutions in implementing an intervention trial under low-resource, rural conditions, including the importance of population choice in promoting generalizability, maintaining rigorous data quality control to reduce inter- and intra- worker variation, and optimizing efficiencies in information and resources flow from and to the field. This trial was a double-masked, cluster-randomized, dual intervention, placebo-controlled trial in a contiguous rural area of ~435 sq km with a population of ~650,000 in Gaibandha and Rangpur Districts of Northwestern Bangladesh. Approximately 120,000 married women of reproductive age underwent 5-weekly home surveillance, of whom ~60,000 were detected as pregnant, enrolled into the trial and gave birth to ~44,000 live-born infants. Upon enrollment, at ~ 9 weeks' gestation, pregnant women received a weekly oral supplement containing vitamin A (7000 ug retinol equivalents (RE)), beta-carotene (42 mg, or ~7000 ug RE) or a placebo through 12 weeks postpartum, according to prior randomized allocation of their cluster of residence. Systems described include enlistment and 5-weekly home surveillance for pregnancy based on menstrual history and urine testing, weekly supervised supplementation, periodic risk factor interviews, maternal and infant vital outcome monitoring, birth defect surveillance and clinical/biochemical substudies. The primary outcome was pregnancy-related mortality assessed for 3 months following parturition. Secondary outcomes included fetal loss due to miscarriage or stillbirth, infant mortality under three months of age, maternal obstetric and infectious morbidity, infant infectious morbidity, maternal and infant micronutrient status, fetal and infant growth and prematurity, external birth defects and postnatal infant growth to 3 months of age. Aspects of study site selection and its "resonance" with national and rural qualities of Bangladesh, the trial's design, methods and allocation group comparability achieved by randomization, field procedures and innovative approaches to solving challenges in trial conduct are described and discussed. This trial is registered with http://Clinicaltrials.gov as protocol NCT00198822.

  6. A quantitative and qualitative evaluation of reports of clinical trials published in six Brazilian dental journals indexed in the Scientific Electronic Library Online (SciELO).

    PubMed

    de Souza, Raphael Freitas; Chaves, Carolina de Andrade Lima; Nasser, Mona; Fedorowicz, Zbys

    2010-01-01

    Open access publishing is becoming increasingly popular within the biomedical sciences. SciELO, the Scientific Electronic Library Online, is a digital library covering a selected collection of Brazilian scientific journals many of which provide open access to full-text articles.This library includes a number of dental journals some of which may include reports of clinical trials in English, Portuguese and/or Spanish. Thus, SciELO could play an important role as a source of evidence for dental healthcare interventions especially if it yields a sizeable number of high quality reports. The aim of this study was to identify reports of clinical trials by handsearching of dental journals that are accessible through SciELO, and to assess the overall quality of these reports. Electronic versions of six Brazilian dental Journals indexed in SciELO were handsearched at www.scielo.br in September 2008. Reports of clinical trials were identified and classified as controlled clinical trials (CCTs - prospective, experimental studies comparing 2 or more healthcare interventions in human beings) or randomized controlled trials (RCTs - a random allocation method is clearly reported), according to Cochrane eligibility criteria. CRITERIA TO ASSESS METHODOLOGICAL QUALITY INCLUDED: method of randomization, concealment of treatment allocation, blinded outcome assessment, handling of withdrawals and losses and whether an intention-to-treat analysis had been carried out. The search retrieved 33 CCTs and 43 RCTs. A majority of the reports provided no description of either the method of randomization (75.3%) or concealment of the allocation sequence (84.2%). Participants and outcome assessors were reported as blinded in only 31.2% of the reports. Withdrawals and losses were only clearly described in 6.5% of the reports and none mentioned an intention-to-treat analysis or any similar procedure. The results of this study indicate that a substantial number of reports of trials and systematic reviews are available in the dental journals listed in SciELO, and that these could provide valuable evidence for clinical decision making. However, it is clear that the quality of a number of these reports is of some concern and that improvement in the conduct and reporting of these trials could be achieved if authors adhered to internationally accepted guidelines, e.g. the CONSORT statement.

  7. School-Based Education Programs for the Prevention of Child Sexual Abuse: A Cochrane Systematic Review and Meta-Analysis

    ERIC Educational Resources Information Center

    Walsh, Kerryann; Zwi, Karen; Woolfenden, Susan; Shlonsky, Aron

    2018-01-01

    Objective: To assess evidence of the effectiveness of school-based education programs for the prevention of child sexual abuse (CSA). The programs deliver information about CSA and strategies to help children avoid it and encourage help seeking. Methods: Systematic review including meta-analysis of randomized controlled trials (RCTs), cluster…

  8. Partners in Caregiving in a Special Care Environment: Cooperative Communication between Staff and Families on Dementia Units

    ERIC Educational Resources Information Center

    Robison, Julie; Curry, Leslie; Gruman, Cynthia; Porter, Martha; Henderson, Charles R., Jr.; Pillemer, Karl

    2007-01-01

    Purpose: This article reports the results of a randomized, controlled evaluation of Partners in Caregiving in a Special Care Environment, an intervention designed to improve communication and cooperation between staff and families of residents in nursing home dementia programs. Design and Methods: Participants included 388 family members and 384…

  9. A Randomized Trial Assessing the Impact of a Personal Printed Feedback Portrait on Statin Prescribing in Primary Care

    ERIC Educational Resources Information Center

    Dormuth, Colin R.; Carney, Greg; Taylor, Suzanne; Bassett, Ken; Maclure, Malcolm

    2012-01-01

    Introduction: Knowledge translation (KT) initiatives have the potential to improve prescribing quality and produce savings that exceed the cost of the KT program itself, including the cost of evaluation using pragmatic study methods. Our objective was to measure the impact and estimated savings resulting from the distribution of individualized…

  10. Validation of a Milk Consumption Stage of Change Algorithm among Adolescent Survivors of Childhood Cancer

    ERIC Educational Resources Information Center

    Mays, Darren; Gerfen, Elissa; Mosher, Revonda B.; Shad, Aziza T.; Tercyak, Kenneth P.

    2012-01-01

    Objective: To assess the construct validity of a milk consumption Stages of Change (SOC) algorithm among adolescent survivors of childhood cancer ages 11 to 21 years (n = 75). Methods: Baseline data from a randomized controlled trial designed to evaluate a health behavior intervention were analyzed. Assessments included a milk consumption SOC…

  11. Patient Adherence Predicts Outcome from Cognitive Behavioral Therapy in Obsessive-Compulsive Disorder

    ERIC Educational Resources Information Center

    Simpson, Helen Blair; Maher, Michael J.; Wang, Yuanjia; Bao, Yuanyuan; Foa, Edna B.; Franklin, Martin

    2011-01-01

    Objective: To examine the effects of patient adherence on outcome from exposure and response prevention (EX/RP) therapy in adults with obsessive-compulsive disorder (OCD). Method: Thirty adults with OCD were randomized to EX/RP (n = 15) or EX/RP augmented by motivational interviewing strategies (n = 15). Both treatments included 3 introductory…

  12. A Free-Recall Demonstration versus a Lecture-Only Control: Learning Benefits

    ERIC Educational Resources Information Center

    Balch, William R.

    2012-01-01

    On their first class day, introductory psychology students took a 14-question multiple-choice pretest on several principles of memory including primacy, recency, storage, retrieval, counterbalancing, and the free-recall method. I randomly preassigned students to come at one of two different times to the second class, 2 days later, when they either…

  13. Re-Visit to the School Nurse and Adolescents' Medicine Use

    ERIC Educational Resources Information Center

    Borup, Ina K.; Andersen, Anette; Holstein, Bjorn E.

    2011-01-01

    Objective: To examine if students who re-visit the school nurse use medicines differently than other students when exposed to aches and psychological problems. Methods: The study includes all 11-, 13- and 15-year-old students from a random sample of schools in Denmark, response rate 87 per cent, n = 5,205. The data collection followed the…

  14. Dietary fat and not calcium supplementation or dairy product consumption is associated with changes in anthropometrics during a randomized, placebo-controlled energy-restriction trial

    USDA-ARS?s Scientific Manuscript database

    Insufficient calcium intake has been proposed to cause unbalanced energy partitioning leading to obesity. However, weight loss interventions including dietary calcium or dairy product consumption have not reported changes in lipid metabolism measured by the plasma lipidome. Methods. The objective ...

  15. Predicting School Readiness from Neurodevelopmental Assessments at Age 2 Years after Respiratory Distress Syndrome in Infants Born Preterm

    ERIC Educational Resources Information Center

    Patrianakos-Hoobler, Athena I.; Msall, Michael E.; Huo, Dezheng; Marks, Jeremy D.; Plesha-Troyke, Susan; Schreiber, Michael D.

    2010-01-01

    Aim: To determine whether neurodevelopmental outcomes at the age of 2 years accurately predict school readiness in children who survived respiratory distress syndrome after preterm birth. Method: Our cohort included 121 preterm infants who received surfactant and ventilation and were enrolled in a randomized controlled study of inhaled nitric…

  16. The Contribution of Counseling Providers to the Success or Failure of Marriages

    ERIC Educational Resources Information Center

    Ansah-Hughes, Winifred

    2015-01-01

    This study is an investigation into the contribution of counseling providers to the success or failure of marriages. The purposive and the simple random sampling methods were used to select eight churches and 259 respondents (married people) in the Techiman Municipality. The instrument used to collect data was a 26-item questionnaire including a…

  17. Methods for a multicenter randomized trial for mixed urinary incontinence: rationale and patient-centeredness of the ESTEEM trial.

    PubMed

    Sung, Vivian W; Borello-France, Diane; Dunivan, Gena; Gantz, Marie; Lukacz, Emily S; Moalli, Pamela; Newman, Diane K; Richter, Holly E; Ridgeway, Beri; Smith, Ariana L; Weidner, Alison C; Meikle, Susan

    2016-10-01

    Mixed urinary incontinence (MUI) can be a challenging condition to manage. We describe the protocol design and rationale for the Effects of Surgical Treatment Enhanced with Exercise for Mixed Urinary Incontinence (ESTEEM) trial, designed to compare a combined conservative and surgical treatment approach versus surgery alone for improving patient-centered MUI outcomes at 12 months. ESTEEM is a multisite, prospective, randomized trial of female participants with MUI randomized to a standardized perioperative behavioral/pelvic floor exercise intervention plus midurethral sling versus midurethral sling alone. We describe our methods and four challenges encountered during the design phase: defining the study population, selecting relevant patient-centered outcomes, determining sample size estimates using a patient-reported outcome measure, and designing an analysis plan that accommodates MUI failure rates. A central theme in the design was patient centeredness, which guided many key decisions. Our primary outcome is patient-reported MUI symptoms measured using the Urogenital Distress Inventory (UDI) score at 12 months. Secondary outcomes include quality of life, sexual function, cost-effectiveness, time to failure, and need for additional treatment. The final study design was implemented in November 2013 across eight clinical sites in the Pelvic Floor Disorders Network. As of 27 February 2016, 433 total/472 targeted participants had been randomized. We describe the ESTEEM protocol and our methods for reaching consensus for methodological challenges in designing a trial for MUI by maintaining the patient perspective at the core of key decisions. This trial will provide information that can directly impact patient care and clinical decision making.

  18. Methodology Series Module 5: Sampling Strategies.

    PubMed

    Setia, Maninder Singh

    2016-01-01

    Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the ' Sampling Method'. There are essentially two types of sampling methods: 1) probability sampling - based on chance events (such as random numbers, flipping a coin etc.); and 2) non-probability sampling - based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample) is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term ' random sample' when the researcher has used convenience sample). The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the ' generalizability' of these results. In such a scenario, the researcher may want to use ' purposive sampling' for the study.

  19. The Fibonacci Life-Chart Method (FLCM) as a foundation for Carl Jung's theory of synchronicity.

    PubMed

    Sacco, Robert G

    2016-04-01

    Since the scientific method requires events to be subject to controlled examination it would seem that synchronicities are not scientifically investigable. Jung speculated that because these incredible events are like the random sparks of a firefly they cannot be pinned down. However, doubting Jung's doubts, the author provides a possible method of elucidating these seemingly random and elusive events. The author draws on a new method, designated the Fibonacci Life-Chart Method (FLCM), which categorizes phase transitions and Phi fractal scaling in human development based on the occurrence of Fibonacci numbers in biological cell division and self-organizing systems. The FLCM offers an orientation towards psychological experience that may have relevance to Jung's theory of synchronicity in which connections are deemed to be intrinsically meaningful rather than demonstrable consequences of cause and effect. In such a model synchronistic events can be seen to be, as the self-organizing system enlarges, manifestations of self-organized critical moments and Phi fractal scaling. Recommendations for future studies include testing the results of the FLCM using case reports of synchronistic and spiritual experiences. © 2016, The Society of Analytical Psychology.

  20. Differences in reporting of analyses in internal company documents versus published trial reports: comparisons in industry-sponsored trials in off-label uses of gabapentin.

    PubMed

    Vedula, S Swaroop; Li, Tianjing; Dickersin, Kay

    2013-01-01

    Details about the type of analysis (e.g., intent to treat [ITT]) and definitions (i.e., criteria for including participants in the analysis) are necessary for interpreting a clinical trial's findings. Our objective was to compare the description of types of analyses and criteria for including participants in the publication (i.e., what was reported) with descriptions in the corresponding internal company documents (i.e., what was planned and what was done). Trials were for off-label uses of gabapentin sponsored by Pfizer and Parke-Davis, and documents were obtained through litigation. For each trial, we compared internal company documents (protocols, statistical analysis plans, and research reports, all unpublished), with publications. One author extracted data and another verified, with a third person verifying discordant items and a sample of the rest. Extracted data included the number of participants randomized and analyzed for efficacy, and types of analyses for efficacy and safety and their definitions (i.e., criteria for including participants in each type of analysis). We identified 21 trials, 11 of which were published randomized controlled trials, and that provided the documents needed for planned comparisons. For three trials, there was disagreement on the number of randomized participants between the research report and publication. Seven types of efficacy analyses were described in the protocols, statistical analysis plans, and publications, including ITT and six others. The protocol or publication described ITT using six different definitions, resulting in frequent disagreements between the two documents (i.e., different numbers of participants were included in the analyses). Descriptions of analyses conducted did not agree between internal company documents and what was publicly reported. Internal company documents provide extensive documentation of methods planned and used, and trial findings, and should be publicly accessible. Reporting standards for randomized controlled trials should recommend transparent descriptions and definitions of analyses performed and which study participants are excluded.

  1. Systematic Review of the Impact of Cancer Survivorship Care Plans on Health Outcomes and Health Care Delivery.

    PubMed

    Jacobsen, Paul B; DeRosa, Antonio P; Henderson, Tara O; Mayer, Deborah K; Moskowitz, Chaya S; Paskett, Electra D; Rowland, Julia H

    2018-05-18

    Purpose Numerous organizations recommend that patients with cancer receive a survivorship care plan (SCP) comprising a treatment summary and follow-up care plans. Among current barriers to implementation are providers' concerns about the strength of evidence that SCPs improve outcomes. This systematic review evaluates whether delivery of SCPs has a positive impact on health outcomes and health care delivery for cancer survivors. Methods Randomized and nonrandomized studies evaluating patient-reported outcomes, health care use, and disease outcomes after delivery of SCPs were identified by searching MEDLINE, Embase, PsycINFO, Cumulative Index to Nursing and Allied Health Literature, and Cochrane Library. Data extracted by independent raters were summarized on the basis of qualitative synthesis. Results Eleven nonrandomized and 13 randomized studies met inclusion criteria. Variability was evident across studies in cancer types, SCP delivery timing and method, SCP recipients and content, SCP-related counseling, and outcomes assessed. Nonrandomized study findings yielded descriptive information on satisfaction with care and reactions to SCPs. Randomized study findings were generally negative for the most commonly assessed outcomes (ie, physical, functional, and psychological well-being); findings were positive in single studies for other outcomes, including amount of information received, satisfaction with care, and physician implementation of recommended care. Conclusion Existing research provides little evidence that SCPs improve health outcomes and health care delivery. Possible explanations include heterogeneity in study designs and the low likelihood that SCP delivery alone would influence distal outcomes. Findings are limited but more positive for proximal outcomes (eg, information received) and for care delivery, particularly when SCPs are accompanied by counseling to prepare survivors for future clinical encounters. Recommendations for future research include focusing to a greater extent on evaluating ways to ensure SCP recommendations are subsequently acted on as part of ongoing care.

  2. Pain Management After Outpatient Shoulder Arthroscopy: A Systematic Review of Randomized Controlled Trials.

    PubMed

    Warrender, William J; Syed, Usman Ali M; Hammoud, Sommer; Emper, William; Ciccotti, Michael G; Abboud, Joseph A; Freedman, Kevin B

    2017-06-01

    Effective postoperative pain management after shoulder arthroscopy is a critical component to recovery, rehabilitation, and patient satisfaction. This systematic review provides a comprehensive overview of level 1 and level 2 evidence regarding postoperative pain management for outpatient arthroscopic shoulder surgery. Systematic review. We performed a systematic review of the various modalities reported in the literature for postoperative pain control after outpatient shoulder arthroscopy and analyzed their outcomes. Analgesic regimens reviewed include regional nerve blocks/infusions, subacromial/intra-articular injections or infusions, cryotherapy, and oral medications. Only randomized control trials with level 1 or level 2 evidence that compared 2 or more pain management modalities or placebo were included. We excluded studies without objective measures to quantify postoperative pain within the first postoperative month, subjective pain scale measurements, or narcotic consumption as outcome measures. A combined total of 40 randomized control trials met our inclusion criteria. Of the 40 included studies, 15 examined nerve blocks, 4 studied oral medication regimens, 12 studied subacromial infusion, 8 compared multiple modalities, and 1 evaluated cryotherapy. Interscalene nerve blocks (ISBs) were found to be the most effective method to control postoperative pain after shoulder arthroscopy. Increasing concentrations, continuous infusions, and patient-controlled methods can be effective for more aggressively controlling pain. Dexamethasone, clonidine, intrabursal oxycodone, and magnesium have all been shown to successfully improve the duration and adequacy of ISBs when used as adjuvants. Oral pregabalin and etoricoxib administered preoperatively have evidence supporting decreased postoperative pain and increased patient satisfaction. On the basis of the evidence in this review, we recommend the use of ISBs as the most effective analgesic for outpatient arthroscopic shoulder surgery.

  3. Droxidopa and Reduced Falls in a Trial of Parkinson Disease Patients With Neurogenic Orthostatic Hypotension

    PubMed Central

    Hauser, Robert A.; Heritier, Stephane; Rowse, Gerald J.; Hewitt, L. Arthur; Isaacson, Stuart H.

    2016-01-01

    Objectives Droxidopa is a prodrug of norepinephrine indicated for the treatment of orthostatic dizziness, lightheadedness, or the “feeling that you are about to black out” in adult patients with symptomatic neurogenic orthostatic hypotension caused by primary autonomic failure including Parkinson disease (PD). The objective of this study was to compare fall rates in PD patients with symptomatic neurogenic orthostatic hypotension randomized to droxidopa or placebo. Methods Study NOH306 was a 10-week, phase 3, randomized, placebo-controlled, double-blind trial of droxidopa in PD patients with symptomatic neurogenic orthostatic hypotension that included assessments of falls as a key secondary end point. In this report, the principal analysis consisted of a comparison of the rate of patient-reported falls from randomization to end of study in droxidopa versus placebo groups. Results A total of 225 patients were randomized; 222 patients were included in the safety analyses, and 197 patients provided efficacy data and were included in the falls analyses. The 92 droxidopa patients reported 308 falls, and the 105 placebo patients reported 908 falls. In the droxidopa group, the fall rate was 0.4 falls per patient-week; in the placebo group, the rate was 1.05 falls per patient-week (prespecified Wilcoxon rank sum P = 0.704; post hoc Poisson-inverse Gaussian test P = 0.014), yielding a relative risk reduction of 77% using the Poisson-inverse Gaussian model. Fall-related injuries occurred in 16.7% of droxidopa-treated patients and 26.9% of placebo-treated patients. Conclusions Treatment with droxidopa appears to reduce falls in PD patients with symptomatic neurogenic orthostatic hypotension, but this finding must be confirmed. PMID:27332626

  4. Accounting for Heterogeneity in Relative Treatment Effects for Use in Cost-Effectiveness Models and Value-of-Information Analyses

    PubMed Central

    Soares, Marta O.; Palmer, Stephen; Ades, Anthony E.; Harrison, David; Shankar-Hari, Manu; Rowan, Kathy M.

    2015-01-01

    Cost-effectiveness analysis (CEA) models are routinely used to inform health care policy. Key model inputs include relative effectiveness of competing treatments, typically informed by meta-analysis. Heterogeneity is ubiquitous in meta-analysis, and random effects models are usually used when there is variability in effects across studies. In the absence of observed treatment effect modifiers, various summaries from the random effects distribution (random effects mean, predictive distribution, random effects distribution, or study-specific estimate [shrunken or independent of other studies]) can be used depending on the relationship between the setting for the decision (population characteristics, treatment definitions, and other contextual factors) and the included studies. If covariates have been measured that could potentially explain the heterogeneity, then these can be included in a meta-regression model. We describe how covariates can be included in a network meta-analysis model and how the output from such an analysis can be used in a CEA model. We outline a model selection procedure to help choose between competing models and stress the importance of clinical input. We illustrate the approach with a health technology assessment of intravenous immunoglobulin for the management of adult patients with severe sepsis in an intensive care setting, which exemplifies how risk of bias information can be incorporated into CEA models. We show that the results of the CEA and value-of-information analyses are sensitive to the model and highlight the importance of sensitivity analyses when conducting CEA in the presence of heterogeneity. The methods presented extend naturally to heterogeneity in other model inputs, such as baseline risk. PMID:25712447

  5. Accounting for Heterogeneity in Relative Treatment Effects for Use in Cost-Effectiveness Models and Value-of-Information Analyses.

    PubMed

    Welton, Nicky J; Soares, Marta O; Palmer, Stephen; Ades, Anthony E; Harrison, David; Shankar-Hari, Manu; Rowan, Kathy M

    2015-07-01

    Cost-effectiveness analysis (CEA) models are routinely used to inform health care policy. Key model inputs include relative effectiveness of competing treatments, typically informed by meta-analysis. Heterogeneity is ubiquitous in meta-analysis, and random effects models are usually used when there is variability in effects across studies. In the absence of observed treatment effect modifiers, various summaries from the random effects distribution (random effects mean, predictive distribution, random effects distribution, or study-specific estimate [shrunken or independent of other studies]) can be used depending on the relationship between the setting for the decision (population characteristics, treatment definitions, and other contextual factors) and the included studies. If covariates have been measured that could potentially explain the heterogeneity, then these can be included in a meta-regression model. We describe how covariates can be included in a network meta-analysis model and how the output from such an analysis can be used in a CEA model. We outline a model selection procedure to help choose between competing models and stress the importance of clinical input. We illustrate the approach with a health technology assessment of intravenous immunoglobulin for the management of adult patients with severe sepsis in an intensive care setting, which exemplifies how risk of bias information can be incorporated into CEA models. We show that the results of the CEA and value-of-information analyses are sensitive to the model and highlight the importance of sensitivity analyses when conducting CEA in the presence of heterogeneity. The methods presented extend naturally to heterogeneity in other model inputs, such as baseline risk. © The Author(s) 2015.

  6. A New Method for Synthesizing Radiation Dose-Response Data From Multiple Trials Applied to Prostate Cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Diez, Patricia; Vogelius, Ivan S.; Department of Human Oncology, University of Wisconsin School of Medicine and Public Health, Madison, WI 53792

    2010-07-15

    Purpose: A new method is presented for synthesizing dose-response data for biochemical control of prostate cancer according to study design (randomized vs. nonrandomized) and risk group (low vs. intermediate-high). Methods and Materials: Nine published prostate cancer dose escalation studies including 6,539 patients were identified in the MEDLINE and CINAHL databases and reviewed to assess the relationship between dose and biochemical control. A novel method of analysis is presented in which the normalized dose-response gradient, {gamma}{sub 50}, is estimated for each study and subsequently synthesized across studies. Our method does not assume that biochemical control rates are directly comparable between studies.more » Results: Nonrandomized studies produced a statistically significantly higher {gamma}{sub 50} than randomized studies for intermediate- to high-risk patients ({gamma}{sub 50} = 1.63 vs. {gamma}{sub 50} = 0.93, p = 0.03) and a borderline significantly higher ({gamma}{sub 50} = 1.78 vs. {gamma}{sub 50} = 0.56, p = 0.08) for low-risk patients. No statistically significant difference in {gamma}{sub 50} was found between low- and intermediate- to high-risk patients (p = 0.31). From the pooled data of low and intermediate- to high-risk patients in randomized trials, we obtain the overall best estimate of {gamma}{sub 50} = 0.84 with 95% confidence interval 0.54-1.15. Conclusions: Nonrandomized studies overestimate the steepness of the dose-response curve as compared with randomized trials. This is probably the result of stage migration, improved treatment techniques, and a shorter follow-up in higher dose patients that were typically entered more recently. This overestimation leads to inflated expectations regarding the benefit from dose-escalation and could lead to underpowered clinical trials. There is no evidence of a steeper dose response for intermediate- to high-risk compared with low-risk patients.« less

  7. A fast ergodic algorithm for generating ensembles of equilateral random polygons

    NASA Astrophysics Data System (ADS)

    Varela, R.; Hinson, K.; Arsuaga, J.; Diao, Y.

    2009-03-01

    Knotted structures are commonly found in circular DNA and along the backbone of certain proteins. In order to properly estimate properties of these three-dimensional structures it is often necessary to generate large ensembles of simulated closed chains (i.e. polygons) of equal edge lengths (such polygons are called equilateral random polygons). However finding efficient algorithms that properly sample the space of equilateral random polygons is a difficult problem. Currently there are no proven algorithms that generate equilateral random polygons with its theoretical distribution. In this paper we propose a method that generates equilateral random polygons in a 'step-wise uniform' way. We prove that this method is ergodic in the sense that any given equilateral random polygon can be generated by this method and we show that the time needed to generate an equilateral random polygon of length n is linear in terms of n. These two properties make this algorithm a big improvement over the existing generating methods. Detailed numerical comparisons of our algorithm with other widely used algorithms are provided.

  8. Anxiety Outcomes after Physical Activity Interventions: Meta-Analysis Findings

    PubMed Central

    Conn, Vicki S.

    2011-01-01

    Background Although numerous primary studies have documented the mental health benefits of physical activity (PA), no previous quantitative synthesis has examined anxiety outcomes of interventions to increase PA. Objectives This meta-analysis integrates extant research about anxiety outcomes from interventions to increase PA among healthy adults. Method Extensive literature searching located published and unpublished PA intervention studies with anxiety outcomes. Eligible studies reported findings from interventions designed to increase PA delivered to healthy adults without anxiety disorders. Data were coded from primary studies. Random-effects meta-analytic procedures were completed. Exploratory moderator analyses using meta-analysis ANOVA and regression analogues were conducted to determine if report, methods, sample, or intervention characteristics were associated with differences in anxiety outcomes. Results Data were synthesized across 3,289 subjects from 19 eligible reports. The overall mean anxiety effect size (d-index) for two-group comparisons was 0.22 with significant heterogeneity (Q = 32.15). Exploratory moderator analyses found larger anxiety improvement effect sizes among studies that included larger samples, used random allocation of subjects to treatment and control conditions, targeted only PA behavior instead of multiple health behaviors, included supervised exercise (vs. home-based PA), used moderate or high-intensity instead of low-intensity PA, and suggested subjects exercise at a fitness facility (vs. home) following interventions. Discussion These findings document that some interventions can decrease anxiety symptoms among healthy adults. Exploratory moderator analyses suggest possible directions for future primary research to compare interventions in randomized trials to confirm causal relationships. PMID:20410849

  9. Employing online quantum random number generators for generating truly random quantum states in Mathematica

    NASA Astrophysics Data System (ADS)

    Miszczak, Jarosław Adam

    2013-01-01

    The presented package for the Mathematica computing system allows the harnessing of quantum random number generators (QRNG) for investigating the statistical properties of quantum states. The described package implements a number of functions for generating random states. The new version of the package adds the ability to use the on-line quantum random number generator service and implements new functions for retrieving lists of random numbers. Thanks to the introduced improvements, the new version provides faster access to high-quality sources of random numbers and can be used in simulations requiring large amount of random data. New version program summaryProgram title: TRQS Catalogue identifier: AEKA_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEKA_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 18 134 No. of bytes in distributed program, including test data, etc.: 2 520 49 Distribution format: tar.gz Programming language: Mathematica, C. Computer: Any supporting Mathematica in version 7 or higher. Operating system: Any platform supporting Mathematica; tested with GNU/Linux (32 and 64 bit). RAM: Case-dependent Supplementary material: Fig. 1 mentioned below can be downloaded. Classification: 4.15. External routines: Quantis software library (http://www.idquantique.com/support/quantis-trng.html) Catalogue identifier of previous version: AEKA_v1_0 Journal reference of previous version: Comput. Phys. Comm. 183(2012)118 Does the new version supersede the previous version?: Yes Nature of problem: Generation of random density matrices and utilization of high-quality random numbers for the purpose of computer simulation. Solution method: Use of a physical quantum random number generator and an on-line service providing access to the source of true random numbers generated by quantum real number generator. Reasons for new version: Added support for the high-speed on-line quantum random number generator and improved methods for retrieving lists of random numbers. Summary of revisions: The presented version provides two signicant improvements. The first one is the ability to use the on-line Quantum Random Number Generation service developed by PicoQuant GmbH and the Nano-Optics groups at the Department of Physics of Humboldt University. The on-line service supported in the version 2.0 of the TRQS package provides faster access to true randomness sources constructed using the laws of quantum physics. The service is freely available at https://qrng.physik.hu-berlin.de/. The use of this service allows using the presented package with the need of a physical quantum random number generator. The second improvement introduced in this version is the ability to retrieve arrays of random data directly for the used source. This increases the speed of the random number generation, especially in the case of an on-line service, where it reduces the time necessary to establish the connection. Thanks to the speed improvement of the presented version, the package can now be used in simulations requiring larger amounts of random data. Moreover, the functions for generating random numbers provided by the current version of the package more closely follow the pattern of functions for generating pseudo- random numbers provided in Mathematica. Additional comments: Speed comparison: The implementation of the support for the QRNG on-line service provides a noticeable improvement in the speed of random number generation. For the samples of real numbers of size 101; 102,…,107 the times required to generate these samples using Quantis USB device and QRNG service are compared in Fig. 1. The presented results show that the use of the on-line service provides faster access to random numbers. One should note, however, that the speed gain can increase or decrease depending on the connection speed between the computer and the server providing random numbers. Running time: Depends on the used source of randomness and the amount of random data used in the experiment. References: [1] M. Wahl, M. Leifgen, M. Berlin, T. Röhlicke, H.-J. Rahn, O. Benson., An ultrafast quantum random number generator with provably bounded output bias based on photon arrival time measurements, Applied Physics Letters, Vol. 098, 171105 (2011). http://dx.doi.org/10.1063/1.3578456.

  10. Assessment of the effect of population and diary sampling methods on estimation of school-age children exposure to fine particles.

    PubMed

    Che, W W; Frey, H Christopher; Lau, Alexis K H

    2014-12-01

    Population and diary sampling methods are employed in exposure models to sample simulated individuals and their daily activity on each simulation day. Different sampling methods may lead to variations in estimated human exposure. In this study, two population sampling methods (stratified-random and random-random) and three diary sampling methods (random resampling, diversity and autocorrelation, and Markov-chain cluster [MCC]) are evaluated. Their impacts on estimated children's exposure to ambient fine particulate matter (PM2.5 ) are quantified via case studies for children in Wake County, NC for July 2002. The estimated mean daily average exposure is 12.9 μg/m(3) for simulated children using the stratified population sampling method, and 12.2 μg/m(3) using the random sampling method. These minor differences are caused by the random sampling among ages within census tracts. Among the three diary sampling methods, there are differences in the estimated number of individuals with multiple days of exposures exceeding a benchmark of concern of 25 μg/m(3) due to differences in how multiday longitudinal diaries are estimated. The MCC method is relatively more conservative. In case studies evaluated here, the MCC method led to 10% higher estimation of the number of individuals with repeated exposures exceeding the benchmark. The comparisons help to identify and contrast the capabilities of each method and to offer insight regarding implications of method choice. Exposure simulation results are robust to the two population sampling methods evaluated, and are sensitive to the choice of method for simulating longitudinal diaries, particularly when analyzing results for specific microenvironments or for exposures exceeding a benchmark of concern. © 2014 Society for Risk Analysis.

  11. Long-acting reversible contraceptive acceptability and unintended pregnancy among women presenting for short-acting methods: a randomized patient preference trial.

    PubMed

    Hubacher, David; Spector, Hannah; Monteith, Charles; Chen, Pai-Lien; Hart, Catherine

    2017-02-01

    Measures of contraceptive effectiveness combine technology and user-related factors. Observational studies show higher effectiveness of long-acting reversible contraception compared with short-acting reversible contraception. Women who choose long-acting reversible contraception may differ in key ways from women who choose short-acting reversible contraception, and it may be these differences that are responsible for the high effectiveness of long-acting reversible contraception. Wider use of long-acting reversible contraception is recommended, but scientific evidence of acceptability and successful use is lacking in a population that typically opts for short-acting methods. The objective of the study was to reduce bias in measuring contraceptive effectiveness and better isolate the independent role that long-acting reversible contraception has in preventing unintended pregnancy relative to short-acting reversible contraception. We conducted a partially randomized patient preference trial and recruited women aged 18-29 years who were seeking a short-acting method (pills or injectable). Participants who agreed to randomization were assigned to 1 of 2 categories: long-acting reversible contraception or short-acting reversible contraception. Women who declined randomization but agreed to follow-up in the observational cohort chose their preferred method. Under randomization, participants chose a specific method in the category and received it for free, whereas participants in the preference cohort paid for the contraception in their usual fashion. Participants were followed up prospectively to measure primary outcomes of method continuation and unintended pregnancy at 12 months. Kaplan-Meier techniques were used to estimate method continuation probabilities. Intent-to-treat principles were applied after method initiation for comparing incidence of unintended pregnancy. We also measured acceptability in terms of level of happiness with the products. Of the 916 participants, 43% chose randomization and 57% chose the preference option. Complete loss to follow-up at 12 months was <2%. The 12-month method continuation probabilities were 63.3% (95% confidence interval, 58.9-67.3) (preference short-acting reversible contraception), 53.0% (95% confidence interval, 45.7-59.8) (randomized short-acting reversible contraception), and 77.8% (95% confidence interval, 71.0-83.2) (randomized long-acting reversible contraception) (P < .001 in the primary comparison involving randomized groups). The 12-month cumulative unintended pregnancy probabilities were 6.4% (95% confidence interval, 4.1-8.7) (preference short-acting reversible contraception), 7.7% (95% confidence interval, 3.3-12.1) (randomized short-acting reversible contraception), and 0.7% (95% confidence interval, 0.0-4.7) (randomized long-acting reversible contraception) (P = .01 when comparing randomized groups). In the secondary comparisons involving only short-acting reversible contraception users, the continuation probability was higher in the preference group compared with the randomized group (P = .04). However, the short-acting reversible contraception randomized group and short-acting reversible contraception preference group had statistically equivalent rates of unintended pregnancy (P = .77). Seventy-eight percent of randomized long-acting reversible contraception users were happy/neutral with their initial method, compared with 89% of randomized short-acting reversible contraception users (P < .05). However, among method continuers at 12 months, all groups were equally happy/neutral (>90%). Even in a typical population of women who presented to initiate or continue short-acting reversible contraception, long-acting reversible contraception proved highly acceptable. One year after initiation, women randomized to long-acting reversible contraception had high continuation rates and consequently experienced superior protection from unintended pregnancy compared with women using short-acting reversible contraception; these findings are attributable to the initial technology and not underlying factors that often bias observational estimates of effectiveness. The similarly patterned experiences of the 2 short-acting reversible contraception cohorts provide a bridge of generalizability between the randomized group and usual-care preference group. Benefits of increased voluntary uptake of long-acting reversible contraception may extend to wider populations than previously thought. Copyright © 2016 Elsevier Inc. All rights reserved.

  12. Flexible sampling large-scale social networks by self-adjustable random walk

    NASA Astrophysics Data System (ADS)

    Xu, Xiao-Ke; Zhu, Jonathan J. H.

    2016-12-01

    Online social networks (OSNs) have become an increasingly attractive gold mine for academic and commercial researchers. However, research on OSNs faces a number of difficult challenges. One bottleneck lies in the massive quantity and often unavailability of OSN population data. Sampling perhaps becomes the only feasible solution to the problems. How to draw samples that can represent the underlying OSNs has remained a formidable task because of a number of conceptual and methodological reasons. Especially, most of the empirically-driven studies on network sampling are confined to simulated data or sub-graph data, which are fundamentally different from real and complete-graph OSNs. In the current study, we propose a flexible sampling method, called Self-Adjustable Random Walk (SARW), and test it against with the population data of a real large-scale OSN. We evaluate the strengths of the sampling method in comparison with four prevailing methods, including uniform, breadth-first search (BFS), random walk (RW), and revised RW (i.e., MHRW) sampling. We try to mix both induced-edge and external-edge information of sampled nodes together in the same sampling process. Our results show that the SARW sampling method has been able to generate unbiased samples of OSNs with maximal precision and minimal cost. The study is helpful for the practice of OSN research by providing a highly needed sampling tools, for the methodological development of large-scale network sampling by comparative evaluations of existing sampling methods, and for the theoretical understanding of human networks by highlighting discrepancies and contradictions between existing knowledge/assumptions of large-scale real OSN data.

  13. Automated diagnosis of myositis from muscle ultrasound: Exploring the use of machine learning and deep learning methods.

    PubMed

    Burlina, Philippe; Billings, Seth; Joshi, Neil; Albayda, Jemima

    2017-01-01

    To evaluate the use of ultrasound coupled with machine learning (ML) and deep learning (DL) techniques for automated or semi-automated classification of myositis. Eighty subjects comprised of 19 with inclusion body myositis (IBM), 14 with polymyositis (PM), 14 with dermatomyositis (DM), and 33 normal (N) subjects were included in this study, where 3214 muscle ultrasound images of 7 muscles (observed bilaterally) were acquired. We considered three problems of classification including (A) normal vs. affected (DM, PM, IBM); (B) normal vs. IBM patients; and (C) IBM vs. other types of myositis (DM or PM). We studied the use of an automated DL method using deep convolutional neural networks (DL-DCNNs) for diagnostic classification and compared it with a semi-automated conventional ML method based on random forests (ML-RF) and "engineered" features. We used the known clinical diagnosis as the gold standard for evaluating performance of muscle classification. The performance of the DL-DCNN method resulted in accuracies ± standard deviation of 76.2% ± 3.1% for problem (A), 86.6% ± 2.4% for (B) and 74.8% ± 3.9% for (C), while the ML-RF method led to accuracies of 72.3% ± 3.3% for problem (A), 84.3% ± 2.3% for (B) and 68.9% ± 2.5% for (C). This study demonstrates the application of machine learning methods for automatically or semi-automatically classifying inflammatory muscle disease using muscle ultrasound. Compared to the conventional random forest machine learning method used here, which has the drawback of requiring manual delineation of muscle/fat boundaries, DCNN-based classification by and large improved the accuracies in all classification problems while providing a fully automated approach to classification.

  14. Medial tibial stress syndrome: evidence-based prevention.

    PubMed

    Craig, Debbie I

    2008-01-01

    Thacker SB, Gilchrist J, Stroup DF, Kimsey CD. The prevention of shin splints in sports: a systematic review of literature. Med Sci Sports Exerc. 2002;34(1):32-40. Among physically active individuals, which medial tibial stress syndrome (MTSS) prevention methods are most effective to decrease injury rates? Studies were identified by searching MEDLINE (1966-2000), Current Contents (1996-2000), Biomedical Collection (1993-1999), and Dissertation Abstracts. Reference lists of identified studies were searched manually until no further studies were identified. Experts in the field were contacted, including first authors of randomized controlled trials addressing prevention of MTSS. The Cochrane Collaboration (early stage of Cochrane Database of Systematic Reviews) was contacted. Inclusion criteria included randomized controlled trials or clinical trials comparing different MTSS prevention methods with control groups. Excluded were studies that did not provide primary research data or that addressed treatment and rehabilitation rather than prevention of incident MTSS. A total of 199 citations were identified. Of these, 4 studies compared prevention methods for MTSS. Three reviewers independently scored the 4 studies. Reviewers were blinded to the authors' names and affiliations but not the results. Each study was evaluated independently for methodologic quality using a 100-point checklist. Final scores were averages of the 3 reviewers' scores. Prevention methods studied were shock-absorbent insoles, foam heel pads, Achilles tendon stretching, footwear, and graduated running programs. No statistically significant results were noted for any of the prevention methods. Median quality scores ranged from 29 to 47, revealing flaws in design, control for bias, and statistical methods. No current evidence supports any single prevention method for MTSS. The most promising outcomes support the use of shock-absorbing insoles. Well-designed and controlled trials are critically needed to decrease the incidence of this common injury.

  15. Comparison of three controllers applied to helicopter vibration

    NASA Technical Reports Server (NTRS)

    Leyland, Jane A.

    1992-01-01

    A comparison was made of the applicability and suitability of the deterministic controller, the cautious controller, and the dual controller for the reduction of helicopter vibration by using higher harmonic blade pitch control. A randomly generated linear plant model was assumed and the performance index was defined to be a quadratic output metric of this linear plant. A computer code, designed to check out and evaluate these controllers, was implemented and used to accomplish this comparison. The effects of random measurement noise, the initial estimate of the plant matrix, and the plant matrix propagation rate were determined for each of the controllers. With few exceptions, the deterministic controller yielded the greatest vibration reduction (as characterized by the quadratic output metric) and operated with the greatest reliability. Theoretical limitations of these controllers were defined and appropriate candidate alternative methods, including one method particularly suitable to the cockpit, were identified.

  16. Design space exploration for early identification of yield limiting patterns

    NASA Astrophysics Data System (ADS)

    Li, Helen; Zou, Elain; Lee, Robben; Hong, Sid; Liu, Square; Wang, JinYan; Du, Chunshan; Zhang, Recco; Madkour, Kareem; Ali, Hussein; Hsu, Danny; Kabeel, Aliaa; ElManhawy, Wael; Kwan, Joe

    2016-03-01

    In order to resolve the causality dilemma of which comes first, accurate design rules or real designs, this paper presents a flow for exploration of the layout design space to early identify problematic patterns that will negatively affect the yield. A new random layout generating method called Layout Schema Generator (LSG) is reported in this paper, this method generates realistic design-like layouts without any design rule violation. Lithography simulation is then used on the generated layout to discover the potentially problematic patterns (hotspots). These hotspot patterns are further explored by randomly inducing feature and context variations to these identified hotspots through a flow called Hotspot variation Flow (HSV). Simulation is then performed on these expanded set of layout clips to further identify more problematic patterns. These patterns are then classified into design forbidden patterns that should be included in the design rule checker and legal patterns that need better handling in the RET recipes and processes.

  17. Multilevel Monte Carlo for two phase flow and Buckley–Leverett transport in random heterogeneous porous media

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Müller, Florian, E-mail: florian.mueller@sam.math.ethz.ch; Jenny, Patrick, E-mail: jenny@ifd.mavt.ethz.ch; Meyer, Daniel W., E-mail: meyerda@ethz.ch

    2013-10-01

    Monte Carlo (MC) is a well known method for quantifying uncertainty arising for example in subsurface flow problems. Although robust and easy to implement, MC suffers from slow convergence. Extending MC by means of multigrid techniques yields the multilevel Monte Carlo (MLMC) method. MLMC has proven to greatly accelerate MC for several applications including stochastic ordinary differential equations in finance, elliptic stochastic partial differential equations and also hyperbolic problems. In this study, MLMC is combined with a streamline-based solver to assess uncertain two phase flow and Buckley–Leverett transport in random heterogeneous porous media. The performance of MLMC is compared tomore » MC for a two dimensional reservoir with a multi-point Gaussian logarithmic permeability field. The influence of the variance and the correlation length of the logarithmic permeability on the MLMC performance is studied.« less

  18. Quantifying data retention of perpendicular spin-transfer-torque magnetic random access memory chips using an effective thermal stability factor method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thomas, Luc, E-mail: luc.thomas@headway.com; Jan, Guenole; Le, Son

    The thermal stability of perpendicular Spin-Transfer-Torque Magnetic Random Access Memory (STT-MRAM) devices is investigated at chip level. Experimental data are analyzed in the framework of the Néel-Brown model including distributions of the thermal stability factor Δ. We show that in the low error rate regime important for applications, the effect of distributions of Δ can be described by a single quantity, the effective thermal stability factor Δ{sub eff}, which encompasses both the median and the standard deviation of the distributions. Data retention of memory chips can be assessed accurately by measuring Δ{sub eff} as a function of device diameter andmore » temperature. We apply this method to show that 54 nm devices based on our perpendicular STT-MRAM design meet our 10 year data retention target up to 120 °C.« less

  19. Using Friends as Sensors to Detect Global-Scale Contagious Outbreaks

    PubMed Central

    Garcia-Herranz, Manuel; Moro, Esteban; Cebrian, Manuel; Christakis, Nicholas A.; Fowler, James H.

    2014-01-01

    Recent research has focused on the monitoring of global–scale online data for improved detection of epidemics, mood patterns, movements in the stock market political revolutions, box-office revenues, consumer behaviour and many other important phenomena. However, privacy considerations and the sheer scale of data available online are quickly making global monitoring infeasible, and existing methods do not take full advantage of local network structure to identify key nodes for monitoring. Here, we develop a model of the contagious spread of information in a global-scale, publicly-articulated social network and show that a simple method can yield not just early detection, but advance warning of contagious outbreaks. In this method, we randomly choose a small fraction of nodes in the network and then we randomly choose a friend of each node to include in a group for local monitoring. Using six months of data from most of the full Twittersphere, we show that this friend group is more central in the network and it helps us to detect viral outbreaks of the use of novel hashtags about 7 days earlier than we could with an equal-sized randomly chosen group. Moreover, the method actually works better than expected due to network structure alone because highly central actors are both more active and exhibit increased diversity in the information they transmit to others. These results suggest that local monitoring is not just more efficient, but also more effective, and it may be applied to monitor contagious processes in global–scale networks. PMID:24718030

  20. Using friends as sensors to detect global-scale contagious outbreaks.

    PubMed

    Garcia-Herranz, Manuel; Moro, Esteban; Cebrian, Manuel; Christakis, Nicholas A; Fowler, James H

    2014-01-01

    Recent research has focused on the monitoring of global-scale online data for improved detection of epidemics, mood patterns, movements in the stock market political revolutions, box-office revenues, consumer behaviour and many other important phenomena. However, privacy considerations and the sheer scale of data available online are quickly making global monitoring infeasible, and existing methods do not take full advantage of local network structure to identify key nodes for monitoring. Here, we develop a model of the contagious spread of information in a global-scale, publicly-articulated social network and show that a simple method can yield not just early detection, but advance warning of contagious outbreaks. In this method, we randomly choose a small fraction of nodes in the network and then we randomly choose a friend of each node to include in a group for local monitoring. Using six months of data from most of the full Twittersphere, we show that this friend group is more central in the network and it helps us to detect viral outbreaks of the use of novel hashtags about 7 days earlier than we could with an equal-sized randomly chosen group. Moreover, the method actually works better than expected due to network structure alone because highly central actors are both more active and exhibit increased diversity in the information they transmit to others. These results suggest that local monitoring is not just more efficient, but also more effective, and it may be applied to monitor contagious processes in global-scale networks.

  1. On the efficiency of a randomized mirror descent algorithm in online optimization problems

    NASA Astrophysics Data System (ADS)

    Gasnikov, A. V.; Nesterov, Yu. E.; Spokoiny, V. G.

    2015-04-01

    A randomized online version of the mirror descent method is proposed. It differs from the existing versions by the randomization method. Randomization is performed at the stage of the projection of a subgradient of the function being optimized onto the unit simplex rather than at the stage of the computation of a subgradient, which is common practice. As a result, a componentwise subgradient descent with a randomly chosen component is obtained, which admits an online interpretation. This observation, for example, has made it possible to uniformly interpret results on weighting expert decisions and propose the most efficient method for searching for an equilibrium in a zero-sum two-person matrix game with sparse matrix.

  2. A Multivariate Randomization Text of Association Applied to Cognitive Test Results

    NASA Technical Reports Server (NTRS)

    Ahumada, Albert; Beard, Bettina

    2009-01-01

    Randomization tests provide a conceptually simple, distribution-free way to implement significance testing. We have applied this method to the problem of evaluating the significance of the association among a number (k) of variables. The randomization method was the random re-ordering of k-1 of the variables. The criterion variable was the value of the largest eigenvalue of the correlation matrix.

  3. A Experimental Investigation of Hydrodynamic Forces on Circular Cylinders in Sinusoidal and Random Oscillating Flow

    NASA Astrophysics Data System (ADS)

    Longoria, Raul Gilberto

    An experimental apparatus has been developed which can be used to generate a general time-dependent planar flow across a cylinder. A mass of water enclosed with no free surface within a square cross-section tank and two spring pre-loaded pistons is oscillated using a hydraulic actuator. A circular cylinder is suspended horizontally in the tank by two X-Y force transducers used to simultaneously measure the total in-line and transverse forces. Fluid motion is measured using a differential pressure transducer for instantaneous acceleration and an LVDT for displacement. This investigation provides measurement of forces on cylinders subjected to planar fluid flow velocity with a time (and frequency) dependence which more accurately represent the random conditions encountered in a natural ocean environment. The use of the same apparatus for both sinusoidal and random experiments provides a quantified assessment of the applicability of sinusoidal planar oscillatory flow data in offshore structure design methods. The drag and inertia coefficients for a Morison equation representation of the inline force are presented for both sinusoidal and random flow. Comparison of the sinusoidal results is favorable with those of previous investigations. The results from random experiments illustrates the difference in the force mechanism by contrasting the force transfer coefficients for the inline and transverse forces. It is found that application of sinusoidal results to random hydrodynamic inline force prediction using the Morison equation wrongly weighs the drag and inertia components, and the transverse force is overpredicted. The use of random planar oscillatory flow in the laboratory, contrasted with sinusoidal planar oscillatory flow, quantifies the accepted belief that the force transfer coefficients from sinusoidal flow experiments are conservative for prediction of forces on cylindrical structures subjected to random sea waves and the ensuing forces. Further analysis of data is conducted in the frequency domain to illustrate models used for predicting the power spectral density of the inline force including a nonlinear describing function method. It is postulated that the large-scale vortex activity prominent in sinusoidal oscillatory flow is subdued in random flow conditions.

  4. Sleep Promotion Program for Improving Sleep Behaviors in Adolescents: A Randomized Controlled Pilot Study

    PubMed Central

    John, Bindu; Bellipady, Sumanth Shetty; Bhat, Shrinivasa Undaru

    2016-01-01

    Aims. The purpose of this pilot trial was to determine the efficacy of sleep promotion program to adapt it for the use of adolescents studying in various schools of Mangalore, India, and evaluate the feasibility issues before conducting a randomized controlled trial in a larger sample of adolescents. Methods. A randomized controlled trial design with stratified random sampling method was used. Fifty-eight adolescents were selected (mean age: 14.02 ± 2.15 years; intervention group, n = 34; control group, n = 24). Self-report questionnaires, including sociodemographic questionnaire with some additional questions on sleep and activities, Sleep Hygiene Index, Pittsburgh Sleep Quality Index, The Cleveland Adolescent Sleepiness Questionnaire, and PedsQL™ Present Functioning Visual Analogue Scale, were used. Results. Insufficient weekday-weekend sleep duration with increasing age of adolescents was observed. The program revealed a significant effect in the experimental group over the control group in overall sleep quality, sleep onset latency, sleep duration, daytime sleepiness, and emotional and overall distress. No significant effect was observed in sleep hygiene and other sleep parameters. All target variables showed significant correlations with each other. Conclusion. The intervention holds a promise for improving the sleep behaviors in healthy adolescents. However, the effect of the sleep promotion program treatment has yet to be proven through a future research. This trial is registered with ISRCTN13083118. PMID:27088040

  5. Elastic constants of random solid solutions by SQS and CPA approaches: the case of fcc Ti-Al.

    PubMed

    Tian, Li-Yun; Hu, Qing-Miao; Yang, Rui; Zhao, Jijun; Johansson, Börje; Vitos, Levente

    2015-08-12

    Special quasi-random structure (SQS) and coherent potential approximation (CPA) are techniques widely employed in the first-principles calculations of random alloys. Here we scrutinize these approaches by focusing on the local lattice distortion (LLD) and the crystal symmetry effects. We compare the elastic parameters obtained from SQS and CPA calculations, taking the random face-centered cubic (fcc) Ti(1-x)Al(x) (0 ≤ x ≤ 1) alloy as an example of systems with components showing different electronic structures and bonding characteristics. For the CPA and SQS calculations, we employ the Exact Muffin-Tin Orbitals (EMTO) method and the pseudopotential method as implemented in the Vienna Ab initio Simulation Package (VASP), respectively. We show that the predicted trends of the VASP-SQS and EMTO-CPA parameters against composition are in good agreement with each other. The energy associated with the LLD increases with x up to x = 0.625 ~ 0.750 and drops drastically thereafter. The influence of the LLD on the lattice constants and C12 elastic constant is negligible. C11 and C44 decrease after atomic relaxation for alloys with large LLD, however, the trends of C11 and C44 are not significantly affected. In general, the uncertainties in the elastic parameters associated with the symmetry lowering turn out to be superior to the differences between the two techniques including the effect of LLD.

  6. Multiple Imputation of a Randomly Censored Covariate Improves Logistic Regression Analysis.

    PubMed

    Atem, Folefac D; Qian, Jing; Maye, Jacqueline E; Johnson, Keith A; Betensky, Rebecca A

    2016-01-01

    Randomly censored covariates arise frequently in epidemiologic studies. The most commonly used methods, including complete case and single imputation or substitution, suffer from inefficiency and bias. They make strong parametric assumptions or they consider limit of detection censoring only. We employ multiple imputation, in conjunction with semi-parametric modeling of the censored covariate, to overcome these shortcomings and to facilitate robust estimation. We develop a multiple imputation approach for randomly censored covariates within the framework of a logistic regression model. We use the non-parametric estimate of the covariate distribution or the semiparametric Cox model estimate in the presence of additional covariates in the model. We evaluate this procedure in simulations, and compare its operating characteristics to those from the complete case analysis and a survival regression approach. We apply the procedures to an Alzheimer's study of the association between amyloid positivity and maternal age of onset of dementia. Multiple imputation achieves lower standard errors and higher power than the complete case approach under heavy and moderate censoring and is comparable under light censoring. The survival regression approach achieves the highest power among all procedures, but does not produce interpretable estimates of association. Multiple imputation offers a favorable alternative to complete case analysis and ad hoc substitution methods in the presence of randomly censored covariates within the framework of logistic regression.

  7. Uncertain dynamic analysis for rigid-flexible mechanisms with random geometry and material properties

    NASA Astrophysics Data System (ADS)

    Wu, Jinglai; Luo, Zhen; Zhang, Nong; Zhang, Yunqing; Walker, Paul D.

    2017-02-01

    This paper proposes an uncertain modelling and computational method to analyze dynamic responses of rigid-flexible multibody systems (or mechanisms) with random geometry and material properties. Firstly, the deterministic model for the rigid-flexible multibody system is built with the absolute node coordinate formula (ANCF), in which the flexible parts are modeled by using ANCF elements, while the rigid parts are described by ANCF reference nodes (ANCF-RNs). Secondly, uncertainty for the geometry of rigid parts is expressed as uniform random variables, while the uncertainty for the material properties of flexible parts is modeled as a continuous random field, which is further discretized to Gaussian random variables using a series expansion method. Finally, a non-intrusive numerical method is developed to solve the dynamic equations of systems involving both types of random variables, which systematically integrates the deterministic generalized-α solver with Latin Hypercube sampling (LHS) and Polynomial Chaos (PC) expansion. The benchmark slider-crank mechanism is used as a numerical example to demonstrate the characteristics of the proposed method.

  8. Blocking for Sequential Political Experiments

    PubMed Central

    Moore, Sally A.

    2013-01-01

    In typical political experiments, researchers randomize a set of households, precincts, or individuals to treatments all at once, and characteristics of all units are known at the time of randomization. However, in many other experiments, subjects “trickle in” to be randomized to treatment conditions, usually via complete randomization. To take advantage of the rich background data that researchers often have (but underutilize) in these experiments, we develop methods that use continuous covariates to assign treatments sequentially. We build on biased coin and minimization procedures for discrete covariates and demonstrate that our methods outperform complete randomization, producing better covariate balance in simulated data. We then describe how we selected and deployed a sequential blocking method in a clinical trial and demonstrate the advantages of our having done so. Further, we show how that method would have performed in two larger sequential political trials. Finally, we compare causal effect estimates from differences in means, augmented inverse propensity weighted estimators, and randomization test inversion. PMID:24143061

  9. Reporting of embryo transfer methods in IVF research: a cross-sectional study.

    PubMed

    Gambadauro, Pietro; Navaratnarajah, Ramesan

    2015-02-01

    The reporting of embryo transfer methods in IVF research was assessed through a cross-sectional analysis of randomized controlled trials (RCTs) published between 2010 and 2011. A systematic search identified 325 abstracts; 122 RCTs were included in the study. Embryo transfer methods were described in 42 out of 122 articles (34%). Catheters (32/42 [76%]) or ultrasound guidance (31/42 [74%]) were most frequently mentioned. Performer 'blinding' (12%) or technique standardization (7%) were seldom reported. The description of embryo transfer methods was significantly more common in trials published by journals with lower impact factor (less than 3, 39.6%; 3 or greater, 21.5%; P = 0.037). Embryo transfer methods were reported more often in trials with pregnancy as the main end-point (33% versus 16%) or with positive outcomes (37.8% versus 25.0%), albeit not significantly. Multivariate logistic regression confirmed that RCTs published in higher impact factor journals are less likely to describe embryo transfer methods (OR 0.371; 95% CI 0.143 to 0.964). Registered trials, trials conducted in an academic setting, multi-centric studies or full-length articles were not positively associated with embryo transfer methods reporting rate. Recent reports of randomized IVF trials rarely describe embryo transfer methods. The under-reporting of research methods might compromise reproducibility and suitability for meta-analysis. Copyright © 2014 Reproductive Healthcare Ltd. Published by Elsevier Ltd. All rights reserved.

  10. Methods to Evaluate the Effects of Internet-Based Digital Health Interventions for Citizens: Systematic Review of Reviews.

    PubMed

    Zanaboni, Paolo; Ngangue, Patrice; Mbemba, Gisele Irène Claudine; Schopf, Thomas Roger; Bergmo, Trine Strand; Gagnon, Marie-Pierre

    2018-06-07

    Digital health can empower citizens to manage their health and address health care system problems including poor access, uncoordinated care and increasing costs. Digital health interventions are typically complex interventions. Therefore, evaluations present methodological challenges. The objective of this study was to provide a systematic overview of the methods used to evaluate the effects of internet-based digital health interventions for citizens. Three research questions were addressed to explore methods regarding approaches (study design), effects and indicators. We conducted a systematic review of reviews of the methods used to measure the effects of internet-based digital health interventions for citizens. The protocol was developed a priori according to Preferred Reporting Items for Systematic review and Meta-Analysis Protocols and the Cochrane Collaboration methodology for overviews of reviews. Qualitative, mixed-method, and quantitative reviews published in English or French from January 2010 to October 2016 were included. We searched for published reviews in PubMed, EMBASE, The Cochrane Database of Systematic Reviews, CINHAL and Epistemonikos. We categorized the findings based on a thematic analysis of the reviews structured around study designs, indicators, types of interventions, effects and perspectives. A total of 20 unique reviews were included. The most common digital health interventions for citizens were patient portals and patients' access to electronic health records, covered by 10/20 (50%) and 6/20 (30%) reviews, respectively. Quantitative approaches to study design included observational study (15/20 reviews, 75%), randomized controlled trial (13/20 reviews, 65%), quasi-experimental design (9/20 reviews, 45%), and pre-post studies (6/20 reviews, 30%). Qualitative studies or mixed methods were reported in 13/20 (65%) reviews. Five main categories of effects were identified: (1) health and clinical outcomes, (2) psychological and behavioral outcomes, (3) health care utilization, (4) system adoption and use, and (5) system attributes. Health and clinical outcomes were measured with both general indicators and disease-specific indicators and reported in 11/20 (55%) reviews. Patient-provider communication and patient satisfaction were the most investigated psychological and behavioral outcomes, reported in 13/20 (65%) and 12/20 (60%) reviews, respectively. Evaluation of health care utilization was included in 8/20 (40%) reviews, most of which focused on the economic effects on the health care system. Although observational studies and surveys have provided evidence of benefits and satisfaction for patients, there is still little reliable evidence from randomized controlled trials of improved health outcomes. Future evaluations of digital health interventions for citizens should focus on specific populations or chronic conditions which are more likely to achieve clinically meaningful benefits and use high-quality approaches such as randomized controlled trials. Implementation research methods should also be considered. We identified a wide range of effects and indicators, most of which focused on patients as main end users. Implications for providers and the health system should also be included in evaluations or monitoring of digital health interventions. ©Paolo Zanaboni, Patrice Ngangue, Gisele Irène Claudine Mbemba, Thomas Roger Schopf, Trine Strand Bergmo, Marie-Pierre Gagnon. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 07.06.2018.

  11. Comprehensive T-matrix Reference Database: A 2009-2011 Update

    NASA Technical Reports Server (NTRS)

    Zakharova, Nadezhda T.; Videen, G.; Khlebtsov, Nikolai G.

    2012-01-01

    The T-matrix method is one of the most versatile and efficient theoretical techniques widely used for the computation of electromagnetic scattering by single and composite particles, discrete random media, and particles in the vicinity of an interface separating two half-spaces with different refractive indices. This paper presents an update to the comprehensive database of peer-reviewed T-matrix publications compiled by us previously and includes the publications that appeared since 2009. It also lists several earlier publications not included in the original database.

  12. Methodology Series Module 5: Sampling Strategies

    PubMed Central

    Setia, Maninder Singh

    2016-01-01

    Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the ‘ Sampling Method’. There are essentially two types of sampling methods: 1) probability sampling – based on chance events (such as random numbers, flipping a coin etc.); and 2) non-probability sampling – based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample) is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term ‘ random sample’ when the researcher has used convenience sample). The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the ‘ generalizability’ of these results. In such a scenario, the researcher may want to use ‘ purposive sampling’ for the study. PMID:27688438

  13. Multiple Scattering in Planetary Regoliths Using Incoherent Interactions

    NASA Astrophysics Data System (ADS)

    Muinonen, K.; Markkanen, J.; Vaisanen, T.; Penttilä, A.

    2017-12-01

    We consider scattering of light by a planetary regolith using novel numerical methods for discrete random media of particles. Understanding the scattering process is of key importance for spectroscopic, photometric, and polarimetric modeling of airless planetary objects, including radar studies. In our modeling, the size of the spherical random medium can range from microscopic to macroscopic sizes, whereas the particles are assumed to be of the order of the wavelength in size. We extend the radiative transfer and coherent backscattering method (RT-CB) to the case of dense packing of particles by adopting the ensemble-averaged first-order incoherent extinction, scattering, and absorption characteristics of a volume element of particles as input. In the radiative transfer part, at each absorption and scattering process, we account for absorption with the help of the single-scattering albedo and peel off the Stokes parameters of radiation emerging from the medium in predefined scattering angles. We then generate a new scattering direction using the joint probability density for the local polar and azimuthal scattering angles. In the coherent backscattering part, we utilize amplitude scattering matrices along the radiative-transfer path and the reciprocal path. Furthermore, we replace the far-field interactions of the RT-CB method with rigorous interactions facilitated by the Superposition T-matrix method (STMM). This gives rise to a new RT-RT method, radiative transfer with reciprocal interactions. For microscopic random media, we then compare the new results to asymptotically exact results computed using the STMM, succeeding in the numerical validation of the new methods.Acknowledgments. Research supported by European Research Council with Advanced Grant No. 320773 SAEMPL, Scattering and Absorption of ElectroMagnetic waves in ParticuLate media. Computational resources provided by CSC - IT Centre for Science Ltd, Finland.

  14. 2GETHER - The Dual Protection Project: Design and rationale of a randomized controlled trial to increase dual protection strategy selection and adherence among African American adolescent females.

    PubMed

    Ewing, Alexander C; Kottke, Melissa J; Kraft, Joan Marie; Sales, Jessica M; Brown, Jennifer L; Goedken, Peggy; Wiener, Jeffrey; Kourtis, Athena P

    2017-03-01

    African American adolescent females are at elevated risk for unintended pregnancy and sexually transmitted infections (STIs). Dual protection (DP) is defined as concurrent prevention of pregnancy and STIs. This can be achieved by abstinence, consistent condom use, or the dual methods of condoms plus an effective non-barrier contraceptive. Previous clinic-based interventions showed short-term effects on increasing dual method use, but evidence of sustained effects on dual method use and decreased incident pregnancies and STIs are lacking. This manuscript describes the 2GETHER Project. 2GETHER is a randomized controlled trial of a multi-component intervention to increase dual protection use among sexually active African American females aged 14-19years not desiring pregnancy at a Title X clinic in Atlanta, GA. The intervention is clinic-based and includes a culturally tailored interactive multimedia component and counseling sessions, both to assist in selection of a DP method and to reinforce use of the DP method. The participants are randomized to the study intervention or the standard of care, and followed for 12months to evaluate how the intervention influences DP method selection and adherence, pregnancy and STI incidence, and participants' DP knowledge, intentions, and self-efficacy. The 2GETHER Project is a novel trial to reduce unintended pregnancies and STIs among African American adolescents. The intervention is unique in the comprehensive and complementary nature of its components and its individual tailoring of provider-patient interaction. If the trial interventions are shown to be effective, then it will be reasonable to assess their scalability and applicability in other populations. Published by Elsevier Inc.

  15. Marginal and Random Intercepts Models for Longitudinal Binary Data With Examples From Criminology.

    PubMed

    Long, Jeffrey D; Loeber, Rolf; Farrington, David P

    2009-01-01

    Two models for the analysis of longitudinal binary data are discussed: the marginal model and the random intercepts model. In contrast to the linear mixed model (LMM), the two models for binary data are not subsumed under a single hierarchical model. The marginal model provides group-level information whereas the random intercepts model provides individual-level information including information about heterogeneity of growth. It is shown how a type of numerical averaging can be used with the random intercepts model to obtain group-level information, thus approximating individual and marginal aspects of the LMM. The types of inferences associated with each model are illustrated with longitudinal criminal offending data based on N = 506 males followed over a 22-year period. Violent offending indexed by official records and self-report were analyzed, with the marginal model estimated using generalized estimating equations and the random intercepts model estimated using maximum likelihood. The results show that the numerical averaging based on the random intercepts can produce prediction curves almost identical to those obtained directly from the marginal model parameter estimates. The results provide a basis for contrasting the models and the estimation procedures and key features are discussed to aid in selecting a method for empirical analysis.

  16. Sequential Multiple Assignment Randomized Trial (SMART) with Adaptive Randomization for Quality Improvement in Depression Treatment Program

    PubMed Central

    Chakraborty, Bibhas; Davidson, Karina W.

    2015-01-01

    Summary Implementation study is an important tool for deploying state-of-the-art treatments from clinical efficacy studies into a treatment program, with the dual goals of learning about effectiveness of the treatments and improving the quality of care for patients enrolled into the program. In this article, we deal with the design of a treatment program of dynamic treatment regimens (DTRs) for patients with depression post acute coronary syndrome. We introduce a novel adaptive randomization scheme for a sequential multiple assignment randomized trial of DTRs. Our approach adapts the randomization probabilities to favor treatment sequences having comparatively superior Q-functions used in Q-learning. The proposed approach addresses three main concerns of an implementation study: it allows incorporation of historical data or opinions, it includes randomization for learning purposes, and it aims to improve care via adaptation throughout the program. We demonstrate how to apply our method to design a depression treatment program using data from a previous study. By simulation, we illustrate that the inputs from historical data are important for the program performance measured by the expected outcomes of the enrollees, but also show that the adaptive randomization scheme is able to compensate poorly specified historical inputs by improving patient outcomes within a reasonable horizon. The simulation results also confirm that the proposed design allows efficient learning of the treatments by alleviating the curse of dimensionality. PMID:25354029

  17. Probabilistic SSME blades structural response under random pulse loading

    NASA Technical Reports Server (NTRS)

    Shiao, Michael; Rubinstein, Robert; Nagpal, Vinod K.

    1987-01-01

    The purpose is to develop models of random impacts on a Space Shuttle Main Engine (SSME) turbopump blade and to predict the probabilistic structural response of the blade to these impacts. The random loading is caused by the impact of debris. The probabilistic structural response is characterized by distribution functions for stress and displacements as functions of the loading parameters which determine the random pulse model. These parameters include pulse arrival, amplitude, and location. The analysis can be extended to predict level crossing rates. This requires knowledge of the joint distribution of the response and its derivative. The model of random impacts chosen allows the pulse arrivals, pulse amplitudes, and pulse locations to be random. Specifically, the pulse arrivals are assumed to be governed by a Poisson process, which is characterized by a mean arrival rate. The pulse intensity is modelled as a normally distributed random variable with a zero mean chosen independently at each arrival. The standard deviation of the distribution is a measure of pulse intensity. Several different models were used for the pulse locations. For example, three points near the blade tip were chosen at which pulses were allowed to arrive with equal probability. Again, the locations were chosen independently at each arrival. The structural response was analyzed both by direct Monte Carlo simulation and by a semi-analytical method.

  18. Single molecule counting and assessment of random molecular tagging errors with transposable giga-scale error-correcting barcodes.

    PubMed

    Lau, Billy T; Ji, Hanlee P

    2017-09-21

    RNA-Seq measures gene expression by counting sequence reads belonging to unique cDNA fragments. Molecular barcodes commonly in the form of random nucleotides were recently introduced to improve gene expression measures by detecting amplification duplicates, but are susceptible to errors generated during PCR and sequencing. This results in false positive counts, leading to inaccurate transcriptome quantification especially at low input and single-cell RNA amounts where the total number of molecules present is minuscule. To address this issue, we demonstrated the systematic identification of molecular species using transposable error-correcting barcodes that are exponentially expanded to tens of billions of unique labels. We experimentally showed random-mer molecular barcodes suffer from substantial and persistent errors that are difficult to resolve. To assess our method's performance, we applied it to the analysis of known reference RNA standards. By including an inline random-mer molecular barcode, we systematically characterized the presence of sequence errors in random-mer molecular barcodes. We observed that such errors are extensive and become more dominant at low input amounts. We described the first study to use transposable molecular barcodes and its use for studying random-mer molecular barcode errors. Extensive errors found in random-mer molecular barcodes may warrant the use of error correcting barcodes for transcriptome analysis as input amounts decrease.

  19. Application of random survival forests in understanding the determinants of under-five child mortality in Uganda in the presence of covariates that satisfy the proportional and non-proportional hazards assumption.

    PubMed

    Nasejje, Justine B; Mwambi, Henry

    2017-09-07

    Uganda just like any other Sub-Saharan African country, has a high under-five child mortality rate. To inform policy on intervention strategies, sound statistical methods are required to critically identify factors strongly associated with under-five child mortality rates. The Cox proportional hazards model has been a common choice in analysing data to understand factors strongly associated with high child mortality rates taking age as the time-to-event variable. However, due to its restrictive proportional hazards (PH) assumption, some covariates of interest which do not satisfy the assumption are often excluded in the analysis to avoid mis-specifying the model. Otherwise using covariates that clearly violate the assumption would mean invalid results. Survival trees and random survival forests are increasingly becoming popular in analysing survival data particularly in the case of large survey data and could be attractive alternatives to models with the restrictive PH assumption. In this article, we adopt random survival forests which have never been used in understanding factors affecting under-five child mortality rates in Uganda using Demographic and Health Survey data. Thus the first part of the analysis is based on the use of the classical Cox PH model and the second part of the analysis is based on the use of random survival forests in the presence of covariates that do not necessarily satisfy the PH assumption. Random survival forests and the Cox proportional hazards model agree that the sex of the household head, sex of the child, number of births in the past 1 year are strongly associated to under-five child mortality in Uganda given all the three covariates satisfy the PH assumption. Random survival forests further demonstrated that covariates that were originally excluded from the earlier analysis due to violation of the PH assumption were important in explaining under-five child mortality rates. These covariates include the number of children under the age of five in a household, number of births in the past 5 years, wealth index, total number of children ever born and the child's birth order. The results further indicated that the predictive performance for random survival forests built using covariates including those that violate the PH assumption was higher than that for random survival forests built using only covariates that satisfy the PH assumption. Random survival forests are appealing methods in analysing public health data to understand factors strongly associated with under-five child mortality rates especially in the presence of covariates that violate the proportional hazards assumption.

  20. Outcomes Definitions and Statistical Tests in Oncology Studies: A Systematic Review of the Reporting Consistency

    PubMed Central

    Rivoirard, Romain; Duplay, Vianney; Oriol, Mathieu; Tinquaut, Fabien; Chauvin, Franck; Magne, Nicolas; Bourmaud, Aurelie

    2016-01-01

    Background Quality of reporting for Randomized Clinical Trials (RCTs) in oncology was analyzed in several systematic reviews, but, in this setting, there is paucity of data for the outcomes definitions and consistency of reporting for statistical tests in RCTs and Observational Studies (OBS). The objective of this review was to describe those two reporting aspects, for OBS and RCTs in oncology. Methods From a list of 19 medical journals, three were retained for analysis, after a random selection: British Medical Journal (BMJ), Annals of Oncology (AoO) and British Journal of Cancer (BJC). All original articles published between March 2009 and March 2014 were screened. Only studies whose main outcome was accompanied by a corresponding statistical test were included in the analysis. Studies based on censored data were excluded. Primary outcome was to assess quality of reporting for description of primary outcome measure in RCTs and of variables of interest in OBS. A logistic regression was performed to identify covariates of studies potentially associated with concordance of tests between Methods and Results parts. Results 826 studies were included in the review, and 698 were OBS. Variables were described in Methods section for all OBS studies and primary endpoint was clearly detailed in Methods section for 109 RCTs (85.2%). 295 OBS (42.2%) and 43 RCTs (33.6%) had perfect agreement for reported statistical test between Methods and Results parts. In multivariable analysis, variable "number of included patients in study" was associated with test consistency: aOR (adjusted Odds Ratio) for third group compared to first group was equal to: aOR Grp3 = 0.52 [0.31–0.89] (P value = 0.009). Conclusion Variables in OBS and primary endpoint in RCTs are reported and described with a high frequency. However, statistical tests consistency between methods and Results sections of OBS is not always noted. Therefore, we encourage authors and peer reviewers to verify consistency of statistical tests in oncology studies. PMID:27716793

  1. Self-balanced real-time photonic scheme for ultrafast random number generation

    NASA Astrophysics Data System (ADS)

    Li, Pu; Guo, Ya; Guo, Yanqiang; Fan, Yuanlong; Guo, Xiaomin; Liu, Xianglian; Shore, K. Alan; Dubrova, Elena; Xu, Bingjie; Wang, Yuncai; Wang, Anbang

    2018-06-01

    We propose a real-time self-balanced photonic method for extracting ultrafast random numbers from broadband randomness sources. In place of electronic analog-to-digital converters (ADCs), the balanced photo-detection technology is used to directly quantize optically sampled chaotic pulses into a continuous random number stream. Benefitting from ultrafast photo-detection, our method can efficiently eliminate the generation rate bottleneck from electronic ADCs which are required in nearly all the available fast physical random number generators. A proof-of-principle experiment demonstrates that using our approach 10 Gb/s real-time and statistically unbiased random numbers are successfully extracted from a bandwidth-enhanced chaotic source. The generation rate achieved experimentally here is being limited by the bandwidth of the chaotic source. The method described has the potential to attain a real-time rate of 100 Gb/s.

  2. Design approaches to experimental mediation☆

    PubMed Central

    Pirlott, Angela G.; MacKinnon, David P.

    2016-01-01

    Identifying causal mechanisms has become a cornerstone of experimental social psychology, and editors in top social psychology journals champion the use of mediation methods, particularly innovative ones when possible (e.g. Halberstadt, 2010, Smith, 2012). Commonly, studies in experimental social psychology randomly assign participants to levels of the independent variable and measure the mediating and dependent variables, and the mediator is assumed to causally affect the dependent variable. However, participants are not randomly assigned to levels of the mediating variable(s), i.e., the relationship between the mediating and dependent variables is correlational. Although researchers likely know that correlational studies pose a risk of confounding, this problem seems forgotten when thinking about experimental designs randomly assigning participants to levels of the independent variable and measuring the mediator (i.e., “measurement-of-mediation” designs). Experimentally manipulating the mediator provides an approach to solving these problems, yet these methods contain their own set of challenges (e.g., Bullock, Green, & Ha, 2010). We describe types of experimental manipulations targeting the mediator (manipulations demonstrating a causal effect of the mediator on the dependent variable and manipulations targeting the strength of the causal effect of the mediator) and types of experimental designs (double randomization, concurrent double randomization, and parallel), provide published examples of the designs, and discuss the strengths and challenges of each design. Therefore, the goals of this paper include providing a practical guide to manipulation-of-mediator designs in light of their challenges and encouraging researchers to use more rigorous approaches to mediation because manipulation-of-mediator designs strengthen the ability to infer causality of the mediating variable on the dependent variable. PMID:27570259

  3. Design approaches to experimental mediation.

    PubMed

    Pirlott, Angela G; MacKinnon, David P

    2016-09-01

    Identifying causal mechanisms has become a cornerstone of experimental social psychology, and editors in top social psychology journals champion the use of mediation methods, particularly innovative ones when possible (e.g. Halberstadt, 2010, Smith, 2012). Commonly, studies in experimental social psychology randomly assign participants to levels of the independent variable and measure the mediating and dependent variables, and the mediator is assumed to causally affect the dependent variable. However, participants are not randomly assigned to levels of the mediating variable(s), i.e., the relationship between the mediating and dependent variables is correlational. Although researchers likely know that correlational studies pose a risk of confounding, this problem seems forgotten when thinking about experimental designs randomly assigning participants to levels of the independent variable and measuring the mediator (i.e., "measurement-of-mediation" designs). Experimentally manipulating the mediator provides an approach to solving these problems, yet these methods contain their own set of challenges (e.g., Bullock, Green, & Ha, 2010). We describe types of experimental manipulations targeting the mediator (manipulations demonstrating a causal effect of the mediator on the dependent variable and manipulations targeting the strength of the causal effect of the mediator) and types of experimental designs (double randomization, concurrent double randomization, and parallel), provide published examples of the designs, and discuss the strengths and challenges of each design. Therefore, the goals of this paper include providing a practical guide to manipulation-of-mediator designs in light of their challenges and encouraging researchers to use more rigorous approaches to mediation because manipulation-of-mediator designs strengthen the ability to infer causality of the mediating variable on the dependent variable.

  4. A Participatory Health Promotion Mobile App Addressing Alcohol Use Problems (The Daybreak Program): Protocol for a Randomized Controlled Trial

    PubMed Central

    Kirkman, Jessica J L; Schaub, Michael P

    2018-01-01

    Background At-risk patterns of alcohol use are prevalent in many countries with significant costs to individuals, families, and society. Screening and brief interventions, including with Web delivery, are effective but with limited translation into practice to date. Previous observational studies of the Hello Sunday Morning approach have found that their unique Web-based participatory health communication method has resulted in a reduction of at-risk alcohol use between baseline and 3 months. The Hello Sunday Morning blog program asks participants to publicly set a personal goal to stop drinking or reduce their consumption for a set period of time, and to record their reflections and progress on blogs and social networks. Daybreak is Hello Sunday Morning’s evidence-based behavior change program, which is designed to support people looking to change their relationship with alcohol. Objective This study aims to systematically evaluate different versions of Hello Sunday Morning’s Daybreak program (with and without coaching support) in reducing at-risk alcohol use. Methods We will use a between groups randomized control design. New participants enrolling in the Daybreak program will be eligible to be randomized to receive either (1) the Daybreak program, including peer support plus behavioral experiments (these encourage and guide participants in developing new skills in the areas of mindfulness, connectedness, resilience, situational strategies, and health), or (2) the Daybreak program, including the same peer support plus behavioral experiments, but with online coaching support. We will recruit 467 people per group to detect an effect size of f=0.10. To be eligible, participants must be resident in Australia, aged ≥18 years, score ≥8 on the alcohol use disorders identification test (AUDIT), and not report prior treatment for cardiovascular disease. Results The primary outcome measure will be reduction in the AUDIT-Consumption (AUDIT-C) scores. Secondary outcomes include mental health (Kessler’s K-10), days out of role (Kessler), alcohol consumed (measured with a 7-day drinking diary in standard 10 g drinks), and alcohol-related harms (CORE alcohol and drug survey). We will collect data at baseline and 1, 3, and 6 months and analyze them with random effects models, given the correlated data structure. Conclusions A randomized trial is required to provide robust evidence of the impact of the online coaching component of the Daybreak program, including over an extended period. Trial Registration Australian New Zealand Clinical Trials Registry ACTRN12618000010291; https://www.anzctr.org.au/Trial/Registration/TrialReview.aspx?id=373110 (Archived by WebCite at http://www.webcitation.org/6zKRmp0aC) Registered Report Identifier RR1-10.2196/9982 PMID:29853435

  5. A Data Management System Integrating Web-based Training and Randomized Trials: Requirements, Experiences and Recommendations.

    PubMed

    Muroff, Jordana; Amodeo, Maryann; Larson, Mary Jo; Carey, Margaret; Loftin, Ralph D

    2011-01-01

    This article describes a data management system (DMS) developed to support a large-scale randomized study of an innovative web-course that was designed to improve substance abuse counselors' knowledge and skills in applying a substance abuse treatment method (i.e., cognitive behavioral therapy; CBT). The randomized trial compared the performance of web-course-trained participants (intervention group) and printed-manual-trained participants (comparison group) to determine the effectiveness of the web-course in teaching CBT skills. A single DMS was needed to support all aspects of the study: web-course delivery and management, as well as randomized trial management. The authors briefly reviewed several other systems that were described as built either to handle randomized trials or to deliver and evaluate web-based training. However it was clear that these systems fell short of meeting our needs for simultaneous, coordinated management of the web-course and the randomized trial. New England Research Institute's (NERI) proprietary Advanced Data Entry and Protocol Tracking (ADEPT) system was coupled with the web-programmed course and customized for our purposes. This article highlights the requirements for a DMS that operates at the intersection of web-based course management systems and randomized clinical trial systems, and the extent to which the coupled, customized ADEPT satisfied those requirements. Recommendations are included for institutions and individuals considering conducting randomized trials and web-based training programs, and seeking a DMS that can meet similar requirements.

  6. Assessment of the Reporting Quality of Placebo-controlled Randomized Trials on the Treatment of Type 2 Diabetes With Traditional Chinese Medicine in Mainland China: A PRISMA-Compliant Systematic Review.

    PubMed

    Zhao, Xiyan; Zhen, Zhong; Guo, Jing; Zhao, Tianyu; Ye, Ru; Guo, Yu; Chen, Hongdong; Lian, Fengmei; Tong, Xiaolin

    2016-01-01

    Placebo-controlled randomized trials are often used to evaluate the absolute effect of new treatments and are considered gold standard for clinical trials. No studies, however, have yet been conducted evaluating the reporting quality of placebo-controlled randomized trials. The current study aims to assess the reporting quality of placebo-controlled randomized trials on treatment of diabetes with Traditional Chinese Medicine (TCM) in Mainland China and to provide recommendations for improvements.China National Knowledge Infrastructure database, Wanfang database, China Biology Medicine database, and VIP database were searched for placebo-controlled randomized trials on treatment of diabetes with TCM. Review, animal experiment, and randomized controlled trials without placebo control were excluded. According to Consolidated Standards of Reporting Trials (CONSORT) 2010 checklists items, each item was given a yes or no depending on whether it was reported or not.A total of 68 articles were included. The reporting percentage in each article ranged from 24.3% to 73%, and 30.9% articles reported more than 50% of the items. Seven of the 37 items were reported more than 90% of the items, whereas 7 items were not mentioned at all. The average reporting for "title and abstract," "introduction," "methods," "results," "discussion," and "other information" was 43.4%, 78.7%, 40.1%, 49.9%, 71.1%, and 17.2%, respectively. The percentage of each section had increased after 2010. In addition, the reporting of multiple study centers, funding, placebo species, informed consent forms, and ethical approvals were 14.7%, 50%, 36.85%, 33.8%, and 4.4%, respectively.Although a scoring system was created according to the CONSORT 2010 checklist, it was not designed as an assessment tool. According to CONSORT 2010, the reporting quality of placebo-controlled randomized trials on the treatment of diabetes with TCM improved after 2010. Future improvements, however, are still needed, particularly in methods sections.

  7. Impacts of an Enhanced Family Health and Sexuality Module of the HealthTeacher Middle School Curriculum: A Cluster Randomized Trial

    PubMed Central

    Scott, Mindy E.; Cook, Elizabeth

    2016-01-01

    Objectives. To evaluate the impacts of an enhanced version of the Family Life and Sexuality Module of the HealthTeacher middle school curriculum. Methods. We conducted a cluster randomized trial of Chicago, Illinois, middle schools. We randomly assigned schools to a treatment group that received the intervention during the 2010–2011 school year or a control group that did not. The primary analysis sample included 595 students (7 schools) in the treatment group and 594 students (7 schools) in the control group. Results. Students in the treatment schools reported greater exposure to information on reproductive health topics such as sexually transmitted infections (STIs; 78% vs 60%; P < .01), abstinence (64% vs 37%; P < .01), and birth control (45% vs 29%; P < .01). They also reported higher average scores on an index of knowledge of contraceptive methods and STI transmission (0.5 vs 0.3; P = .02). We found no statistically significant differences in rates of sexual intercourse (12% vs 12%; P = .99), oral sex (12% vs 9%; P = .18), or other intermediate outcomes. Conclusions. The program had modest effects when tested among Chicago middle school students. PMID:27689479

  8. Mixed-Poisson Point Process with Partially-Observed Covariates: Ecological Momentary Assessment of Smoking.

    PubMed

    Neustifter, Benjamin; Rathbun, Stephen L; Shiffman, Saul

    2012-01-01

    Ecological Momentary Assessment is an emerging method of data collection in behavioral research that may be used to capture the times of repeated behavioral events on electronic devices, and information on subjects' psychological states through the electronic administration of questionnaires at times selected from a probability-based design as well as the event times. A method for fitting a mixed Poisson point process model is proposed for the impact of partially-observed, time-varying covariates on the timing of repeated behavioral events. A random frailty is included in the point-process intensity to describe variation among subjects in baseline rates of event occurrence. Covariate coefficients are estimated using estimating equations constructed by replacing the integrated intensity in the Poisson score equations with a design-unbiased estimator. An estimator is also proposed for the variance of the random frailties. Our estimators are robust in the sense that no model assumptions are made regarding the distribution of the time-varying covariates or the distribution of the random effects. However, subject effects are estimated under gamma frailties using an approximate hierarchical likelihood. The proposed approach is illustrated using smoking data.

  9. Postoperative Nutritional Effects of Early Enteral Feeding Compared with Total Parental Nutrition in Pancreaticoduodectomy Patients: A Prosepective, Randomized Study

    PubMed Central

    Park, Joon Seong; Chung, Hye-Kyung; Hwang, Ho Kyoung; Kim, Jae Keun

    2012-01-01

    The benefits of early enteral feeding (EEN) have been demonstrated in gastrointestinal surgery. But, the impact of EEN has not been elucidated yet. We assessed the postoperative nutritional status of patients who had undergone pancreaticoduodenectomy (PD) according to the postoperative nutritional method and compared the clinical outcomes of two methods. A prospective randomized trial was undertaken following PD. Patients were randomly divided into two groups; the EEN group received the postoperative enteral feed and the control group received the postoperative total parenteral nutrition (TPN) management. Thirty-eight patients were included in our analyses. The first day of bowel movement and time to take a normal soft diet was significantly shorter in EEN group than in TPN group. Prealbumin and transferrin were significantly reduced on post-operative day (POD) 7 and were slowly recovered until POD 90 in the TPN group than in the EEN group. EEN group rapidly recovered weight after POD 21 whereas it was gradually decreased in TPN group until POD 90. EEN after PD is associated with preservation of weight compared with TPN and impact on recovery of digestive function after PD. PMID:22379336

  10. The development of GPU-based parallel PRNG for Monte Carlo applications in CUDA Fortran

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kargaran, Hamed, E-mail: h-kargaran@sbu.ac.ir; Minuchehr, Abdolhamid; Zolfaghari, Ahmad

    The implementation of Monte Carlo simulation on the CUDA Fortran requires a fast random number generation with good statistical properties on GPU. In this study, a GPU-based parallel pseudo random number generator (GPPRNG) have been proposed to use in high performance computing systems. According to the type of GPU memory usage, GPU scheme is divided into two work modes including GLOBAL-MODE and SHARED-MODE. To generate parallel random numbers based on the independent sequence method, the combination of middle-square method and chaotic map along with the Xorshift PRNG have been employed. Implementation of our developed PPRNG on a single GPU showedmore » a speedup of 150x and 470x (with respect to the speed of PRNG on a single CPU core) for GLOBAL-MODE and SHARED-MODE, respectively. To evaluate the accuracy of our developed GPPRNG, its performance was compared to that of some other commercially available PPRNGs such as MATLAB, FORTRAN and Miller-Park algorithm through employing the specific standard tests. The results of this comparison showed that the developed GPPRNG in this study can be used as a fast and accurate tool for computational science applications.« less

  11. Ensemble Methods for Classification of Physical Activities from Wrist Accelerometry.

    PubMed

    Chowdhury, Alok Kumar; Tjondronegoro, Dian; Chandran, Vinod; Trost, Stewart G

    2017-09-01

    To investigate whether the use of ensemble learning algorithms improve physical activity recognition accuracy compared to the single classifier algorithms, and to compare the classification accuracy achieved by three conventional ensemble machine learning methods (bagging, boosting, random forest) and a custom ensemble model comprising four algorithms commonly used for activity recognition (binary decision tree, k nearest neighbor, support vector machine, and neural network). The study used three independent data sets that included wrist-worn accelerometer data. For each data set, a four-step classification framework consisting of data preprocessing, feature extraction, normalization and feature selection, and classifier training and testing was implemented. For the custom ensemble, decisions from the single classifiers were aggregated using three decision fusion methods: weighted majority vote, naïve Bayes combination, and behavior knowledge space combination. Classifiers were cross-validated using leave-one subject out cross-validation and compared on the basis of average F1 scores. In all three data sets, ensemble learning methods consistently outperformed the individual classifiers. Among the conventional ensemble methods, random forest models provided consistently high activity recognition; however, the custom ensemble model using weighted majority voting demonstrated the highest classification accuracy in two of the three data sets. Combining multiple individual classifiers using conventional or custom ensemble learning methods can improve activity recognition accuracy from wrist-worn accelerometer data.

  12. Evaluating the statistical performance of less applied algorithms in classification of worldview-3 imagery data in an urbanized landscape

    NASA Astrophysics Data System (ADS)

    Ranaie, Mehrdad; Soffianian, Alireza; Pourmanafi, Saeid; Mirghaffari, Noorollah; Tarkesh, Mostafa

    2018-03-01

    In recent decade, analyzing the remotely sensed imagery is considered as one of the most common and widely used procedures in the environmental studies. In this case, supervised image classification techniques play a central role. Hence, taking a high resolution Worldview-3 over a mixed urbanized landscape in Iran, three less applied image classification methods including Bagged CART, Stochastic gradient boosting model and Neural network with feature extraction were tested and compared with two prevalent methods: random forest and support vector machine with linear kernel. To do so, each method was run ten time and three validation techniques was used to estimate the accuracy statistics consist of cross validation, independent validation and validation with total of train data. Moreover, using ANOVA and Tukey test, statistical difference significance between the classification methods was significantly surveyed. In general, the results showed that random forest with marginal difference compared to Bagged CART and stochastic gradient boosting model is the best performing method whilst based on independent validation there was no significant difference between the performances of classification methods. It should be finally noted that neural network with feature extraction and linear support vector machine had better processing speed than other.

  13. Lessons learned from a practice-based, multi-site intervention study with nurse participants

    PubMed Central

    Friese, Christopher R.; Mendelsohn-Victor, Kari; Ginex, Pamela; McMahon, Carol M.; Fauer, Alex J.; McCullagh, Marjorie C.

    2016-01-01

    Purpose To identify challenges and solutions to the efficient conduct of a multi-site, practice-based randomized controlled trial to improve nurses’ adherence to personal protective equipment use in ambulatory oncology settings. Design The Drug Exposure Feedback and Education for Nurses’ Safety (DEFENS) study is a clustered, randomized, controlled trial. Participating sites are randomized to web-based feedback on hazardous drug exposures in the sites plus tailored messages to address barriers versus a control intervention of a web-based continuing education video. Approach The study principal investigator, the study coordinator, and two site leaders identified challenges to study implementation and potential solutions, plus potential methods to prevent logistical challenges in future studies. Findings Noteworthy challenges included variation in human subjects protection policies, grants and contracts budgeting, infrastructure for nursing-led research, and information technology variation. Successful strategies included scheduled web conferences, site-based study champions, site visits by the principal investigator, and centrally-based document preparation. Strategies to improve efficiency in future studies include early and continued engagement with contract personnel in sites, and proposed changes to the common rule concerning human subjects. The DEFENS study successfully recruited 393 nurses across 12 sites. To date, 369 have completed surveys and 174 nurses have viewed educational materials. Conclusions Multi-site studies of nursing personnel are rare and challenging to existing infrastructure. These barriers can be overcome with strong engagement and planning. Clinical Relevance Leadership engagement, onsite staff support, and continuous communication can facilitate successful recruitment to a workplace-based randomized, controlled behavioral trial. PMID:28098951

  14. Engagement practices that join scientific methods with community wisdom: Designing a patient-centered, randomized control trial with a Pacific Islander Community

    PubMed Central

    Goulden, Peter A.; Bursac, Zoran; Hudson, Jonell; Purvis, Rachel S.; Yeary, Karen H. Kim; Aitaoto, Nia; Kohler, Peter O.

    2016-01-01

    This article illustrates how a collaborative research process can successfully engage an underserved minority community to address health disparities. Pacific Islanders, including the Marshallese, are one of the fastest growing US populations. They face significant health disparities, including extremely high rates of type 2 diabetes. This article describes the engagement process of designing patient-centered outcomes research with Marshallese stakeholders, highlighting the specific influences of their input on a randomized control trial to address diabetes. Over 18 months, an interdisciplinary research team used community-based participatory principles to conduct patient-engaged outcomes research that involved 31 stakeholders in all aspects of research design, from defining the research question to making decisions about budgets and staffing. This required academic researcher flexibility, but yielded a design linking scientific methodology with community wisdom. PMID:27325179

  15. A Cautious Note on Auxiliary Variables That Can Increase Bias in Missing Data Problems.

    PubMed

    Thoemmes, Felix; Rose, Norman

    2014-01-01

    The treatment of missing data in the social sciences has changed tremendously during the last decade. Modern missing data techniques such as multiple imputation and full-information maximum likelihood are used much more frequently. These methods assume that data are missing at random. One very common approach to increase the likelihood that missing at random is achieved consists of including many covariates as so-called auxiliary variables. These variables are either included based on data considerations or in an inclusive fashion; that is, taking all available auxiliary variables. In this article, we point out that there are some instances in which auxiliary variables exhibit the surprising property of increasing bias in missing data problems. In a series of focused simulation studies, we highlight some situations in which this type of biasing behavior can occur. We briefly discuss possible ways how one can avoid selecting bias-inducing covariates as auxiliary variables.

  16. A pilot randomized trial teaching mindfulness-based stress reduction to traumatized youth in foster care.

    PubMed

    Jee, Sandra H; Couderc, Jean-Philippe; Swanson, Dena; Gallegos, Autumn; Hilliard, Cammie; Blumkin, Aaron; Cunningham, Kendall; Heinert, Sara

    2015-08-01

    This article presents a pilot project implementing a mindfulness-based stress reduction program among traumatized youth in foster and kinship care over 10 weeks. Forty-two youth participated in this randomized controlled trial that used a mixed-methods (quantitative, qualitative, and physiologic) evaluation. Youth self-report measuring mental health problems, mindfulness, and stress were lower than anticipated, and the relatively short time-frame to teach these skills to traumatized youth may not have been sufficient to capture significant changes in stress as measured by electrocardiograms. Main themes from qualitative data included expressed competence in managing ongoing stress, enhanced self-awareness, and new strategies to manage stress. We share our experiences and recommendations for future research and practice, including focusing efforts on younger youth, and using community-based participatory research principles to promote engagement and co-learning. CLINICALTRIALS.GOV: Protocol Registration System ID NCT01708291. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. The role of critical self-reflection of assumptions in an online HIV intervention for men who have sex with men.

    PubMed

    Wilkerson, J Michael; Danilenko, Gene P; Smolenski, Derek J; Myer, Bryn B; Rosser, B R Simon

    2011-02-01

    The Men's INTernet Study II included a randomized controlled trial to develop and test an Internet-based HIV prevention intervention for U.S men who use the Internet to seek sex with men. In 2008, participants (n = 560) were randomized to an online, interactive, sexual risk-reduction intervention or to a wait list null control. After 3 months, participants in both conditions reported varying degrees of change in sexual beliefs or behaviors. Using content analysis and logistic regression, this mixed-methods study sought to understand why these changes occurred. Level of critical self-reflection of assumptions appeared to facilitate the labeling of sexual beliefs and behaviors as risky, which in turn encouraged men to commit to and enact change. New HIV prevention interventions should include activities in their curriculum that foster critical self-reflection on assumptions.

  18. HIV Prevention for Adults With Criminal Justice Involvement: A Systematic Review of HIV Risk-Reduction Interventions in Incarceration and Community Settings

    PubMed Central

    Dumont, Dora; Operario, Don

    2014-01-01

    We summarized and appraised evidence regarding HIV prevention interventions for adults with criminal justice involvement. We included randomized and quasi-randomized controlled trials that evaluated an HIV prevention intervention, enrolled participants with histories of criminal justice involvement, and reported biological or behavioral outcomes. We used Cochrane methods to screen 32 271 citations from 16 databases and gray literature. We included 37 trials enrolling n = 12 629 participants. Interventions were 27 psychosocial, 7 opioid substitution therapy, and 3 HIV-testing programs. Eleven programs significantly reduced sexual risk taking, 4 reduced injection drug risks, and 4 increased testing. Numerous interventions may reduce HIV-related risks among adults with criminal justice involvement. Future research should consider process evaluations, programs involving partners or families, and interventions integrating biomedical, psychosocial, and structural approaches. PMID:25211725

  19. The fast algorithm of spark in compressive sensing

    NASA Astrophysics Data System (ADS)

    Xie, Meihua; Yan, Fengxia

    2017-01-01

    Compressed Sensing (CS) is an advanced theory on signal sampling and reconstruction. In CS theory, the reconstruction condition of signal is an important theory problem, and spark is a good index to study this problem. But the computation of spark is NP hard. In this paper, we study the problem of computing spark. For some special matrixes, for example, the Gaussian random matrix and 0-1 random matrix, we obtain some conclusions. Furthermore, for Gaussian random matrix with fewer rows than columns, we prove that its spark equals to the number of its rows plus one with probability 1. For general matrix, two methods are given to compute its spark. One is the method of directly searching and the other is the method of dual-tree searching. By simulating 24 Gaussian random matrixes and 18 0-1 random matrixes, we tested the computation time of these two methods. Numerical results showed that the dual-tree searching method had higher efficiency than directly searching, especially for those matrixes which has as much as rows and columns.

  20. Characterizing and Managing Missing Structured Data in Electronic Health Records: Data Analysis.

    PubMed

    Beaulieu-Jones, Brett K; Lavage, Daniel R; Snyder, John W; Moore, Jason H; Pendergrass, Sarah A; Bauer, Christopher R

    2018-02-23

    Missing data is a challenge for all studies; however, this is especially true for electronic health record (EHR)-based analyses. Failure to appropriately consider missing data can lead to biased results. While there has been extensive theoretical work on imputation, and many sophisticated methods are now available, it remains quite challenging for researchers to implement these methods appropriately. Here, we provide detailed procedures for when and how to conduct imputation of EHR laboratory results. The objective of this study was to demonstrate how the mechanism of missingness can be assessed, evaluate the performance of a variety of imputation methods, and describe some of the most frequent problems that can be encountered. We analyzed clinical laboratory measures from 602,366 patients in the EHR of Geisinger Health System in Pennsylvania, USA. Using these data, we constructed a representative set of complete cases and assessed the performance of 12 different imputation methods for missing data that was simulated based on 4 mechanisms of missingness (missing completely at random, missing not at random, missing at random, and real data modelling). Our results showed that several methods, including variations of Multivariate Imputation by Chained Equations (MICE) and softImpute, consistently imputed missing values with low error; however, only a subset of the MICE methods was suitable for multiple imputation. The analyses we describe provide an outline of considerations for dealing with missing EHR data, steps that researchers can perform to characterize missingness within their own data, and an evaluation of methods that can be applied to impute clinical data. While the performance of methods may vary between datasets, the process we describe can be generalized to the majority of structured data types that exist in EHRs, and all of our methods and code are publicly available. ©Brett K Beaulieu-Jones, Daniel R Lavage, John W Snyder, Jason H Moore, Sarah A Pendergrass, Christopher R Bauer. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 23.02.2018.

  1. Variances and uncertainties of the sample laboratory-to-laboratory variance (S(L)2) and standard deviation (S(L)) associated with an interlaboratory study.

    PubMed

    McClure, Foster D; Lee, Jung K

    2012-01-01

    The validation process for an analytical method usually employs an interlaboratory study conducted as a balanced completely randomized model involving a specified number of randomly chosen laboratories, each analyzing a specified number of randomly allocated replicates. For such studies, formulas to obtain approximate unbiased estimates of the variance and uncertainty of the sample laboratory-to-laboratory (lab-to-lab) STD (S(L)) have been developed primarily to account for the uncertainty of S(L) when there is a need to develop an uncertainty budget that includes the uncertainty of S(L). For the sake of completeness on this topic, formulas to estimate the variance and uncertainty of the sample lab-to-lab variance (S(L)2) were also developed. In some cases, it was necessary to derive the formulas based on an approximate distribution for S(L)2.

  2. Computer-Aided Screening of Conjugated Polymers for Organic Solar Cell: Classification by Random Forest.

    PubMed

    Nagasawa, Shinji; Al-Naamani, Eman; Saeki, Akinori

    2018-05-17

    Owing to the diverse chemical structures, organic photovoltaic (OPV) applications with a bulk heterojunction framework have greatly evolved over the last two decades, which has produced numerous organic semiconductors exhibiting improved power conversion efficiencies (PCEs). Despite the recent fast progress in materials informatics and data science, data-driven molecular design of OPV materials remains challenging. We report a screening of conjugated molecules for polymer-fullerene OPV applications by supervised learning methods (artificial neural network (ANN) and random forest (RF)). Approximately 1000 experimental parameters including PCE, molecular weight, and electronic properties are manually collected from the literature and subjected to machine learning with digitized chemical structures. Contrary to the low correlation coefficient in ANN, RF yields an acceptable accuracy, which is twice that of random classification. We demonstrate the application of RF screening for the design, synthesis, and characterization of a conjugated polymer, which facilitates a rapid development of optoelectronic materials.

  3. A Cluster-Randomized Trial of Restorative Practices: An Illustration to Spur High-Quality Research and Evaluation

    PubMed Central

    Acosta, Joie D.; Chinman, Matthew; Ebener, Patricia; Phillips, Andrea; Xenakis, Lea; Malone, Patrick S.

    2017-01-01

    Restorative Practices in schools lack rigorous evaluation studies. As an example of rigorous school-based research, this paper describes the first randomized control trial of restorative practices to date, the Study of Restorative Practices. It is a 5-year, cluster-randomized controlled trial (RCT) of the Restorative Practices Intervention (RPI) in 14 middle schools in Maine to assess whether RPI impacts both positive developmental outcomes and problem behaviors and whether the effects persist during the transition from middle to high school. The two-year RPI intervention began in the 2014–2015 school year. The study’s rationale and theoretical concerns are discussed along with methodological concerns including teacher professional development. The theoretical rationale and description of the methods from this study may be useful to others conducting rigorous research and evaluation in this area. PMID:28936104

  4. Handling Correlations between Covariates and Random Slopes in Multilevel Models

    ERIC Educational Resources Information Center

    Bates, Michael David; Castellano, Katherine E.; Rabe-Hesketh, Sophia; Skrondal, Anders

    2014-01-01

    This article discusses estimation of multilevel/hierarchical linear models that include cluster-level random intercepts and random slopes. Viewing the models as structural, the random intercepts and slopes represent the effects of omitted cluster-level covariates that may be correlated with included covariates. The resulting correlations between…

  5. Biases and Power for Groups Comparison on Subjective Health Measurements

    PubMed Central

    Hamel, Jean-François; Hardouin, Jean-Benoit; Le Neel, Tanguy; Kubis, Gildas; Roquelaure, Yves; Sébille, Véronique

    2012-01-01

    Subjective health measurements are increasingly used in clinical research, particularly for patient groups comparisons. Two main types of analytical strategies can be used for such data: so-called classical test theory (CTT), relying on observed scores and models coming from Item Response Theory (IRT) relying on a response model relating the items responses to a latent parameter, often called latent trait. Whether IRT or CTT would be the most appropriate method to compare two independent groups of patients on a patient reported outcomes measurement remains unknown and was investigated using simulations. For CTT-based analyses, groups comparison was performed using t-test on the scores. For IRT-based analyses, several methods were compared, according to whether the Rasch model was considered with random effects or with fixed effects, and the group effect was included as a covariate or not. Individual latent traits values were estimated using either a deterministic method or by stochastic approaches. Latent traits were then compared with a t-test. Finally, a two-steps method was performed to compare the latent trait distributions, and a Wald test was performed to test the group effect in the Rasch model including group covariates. The only unbiased IRT-based method was the group covariate Wald’s test, performed on the random effects Rasch model. This model displayed the highest observed power, which was similar to the power using the score t-test. These results need to be extended to the case frequently encountered in practice where data are missing and possibly informative. PMID:23115620

  6. Exposing Clinicians to Exposure: A Randomized Controlled Dissemination Trial of Exposure Therapy for Anxiety Disorders

    PubMed Central

    Harned, Melanie S.; Dimeff, Linda A.; Woodcock, Eric A.; Kelly, Tim; Zavertnik, Jake; Contreras, Ignacio; Danner, Sankirtana M.

    2014-01-01

    Objective The present study evaluated three technology-based methods of training mental health providers in exposure therapy (ET) for anxiety disorders. Training methods were designed to address common barriers to the dissemination of ET, including limited access to training, negative clinician attitudes toward ET, and lack of support during and following training. Method Clinicians naïve to ET (N=181, Mage = 37.4, 71.3% female, 72.1% Caucasian) were randomly assigned to: 1) an interactive, multimedia online training (OLT), 2) OLT plus a brief, computerized motivational enhancement intervention (OLT+ME), or 3) OLT + ME plus a web-based learning community (OLT+ME+LC). Assessments were completed at baseline, post-training, and 6 and 12 weeks following training. Outcomes include satisfaction, knowledge, self-efficacy, attitudes, self-reported clinical use, and observer-rated clinical proficiency. Results All three training methods led to large and comparable improvements in self-efficacy and clinical use of ET, indicating that OLT alone was sufficient for improving these outcomes. The addition of the ME intervention did not significantly improve outcomes in comparison to OLT alone. Supplementing the OLT with both the ME intervention and the LC significantly improved attitudes and clinical proficiency in comparison to OLT alone. The OLT+ME+LC condition was superior to both other conditions in increasing knowledge of ET. Conclusions Multi-component trainings that address multiple potential barriers to dissemination appear to be most effective in improving clinician outcomes. Technology-based training methods offer a satisfactory, effective, and scalable way to train mental health providers in evidence-based treatments such as ET. PMID:25311284

  7. Randomized Comparison of 3 Methods to Screen for Domestic Violence in Family Practice

    PubMed Central

    Chen, Ping-Hsin; Rovi, Sue; Washington, Judy; Jacobs, Abbie; Vega, Marielos; Pan, Ko-Yu; Johnson, Mark S.

    2007-01-01

    PURPOSE We undertook a study to compare 3 ways of administering brief domestic violence screening questionnaires: self-administered questionnaire, medical staff interview, and physician interview. METHODS We conducted a randomized trial of 3 screening protocols for domestic violence in 4 urban family medicine practices with mostly minority patients. We randomly assigned 523 female patients, aged 18 years or older and currently involved with a partner, to 1 of 3 screening protocols. Each included 2 brief screening tools: HITS and WAST-Short. Outcome measures were domestic violence disclosure, patient and clinician comfort with the screening, and time spent screening. RESULTS Overall prevalence of domestic violence was 14%. Most patients (93.4%) and clinicians (84.5%) were comfortable with the screening questions and method of administering them. Average time spent screening was 4.4 minutes. Disclosure rates, patient and clinician comfort with screening, and time spent screening were similar among the 3 protocols. In addition, WAST-Short was validated in this sample of minority women by comparison with HITS and with the 8-item WAST. CONCLUSIONS Domestic violence is common, and we found that most patients and clinicians are comfortable with domestic violence screening in urban family medicine settings. Patient self-administered domestic violence screening is as effective as clinician interview in terms of disclosure, comfort, and time spent screening. PMID:17893385

  8. The Minnesota Green Tea Trial (MGTT), a randomized controlled trial of the efficacy of green tea extract on biomarkers of breast cancer risk: Study rationale, design, methods, and participant characteristics

    PubMed Central

    Samavat, Hamed; Dostal, Allison M.; Wang, Renwei; Bedell, Sarah; Emory, Tim H.; Ursin, Giske; Torkelson, Carolyn J.; Gross, Myron D.; Le, Chap T.; Yu, Mimi C.; Yang, Chung S.; Yee, Douglas; Wu, Anna H.; Yuan, Jian-Min; Kurzer, Mindy S.

    2015-01-01

    Purpose The Minnesota Green Tea Trial (MGTT) was a randomized, placebo-controlled, double-blinded trial investigating the effect of daily green tea extract consumption for 12 months on biomarkers of breast cancer risk. Methods Participants were healthy postmenopausal women at high risk of breast cancer due to dense breast tissue with differing catechol-O-methyltransferase (COMT) genotypes. The intervention was a green tea catechin extract containing 843.0 ± 44.0 mg/day epigallocatechin gallate or placebo capsules for one year. Annual digital screening mammograms were obtained at baseline and month 12, and fasting blood and 24-hour urine samples were provided at baseline, months 6, and 12. Primary endpoints included changes in percent mammographic density, circulating endogenous sex hormones and insulin-like growth factor axis proteins; secondary endpoints were changes in urinary estrogens and estrogen metabolites and circulating F2-isoprostanes, a biomarker of oxidative stress. Results The MGTT screened more than 100,000 mammograms and randomized 1075 participants based on treatment (green tea extract vs. placebo), stratified by COMT genotype activity (high COMT vs. low/intermediate COMT genotype activity). 937 women successfully completed the study and 138 dropped out (overall dropout rate= 12.8%). Conclusions In this paper we report the rationale, design, recruitment, participant characteristics, and methods for biomarker and statistical analyses. PMID:26206423

  9. Baseline Predictors of Missed Visits in the Look AHEAD Study

    PubMed Central

    Fitzpatrick, Stephanie L.; Jeffery, Robert; Johnson, Karen C.; Roche, Cathy C.; Van Dorsten, Brent; Gee, Molly; Johnson, Ruby Ann; Charleston, Jeanne; Dotson, Kathy; Walkup, Michael P.; Hill-Briggs, Felicia; Brancati, Frederick L.

    2013-01-01

    Objective To identify baseline attributes associated with consecutively missed data collection visits during the first 48 months of Look AHEAD—a randomized, controlled trial in 5145 overweight/obese adults with type 2 diabetes designed to determine the long-term health benefits of weight loss achieved by lifestyle change. Design and Methods The analyzed sample consisted of 5016 participants who were alive at month 48 and enrolled at Look AHEAD sites. Demographic, baseline behavior, psychosocial factors, and treatment randomization were included as predictors of missed consecutive visits in proportional hazard models. Results In multivariate Cox proportional hazard models, baseline attributes of participants who missed consecutive visits (n=222) included: younger age ( Hazard Ratio [HR] 1.18 per 5 years younger; 95% Confidence Interval 1.05, 1.30), higher depression score (HR 1.04; 1.01, 1.06), non-married status (HR 1.37; 1.04, 1.82), never self-weighing prior to enrollment (HR 2.01; 1.25, 3.23), and randomization to minimal vs. intensive lifestyle intervention (HR 1.46; 1.11, 1.91). Conclusions Younger age, symptoms of depression, non-married status, never self-weighing, and randomization to minimal intervention were associated with a higher likelihood of missing consecutive data collection visits, even in a high-retention trial like Look AHEAD. Whether modifications to screening or retention efforts targeted to these attributes might enhance long-term retention in behavioral trials requires further investigation. PMID:23996977

  10. Chinese Herbal Bath Therapy for the Treatment of Knee Osteoarthritis: Meta-Analysis of Randomized Controlled Trials

    PubMed Central

    Chen, Bo; Zhan, Hongsheng; Chung, Mei; Lin, Xun; Zhang, Min; Pang, Jian; Wang, Chenchen

    2015-01-01

    Objective. Chinese herbal bath therapy (CHBT) has traditionally been considered to have analgesic and anti-inflammatory effects. We conducted the first meta-analysis evaluating its benefits for patients with knee osteoarthritis (OA). Methods. We searched three English and four Chinese databases through October, 2014. Randomized trials evaluating at least 2 weeks of CHBT for knee OA were selected. The effects of CHBT on clinical symptoms included both pain level (via the visual analog scale) and total effectiveness rate, which assessed pain, physical performance, and wellness. We performed random-effects meta-analyses using mean difference. Results. Fifteen studies totaling 1618 subjects met eligibility criteria. Bath prescription included, on average, 13 Chinese herbs with directions to steam and wash around the knee for 20–40 minutes once or twice daily. Mean treatment duration was 3 weeks. Results from meta-analysis showed superior pain improvement (mean difference = −0.59 points; 95% confidence intervals [CI], −0.83 to −0.36; p < 0.00001) and higher total effectiveness rate (risk ratio = 1.21; 95% CI, 1.15 to 1.28; p < 0.00001) when compared with standard western treatment. No serious adverse events were reported. Conclusion. Chinese herbal bath therapy may be a safe, effective, and simple alternative treatment modality for knee OA. Further rigorously designed, randomized trials are warranted. PMID:26483847

  11. Quasi-experimental designs in practice-based research settings: design and implementation considerations.

    PubMed

    Handley, Margaret A; Schillinger, Dean; Shiboski, Stephen

    2011-01-01

    Although randomized controlled trials are often a gold standard for determining intervention effects, in the area of practice-based research (PBR), there are many situations in which individual randomization is not possible. Alternative approaches to evaluating interventions have received increased attention, particularly those that can retain elements of randomization such that they can be considered "controlled" trials. Methodological design elements and practical implementation considerations for two quasi-experimental design approaches that have considerable promise in PBR settings--the stepped-wedge design, and a variant of this design, a wait-list cross-over design, are presented along with a case study from a recent PBR intervention for patients with diabetes. PBR-relevant design features include: creation of a cohort over time that collects control data but allows all participants (clusters or patients) to receive the intervention; staggered introduction of clusters; multiple data collection points; and one-way cross-over into the intervention arm. Practical considerations include: randomization versus stratification, training run in phases; and extended time period for overall study completion. Several design features of practice based research studies can be adapted to local circumstances yet retain elements to improve methodological rigor. Studies that utilize these methods, such as the stepped-wedge design and the wait-list cross-over design, can increase the evidence base for controlled studies conducted within the complex environment of PBR.

  12. Evaluating the Use of Random Distribution Theory to Introduce Statistical Inference Concepts to Business Students

    ERIC Educational Resources Information Center

    Larwin, Karen H.; Larwin, David A.

    2011-01-01

    Bootstrapping methods and random distribution methods are increasingly recommended as better approaches for teaching students about statistical inference in introductory-level statistics courses. The authors examined the effect of teaching undergraduate business statistics students using random distribution and bootstrapping simulations. It is the…

  13. Can Dialectical Behavior Therapy Be Learned in Highly Structured Learning Environments? Results from a Randomized Controlled Dissemination Trial

    ERIC Educational Resources Information Center

    Dimeff, Linda A.; Woodcock, Eric A.; Harned, Melanie S.; Beadnell, Blair

    2011-01-01

    This study evaluated the efficacy of methods of training community mental health providers (N=132) in dialectical behavior therapy (DBT) distress tolerance skills, including (a) Linehan's (1993a) Skills Training Manual for Borderline Personality Disorder (Manual), (b) a multimedia e-Learning course covering the same content (e-DBT), and (c) a…

  14. Pathways Explaining the Reduction of Adult Criminal Behaviour by a Randomized Preventive Intervention for Disruptive Kindergarten Children

    ERIC Educational Resources Information Center

    Vitaro, Frank; Barker, Edward D.; Brendgen, Mara; Tremblay, Richard E.

    2012-01-01

    Objective: This study aimed to identify the pathways through which a preventive intervention targeting young low-SES disruptive boys could result in lower crime involvement during adulthood. Method: The preventive intervention was implemented when the children were between 7 and 9 years and included three components (i.e. social skills, parental…

  15. The Influence of Labeling the Vegetable Content of Snack Food on Children's Taste Preferences: A Pilot Study

    ERIC Educational Resources Information Center

    Pope, Lizzy; Wolf, Randi L.

    2012-01-01

    Objective: This pilot study examined whether informing children of the presence of vegetables in select snack food items alters taste preference. Methods: A random sample of 68 elementary and middle school children tasted identical pairs of 3 snack food items containing vegetables. In each pair, 1 sample's label included the food's vegetable (eg,…

  16. Scalability, Complexity and Reliability in Quantum Information Processing

    DTIC Science & Technology

    2007-03-01

    hidden subgroup framework to abelian groups which are not finitely generated. An extension of the basic algorithm breaks the Buchmann-Williams...finding short lattice vectors . In [2], we showed that the generalization of the standard method --- random coset state preparation followed by fourier...sampling --- required exponential time for sufficiently non-abelian groups including the symmetric group , at least when the fourier transforms are

  17. Comparison of Program Costs for Parent-Only and Family-Based Interventions for Pediatric Obesity in Medically Underserved Rural Settings

    ERIC Educational Resources Information Center

    Janicke, David M.; Sallinen, Bethany J.; Perri, Michael G.; Lutes, Lesley D.; Silverstein, Janet H.; Brumback, Babette

    2009-01-01

    Purpose: To compare the costs of parent-only and family-based group interventions for childhood obesity delivered through Cooperative Extension Services in rural communities. Methods: Ninety-three overweight or obese children (aged 8 to 14 years) and their parent(s) participated in this randomized controlled trial, which included a 4-month…

  18. Risk Factors for Nonelective Hospitalization in Frail and Older Adult, Inner-City Outpatients.

    ERIC Educational Resources Information Center

    Damush, Teresa M.; Smith, David M.; Perkins, Anthony J.; Dexter, Paul R.; Smith, Faye

    2004-01-01

    Purpose: In our study, we sought to improve the accuracy of predicting the risk of hospitalization and to identify older, inner-city patients who could be targeted for preventive interventions. Design and Methods: Participants (56% were African American) in a randomized trial were from a primary care practice and included 1,041 patients living in…

  19. AUPress: A Comparison of an Open Access University Press with Traditional Presses

    ERIC Educational Resources Information Center

    McGreal, Rory; Chen, Nian-Shing

    2011-01-01

    This study is a comparison of AUPress with three other traditional (non-open access) Canadian university presses. The analysis is based on the rankings that are correlated with book sales on Amazon.com and Amazon.ca. Statistical methods include the sampling of the sales ranking of randomly selected books from each press. The results of one-way…

  20. Evaluation of a Metric Booklet as a Supplement to Teaching the Metric System to Undergraduate Non-Science Majors.

    ERIC Educational Resources Information Center

    Exum, Kenith Gene

    Examined is the effectiveness of a method of teaching the metric system using the booklet, Metric Supplement to Mathematics, in combination with a physical science textbook. The participants in the study were randomly selected undergraduates in a non-science oriented program of study. Instruments used included the Metric Supplement to Mathematics…

  1. Social Involvement of Children with Autism Spectrum Disorders in Elementary School Classrooms

    ERIC Educational Resources Information Center

    Rotheram-Fuller, Erin; Kasari, Connie; Chamberlain, Brandt; Locke, Jill

    2010-01-01

    Background: Children with autism spectrum disorders (ASD) are increasingly included in general education classrooms in an effort to improve their social involvement. Methods: Seventy-nine children with ASD and 79 randomly selected, gender-matched peers (88.6% male) in 75 early (K-1), middle (2nd-3rd), and late (4th-5th) elementary classrooms…

  2. Test surfaces useful for calibration of surface profilometers

    DOEpatents

    Yashchuk, Valeriy V; McKinney, Wayne R; Takacs, Peter Z

    2013-12-31

    The present invention provides for test surfaces and methods for calibration of surface profilometers, including interferometric and atomic force microscopes. Calibration is performed using a specially designed test surface, or the Binary Pseudo-random (BPR) grating (array). Utilizing the BPR grating (array) to measure the power spectral density (PSD) spectrum, the profilometer is calibrated by determining the instrumental modulation transfer.

  3. The Mediatory Role of Exercise Self-Regulation in the Relationship between Personality Traits and Anger Management of Athletes

    ERIC Educational Resources Information Center

    Shahbazzadeh, Somayeh; Beliad, Mohammad Reza

    2017-01-01

    This study investigates the mediatory role of exercise self-regulation role in the relationship between personality traits and anger management among athletes. The statistical population of this study includes all athlete students of Shar-e Ghods College, among which 260 people were selected as sample using random sampling method. In addition, the…

  4. Use of the "Intervention Selection Profile-Social Skills" to Identify Social Skill Acquisition Deficits: A Preliminary Validation Study

    ERIC Educational Resources Information Center

    Kilgus, Stephen P.; von der Embse, Nathaniel P.; Scott, Katherine; Paxton, Sara

    2015-01-01

    The purpose of this investigation was to develop and initially validate the "Intervention Selection Profile-Social Skills" (ISP-SS), a novel brief social skills assessment method intended for use at Tier 2. Participants included 54 elementary school teachers and their 243 randomly selected students. Teachers rated students on two rating…

  5. Effect of Multi Modal Representations on the Critical Thinking Skills of the Fifth Grade Students

    ERIC Educational Resources Information Center

    Öz, Muhittin; Memis, Esra Kabatas

    2018-01-01

    The purpose of this study was to explore the effects of multi modal representations within writing to learn activities on students' critical thinking. Mixed method was used. The participants included 32 students 5th grade from elementary school. The groups were randomly selected as a control group and the other class was selected as the…

  6. Circular Data Images for Directional Data

    NASA Technical Reports Server (NTRS)

    Morpet, William J.

    2004-01-01

    Directional data includes vectors, points on a unit sphere, axis orientation, angular direction, and circular or periodic data. The theoretical statistics for circular data (random points on a unit circle) or spherical data (random points on a unit sphere) are a recent development. An overview of existing graphical methods for the display of directional data is given. Cross-over occurs when periodic data are measured on a scale for the measurement of linear variables. For example, if angle is represented by a linear color gradient changing uniformly from dark blue at -180 degrees to bright red at +180 degrees, the color image will be discontinuous at +180 degrees and -180 degrees, which are the same location. The resultant color would depend on the direction of approach to the cross-over point. A new graphical method for imaging directional data is described, which affords high resolution without color discontinuity from "cross-over". It is called the circular data image. The circular data image uses a circular color scale in which colors repeat periodically. Some examples of the circular data image include direction of earth winds on a global scale, rocket motor internal flow, earth global magnetic field direction, and rocket motor nozzle vector direction vs. time.

  7. The study of combining Latin Hypercube Sampling method and LU decomposition method (LULHS method) for constructing spatial random field

    NASA Astrophysics Data System (ADS)

    WANG, P. T.

    2015-12-01

    Groundwater modeling requires to assign hydrogeological properties to every numerical grid. Due to the lack of detailed information and the inherent spatial heterogeneity, geological properties can be treated as random variables. Hydrogeological property is assumed to be a multivariate distribution with spatial correlations. By sampling random numbers from a given statistical distribution and assigning a value to each grid, a random field for modeling can be completed. Therefore, statistics sampling plays an important role in the efficiency of modeling procedure. Latin Hypercube Sampling (LHS) is a stratified random sampling procedure that provides an efficient way to sample variables from their multivariate distributions. This study combines the the stratified random procedure from LHS and the simulation by using LU decomposition to form LULHS. Both conditional and unconditional simulations of LULHS were develpoed. The simulation efficiency and spatial correlation of LULHS are compared to the other three different simulation methods. The results show that for the conditional simulation and unconditional simulation, LULHS method is more efficient in terms of computational effort. Less realizations are required to achieve the required statistical accuracy and spatial correlation.

  8. Digital double random amplitude image encryption method based on the symmetry property of the parametric discrete Fourier transform

    NASA Astrophysics Data System (ADS)

    Bekkouche, Toufik; Bouguezel, Saad

    2018-03-01

    We propose a real-to-real image encryption method. It is a double random amplitude encryption method based on the parametric discrete Fourier transform coupled with chaotic maps to perform the scrambling. The main idea behind this method is the introduction of a complex-to-real conversion by exploiting the inherent symmetry property of the transform in the case of real-valued sequences. This conversion allows the encrypted image to be real-valued instead of being a complex-valued image as in all existing double random phase encryption methods. The advantage is to store or transmit only one image instead of two images (real and imaginary parts). Computer simulation results and comparisons with the existing double random amplitude encryption methods are provided for peak signal-to-noise ratio, correlation coefficient, histogram analysis, and key sensitivity.

  9. Spatial Light Modulators as Optical Crossbar Switches

    NASA Technical Reports Server (NTRS)

    Juday, Richard

    2003-01-01

    A proposed method of implementing cross connections in an optical communication network is based on the use of a spatial light modulator (SLM) to form controlled diffraction patterns that connect inputs (light sources) and outputs (light sinks). Sources would typically include optical fibers and/or light-emitting diodes; sinks would typically include optical fibers and/or photodetectors. The sources and/or sinks could be distributed in two dimensions; that is, on planes. Alternatively or in addition, sources and/or sinks could be distributed in three dimensions -- for example, on curved surfaces or in more complex (including random) three-dimensional patterns.

  10. Improving the outcome of infants born at <30 weeks' gestation - a randomized controlled trial of preventative care at home

    PubMed Central

    2009-01-01

    Background Early developmental interventions to prevent the high rate of neurodevelopmental problems in very preterm children, including cognitive, motor and behavioral impairments, are urgently needed. These interventions should be multi-faceted and include modules for caregivers given their high rates of mental health problems. Methods/Design We have designed a randomized controlled trial to assess the effectiveness of a preventative care program delivered at home over the first 12 months of life for infants born very preterm (<30 weeks of gestational age) and their families, compared with standard medical follow-up. The aim of the program, delivered over nine sessions by a team comprising a physiotherapist and psychologist, is to improve infant development (cognitive, motor and language), behavioral regulation, caregiver-child interactions and caregiver mental health at 24 months' corrected age. The infants will be stratified by severity of brain white matter injury (assessed by magnetic resonance imaging) at term equivalent age, and then randomized. At 12 months' corrected age interim outcome measures will include motor development assessed using the Alberta Infant Motor Scale and the Neurological Sensory Motor Developmental Assessment. Caregivers will also complete a questionnaire at this time to obtain information on behavior, parenting, caregiver mental health, and social support. The primary outcomes are at 24 months' corrected age and include cognitive, motor and language development assessed with the Bayley Scales of Infant and Toddler Development (Bayley-III). Secondary outcomes at 24 months include caregiver-child interaction measured using an observational task, and infant behavior, parenting, caregiver mental health and social support measured via standardized parental questionnaires. Discussion This paper presents the background, study design and protocol for a randomized controlled trial in very preterm infants utilizing a preventative care program in the first year after discharge home designed to improve cognitive, motor and behavioral outcomes of very preterm children and caregiver mental health at two-years' corrected age. Clinical Trial Registration Number ACTRN12605000492651 PMID:19954550

  11. Mindful Yoga for women with metastatic breast cancer: design of a randomized controlled trial.

    PubMed

    Carson, James W; Carson, Kimberly M; Olsen, Maren K; Sanders, Linda; Porter, Laura S

    2017-03-13

    Women with metastatic breast cancer (MBC) have average life expectancies of about 2 years, and report high levels of disease-related symptoms including pain, fatigue, sleep disturbance, psychological distress, and functional impairment. There is growing recognition of the limitations of medical approaches to managing such symptoms. Yoga is a mind-body discipline that has demonstrated a positive impact on psychological and functional health in early stage breast cancer patients and survivors, but has not been rigorously studied in advanced cancer samples. This randomized controlled trial examines the feasibility and initial efficacy of a Mindful Yoga program, compared with a social support condition that controls for attention, on measures of disease-related symptoms such as pain and fatigue. The study will be completed by December 2017. Sixty-five women with MBC age ≥ 18 are being identified and randomized with a 2:1 allocation to Mindful Yoga or a support group control intervention. The 120-min intervention sessions take place weekly for 8 weeks. The study is conducted at an urban tertiary care academic medical center located in Durham, North Carolina. The primary feasibility outcome is attendance at intervention sessions. Efficacy outcomes include pain, fatigue, sleep quality, psychological distress, mindfulness and functional capacity at post-intervention, 3-month follow-up, and 6-month follow-up. In this article, we present the challenges of designing a randomized controlled trial with long-term follow-up among women with MBC. These challenges include ensuring adequate recruitment including of minorities, limiting and controlling for selection bias, tailoring of the yoga intervention to address special needs, and maximizing adherence and retention. This project will provide important information regarding yoga as an intervention for women with advanced cancer, including preliminary data on the psychological and functional effects of yoga for MBC patients. This investigation will also establish rigorous methods for future research into yoga as an intervention for this population. ClinicalTrials.gov identifer: NCT01927081 , registered August 16, 2013.

  12. Prophylaxis of Radiation-Induced Nausea and Vomiting Using 5-Hydroxytryptamine-3 Serotonin Receptor Antagonists: A Systematic Review of Randomized Trials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Salvo, Nadia; Doble, Brett; Khan, Luluel

    Purpose: To systematically review the effectiveness and safety of 5-hydroxytryptamine-3 receptor antagonists (5-HT3 RAs) compared with other antiemetic medication or placebo for prophylaxis of radiation-induced nausea and vomiting. Methods and Materials: We searched the following electronic databases: MEDLINE, Embase, the Cochrane Central Register of Controlled Clinical Trials, and Web of Science. We also hand-searched reference lists of included studies. Randomized, controlled trials that compared a 5-HT3 RA with another antiemetic medication or placebo for preventing radiation-induced nausea and vomiting were included. We excluded studies recruiting patients receiving concomitant chemotherapy. When appropriate, meta-analysis was conducted using Review Manager (v5) software. Relativemore » risks were calculated using inverse variance as the statistical method under a random-effects model. We assessed the quality of evidence by outcome using the Grading of Recommendations Assessment, Development, and Evaluation approach. Results: Eligibility screening of 47 articles resulted in 9 included in the review. The overall methodologic quality was moderate. Meta-analysis of 5-HT3 RAs vs. placebo showed significant benefit for 5-HT3 RAs (relative risk [RR] 0.70; 95% confidence interval [CI] 0.57-0.86 for emesis; RR 0.84, 95% CI 0.73-0.96 for nausea). Meta-analysis comparing 5-HT3 RAs vs. metoclopramide showed a significant benefit of the 5-HT3 RAs for emetic control (RR 0.27, 95% CI 0.15-0.47). Conclusion: 5-Hydroxytryptamine-3 RAs are superior to placebo and other antiemetics for prevention of emesis, but little benefit was identified for nausea prevention. 5-Hydroxytryptamine-3 RAs are suggested for prevention of emesis. Limited evidence was found regarding delayed emesis, adverse events, quality of life, or need for rescue medication. Future randomized, controlled trials should evaluate different 5-HT3 antiemetics and new agents with novel mechanisms of action such at the NK{sub 1} receptor antagonists to determine the most effective drug. Delayed nausea and vomiting should be a focus of future study, perhaps concentrating on the palliative cancer population.« less

  13. Zhen gan xi feng decoction, a traditional chinese herbal formula, for the treatment of essential hypertension: a systematic review of randomized controlled trials.

    PubMed

    Xiong, Xingjiang; Yang, Xiaochen; Feng, Bo; Liu, Wei; Duan, Lian; Gao, Ao; Li, Haixia; Ma, Jizheng; Du, Xinliang; Li, Nan; Wang, Pengqian; Su, Kelei; Chu, Fuyong; Zhang, Guohao; Li, Xiaoke; Wang, Jie

    2013-01-01

    Objectives. To assess the clinical effectiveness and adverse effects of Zhen Gan Xi Feng Decoction (ZGXFD) for essential hypertension (EH). Methods. Five major electronic databases were searched up to August 2012 to retrieve any potential randomized controlled trials designed to evaluate the clinical effectiveness of ZGXFD for EH reported in any language, with main outcome measure as blood pressure (BP). Results. Six randomized trials were included. Methodological quality of the trials was evaluated as generally low. Four trials compared prescriptions based on ZGXFD with antihypertensive drugs. Meta-analysis showed that ZGXFD was more effective in BP control and TCM syndrome and symptom differentiation (TCM-SSD) scores than antihypertensive drugs. Two trials compared the combination of modified ZGXFD plus antihypertensive drugs with antihypertensive drugs. Meta-analysis showed that there is significant beneficial effect on TCM-SSD scores. However, no significant effect on BP was found. The safety of ZGXFD is still uncertain. Conclusions. ZGXFD appears to be effective in improving blood pressure and hypertension-related symptoms for EH. However, the evidence remains weak due to poor methodological quality of the included studies. More rigorous trials are warranted to support their clinical use.

  14. Dry cupping for plantar fasciitis: a randomized controlled trial.

    PubMed

    Ge, Weiqing; Leson, Chelsea; Vukovic, Corey

    2017-05-01

    [Purpose] The purpose of this study was to determine the effects of dry cupping on pain and function of patients with plantar fasciitis. [Subjects and Methods] Twenty-nine subjects (age 15 to 59 years old, 20 females and 9 males), randomly assigned into the two groups (dry cupping therapy and electrical stimulation therapy groups), participated in this study. The research design was a randomized controlled trial (RCT). Treatments were provided to the subjects twice a week for 4 weeks. Outcome measurements included the Visual Analogue Pain Scale (VAS) (at rest, first in the morning, and with activities), the Foot and Ankle Ability Measure (FAAM), the Lower Extremity Functional Scale (LEFS), as well as the pressure pain threshold. [Results]The data indicated that both dry cupping therapy and electrical stimulation therapy could reduce pain and increase function significantly in the population tested, as all the 95% Confidence Intervals (CIs) did not include 0 except for the pressure pain threshold. There was no significant difference between the dry cupping therapy and electrical stimulation groups in all the outcome measurements. [Conclusion] These results support that both dry cupping therapy and electrical stimulation therapy could reduce pain and increase function in the population tested.

  15. Dry cupping for plantar fasciitis: a randomized controlled trial

    PubMed Central

    Ge, Weiqing; Leson, Chelsea; Vukovic, Corey

    2017-01-01

    [Purpose] The purpose of this study was to determine the effects of dry cupping on pain and function of patients with plantar fasciitis. [Subjects and Methods] Twenty-nine subjects (age 15 to 59 years old, 20 females and 9 males), randomly assigned into the two groups (dry cupping therapy and electrical stimulation therapy groups), participated in this study. The research design was a randomized controlled trial (RCT). Treatments were provided to the subjects twice a week for 4 weeks. Outcome measurements included the Visual Analogue Pain Scale (VAS) (at rest, first in the morning, and with activities), the Foot and Ankle Ability Measure (FAAM), the Lower Extremity Functional Scale (LEFS), as well as the pressure pain threshold. [Results]The data indicated that both dry cupping therapy and electrical stimulation therapy could reduce pain and increase function significantly in the population tested, as all the 95% Confidence Intervals (CIs) did not include 0 except for the pressure pain threshold. There was no significant difference between the dry cupping therapy and electrical stimulation groups in all the outcome measurements. [Conclusion] These results support that both dry cupping therapy and electrical stimulation therapy could reduce pain and increase function in the population tested. PMID:28603360

  16. Simultaneous multiplexing and encoding of multiple images based on a double random phase encryption system

    NASA Astrophysics Data System (ADS)

    Alfalou, Ayman; Mansour, Ali

    2009-09-01

    Nowadays, protecting information is a major issue in any transmission system, as showed by an increasing number of research papers related to this topic. Optical encoding methods, such as a Double Random Phase encryption system i.e. DRP, are widely used and cited in the literature. DRP systems have very simple principle and they are easily applicable to most images (B&W, gray levels or color). Moreover, some applications require an enhanced encoding level based on multiencryption scheme and including biometric keys (as digital fingerprints). The enhancement should be done without increasing transmitted or stored information. In order to achieve that goal, a new approach for simultaneous multiplexing & encoding of several target images is developed in this manuscript. By introducing two additional security levels, our approach enhances the security level of a classic "DRP" system. Our first security level consists in using several independent image-keys (randomly and structurally) along with a new multiplexing algorithm. At this level, several target images (multiencryption) are used. This part can reduce needed information (encoding information). At the second level a standard DRP system is included. Finally, our approach can detect if any vandalism attempt has been done on transmitted encrypted images.

  17. Teaching Medical Ethics in Graduate and Undergraduate Medical Education: A Systematic Review of Effectiveness.

    PubMed

    de la Garza, Santiago; Phuoc, Vania; Throneberry, Steven; Blumenthal-Barby, Jennifer; McCullough, Laurence; Coverdale, John

    2017-08-01

    One objective was to identify and review studies on teaching medical ethics to psychiatry residents. In order to gain insights from other disciplines that have published research in this area, a second objective was to identify and review studies on teaching medical ethics to residents across all other specialties of training and on teaching medical students. PubMed, EMBASE, and PsycINFO were searched for controlled trials on teaching medical ethics with quantitative outcomes. Search terms included ethics, bioethics, medical ethics, medical students, residents/registrars, teaching, education, outcomes, and controlled trials. Nine studies were found that met inclusion criteria, including five randomized controlled trails and four controlled non-randomized trials. Subjects included medical students (5 studies), surgical residents (2 studies), internal medicine house officers (1 study), and family medicine preceptors and their medical students (1 study). Teaching methods, course content, and outcome measures varied considerably across studies. Common methodological issues included a lack of concealment of allocation, a lack of blinding, and generally low numbers of subjects as learners. One randomized controlled trial which taught surgical residents using a standardized patient was judged to be especially methodologically rigorous. None of the trials incorporated psychiatry residents. Ethics educators should undertake additional rigorously controlled trials in order to secure a strong evidence base for the design of medical ethics curricula. Psychiatry ethics educators can also benefit from the findings of trials in other disciplines and in undergraduate medical education.

  18. A hybrid flower pollination algorithm based modified randomized location for multi-threshold medical image segmentation.

    PubMed

    Wang, Rui; Zhou, Yongquan; Zhao, Chengyan; Wu, Haizhou

    2015-01-01

    Multi-threshold image segmentation is a powerful image processing technique that is used for the preprocessing of pattern recognition and computer vision. However, traditional multilevel thresholding methods are computationally expensive because they involve exhaustively searching the optimal thresholds to optimize the objective functions. To overcome this drawback, this paper proposes a flower pollination algorithm with a randomized location modification. The proposed algorithm is used to find optimal threshold values for maximizing Otsu's objective functions with regard to eight medical grayscale images. When benchmarked against other state-of-the-art evolutionary algorithms, the new algorithm proves itself to be robust and effective through numerical experimental results including Otsu's objective values and standard deviations.

  19. Stochastic analysis of uncertain thermal parameters for random thermal regime of frozen soil around a single freezing pipe

    NASA Astrophysics Data System (ADS)

    Wang, Tao; Zhou, Guoqing; Wang, Jianzhou; Zhou, Lei

    2018-03-01

    The artificial ground freezing method (AGF) is widely used in civil and mining engineering, and the thermal regime of frozen soil around the freezing pipe affects the safety of design and construction. The thermal parameters can be truly random due to heterogeneity of the soil properties, which lead to the randomness of thermal regime of frozen soil around the freezing pipe. The purpose of this paper is to study the one-dimensional (1D) random thermal regime problem on the basis of a stochastic analysis model and the Monte Carlo (MC) method. Considering the uncertain thermal parameters of frozen soil as random variables, stochastic processes and random fields, the corresponding stochastic thermal regime of frozen soil around a single freezing pipe are obtained and analyzed. Taking the variability of each stochastic parameter into account individually, the influences of each stochastic thermal parameter on stochastic thermal regime are investigated. The results show that the mean temperatures of frozen soil around the single freezing pipe with three analogy method are the same while the standard deviations are different. The distributions of standard deviation have a great difference at different radial coordinate location and the larger standard deviations are mainly at the phase change area. The computed data with random variable method and stochastic process method have a great difference from the measured data while the computed data with random field method well agree with the measured data. Each uncertain thermal parameter has a different effect on the standard deviation of frozen soil temperature around the single freezing pipe. These results can provide a theoretical basis for the design and construction of AGF.

  20. Effect of progestin vs. combined oral contraceptive pills on lactation: A double-blind randomized controlled trial

    PubMed Central

    Espey, Eve; Ogburn, Tony; Leeman, Larry; Singh, Rameet; Schrader, Ronald

    2013-01-01

    Objective To estimate the effect of progestin-only vs. combined hormonal contraceptive pills on rates of breastfeeding continuation in postpartum women. Secondary outcomes include infant growth parameters, contraceptive method continuation and patient satisfaction with breastfeeding and contraceptive method. Methods In this randomized controlled trial, postpartum breastfeeding women who desired oral contraceptives were assigned to progestin-only vs. combined hormonal contraceptive pills. At two and eight weeks postpartum, participants completed in-person questionnaires that assessed breastfeeding continuation and contraceptive use. Infant growth parameters including weight, length and head circumference were assessed at eight weeks postpartum. Telephone questionnaires assessing breastfeeding, contraceptive continuation and satisfaction were completed at 3-7 weeks and 4 and 6 months. Breastfeeding continuation was compared between groups using Cox proportional hazards regression. Differences in baseline demographic characteristics and in variables between the two intervention groups were compared using chi-square tests, Fisher’s Exact test, or two-sample t-tests as appropriate. Results Breastfeeding continuation rates, contraceptive continuation, and infant growth parameters did not differ between users of progestin-only and combined hormonal contraceptive pills. Infant formula supplementation and maternal perception of inadequate milk supply were associated with decreased rates of breastfeeding in both groups. Conclusions Choice of combined or progestin-only birth control pills administered two weeks postpartum did not adversely affect breastfeeding continuation. PMID:22143258

  1. [Meta-analysis of needle-knife treatment on cervical spondylosis].

    PubMed

    Kan, Li-Li; Wang, Hai-Dong; Liu, An-Guo

    2013-11-01

    To assess the efficacy of cervical spondylosis by needle-knife treatment according to the correlated literature of RCT,to compare advantages of needle-knife treatment. Randomized Controlled Trials about needle-knife treatment of cervical spondylosis were indexed from Chinese HowNet (CNKI) and Wanfang (WF) from 2000 to 2012, then were analyzed the efficacy by Review Manager 5.1 software. A total of 13 RCT literatures and 1 419 patients were included. The methods of included studies were poor in quality evaluation because of large sample and multi-center RCT studies was lacked, randomization method was not accurate enough, diagnostic criteria and efficacy evaluation were various, only four studies described long-term efficacy, most of the literature didn't describe the adverse event and fall off,all studies did not use the blind method. The Meta analysis outcome showed overall efficiency of needle-knife therapy was better than acupuncture and traction. Needle-knife therapy compared with Acupuncture, the total RR = 0.19, 95% confidence interval was (0.15, 0.24), P < 0.000.01. Compared with traction therapy the total RR = 1.30, 95% confidence intervalwas (1.18,1.42), P < 0.00001. Compared with acupuncture therapy,the overall effectiveness of needle-knife therapy is higher;compared with traction therapy, although,needle-knife therapy has a high overall effectiveness, but because of the loss of total sample size, the outcome RCT researches to confirm.

  2. Sign language spotting with a threshold model based on conditional random fields.

    PubMed

    Yang, Hee-Deok; Sclaroff, Stan; Lee, Seong-Whan

    2009-07-01

    Sign language spotting is the task of detecting and recognizing signs in a signed utterance, in a set vocabulary. The difficulty of sign language spotting is that instances of signs vary in both motion and appearance. Moreover, signs appear within a continuous gesture stream, interspersed with transitional movements between signs in a vocabulary and nonsign patterns (which include out-of-vocabulary signs, epentheses, and other movements that do not correspond to signs). In this paper, a novel method for designing threshold models in a conditional random field (CRF) model is proposed which performs an adaptive threshold for distinguishing between signs in a vocabulary and nonsign patterns. A short-sign detector, a hand appearance-based sign verification method, and a subsign reasoning method are included to further improve sign language spotting accuracy. Experiments demonstrate that our system can spot signs from continuous data with an 87.0 percent spotting rate and can recognize signs from isolated data with a 93.5 percent recognition rate versus 73.5 percent and 85.4 percent, respectively, for CRFs without a threshold model, short-sign detection, subsign reasoning, and hand appearance-based sign verification. Our system can also achieve a 15.0 percent sign error rate (SER) from continuous data and a 6.4 percent SER from isolated data versus 76.2 percent and 14.5 percent, respectively, for conventional CRFs.

  3. Does Assessing Suicidality Frequently and Repeatedly Cause Harm? A Randomized Control Study

    PubMed Central

    Law, Mary Kate; Furr, R. Michael; Arnold, Elizabeth Mayfield; Mneimne, Malek; Jaquett, Caroline; Fleeson, William

    2015-01-01

    Assessing suicidality is common in mental health practice and is fundamental to suicide research. Although necessary, there is significant concern that such assessments have unintended harmful consequences. Using a longitudinal randomized control design, we evaluated whether repeated and frequent assessments of suicide-related thoughts and behaviors negatively affected individuals, including those at-risk for suicide-related outcomes. Adults (N = 282), including many diagnosed with Borderline Personality Disorder (BPD), were recruited through psychiatric outpatient clinics and from the community at large, and were randomly assigned to assessment groups. A Control Assessment group responded to questions regarding negative psychological experiences several times each day during a 2-week Main Observation phase. During the same observation period, an Intensive Suicide Assessment group responded to the same questions, along with questions regarding suicidal behavior and ideation. Negative psychological outcomes were measured during the Main Observation phase (for BPD symptoms unrelated to suicide and for BPD-relevant emotions) and/or at the end of each week during the Main Observation phase and monthly for 6 months thereafter (for all outcomes, including suicidal ideation and behavior). Results revealed little evidence that intensive suicide assessment triggered negative outcomes, including suicidal ideation or behavior, even among people with BPD. A handful of effects did reach or approach significance, though these were temporary and non-robust. However, given the seriousness of some outcomes, we recommend that researchers or clinicians who implement experience sampling methods including suicide-related items carefully consider the benefits of asking about suicide and to inform participants about possible risks. PMID:25894705

  4. Does assessing suicidality frequently and repeatedly cause harm? A randomized control study.

    PubMed

    Law, Mary Kate; Furr, R Michael; Arnold, Elizabeth Mayfield; Mneimne, Malek; Jaquett, Caroline; Fleeson, William

    2015-12-01

    Assessing suicidality is common in mental health practice and is fundamental to suicide research. Although necessary, there is significant concern that such assessments have unintended harmful consequences. Using a longitudinal randomized control design, the authors evaluated whether repeated and frequent assessments of suicide-related thoughts and behaviors negatively affected individuals, including those at-risk for suicide-related outcomes. Adults (N = 282), including many diagnosed with borderline personality disorder (BPD), were recruited through psychiatric outpatient clinics and from the community at large, and were randomly assigned to assessment groups. A control assessment group responded to questions regarding negative psychological experiences several times each day during a 2-week main observation phase. During the same observation period, an intensive suicide assessment group responded to the same questions, along with questions regarding suicidal behavior and ideation. Negative psychological outcomes were measured during the main observation phase (for BPD symptoms unrelated to suicide and for BPD-relevant emotions) and/or at the end of each week during the main observation phase and monthly for 6 months thereafter (for all outcomes, including suicidal ideation and behavior). Results revealed little evidence that intensive suicide assessment triggered negative outcomes, including suicidal ideation or behavior, even among people with BPD. A handful of effects did reach or approach significance, though these were temporary and nonrobust. However, given the seriousness of some outcomes, the authors recommend that researchers or clinicians who implement experience sampling methods including suicide-related items carefully consider the benefits of asking about suicide and to inform participants about possible risks. (c) 2015 APA, all rights reserved).

  5. Rehabilitation for the management of knee osteoarthritis using comprehensive traditional Chinese medicine in community health centers: study protocol for a randomized controlled trial

    PubMed Central

    2013-01-01

    Background It is becoming increasingly necessary for community health centers to make rehabilitation services available to patients with osteoarthritis of the knee. However, for a number of reasons, including a lack of expertise, the small size of community health centers and the availability of only simple medical equipment, conventional rehabilitation therapy has not been widely used in China. Consequently, most patients with knee osteoarthritis seek treatment in high-grade hospitals. However, many patients cannot manage the techniques that they were taught in the hospital. Methods such as acupuncture, tuina, Chinese medical herb fumigation-washing and t’ai chi are easy to do and have been reported to have curative effects in those with knee osteoarthritis. To date, there have been no randomized controlled trials validating comprehensive traditional Chinese medicine for the rehabilitation of knee osteoarthritis in a community health center. Furthermore, there is no standard rehabilitation protocol using traditional Chinese medicine for knee osteoarthritis. The aim of the current study is to develop a comprehensive rehabilitation protocol using traditional Chinese medicine for the management of knee osteoarthritis in a community health center. Method/design This will be a randomized controlled clinical trial with blinded assessment. There will be a 4-week intervention utilizing rehabilitation protocols from traditional Chinese medicine and conventional therapy. Follow-up will be conducted for a period of 12 weeks. A total of 722 participants with knee osteoarthritis will be recruited. Participants will be randomly divided into two groups: experimental and control. Primary outcomes will include range of motion, girth measurement, the visual analogue scale, and results from the manual muscle, six-minute walking and stair-climbing tests. Secondary outcomes will include average daily consumption of pain medication, ability to perform daily tasks and health-related quality-of-life assessments. Other outcomes will include rate of adverse events and economic effects. Relative cost-effectiveness will be determined from health service usage and outcome data. Discussion The primary aim of this trial is to develop a standard protocol for traditional Chinese medicine, which can be adopted by community health centers in China and worldwide, for the rehabilitation of patients with knee osteoarthritis. Trial registration Clinical Trials Registration: ChiCTR-TRC-12002538 PMID:24188276

  6. Design, methods, and baseline characteristics of the Kids’ Health Insurance by Educating Lots of Parents (Kids’ HELP) trial: A randomized, controlled trial of the effectiveness of parent mentors in insuring uninsured minority children✰

    PubMed Central

    Flores, Glenn; Walker, Candy; Lin, Hua; Lee, Michael; Fierro, Marco; Henry, Monica; Massey, Kenneth; Portillo, Alberto

    2014-01-01

    Background & objectives Six million US children have no health insurance, and substantial racial/ethnic disparities exist. The design, methods, and baseline characteristics are described for Kids’ Health Insurance by Educating Lots of Parents (Kids’ HELP), the first randomized, clinical trial of the effectiveness of Parent Mentors (PMs) in insuring uninsured minority children. Methods & research design Latino and African-American children eligible for but not enrolled in Medicaid/CHIP were randomized to PMs, or a control group receiving traditional Medicaid/CHIP outreach. PMs are experienced parents with ≥ 1 Medicaid/CHIP-covered children. PMs received two days of training, and provide intervention families with information on Medicaid/CHIP eligibility, assistance with application submission, and help maintaining coverage. Primary outcomes include obtaining health insurance, time interval to obtain coverage, and parental satisfaction. A blinded assessor contacts subjects monthly for one year to monitor outcomes. Results Of 49,361 candidates screened, 329 fulfilled eligibility criteria and were randomized. The mean age is seven years for children and 32 years for caregivers; 2/3 are Latino, 1/3 are African-American, and the mean annual family income is $21,857. Half of caregivers were unaware that their uninsured child is Medicaid/CHIP eligible, and 95% of uninsured children had prior insurance. Fifteen PMs completed two-day training sessions. All PMs are female and minority, 60% are unemployed, and the mean annual family income is $20,913. Post-PM-training, overall knowledge/skills test scores significantly increased, and 100% reported being very satisfied/satisfied with the training. Conclusions Kids’ HELP successfully reached target populations, met participant enrollment goals, and recruited and trained PMs. PMID:25476583

  7. Multiple imputation for assessment of exposures to drinking water contaminants: evaluation with the Atrazine Monitoring Program.

    PubMed

    Jones, Rachael M; Stayner, Leslie T; Demirtas, Hakan

    2014-10-01

    Drinking water may contain pollutants that harm human health. The frequency of pollutant monitoring may occur quarterly, annually, or less frequently, depending upon the pollutant, the pollutant concentration, and community water system. However, birth and other health outcomes are associated with narrow time-windows of exposure. Infrequent monitoring impedes linkage between water quality and health outcomes for epidemiological analyses. To evaluate the performance of multiple imputation to fill in water quality values between measurements in community water systems (CWSs). The multiple imputation method was implemented in a simulated setting using data from the Atrazine Monitoring Program (AMP, 2006-2009 in five Midwestern states). Values were deleted from the AMP data to leave one measurement per month. Four patterns reflecting drinking water monitoring regulations were used to delete months of data in each CWS: three patterns were missing at random and one pattern was missing not at random. Synthetic health outcome data were created using a linear and a Poisson exposure-response relationship with five levels of hypothesized association, respectively. The multiple imputation method was evaluated by comparing the exposure-response relationships estimated based on multiply imputed data with the hypothesized association. The four patterns deleted 65-92% months of atrazine observations in AMP data. Even with these high rates of missing information, our procedure was able to recover most of the missing information when the synthetic health outcome was included for missing at random patterns and for missing not at random patterns with low-to-moderate exposure-response relationships. Multiple imputation appears to be an effective method for filling in water quality values between measurements. Copyright © 2014 Elsevier Inc. All rights reserved.

  8. A random spatial sampling method in a rural developing nation

    Treesearch

    Michelle C. Kondo; Kent D.W. Bream; Frances K. Barg; Charles C. Branas

    2014-01-01

    Nonrandom sampling of populations in developing nations has limitations and can inaccurately estimate health phenomena, especially among hard-to-reach populations such as rural residents. However, random sampling of rural populations in developing nations can be challenged by incomplete enumeration of the base population. We describe a stratified random sampling method...

  9. A general method for handling missing binary outcome data in randomized controlled trials.

    PubMed

    Jackson, Dan; White, Ian R; Mason, Dan; Sutton, Stephen

    2014-12-01

    The analysis of randomized controlled trials with incomplete binary outcome data is challenging. We develop a general method for exploring the impact of missing data in such trials, with a focus on abstinence outcomes. We propose a sensitivity analysis where standard analyses, which could include 'missing = smoking' and 'last observation carried forward', are embedded in a wider class of models. We apply our general method to data from two smoking cessation trials. A total of 489 and 1758 participants from two smoking cessation trials. The abstinence outcomes were obtained using telephone interviews. The estimated intervention effects from both trials depend on the sensitivity parameters used. The findings differ considerably in magnitude and statistical significance under quite extreme assumptions about the missing data, but are reasonably consistent under more moderate assumptions. A new method for undertaking sensitivity analyses when handling missing data in trials with binary outcomes allows a wide range of assumptions about the missing data to be assessed. In two smoking cessation trials the results were insensitive to all but extreme assumptions. © 2014 The Authors. Addiction published by John Wiley & Sons Ltd on behalf of Society for the Study of Addiction.

  10. Order-disorder effects in structure and color relation of photonic-crystal-type nanostructures in butterfly wing scales.

    PubMed

    Márk, Géza I; Vértesy, Zofia; Kertész, Krisztián; Bálint, Zsolt; Biró, László P

    2009-11-01

    In order to study local and global order in butterfly wing scales possessing structural colors, we have developed a direct space algorithm, based on averaging the local environment of the repetitive units building up the structure. The method provides the statistical distribution of the local environments, including the histogram of the nearest-neighbor distance and the number of nearest neighbors. We have analyzed how the different kinds of randomness present in the direct space structure influence the reciprocal space structure. It was found that the Fourier method is useful in the case of a structure randomly deviating from an ordered lattice. The direct space averaging method remains applicable even for structures lacking long-range order. Based on the first Born approximation, a link is established between the reciprocal space image and the optical reflectance spectrum. Results calculated within this framework agree well with measured reflectance spectra because of the small width and moderate refractive index contrast of butterfly scales. By the analysis of the wing scales of Cyanophrys remus and Albulina metallica butterflies, we tested the methods for structures having long-range order, medium-range order, and short-range order.

  11. Order-disorder effects in structure and color relation of photonic-crystal-type nanostructures in butterfly wing scales

    NASA Astrophysics Data System (ADS)

    Márk, Géza I.; Vértesy, Zofia; Kertész, Krisztián; Bálint, Zsolt; Biró, László P.

    2009-11-01

    In order to study local and global order in butterfly wing scales possessing structural colors, we have developed a direct space algorithm, based on averaging the local environment of the repetitive units building up the structure. The method provides the statistical distribution of the local environments, including the histogram of the nearest-neighbor distance and the number of nearest neighbors. We have analyzed how the different kinds of randomness present in the direct space structure influence the reciprocal space structure. It was found that the Fourier method is useful in the case of a structure randomly deviating from an ordered lattice. The direct space averaging method remains applicable even for structures lacking long-range order. Based on the first Born approximation, a link is established between the reciprocal space image and the optical reflectance spectrum. Results calculated within this framework agree well with measured reflectance spectra because of the small width and moderate refractive index contrast of butterfly scales. By the analysis of the wing scales of Cyanophrys remus and Albulina metallica butterflies, we tested the methods for structures having long-range order, medium-range order, and short-range order.

  12. Redshift data and statistical inference

    NASA Technical Reports Server (NTRS)

    Newman, William I.; Haynes, Martha P.; Terzian, Yervant

    1994-01-01

    Frequency histograms and the 'power spectrum analysis' (PSA) method, the latter developed by Yu & Peebles (1969), have been widely employed as techniques for establishing the existence of periodicities. We provide a formal analysis of these two classes of methods, including controlled numerical experiments, to better understand their proper use and application. In particular, we note that typical published applications of frequency histograms commonly employ far greater numbers of class intervals or bins than is advisable by statistical theory sometimes giving rise to the appearance of spurious patterns. The PSA method generates a sequence of random numbers from observational data which, it is claimed, is exponentially distributed with unit mean and variance, essentially independent of the distribution of the original data. We show that the derived random processes is nonstationary and produces a small but systematic bias in the usual estimate of the mean and variance. Although the derived variable may be reasonably described by an exponential distribution, the tail of the distribution is far removed from that of an exponential, thereby rendering statistical inference and confidence testing based on the tail of the distribution completely unreliable. Finally, we examine a number of astronomical examples wherein these methods have been used giving rise to widespread acceptance of statistically unconfirmed conclusions.

  13. A randomized trial to identify accurate and cost-effective fidelity measurement methods for cognitive-behavioral therapy: project FACTS study protocol.

    PubMed

    Beidas, Rinad S; Maclean, Johanna Catherine; Fishman, Jessica; Dorsey, Shannon; Schoenwald, Sonja K; Mandell, David S; Shea, Judy A; McLeod, Bryce D; French, Michael T; Hogue, Aaron; Adams, Danielle R; Lieberman, Adina; Becker-Haimes, Emily M; Marcus, Steven C

    2016-09-15

    This randomized trial will compare three methods of assessing fidelity to cognitive-behavioral therapy (CBT) for youth to identify the most accurate and cost-effective method. The three methods include self-report (i.e., therapist completes a self-report measure on the CBT interventions used in session while circumventing some of the typical barriers to self-report), chart-stimulated recall (i.e., therapist reports on the CBT interventions used in session via an interview with a trained rater, and with the chart to assist him/her) and behavioral rehearsal (i.e., therapist demonstrates the CBT interventions used in session via a role-play with a trained rater). Direct observation will be used as the gold-standard comparison for each of the three methods. This trial will recruit 135 therapists in approximately 12 community agencies in the City of Philadelphia. Therapists will be randomized to one of the three conditions. Each therapist will provide data from three unique sessions, for a total of 405 sessions. All sessions will be audio-recorded and coded using the Therapy Process Observational Coding System for Child Psychotherapy-Revised Strategies scale. This will enable comparison of each measurement approach to direct observation of therapist session behavior to determine which most accurately assesses fidelity. Cost data associated with each method will be gathered. To gather stakeholder perspectives of each measurement method, we will use purposive sampling to recruit 12 therapists from each condition (total of 36 therapists) and 12 supervisors to participate in semi-structured qualitative interviews. Results will provide needed information on how to accurately and cost-effectively measure therapist fidelity to CBT for youth, as well as important information about stakeholder perspectives with regard to each measurement method. Findings will inform fidelity measurement practices in future implementation studies as well as in clinical practice. NCT02820623 , June 3rd, 2016.

  14. Novel approaches to assess the quality of fertility data stored in dairy herd management software.

    PubMed

    Hermans, K; Waegeman, W; Opsomer, G; Van Ranst, B; De Koster, J; Van Eetvelde, M; Hostens, M

    2017-05-01

    Scientific journals and popular press magazines are littered with articles in which the authors use data from dairy herd management software. Almost none of such papers include data cleaning and data quality assessment in their study design despite this being a very critical step during data mining. This paper presents 2 novel data cleaning methods that permit identification of animals with good and bad data quality. The first method is a deterministic or rule-based data cleaning method. Reproduction and mutation or life-changing events such as birth and death were converted to a symbolic (alphabetical letter) representation and split into triplets (3-letter code). The triplets were manually labeled as physiologically correct, suspicious, or impossible. The deterministic data cleaning method was applied to assess the quality of data stored in dairy herd management from 26 farms enrolled in the herd health management program from the Faculty of Veterinary Medicine Ghent University, Belgium. In total, 150,443 triplets were created, 65.4% were labeled as correct, 17.4% as suspicious, and 17.2% as impossible. The second method, a probabilistic method, uses a machine learning algorithm (random forests) to predict the correctness of fertility and mutation events in an early stage of data cleaning. The prediction accuracy of the random forests algorithm was compared with a classical linear statistical method (penalized logistic regression), outperforming the latter substantially, with a superior receiver operating characteristic curve and a higher accuracy (89 vs. 72%). From those results, we conclude that the triplet method can be used to assess the quality of reproduction data stored in dairy herd management software and that a machine learning technique such as random forests is capable of predicting the correctness of fertility data. Copyright © 2017 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  15. Multilevel covariance regression with correlated random effects in the mean and variance structure.

    PubMed

    Quintero, Adrian; Lesaffre, Emmanuel

    2017-09-01

    Multivariate regression methods generally assume a constant covariance matrix for the observations. In case a heteroscedastic model is needed, the parametric and nonparametric covariance regression approaches can be restrictive in the literature. We propose a multilevel regression model for the mean and covariance structure, including random intercepts in both components and allowing for correlation between them. The implied conditional covariance function can be different across clusters as a result of the random effect in the variance structure. In addition, allowing for correlation between the random intercepts in the mean and covariance makes the model convenient for skewedly distributed responses. Furthermore, it permits us to analyse directly the relation between the mean response level and the variability in each cluster. Parameter estimation is carried out via Gibbs sampling. We compare the performance of our model to other covariance modelling approaches in a simulation study. Finally, the proposed model is applied to the RN4CAST dataset to identify the variables that impact burnout of nurses in Belgium. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Weight Control Intervention for Truck Drivers: The SHIFT Randomized Controlled Trial, United States

    PubMed Central

    Wipfli, Brad; Thompson, Sharon V.; Elliot, Diane L.; Anger, W. Kent; Bodner, Todd; Hammer, Leslie B.; Perrin, Nancy A.

    2016-01-01

    Objectives. To evaluate the effectiveness of the Safety and Health Involvement For Truckers (SHIFT) intervention with a randomized controlled design. Methods. The multicomponent intervention was a weight-loss competition supported with body weight and behavioral self-monitoring, computer-based training, and motivational interviewing. We evaluated intervention effectiveness with a cluster-randomized design involving 22 terminals from 5 companies in the United States in 2012 to 2014. Companies were required to provide interstate transportation services and operate at least 2 larger terminals. We randomly assigned terminals to intervention or usual practice control conditions. We assessed participating drivers (n = 452) at baseline and 6 months. Results. In an intent-to-treat analysis, the postintervention difference between groups in mean body mass index change was 1.00 kilograms per meters squared (P < .001; intervention = −0.73; control = +0.27). Behavioral changes included statistically significant improvements in fruit and vegetable consumption and physical activity. Conclusions. Results establish the effectiveness of a multicomponent and remotely administered intervention for producing significant weight loss among commercial truck drivers. PMID:27463067

  17. Directional Histogram Ratio at Random Probes: A Local Thresholding Criterion for Capillary Images

    PubMed Central

    Lu, Na; Silva, Jharon; Gu, Yu; Gerber, Scott; Wu, Hulin; Gelbard, Harris; Dewhurst, Stephen; Miao, Hongyu

    2013-01-01

    With the development of micron-scale imaging techniques, capillaries can be conveniently visualized using methods such as two-photon and whole mount microscopy. However, the presence of background staining, leaky vessels and the diffusion of small fluorescent molecules can lead to significant complexity in image analysis and loss of information necessary to accurately quantify vascular metrics. One solution to this problem is the development of accurate thresholding algorithms that reliably distinguish blood vessels from surrounding tissue. Although various thresholding algorithms have been proposed, our results suggest that without appropriate pre- or post-processing, the existing approaches may fail to obtain satisfactory results for capillary images that include areas of contamination. In this study, we propose a novel local thresholding algorithm, called directional histogram ratio at random probes (DHR-RP). This method explicitly considers the geometric features of tube-like objects in conducting image binarization, and has a reliable performance in distinguishing small vessels from either clean or contaminated background. Experimental and simulation studies suggest that our DHR-RP algorithm is superior over existing thresholding methods. PMID:23525856

  18. What qualitative research can contribute to a randomized controlled trial of a complex community intervention.

    PubMed

    Nelson, Geoffrey; Macnaughton, Eric; Goering, Paula

    2015-11-01

    Using the case of a large-scale, multi-site Canadian Housing First research demonstration project for homeless people with mental illness, At Home/Chez Soi, we illustrate the value of qualitative methods in a randomized controlled trial (RCT) of a complex community intervention. We argue that quantitative RCT research can neither capture the complexity nor tell the full story of a complex community intervention. We conceptualize complex community interventions as having multiple phases and dimensions that require both RCT and qualitative research components. Rather than assume that qualitative research and RCTs are incommensurate, a more pragmatic mixed methods approach was used, which included using both qualitative and quantitative methods to understand program implementation and outcomes. At the same time, qualitative research was used to examine aspects of the intervention that could not be understood through the RCT, such as its conception, planning, sustainability, and policy impacts. Through this example, we show how qualitative research can tell a more complete story about complex community interventions. Copyright © 2015 Elsevier Inc. All rights reserved.

  19. Computer-Aided Diagnosis for Breast Ultrasound Using Computerized BI-RADS Features and Machine Learning Methods.

    PubMed

    Shan, Juan; Alam, S Kaisar; Garra, Brian; Zhang, Yingtao; Ahmed, Tahira

    2016-04-01

    This work identifies effective computable features from the Breast Imaging Reporting and Data System (BI-RADS), to develop a computer-aided diagnosis (CAD) system for breast ultrasound. Computerized features corresponding to ultrasound BI-RADs categories were designed and tested using a database of 283 pathology-proven benign and malignant lesions. Features were selected based on classification performance using a "bottom-up" approach for different machine learning methods, including decision tree, artificial neural network, random forest and support vector machine. Using 10-fold cross-validation on the database of 283 cases, the highest area under the receiver operating characteristic (ROC) curve (AUC) was 0.84 from a support vector machine with 77.7% overall accuracy; the highest overall accuracy, 78.5%, was from a random forest with the AUC 0.83. Lesion margin and orientation were optimum features common to all of the different machine learning methods. These features can be used in CAD systems to help distinguish benign from worrisome lesions. Copyright © 2016 World Federation for Ultrasound in Medicine & Biology. All rights reserved.

  20. PNF and manual therapy treatment results of patients with cervical spine osteoarthritis.

    PubMed

    Maicki, Tomasz; Bilski, Jan; Szczygieł, Elżbieta; Trąbka, Rafał

    2017-09-22

    The aim of this study was to evaluate the effectiveness of PNF and manual therapy methods in the treatment of patients with cervical spine osteoarthritis, especially their efficacy in reducing pain and improving functionality in everyday life. Long-term results were also compared in order to determine which method of treatment is more effective. Eighty randomly selected females aged 45-65 were included in the study. They were randomly divided into two groups of 40 persons. One group received PNF treatment and the other received manual therapy (MAN.T). To evaluate functional capabilities, the Functional Rating Index was used. To evaluate changes in pain, a shortened version of the McGill Questionnaire was used. The PNF group achieved a greater reduction in pain than the MAN.T group. The PNF group showed a greater improvement in performing daily activities such as sleeping, personal care, travelling, work, recreation, lifting, walking and standing as well as decreased intensity and frequency of pain compared to the MAN.T group. The PNF method proved to be more effective in both short (after two weeks) and long (after three months) term.

Top