Lessons Learned from Large-Scale Randomized Experiments
ERIC Educational Resources Information Center
Slavin, Robert E.; Cheung, Alan C. K.
2017-01-01
Large-scale randomized studies provide the best means of evaluating practical, replicable approaches to improving educational outcomes. This article discusses the advantages, problems, and pitfalls of these evaluations, focusing on alternative methods of randomization, recruitment, ensuring high-quality implementation, dealing with attrition, and…
The Relationship of Class Size Effects and Teacher Salary
ERIC Educational Resources Information Center
Peevely, Gary; Hedges, Larry; Nye, Barbara A.
2005-01-01
The effects of class size on academic achievement have been studied for decades. Although the results of small-scale, randomized experiments and large-scale, econometric studies point to positive effects of small classes, some scholars see the evidence as ambiguous. Recent analyses from a 4-year, large-scale, randomized experiment on the effects…
Freak waves in random oceanic sea states.
Onorato, M; Osborne, A R; Serio, M; Bertone, S
2001-06-18
Freak waves are very large, rare events in a random ocean wave train. Here we study their generation in a random sea state characterized by the Joint North Sea Wave Project spectrum. We assume, to cubic order in nonlinearity, that the wave dynamics are governed by the nonlinear Schrödinger (NLS) equation. We show from extensive numerical simulations of the NLS equation how freak waves in a random sea state are more likely to occur for large values of the Phillips parameter alpha and the enhancement coefficient gamma. Comparison with linear simulations is also reported.
Osteoporosis therapies: evidence from health-care databases and observational population studies.
Silverman, Stuart L
2010-11-01
Osteoporosis is a well-recognized disease with severe consequences if left untreated. Randomized controlled trials are the most rigorous method for determining the efficacy and safety of therapies. Nevertheless, randomized controlled trials underrepresent the real-world patient population and are costly in both time and money. Modern technology has enabled researchers to use information gathered from large health-care or medical-claims databases to assess the practical utilization of available therapies in appropriate patients. Observational database studies lack randomization but, if carefully designed and successfully completed, can provide valuable information that complements results obtained from randomized controlled trials and extends our knowledge to real-world clinical patients. Randomized controlled trials comparing fracture outcomes among osteoporosis therapies are difficult to perform. In this regard, large observational database studies could be useful in identifying clinically important differences among therapeutic options. Database studies can also provide important information with regard to osteoporosis prevalence, health economics, and compliance and persistence with treatment. This article describes the strengths and limitations of both randomized controlled trials and observational database studies, discusses considerations for observational study design, and reviews a wealth of information generated by database studies in the field of osteoporosis.
Large-scale structure of randomly jammed spheres
NASA Astrophysics Data System (ADS)
Ikeda, Atsushi; Berthier, Ludovic; Parisi, Giorgio
2017-05-01
We numerically analyze the density field of three-dimensional randomly jammed packings of monodisperse soft frictionless spherical particles, paying special attention to fluctuations occurring at large length scales. We study in detail the two-point static structure factor at low wave vectors in Fourier space. We also analyze the nature of the density field in real space by studying the large-distance behavior of the two-point pair correlation function, of density fluctuations in subsystems of increasing sizes, and of the direct correlation function. We show that such real space analysis can be greatly improved by introducing a coarse-grained density field to disentangle genuine large-scale correlations from purely local effects. Our results confirm that both Fourier and real space signatures of vanishing density fluctuations at large scale are absent, indicating that randomly jammed packings are not hyperuniform. In addition, we establish that the pair correlation function displays a surprisingly complex structure at large distances, which is however not compatible with the long-range negative correlation of hyperuniform systems but fully compatible with an analytic form for the structure factor. This implies that the direct correlation function is short ranged, as we also demonstrate directly. Our results reveal that density fluctuations in jammed packings do not follow the behavior expected for random hyperuniform materials, but display instead a more complex behavior.
Comparison of evidence on harms of medical interventions in randomized and nonrandomized studies
Papanikolaou, Panagiotis N.; Christidi, Georgia D.; Ioannidis, John P.A.
2006-01-01
Background Information on major harms of medical interventions comes primarily from epidemiologic studies performed after licensing and marketing. Comparison with data from large-scale randomized trials is occasionally feasible. We compared evidence from randomized trials with that from epidemiologic studies to determine whether they give different estimates of risk for important harms of medical interventions. Methods We targeted well-defined, specific harms of various medical interventions for which data were already available from large-scale randomized trials (> 4000 subjects). Nonrandomized studies involving at least 4000 subjects addressing these same harms were retrieved through a search of MEDLINE. We compared the relative risks and absolute risk differences for specific harms in the randomized and nonrandomized studies. Results Eligible nonrandomized studies were found for 15 harms for which data were available from randomized trials addressing the same harms. Comparisons of relative risks between the study types were feasible for 13 of the 15 topics, and of absolute risk differences for 8 topics. The estimated increase in relative risk differed more than 2-fold between the randomized and nonrandomized studies for 7 (54%) of the 13 topics; the estimated increase in absolute risk differed more than 2-fold for 5 (62%) of the 8 topics. There was no clear predilection for randomized or nonrandomized studies to estimate greater relative risks, but usually (75% [6/8]) the randomized trials estimated larger absolute excess risks of harm than the nonrandomized studies did. Interpretation Nonrandomized studies are often conservative in estimating absolute risks of harms. It would be useful to compare and scrutinize the evidence on harms obtained from both randomized and nonrandomized studies. PMID:16505459
Screening large-scale association study data: exploiting interactions using random forests.
Lunetta, Kathryn L; Hayward, L Brooke; Segal, Jonathan; Van Eerdewegh, Paul
2004-12-10
Genome-wide association studies for complex diseases will produce genotypes on hundreds of thousands of single nucleotide polymorphisms (SNPs). A logical first approach to dealing with massive numbers of SNPs is to use some test to screen the SNPs, retaining only those that meet some criterion for further study. For example, SNPs can be ranked by p-value, and those with the lowest p-values retained. When SNPs have large interaction effects but small marginal effects in a population, they are unlikely to be retained when univariate tests are used for screening. However, model-based screens that pre-specify interactions are impractical for data sets with thousands of SNPs. Random forest analysis is an alternative method that produces a single measure of importance for each predictor variable that takes into account interactions among variables without requiring model specification. Interactions increase the importance for the individual interacting variables, making them more likely to be given high importance relative to other variables. We test the performance of random forests as a screening procedure to identify small numbers of risk-associated SNPs from among large numbers of unassociated SNPs using complex disease models with up to 32 loci, incorporating both genetic heterogeneity and multi-locus interaction. Keeping other factors constant, if risk SNPs interact, the random forest importance measure significantly outperforms the Fisher Exact test as a screening tool. As the number of interacting SNPs increases, the improvement in performance of random forest analysis relative to Fisher Exact test for screening also increases. Random forests perform similarly to the univariate Fisher Exact test as a screening tool when SNPs in the analysis do not interact. In the context of large-scale genetic association studies where unknown interactions exist among true risk-associated SNPs or SNPs and environmental covariates, screening SNPs using random forest analyses can significantly reduce the number of SNPs that need to be retained for further study compared to standard univariate screening methods.
Regional or general anesthesia for fast-track hip and knee replacement - what is the evidence?
Kehlet, Henrik; Aasvang, Eske Kvanner
2015-01-01
Regional anesthesia for knee and hip arthroplasty may have favorable outcome effects compared with general anesthesia by effectively blocking afferent input, providing initial postoperative analgesia, reducing endocrine metabolic responses, and providing sympathetic blockade with reduced bleeding and less risk of thromboembolic complications but with undesirable effects on lower limb motor and urinary bladder function. Old randomized studies supported the use of regional anesthesia with fewer postoperative pulmonary and thromboembolic complications, and this has been supported by recent large non-randomized epidemiological database cohort studies. In contrast, the data from newer randomized trials are conflicting, and recent studies using modern general anesthetic techniques may potentially support the use of general versus spinal anesthesia. In summary, the lack of properly designed large randomized controlled trials comparing modern general anesthesia and spinal anesthesia for knee and hip arthroplasty prevents final recommendations and calls for prospective detailed studies in this clinically important field. PMID:26918127
Chao, Ming; Wu, Hao; Jin, Kai; Li, Bin; Wu, Jianjun; Zhang, Guangqiang; Yang, Gong; Hu, Xun
2016-01-01
Study design: Previous works suggested that neutralizing intratumoral lactic acidosis combined with glucose deprivation may deliver an effective approach to control tumor. We did a pilot clinical investigation, including a nonrandomized (57 patients with large HCC) and a randomized controlled (20 patients with large HCC) studies. Methods: The patients were treated with transarterial chemoembolization (TACE) with or without bicarbonate local infusion into tumor. Results: In the nonrandomized controlled study, geometric mean of viable tumor residues (VTR) in TACE with bicarbonate was 6.4-fold lower than that in TACE without bicarbonate (7.1% [95% CI: 4.6%–10.9%] vs 45.6% [28.9%–72.0%]; p<0.0001). This difference was recapitulated by a subsequent randomized controlled study. TACE combined with bicarbonate yielded a 100% objective response rate (ORR), whereas the ORR treated with TACE alone was 44.4% (nonrandomized) and 63.6% (randomized). The survival data suggested that bicarbonate may bring survival benefit. Conclusion: Bicarbonate markedly enhances the anticancer activity of TACE. Clinical trail registration: ChiCTR-IOR-14005319. DOI: http://dx.doi.org/10.7554/eLife.15691.001 PMID:27481188
Balancing Participation across Students in Large College Classes via Randomized Participation Credit
ERIC Educational Resources Information Center
McCleary, Daniel F.; Aspiranti, Kathleen B.; Foster, Lisa N.; Blondin, Carolyn A.; Gaylon, Charles E.; Yaw, Jared S.; Forbes, Bethany N.; Williams, Robert L.
2011-01-01
The study examines the effects of randomized credit on the percentage of students participating at four predefined levels. Students recorded their comments on specially designed record cards, and days were randomly selected for participation credit. This arrangement balanced participation across students while cutting instructor time for recording…
Li, Nicole; Yan, Lijing L; Niu, Wenyi; Labarthe, Darwin; Feng, Xiangxian; Shi, Jingpu; Zhang, Jianxin; Zhang, Ruijuan; Zhang, Yuhong; Chu, Hongling; Neiman, Andrea; Engelgau, Michael; Elliott, Paul; Wu, Yangfeng; Neal, Bruce
2013-11-01
Cardiovascular diseases are the leading cause of death and disability in China. High blood pressure caused by excess intake of dietary sodium is widespread and an effective sodium reduction program has potential to improve cardiovascular health. This study is a large-scale, cluster-randomized, trial done in five Northern Chinese provinces. Two counties have been selected from each province and 12 townships in each county making a total of 120 clusters. Within each township one village has been selected for participation with 1:1 randomization stratified by county. The sodium reduction intervention comprises community health education and a food supply strategy based upon providing access to salt substitute. Subsidization of the price of salt substitute was done in 30 intervention villages selected at random. Control villages continued usual practices. The primary outcome for the study is dietary sodium intake level estimated from assays of 24-hour urine. The trial recruited and randomized 120 townships in April 2011. The sodium reduction program was commenced in the 60 intervention villages between May and June of that year with outcome surveys scheduled for October to December 2012. Baseline data collection shows that randomisation achieved good balance across groups. The establishment of the China Rural Health Initiative has enabled the launch of this large-scale trial designed to identify a novel, scalable strategy for reduction of dietary sodium and control of blood pressure. If proved effective, the intervention could plausibly be implemented at low cost in large parts of China and other countries worldwide. © 2013.
A Data Management System Integrating Web-Based Training and Randomized Trials
ERIC Educational Resources Information Center
Muroff, Jordana; Amodeo, Maryann; Larson, Mary Jo; Carey, Margaret; Loftin, Ralph D.
2011-01-01
This article describes a data management system (DMS) developed to support a large-scale randomized study of an innovative web-course that was designed to improve substance abuse counselors' knowledge and skills in applying a substance abuse treatment method (i.e., cognitive behavioral therapy; CBT). The randomized trial compared the performance…
ERIC Educational Resources Information Center
McDonald, Lynn; Moberg, D. Paul; Brown, Roger; Rodriguez-Espiricueta, Ismael; Flores, Nydia I.; Burke, Melissa P.; Coover, Gail
2006-01-01
This randomized controlled trial evaluated a culturally representative parent engagement strategy with Latino parents of elementary school children. Ten urban schools serving low-income children from mixed cultural backgrounds participated in a large study. Classrooms were randomly assigned either either to an after-school, multifamily support…
Housworth, E A; Martins, E P
2001-01-01
Statistical randomization tests in evolutionary biology often require a set of random, computer-generated trees. For example, earlier studies have shown how large numbers of computer-generated trees can be used to conduct phylogenetic comparative analyses even when the phylogeny is uncertain or unknown. These methods were limited, however, in that (in the absence of molecular sequence or other data) they allowed users to assume that no phylogenetic information was available or that all possible trees were known. Intermediate situations where only a taxonomy or other limited phylogenetic information (e.g., polytomies) are available are technically more difficult. The current study describes a procedure for generating random samples of phylogenies while incorporating limited phylogenetic information (e.g., four taxa belong together in a subclade). The procedure can be used to conduct comparative analyses when the phylogeny is only partially resolved or can be used in other randomization tests in which large numbers of possible phylogenies are needed.
NASA Technical Reports Server (NTRS)
Prasad, C. B.; Mei, Chuh
1988-01-01
The large deflection random response of symmetrically laminated cross-ply rectangular thin plates subjected to random excitation is studied. The out-of-plane boundary conditions are such that all the edges are rigidly supported against translation, but elastically restrained against rotation. The plate is also assumed to have a small initial imperfection. The assumed membrane boundary conditions are such that all the edges are free from normal and tangential forces in the plane of the plate. Mean-square deflections and mean-square strains are determined for a three-layered cross-ply laminate.
Convex hulls of random walks in higher dimensions: A large-deviation study
NASA Astrophysics Data System (ADS)
Schawe, Hendrik; Hartmann, Alexander K.; Majumdar, Satya N.
2017-12-01
The distribution of the hypervolume V and surface ∂ V of convex hulls of (multiple) random walks in higher dimensions are determined numerically, especially containing probabilities far smaller than P =10-1000 to estimate large deviation properties. For arbitrary dimensions and large walk lengths T , we suggest a scaling behavior of the distribution with the length of the walk T similar to the two-dimensional case and behavior of the distributions in the tails. We underpin both with numerical data in d =3 and d =4 dimensions. Further, we confirm the analytically known means of those distributions and calculate their variances for large T .
Universal statistics of vortex tangles in three-dimensional random waves
NASA Astrophysics Data System (ADS)
Taylor, Alexander J.
2018-02-01
The tangled nodal lines (wave vortices) in random, three-dimensional wavefields are studied as an exemplar of a fractal loop soup. Their statistics are a three-dimensional counterpart to the characteristic random behaviour of nodal domains in quantum chaos, but in three dimensions the filaments can wind around one another to give distinctly different large scale behaviours. By tracing numerically the structure of the vortices, their conformations are shown to follow recent analytical predictions for random vortex tangles with periodic boundaries, where the local disorder of the model ‘averages out’ to produce large scale power law scaling relations whose universality classes do not depend on the local physics. These results explain previous numerical measurements in terms of an explicit effect of the periodic boundaries, where the statistics of the vortices are strongly affected by the large scale connectedness of the system even at arbitrarily high energies. The statistics are investigated primarily for static (monochromatic) wavefields, but the analytical results are further shown to directly describe the reconnection statistics of vortices evolving in certain dynamic systems, or occurring during random perturbations of the static configuration.
ERIC Educational Resources Information Center
Houston, J. Brian; First, Jennifer; Spialek, Matthew L.; Sorenson, Mary E.; Mills-Sandoval, Toby; Lockett, McKenzie; First, Nathan L.; Nitiéma, Pascal; Allen, Sandra F.; Pfefferbaum, Betty
2017-01-01
Objective: The purpose of this pilot study was to evaluate the Resilience and Coping Intervention (RCI) with college students. Participants: College students (aged 18-23) from a large Midwest US university who volunteered for a randomized controlled trial during the 2015 spring semester. Methods: College students were randomly assigned to an…
Li, Nicole; Yan, Lijing L.; Niu, Wenyi; Labarthe, Darwin; Feng, Xiangxian; Shi, Jingpu; Zhang, Jianxin; Zhang, Ruijuan; Zhang, Yuhong; Chu, Hongling; Neiman, Andrea; Engelgau, Michael; Elliott, Paul; Wu, Yangfeng; Neal, Bruce
2013-01-01
Background Cardiovascular diseases are the leading cause of death and disability in China. High blood pressure caused by excess intake of dietary sodium is widespread and an effective sodium reduction program has potential to improve cardiovascular health. Design This study is a large-scale, cluster-randomized, trial done in five Northern Chinese provinces. Two counties have been selected from each province and 12 townships in each county making a total of 120 clusters. Within each township one village has been selected for participation with 1:1 randomization stratified by county. The sodium reduction intervention comprises community health education and a food supply strategy based upon providing access to salt substitute. Subsidization of the price of salt substitute was done in 30 intervention villages selected at random. Control villages continued usual practices. The primary outcome for the study is dietary sodium intake level estimated from assays of 24 hour urine. Trial status The trial recruited and randomized 120 townships in April 2011. The sodium reduction program was commenced in the 60 intervention villages between May and June of that year with outcome surveys scheduled for October to December 2012. Baseline data collection shows that randomisation achieved good balance across groups. Discussion The establishment of the China Rural Health Initiative has enabled the launch of this large-scale trial designed to identify a novel, scalable strategy for reduction of dietary sodium and control of blood pressure. If proved effective, the intervention could plausibly be implemented at low cost in large parts of China and other countries worldwide. PMID:24176436
Impact of Probiotics on Necrotizing Enterocolitis
Underwood, Mark A.
2016-01-01
A large number of randomized placebo-controlled clinical trials and cohort studies have demonstrated a decrease in the incidence of necrotizing enterocolitis with administration of probiotic microbes. These studies have prompted many neonatologists to adopt routine prophylactic administration of probiotics while others await more definitive studies and/or probiotic products with demonstrated purity and stable numbers of live organisms. Cross-contamination and inadequate sample size limit the value of further traditional placebo-controlled randomized controlled trials. Key areas for future research include mechanisms of protection, optimum probiotic species or strains (or combinations thereof) and duration of treatment, interactions between diet and the administered probiotic, and the influence of genetic polymorphisms in the mother and infant on probiotic response. Next generation probiotics selected based on bacterial genetics rather than ease of production and large cluster-randomized clinical trials hold great promise for NEC prevention. PMID:27836423
Sampling large random knots in a confined space
NASA Astrophysics Data System (ADS)
Arsuaga, J.; Blackstone, T.; Diao, Y.; Hinson, K.; Karadayi, E.; Saito, M.
2007-09-01
DNA knots formed under extreme conditions of condensation, as in bacteriophage P4, are difficult to analyze experimentally and theoretically. In this paper, we propose to use the uniform random polygon model as a supplementary method to the existing methods for generating random knots in confinement. The uniform random polygon model allows us to sample knots with large crossing numbers and also to generate large diagrammatically prime knot diagrams. We show numerically that uniform random polygons sample knots with large minimum crossing numbers and certain complicated knot invariants (as those observed experimentally). We do this in terms of the knot determinants or colorings. Our numerical results suggest that the average determinant of a uniform random polygon of n vertices grows faster than O(e^{n^2}) . We also investigate the complexity of prime knot diagrams. We show rigorously that the probability that a randomly selected 2D uniform random polygon of n vertices is almost diagrammatically prime goes to 1 as n goes to infinity. Furthermore, the average number of crossings in such a diagram is at the order of O(n2). Therefore, the two-dimensional uniform random polygons offer an effective way in sampling large (prime) knots, which can be useful in various applications.
Udsen, Flemming Witt; Lilholt, Pernille Heyckendorff; Hejlesen, Ole; Ehlers, Lars Holger
2014-05-21
Several feasibility studies show promising results of telehealthcare on health outcomes and health-related quality of life for patients suffering from chronic obstructive pulmonary disease, and some of these studies show that telehealthcare may even lower healthcare costs. However, the only large-scale trial we have so far - the Whole System Demonstrator Project in England - has raised doubts about these results since it conclude that telehealthcare as a supplement to usual care is not likely to be cost-effective compared with usual care alone. The present study is known as 'TeleCare North' in Denmark. It seeks to address these doubts by implementing a large-scale, pragmatic, cluster-randomized trial with nested economic evaluation. The purpose of the study is to assess the effectiveness and the cost-effectiveness of a telehealth solution for patients suffering from chronic obstructive pulmonary disease compared to usual practice. General practitioners will be responsible for recruiting eligible participants (1,200 participants are expected) for the trial in the geographical area of the North Denmark Region. Twenty-six municipality districts in the region define the randomization clusters. The primary outcomes are changes in health-related quality of life, and the incremental cost-effectiveness ratio measured from baseline to follow-up at 12 months. Secondary outcomes are changes in mortality and physiological indicators (diastolic and systolic blood pressure, pulse, oxygen saturation, and weight). There has been a call for large-scale clinical trials with rigorous cost-effectiveness assessments in telehealthcare research. This study is meant to improve the international evidence base for the effectiveness and cost-effectiveness of telehealthcare to patients suffering from chronic obstructive pulmonary disease by implementing a large-scale pragmatic cluster-randomized clinical trial. Clinicaltrials.gov, http://NCT01984840, November 14, 2013.
Complete convergence of randomly weighted END sequences and its application.
Li, Penghua; Li, Xiaoqin; Wu, Kehan
2017-01-01
We investigate the complete convergence of partial sums of randomly weighted extended negatively dependent (END) random variables. Some results of complete moment convergence, complete convergence and the strong law of large numbers for this dependent structure are obtained. As an application, we study the convergence of the state observers of linear-time-invariant systems. Our results extend the corresponding earlier ones.
3D vector distribution of the electro-magnetic fields on a random gold film
NASA Astrophysics Data System (ADS)
Canneson, Damien; Berini, Bruno; Buil, Stéphanie; Hermier, Jean-Pierre; Quélin, Xavier
2018-05-01
The 3D vector distribution of the electro-magnetic fields at the very close vicinity of the surface of a random gold film is studied. Such films are well known for their properties of light confinement and large fluctuations of local density of optical states. Using Finite-Difference Time-Domain simulations, we show that it is possible to determine the local orientation of the electro-magnetic fields. This allows us to obtain a complete characterization of the fields. Large fluctuations of their amplitude are observed as previously shown. Here, we demonstrate large variations of their direction depending both on the position on the random gold film, and on the distance to it. Such characterization could be useful for a better understanding of applications like the coupling of point-like dipoles to such films.
Glutamine Randomized Studies in Early Life: The Unsolved Riddle of Experimental and Clinical Studies
Briassouli, Efrossini; Briassoulis, George
2012-01-01
Glutamine may have benefits during immaturity or critical illness in early life but its effects on outcome end hardpoints are controversial. Our aim was to review randomized studies on glutamine supplementation in pups, infants, and children examining whether glutamine affects outcome. Experimental work has proposed various mechanisms of glutamine action but none of the randomized studies in early life showed any effect on mortality and only a few showed some effect on inflammatory response, organ function, and a trend for infection control. Although apparently safe in animal models (pups), premature infants, and critically ill children, glutamine supplementation does not reduce mortality or late onset sepsis, and its routine use cannot be recommended in these sensitive populations. Large prospectively stratified trials are needed to better define the crucial interrelations of “glutamine-heat shock proteins-stress response” in critical illness and to identify the specific subgroups of premature neonates and critically ill infants or children who may have a greater need for glutamine and who may eventually benefit from its supplementation. The methodological problems noted in the reviewed randomized experimental and clinical trials should be seriously considered in any future well-designed large blinded randomized controlled trial involving glutamine supplementation in critical illness. PMID:23019424
The Random Forests Statistical Technique: An Examination of Its Value for the Study of Reading
ERIC Educational Resources Information Center
Matsuki, Kazunaga; Kuperman, Victor; Van Dyke, Julie A.
2016-01-01
Studies investigating individual differences in reading ability often involve data sets containing a large number of collinear predictors and a small number of observations. In this article, we discuss the method of Random Forests and demonstrate its suitability for addressing the statistical concerns raised by such data sets. The method is…
ERIC Educational Resources Information Center
Jackson, Dan; Bowden, Jack; Baker, Rose
2015-01-01
Moment-based estimators of the between-study variance are very popular when performing random effects meta-analyses. This type of estimation has many advantages including computational and conceptual simplicity. Furthermore, by using these estimators in large samples, valid meta-analyses can be performed without the assumption that the treatment…
Workforce Readiness: A Study of University Students' Fluency with Information Technology
ERIC Educational Resources Information Center
Kaminski, Karen; Switzer, Jamie; Gloeckner, Gene
2009-01-01
This study with data collected from a large sample of freshmen in 2001 and a random stratified sample of seniors in 2005 examined students perceived FITness (fluency with Information Technology). In the fall of 2001 freshmen at a medium sized research-one institution completed a survey and in spring 2005 a random sample of graduating seniors…
ERIC Educational Resources Information Center
Hedberg, E. C.; Hedges, Larry V.
2014-01-01
Randomized experiments are often considered the strongest designs to study the impact of educational interventions. Perhaps the most prevalent class of designs used in large scale education experiments is the cluster randomized design in which entire schools are assigned to treatments. In cluster randomized trials (CRTs) that assign schools to…
ARTS: automated randomization of multiple traits for study design.
Maienschein-Cline, Mark; Lei, Zhengdeng; Gardeux, Vincent; Abbasi, Taimur; Machado, Roberto F; Gordeuk, Victor; Desai, Ankit A; Saraf, Santosh; Bahroos, Neil; Lussier, Yves
2014-06-01
Collecting data from large studies on high-throughput platforms, such as microarray or next-generation sequencing, typically requires processing samples in batches. There are often systematic but unpredictable biases from batch-to-batch, so proper randomization of biologically relevant traits across batches is crucial for distinguishing true biological differences from experimental artifacts. When a large number of traits are biologically relevant, as is common for clinical studies of patients with varying sex, age, genotype and medical background, proper randomization can be extremely difficult to prepare by hand, especially because traits may affect biological inferences, such as differential expression, in a combinatorial manner. Here we present ARTS (automated randomization of multiple traits for study design), which aids researchers in study design by automatically optimizing batch assignment for any number of samples, any number of traits and any batch size. ARTS is implemented in Perl and is available at github.com/mmaiensc/ARTS. ARTS is also available in the Galaxy Tool Shed, and can be used at the Galaxy installation hosted by the UIC Center for Research Informatics (CRI) at galaxy.cri.uic.edu. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Mohamed, Somaia; Rosenheck, Robert A; Lin, Haiqun; Swartz, Marvin; McEvoy, Joseph; Stroup, Scott
2015-07-01
No large-scale randomized trial has compared the effect of different second-generation antipsychotic drugs and any first-generation drug on alcohol, drug and nicotine use in patients with schizophrenia. The Clinical Antipsychotic Trial of Intervention Effectiveness study randomly assigned 1432 patients formally diagnosed with schizophrenia to four second-generation antipsychotic drugs (olanzapine, risperidone quetiapine, and ziprasidone) and one first-generation antipsychotic (perphenazine) and followed them for up to 18 months. Secondary outcome data documented cigarettes smoked in the past week and alcohol and drug use severity ratings. At baseline, 61% of patients smoked, 35% used alcohol, and 23% used illicit drugs. Although there were significant effects of time showing reduction in substance use over the 18 months (all p < 0.0001), this study found no evidence that any antipsychotic was robustly superior to any other in a secondary analysis of data on substance use outcomes from a large 18-month randomized schizophrenia trial.
Large deviations and mixing for dissipative PDEs with unbounded random kicks
NASA Astrophysics Data System (ADS)
Jakšić, V.; Nersesyan, V.; Pillet, C.-A.; Shirikyan, A.
2018-02-01
We study the problem of exponential mixing and large deviations for discrete-time Markov processes associated with a class of random dynamical systems. Under some dissipativity and regularisation hypotheses for the underlying deterministic dynamics and a non-degeneracy condition for the driving random force, we discuss the existence and uniqueness of a stationary measure and its exponential stability in the Kantorovich-Wasserstein metric. We next turn to the large deviations principle (LDP) and establish its validity for the occupation measures of the Markov processes in question. The proof is based on Kifer’s criterion for non-compact spaces, a result on large-time asymptotics for generalised Markov semigroup, and a coupling argument. These tools combined together constitute a new approach to LDP for infinite-dimensional processes without strong Feller property in a non-compact space. The results obtained can be applied to the two-dimensional Navier-Stokes system in a bounded domain and to the complex Ginzburg-Landau equation.
From randomized controlled trials to observational studies.
Silverman, Stuart L
2009-02-01
Randomized controlled trials are considered the gold standard in the hierarchy of research designs for evaluating the efficacy and safety of a treatment intervention. However, their results can have limited applicability to patients in clinical settings. Observational studies using large health care databases can complement findings from randomized controlled trials by assessing treatment effectiveness in patients encountered in day-to-day clinical practice. Results from these designs can expand upon outcomes of randomized controlled trials because of the use of larger and more diverse patient populations with common comorbidities and longer follow-up periods. Furthermore, well-designed observational studies can identify clinically important differences among therapeutic options and provide data on long-term drug effectiveness and safety.
ERIC Educational Resources Information Center
Chung, Gregory K. W. K.; Choi, Kilchan; Baker, Eva L.; Cai, Li
2014-01-01
A large-scale randomized controlled trial tested the effects of researcher-developed learning games on a transfer measure of fractions knowledge. The measure contained items similar to standardized assessments. Thirty treatment and 29 control classrooms (~1500 students, 9 districts, 26 schools) participated in the study. Students in treatment…
Mendelian randomization in nutritional epidemiology
Qi, Lu
2013-01-01
Nutritional epidemiology aims to identify dietary and lifestyle causes for human diseases. Causality inference in nutritional epidemiology is largely based on evidence from studies of observational design, and may be distorted by unmeasured or residual confounding and reverse causation. Mendelian randomization is a recently developed methodology that combines genetic and classical epidemiological analysis to infer causality for environmental exposures, based on the principle of Mendel’s law of independent assortment. Mendelian randomization uses genetic variants as proxiesforenvironmentalexposuresofinterest.AssociationsderivedfromMendelian randomization analysis are less likely to be affected by confounding and reverse causation. During the past 5 years, a body of studies examined the causal effects of diet/lifestyle factors and biomarkers on a variety of diseases. The Mendelian randomization approach also holds considerable promise in the study of intrauterine influences on offspring health outcomes. However, the application of Mendelian randomization in nutritional epidemiology has some limitations. PMID:19674341
Saville, Benjamin R.; Herring, Amy H.; Kaufman, Jay S.
2013-01-01
Racial/ethnic disparities in birthweight are a large source of differential morbidity and mortality worldwide and have remained largely unexplained in epidemiologic models. We assess the impact of maternal ancestry and census tract residence on infant birth weights in New York City and the modifying effects of race and nativity by incorporating random effects in a multilevel linear model. Evaluating the significance of these predictors involves the test of whether the variances of the random effects are equal to zero. This is problematic because the null hypothesis lies on the boundary of the parameter space. We generalize an approach for assessing random effects in the two-level linear model to a broader class of multilevel linear models by scaling the random effects to the residual variance and introducing parameters that control the relative contribution of the random effects. After integrating over the random effects and variance components, the resulting integrals needed to calculate the Bayes factor can be efficiently approximated with Laplace’s method. PMID:24082430
Cluster Tails for Critical Power-Law Inhomogeneous Random Graphs
NASA Astrophysics Data System (ADS)
van der Hofstad, Remco; Kliem, Sandra; van Leeuwaarden, Johan S. H.
2018-04-01
Recently, the scaling limit of cluster sizes for critical inhomogeneous random graphs of rank-1 type having finite variance but infinite third moment degrees was obtained in Bhamidi et al. (Ann Probab 40:2299-2361, 2012). It was proved that when the degrees obey a power law with exponent τ \\in (3,4), the sequence of clusters ordered in decreasing size and multiplied through by n^{-(τ -2)/(τ -1)} converges as n→ ∞ to a sequence of decreasing non-degenerate random variables. Here, we study the tails of the limit of the rescaled largest cluster, i.e., the probability that the scaling limit of the largest cluster takes a large value u, as a function of u. This extends a related result of Pittel (J Combin Theory Ser B 82(2):237-269, 2001) for the Erdős-Rényi random graph to the setting of rank-1 inhomogeneous random graphs with infinite third moment degrees. We make use of delicate large deviations and weak convergence arguments.
Modeling pattern in collections of parameters
Link, W.A.
1999-01-01
Wildlife management is increasingly guided by analyses of large and complex datasets. The description of such datasets often requires a large number of parameters, among which certain patterns might be discernible. For example, one may consider a long-term study producing estimates of annual survival rates; of interest is the question whether these rates have declined through time. Several statistical methods exist for examining pattern in collections of parameters. Here, I argue for the superiority of 'random effects models' in which parameters are regarded as random variables, with distributions governed by 'hyperparameters' describing the patterns of interest. Unfortunately, implementation of random effects models is sometimes difficult. Ultrastructural models, in which the postulated pattern is built into the parameter structure of the original data analysis, are approximations to random effects models. However, this approximation is not completely satisfactory: failure to account for natural variation among parameters can lead to overstatement of the evidence for pattern among parameters. I describe quasi-likelihood methods that can be used to improve the approximation of random effects models by ultrastructural models.
Strategies for Improving Power in School-Randomized Studies of Professional Development.
Kelcey, Ben; Phelps, Geoffrey
2013-12-01
Group-randomized designs are well suited for studies of professional development because they can accommodate programs that are delivered to intact groups (e.g., schools), the collaborative nature of professional development, and extant teacher/school assignments. Though group designs may be theoretically favorable, prior evidence has suggested that they may be challenging to conduct in professional development studies because well-powered designs will typically require large sample sizes or expect large effect sizes. Using teacher knowledge outcomes in mathematics, we investigated when and the extent to which there is evidence that covariance adjustment on a pretest, teacher certification, or demographic covariates can reduce the sample size necessary to achieve reasonable power. Our analyses drew on multilevel models and outcomes in five different content areas for over 4,000 teachers and 2,000 schools. Using these estimates, we assessed the minimum detectable effect sizes for several school-randomized designs with and without covariance adjustment. The analyses suggested that teachers' knowledge is substantially clustered within schools in each of the five content areas and that covariance adjustment for a pretest or, to a lesser extent, teacher certification, has the potential to transform designs that are unreasonably large for professional development studies into viable studies. © The Author(s) 2014.
Open quantum random walks: Bistability on pure states and ballistically induced diffusion
NASA Astrophysics Data System (ADS)
Bauer, Michel; Bernard, Denis; Tilloy, Antoine
2013-12-01
Open quantum random walks (OQRWs) deal with quantum random motions on a line for systems with internal and orbital degrees of freedom. The internal system behaves as a quantum random gyroscope coding for the direction of the orbital moves. We reveal the existence of a transition, depending on OQRW moduli, in the internal system behaviors from simple oscillations to random flips between two unstable pure states. This induces a transition in the orbital motions from the usual diffusion to ballistically induced diffusion with a large mean free path and large effective diffusion constant at large times. We also show that mixed states of the internal system are converted into random pure states during the process. We touch upon possible experimental realizations.
Freedman, Laurence S; Kipnis, Victor; Schatzkin, Arthur; Potischman, Nancy
2008-01-01
Results from several large cohort studies that were reported 10 to 20 years ago seemed to indicate that the hypothesized link between dietary fat intake and breast cancer risk was illusory. In this article, we review several strands of more recent evidence that have emerged. These include two studies comparing the performance of dietary instruments used to investigate the dietary fat- breast cancer hypothesis, a large randomized disease prevention trial, a more recent meta-analysis of nutritional cohort studies, and a very large nutritional cohort study. Each of the studies discussed in this article suggests that a modest but real association between fat intake and breast cancer is likely. If the association is causative, it would have important implications for public health strategies in reducing breast cancer incidence. The evidence is not yet conclusive, but additional follow-up in the randomized trial, as well as efforts to improve dietary assessment methodology for cohort studies, may be sufficient to provide a convincing answer.
Random packing of regular polygons and star polygons on a flat two-dimensional surface.
Cieśla, Michał; Barbasz, Jakub
2014-08-01
Random packing of unoriented regular polygons and star polygons on a two-dimensional flat continuous surface is studied numerically using random sequential adsorption algorithm. Obtained results are analyzed to determine the saturated random packing ratio as well as its density autocorrelation function. Additionally, the kinetics of packing growth and available surface function are measured. In general, stars give lower packing ratios than polygons, but when the number of vertexes is large enough, both shapes approach disks and, therefore, properties of their packing reproduce already known results for disks.
Effect of H-wave polarization on laser radar detection of partially convex targets in random media.
El-Ocla, Hosam
2010-07-01
A study on the performance of laser radar cross section (LRCS) of conducting targets with large sizes is investigated numerically in free space and random media. The LRCS is calculated using a boundary value method with beam wave incidence and H-wave polarization. Considered are those elements that contribute to the LRCS problem including random medium strength, target configuration, and beam width. The effect of the creeping waves, stimulated by H-polarization, on the LRCS behavior is manifested. Targets taking large sizes of up to five wavelengths are sufficiently larger than the beam width and are sufficient for considering fairly complex targets. Scatterers are assumed to have analytical partially convex contours with inflection points.
Random-anisotropy model: Monotonic dependence of the coercive field on D/J
NASA Astrophysics Data System (ADS)
Saslow, W. M.; Koon, N. C.
1994-02-01
We present the results of a numerical study of the zero-temperature remanence and coercivity for the random anisotropy model (RAM), showing that, contrary to early calculations for this model, the coercive field increases monotonically with increases in the strength D of the random anisotropy relative to the strength J at the exchange field. Local-field adjustments with and without spin flips are considered. Convergence is difficult to obtain for small values of the anisotropy, suggesting that this is the likely source of the nonmonotonic behavior found in earlier studies. For both large and small anisotropy, each spin undergoes about one flip per hysteresis cycle, and about half of the spin flips occur in the vicinity of the coercive field. When only non-spin-flip adjustments are considered, at large anisotropy the coercivity is proportional to the anisotropy. At small anisotropy, the rate of convergence is comparable to that when spin flips are included.
Li, Sheng; Liu, Tong-Zu; Wang, Xing-Huan; Zeng, Xian-Tao; Zeng, Guang; Yang, Zhong-Hua; Weng, Hong; Meng, Zhe; Huang, Jing-Yu
2014-08-01
To evaluate the efficacy and safety of retroperitoneal laparoscopic pyelolithotomy (RLP) versus percutaneous nephrolithotomy (PCNL) for large renal pelvic calculi using a randomized controlled trial. Patients with large renal pelvic calculi were prospectively randomized using matched-pair analysis (1:1 scenario) into either the RLP group or the PCNL group. The patients in each group underwent the procedure accordingly. Treatment efficacy, safety, and complications were evaluated after surgery. Finally, 178 eligible patients were included and the demographics and mean stone size of two groups were similar. We found no significant differences in the mean postoperative hospital stay (4.5±2.3 vs. 4.3±1.3 days), rate of blood transfusion (0% vs. 1.1%), conversion rate (0% vs. 3.4%), and rate of total postoperative complication (p>0.05). The procedural duration and mean drop in hemoglobin levels were significantly lower in the RLP group as compared with the PCNL group (90.87±33.4 vs. 116.8±44.4 minutes, p<0.001; 0.9±0.5 vs. 1.7±1.3 g/dL, p<0.001, respectively). Significant differences were also observed in the stone-free rate (98% vs. 90%, p=0.03) and postoperative fever rate (3.4% vs. 13.5%, p=0.02). Current evidence suggests that PCNL and RLP are both effective and safe for the treatment of large renal pelvic calculi. Our study shows that, compared with the PCNL approach, RLP for large renal pelvic stone resulted in shorter operative time, less bleeding, less postoperative fever, and a higher stone-free rate. Data from larger, multicenter randomized controlled trials of high quality are needed to further confirm our findings.
ERIC Educational Resources Information Center
Steel, Jennifer L.; Herlitz, Claes A.
2005-01-01
Objective: Several studies with small and ''high risk'' samples have demonstrated that a history of childhood or adolescent sexual abuse (CASA) is associated with sexual risk behaviors (SRBs). However, few studies with large random samples from the general population have specifically examined the relationship between CASA and SRBs with a…
ERIC Educational Resources Information Center
Wheeler, Marc E.; Keller, Thomas E.; DuBois, David L.
2010-01-01
Between 2007 and 2009, reports were released on the results of three separate large-scale random assignment studies of the effectiveness of school-based mentoring programs for youth. The studies evaluated programs implemented by Big Brothers Big Sisters of America (BBBSA) affiliates (Herrera et al., 2007), Communities In Schools of San Antonio,…
Finite-range Coulomb gas models of banded random matrices and quantum kicked rotors
NASA Astrophysics Data System (ADS)
Pandey, Akhilesh; Kumar, Avanish; Puri, Sanjay
2017-11-01
Dyson demonstrated an equivalence between infinite-range Coulomb gas models and classical random matrix ensembles for the study of eigenvalue statistics. We introduce finite-range Coulomb gas (FRCG) models via a Brownian matrix process, and study them analytically and by Monte Carlo simulations. These models yield new universality classes, and provide a theoretical framework for the study of banded random matrices (BRMs) and quantum kicked rotors (QKRs). We demonstrate that, for a BRM of bandwidth b and a QKR of chaos parameter α , the appropriate FRCG model has the effective range d =b2/N =α2/N , for large N matrix dimensionality. As d increases, there is a transition from Poisson to classical random matrix statistics.
Finite-range Coulomb gas models of banded random matrices and quantum kicked rotors.
Pandey, Akhilesh; Kumar, Avanish; Puri, Sanjay
2017-11-01
Dyson demonstrated an equivalence between infinite-range Coulomb gas models and classical random matrix ensembles for the study of eigenvalue statistics. We introduce finite-range Coulomb gas (FRCG) models via a Brownian matrix process, and study them analytically and by Monte Carlo simulations. These models yield new universality classes, and provide a theoretical framework for the study of banded random matrices (BRMs) and quantum kicked rotors (QKRs). We demonstrate that, for a BRM of bandwidth b and a QKR of chaos parameter α, the appropriate FRCG model has the effective range d=b^{2}/N=α^{2}/N, for large N matrix dimensionality. As d increases, there is a transition from Poisson to classical random matrix statistics.
NASA Astrophysics Data System (ADS)
Berger, Noam; Mukherjee, Chiranjib; Okamura, Kazuki
2018-03-01
We prove a quenched large deviation principle (LDP) for a simple random walk on a supercritical percolation cluster (SRWPC) on {Z^d} ({d ≥ 2}). The models under interest include classical Bernoulli bond and site percolation as well as models that exhibit long range correlations, like the random cluster model, the random interlacement and the vacant set of random interlacements (for {d ≥ 3}) and the level sets of the Gaussian free field ({d≥ 3}). Inspired by the methods developed by Kosygina et al. (Commun Pure Appl Math 59:1489-1521, 2006) for proving quenched LDP for elliptic diffusions with a random drift, and by Yilmaz (Commun Pure Appl Math 62(8):1033-1075, 2009) and Rosenbluth (Quenched large deviations for multidimensional random walks in a random environment: a variational formula. Ph.D. thesis, NYU, arXiv:0804.1444v1) for similar results regarding elliptic random walks in random environment, we take the point of view of the moving particle and prove a large deviation principle for the quenched distribution of the pair empirical measures of the environment Markov chain in the non-elliptic case of SRWPC. Via a contraction principle, this reduces easily to a quenched LDP for the distribution of the mean velocity of the random walk and both rate functions admit explicit variational formulas. The main difficulty in our set up lies in the inherent non-ellipticity as well as the lack of translation-invariance stemming from conditioning on the fact that the origin belongs to the infinite cluster. We develop a unifying approach for proving quenched large deviations for SRWPC based on exploiting coercivity properties of the relative entropies in the context of convex variational analysis, combined with input from ergodic theory and invoking geometric properties of the supercritical percolation cluster.
NASA Astrophysics Data System (ADS)
Berger, Noam; Mukherjee, Chiranjib; Okamura, Kazuki
2017-12-01
We prove a quenched large deviation principle (LDP) for a simple random walk on a supercritical percolation cluster (SRWPC) on {Z^d} ({d ≥ 2} ). The models under interest include classical Bernoulli bond and site percolation as well as models that exhibit long range correlations, like the random cluster model, the random interlacement and the vacant set of random interlacements (for {d ≥ 3} ) and the level sets of the Gaussian free field ({d≥ 3} ). Inspired by the methods developed by Kosygina et al. (Commun Pure Appl Math 59:1489-1521, 2006) for proving quenched LDP for elliptic diffusions with a random drift, and by Yilmaz (Commun Pure Appl Math 62(8):1033-1075, 2009) and Rosenbluth (Quenched large deviations for multidimensional random walks in a random environment: a variational formula. Ph.D. thesis, NYU, arXiv:0804.1444v1) for similar results regarding elliptic random walks in random environment, we take the point of view of the moving particle and prove a large deviation principle for the quenched distribution of the pair empirical measures of the environment Markov chain in the non-elliptic case of SRWPC. Via a contraction principle, this reduces easily to a quenched LDP for the distribution of the mean velocity of the random walk and both rate functions admit explicit variational formulas. The main difficulty in our set up lies in the inherent non-ellipticity as well as the lack of translation-invariance stemming from conditioning on the fact that the origin belongs to the infinite cluster. We develop a unifying approach for proving quenched large deviations for SRWPC based on exploiting coercivity properties of the relative entropies in the context of convex variational analysis, combined with input from ergodic theory and invoking geometric properties of the supercritical percolation cluster.
NASA Astrophysics Data System (ADS)
Verma, Arjun; Privman, Vladimir
2018-02-01
We study approach to the large-time jammed state of the deposited particles in the model of random sequential adsorption. The convergence laws are usually derived from the argument of Pomeau which includes the assumption of the dominance, at large enough times, of small landing regions into each of which only a single particle can be deposited without overlapping earlier deposited particles and which, after a certain time are no longer created by depositions in larger gaps. The second assumption has been that the size distribution of gaps open for particle-center landing in this large-time small-gaps regime is finite in the limit of zero gap size. We report numerical Monte Carlo studies of a recently introduced model of random sequential adsorption on patterned one-dimensional substrates that suggest that the second assumption must be generalized. We argue that a region exists in the parameter space of the studied model in which the gap-size distribution in the Pomeau large-time regime actually linearly vanishes at zero gap sizes. In another region, the distribution develops a threshold property, i.e., there are no small gaps below a certain gap size. We discuss the implications of these findings for new asymptotic power-law and exponential-modified-by-a-power-law convergences to jamming in irreversible one-dimensional deposition.
Ko, Mi-Hwa
2018-01-01
In this paper, we obtain the Hájek-Rényi inequality and, as an application, we study the strong law of large numbers for H -valued m -asymptotically almost negatively associated random vectors with mixing coefficients [Formula: see text] such that [Formula: see text].
ERIC Educational Resources Information Center
Middleton, Kathryn R.; Perri, Michael G.
2014-01-01
Objective: The current study was a randomized controlled trial investigating the effect of an innovative, short-term lifestyle intervention on weight gain in female freshman college students. Participants: Ninety-five freshmen were recruited from a large public university in the United States. Methods: Participants completed baseline assessments…
A Bimodal Hybrid Model for Time-Dependent Probabilistic Seismic Hazard Analysis
NASA Astrophysics Data System (ADS)
Yaghmaei-Sabegh, Saman; Shoaeifar, Nasser; Shoaeifar, Parva
2018-03-01
The evaluation of evidence provided by geological studies and historical catalogs indicates that in some seismic regions and faults, multiple large earthquakes occur in cluster. Then, the occurrences of large earthquakes confront with quiescence and only the small-to-moderate earthquakes take place. Clustering of large earthquakes is the most distinguishable departure from the assumption of constant hazard of random occurrence of earthquakes in conventional seismic hazard analysis. In the present study, a time-dependent recurrence model is proposed to consider a series of large earthquakes that occurs in clusters. The model is flexible enough to better reflect the quasi-periodic behavior of large earthquakes with long-term clustering, which can be used in time-dependent probabilistic seismic hazard analysis with engineering purposes. In this model, the time-dependent hazard results are estimated by a hazard function which comprises three parts. A decreasing hazard of last large earthquake cluster and an increasing hazard of the next large earthquake cluster, along with a constant hazard of random occurrence of small-to-moderate earthquakes. In the final part of the paper, the time-dependent seismic hazard of the New Madrid Seismic Zone at different time intervals has been calculated for illustrative purpose.
Random number generators for large-scale parallel Monte Carlo simulations on FPGA
NASA Astrophysics Data System (ADS)
Lin, Y.; Wang, F.; Liu, B.
2018-05-01
Through parallelization, field programmable gate array (FPGA) can achieve unprecedented speeds in large-scale parallel Monte Carlo (LPMC) simulations. FPGA presents both new constraints and new opportunities for the implementations of random number generators (RNGs), which are key elements of any Monte Carlo (MC) simulation system. Using empirical and application based tests, this study evaluates all of the four RNGs used in previous FPGA based MC studies and newly proposed FPGA implementations for two well-known high-quality RNGs that are suitable for LPMC studies on FPGA. One of the newly proposed FPGA implementations: a parallel version of additive lagged Fibonacci generator (Parallel ALFG) is found to be the best among the evaluated RNGs in fulfilling the needs of LPMC simulations on FPGA.
Li, Baoyue; Lingsma, Hester F; Steyerberg, Ewout W; Lesaffre, Emmanuel
2011-05-23
Logistic random effects models are a popular tool to analyze multilevel also called hierarchical data with a binary or ordinal outcome. Here, we aim to compare different statistical software implementations of these models. We used individual patient data from 8509 patients in 231 centers with moderate and severe Traumatic Brain Injury (TBI) enrolled in eight Randomized Controlled Trials (RCTs) and three observational studies. We fitted logistic random effects regression models with the 5-point Glasgow Outcome Scale (GOS) as outcome, both dichotomized as well as ordinal, with center and/or trial as random effects, and as covariates age, motor score, pupil reactivity or trial. We then compared the implementations of frequentist and Bayesian methods to estimate the fixed and random effects. Frequentist approaches included R (lme4), Stata (GLLAMM), SAS (GLIMMIX and NLMIXED), MLwiN ([R]IGLS) and MIXOR, Bayesian approaches included WinBUGS, MLwiN (MCMC), R package MCMCglmm and SAS experimental procedure MCMC.Three data sets (the full data set and two sub-datasets) were analysed using basically two logistic random effects models with either one random effect for the center or two random effects for center and trial. For the ordinal outcome in the full data set also a proportional odds model with a random center effect was fitted. The packages gave similar parameter estimates for both the fixed and random effects and for the binary (and ordinal) models for the main study and when based on a relatively large number of level-1 (patient level) data compared to the number of level-2 (hospital level) data. However, when based on relatively sparse data set, i.e. when the numbers of level-1 and level-2 data units were about the same, the frequentist and Bayesian approaches showed somewhat different results. The software implementations differ considerably in flexibility, computation time, and usability. There are also differences in the availability of additional tools for model evaluation, such as diagnostic plots. The experimental SAS (version 9.2) procedure MCMC appeared to be inefficient. On relatively large data sets, the different software implementations of logistic random effects regression models produced similar results. Thus, for a large data set there seems to be no explicit preference (of course if there is no preference from a philosophical point of view) for either a frequentist or Bayesian approach (if based on vague priors). The choice for a particular implementation may largely depend on the desired flexibility, and the usability of the package. For small data sets the random effects variances are difficult to estimate. In the frequentist approaches the MLE of this variance was often estimated zero with a standard error that is either zero or could not be determined, while for Bayesian methods the estimates could depend on the chosen "non-informative" prior of the variance parameter. The starting value for the variance parameter may be also critical for the convergence of the Markov chain.
Baseline adjustments for binary data in repeated cross-sectional cluster randomized trials.
Nixon, R M; Thompson, S G
2003-09-15
Analysis of covariance models, which adjust for a baseline covariate, are often used to compare treatment groups in a controlled trial in which individuals are randomized. Such analysis adjusts for any baseline imbalance and usually increases the precision of the treatment effect estimate. We assess the value of such adjustments in the context of a cluster randomized trial with repeated cross-sectional design and a binary outcome. In such a design, a new sample of individuals is taken from the clusters at each measurement occasion, so that baseline adjustment has to be at the cluster level. Logistic regression models are used to analyse the data, with cluster level random effects to allow for different outcome probabilities in each cluster. We compare the estimated treatment effect and its precision in models that incorporate a covariate measuring the cluster level probabilities at baseline and those that do not. In two data sets, taken from a cluster randomized trial in the treatment of menorrhagia, the value of baseline adjustment is only evident when the number of subjects per cluster is large. We assess the generalizability of these findings by undertaking a simulation study, and find that increased precision of the treatment effect requires both large cluster sizes and substantial heterogeneity between clusters at baseline, but baseline imbalance arising by chance in a randomized study can always be effectively adjusted for. Copyright 2003 John Wiley & Sons, Ltd.
Temprosa, M.; Otvos, J.; Brunzell, J.; Marcovina, S.; Mather, K.; Arakaki, R.; Watson, K.; Horton, E.; Barrett-Connor, E.
2013-01-01
Context: Although intensive lifestyle change (ILS) and metformin reduce diabetes incidence in subjects with impaired glucose tolerance (IGT), their effects on lipoprotein subfractions have not been studied. Objective: The objective of the study was to characterize the effects of ILS and metformin vs placebo interventions on lipoprotein subfractions in the Diabetes Prevention Program. Design: This was a randomized clinical trial, testing the effects of ILS, metformin, and placebo on diabetes development in subjects with IGT. Participants: Selected individuals with IGT randomized in the Diabetes Prevention Program participated in the study. Interventions: Interventions included randomization to metformin 850 mg or placebo twice daily or ILS aimed at a 7% weight loss using a low-fat diet with increased physical activity. Main Outcome Measures: Lipoprotein subfraction size, density, and concentration measured by magnetic resonance and density gradient ultracentrifugation at baseline and 1 year were measured. Results: ILS decreased large and buoyant very low-density lipoprotein, small and dense low-density lipoprotein (LDL), and small high-density lipoprotein (HDL) and raised large HDL. Metformin modestly reduced small and dense LDL and raised small and large HDL. Change in insulin resistance largely accounted for the intervention-associated decreases in large very low-density lipoprotein, whereas changes in body mass index (BMI) and adiponectin were strongly associated with changes in LDL. Baseline and a change in adiponectin were related to change in large HDL, and BMI change associated with small HDL change. The effect of metformin to increase small HDL was independent of adiponectin, BMI, and insulin resistance. Conclusion: ILS and metformin treatment have favorable effects on lipoprotein subfractions that are primarily mediated by intervention-related changes in insulin resistance, BMI, and adiponectin. Interventions that slow the development of diabetes may also retard the progression of atherosclerosis. PMID:23979954
Metacognitive therapy versus cognitive behavioural therapy for depression: a randomized pilot study.
Jordan, Jennifer; Carter, Janet D; McIntosh, Virginia V W; Fernando, Kumari; Frampton, Christopher M A; Porter, Richard J; Mulder, Roger T; Lacey, Cameron; Joyce, Peter R
2014-10-01
Metacognitive therapy (MCT) is one of the newer developments within cognitive therapy. This randomized controlled pilot study compared independently applied MCT with cognitive behavioural therapy (CBT) in outpatients with depression to explore the relative speed and efficacy of MCT, ahead of a planned randomized controlled trial. A total of 48 participants referred for outpatient therapy were randomized to up to 12 weeks of MCT or CBT. Key outcomes were reduction in depressive symptoms at week 4 and week 12, measured using the independent-clinician-rated Quick Inventory of Depressive Symptomatology16. Intention-to-treat and completer analyses as well as additional methods of reporting outcome of depression are presented. Both therapies were effective in producing clinically significant change in depressive symptoms, with moderate-to-large effect sizes obtained. No differences were detected between therapies in overall outcome or early change on clinician-rated or self-reported measures. Post-hoc analyses suggest that MCT may have been adversely affected by greater comorbidity. In this large pilot study conducted independently of MCT's developers, MCT was an effective treatment for outpatients with depression, with similar results overall to CBT. Insufficient power and imbalanced comorbidity limit conclusions regarding comparative efficacy so further studies of MCT and CBT are required. © The Royal Australian and New Zealand College of Psychiatrists 2014.
Vrijheid, Martine; Deltour, Isabelle; Krewski, Daniel; Sanchez, Marie; Cardis, Elisabeth
2006-07-01
This paper examines the effects of systematic and random errors in recall and of selection bias in case-control studies of mobile phone use and cancer. These sensitivity analyses are based on Monte-Carlo computer simulations and were carried out within the INTERPHONE Study, an international collaborative case-control study in 13 countries. Recall error scenarios simulated plausible values of random and systematic, non-differential and differential recall errors in amount of mobile phone use reported by study subjects. Plausible values for the recall error were obtained from validation studies. Selection bias scenarios assumed varying selection probabilities for cases and controls, mobile phone users, and non-users. Where possible these selection probabilities were based on existing information from non-respondents in INTERPHONE. Simulations used exposure distributions based on existing INTERPHONE data and assumed varying levels of the true risk of brain cancer related to mobile phone use. Results suggest that random recall errors of plausible levels can lead to a large underestimation in the risk of brain cancer associated with mobile phone use. Random errors were found to have larger impact than plausible systematic errors. Differential errors in recall had very little additional impact in the presence of large random errors. Selection bias resulting from underselection of unexposed controls led to J-shaped exposure-response patterns, with risk apparently decreasing at low to moderate exposure levels. The present results, in conjunction with those of the validation studies conducted within the INTERPHONE study, will play an important role in the interpretation of existing and future case-control studies of mobile phone use and cancer risk, including the INTERPHONE study.
The feasibility and stability of large complex biological networks: a random matrix approach.
Stone, Lewi
2018-05-29
In the 70's, Robert May demonstrated that complexity creates instability in generic models of ecological networks having random interaction matrices A. Similar random matrix models have since been applied in many disciplines. Central to assessing stability is the "circular law" since it describes the eigenvalue distribution for an important class of random matrices A. However, despite widespread adoption, the "circular law" does not apply for ecological systems in which density-dependence operates (i.e., where a species growth is determined by its density). Instead one needs to study the far more complicated eigenvalue distribution of the community matrix S = DA, where D is a diagonal matrix of population equilibrium values. Here we obtain this eigenvalue distribution. We show that if the random matrix A is locally stable, the community matrix S = DA will also be locally stable, providing the system is feasible (i.e., all species have positive equilibria D > 0). This helps explain why, unusually, nearly all feasible systems studied here are locally stable. Large complex systems may thus be even more fragile than May predicted, given the difficulty of assembling a feasible system. It was also found that the degree of stability, or resilience of a system, depended on the minimum equilibrium population.
Renfrew, M J; Hannah, W; Albers, L; Floyd, E
1998-09-01
Trauma to the genital tract commonly occurs at birth, and can cause short- and long-term morbidity. Clinical measures to reduce its occurrence have not been fully identified. A systematic review of the English language literature was conducted to describe the current state of knowledge on reduction of genital tract trauma before planning a large randomized controlled trial of ways to prevent such trauma. Randomized trials and other published reports were identified from relevant databases and hand searches. Studies were reviewed and assessed using a structured format. A total of 77 papers and chapters were identified and placed into 5 categories after critical review: 25 randomized trials, 4 meta-analyses, 4 prospective studies, 36 retrospective studies, and 8 descriptions of practice from textbooks. The available evidence is conclusive in favor of restricted use of episiotomy. The contribution of maternal characteristics and attitudes to intact perineum has not been investigated. Several other topics warrant further study, including maternal position, style of pushing, and antenatal perineal massage. Strong opinions and sparse data exist regarding the role of hand maneuvers by the birth attendant for perineal management and birth of the baby. This became the topic of the planned randomized controlled trial, which was completed; results will be published soon. The case for restricting the use of episiotomy is conclusive. Several other clinical factors warrant investigation, including the role of hand maneuvers by the birth attendant in preventing birth trauma. A large randomized controlled trial will report on this topic.
ERIC Educational Resources Information Center
Wijekumar, Kausalai; Meyer, Bonnie J. F.; Lei, Pui-Wa; Lin, Yu-Chu; Johnson, Lori A.; Spielvogel, James A.; Shurmatz, Kathryn M.; Ray, Melissa; Cook, Michael
2014-01-01
This article reports on a large scale randomized controlled trial to study the efficacy of a web-based intelligent tutoring system for the structure strategy designed to improve content area reading comprehension. The research was conducted with 128 fifth-grade classrooms within 12 school districts in rural and suburban settings. Classrooms within…
Reducing Achievement Gaps in Academic Writing for Latinos and English Learners in Grades 7-12
ERIC Educational Resources Information Center
Olson, Carol Booth; Matuchniak, Tina; Chung, Huy Q.; Stumpf, Rachel; Farkas, George
2017-01-01
This study reports 2 years of findings from a randomized controlled trial designed to replicate and demonstrate the efficacy of an existing, successful professional development program, the Pathway Project, that uses a cognitive strategies approach to text-based analytical writing. Building on an earlier randomized field trial in a large, urban,…
Stability of the Markov operator and synchronization of Markovian random products
NASA Astrophysics Data System (ADS)
Díaz, Lorenzo J.; Matias, Edgar
2018-05-01
We study Markovian random products on a large class of ‘m-dimensional’ connected compact metric spaces (including products of closed intervals and trees). We introduce a splitting condition, generalizing the classical one by Dubins and Freedman, and prove that this condition implies the asymptotic stability of the corresponding Markov operator and (exponentially fast) synchronization.
An assessment of re-randomization methods in bark beetle (Scolytidae) trapping bioassays
Christopher J. Fettig; Christopher P. Dabney; Stepehen R. McKelvey; Robert R. Borys
2006-01-01
Numerous studies have explored the role of semiochemicals in the behavior of bark beetles (Scolytidae). Multiple funnel traps are often used to elucidate these behavioral responses. Sufficient sample sizes are obtained by using large numbers of traps to which treatments are randomly assigned once, or by frequent collection of trap catches and subsequent re-...
On the statistical mechanics of the 2D stochastic Euler equation
NASA Astrophysics Data System (ADS)
Bouchet, Freddy; Laurie, Jason; Zaboronski, Oleg
2011-12-01
The dynamics of vortices and large scale structures is qualitatively very different in two dimensional flows compared to its three dimensional counterparts, due to the presence of multiple integrals of motion. These are believed to be responsible for a variety of phenomena observed in Euler flow such as the formation of large scale coherent structures, the existence of meta-stable states and random abrupt changes in the topology of the flow. In this paper we study stochastic dynamics of the finite dimensional approximation of the 2D Euler flow based on Lie algebra su(N) which preserves all integrals of motion. In particular, we exploit rich algebraic structure responsible for the existence of Euler's conservation laws to calculate the invariant measures and explore their properties and also study the approach to equilibrium. Unexpectedly, we find deep connections between equilibrium measures of finite dimensional su(N) truncations of the stochastic Euler equations and random matrix models. Our work can be regarded as a preparation for addressing the questions of large scale structures, meta-stability and the dynamics of random transitions between different flow topologies in stochastic 2D Euler flows.
Quantifying Uncertainties in Land Surface Microwave Emissivity Retrievals
NASA Technical Reports Server (NTRS)
Tian, Yudong; Peters-Lidard, Christa D.; Harrison, Kenneth W.; Prigent, Catherine; Norouzi, Hamidreza; Aires, Filipe; Boukabara, Sid-Ahmed; Furuzawa, Fumie A.; Masunaga, Hirohiko
2012-01-01
Uncertainties in the retrievals of microwave land surface emissivities were quantified over two types of land surfaces: desert and tropical rainforest. Retrievals from satellite-based microwave imagers, including SSM/I, TMI and AMSR-E, were studied. Our results show that there are considerable differences between the retrievals from different sensors and from different groups over these two land surface types. In addition, the mean emissivity values show different spectral behavior across the frequencies. With the true emissivity assumed largely constant over both of the two sites throughout the study period, the differences are largely attributed to the systematic and random errors in the retrievals. Generally these retrievals tend to agree better at lower frequencies than at higher ones, with systematic differences ranging 14% (312 K) over desert and 17% (320 K) over rainforest. The random errors within each retrieval dataset are in the range of 0.52% (26 K). In particular, at 85.0/89.0 GHz, there are very large differences between the different retrieval datasets, and within each retrieval dataset itself. Further investigation reveals that these differences are mostly likely caused by rain/cloud contamination, which can lead to random errors up to 1017 K under the most severe conditions.
Buffel du Vaure, Céline; Boutron, Isabelle; Perrodeau, Elodie; Ravaud, Philippe
2014-04-28
Systematic reporting of funding sources is recommended in the CONSORT Statement for abstracts. However, no specific recommendation is related to the reporting of conflicts of interest (CoI). The objective was to compare physicians' confidence in the conclusions of abstracts of randomized controlled trials of pharmaceutical treatment indexed in PubMed. We planned a three-arm parallel-group randomized trial. French general practitioners (GPs) were invited to participate and were blinded to the study's aim. We used a representative sample of 75 abstracts of pharmaceutical industry-funded randomized controlled trials published in 2010 and indexed in PubMed. Each abstract was standardized and reported in three formats: 1) no mention of the funding source or CoI; 2) reporting the funding source only; and 3) reporting the funding source and CoI. GPs were randomized according to a computerized randomization on a secure Internet system at a 1:1:1 ratio to assess one abstract among the three formats. The primary outcome was GPs' confidence in the abstract conclusions (0, not at all, to 10, completely confident). The study was planned to detect a large difference with an effect size of 0.5. Between October 2012 and June 2013, among 605 GPs contacted, 354 were randomized, 118 for each type of abstract. The mean difference (95% confidence interval) in GPs' confidence in abstract findings was 0.2 (-0.6; 1.0) (P = 0.84) for abstracts reporting the funding source only versus no funding source or CoI; -0.4 (-1.3; 0.4) (P = 0.39) for abstracts reporting the funding source and CoI versus no funding source and CoI; and -0.6 (-1.5; 0.2) (P = 0.15) for abstracts reporting the funding source and CoI versus the funding source only. We found no evidence of a large impact of trial report abstracts mentioning funding sources or CoI on GPs' confidence in the conclusions of the abstracts. ClinicalTrials.gov identifier: NCT01679873.
Mindfulness-based interventions for binge eating: a systematic review and meta-analysis.
Godfrey, Kathryn M; Gallo, Linda C; Afari, Niloofar
2015-04-01
Mindfulness-based interventions are increasingly used to treat binge eating. The effects of these interventions have not been reviewed comprehensively. This systematic review and meta-analysis sought to summarize the literature on mindfulness-based interventions and determine their impact on binge eating behavior. PubMED, Web of Science, and PsycINFO were searched using keywords binge eating, overeating, objective bulimic episodes, acceptance and commitment therapy, dialectical behavior therapy, mindfulness, meditation, mindful eating. Of 151 records screened, 19 studies met inclusion criteria. Most studies showed effects of large magnitude. Results of random effects meta-analyses supported large or medium-large effects of these interventions on binge eating (within-group random effects mean Hedge's g = -1.12, 95 % CI -1.67, -0.80, k = 18; between-group mean Hedge's g = -0.70, 95 % CI -1.16, -0.24, k = 7). However, there was high statistical heterogeneity among the studies (within-group I(2) = 93 %; between-group I(2) = 90 %). Limitations and future research directions are discussed.
Some limit theorems for ratios of order statistics from uniform random variables.
Xu, Shou-Fang; Miao, Yu
2017-01-01
In this paper, we study the ratios of order statistics based on samples drawn from uniform distribution and establish some limit properties such as the almost sure central limit theorem, the large deviation principle, the Marcinkiewicz-Zygmund law of large numbers and complete convergence.
El-Ocla, Hosam
2006-08-01
The characteristics of a radar cross section (RCS) of partially convex targets with large sizes up to five wavelengths in free space and random media are studied. The nature of the incident wave is an important factor in remote sensing and radar detection applications. I investigate the effects of beam wave incidence on the performance of RCS, drawing on the method I used in a previous study on plane-wave incidence. A beam wave can be considered a plane wave if the target size is smaller than the beam width. Therefore, to have a beam wave with a limited spot on the target, the target size should be larger than the beam width (assuming E-wave incidence wave polarization. The effects of the target configuration, random medium parameters, and the beam width on the laser RCS and the enhancement in the radar cross section are numerically analyzed, resulting in the possibility of having some sort of control over radar detection using beam wave incidence.
ERIC Educational Resources Information Center
Smith, Sherri L.; Saunders, Gabrielle H.; Chisolm, Theresa H.; Frederick, Melissa; Bailey, Beth A.
2016-01-01
Purpose: The purpose of this study was to determine if patient characteristics or clinical variables could predict who benefits from individual auditory training. Method: A retrospective series of analyses were performed using a data set from a large, multisite, randomized controlled clinical trial that compared the treatment effects of at-home…
ERIC Educational Resources Information Center
Gersten, Russell; Rolfhus, Eric; Clarke, Ben; Decker, Lauren E.; Wilkins, Chuck; Dimino, Joseph
2015-01-01
Replication studies are extremely rare in education. This randomized controlled trial (RCT) is a scale-up replication of Fuchs et al., which in a sample of 139 found a statistically significant positive impact for Number Rockets, a small-group intervention for at-risk first graders that focused on building understanding of number operations. The…
2011-01-01
Background Logistic random effects models are a popular tool to analyze multilevel also called hierarchical data with a binary or ordinal outcome. Here, we aim to compare different statistical software implementations of these models. Methods We used individual patient data from 8509 patients in 231 centers with moderate and severe Traumatic Brain Injury (TBI) enrolled in eight Randomized Controlled Trials (RCTs) and three observational studies. We fitted logistic random effects regression models with the 5-point Glasgow Outcome Scale (GOS) as outcome, both dichotomized as well as ordinal, with center and/or trial as random effects, and as covariates age, motor score, pupil reactivity or trial. We then compared the implementations of frequentist and Bayesian methods to estimate the fixed and random effects. Frequentist approaches included R (lme4), Stata (GLLAMM), SAS (GLIMMIX and NLMIXED), MLwiN ([R]IGLS) and MIXOR, Bayesian approaches included WinBUGS, MLwiN (MCMC), R package MCMCglmm and SAS experimental procedure MCMC. Three data sets (the full data set and two sub-datasets) were analysed using basically two logistic random effects models with either one random effect for the center or two random effects for center and trial. For the ordinal outcome in the full data set also a proportional odds model with a random center effect was fitted. Results The packages gave similar parameter estimates for both the fixed and random effects and for the binary (and ordinal) models for the main study and when based on a relatively large number of level-1 (patient level) data compared to the number of level-2 (hospital level) data. However, when based on relatively sparse data set, i.e. when the numbers of level-1 and level-2 data units were about the same, the frequentist and Bayesian approaches showed somewhat different results. The software implementations differ considerably in flexibility, computation time, and usability. There are also differences in the availability of additional tools for model evaluation, such as diagnostic plots. The experimental SAS (version 9.2) procedure MCMC appeared to be inefficient. Conclusions On relatively large data sets, the different software implementations of logistic random effects regression models produced similar results. Thus, for a large data set there seems to be no explicit preference (of course if there is no preference from a philosophical point of view) for either a frequentist or Bayesian approach (if based on vague priors). The choice for a particular implementation may largely depend on the desired flexibility, and the usability of the package. For small data sets the random effects variances are difficult to estimate. In the frequentist approaches the MLE of this variance was often estimated zero with a standard error that is either zero or could not be determined, while for Bayesian methods the estimates could depend on the chosen "non-informative" prior of the variance parameter. The starting value for the variance parameter may be also critical for the convergence of the Markov chain. PMID:21605357
Real-time fast physical random number generator with a photonic integrated circuit.
Ugajin, Kazusa; Terashima, Yuta; Iwakawa, Kento; Uchida, Atsushi; Harayama, Takahisa; Yoshimura, Kazuyuki; Inubushi, Masanobu
2017-03-20
Random number generators are essential for applications in information security and numerical simulations. Most optical-chaos-based random number generators produce random bit sequences by offline post-processing with large optical components. We demonstrate a real-time hardware implementation of a fast physical random number generator with a photonic integrated circuit and a field programmable gate array (FPGA) electronic board. We generate 1-Tbit random bit sequences and evaluate their statistical randomness using NIST Special Publication 800-22 and TestU01. All of the BigCrush tests in TestU01 are passed using 410-Gbit random bit sequences. A maximum real-time generation rate of 21.1 Gb/s is achieved for random bit sequences in binary format stored in a computer, which can be directly used for applications involving secret keys in cryptography and random seeds in large-scale numerical simulations.
van Staa, T-P; Klungel, O; Smeeth, L
2014-06-01
A solid foundation of evidence of the effects of an intervention is a prerequisite of evidence-based medicine. The best source of such evidence is considered to be randomized trials, which are able to avoid confounding. However, they may not always estimate effectiveness in clinical practice. Databases that collate anonymized electronic health records (EHRs) from different clinical centres have been widely used for many years in observational studies. Randomized point-of-care trials have been initiated recently to recruit and follow patients using the data from EHR databases. In this review, we describe how EHR databases can be used for conducting large-scale simple trials and discuss the advantages and disadvantages of their use. © 2014 The Association for the Publication of the Journal of Internal Medicine.
Corrected Mean-Field Model for Random Sequential Adsorption on Random Geometric Graphs
NASA Astrophysics Data System (ADS)
Dhara, Souvik; van Leeuwaarden, Johan S. H.; Mukherjee, Debankur
2018-03-01
A notorious problem in mathematics and physics is to create a solvable model for random sequential adsorption of non-overlapping congruent spheres in the d-dimensional Euclidean space with d≥ 2 . Spheres arrive sequentially at uniformly chosen locations in space and are accepted only when there is no overlap with previously deposited spheres. Due to spatial correlations, characterizing the fraction of accepted spheres remains largely intractable. We study this fraction by taking a novel approach that compares random sequential adsorption in Euclidean space to the nearest-neighbor blocking on a sequence of clustered random graphs. This random network model can be thought of as a corrected mean-field model for the interaction graph between the attempted spheres. Using functional limit theorems, we characterize the fraction of accepted spheres and its fluctuations.
A systematic review of randomized trials of mind-body interventions for PTSD.
Niles, Barbara L; Mori, DeAnna L; Polizzi, Craig; Pless Kaiser, Anica; Weinstein, Elizabeth S; Gershkovich, Marina; Wang, Chenchen
2018-05-10
To systematically review outcomes from randomized controlled trials (RCTs) of mind-body treatments for PTSD. Inclusion criteria based on guidelines for assessing risk of bias were used to evaluate articles identified through electronic literature searches. Twenty-two RCTs met inclusion standards. In most of the nine mindfulness and six yoga studies, significant between-group effects were found indicating moderate to large effect size advantages for these treatments. In all seven relaxation RCT's, relaxation was used as a control condition and five studies reported significant between-group differences on relevant PTSD outcomes in favor of the target treatments. However, there were large within-group symptom improvements in the relaxation condition for the majority of studies. Although many studies are limited by methodologic weaknesses, recent studies have increased rigor and, in aggregate, the results for mindfulness, yoga, and relaxation are promising. Recommendations for design of future mind-body trials are offered. © 2018 Wiley Periodicals, Inc.
A multicenter, randomized, controlled trial of osteopathic manipulative treatment on preterms.
Cerritelli, Francesco; Pizzolorusso, Gianfranco; Renzetti, Cinzia; Cozzolino, Vincenzo; D'Orazio, Marianna; Lupacchini, Mariacristina; Marinelli, Benedetta; Accorsi, Alessandro; Lucci, Chiara; Lancellotti, Jenny; Ballabio, Silvia; Castelli, Carola; Molteni, Daniela; Besana, Roberto; Tubaldi, Lucia; Perri, Francesco Paolo; Fusilli, Paola; D'Incecco, Carmine; Barlafante, Gina
2015-01-01
Despite some preliminary evidence, it is still largely unknown whether osteopathic manipulative treatment improves preterm clinical outcomes. The present multi-center randomized single blind parallel group clinical trial enrolled newborns who met the criteria for gestational age between 29 and 37 weeks, without any congenital complication from 3 different public neonatal intensive care units. Preterm infants were randomly assigned to usual prenatal care (control group) or osteopathic manipulative treatment (study group). The primary outcome was the mean difference in length of hospital stay between groups. A total of 695 newborns were randomly assigned to either the study group (n= 352) or the control group (n=343). A statistical significant difference was observed between the two groups for the primary outcome (13.8 and 17.5 days for the study and control group respectively, p<0.001, effect size: 0.31). Multivariate analysis showed a reduction of the length of stay of 3.9 days (95% CI -5.5 to -2.3, p<0.001). Furthermore, there were significant reductions with treatment as compared to usual care in cost (difference between study and control group: 1,586.01€; 95% CI 1,087.18 to 6,277.28; p<0.001) but not in daily weight gain. There were no complications associated to the intervention. Osteopathic treatment reduced significantly the number of days of hospitalization and is cost-effective on a large cohort of preterm infants.
Entanglement transitions induced by large deviations
NASA Astrophysics Data System (ADS)
Bhosale, Udaysinh T.
2017-12-01
The probability of large deviations of the smallest Schmidt eigenvalue for random pure states of bipartite systems, denoted as A and B , is computed analytically using a Coulomb gas method. It is shown that this probability, for large N , goes as exp[-β N2Φ (ζ ) ] , where the parameter β is the Dyson index of the ensemble, ζ is the large deviation parameter, while the rate function Φ (ζ ) is calculated exactly. Corresponding equilibrium Coulomb charge density is derived for its large deviations. Effects of the large deviations of the extreme (largest and smallest) Schmidt eigenvalues on the bipartite entanglement are studied using the von Neumann entropy. Effect of these deviations is also studied on the entanglement between subsystems 1 and 2, obtained by further partitioning the subsystem A , using the properties of the density matrix's partial transpose ρ12Γ. The density of states of ρ12Γ is found to be close to the Wigner's semicircle law with these large deviations. The entanglement properties are captured very well by a simple random matrix model for the partial transpose. The model predicts the entanglement transition across a critical large deviation parameter ζ . Log negativity is used to quantify the entanglement between subsystems 1 and 2. Analytical formulas for it are derived using the simple model. Numerical simulations are in excellent agreement with the analytical results.
Entanglement transitions induced by large deviations.
Bhosale, Udaysinh T
2017-12-01
The probability of large deviations of the smallest Schmidt eigenvalue for random pure states of bipartite systems, denoted as A and B, is computed analytically using a Coulomb gas method. It is shown that this probability, for large N, goes as exp[-βN^{2}Φ(ζ)], where the parameter β is the Dyson index of the ensemble, ζ is the large deviation parameter, while the rate function Φ(ζ) is calculated exactly. Corresponding equilibrium Coulomb charge density is derived for its large deviations. Effects of the large deviations of the extreme (largest and smallest) Schmidt eigenvalues on the bipartite entanglement are studied using the von Neumann entropy. Effect of these deviations is also studied on the entanglement between subsystems 1 and 2, obtained by further partitioning the subsystem A, using the properties of the density matrix's partial transpose ρ_{12}^{Γ}. The density of states of ρ_{12}^{Γ} is found to be close to the Wigner's semicircle law with these large deviations. The entanglement properties are captured very well by a simple random matrix model for the partial transpose. The model predicts the entanglement transition across a critical large deviation parameter ζ. Log negativity is used to quantify the entanglement between subsystems 1 and 2. Analytical formulas for it are derived using the simple model. Numerical simulations are in excellent agreement with the analytical results.
ERIC Educational Resources Information Center
Hasson, H.; Brown, C.; Hasson, D.
2010-01-01
In web-based health promotion programs, large variations in participant engagement are common. The aim was to investigate determinants of high use of a worksite self-help web-based program for stress management. Two versions of the program were offered to randomly selected departments in IT and media companies. A static version of the program…
Shen, Deqiang; Bai, Hao; Li, Zhaoping; Yu, Yue; Zhang, Huanhuan; Chen, Liyong
2017-03-01
Animal experimental studies have found that resistant starch can significantly improve bowel function, but the outcomes are mixed while conducting human studies. Thus, we conducted a systematic review and meta-analysis of randomized controlled trials to evaluate the relationship between resistant starch supplementation and large intestinal function. Three electronic databases (PubMed, Embase, Scopus) were searched to identify eligible studies. The standardized mean difference (SMD) or weighted mean difference (WMD) was calculated using a fixed-effects model or a random-effects model. The pooled findings revealed that resistant starch significantly increased fecal wet weight (WMD 35.51 g/d, 95% CI 1.21, 69.82) and butyrate concentration (SMD 0.61, 95% CI 0.32, 0.89). Also, it significantly reduced fecal PH (WMD -0.19, 95% CI -0.35, -0.03), but the increment of defecation frequency were not statistically significant (WMD 0.04stools/g, 95% CI -0.08, 0.16). To conclude, our study found that resistant starch elicited a beneficial effect on the function of large bowel in healthy adults.[Formula: see text].
Mediterranean diet and life expectancy; beyond olive oil, fruits, and vegetables.
Martinez-Gonzalez, Miguel A; Martin-Calvo, Nerea
2016-11-01
The recent relevant evidence of the effects of the Mediterranean diet (MedDiet) and lifestyle on health (2015 and first months of 2016). Large observational prospective epidemiological studies with adequate control of confounding and two large randomized trials support the benefits of the Mediterranean dietary pattern to increase life expectancy, reduce the risk of major chronic disease, and improve quality of life and well-being. Recently, 19 new studies from large prospective studies showed - with nearly perfect consistency - strong benefits of the MedDiet to reduce the risk of myocardial infarction, stroke, total mortality, heart failure, and disability. Interestingly, two large and well conducted cohorts reported significant cardiovascular benefits after using repeated measurements of diet during a long follow-up period. In addition, Prevención con Dieta Mediterránea, the largest randomized trial with MedDiet, recently reported benefits of this dietary pattern to prevent cognitive decline and breast cancer. In the era of evidence-based medicine, the MedDiet represents the gold standard in preventive medicine, probably because of the harmonic combination of many elements with antioxidant and anti-inflammatory properties, which overwhelm any single nutrient or food item. The whole seems more important than the sum of its parts.
BCH codes for large IC random-access memory systems
NASA Technical Reports Server (NTRS)
Lin, S.; Costello, D. J., Jr.
1983-01-01
In this report some shortened BCH codes for possible applications to large IC random-access memory systems are presented. These codes are given by their parity-check matrices. Encoding and decoding of these codes are discussed.
A large-scale study of the random variability of a coding sequence: a study on the CFTR gene.
Modiano, Guido; Bombieri, Cristina; Ciminelli, Bianca Maria; Belpinati, Francesca; Giorgi, Silvia; Georges, Marie des; Scotet, Virginie; Pompei, Fiorenza; Ciccacci, Cinzia; Guittard, Caroline; Audrézet, Marie Pierre; Begnini, Angela; Toepfer, Michael; Macek, Milan; Ferec, Claude; Claustres, Mireille; Pignatti, Pier Franco
2005-02-01
Coding single nucleotide substitutions (cSNSs) have been studied on hundreds of genes using small samples (n(g) approximately 100-150 genes). In the present investigation, a large random European population sample (average n(g) approximately 1500) was studied for a single gene, the CFTR (Cystic Fibrosis Transmembrane conductance Regulator). The nonsynonymous (NS) substitutions exhibited, in accordance with previous reports, a mean probability of being polymorphic (q > 0.005), much lower than that of the synonymous (S) substitutions, but they showed a similar rate of subpolymorphic (q < 0.005) variability. This indicates that, in autosomal genes that may have harmful recessive alleles (nonduplicated genes with important functions), genetic drift overwhelms selection in the subpolymorphic range of variability, making disadvantageous alleles behave as neutral. These results imply that the majority of the subpolymorphic nonsynonymous alleles of these genes are selectively negative or even pathogenic.
Flexible sampling large-scale social networks by self-adjustable random walk
NASA Astrophysics Data System (ADS)
Xu, Xiao-Ke; Zhu, Jonathan J. H.
2016-12-01
Online social networks (OSNs) have become an increasingly attractive gold mine for academic and commercial researchers. However, research on OSNs faces a number of difficult challenges. One bottleneck lies in the massive quantity and often unavailability of OSN population data. Sampling perhaps becomes the only feasible solution to the problems. How to draw samples that can represent the underlying OSNs has remained a formidable task because of a number of conceptual and methodological reasons. Especially, most of the empirically-driven studies on network sampling are confined to simulated data or sub-graph data, which are fundamentally different from real and complete-graph OSNs. In the current study, we propose a flexible sampling method, called Self-Adjustable Random Walk (SARW), and test it against with the population data of a real large-scale OSN. We evaluate the strengths of the sampling method in comparison with four prevailing methods, including uniform, breadth-first search (BFS), random walk (RW), and revised RW (i.e., MHRW) sampling. We try to mix both induced-edge and external-edge information of sampled nodes together in the same sampling process. Our results show that the SARW sampling method has been able to generate unbiased samples of OSNs with maximal precision and minimal cost. The study is helpful for the practice of OSN research by providing a highly needed sampling tools, for the methodological development of large-scale network sampling by comparative evaluations of existing sampling methods, and for the theoretical understanding of human networks by highlighting discrepancies and contradictions between existing knowledge/assumptions of large-scale real OSN data.
Quantifying Uncertainties in Land-Surface Microwave Emissivity Retrievals
NASA Technical Reports Server (NTRS)
Tian, Yudong; Peters-Lidard, Christa D.; Harrison, Kenneth W.; Prigent, Catherine; Norouzi, Hamidreza; Aires, Filipe; Boukabara, Sid-Ahmed; Furuzawa, Fumie A.; Masunaga, Hirohiko
2013-01-01
Uncertainties in the retrievals of microwaveland-surface emissivities are quantified over two types of land surfaces: desert and tropical rainforest. Retrievals from satellite-based microwave imagers, including the Special Sensor Microwave Imager, the Tropical Rainfall Measuring Mission Microwave Imager, and the Advanced Microwave Scanning Radiometer for Earth Observing System, are studied. Our results show that there are considerable differences between the retrievals from different sensors and from different groups over these two land-surface types. In addition, the mean emissivity values show different spectral behavior across the frequencies. With the true emissivity assumed largely constant over both of the two sites throughout the study period, the differences are largely attributed to the systematic and random errors inthe retrievals. Generally, these retrievals tend to agree better at lower frequencies than at higher ones, with systematic differences ranging 1%-4% (3-12 K) over desert and 1%-7% (3-20 K) over rainforest. The random errors within each retrieval dataset are in the range of 0.5%-2% (2-6 K). In particular, at 85.5/89.0 GHz, there are very large differences between the different retrieval datasets, and within each retrieval dataset itself. Further investigation reveals that these differences are most likely caused by rain/cloud contamination, which can lead to random errors up to 10-17 K under the most severe conditions.
Sampling Large Graphs for Anticipatory Analytics
2015-05-15
low. C. Random Area Sampling Random area sampling [8] is a “ snowball ” sampling method in which a set of random seed vertices are selected and areas... Sampling Large Graphs for Anticipatory Analytics Lauren Edwards, Luke Johnson, Maja Milosavljevic, Vijay Gadepally, Benjamin A. Miller Lincoln...systems, greater human-in-the-loop involvement, or through complex algorithms. We are investigating the use of sampling to mitigate these challenges
Effectiveness of Acupuncture Therapy on Stress in a Large Urban College Population.
Schroeder, Stefanie; Burnis, James; Denton, Antony; Krasnow, Aaron; Raghu, T S; Mathis, Kimberly
2017-06-01
This study is a randomized controlled clinical trial to study the effectiveness of acupuncture on the perception of stress in patients who study or work on a large, urban college campus. The hypothesis was that verum acupuncture would demonstrate a significant positive impact on perceived stress as compared to sham acupuncture. This study included 111 participants with high self-reported stress levels who either studied or worked at a large, urban public university in the southwestern United States. However, only 62 participants completed the study. The participants were randomized into a verum acupuncture or sham acupuncture group. Both the groups received treatment once a week for 12 weeks. The Cohen's global measure of perceived stress scale (PSS-14) was completed by each participant prior to treatment, at 6 weeks, at 12 weeks, and 6 weeks and 12 weeks post-treatment completion. While participants of both the groups showed a substantial initial decrease in perceived stress scores, at 12 weeks post treatment, the verum acupuncture group showed a significantly greater treatment effect than the sham acupuncture group. This study indicates that acupuncture may be successful in decreasing the perception of stress in students and staff at a large urban university, and this effect persists for at least 3 months after the completion of treatment. Copyright © 2017. Published by Elsevier B.V.
Bourmaud, Aurelie; Soler-Michel, Patricia; Oriol, Mathieu; Regnier, Véronique; Tinquaut, Fabien; Nourissat, Alice; Bremond, Alain; Moumjid, Nora; Chauvin, Franck
2016-01-01
Controversies regarding the benefits of breast cancer screening programs have led to the promotion of new strategies taking into account individual preferences, such as decision aid. The aim of this study was to assess the impact of a decision aid leaflet on the participation of women invited to participate in a national breast cancer screening program. This Randomized, multicentre, controlled trial. Women aged 50 to 74 years, were randomly assigned to receive either a decision aid or the usual invitation letter. Primary outcome was the participation rate 12 months after the invitation. 16 000 women were randomized and 15 844 included in the modified intention-to-treat analysis. The participation rate in the intervention group was 40.25% (3174/7885 women) compared with 42.13% (3353/7959) in the control group (p = 0.02). Previous attendance for screening (RR = 6.24; [95%IC: 5.75-6.77]; p < 0.0001) and medium household income (RR = 1.05; [95%IC: 1.01-1.09]; p = 0.0074) were independently associated with attendance for screening. This large-scale study demonstrates that the decision aid reduced the participation rate. The decision aid activate the decision making process of women toward non-attendance to screening. These results show the importance of promoting informed patient choices, especially when those choices cannot be anticipated. PMID:26883201
Johansen, Anette; Denbæk, Anne Maj; Bonnesen, Camilla Thørring; Due, Pernille
2015-03-01
Infectious illnesses such as influenza and diarrhea are leading causes of absenteeism among Danish school children. Interventions in school settings addressing hand hygiene have shown to reduce the number of infectious illnesses. However, most of these studies include small populations and almost none of them are conducted as randomized controlled trials. The overall aim of the Hi Five study was to develop, implement and evaluate a multi-component school-based intervention to improve hand hygiene and well-being and to reduce the prevalence of infections among school children in intervention schools by 20% compared to control schools. This paper describes the development and the evaluation design of Hi Five. The Hi Five study was designed as a tree-armed cluster-randomized controlled trial. A national random sample of schools (n = 44) was randomized to one of two intervention groups (n = 29) or to a control group with no intervention (n = 15). A total of 8,438 six to fifteen-year-old school children were enrolled in the study. The Hi Five intervention consisted of three components: 1) a curriculum component 2) mandatory daily hand washing before lunch 3) extra cleaning of school toilets during the school day. Baseline data was collected from December 2011 to April 2012. The intervention period was August 2012 to June 2013. The follow-up data was collected from December 2012 to April 2013. The Hi Five study fills a gap in international research. This large randomized multi-component school-based hand hygiene intervention is the first to include education on healthy and appropriate toilet behavior as part of the curriculum. No previous studies have involved supplementary cleaning at the school toilets as an intervention component. The study will have the added value of providing new knowledge about usability of short message service (SMS, text message) for collecting data on infectious illness and absenteeism in large study populations. Current Controlled Trials ISRCTN19287682 , 21 December 2012.
On the apparent insignificance of the randomness of flexible joints on large space truss dynamics
NASA Technical Reports Server (NTRS)
Koch, R. M.; Klosner, J. M.
1993-01-01
Deployable periodic large space structures have been shown to exhibit high dynamic sensitivity to period-breaking imperfections and uncertainties. These can be brought on by manufacturing or assembly errors, structural imperfections, as well as nonlinear and/or nonconservative joint behavior. In addition, the necessity of precise pointing and position capability can require the consideration of these usually negligible and unknown parametric uncertainties and their effect on the overall dynamic response of large space structures. This work describes the use of a new design approach for the global dynamic solution of beam-like periodic space structures possessing parametric uncertainties. Specifically, the effect of random flexible joints on the free vibrations of simply-supported periodic large space trusses is considered. The formulation is a hybrid approach in terms of an extended Timoshenko beam continuum model, Monte Carlo simulation scheme, and first-order perturbation methods. The mean and mean-square response statistics for a variety of free random vibration problems are derived for various input random joint stiffness probability distributions. The results of this effort show that, although joint flexibility has a substantial effect on the modal dynamic response of periodic large space trusses, the effect of any reasonable uncertainty or randomness associated with these joint flexibilities is insignificant.
Random Bits Forest: a Strong Classifier/Regressor for Big Data
NASA Astrophysics Data System (ADS)
Wang, Yi; Li, Yi; Pu, Weilin; Wen, Kathryn; Shugart, Yin Yao; Xiong, Momiao; Jin, Li
2016-07-01
Efficiency, memory consumption, and robustness are common problems with many popular methods for data analysis. As a solution, we present Random Bits Forest (RBF), a classification and regression algorithm that integrates neural networks (for depth), boosting (for width), and random forests (for prediction accuracy). Through a gradient boosting scheme, it first generates and selects ~10,000 small, 3-layer random neural networks. These networks are then fed into a modified random forest algorithm to obtain predictions. Testing with datasets from the UCI (University of California, Irvine) Machine Learning Repository shows that RBF outperforms other popular methods in both accuracy and robustness, especially with large datasets (N > 1000). The algorithm also performed highly in testing with an independent data set, a real psoriasis genome-wide association study (GWAS).
ERIC Educational Resources Information Center
Sheridan, Susan M.; Witte, Amanda L.; Holmes, Shannon R.; Coutts, Michael J.; Dent, Amy L.; Kunz, Gina M.; Wu, ChaoRong
2017-01-01
The results of a large-scale randomized controlled trial of Conjoint Behavioral Consultation (CBC) on student outcomes and teacher-parent relationships in rural schools are presented. CBC is an indirect service delivery model that addresses concerns shared by teachers and parents about students. In the present study, the intervention was aimed at…
Grade, Stéphane; Badets, Arnaud; Pesenti, Mauro
2017-05-01
Numerical magnitude and specific grasping action processing have been shown to interfere with each other because some aspects of numerical meaning may be grounded in sensorimotor transformation mechanisms linked to finger grip control. However, how specific these interactions are to grasping actions is still unknown. The present study tested the specificity of the number-grip relationship by investigating how the observation of different closing-opening stimuli that might or not refer to prehension-releasing actions was able to influence a random number generation task. Participants had to randomly produce numbers after they observed action stimuli representing either closure or aperture of the fingers, the hand or the mouth, or a colour change used as a control condition. Random number generation was influenced by the prior presentation of finger grip actions, whereby observing a closing finger grip led participants to produce small rather than large numbers, whereas observing an opening finger grip led them to produce large rather than small numbers. Hand actions had reduced or no influence on number production; mouth action influence was restricted to opening, with an overproduction of large numbers. Finally, colour changes did not influence number generation. These results show that some characteristics of observed finger, hand and mouth grip actions automatically prime number magnitude, with the strongest effect for finger grasping. The findings are discussed in terms of the functional and neural mechanisms shared between hand actions and number processing, but also between hand and mouth actions. The present study provides converging evidence that part of number semantics is grounded in sensory-motor mechanisms.
A topological analysis of large-scale structure, studied using the CMASS sample of SDSS-III
DOE Office of Scientific and Technical Information (OSTI.GOV)
Parihar, Prachi; Gott, J. Richard III; Vogeley, Michael S.
2014-12-01
We study the three-dimensional genus topology of large-scale structure using the northern region of the CMASS Data Release 10 (DR10) sample of the SDSS-III Baryon Oscillation Spectroscopic Survey. We select galaxies with redshift 0.452 < z < 0.625 and with a stellar mass M {sub stellar} > 10{sup 11.56} M {sub ☉}. We study the topology at two smoothing lengths: R {sub G} = 21 h {sup –1} Mpc and R {sub G} = 34 h {sup –1} Mpc. The genus topology studied at the R {sub G} = 21 h {sup –1} Mpc scale results in the highest genusmore » amplitude observed to date. The CMASS sample yields a genus curve that is characteristic of one produced by Gaussian random phase initial conditions. The data thus support the standard model of inflation where random quantum fluctuations in the early universe produced Gaussian random phase initial conditions. Modest deviations in the observed genus from random phase are as expected from shot noise effects and the nonlinear evolution of structure. We suggest the use of a fitting formula motivated by perturbation theory to characterize the shift and asymmetries in the observed genus curve with a single parameter. We construct 54 mock SDSS CMASS surveys along the past light cone from the Horizon Run 3 (HR3) N-body simulations, where gravitationally bound dark matter subhalos are identified as the sites of galaxy formation. We study the genus topology of the HR3 mock surveys with the same geometry and sampling density as the observational sample and find the observed genus topology to be consistent with ΛCDM as simulated by the HR3 mock samples. We conclude that the topology of the large-scale structure in the SDSS CMASS sample is consistent with cosmological models having primordial Gaussian density fluctuations growing in accordance with general relativity to form galaxies in massive dark matter halos.« less
Nonpharmacological treatments for patients with Parkinson's disease.
Bloem, Bastiaan R; de Vries, Nienke M; Ebersbach, Georg
2015-09-15
Since 2013, a number of studies have enhanced the literature and have guided clinicians on viable treatment interventions outside of pharmacotherapy and surgery. Thirty-three randomized controlled trials and one large observational study on exercise and physiotherapy were published in this period. Four randomized controlled trials focused on dance interventions, eight on treatment of cognition and behavior, two on occupational therapy, and two on speech and language therapy (the latter two specifically addressed dysphagia). Three randomized controlled trials focused on multidisciplinary care models, one study on telemedicine, and four studies on alternative interventions, including music therapy and mindfulness. These studies attest to the marked interest in these therapeutic approaches and the increasing evidence base that places nonpharmacological treatments firmly within the integrated repertoire of treatment options in Parkinson's disease. © 2015 International Parkinson and Movement Disorder Society.
Aortic Arch Plaques and Risk of Recurrent Stroke and Death
Di Tullio, Marco R.; Russo, Cesare; Jin, Zhezhen; Sacco, Ralph L.; Mohr, J.P.; Homma, Shunichi
2010-01-01
Background Aortic arch plaques are a risk factor for ischemic stroke. Although the stroke mechanism is conceivably thromboembolic, no randomized studies have evaluated the efficacy of antithrombotic therapies in preventing recurrent events. Methods and Results The relationship between arch plaques and recurrent events was studied in 516 patients with ischemic stroke, double–blindly randomized to treatment with warfarin or aspirin as part of the Patent Foramen Ovale in Cryptogenic Stroke Study (PICSS), based on the Warfarin-Aspirin Recurrent Stroke Study (WARSS). Plaque thickness and morphology was evaluated by transesophageal echocardiography. End-points were recurrent ischemic stroke or death over a 2-year follow-up. Large plaques (≥4mm) were present in 19.6% of patients, large complex plaques (those with ulcerations or mobile components) in 8.5 %. During follow-up, large plaques were associated with a significantly increased risk of events (adjusted Hazard Ratio 2.12, 95% Confidence Interval 1.04-4.32), especially those with complex morphology (HR 2.55, CI 1.10-5.89). The risk was highest among cryptogenic stroke patients, both for large plaques (HR 6.42, CI 1.62-25.46) and large-complex plaques (HR 9.50, CI 1.92-47.10). Event rates were similar in the warfarin and aspirin groups in the overall study population (16.4% vs. 15.8%; p=0.43). Conclusions In patients with stroke, and especially cryptogenic stroke, large aortic plaques remain associated with an increased risk of recurrent stroke and death at two years despite treatment with warfarin or aspirin. Complex plaque morphology confers a slight additional increase in risk. PMID:19380621
van den Broek, Frank J C; de Graaf, Eelco J R; Dijkgraaf, Marcel G W; Reitsma, Johannes B; Haringsma, Jelle; Timmer, Robin; Weusten, Bas L A M; Gerhards, Michael F; Consten, Esther C J; Schwartz, Matthijs P; Boom, Maarten J; Derksen, Erik J; Bijnen, A Bart; Davids, Paul H P; Hoff, Christiaan; van Dullemen, Hendrik M; Heine, G Dimitri N; van der Linde, Klaas; Jansen, Jeroen M; Mallant-Hent, Rosalie C H; Breumelhof, Ronald; Geldof, Han; Hardwick, James C H; Doornebosch, Pascal G; Depla, Annekatrien C T M; Ernst, Miranda F; van Munster, Ivo P; de Hingh, Ignace H J T; Schoon, Erik J; Bemelman, Willem A; Fockens, Paul; Dekker, Evelien
2009-03-13
Recent non-randomized studies suggest that extended endoscopic mucosal resection (EMR) is equally effective in removing large rectal adenomas as transanal endoscopic microsurgery (TEM). If equally effective, EMR might be a more cost-effective approach as this strategy does not require expensive equipment, general anesthesia and hospital admission. Furthermore, EMR appears to be associated with fewer complications.The aim of this study is to compare the cost-effectiveness and cost-utility of TEM and EMR for the resection of large rectal adenomas. Multicenter randomized trial among 15 hospitals in the Netherlands. Patients with a rectal adenoma > or = 3 cm, located between 1-15 cm ab ano, will be randomized to a TEM- or EMR-treatment strategy. For TEM, patients will be treated under general anesthesia, adenomas will be dissected en-bloc by a full-thickness excision, and patients will be admitted to the hospital. For EMR, no or conscious sedation is used, lesions will be resected through the submucosal plane in a piecemeal fashion, and patients will be discharged from the hospital. Residual adenoma that is visible during the first surveillance endoscopy at 3 months will be removed endoscopically in both treatment strategies and is considered as part of the primary treatment. Primary outcome measure is the proportion of patients with recurrence after 3 months. Secondary outcome measures are: 2) number of days not spent in hospital from initial treatment until 2 years afterwards; 3) major and minor morbidity; 4) disease specific and general quality of life; 5) anorectal function; 6) health care utilization and costs. A cost-effectiveness and cost-utility analysis of EMR against TEM for large rectal adenomas will be performed from a societal perspective with respectively the costs per recurrence free patient and the cost per quality adjusted life year as outcome measures. Based on comparable recurrence rates for TEM and EMR of 3.3% and considering an upper-limit of 10% for EMR to be non-inferior (beta-error 0.2 and one-sided alpha-error 0.05), 89 patients are needed per group. The TREND study is the first randomized trial evaluating whether TEM or EMR is more cost-effective for the treatment of large rectal adenomas. (trialregister.nl) NTR1422.
van den Broek, Frank JC; de Graaf, Eelco JR; Dijkgraaf, Marcel GW; Reitsma, Johannes B; Haringsma, Jelle; Timmer, Robin; Weusten, Bas LAM; Gerhards, Michael F; Consten, Esther CJ; Schwartz, Matthijs P; Boom, Maarten J; Derksen, Erik J; Bijnen, A Bart; Davids, Paul HP; Hoff, Christiaan; van Dullemen, Hendrik M; Heine, G Dimitri N; van der Linde, Klaas; Jansen, Jeroen M; Mallant-Hent, Rosalie CH; Breumelhof, Ronald; Geldof, Han; Hardwick, James CH; Doornebosch, Pascal G; Depla, Annekatrien CTM; Ernst, Miranda F; van Munster, Ivo P; de Hingh, Ignace HJT; Schoon, Erik J; Bemelman, Willem A; Fockens, Paul; Dekker, Evelien
2009-01-01
Background Recent non-randomized studies suggest that extended endoscopic mucosal resection (EMR) is equally effective in removing large rectal adenomas as transanal endoscopic microsurgery (TEM). If equally effective, EMR might be a more cost-effective approach as this strategy does not require expensive equipment, general anesthesia and hospital admission. Furthermore, EMR appears to be associated with fewer complications. The aim of this study is to compare the cost-effectiveness and cost-utility of TEM and EMR for the resection of large rectal adenomas. Methods/design Multicenter randomized trial among 15 hospitals in the Netherlands. Patients with a rectal adenoma ≥ 3 cm, located between 1–15 cm ab ano, will be randomized to a TEM- or EMR-treatment strategy. For TEM, patients will be treated under general anesthesia, adenomas will be dissected en-bloc by a full-thickness excision, and patients will be admitted to the hospital. For EMR, no or conscious sedation is used, lesions will be resected through the submucosal plane in a piecemeal fashion, and patients will be discharged from the hospital. Residual adenoma that is visible during the first surveillance endoscopy at 3 months will be removed endoscopically in both treatment strategies and is considered as part of the primary treatment. Primary outcome measure is the proportion of patients with recurrence after 3 months. Secondary outcome measures are: 2) number of days not spent in hospital from initial treatment until 2 years afterwards; 3) major and minor morbidity; 4) disease specific and general quality of life; 5) anorectal function; 6) health care utilization and costs. A cost-effectiveness and cost-utility analysis of EMR against TEM for large rectal adenomas will be performed from a societal perspective with respectively the costs per recurrence free patient and the cost per quality adjusted life year as outcome measures. Based on comparable recurrence rates for TEM and EMR of 3.3% and considering an upper-limit of 10% for EMR to be non-inferior (beta-error 0.2 and one-sided alpha-error 0.05), 89 patients are needed per group. Discussion The TREND study is the first randomized trial evaluating whether TEM or EMR is more cost-effective for the treatment of large rectal adenomas. Trial registration number (trialregister.nl) NTR1422 PMID:19284647
Kheur, Mohit G; Kheur, Supriya; Lakha, Tabrez; Jambhekar, Shantanu; Le, Bach; Jain, Vinay
2018-04-01
The absence of an adequate volume of bone at implant sites requires augmentation procedures before the placement of implants. The aim of the present study was to assess the ridge width gain with the use of allografts and biphasic β-tricalcium phosphate with hydroxyapatite (alloplast) in ridge split procedures, when each were used in small (0.25 to 1 mm) and large (1 to 2 mm) particle sizes. A randomized controlled trial of 23 subjects with severe atrophy of the mandible in the horizontal dimension was conducted in a private institute. The patients underwent placement of 49 dental implants after a staged ridge split procedure. The patients were randomly allocated to alloplast and allograft groups (predictor variable). In each group, the patients were randomly assigned to either small graft particle or large graft particle size (predictor variable). The gain in ridge width (outcome variable) was assessed before implant placement. A 2-way analysis of variance test and the Student unpaired t test were used for evaluation of the ridge width gain between the allograft and alloplast groups (predictor variable). Differences were considered significant if P values were < .05. The sample included 23 patients (14 men and 9 women). The patients were randomly allocated to the alloplast (n = 11) or allograft (n = 12) group before the ridge split procedure. In each group, they were assigned to a small graft particle or large graft particle size (alloplast group, small particle in 5 and large particle size in 6 patients; allograft group, small particle in 6 and large particle size in 6). A statistically significant difference was observed between the 2 graft types. The average ridge width gain was significantly greater in the alloplast group (large, 4.40 ± 0.24 mm; small, 3.52 ± 0.59 mm) than in the allograft group (large, 3.82 ± 0.19 mm; small, 2.57 ± 0.16 mm). For both graft types (alloplast and allograft), the large particle size graft resulted in a greater ridge width gain compared with the small particle size graft (P < .05). Within the limitations of the present study, we suggest the use of large particle alloplast as the graft material of choice for staged ridge split procedures in the posterior mandible. Copyright © 2017 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.
A Multicenter, Randomized, Controlled Trial of Osteopathic Manipulative Treatment on Preterms
Cerritelli, Francesco; Pizzolorusso, Gianfranco; Renzetti, Cinzia; Cozzolino, Vincenzo; D’Orazio, Marianna; Lupacchini, Mariacristina; Marinelli, Benedetta; Accorsi, Alessandro; Lucci, Chiara; Lancellotti, Jenny; Ballabio, Silvia; Castelli, Carola; Molteni, Daniela; Besana, Roberto; Tubaldi, Lucia; Perri, Francesco Paolo; Fusilli, Paola; D’Incecco, Carmine; Barlafante, Gina
2015-01-01
Background Despite some preliminary evidence, it is still largely unknown whether osteopathic manipulative treatment improves preterm clinical outcomes. Materials and Methods The present multi-center randomized single blind parallel group clinical trial enrolled newborns who met the criteria for gestational age between 29 and 37 weeks, without any congenital complication from 3 different public neonatal intensive care units. Preterm infants were randomly assigned to usual prenatal care (control group) or osteopathic manipulative treatment (study group). The primary outcome was the mean difference in length of hospital stay between groups. Results A total of 695 newborns were randomly assigned to either the study group (n= 352) or the control group (n=343). A statistical significant difference was observed between the two groups for the primary outcome (13.8 and 17.5 days for the study and control group respectively, p<0.001, effect size: 0.31). Multivariate analysis showed a reduction of the length of stay of 3.9 days (95% CI -5.5 to -2.3, p<0.001). Furthermore, there were significant reductions with treatment as compared to usual care in cost (difference between study and control group: 1,586.01€; 95% CI 1,087.18 to 6,277.28; p<0.001) but not in daily weight gain. There were no complications associated to the intervention. Conclusions Osteopathic treatment reduced significantly the number of days of hospitalization and is cost-effective on a large cohort of preterm infants. PMID:25974071
Doidge, James C
2018-02-01
Population-based cohort studies are invaluable to health research because of the breadth of data collection over time, and the representativeness of their samples. However, they are especially prone to missing data, which can compromise the validity of analyses when data are not missing at random. Having many waves of data collection presents opportunity for participants' responsiveness to be observed over time, which may be informative about missing data mechanisms and thus useful as an auxiliary variable. Modern approaches to handling missing data such as multiple imputation and maximum likelihood can be difficult to implement with the large numbers of auxiliary variables and large amounts of non-monotone missing data that occur in cohort studies. Inverse probability-weighting can be easier to implement but conventional wisdom has stated that it cannot be applied to non-monotone missing data. This paper describes two methods of applying inverse probability-weighting to non-monotone missing data, and explores the potential value of including measures of responsiveness in either inverse probability-weighting or multiple imputation. Simulation studies are used to compare methods and demonstrate that responsiveness in longitudinal studies can be used to mitigate bias induced by missing data, even when data are not missing at random.
Random variability explains apparent global clustering of large earthquakes
Michael, A.J.
2011-01-01
The occurrence of 5 Mw ≥ 8.5 earthquakes since 2004 has created a debate over whether or not we are in a global cluster of large earthquakes, temporarily raising risks above long-term levels. I use three classes of statistical tests to determine if the record of M ≥ 7 earthquakes since 1900 can reject a null hypothesis of independent random events with a constant rate plus localized aftershock sequences. The data cannot reject this null hypothesis. Thus, the temporal distribution of large global earthquakes is well-described by a random process, plus localized aftershocks, and apparent clustering is due to random variability. Therefore the risk of future events has not increased, except within ongoing aftershock sequences, and should be estimated from the longest possible record of events.
Epidemiologic methods in clinical trials.
Rothman, K J
1977-04-01
Epidemiologic methods developed to control confounding in non-experimental studies are equally applicable for experiments. In experiments, most confounding is usually controlled by random allocation of subjects to treatment groups, but randomization does not preclude confounding except for extremely large studies, the degree of confounding expected being inversely related to the size of the treatment groups. In experiments, as in non-experimental studies, the extent of confounding for each risk indicator should be assessed, and if sufficiently large, controlled. Confounding is properly assessed by comparing the unconfounded effect estimate to the crude effect estimate; a common error is to assess confounding by statistical tests of significance. Assessment of confounding involves its control as a prerequisite. Control is most readily and cogently achieved by stratification of the data, though with many factors to control simultaneously, multivariate analysis or a combination of multivariate analysis and stratification might be necessary.
On the number of infinite geodesics and ground states in disordered systems
NASA Astrophysics Data System (ADS)
Wehr, Jan
1997-04-01
We study first-passage percolation models and their higher dimensional analogs—models of surfaces with random weights. We prove that under very general conditions the number of lines or, in the second case, hypersurfaces which locally minimize the sum of the random weights is with probability one equal to 0 or with probability one equal to +∞. As corollaries we show that in any dimension d≥2 the number of ground states of an Ising ferromagnet with random coupling constants equals (with probability one) 2 or +∞. Proofs employ simple large-deviation estimates and ergodic arguments.
Evaluation of Bayesian Sequential Proportion Estimation Using Analyst Labels
NASA Technical Reports Server (NTRS)
Lennington, R. K.; Abotteen, K. M. (Principal Investigator)
1980-01-01
The author has identified the following significant results. A total of ten Large Area Crop Inventory Experiment Phase 3 blind sites and analyst-interpreter labels were used in a study to compare proportional estimates obtained by the Bayes sequential procedure with estimates obtained from simple random sampling and from Procedure 1. The analyst error rate using the Bayes technique was shown to be no greater than that for the simple random sampling. Also, the segment proportion estimates produced using this technique had smaller bias and mean squared errors than the estimates produced using either simple random sampling or Procedure 1.
Projection correlation between two random vectors.
Zhu, Liping; Xu, Kai; Li, Runze; Zhong, Wei
2017-12-01
We propose the use of projection correlation to characterize dependence between two random vectors. Projection correlation has several appealing properties. It equals zero if and only if the two random vectors are independent, it is not sensitive to the dimensions of the two random vectors, it is invariant with respect to the group of orthogonal transformations, and its estimation is free of tuning parameters and does not require moment conditions on the random vectors. We show that the sample estimate of the projection correction is [Formula: see text]-consistent if the two random vectors are independent and root-[Formula: see text]-consistent otherwise. Monte Carlo simulation studies indicate that the projection correlation has higher power than the distance correlation and the ranks of distances in tests of independence, especially when the dimensions are relatively large or the moment conditions required by the distance correlation are violated.
Antioxidant supplements and mortality.
Bjelakovic, Goran; Nikolova, Dimitrinka; Gluud, Christian
2014-01-01
Oxidative damage to cells and tissues is considered involved in the aging process and in the development of chronic diseases in humans, including cancer and cardiovascular diseases, the leading causes of death in high-income countries. This has stimulated interest in the preventive potential of antioxidant supplements. Today, more than one half of adults in high-income countries ingest antioxidant supplements hoping to improve their health, oppose unhealthy behaviors, and counteract the ravages of aging. Older observational studies and some randomized clinical trials with high risks of systematic errors ('bias') have suggested that antioxidant supplements may improve health and prolong life. A number of randomized clinical trials with adequate methodologies observed neutral or negative results of antioxidant supplements. Recently completed large randomized clinical trials with low risks of bias and systematic reviews of randomized clinical trials taking systematic errors ('bias') and risks of random errors ('play of chance') into account have shown that antioxidant supplements do not seem to prevent cancer, cardiovascular diseases, or death. Even more, beta-carotene, vitamin A, and vitamin E may increase mortality. Some recent large observational studies now support these findings. According to recent dietary guidelines, there is no evidence to support the use of antioxidant supplements in the primary prevention of chronic diseases or mortality. Antioxidant supplements do not possess preventive effects and may be harmful with unwanted consequences to our health, especially in well-nourished populations. The optimal source of antioxidants seems to come from our diet, not from antioxidant supplements in pills or tablets.
Moerbeek, Mirjam; van Schie, Sander
2016-07-11
The number of clusters in a cluster randomized trial is often low. It is therefore likely random assignment of clusters to treatment conditions results in covariate imbalance. There are no studies that quantify the consequences of covariate imbalance in cluster randomized trials on parameter and standard error bias and on power to detect treatment effects. The consequences of covariance imbalance in unadjusted and adjusted linear mixed models are investigated by means of a simulation study. The factors in this study are the degree of imbalance, the covariate effect size, the cluster size and the intraclass correlation coefficient. The covariate is binary and measured at the cluster level; the outcome is continuous and measured at the individual level. The results show covariate imbalance results in negligible parameter bias and small standard error bias in adjusted linear mixed models. Ignoring the possibility of covariate imbalance while calculating the sample size at the cluster level may result in a loss in power of at most 25 % in the adjusted linear mixed model. The results are more severe for the unadjusted linear mixed model: parameter biases up to 100 % and standard error biases up to 200 % may be observed. Power levels based on the unadjusted linear mixed model are often too low. The consequences are most severe for large clusters and/or small intraclass correlation coefficients since then the required number of clusters to achieve a desired power level is smallest. The possibility of covariate imbalance should be taken into account while calculating the sample size of a cluster randomized trial. Otherwise more sophisticated methods to randomize clusters to treatments should be used, such as stratification or balance algorithms. All relevant covariates should be carefully identified, be actually measured and included in the statistical model to avoid severe levels of parameter and standard error bias and insufficient power levels.
Bruintjes, Moira H D; Braat, Andries E; Dahan, Albert; Scheffer, Gert-Jan; Hilbrands, Luuk B; d'Ancona, Frank C H; Donders, Rogier A R T; van Laarhoven, Cornelis J H M; Warlé, Michiel C
2017-03-04
Postoperative recovery after live donor nephrectomy is largely determined by the consequences of postoperative pain and analgesia consumptions. The use of deep neuromuscular blockade has been shown to reduce postoperative pain scores after laparoscopic surgery. In this study, we will investigate whether deep neuromuscular blockade also improves the early quality of recovery after live donor nephrectomy. The RELAX-study is a phase IV, multicenter, double-blinded, randomized controlled trial, in which 96 patients, scheduled for living donor nephrectomy, will be randomized into two groups: one with deep and one with moderate neuromuscular blockade. Deep neuromuscular blockade is defined as a post-tetanic count of 1-2. Our primary outcome measurement will be the Quality of Recovery-40 questionnaire (overall score) at 24 h after extubation. This study is, to our knowledge, the first randomized study to assess the effectiveness of deep neuromuscular blockade during laparoscopic donor nephrectomy in enhancing postoperative recovery. The study findings may also be applicable for other laparoscopic procedures. clinicaltrials.gov, NCT02838134 . Registered on 29 June 2016.
NASA Astrophysics Data System (ADS)
Shemer, L.; Sergeeva, A.
2009-12-01
The statistics of random water wave field determines the probability of appearance of extremely high (freak) waves. This probability is strongly related to the spectral wave field characteristics. Laboratory investigation of the spatial variation of the random wave-field statistics for various initial conditions is thus of substantial practical importance. Unidirectional nonlinear random wave groups are investigated experimentally in the 300 m long Large Wave Channel (GWK) in Hannover, Germany, which is the biggest facility of its kind in Europe. Numerous realizations of a wave field with the prescribed frequency power spectrum, yet randomly-distributed initial phases of each harmonic, were generated by a computer-controlled piston-type wavemaker. Several initial spectral shapes with identical dominant wave length but different width were considered. For each spectral shape, the total duration of sampling in all realizations was long enough to yield sufficient sample size for reliable statistics. Through all experiments, an effort had been made to retain the characteristic wave height value and thus the degree of nonlinearity of the wave field. Spatial evolution of numerous statistical wave field parameters (skewness, kurtosis and probability distributions) is studied using about 25 wave gauges distributed along the tank. It is found that, depending on the initial spectral shape, the frequency spectrum of the wave field may undergo significant modification in the course of its evolution along the tank; the values of all statistical wave parameters are strongly related to the local spectral width. A sample of the measured wave height probability functions (scaled by the variance of surface elevation) is plotted in Fig. 1 for the initially narrow rectangular spectrum. The results in Fig. 1 resemble findings obtained in [1] for the initial Gaussian spectral shape. The probability of large waves notably surpasses that predicted by the Rayleigh distribution and is the highest at the distance of about 100 m. Acknowledgement This study is carried out in the framework of the EC supported project "Transnational access to large-scale tests in the Large Wave Channel (GWK) of Forschungszentrum Küste (Contract HYDRALAB III - No. 022441). [1] L. Shemer and A. Sergeeva, J. Geophys. Res. Oceans 114, C01015 (2009). Figure 1. Variation along the tank of the measured wave height distribution for rectangular initial spectral shape, the carrier wave period T0=1.5 s.
Distinguishability of generic quantum states
NASA Astrophysics Data System (ADS)
Puchała, Zbigniew; Pawela, Łukasz; Życzkowski, Karol
2016-06-01
Properties of random mixed states of dimension N distributed uniformly with respect to the Hilbert-Schmidt measure are investigated. We show that for large N , due to the concentration of measure, the trace distance between two random states tends to a fixed number D ˜=1 /4 +1 /π , which yields the Helstrom bound on their distinguishability. To arrive at this result, we apply free random calculus and derive the symmetrized Marchenko-Pastur distribution, which is shown to describe numerical data for the model of coupled quantum kicked tops. Asymptotic value for the root fidelity between two random states, √{F }=3/4 , can serve as a universal reference value for further theoretical and experimental studies. Analogous results for quantum relative entropy and Chernoff quantity provide other bounds on the distinguishablity of both states in a multiple measurement setup due to the quantum Sanov theorem. We study also mean entropy of coherence of random pure and mixed states and entanglement of a generic mixed state of a bipartite system.
Comparing spatial regression to random forests for large environmental data sets
Environmental data may be “large” due to number of records, number of covariates, or both. Random forests has a reputation for good predictive performance when using many covariates, whereas spatial regression, when using reduced rank methods, has a reputatio...
Kassem, M I; El-Haddad, H M
2016-10-01
To compare polypropylene mesh positioned onlay supported by omentum and/or peritoneum versus inlay implantation of polypropylene-based composite mesh in patients with complicated wide-defect ventral hernias. This was a prospective randomized study carried out on 60 patients presenting with complicated large ventral hernia in the period from January 2012 to January 2016 in the department of Gastrointestinal Surgery unit and Surgical Emergency of the Main Alexandria University Hospital, Egypt. Large hernia had an abdominal wall defect that could not be closed. Patients were divided into two groups of 30 patients according to the type of mesh used to deal with the large abdominal wall defect. The study included 38 women (63.3 %) and 22 men (37.7 %); their mean age was 46.5 years (range, 25-70). Complicated incisional hernia was the commonest presentation (56.7 %).The operative and mesh fixation times were longer in the polypropylene group. Seven wound infections and two recurrences were encountered in the propylene group. Mean follow-up was 28.7 months (2-48 months). Composite mesh provided, in one session, satisfactory results in patients with complicated large ventral hernia. The procedure is safe and effective in lowering operative time with a trend of low wound complication and recurrence rates.
Large-N -approximated field theory for multipartite entanglement
NASA Astrophysics Data System (ADS)
Facchi, P.; Florio, G.; Parisi, G.; Pascazio, S.; Scardicchio, A.
2015-12-01
We try to characterize the statistics of multipartite entanglement of the random states of an n -qubit system. Unable to solve the problem exactly we generalize it, replacing complex numbers with real vectors with Nc components (the original problem is recovered for Nc=2 ). Studying the leading diagrams in the large-Nc approximation, we unearth the presence of a phase transition and, in an explicit example, show that the so-called entanglement frustration disappears in the large-Nc limit.
Nakamura, Kazuhiko; Ihara, Eikichi; Akiho, Hirotada; Akahoshi, Kazuya; Harada, Naohiko; Ochiai, Toshiaki; Nakamura, Norimoto; Ogino, Haruei; Iwasa, Tsutomu; Aso, Akira; Iboshi, Yoichiro; Takayanagi, Ryoichi
2016-11-15
The ability of endoscopic submucosal dissection (ESD) to resect large early gastric cancers (EGCs) results in the need to treat large artificial gastric ulcers. This study assessed whether the combination therapy of rebamipide plus a proton pump inhibitor (PPI) offered benefits over PPI monotherapy. In this prospective, randomized, multicenter, open-label, and comparative study, patients who had undergone ESD for EGC or gastric adenoma were randomized into groups receiving either rabeprazole monotherapy (10 mg/day, n=64) or a combination of rabeprazole plus rebamipide (300 mg/day, n=66). The Scar stage (S stage) ratio after treatment was compared, and factors independently associated with ulcer healing were identified by using multivariate analyses. The S stage rates at 4 and 8 weeks were similar in the two groups, even in the subgroups of patients with large amounts of tissue resected and regardless of CYP2C19 genotype. Independent factors for ulcer healing were circumferential location of the tumor and resected tissue size; the type of treatment did not affect ulcer healing. Combination therapy with rebamipide and PPI had limited benefits compared with PPI monotherapy in the treatment of post-ESD gastric ulcer (UMIN Clinical Trials Registry, UMIN000007435).
Nakamura, Kazuhiko; Ihara, Eikichi; Akiho, Hirotada; Akahoshi, Kazuya; Harada, Naohiko; Ochiai, Toshiaki; Nakamura, Norimoto; Ogino, Haruei; Iwasa, Tsutomu; Aso, Akira; Iboshi, Yoichiro; Takayanagi, Ryoichi
2016-01-01
Background/Aims The ability of endoscopic submucosal dissection (ESD) to resect large early gastric cancers (EGCs) results in the need to treat large artificial gastric ulcers. This study assessed whether the combination therapy of rebamipide plus a proton pump inhibitor (PPI) offered benefits over PPI monotherapy. Methods In this prospective, randomized, multicenter, open-label, and comparative study, patients who had undergone ESD for EGC or gastric adenoma were randomized into groups receiving either rabeprazole monotherapy (10 mg/day, n=64) or a combination of rabeprazole plus rebamipide (300 mg/day, n=66). The Scar stage (S stage) ratio after treatment was compared, and factors independently associated with ulcer healing were identified by using multivariate analyses. Results The S stage rates at 4 and 8 weeks were similar in the two groups, even in the subgroups of patients with large amounts of tissue resected and regardless of CYP2C19 genotype. Independent factors for ulcer healing were circumferential location of the tumor and resected tissue size; the type of treatment did not affect ulcer healing. Conclusions Combination therapy with rebamipide and PPI had limited benefits compared with PPI monotherapy in the treatment of post-ESD gastric ulcer (UMIN Clinical Trials Registry, UMIN000007435). PMID:27282261
Estimating the Size of a Large Network and its Communities from a Random Sample
Chen, Lin; Karbasi, Amin; Crawford, Forrest W.
2017-01-01
Most real-world networks are too large to be measured or studied directly and there is substantial interest in estimating global network properties from smaller sub-samples. One of the most important global properties is the number of vertices/nodes in the network. Estimating the number of vertices in a large network is a major challenge in computer science, epidemiology, demography, and intelligence analysis. In this paper we consider a population random graph G = (V, E) from the stochastic block model (SBM) with K communities/blocks. A sample is obtained by randomly choosing a subset W ⊆ V and letting G(W) be the induced subgraph in G of the vertices in W. In addition to G(W), we observe the total degree of each sampled vertex and its block membership. Given this partial information, we propose an efficient PopULation Size Estimation algorithm, called PULSE, that accurately estimates the size of the whole population as well as the size of each community. To support our theoretical analysis, we perform an exhaustive set of experiments to study the effects of sample size, K, and SBM model parameters on the accuracy of the estimates. The experimental results also demonstrate that PULSE significantly outperforms a widely-used method called the network scale-up estimator in a wide variety of scenarios. PMID:28867924
Estimating the Size of a Large Network and its Communities from a Random Sample.
Chen, Lin; Karbasi, Amin; Crawford, Forrest W
2016-01-01
Most real-world networks are too large to be measured or studied directly and there is substantial interest in estimating global network properties from smaller sub-samples. One of the most important global properties is the number of vertices/nodes in the network. Estimating the number of vertices in a large network is a major challenge in computer science, epidemiology, demography, and intelligence analysis. In this paper we consider a population random graph G = ( V, E ) from the stochastic block model (SBM) with K communities/blocks. A sample is obtained by randomly choosing a subset W ⊆ V and letting G ( W ) be the induced subgraph in G of the vertices in W . In addition to G ( W ), we observe the total degree of each sampled vertex and its block membership. Given this partial information, we propose an efficient PopULation Size Estimation algorithm, called PULSE, that accurately estimates the size of the whole population as well as the size of each community. To support our theoretical analysis, we perform an exhaustive set of experiments to study the effects of sample size, K , and SBM model parameters on the accuracy of the estimates. The experimental results also demonstrate that PULSE significantly outperforms a widely-used method called the network scale-up estimator in a wide variety of scenarios.
Mediterranean diet and life expectancy; beyond olive oil, fruits and vegetables
Martinez-Gonzalez, Miguel A.; Martín-Calvo, Nerea
2018-01-01
Purpose to review the recent relevant evidence of the effects of the Mediterranean diet and lifestyle on health (2015 and first months of 2016). Recent findings Large observational prospective epidemiological studies with adequate control of confounding and two large randomized trials support the benefits of the Mediterranean dietary pattern to increase life expectancy, reduce the risk of major chronic disease, and improve quality of life and well-being. Recently, 19 new reports from large prospective studies showed –with nearly perfect consistency– strong benefits of the Mediterranean diet to reduce the risk of myocardial infarction, stroke, total mortality, heart failure and disability. Interestingly, two large and well-conducted cohorts reported significant cardiovascular benefits after using repeated measurements of diet during a long follow-up period. Besides, PREDIMED, the largest randomized trial with Mediterranean diet, recently reported benefits of this dietary pattern to prevent cognitive decline and breast cancer. Summary In the era of evidence-based medicine, the Mediterranean diet represents the gold standard in preventive medicine, probably due to the harmonic combination of many elements with antioxidant and antiinflammatory properties, which overwhelm any single nutrient or food item. The whole seems more important than the sum of its parts. PMID:27552476
Ko, Mi-Hwa
2018-01-01
In this paper, based on the Rosenthal-type inequality for asymptotically negatively associated random vectors with values in [Formula: see text], we establish results on [Formula: see text]-convergence and complete convergence of the maximums of partial sums are established. We also obtain weak laws of large numbers for coordinatewise asymptotically negatively associated random vectors with values in [Formula: see text].
Coburn, Brian W; Cheetham, T Craig; Rashid, Nazia; Chang, John M; Levy, Gerald D; Kerimian, Artak; Low, Kimberly J; Redden, David T; Bridges, S Louis; Saag, Kenneth G; Curtis, Jeffrey R; Mikuls, Ted R
2016-01-01
Background Despite the availability of effective therapies, most gout patients achieve suboptimal treatment outcomes. Current best practices suggest gradual dose-escalation of urate lowering therapy and serial serum urate (sUA) measurement to achieve sUA < 6.0 mg/dl. However, this strategy is not routinely used. Here we present the study design rationale and development for a pharmacist-led intervention to promote sUA goal attainment. Methods To overcome barriers in achieving optimal outcomes, we planned and implemented the Randomized Evaluation of an Ambulatory Care Pharmacist-Led Intervention to Optimize Urate Lowering Pathways (RAmP-UP) study. This is a large pragmatic cluster-randomized trial designed to assess a highly automated, pharmacist-led intervention to optimize allopurinol treatment in gout. Ambulatory clinics (n=101) from a large health system were randomized to deliver either the pharmacist-led intervention or usual care to gout patients over the age of 18 years newly initiating allopurinol. All participants received educational materials and could opt-out of the study. For intervention sites, pharmacists conducted outreach primarily via an automated telephone interactive voice recognition system. The outreach, guided by a gout care algorithm developed for this study, systematically promoted adherence assessment, facilitated sUA testing, provided education, and adjusted allopurinol dosing. The primary study outcomes are achievement of sUA < 6.0 mg/dl and treatment adherence determined after one year. With follow-up ongoing, study results will be reported subsequently. Conclusion Ambulatory care pharmacists and automated calling technology represent potentially important, underutilized resources for improving health outcomes for gout patients. PMID:27449546
Coburn, Brian W; Cheetham, T Craig; Rashid, Nazia; Chang, John M; Levy, Gerald D; Kerimian, Artak; Low, Kimberly J; Redden, David T; Bridges, S Louis; Saag, Kenneth G; Curtis, Jeffrey R; Mikuls, Ted R
2016-09-01
Despite the availability of effective therapies, most gout patients achieve suboptimal treatment outcomes. Current best practices suggest gradual dose-escalation of urate lowering therapy and serial serum urate (sUA) measurement to achieve sUA<6.0mg/dl. However, this strategy is not routinely used. Here we present the study design rationale and development for a pharmacist-led intervention to promote sUA goal attainment. To overcome barriers in achieving optimal outcomes, we planned and implemented the Randomized Evaluation of an Ambulatory Care Pharmacist-Led Intervention to Optimize Urate Lowering Pathways (RAmP-UP) study. This is a large pragmatic cluster-randomized trial designed to assess a highly automated, pharmacist-led intervention to optimize allopurinol treatment in gout. Ambulatory clinics (n=101) from a large health system were randomized to deliver either the pharmacist-led intervention or usual care to gout patients over the age of 18years newly initiating allopurinol. All participants received educational materials and could opt-out of the study. For intervention sites, pharmacists conducted outreach primarily via an automated telephone interactive voice recognition system. The outreach, guided by a gout care algorithm developed for this study, systematically promoted adherence assessment, facilitated sUA testing, provided education, and adjusted allopurinol dosing. The primary study outcomes are achievement of sUA<6.0mg/dl and treatment adherence determined after one year. With follow-up ongoing, study results will be reported subsequently. Ambulatory care pharmacists and automated calling technology represent potentially important, underutilized resources for improving health outcomes for gout patients. Copyright © 2016 Elsevier Inc. All rights reserved.
Randomized Trials Built on Sand: Examples from COPD, Hormone Therapy, and Cancer
Suissa, Samy
2012-01-01
The randomized controlled trial is the fundamental study design to evaluate the effectiveness of medications and receive regulatory approval. Observational studies, on the other hand, are essential to address post-marketing drug safety issues but have also been used to uncover new indications or new benefits for already marketed drugs. Hormone replacement therapy (HRT) for instance, effective for menopausal symptoms, was reported in several observational studies during the 1980s and 1990s to also significantly reduce the incidence of coronary heart disease. This claim was refuted in 2002 by the large-scale Women’s Health Initiative randomized trial. An example of a new indication for an old drug is that of metformin, an anti-diabetic medication, which is being hailed as a potential anti-cancer agent, primarily on the basis of several recent observational studies that reported impressive reductions in cancer incidence and mortality with its use. These observational studies have now sparked the conduct of large-scale randomized controlled trials currently ongoing in cancer. We show in this paper that the spectacular effects on new indications or new outcomes reported in many observational studies in chronic obstructive pulmonary disease (COPD), HRT, and cancer are the result of time-related biases, such as immortal time bias, that tend to seriously exaggerate the benefits of a drug and that eventually disappear with the proper statistical analysis. In all, while observational studies are central to assess the effects of drugs, their proper design and analysis are essential to avoid bias. The scientific evidence on the potential beneficial effects in new indications of existing drugs will need to be more carefully assessed before embarking on long and expensive unsubstantiated trials. PMID:23908838
NASA Astrophysics Data System (ADS)
Zhang, Yu; Li, Yan; Shao, Hao; Zhong, Yaozhao; Zhang, Sai; Zhao, Zongxi
2012-06-01
Band structure and wave localization are investigated for sea surface water waves over large-scale sand wave topography. Sand wave height, sand wave width, water depth, and water width between adjacent sand waves have significant impact on band gaps. Random fluctuations of sand wave height, sand wave width, and water depth induce water wave localization. However, random water width produces a perfect transmission tunnel of water waves at a certain frequency so that localization does not occur no matter how large a disorder level is applied. Together with theoretical results, the field experimental observations in the Taiwan Bank suggest band gap and wave localization as the physical mechanism of sea surface water wave propagating over natural large-scale sand waves.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-11
..., emotional, motor and sensory) for use in large longitudinal or epidemiological studies where functioning is... of establishing comparative norms. Existing recruitment databases will be randomly sampled and... * Adult study participants, single assessment..... 3,150 1 3 9,450 Adult study participants, two...
Investigating the Randomness of Numbers
ERIC Educational Resources Information Center
Pendleton, Kenn L.
2009-01-01
The use of random numbers is pervasive in today's world. Random numbers have practical applications in such far-flung arenas as computer simulations, cryptography, gambling, the legal system, statistical sampling, and even the war on terrorism. Evaluating the randomness of extremely large samples is a complex, intricate process. However, the…
Quantum Entanglement in Random Physical States
NASA Astrophysics Data System (ADS)
Hamma, Alioscia; Santra, Siddhartha; Zanardi, Paolo
2012-07-01
Most states in the Hilbert space are maximally entangled. This fact has proven useful to investigate—among other things—the foundations of statistical mechanics. Unfortunately, most states in the Hilbert space of a quantum many-body system are not physically accessible. We define physical ensembles of states acting on random factorized states by a circuit of length k of random and independent unitaries with local support. We study the typicality of entanglement by means of the purity of the reduced state. We find that for a time k=O(1), the typical purity obeys the area law. Thus, the upper bounds for area law are actually saturated, on average, with a variance that goes to zero for large systems. Similarly, we prove that by means of local evolution a subsystem of linear dimensions L is typically entangled with a volume law when the time scales with the size of the subsystem. Moreover, we show that for large values of k the reduced state becomes very close to the completely mixed state.
Efficient design of clinical trials and epidemiological research: is it possible?
Lauer, Michael S; Gordon, David; Wei, Gina; Pearson, Gail
2017-08-01
Randomized clinical trials and large-scale, cohort studies continue to have a critical role in generating evidence in cardiovascular medicine; however, the increasing concern is that ballooning costs threaten the clinical trial enterprise. In this Perspectives article, we discuss the changing landscape of clinical research, and clinical trials in particular, focusing on reasons for the increasing costs and inefficiencies. These reasons include excessively complex design, overly restrictive inclusion and exclusion criteria, burdensome regulations, excessive source-data verification, and concerns about the effect of clinical research conduct on workflow. Thought leaders have called on the clinical research community to consider alternative, transformative business models, including those models that focus on simplicity and leveraging of digital resources. We present some examples of innovative approaches by which some investigators have successfully conducted large-scale, clinical trials at relatively low cost. These examples include randomized registry trials, cluster-randomized trials, adaptive trials, and trials that are fully embedded within digital clinical care or administrative platforms.
Low-order black-box models for control system design in large power systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kamwa, I.; Trudel, G.; Gerin-Lajoie, L.
1996-02-01
The paper studies two multi-input multi-output (MIMO) procedures for the identification of low-order state-space models of power systems, by probing the network in open loop with low-energy pulses or random signals. Although such data may result from actual measurements, the development assumes simulated responses from a transient stability program, hence benefiting from the existing large base of stability models. While pulse data is processed using the eigensystem realization algorithm, the analysis of random responses is done by means of subspace identification methods. On a prototype Hydro-Quebec power system, including SVCs, DC lines, series compensation, and more than 1,100 buses, itmore » is verified that the two approaches are equivalent only when strict requirements are imposed on the pulse length and magnitude. The 10th-order equivalent models derived by random-signal probing allow for effective tuning of decentralized power system stabilizers (PSSs) able to damp both local and very slow inter-area modes.« less
Low-order black-box models for control system design in large power systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kamwa, I.; Trudel, G.; Gerin-Lajoie, L.
1995-12-31
The paper studies two multi-input multi-output (MIMO) procedures for the identification of low-order state-space models of power systems, by probing the network in open loop with low-energy pulses or random signals. Although such data may result from actual measurements, the development assumes simulated responses from a transient stability program, hence benefiting form the existing large base of stability models. While pulse data is processed using the eigensystem realization algorithm, the analysis of random responses is done by means of subspace identification methods. On a prototype Hydro-Quebec power system, including SVCs, DC lines, series compensation, and more than 1,100 buses, itmore » is verified that the two approaches are equivalent only when strict requirements are imposed on the pulse length and magnitude. The 10th-order equivalent models derived by random-signal probing allow for effective tuning of decentralized power system stabilizers (PSSs) able to damp both local and very slow inter-area modes.« less
Asymptotic shape of the region visited by an Eulerian walker.
Kapri, Rajeev; Dhar, Deepak
2009-11-01
We study an Eulerian walker on a square lattice, starting from an initial randomly oriented background using Monte Carlo simulations. We present evidence that, for a large number of steps N , the asymptotic shape of the set of sites visited by the walker is a perfect circle. The radius of the circle increases as N1/3, for large N , and the width of the boundary region grows as Nalpha/3, with alpha=0.40+/-0.06 . If we introduce stochasticity in the evolution rules, the mean-square displacement of the walker,
Update on mechanical cardiopulmonary resuscitation devices.
Rubertsson, Sten
2016-06-01
The aim of this review is to update and discuss the use of mechanical chest compression devices in treatment of cardiac arrest. Three recently published large multicenter randomized trials have not been able to show any improved outcome in adult out-of-hospital cardiac arrest patients when compared with manual chest compressions. Mechanical chest compression devices have been developed to better deliver uninterrupted chest compressions of good quality. Prospective large randomized studies have not been able to prove a better outcome compared to manual chest compressions; however, latest guidelines support their use when high-quality manual chest compressions cannot be delivered. Mechanical chest compressions can also be preferred during transportation, in the cath-lab and as a bridge to more invasive support like extracorporeal membrane oxygenation.
Inflation in random Gaussian landscapes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Masoumi, Ali; Vilenkin, Alexander; Yamada, Masaki, E-mail: ali@cosmos.phy.tufts.edu, E-mail: vilenkin@cosmos.phy.tufts.edu, E-mail: Masaki.Yamada@tufts.edu
2017-05-01
We develop analytic and numerical techniques for studying the statistics of slow-roll inflation in random Gaussian landscapes. As an illustration of these techniques, we analyze small-field inflation in a one-dimensional landscape. We calculate the probability distributions for the maximal number of e-folds and for the spectral index of density fluctuations n {sub s} and its running α {sub s} . These distributions have a universal form, insensitive to the correlation function of the Gaussian ensemble. We outline possible extensions of our methods to a large number of fields and to models of large-field inflation. These methods do not suffer frommore » potential inconsistencies inherent in the Brownian motion technique, which has been used in most of the earlier treatments.« less
Comparative effectiveness research in cancer with observational data.
Giordano, Sharon H
2015-01-01
Observational studies are increasingly being used for comparative effectiveness research. These studies can have the greatest impact when randomized trials are not feasible or when randomized studies have not included the population or outcomes of interest. However, careful attention must be paid to study design to minimize the likelihood of selection biases. Analytic techniques, such as multivariable regression modeling, propensity score analysis, and instrumental variable analysis, also can also be used to help address confounding. Oncology has many existing large and clinically rich observational databases that can be used for comparative effectiveness research. With careful study design, observational studies can produce valid results to assess the benefits and harms of a treatment or intervention in representative real-world populations.
Universality in chaos: Lyapunov spectrum and random matrix theory.
Hanada, Masanori; Shimada, Hidehiko; Tezuka, Masaki
2018-02-01
We propose the existence of a new universality in classical chaotic systems when the number of degrees of freedom is large: the statistical property of the Lyapunov spectrum is described by random matrix theory. We demonstrate it by studying the finite-time Lyapunov exponents of the matrix model of a stringy black hole and the mass-deformed models. The massless limit, which has a dual string theory interpretation, is special in that the universal behavior can be seen already at t=0, while in other cases it sets in at late time. The same pattern is demonstrated also in the product of random matrices.
Universality in chaos: Lyapunov spectrum and random matrix theory
NASA Astrophysics Data System (ADS)
Hanada, Masanori; Shimada, Hidehiko; Tezuka, Masaki
2018-02-01
We propose the existence of a new universality in classical chaotic systems when the number of degrees of freedom is large: the statistical property of the Lyapunov spectrum is described by random matrix theory. We demonstrate it by studying the finite-time Lyapunov exponents of the matrix model of a stringy black hole and the mass-deformed models. The massless limit, which has a dual string theory interpretation, is special in that the universal behavior can be seen already at t =0 , while in other cases it sets in at late time. The same pattern is demonstrated also in the product of random matrices.
Kolitsopoulos, Francesca M; Strom, Brian L; Faich, Gerald; Eng, Sybil M; Kane, John M; Reynolds, Robert F
2013-03-01
Large, "practical" or streamlined trials (LSTs) are used to study the effectiveness and/or safety of medicines in real world settings with minimal study imposed interventions. While LSTs have benefits over traditional randomized clinical trials and observational studies, there are inherent challenges to their conduct. Enrollment and follow-up of a large study sample of patients with mental illness pose a particular difficulty. To assist in overcoming operational barriers in future LSTs in psychiatry, this paper describes the recruitment and observational follow-up strategies used for the ZODIAC study, an international, open-label LST, which followed 18,239 persons randomly assigned to one of two treatments indicated for schizophrenia for 1 year. ZODIAC enrolled patients in 18 countries in North America, South America, Europe, and Asia using broad study entry criteria and required minimal clinical care intervention. Recruitment of adequate numbers and continued engagement of both study centers and subjects were significant challenges. Strategies implemented to mitigate these in ZODIAC include global study expansion, study branding, field coordinator and site relations programs, monthly site newsletters, collection of alternate contact information, conduct of national death index (NDI) searches, and frequent sponsor, contract research organization (CRO) and site interaction to share best practices and address recruitment challenges quickly. We conclude that conduct of large LSTs in psychiatric patient populations is feasible, but importantly, realistic site recruitment goals and maintaining site engagement are key factors that need to be considered in early study planning and conduct. Copyright © 2012 Elsevier Inc. All rights reserved.
Disordered quivers and cold horizons
Anninos, Dionysios; Anous, Tarek; Denef, Frederik
2016-12-15
We analyze the low temperature structure of a supersymmetric quiver quantum mechanics with randomized superpotential coefficients, treating them as quenched disorder. These theories describe features of the low energy dynamics of wrapped branes, which in large number backreact into extremal black holes. We show that the low temperature theory, in the limit of a large number of bifundamentals, exhibits a time reparametrization symmetry as well as a specific heat linear in the temperature. Both these features resemble the behavior of black hole horizons in the zero temperature limit. We demonstrate similarities between the low temperature physics of the random quivermore » model and a theory of large N free fermions with random masses.« less
Sample size determination for bibliographic retrieval studies
Yao, Xiaomei; Wilczynski, Nancy L; Walter, Stephen D; Haynes, R Brian
2008-01-01
Background Research for developing search strategies to retrieve high-quality clinical journal articles from MEDLINE is expensive and time-consuming. The objective of this study was to determine the minimal number of high-quality articles in a journal subset that would need to be hand-searched to update or create new MEDLINE search strategies for treatment, diagnosis, and prognosis studies. Methods The desired width of the 95% confidence intervals (W) for the lowest sensitivity among existing search strategies was used to calculate the number of high-quality articles needed to reliably update search strategies. New search strategies were derived in journal subsets formed by 2 approaches: random sampling of journals and top journals (having the most high-quality articles). The new strategies were tested in both the original large journal database and in a low-yielding journal (having few high-quality articles) subset. Results For treatment studies, if W was 10% or less for the lowest sensitivity among our existing search strategies, a subset of 15 randomly selected journals or 2 top journals were adequate for updating search strategies, based on each approach having at least 99 high-quality articles. The new strategies derived in 15 randomly selected journals or 2 top journals performed well in the original large journal database. Nevertheless, the new search strategies developed using the random sampling approach performed better than those developed using the top journal approach in a low-yielding journal subset. For studies of diagnosis and prognosis, no journal subset had enough high-quality articles to achieve the expected W (10%). Conclusion The approach of randomly sampling a small subset of journals that includes sufficient high-quality articles is an efficient way to update or create search strategies for high-quality articles on therapy in MEDLINE. The concentrations of diagnosis and prognosis articles are too low for this approach. PMID:18823538
Maddison, Ralph; Foley, Louise; Ni Mhurchu, Cliona; Jull, Andrew; Jiang, Yannan; Prapavessis, Harry; Rodgers, Anthony; Vander Hoorn, Stephen; Hohepa, Maea; Schaaf, David
2009-01-01
Background Childhood obesity has reached epidemic proportions in developed countries. Sedentary screen-based activities such as video gaming are thought to displace active behaviors and are independently associated with obesity. Active video games, where players physically interact with images onscreen, may have utility as a novel intervention to increase physical activity and improve body composition in children. The aim of the Electronic Games to Aid Motivation to Exercise (eGAME) study is to determine the effects of an active video game intervention over 6 months on: body mass index (BMI), percent body fat, waist circumference, cardio-respiratory fitness, and physical activity levels in overweight children. Methods/Design Three hundred and thirty participants aged 10–14 years will be randomized to receive either an active video game upgrade package or to a control group (no intervention). Discussion An overview of the eGAME study is presented, providing an example of a large, pragmatic randomized controlled trial in a community setting. Reflection is offered on key issues encountered during the course of the study. In particular, investigation into the feasibility of the proposed intervention, as well as robust testing of proposed study procedures is a critical step prior to implementation of a large-scale trial. Trial registration Australian New Zealand Clinical Trials Registry ACTRN12607000632493 PMID:19450288
European Collaboration on Low-dose Aspirin in Polycythemia Vera (ECLAP): a randomized trial.
Landolfi, R; Marchioli, R
1997-01-01
Thrombotic complications characterize the clinical course of polycythemia vera (PV) and represent the main cause of morbidity and mortality. However, uncertainty still exists as to the benefit/risk ratio of aspirin prophylaxis in this setting. In vivo platelet biosynthesis of thromboxane A2 is enhanced and can be suppressed by low-dose aspirin in PV, thus providing a rationale for assessing the efficacy and safety of a low-dose aspirin regimen in these patients. The Gruppo Italiano Studio Policitemia Vera has recently performed a pilot study on 112 patients randomized to receive aspirin, 40 mg daily, or placebo and followed for 16 +/- 6 months (mean +/- SD). This study showed that low-dose aspirin is well tolerated in PV patients, and that a large-scale efficacy trial is feasible in this setting. In this article we report the protocol of the European Collaboration on Low-dose Aspirin in Polycythemia Vera (ECLAP) study, which is a randomized trial designed to assess the risk/benefit ratio of low-dose aspirin in PV. To estimate the size and the follow-up duration required for the ECLAP trial, a retrospective analysis of the clinical epidemiology of a large PV population has recently been completed by the Gruppo Italiano Studio Policitemia Vera. On this basis, approximately 3500 patients will be enrolled in the ECLAP study with a follow-up of 3 to 4 years. The uncertainty principle will be used as the main eligibility criterion: Polycythemic patients of any age, having no clear indication for or contraindication to aspirin treatment, will be randomized in a double-blind fashion to receive oral aspirin (100 mg daily) or placebo. According to current therapeutic recommendations, the basic treatment of randomized patients should be aimed at maintaining the hematocrit value < or = 45% in subjects aged < or = 50, and hematocrit < 45% as well as platelet count < 400 x 10(9)/L in patients aged > 50. Randomization will be stratified by participating center. The study is funded by the European Union BIOMED 2 program.
NASA Astrophysics Data System (ADS)
Du, Shihong; Zhang, Fangli; Zhang, Xiuyuan
2015-07-01
While most existing studies have focused on extracting geometric information on buildings, only a few have concentrated on semantic information. The lack of semantic information cannot satisfy many demands on resolving environmental and social issues. This study presents an approach to semantically classify buildings into much finer categories than those of existing studies by learning random forest (RF) classifier from a large number of imbalanced samples with high-dimensional features. First, a two-level segmentation mechanism combining GIS and VHR image produces single image objects at a large scale and intra-object components at a small scale. Second, a semi-supervised method chooses a large number of unbiased samples by considering the spatial proximity and intra-cluster similarity of buildings. Third, two important improvements in RF classifier are made: a voting-distribution ranked rule for reducing the influences of imbalanced samples on classification accuracy and a feature importance measurement for evaluating each feature's contribution to the recognition of each category. Fourth, the semantic classification of urban buildings is practically conducted in Beijing city, and the results demonstrate that the proposed approach is effective and accurate. The seven categories used in the study are finer than those in existing work and more helpful to studying many environmental and social problems.
Shah, Parag K; Narendran, V; Kalpana, N
2011-01-01
To compare structural and functional outcome and time efficiency between standard spot sized conventional pulsed mode diode laser and continuous mode large spot transpupillary thermotherapy (LS TTT) for treatment of high risk prethreshold retinopathy of prematurity (ROP). Ten eyes of five preterm babies having bilateral symmetrical high risk prethreshold ROP were included in this study. One eye of each baby was randomized to get either standard spot sized conventional pulsed mode diode laser or continuous mode LS TTT. There was no significant difference between structural or functional outcome in either group. The mean time taken for conventional diode laser was 20.07 minutes, while that for LS TTT was 12.3 minutes. LS TTT was 40% more time efficient than the conventional laser. It may be better suited for the very small fragile premature infants as it is quicker than the conventional laser.
Kröger, Christoph; Kliem, Sören; Zimmermann, Peter; Kowalski, Jens
2018-04-01
This study examines the short-term effectiveness of a relationship education program designed for military couples. Distressed couples were randomly placed in either a wait-list control group or an intervention group. We conducted training sessions before a 3-month foreign assignment, and refresher courses approximately 6-week post-assignment. We analyzed the dyadic data of 32 couples, using hierarchical linear modeling in a two-level model. Reduction in unresolved conflicts was found in the intervention group, with large pre-post effects for both partners. Relationship satisfaction scores were improved, with moderate-to-large effects only for soldiers, rather than their partners. Post-follow-up effect sizes suggested further improvement in the intervention group. Future research should examine the long-term effectiveness of this treatment. © 2017 American Association for Marriage and Family Therapy.
ERIC Educational Resources Information Center
Andrade, Jeanette; Huang, Wen-Hao David; Bohn, Dawn M.
2015-01-01
The effective design of course materials is critical for student learning, especially for large lecture introductory courses. This quantitative study was designed to explore the effect multimedia and content difficulty has on students' cognitive load and learning outcomes. College students (n = 268) were randomized into 1 of 3 multimedia groups:…
Heidema, A Geert; Boer, Jolanda M A; Nagelkerke, Nico; Mariman, Edwin C M; van der A, Daphne L; Feskens, Edith J M
2006-04-21
Genetic epidemiologists have taken the challenge to identify genetic polymorphisms involved in the development of diseases. Many have collected data on large numbers of genetic markers but are not familiar with available methods to assess their association with complex diseases. Statistical methods have been developed for analyzing the relation between large numbers of genetic and environmental predictors to disease or disease-related variables in genetic association studies. In this commentary we discuss logistic regression analysis, neural networks, including the parameter decreasing method (PDM) and genetic programming optimized neural networks (GPNN) and several non-parametric methods, which include the set association approach, combinatorial partitioning method (CPM), restricted partitioning method (RPM), multifactor dimensionality reduction (MDR) method and the random forests approach. The relative strengths and weaknesses of these methods are highlighted. Logistic regression and neural networks can handle only a limited number of predictor variables, depending on the number of observations in the dataset. Therefore, they are less useful than the non-parametric methods to approach association studies with large numbers of predictor variables. GPNN on the other hand may be a useful approach to select and model important predictors, but its performance to select the important effects in the presence of large numbers of predictors needs to be examined. Both the set association approach and random forests approach are able to handle a large number of predictors and are useful in reducing these predictors to a subset of predictors with an important contribution to disease. The combinatorial methods give more insight in combination patterns for sets of genetic and/or environmental predictor variables that may be related to the outcome variable. As the non-parametric methods have different strengths and weaknesses we conclude that to approach genetic association studies using the case-control design, the application of a combination of several methods, including the set association approach, MDR and the random forests approach, will likely be a useful strategy to find the important genes and interaction patterns involved in complex diseases.
A random effects meta-analysis model with Box-Cox transformation.
Yamaguchi, Yusuke; Maruo, Kazushi; Partlett, Christopher; Riley, Richard D
2017-07-19
In a random effects meta-analysis model, true treatment effects for each study are routinely assumed to follow a normal distribution. However, normality is a restrictive assumption and the misspecification of the random effects distribution may result in a misleading estimate of overall mean for the treatment effect, an inappropriate quantification of heterogeneity across studies and a wrongly symmetric prediction interval. We focus on problems caused by an inappropriate normality assumption of the random effects distribution, and propose a novel random effects meta-analysis model where a Box-Cox transformation is applied to the observed treatment effect estimates. The proposed model aims to normalise an overall distribution of observed treatment effect estimates, which is sum of the within-study sampling distributions and the random effects distribution. When sampling distributions are approximately normal, non-normality in the overall distribution will be mainly due to the random effects distribution, especially when the between-study variation is large relative to the within-study variation. The Box-Cox transformation addresses this flexibly according to the observed departure from normality. We use a Bayesian approach for estimating parameters in the proposed model, and suggest summarising the meta-analysis results by an overall median, an interquartile range and a prediction interval. The model can be applied for any kind of variables once the treatment effect estimate is defined from the variable. A simulation study suggested that when the overall distribution of treatment effect estimates are skewed, the overall mean and conventional I 2 from the normal random effects model could be inappropriate summaries, and the proposed model helped reduce this issue. We illustrated the proposed model using two examples, which revealed some important differences on summary results, heterogeneity measures and prediction intervals from the normal random effects model. The random effects meta-analysis with the Box-Cox transformation may be an important tool for examining robustness of traditional meta-analysis results against skewness on the observed treatment effect estimates. Further critical evaluation of the method is needed.
Evolution in a Test Tube: Exploring the Structure and Function of RNA Probes
2008-05-02
Bartel, D.P. and Szostak, J.W. (1993) Isolation of New Ribozymes from a Large Pool of Random Sequences. Science, New Series 261, 1141-1418. 24...Szostak, J.W. (1993) Isolation of New Ribozymes from a Large Pool of Random Sequences. Science, New Series 261, 1141-1418. Chen, Ying; Carlini
Shahbazkhani, Bijan; Sadeghi, Amirsaeid; Malekzadeh, Reza; Khatavi, Fatima; Etemadi, Mehrnoosh; Kalantri, Ebrahim; Rostami-Nejad, Mohammad; Rostami, Kamran
2015-06-05
Several studies have shown that a large number of patients who are fulfilling the criteria for irritable bowel syndrome (IBS) are sensitive to gluten. The aim of this study was to evaluate the effect of a gluten-free diet on gastrointestinal symptoms in patients with IBS. In this double-blind randomized, placebo-controlled trial, 148 IBS patients fulfilling the Rome III criteria were enrolled between 2011 and 2013. However, only 72 out of the 148 commenced on a gluten-free diet for up to six weeks and completed the study; clinical symptoms were recorded biweekly using a standard visual analogue scale (VAS). In the second stage after six weeks, patients whose symptoms improved to an acceptable level were randomly divided into two groups; patients either received packages containing powdered gluten (35 cases) or patients received placebo (gluten free powder) (37 cases). Overall, the symptomatic improvement was statistically different in the gluten-containing group compared with placebo group in 9 (25.7%), and 31 (83.8%) patients respectively (p < 0.001). A large number of patients labelled as irritable bowel syndrome are sensitive to gluten. Using the term of IBS can therefore be misleading and may deviate and postpone the application of an effective and well-targeted treatment strategy in gluten sensitive patients.
NASA Technical Reports Server (NTRS)
Ponomarev, A. L.; Brenner, D.; Hlatky, L. R.; Sachs, R. K.
2000-01-01
DNA double-strand breaks (DSBs) produced by densely ionizing radiation are not located randomly in the genome: recent data indicate DSB clustering along chromosomes. Stochastic DSB clustering at large scales, from > 100 Mbp down to < 0.01 Mbp, is modeled using computer simulations and analytic equations. A random-walk, coarse-grained polymer model for chromatin is combined with a simple track structure model in Monte Carlo software called DNAbreak and is applied to data on alpha-particle irradiation of V-79 cells. The chromatin model neglects molecular details but systematically incorporates an increase in average spatial separation between two DNA loci as the number of base-pairs between the loci increases. Fragment-size distributions obtained using DNAbreak match data on large fragments about as well as distributions previously obtained with a less mechanistic approach. Dose-response relations, linear at small doses of high linear energy transfer (LET) radiation, are obtained. They are found to be non-linear when the dose becomes so large that there is a significant probability of overlapping or close juxtaposition, along one chromosome, for different DSB clusters from different tracks. The non-linearity is more evident for large fragments than for small. The DNAbreak results furnish an example of the RLC (randomly located clusters) analytic formalism, which generalizes the broken-stick fragment-size distribution of the random-breakage model that is often applied to low-LET data.
ERIC Educational Resources Information Center
Roschelle, Jeremy; Shechtman, Nicole; Tatar, Deborah; Hegedus, Stephen; Hopkins, Bill; Empson, Susan; Knudsen, Jennifer; Gallagher, Lawrence P.
2010-01-01
The authors present three studies (two randomized controlled experiments and one embedded quasi-experiment) designed to evaluate the impact of replacement units targeting student learning of advanced middle school mathematics. The studies evaluated the SimCalc approach, which integrates an interactive representational technology, paper curriculum,…
NASA Technical Reports Server (NTRS)
Over, Thomas, M.; Gupta, Vijay K.
1994-01-01
Under the theory of independent and identically distributed random cascades, the probability distribution of the cascade generator determines the spatial and the ensemble properties of spatial rainfall. Three sets of radar-derived rainfall data in space and time are analyzed to estimate the probability distribution of the generator. A detailed comparison between instantaneous scans of spatial rainfall and simulated cascades using the scaling properties of the marginal moments is carried out. This comparison highlights important similarities and differences between the data and the random cascade theory. Differences are quantified and measured for the three datasets. Evidence is presented to show that the scaling properties of the rainfall can be captured to the first order by a random cascade with a single parameter. The dependence of this parameter on forcing by the large-scale meteorological conditions, as measured by the large-scale spatial average rain rate, is investigated for these three datasets. The data show that this dependence can be captured by a one-to-one function. Since the large-scale average rain rate can be diagnosed from the large-scale dynamics, this relationship demonstrates an important linkage between the large-scale atmospheric dynamics and the statistical cascade theory of mesoscale rainfall. Potential application of this research to parameterization of runoff from the land surface and regional flood frequency analysis is briefly discussed, and open problems for further research are presented.
Sánchez, R; Carreras, B A; van Milligen, B Ph
2005-01-01
The fluid limit of a recently introduced family of nonintegrable (nonlinear) continuous-time random walks is derived in terms of fractional differential equations. In this limit, it is shown that the formalism allows for the modeling of the interaction between multiple transport mechanisms with not only disparate spatial scales but also different temporal scales. For this reason, the resulting fluid equations may find application in the study of a large number of nonlinear multiscale transport problems, ranging from the study of self-organized criticality to the modeling of turbulent transport in fluids and plasmas.
Partial Identification of Treatment Effects: Applications to Generalizability
ERIC Educational Resources Information Center
Chan, Wendy
2016-01-01
Results from large-scale evaluation studies form the foundation of evidence-based policy. The randomized experiment is often considered the gold standard among study designs because the causal impact of a treatment or intervention can be assessed without threats of confounding from external variables. Policy-makers have become increasingly…
Nagel, Corey L; Kirby, Miles A; Zambrano, Laura D; Rosa, Ghislane; Barstow, Christina K; Thomas, Evan A; Clasen, Thomas F
2016-12-15
In Rwanda, pneumonia and diarrhea are the first and second leading causes of death, respectively, among children under five. Household air pollution (HAP) resultant from cooking indoors with biomass fuels on traditional stoves is a significant risk factor for pneumonia, while consumption of contaminated drinking water is a primary cause of diarrheal disease. To date, there have been no large-scale effectiveness trials of programmatic efforts to provide either improved cookstoves or household water filters at scale in a low-income country. In this paper we describe the design of a cluster-randomized trial to evaluate the impact of a national-level program to distribute and promote the use of improved cookstoves and advanced water filters to the poorest quarter of households in Rwanda. We randomly allocated 72 sectors (administratively defined units) in Western Province to the intervention, with the remaining 24 sectors in the province serving as controls. In the intervention sectors, roughly 100,000 households received improved cookstoves and household water filters through a government-sponsored program targeting the poorest quarter of households nationally. The primary outcome measures are the incidence of acute respiratory infection (ARI) and diarrhea among children under five years of age. Over a one-year surveillance period, all cases of acute respiratory infection (ARI) and diarrhea identified by health workers in the study area will be extracted from records maintained at health facilities and by community health workers (CHW). In addition, we are conducting intensive, longitudinal data collection among a random sample of households in the study area for in-depth assessment of coverage, use, environmental exposures, and additional health measures. Although previous research has examined the impact of providing household water treatment and improved cookstoves on child health, there have been no studies of national-level programs to deliver these interventions at scale in a developing country. The results of this study, the first RCT of a large-scale programmatic cookstove or household water filter intervention, will inform global efforts to reduce childhood morbidity and mortality from diarrheal disease and pneumonia. This trial is registered at Clinicaltrials.gov (NCT02239250).
Christen, William G.; Glynn, Robert J.; Gaziano, J. Michael; Darke, Amy K.; Crowley, John J.; Goodman, Phyllis J.; Lippman, Scott M.; Lad, Thomas E.; Bearden, James D.; Goodman, Gary E.; Minasian, Lori M.; Thompson, Ian M.; Blanke, Charles D.; Klein, Eric A.
2014-01-01
Importance Observational studies suggest a role for dietary nutrients such as vitamin E and selenium in cataract prevention. However, the results of randomized trials of vitamin E supplements and cataract have been disappointing, and are not yet available for selenium. Objective To test whether long-term supplementation with selenium and vitamin E affects the incidence of cataract in a large cohort of men. Design, Setting, and Participants The SELECT Eye Endpoints (SEE) study was an ancillary study of the SWOG-coordinated Selenium and Vitamin E Cancer Prevention Trial (SELECT), a randomized, placebo-controlled, four arm trial of selenium and vitamin E conducted among 35,533 men aged 50 years and older for African Americans and 55 and older for all other men, at 427 participating sites in the US, Canada, and Puerto Rico. A total of 11,267 SELECT participants from 128 SELECT sites participated in the SEE ancillary study. Intervention Individual supplements of selenium (200 µg/d from L-selenomethionine) and vitamin E (400 IU/d of all rac-α-tocopheryl acetate). Main Outcome Measures Incident cataract, defined as a lens opacity, age-related in origin, responsible for a reduction in best-corrected visual acuity to 20/30 or worse based on self-report confirmed by medical record review, and cataract extraction, defined as the surgical removal of an incident cataract. Results During a mean (SD) of 5.6 (1.2) years of treatment and follow-up, 389 cases of cataract were documented. There were 185 cataracts in the selenium group and 204 in the no selenium group (hazard ratio [HR], 0.91; 95 percent confidence interval [CI], 0.75 to 1.11; P=.37). For vitamin E, there were 197 cases in the treated group and 192 in the placebo group (HR, 1.02; CI, 0.84 to 1.25; P=.81). Similar results were observed for cataract extraction. Conclusions and Relevance These randomized trial data from a large cohort of apparently healthy men indicate that long-term daily supplementation with selenium and/or vitamin E is unlikely to have a large beneficial effect on age-related cataract. PMID:25232809
Topology Trivialization and Large Deviations for the Minimum in the Simplest Random Optimization
NASA Astrophysics Data System (ADS)
Fyodorov, Yan V.; Le Doussal, Pierre
2014-01-01
Finding the global minimum of a cost function given by the sum of a quadratic and a linear form in N real variables over (N-1)-dimensional sphere is one of the simplest, yet paradigmatic problems in Optimization Theory known as the "trust region subproblem" or "constraint least square problem". When both terms in the cost function are random this amounts to studying the ground state energy of the simplest spherical spin glass in a random magnetic field. We first identify and study two distinct large-N scaling regimes in which the linear term (magnetic field) leads to a gradual topology trivialization, i.e. reduction in the total number {N}_{tot} of critical (stationary) points in the cost function landscape. In the first regime {N}_{tot} remains of the order N and the cost function (energy) has generically two almost degenerate minima with the Tracy-Widom (TW) statistics. In the second regime the number of critical points is of the order of unity with a finite probability for a single minimum. In that case the mean total number of extrema (minima and maxima) of the cost function is given by the Laplace transform of the TW density, and the distribution of the global minimum energy is expected to take a universal scaling form generalizing the TW law. Though the full form of that distribution is not yet known to us, one of its far tails can be inferred from the large deviation theory for the global minimum. In the rest of the paper we show how to use the replica method to obtain the probability density of the minimum energy in the large-deviation approximation by finding both the rate function and the leading pre-exponential factor.
Stochastic Fermi Energization of Coronal Plasma during Explosive Magnetic Energy Release
NASA Astrophysics Data System (ADS)
Pisokas, Theophilos; Vlahos, Loukas; Isliker, Heinz; Tsiolis, Vassilis; Anastasiadis, Anastasios
2017-02-01
The aim of this study is to analyze the interaction of charged particles (ions and electrons) with randomly formed particle scatterers (e.g., large-scale local “magnetic fluctuations” or “coherent magnetic irregularities”) using the setup proposed initially by Fermi. These scatterers are formed by the explosive magnetic energy release and propagate with the Alfvén speed along the irregular magnetic fields. They are large-scale local fluctuations (δB/B ≈ 1) randomly distributed inside the unstable magnetic topology and will here be called Alfvénic Scatterers (AS). We constructed a 3D grid on which a small fraction of randomly chosen grid points are acting as AS. In particular, we study how a large number of test particles evolves inside a collection of AS, analyzing the evolution of their energy distribution and their escape-time distribution. We use a well-established method to estimate the transport coefficients directly from the trajectories of the particles. Using the estimated transport coefficients and solving the Fokker-Planck equation numerically, we can recover the energy distribution of the particles. We have shown that the stochastic Fermi energization of mildly relativistic and relativistic plasma can heat and accelerate the tail of the ambient particle distribution as predicted by Parker & Tidman and Ramaty. The temperature of the hot plasma and the tail of the energetic particles depend on the mean free path (λsc) of the particles between the scatterers inside the energization volume.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pisokas, Theophilos; Vlahos, Loukas; Isliker, Heinz
The aim of this study is to analyze the interaction of charged particles (ions and electrons) with randomly formed particle scatterers (e.g., large-scale local “magnetic fluctuations” or “coherent magnetic irregularities”) using the setup proposed initially by Fermi. These scatterers are formed by the explosive magnetic energy release and propagate with the Alfvén speed along the irregular magnetic fields. They are large-scale local fluctuations ( δB / B ≈ 1) randomly distributed inside the unstable magnetic topology and will here be called Alfvénic Scatterers (AS). We constructed a 3D grid on which a small fraction of randomly chosen grid points aremore » acting as AS. In particular, we study how a large number of test particles evolves inside a collection of AS, analyzing the evolution of their energy distribution and their escape-time distribution. We use a well-established method to estimate the transport coefficients directly from the trajectories of the particles. Using the estimated transport coefficients and solving the Fokker–Planck equation numerically, we can recover the energy distribution of the particles. We have shown that the stochastic Fermi energization of mildly relativistic and relativistic plasma can heat and accelerate the tail of the ambient particle distribution as predicted by Parker and Tidman and Ramaty. The temperature of the hot plasma and the tail of the energetic particles depend on the mean free path ( λ {sub sc}) of the particles between the scatterers inside the energization volume.« less
Rautiainen, Susanne; Sesso, Howard D; Manson, JoAnn E
2017-12-29
Several bioactive compounds and nutrients in foods have physiological properties that are beneficial for human health. While nutrients typically have clear definitions with established levels of recommended intakes, bioactive compounds often lack such a definition. Although a food-based approach is often the optimal approach to ensure adequate intake of bioactives and nutrients, these components are also often produced as dietary supplements. However, many of these supplements are not sufficiently studied and have an unclear role in chronic disease prevention. Randomized trials are considered the gold standard of study designs, but have not been fully applied to understand the effects of bioactives and nutrients. We review the specific role of large-scale trials to test whether bioactives and nutrients have an effect on health outcomes through several crucial components of trial design, including selection of intervention, recruitment, compliance, outcome selection, and interpretation and generalizability of study findings. We will discuss these components in the context of two randomized clinical trials, the VITamin D and OmegA-3 TriaL (VITAL) and the COcoa Supplement and Multivitamin Outcomes Study (COSMOS). We will mainly focus on dietary supplements of bioactives and nutrients while also emphasizing the need for translation and integration with food-based trials that are of vital importance within nutritional research. Copyright © 2017. Published by Elsevier Ltd.
ERIC Educational Resources Information Center
Frees, Edward W.; Kim, Jee-Seon
2006-01-01
Multilevel models are proven tools in social research for modeling complex, hierarchical systems. In multilevel modeling, statistical inference is based largely on quantification of random variables. This paper distinguishes among three types of random variables in multilevel modeling--model disturbances, random coefficients, and future response…
NASA Astrophysics Data System (ADS)
Ordóñez Cabrera, Manuel; Volodin, Andrei I.
2005-05-01
From the classical notion of uniform integrability of a sequence of random variables, a new concept of integrability (called h-integrability) is introduced for an array of random variables, concerning an array of constantsE We prove that this concept is weaker than other previous related notions of integrability, such as Cesàro uniform integrability [Chandra, Sankhya Ser. A 51 (1989) 309-317], uniform integrability concerning the weights [Ordóñez Cabrera, Collect. Math. 45 (1994) 121-132] and Cesàro [alpha]-integrability [Chandra and Goswami, J. Theoret. ProbabE 16 (2003) 655-669]. Under this condition of integrability and appropriate conditions on the array of weights, mean convergence theorems and weak laws of large numbers for weighted sums of an array of random variables are obtained when the random variables are subject to some special kinds of dependence: (a) rowwise pairwise negative dependence, (b) rowwise pairwise non-positive correlation, (c) when the sequence of random variables in every row is [phi]-mixing. Finally, we consider the general weak law of large numbers in the sense of Gut [Statist. Probab. Lett. 14 (1992) 49-52] under this new condition of integrability for a Banach space setting.
A random-sum Wilcoxon statistic and its application to analysis of ROC and LROC data.
Tang, Liansheng Larry; Balakrishnan, N
2011-01-01
The Wilcoxon-Mann-Whitney statistic is commonly used for a distribution-free comparison of two groups. One requirement for its use is that the sample sizes of the two groups are fixed. This is violated in some of the applications such as medical imaging studies and diagnostic marker studies; in the former, the violation occurs since the number of correctly localized abnormal images is random, while in the latter the violation is due to some subjects not having observable measurements. For this reason, we propose here a random-sum Wilcoxon statistic for comparing two groups in the presence of ties, and derive its variance as well as its asymptotic distribution for large sample sizes. The proposed statistic includes the regular Wilcoxon rank-sum statistic. Finally, we apply the proposed statistic for summarizing location response operating characteristic data from a liver computed tomography study, and also for summarizing diagnostic accuracy of biomarker data.
Study on the Vehicle Dynamic Load Considering the Vehicle-Pavement Coupled Effect
NASA Astrophysics Data System (ADS)
Xu, H. L.; He, L.; An, D.
2017-11-01
The vibration of vehicle-pavement interaction system is sophisticated random vibration process and the vehicle-pavement coupled effect was not considered in the previous study. A new linear elastic model of the vehicle-pavement coupled system was established in the paper. The new model was verified with field measurement which could reflect the real vibration between vehicle and pavement. Using the new model, the study on the vehicle dynamic load considering the vehicle-pavement coupled effect showed that random forces (centralization) between vehicle and pavement were influenced largely by vehicle-pavement coupled effect. Numerical calculation indicated that the maximum of random forces in coupled model was 2.4 times than that in uncoupled model. Inquiring the reason, it was found that the main vibration frequency of the vehicle non-suspension system was similar with that of the vehicle suspension system in the coupled model and the resonance vibration lead to vehicle dynamic load increase significantly.
Random walks exhibiting anomalous diffusion: elephants, urns and the limits of normality
NASA Astrophysics Data System (ADS)
Kearney, Michael J.; Martin, Richard J.
2018-01-01
A random walk model is presented which exhibits a transition from standard to anomalous diffusion as a parameter is varied. The model is a variant on the elephant random walk and differs in respect of the treatment of the initial state, which in the present work consists of a given number N of fixed steps. This also links the elephant random walk to other types of history dependent random walk. As well as being amenable to direct analysis, the model is shown to be asymptotically equivalent to a non-linear urn process. This provides fresh insights into the limiting form of the distribution of the walker’s position at large times. Although the distribution is intrinsically non-Gaussian in the anomalous diffusion regime, it gradually reverts to normal form when N is large under quite general conditions.
NASA Astrophysics Data System (ADS)
Long, Yin; Zhang, Xiao-Jun; Wang, Kui
2018-05-01
In this paper, convergence and approximate calculation of average degree under different network sizes for decreasing random birth-and-death networks (RBDNs) are studied. First, we find and demonstrate that the average degree is convergent in the form of power law. Meanwhile, we discover that the ratios of the back items to front items of convergent reminder are independent of network link number for large network size, and we theoretically prove that the limit of the ratio is a constant. Moreover, since it is difficult to calculate the analytical solution of the average degree for large network sizes, we adopt numerical method to obtain approximate expression of the average degree to approximate its analytical solution. Finally, simulations are presented to verify our theoretical results.
Scattering from a random layer of leaves in the physical optics limit
NASA Technical Reports Server (NTRS)
Lang, R. H.; Seker, S. S.; Le Vine, D. M.
1982-01-01
Backscatter of electromagnetic radiation from a layer of vegetation over flat lossy ground has been studied in collaborative research at the George Washingnton University and the Goddard Space Flight Center. In this work the vegetation is composed of leaves which are modeled by a random collection of lossy dielectric disks. Backscattering coefficients for the vegetation layer have been calculated in the case of disks whose diameter is large compared to wavelength. These backscattering coefficients are obtained in terms of the scattering amplitude of an individual disk by employing the distorted Born procedure. The scattering amplitude for a disk which is large compared to wavelength is then found by physical optic techniques. Computed results are interpreted in terms of dominant reflected and transmitted contributions from the disks and ground.
NASA Astrophysics Data System (ADS)
Sidorova, Mariia; Semenov, Alexej; Hübers, Heinz-Wilhelm; Charaev, Ilya; Kuzmin, Artem; Doerner, Steffen; Siegel, Michael
2017-11-01
We studied timing jitter in the appearance of photon counts in meandering nanowires with different fractional amount of bends. Intrinsic timing jitter, which is the probability density function of the random time delay between photon absorption in current-carrying superconducting nanowire and appearance of the normal domain, reveals two different underlying physical mechanisms. In the deterministic regime, which is realized at large photon energies and large currents, jitter is controlled by position-dependent detection threshold in straight parts of meanders. It decreases with the increase in the current. At small photon energies, jitter increases and its current dependence disappears. In this probabilistic regime jitter is controlled by Poisson process in that magnetic vortices jump randomly across the wire in areas adjacent to the bends.
Uncertainty in Random Forests: What does it mean in a spatial context?
NASA Astrophysics Data System (ADS)
Klump, Jens; Fouedjio, Francky
2017-04-01
Geochemical surveys are an important part of exploration for mineral resources and in environmental studies. The samples and chemical analyses are often laborious and difficult to obtain and therefore come at a high cost. As a consequence, these surveys are characterised by datasets with large numbers of variables but relatively few data points when compared to conventional big data problems. With more remote sensing platforms and sensor networks being deployed, large volumes of auxiliary data of the surveyed areas are becoming available. The use of these auxiliary data has the potential to improve the prediction of chemical element concentrations over the whole study area. Kriging is a well established geostatistical method for the prediction of spatial data but requires significant pre-processing and makes some basic assumptions about the underlying distribution of the data. Some machine learning algorithms, on the other hand, may require less data pre-processing and are non-parametric. In this study we used a dataset provided by Kirkwood et al. [1] to explore the potential use of Random Forest in geochemical mapping. We chose Random Forest because it is a well understood machine learning method and has the advantage that it provides us with a measure of uncertainty. By comparing Random Forest to Kriging we found that both methods produced comparable maps of estimated values for our variables of interest. Kriging outperformed Random Forest for variables of interest with relatively strong spatial correlation. The measure of uncertainty provided by Random Forest seems to be quite different to the measure of uncertainty provided by Kriging. In particular, the lack of spatial context can give misleading results in areas without ground truth data. In conclusion, our preliminary results show that the model driven approach in geostatistics gives us more reliable estimates for our target variables than Random Forest for variables with relatively strong spatial correlation. However, in cases of weak spatial correlation Random Forest, as a nonparametric method, may give the better results once we have a better understanding of the meaning of its uncertainty measures in a spatial context. References [1] Kirkwood, C., M. Cave, D. Beamish, S. Grebby, and A. Ferreira (2016), A machine learning approach to geochemical mapping, Journal of Geochemical Exploration, 163, 28-40, doi:10.1016/j.gexplo.2016.05.003.
The PLCO Cancer Screening Trial: Q and A
The Prostate, Lung, Colorectal and Ovarian (PLCO) Cancer Screening Trial is a large, randomized study to determine whether the use of certain screening tests will reduce the risk of dying of those four cancers. In addition to answering questions about the
Duong, Manh Hong; Han, The Anh
2016-12-01
In this paper, we study the distribution and behaviour of internal equilibria in a d-player n-strategy random evolutionary game where the game payoff matrix is generated from normal distributions. The study of this paper reveals and exploits interesting connections between evolutionary game theory and random polynomial theory. The main contributions of the paper are some qualitative and quantitative results on the expected density, [Formula: see text], and the expected number, E(n, d), of (stable) internal equilibria. Firstly, we show that in multi-player two-strategy games, they behave asymptotically as [Formula: see text] as d is sufficiently large. Secondly, we prove that they are monotone functions of d. We also make a conjecture for games with more than two strategies. Thirdly, we provide numerical simulations for our analytical results and to support the conjecture. As consequences of our analysis, some qualitative and quantitative results on the distribution of zeros of a random Bernstein polynomial are also obtained.
A new u-statistic with superior design sensitivity in matched observational studies.
Rosenbaum, Paul R
2011-09-01
In an observational or nonrandomized study of treatment effects, a sensitivity analysis indicates the magnitude of bias from unmeasured covariates that would need to be present to alter the conclusions of a naïve analysis that presumes adjustments for observed covariates suffice to remove all bias. The power of sensitivity analysis is the probability that it will reject a false hypothesis about treatment effects allowing for a departure from random assignment of a specified magnitude; in particular, if this specified magnitude is "no departure" then this is the same as the power of a randomization test in a randomized experiment. A new family of u-statistics is proposed that includes Wilcoxon's signed rank statistic but also includes other statistics with substantially higher power when a sensitivity analysis is performed in an observational study. Wilcoxon's statistic has high power to detect small effects in large randomized experiments-that is, it often has good Pitman efficiency-but small effects are invariably sensitive to small unobserved biases. Members of this family of u-statistics that emphasize medium to large effects can have substantially higher power in a sensitivity analysis. For example, in one situation with 250 pair differences that are Normal with expectation 1/2 and variance 1, the power of a sensitivity analysis that uses Wilcoxon's statistic is 0.08 while the power of another member of the family of u-statistics is 0.66. The topic is examined by performing a sensitivity analysis in three observational studies, using an asymptotic measure called the design sensitivity, and by simulating power in finite samples. The three examples are drawn from epidemiology, clinical medicine, and genetic toxicology. © 2010, The International Biometric Society.
Salem, Mohamed M.; Alturki, Abdulrahman Y.; Fusco, Matthew R.; Thomas, Ajith J.; Carter, Bob S.; Chen, Clark C.; Kasper, Ekkehard M.
2018-01-01
Background: Carotid artery stenosis, both symptomatic and asymptomatic, has been well studied with several multicenter randomized trials. The superiority of carotid endarterectomy (CEA) to medical therapy alone in both symptomatic and asymptomatic carotid artery stenosis has been well established in previous trials in the 1990s. The consequent era of endovascular carotid artery stenting (CAS) has offered another option for treating carotid artery stenosis. A series of randomized trials have now been conducted to compare CEA and CAS in the treatment of carotid artery disease. The large number of similar trials has created some confusion due to inconsistent results. Here, the authors review the trials that compare CEA and CAS in the management of carotid artery stenosis. Methods: The PubMed database was searched systematically for randomized controlled trials published in English that compared CEA and CAS. Only human studies on adult patients were assessed. The references of identified articles were reviewed for additional manuscripts to be included if inclusion criteria were met. The following terms were used during search: carotid stenosis, endarterectomy, stenting. Retrospective or single-center studies were excluded from the review. Results: Thirteen reports of seven large-scale prospective multicenter studies, comparing both interventions for symptomatic or asymptomatic extracranial carotid artery stenosis, were identified. Conclusions: While the superiority of intervention to medical management for symptomatic patients has been well established in the literatures, careful selection of asymptomatic patients for intervention should be undertaken and only be pursued after institution of appropriate medical therapy until further reports on trials comparing medical therapy to intervention in this patient group are available. PMID:29740506
ERIC Educational Resources Information Center
Westine, Carl D.
2016-01-01
Little is known empirically about intraclass correlations (ICCs) for multisite cluster randomized trial (MSCRT) designs, particularly in science education. In this study, ICCs suitable for science achievement studies using a three-level (students in schools in districts) MSCRT design that block on district are estimated and examined. Estimates of…
ERIC Educational Resources Information Center
National Center for Education Evaluation and Regional Assistance, 2014
2014-01-01
The Teacher Incentive Fund (TIF) provides grants to support performance-based compensation systems for teachers and principals in high-need schools. The study measures the impact of pay-for-performance bonuses as part of a comprehensive compensation system within a large, multisite random assignment study design. The treatment schools were to…
True Randomness from Big Data.
Papakonstantinou, Periklis A; Woodruff, David P; Yang, Guang
2016-09-26
Generating random bits is a difficult task, which is important for physical systems simulation, cryptography, and many applications that rely on high-quality random bits. Our contribution is to show how to generate provably random bits from uncertain events whose outcomes are routinely recorded in the form of massive data sets. These include scientific data sets, such as in astronomics, genomics, as well as data produced by individuals, such as internet search logs, sensor networks, and social network feeds. We view the generation of such data as the sampling process from a big source, which is a random variable of size at least a few gigabytes. Our view initiates the study of big sources in the randomness extraction literature. Previous approaches for big sources rely on statistical assumptions about the samples. We introduce a general method that provably extracts almost-uniform random bits from big sources and extensively validate it empirically on real data sets. The experimental findings indicate that our method is efficient enough to handle large enough sources, while previous extractor constructions are not efficient enough to be practical. Quality-wise, our method at least matches quantum randomness expanders and classical world empirical extractors as measured by standardized tests.
NASA Astrophysics Data System (ADS)
Papakonstantinou, Periklis A.; Woodruff, David P.; Yang, Guang
2016-09-01
Generating random bits is a difficult task, which is important for physical systems simulation, cryptography, and many applications that rely on high-quality random bits. Our contribution is to show how to generate provably random bits from uncertain events whose outcomes are routinely recorded in the form of massive data sets. These include scientific data sets, such as in astronomics, genomics, as well as data produced by individuals, such as internet search logs, sensor networks, and social network feeds. We view the generation of such data as the sampling process from a big source, which is a random variable of size at least a few gigabytes. Our view initiates the study of big sources in the randomness extraction literature. Previous approaches for big sources rely on statistical assumptions about the samples. We introduce a general method that provably extracts almost-uniform random bits from big sources and extensively validate it empirically on real data sets. The experimental findings indicate that our method is efficient enough to handle large enough sources, while previous extractor constructions are not efficient enough to be practical. Quality-wise, our method at least matches quantum randomness expanders and classical world empirical extractors as measured by standardized tests.
Papakonstantinou, Periklis A.; Woodruff, David P.; Yang, Guang
2016-01-01
Generating random bits is a difficult task, which is important for physical systems simulation, cryptography, and many applications that rely on high-quality random bits. Our contribution is to show how to generate provably random bits from uncertain events whose outcomes are routinely recorded in the form of massive data sets. These include scientific data sets, such as in astronomics, genomics, as well as data produced by individuals, such as internet search logs, sensor networks, and social network feeds. We view the generation of such data as the sampling process from a big source, which is a random variable of size at least a few gigabytes. Our view initiates the study of big sources in the randomness extraction literature. Previous approaches for big sources rely on statistical assumptions about the samples. We introduce a general method that provably extracts almost-uniform random bits from big sources and extensively validate it empirically on real data sets. The experimental findings indicate that our method is efficient enough to handle large enough sources, while previous extractor constructions are not efficient enough to be practical. Quality-wise, our method at least matches quantum randomness expanders and classical world empirical extractors as measured by standardized tests. PMID:27666514
Hoare, Jacqueline; Carey, Paul; Joska, John A; Carrara, Henri; Sorsdahl, Katherine; Stein, Dan J
2014-02-01
Depression can be a chronic and impairing illness in people with human immunodeficiency virus (HIV)/acquired immunodeficiency syndrome. Large randomized studies of newer selective serotonin reuptake inhibitors such as escitalopram in the treatment of depression in HIV, examining comparative treatment efficacy and safety, have yet to be done in HIV-positive patients. This was a fixed-dose, placebo-controlled, randomized, double-blind study to investigate the efficacy of escitalopram in HIV-seropositive subjects with Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition, major depressive disorder. One hundred two participants were randomly assigned to either 10 mg of escitalopram or placebo for 6 weeks. An analysis of covariance of the completers found that there was no advantage for escitalopram over placebo on the Montgomery-Asberg Depression Rating Scale (p = 0.93). Sixty-two percent responded to escitalopram and 59% responded to placebo on the Clinical Global Impression Scale. Given the relatively high placebo response, future trials in this area need to be selective in participant recruitment and to be adequately powered.
Parameters affecting the resilience of scale-free networks to random failures.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Link, Hamilton E.; LaViolette, Randall A.; Lane, Terran
2005-09-01
It is commonly believed that scale-free networks are robust to massive numbers of random node deletions. For example, Cohen et al. in (1) study scale-free networks including some which approximate the measured degree distribution of the Internet. Their results suggest that if each node in this network failed independently with probability 0.99, most of the remaining nodes would still be connected in a giant component. In this paper, we show that a large and important subclass of scale-free networks are not robust to massive numbers of random node deletions. In particular, we study scale-free networks which have minimum node degreemore » of 1 and a power-law degree distribution beginning with nodes of degree 1 (power-law networks). We show that, in a power-law network approximating the Internet's reported distribution, when the probability of deletion of each node is 0.5 only about 25% of the surviving nodes in the network remain connected in a giant component, and the giant component does not persist beyond a critical failure rate of 0.9. The new result is partially due to improved analytical accommodation of the large number of degree-0 nodes that result after node deletions. Our results apply to power-law networks with a wide range of power-law exponents, including Internet-like networks. We give both analytical and empirical evidence that such networks are not generally robust to massive random node deletions.« less
NASA Astrophysics Data System (ADS)
Abid, Najmul; Mirkhalaf, Mohammad; Barthelat, Francois
2018-03-01
Natural materials such as nacre, collagen, and spider silk are composed of staggered stiff and strong inclusions in a softer matrix. This type of hybrid microstructure results in remarkable combinations of stiffness, strength, and toughness and it now inspires novel classes of high-performance composites. However, the analytical and numerical approaches used to predict and optimize the mechanics of staggered composites often neglect statistical variations and inhomogeneities, which may have significant impacts on modulus, strength, and toughness. Here we present an analysis of localization using small representative volume elements (RVEs) and large scale statistical volume elements (SVEs) based on the discrete element method (DEM). DEM is an efficient numerical method which enabled the evaluation of more than 10,000 microstructures in this study, each including about 5,000 inclusions. The models explore the combined effects of statistics, inclusion arrangement, and interface properties. We find that statistical variations have a negative effect on all properties, in particular on the ductility and energy absorption because randomness precipitates the localization of deformations. However, the results also show that the negative effects of random microstructures can be offset by interfaces with large strain at failure accompanied by strain hardening. More specifically, this quantitative study reveals an optimal range of interface properties where the interfaces are the most effective at delaying localization. These findings show how carefully designed interfaces in bioinspired staggered composites can offset the negative effects of microstructural randomness, which is inherent to most current fabrication methods.
NASA Astrophysics Data System (ADS)
Warrier, M.; Bhardwaj, U.; Hemani, H.; Schneider, R.; Mutzke, A.; Valsakumar, M. C.
2015-12-01
We report on molecular Dynamics (MD) simulations carried out in fcc Cu and bcc W using the Large-scale Atomic/Molecular Massively Parallel Simulator (LAMMPS) code to study (i) the statistical variations in the number of interstitials and vacancies produced by energetic primary knock-on atoms (PKA) (0.1-5 keV) directed in random directions and (ii) the in-cascade cluster size distributions. It is seen that around 60-80 random directions have to be explored for the average number of displaced atoms to become steady in the case of fcc Cu, whereas for bcc W around 50-60 random directions need to be explored. The number of Frenkel pairs produced in the MD simulations are compared with that from the Binary Collision Approximation Monte Carlo (BCA-MC) code SDTRIM-SP and the results from the NRT model. It is seen that a proper choice of the damage energy, i.e. the energy required to create a stable interstitial, is essential for the BCA-MC results to match the MD results. On the computational front it is seen that in-situ processing saves the need to input/output (I/O) atomic position data of several tera-bytes when exploring a large number of random directions and there is no difference in run-time because the extra run-time in processing data is offset by the time saved in I/O.
Large-deviation theory for diluted Wishart random matrices
NASA Astrophysics Data System (ADS)
Castillo, Isaac Pérez; Metz, Fernando L.
2018-03-01
Wishart random matrices with a sparse or diluted structure are ubiquitous in the processing of large datasets, with applications in physics, biology, and economy. In this work, we develop a theory for the eigenvalue fluctuations of diluted Wishart random matrices based on the replica approach of disordered systems. We derive an analytical expression for the cumulant generating function of the number of eigenvalues IN(x ) smaller than x ∈R+ , from which all cumulants of IN(x ) and the rate function Ψx(k ) controlling its large-deviation probability Prob[IN(x ) =k N ] ≍e-N Ψx(k ) follow. Explicit results for the mean value and the variance of IN(x ) , its rate function, and its third cumulant are discussed and thoroughly compared to numerical diagonalization, showing very good agreement. The present work establishes the theoretical framework put forward in a recent letter [Phys. Rev. Lett. 117, 104101 (2016), 10.1103/PhysRevLett.117.104101] as an exact and compelling approach to deal with eigenvalue fluctuations of sparse random matrices.
NASA Technical Reports Server (NTRS)
Weinberg, David H.; Gott, J. Richard, III; Melott, Adrian L.
1987-01-01
Many models for the formation of galaxies and large-scale structure assume a spectrum of random phase (Gaussian), small-amplitude density fluctuations as initial conditions. In such scenarios, the topology of the galaxy distribution on large scales relates directly to the topology of the initial density fluctuations. Here a quantitative measure of topology - the genus of contours in a smoothed density distribution - is described and applied to numerical simulations of galaxy clustering, to a variety of three-dimensional toy models, and to a volume-limited sample of the CfA redshift survey. For random phase distributions the genus of density contours exhibits a universal dependence on threshold density. The clustering simulations show that a smoothing length of 2-3 times the mass correlation length is sufficient to recover the topology of the initial fluctuations from the evolved galaxy distribution. Cold dark matter and white noise models retain a random phase topology at shorter smoothing lengths, but massive neutrino models develop a cellular topology.
Analytical connection between thresholds and immunization strategies of SIS model in random networks
NASA Astrophysics Data System (ADS)
Zhou, Ming-Yang; Xiong, Wen-Man; Liao, Hao; Wang, Tong; Wei, Zong-Wen; Fu, Zhong-Qian
2018-05-01
Devising effective strategies for hindering the propagation of viruses and protecting the population against epidemics is critical for public security and health. Despite a number of studies based on the susceptible-infected-susceptible (SIS) model devoted to this topic, we still lack a general framework to compare different immunization strategies in completely random networks. Here, we address this problem by suggesting a novel method based on heterogeneous mean-field theory for the SIS model. Our method builds the relationship between the thresholds and different immunization strategies in completely random networks. Besides, we provide an analytical argument that the targeted large-degree strategy achieves the best performance in random networks with arbitrary degree distribution. Moreover, the experimental results demonstrate the effectiveness of the proposed method in both artificial and real-world networks.
NASA Astrophysics Data System (ADS)
Andresen, Juan Carlos; Katzgraber, Helmut G.; Schechter, Moshe
2017-12-01
Random fields disorder Ising ferromagnets by aligning single spins in the direction of the random field in three space dimensions, or by flipping large ferromagnetic domains at dimensions two and below. While the former requires random fields of typical magnitude similar to the interaction strength, the latter Imry-Ma mechanism only requires infinitesimal random fields. Recently, it has been shown that for dilute anisotropic dipolar systems a third mechanism exists, where the ferromagnetic phase is disordered by finite-size glassy domains at a random field of finite magnitude that is considerably smaller than the typical interaction strength. Using large-scale Monte Carlo simulations and zero-temperature numerical approaches, we show that this mechanism applies to disordered ferromagnets with competing short-range ferromagnetic and antiferromagnetic interactions, suggesting its generality in ferromagnetic systems with competing interactions and an underlying spin-glass phase. A finite-size-scaling analysis of the magnetization distribution suggests that the transition might be first order.
McEwen, Sara; Polatajko, Helene; Baum, Carolyn; Rios, Jorge; Cirone, Dianne; Doherty, Meghan; Wolf, Timothy
2014-01-01
Purpose The purpose of this study was to estimate the effect of the Cognitive Orientation to daily Occupational Performance (CO-OP) approach compared to usual outpatient rehabilitation on activity and participation in people less than 3 months post stroke. Methods An exploratory, single blind, randomized controlled trial with a usual care control arm was conducted. Participants referred to 2 stroke rehabilitation outpatient programs were randomized to receive either Usual Care or CO-OP. The primary outcome was actual performance of trained and untrained self-selected activities, measured using the Performance Quality Rating Scale (PQRS). Additional outcomes included the Canadian Occupational Performance Measure (COPM), the Stroke Impact Scale Participation Domain, the Community Participation Index, and the Self Efficacy Gauge. Results Thirty-five (35) eligible participants were randomized; 26 completed the intervention. Post-intervention, PQRS change scores demonstrated CO-OP had a medium effect over Usual Care on trained self-selected activities (d=0.5) and a large effect on untrained (d=1.2). At a 3 month follow-up, PQRS change scores indicated a large effect of CO-OP on both trained (d=1.6) and untrained activities (d=1.1). CO-OP had a small effect on COPM and a medium effect on the Community Participation Index perceived control and the Self-Efficacy Gauge. Conclusion CO-OP was associated with a large treatment effect on follow up performances of self-selected activities, and demonstrated transfer to untrained activities. A larger trial is warranted. PMID:25416738
McEwen, Sara; Polatajko, Helene; Baum, Carolyn; Rios, Jorge; Cirone, Dianne; Doherty, Meghan; Wolf, Timothy
2015-07-01
The purpose of this study was to estimate the effect of the Cognitive Orientation to daily Occupational Performance (CO-OP) approach compared with usual outpatient rehabilitation on activity and participation in people <3 months poststroke. An exploratory, single-blind, randomized controlled trial, with a usual-care control arm, was conducted. Participants referred to 2 stroke rehabilitation outpatient programs were randomized to receive either usual care or CO-OP. The primary outcome was actual performance of trained and untrained self-selected activities, measured using the Performance Quality Rating Scale (PQRS). Additional outcomes included the Canadian Occupational Performance Measure (COPM), the Stroke Impact Scale Participation Domain, the Community Participation Index, and the Self-Efficacy Gauge. A total of 35 eligible participants were randomized; 26 completed the intervention. Post intervention, PQRS change scores demonstrated that CO-OP had a medium effect over usual care on trained self-selected activities (d = 0.5) and a large effect on untrained activities (d = 1.2). At a 3-month follow-up, PQRS change scores indicated a large effect of CO-OP on both trained (d = 1.6) and untrained activities (d = 1.1). CO-OP had a small effect on COPM and a medium effect on the Community Participation Index perceived control and on the Self-Efficacy Gauge. CO-OP was associated with a large treatment effect on follow-up performances of self-selected activities and demonstrated transfer to untrained activities. A larger trial is warranted. © The Author(s) 2014.
Designing Large-Scale Multisite and Cluster-Randomized Studies of Professional Development
ERIC Educational Resources Information Center
Kelcey, Ben; Spybrook, Jessaca; Phelps, Geoffrey; Jones, Nathan; Zhang, Jiaqi
2017-01-01
We develop a theoretical and empirical basis for the design of teacher professional development studies. We build on previous work by (a) developing estimates of intraclass correlation coefficients for teacher outcomes using two- and three-level data structures, (b) developing estimates of the variance explained by covariates, and (c) modifying…
Predictors of Response to an Attention Modification Program in Generalized Social Phobia
ERIC Educational Resources Information Center
Amir, Nader; Taylor, Charles T.; Donohue, Michael C.
2011-01-01
Objective: At least 3 randomized, placebo-controlled, double-blind studies have supported the efficacy of computerized attention modification programs (AMPs) in reducing symptoms of anxiety in patients diagnosed with an anxiety disorder. In this study we examined patient characteristics that predicted response to AMP in a large sample of…
Replicating Experimental Impact Estimates Using a Regression Discontinuity Approach. NCEE 2012-4025
ERIC Educational Resources Information Center
Gleason, Philip M.; Resch, Alexandra M.; Berk, Jillian A.
2012-01-01
This NCEE Technical Methods Paper compares the estimated impacts of an educational intervention using experimental and regression discontinuity (RD) study designs. The analysis used data from two large-scale randomized controlled trials--the Education Technology Evaluation and the Teach for America Study--to provide evidence on the performance of…
ERIC Educational Resources Information Center
Smolkowski, Keith; Strycker, Lisa; Ward, Bryce
2016-01-01
This study evaluated the scale-up of a Safe & Civil Schools "Foundations: Establishing Positive Discipline Policies" positive behavioral interventions and supports initiative through 4 years of "real-world" implementation in a large urban school district. The study extends results from a previous randomized controlled trial…
Depressive Symptoms Negate the Beneficial Effects of Physical Activity on Mortality Risk
ERIC Educational Resources Information Center
Lee, Pai-Lin
2013-01-01
The aim of this study is to: (1) compare the association between various levels of physical activity (PA) and mortality; and (2) examine the potential modifying effect of depressive symptoms on the PA-mortality associations. Previous large scale randomized studies rarely assess the association in conjunction with modifying effects of depressive…
Topological analysis of the CfA redshift survey
NASA Technical Reports Server (NTRS)
Vogeley, Michael S.; Park, Changbom; Geller, Margaret J.; Huchra, John P.; Gott, J. Richard, III
1994-01-01
We study the topology of large-scale structure in the Center for Astrophysics Redshift Survey, which now includes approximately 12,000 galaxies with limiting magnitude m(sub B) is less than or equal to 15.5. The dense sampling and large volume of this survey allow us to compute the topology on smoothing scales from 6 to 20/h Mpc; we thus examine the topology of structure in both 'nonlinear' and 'linear' regimes. On smoothing scales less than or equal to 10/h Mpc this sample has 3 times the number of resolution elements of samples examined in previous studies. Isodensity surface of the smoothed galaxy density field demonstrate that coherent high-density structures and large voids dominate the galaxy distribution. We compute the genus-threshold density relation for isodensity surfaces of the CfA survey. To quantify phase correlation in these data, we compare the CfA genus with the genus of realizations of Gaussian random fields with the power spectrum measured for the CfA survey. On scales less than or equal to 10/h Mpc the observed genus amplitude is smaller than random phase (96% confidence level). This decrement reflects the degree of phase coherence in the observed galaxy distribution. In other words the genus amplitude on these scales is not good measure of the power spectrum slope. On scales greater than 10/h Mpc, where the galaxy distribution is rougly in the 'linear' regime, the genus ampitude is consistent with the random phase amplitude. The shape of the genus curve reflects the strong coherence in the observed structure; the observed genus curve appears broader than random phase (94% confidence level for smoothing scales less than or equal to 10/h Mpc) because the topolgoy is spongelike over a very large range of density threshold. This departre from random phase consistent with a distribution like a filamentary net of 'walls with holes.' On smoothing scales approaching approximately 20/h Mpc the shape of the CfA genus curve is consistent with random phase. There is very weak evidence for a shift of the genus toward a 'bubble-like' topology. To test cosmological models, we compute the genus for mock CfA surveys drawn from large (L greater than or approximately 400/h Mpc) N-body simulations of three variants of the cold dark matter (CDM) cosmogony. The genus amplitude of the 'standard' CDM model (omega h = 0.5, b = 1.5) differs from the observations (96% confidence level) on smoothing scales is less than or approximately 10/h Mpc. An open CDM model (omega h = 0.2) and a CDM model with nonzero cosmological constant (omega h = 0.24, lambda (sub 0) = 0.6) are consistent with the observed genus amplitude over the full range of smoothing scales. All of these models fail (97% confidence level) to match the broadness of the observed genus curve on smoothing scales is less than or equal to 10/h Mpc.
Martinussen, Torben; Vansteelandt, Stijn; Tchetgen Tchetgen, Eric J; Zucker, David M
2017-12-01
The use of instrumental variables for estimating the effect of an exposure on an outcome is popular in econometrics, and increasingly so in epidemiology. This increasing popularity may be attributed to the natural occurrence of instrumental variables in observational studies that incorporate elements of randomization, either by design or by nature (e.g., random inheritance of genes). Instrumental variables estimation of exposure effects is well established for continuous outcomes and to some extent for binary outcomes. It is, however, largely lacking for time-to-event outcomes because of complications due to censoring and survivorship bias. In this article, we make a novel proposal under a class of structural cumulative survival models which parameterize time-varying effects of a point exposure directly on the scale of the survival function; these models are essentially equivalent with a semi-parametric variant of the instrumental variables additive hazards model. We propose a class of recursive instrumental variable estimators for these exposure effects, and derive their large sample properties along with inferential tools. We examine the performance of the proposed method in simulation studies and illustrate it in a Mendelian randomization study to evaluate the effect of diabetes on mortality using data from the Health and Retirement Study. We further use the proposed method to investigate potential benefit from breast cancer screening on subsequent breast cancer mortality based on the HIP-study. © 2017, The International Biometric Society.
2012-01-01
Background With the current focus on personalized medicine, patient/subject level inference is often of key interest in translational research. As a result, random effects models (REM) are becoming popular for patient level inference. However, for very large data sets that are characterized by large sample size, it can be difficult to fit REM using commonly available statistical software such as SAS since they require inordinate amounts of computer time and memory allocations beyond what are available preventing model convergence. For example, in a retrospective cohort study of over 800,000 Veterans with type 2 diabetes with longitudinal data over 5 years, fitting REM via generalized linear mixed modeling using currently available standard procedures in SAS (e.g. PROC GLIMMIX) was very difficult and same problems exist in Stata’s gllamm or R’s lme packages. Thus, this study proposes and assesses the performance of a meta regression approach and makes comparison with methods based on sampling of the full data. Data We use both simulated and real data from a national cohort of Veterans with type 2 diabetes (n=890,394) which was created by linking multiple patient and administrative files resulting in a cohort with longitudinal data collected over 5 years. Methods and results The outcome of interest was mean annual HbA1c measured over a 5 years period. Using this outcome, we compared parameter estimates from the proposed random effects meta regression (REMR) with estimates based on simple random sampling and VISN (Veterans Integrated Service Networks) based stratified sampling of the full data. Our results indicate that REMR provides parameter estimates that are less likely to be biased with tighter confidence intervals when the VISN level estimates are homogenous. Conclusion When the interest is to fit REM in repeated measures data with very large sample size, REMR can be used as a good alternative. It leads to reasonable inference for both Gaussian and non-Gaussian responses if parameter estimates are homogeneous across VISNs. PMID:23095325
Study of Nonlinear Dynamics of Intense Charged Particle Beams in the Paul Trap Simulator Experiment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Hua
The Paul Trap Simulator Experiment (PTSX) is a compact laboratory device that simulates the nonlinear dynamics of intense charged particle beams propagating over a large distance in an alternating-gradient magnetic transport system. The radial quadrupole electric eld forces on the charged particles in the Paul Trap are analogous to the radial forces on the charged particles in the quadrupole magnetic transport system. The amplitude of oscillating voltage applied to the cylindrical electrodes in PTSX is equivalent to the quadrupole magnetic eld gradient in accelerators. The temporal periodicity in PTSX corresponds to the spatial periodicity in magnetic transport system. This thesismore » focuses on investigations of envelope instabilities and collective mode excitations, properties of high-intensity beams with significant space-charge effects, random noise-induced beam degradation and a laser-induced-fluorescence diagnostic. To better understand the nonlinear dynamics of the charged particle beams, it is critical to understand the collective processes of the charged particles. Charged particle beams support a variety of collective modes, among which the quadrupole mode and the dipole mode are of the greatest interest. We used quadrupole and dipole perturbations to excite the quadrupole and dipole mode respectively and study the effects of those collective modes on the charge bunch. The experimental and particle-in-cell (PIC) simulation results both show that when the frequency and the spatial structure of the external perturbation are matched with the corresponding collective mode, that mode will be excited to a large amplitude and resonates strongly with the external perturbation, usually causing expansion of the charge bunch and loss of particles. Machine imperfections are inevitable for accelerator systems, and we use random noise to simulate the effects of machine imperfection on the charged particle beams. The random noise can be Fourier decomposed into various frequency components and experimental results show that when the random noise has a large frequency component that matches a certain collective mode, the mode will also be excited and cause heating of the charge bunch. It is also noted that by rearranging the order of the random noise, the adverse effects of the random noise may be eliminated. As a non-destructive diagnostic method, a laser-induced- fluorescence (LIF) diagnostic is developed to study the transverse dynamics of the charged particle beams. The accompanying barium ion source and dye laser system are developed and tested.« less
Snell, Kym Ie; Ensor, Joie; Debray, Thomas Pa; Moons, Karel Gm; Riley, Richard D
2017-01-01
If individual participant data are available from multiple studies or clusters, then a prediction model can be externally validated multiple times. This allows the model's discrimination and calibration performance to be examined across different settings. Random-effects meta-analysis can then be used to quantify overall (average) performance and heterogeneity in performance. This typically assumes a normal distribution of 'true' performance across studies. We conducted a simulation study to examine this normality assumption for various performance measures relating to a logistic regression prediction model. We simulated data across multiple studies with varying degrees of variability in baseline risk or predictor effects and then evaluated the shape of the between-study distribution in the C-statistic, calibration slope, calibration-in-the-large, and E/O statistic, and possible transformations thereof. We found that a normal between-study distribution was usually reasonable for the calibration slope and calibration-in-the-large; however, the distributions of the C-statistic and E/O were often skewed across studies, particularly in settings with large variability in the predictor effects. Normality was vastly improved when using the logit transformation for the C-statistic and the log transformation for E/O, and therefore we recommend these scales to be used for meta-analysis. An illustrated example is given using a random-effects meta-analysis of the performance of QRISK2 across 25 general practices.
Symmetry of interactions rules in incompletely connected random replicator ecosystems.
Kärenlampi, Petri P
2014-06-01
The evolution of an incompletely connected system of species with speciation and extinction is investigated in terms of random replicators. It is found that evolving random replicator systems with speciation do become large and complex, depending on speciation parameters. Antisymmetric interactions result in large systems, whereas systems with symmetric interactions remain small. A co-dominating feature is within-species interaction pressure: large within-species interaction increases species diversity. Average fitness evolves in all systems, however symmetry and connectivity evolve in small systems only. Newcomers get extinct almost immediately in symmetric systems. The distribution in species lifetimes is determined for antisymmetric systems. The replicator systems investigated do not show any sign of self-organized criticality. The generalized Lotka-Volterra system is shown to be a tedious way of implementing the replicator system.
Magis, David
2014-11-01
In item response theory, the classical estimators of ability are highly sensitive to response disturbances and can return strongly biased estimates of the true underlying ability level. Robust methods were introduced to lessen the impact of such aberrant responses on the estimation process. The computation of asymptotic (i.e., large-sample) standard errors (ASE) for these robust estimators, however, has not yet been fully considered. This paper focuses on a broad class of robust ability estimators, defined by an appropriate selection of the weight function and the residual measure, for which the ASE is derived from the theory of estimating equations. The maximum likelihood (ML) and the robust estimators, together with their estimated ASEs, are then compared in a simulation study by generating random guessing disturbances. It is concluded that both the estimators and their ASE perform similarly in the absence of random guessing, while the robust estimator and its estimated ASE are less biased and outperform their ML counterparts in the presence of random guessing with large impact on the item response process. © 2013 The British Psychological Society.
Probability distribution of the entanglement across a cut at an infinite-randomness fixed point
NASA Astrophysics Data System (ADS)
Devakul, Trithep; Majumdar, Satya N.; Huse, David A.
2017-03-01
We calculate the probability distribution of entanglement entropy S across a cut of a finite one-dimensional spin chain of length L at an infinite-randomness fixed point using Fisher's strong randomness renormalization group (RG). Using the random transverse-field Ising model as an example, the distribution is shown to take the form p (S |L ) ˜L-ψ (k ) , where k ≡S /ln[L /L0] , the large deviation function ψ (k ) is found explicitly, and L0 is a nonuniversal microscopic length. We discuss the implications of such a distribution on numerical techniques that rely on entanglement, such as matrix-product-state-based techniques. Our results are verified with numerical RG simulations, as well as the actual entanglement entropy distribution for the random transverse-field Ising model which we calculate for large L via a mapping to Majorana fermions.
Large deviation approach to the generalized random energy model
NASA Astrophysics Data System (ADS)
Dorlas, T. C.; Dukes, W. M. B.
2002-05-01
The generalized random energy model is a generalization of the random energy model introduced by Derrida to mimic the ultrametric structure of the Parisi solution of the Sherrington-Kirkpatrick model of a spin glass. It was solved exactly in two special cases by Derrida and Gardner. A complete solution for the thermodynamics in the general case was given by Capocaccia et al. Here we use large deviation theory to analyse the model in a very straightforward way. We also show that the variational expression for the free energy can be evaluated easily using the Cauchy-Schwarz inequality.
Păsărelu, Costina Ruxandra; Andersson, Gerhard; Bergman Nordgren, Lise; Dobrean, Anca
2017-01-01
Anxiety and depressive disorders are often comorbid. Transdiagnostic and tailored treatments seem to be promising approaches in dealing with comorbidity. Although several primary studies have examined the effects of Internet-delivered cognitive behavior therapy (iCBT) for anxiety and depression, no meta-analysis including different types of iCBT that address comorbidity has been conducted so far. We conducted systematic searches in databases up to 1 July 2016. Only randomized trials comparing transdiagnostic/tailored iCBT for adult anxiety and/or depression with control groups were included. Nineteen randomized trials with a total of 2952 participants that met inclusion criteria were analyzed. The quality of the studies was high, however the blinding criteria were not fulfilled. The uncontrolled effect size (Hedges' g) of transdiagnostic/tailored iCBT on anxiety and depression outcomes was large and medium for quality of life. The controlled effect size for iCBT on anxiety and depression outcomes was medium to large (anxiety: g = .82, 95% CI: .58-1.05, depression: g = .79, 95% CI: .59-1.00) and medium on quality of life (g = .56, 95% CI: .37-.73). Heterogeneity was small (quality of life) to moderate (anxiety, depression). There was a large effect on generic outcome measures and a moderate effect on comorbidities. When compared to disorder-specific treatments there were no differences on anxiety and quality of life outcomes, however there were differences in depression outcomes. Transdiagnostic and tailored iCBT are effective interventions for anxiety disorders and depression. Future studies should investigate mechanisms of change and develop outcome measures for these interventions.
Effect of increasing disorder on domains of the 2d Coulomb glass.
Bhandari, Preeti; Malik, Vikas
2017-12-06
We have studied a two dimensional lattice model of Coulomb glass for a wide range of disorders at [Formula: see text]. The system was first annealed using Monte Carlo simulation. Further minimization of the total energy of the system was done using an algorithm developed by Baranovskii et al, followed by cluster flipping to obtain the pseudo-ground states. We have shown that the energy required to create a domain of linear size L in d dimensions is proportional to [Formula: see text]. Using Imry-Ma arguments given for random field Ising model, one gets critical dimension [Formula: see text] for Coulomb glass. The investigation of domains in the transition region shows a discontinuity in staggered magnetization which is an indication of a first-order type transition from charge-ordered phase to disordered phase. The structure and nature of random field fluctuations of the second largest domain in Coulomb glass are inconsistent with the assumptions of Imry and Ma, as was also reported for random field Ising model. The study of domains showed that in the transition region there were mostly two large domains, and that as disorder was increased the two large domains remained, but a large number of small domains also opened up. We have also studied the properties of the second largest domain as a function of disorder. We furthermore analysed the effect of disorder on the density of states, and showed a transition from hard gap at low disorders to a soft gap at higher disorders. At [Formula: see text], we have analysed the soft gap in detail, and found that the density of states deviates slightly ([Formula: see text]) from the linear behaviour in two dimensions. Analysis of local minima show that the pseudo-ground states have similar structure.
NASA Astrophysics Data System (ADS)
Le Doussal, Pierre; Petković, Aleksandra; Wiese, Kay Jörg
2012-06-01
We study the motion of an elastic object driven in a disordered environment in presence of both dissipation and inertia. We consider random forces with the statistics of random walks and reduce the problem to a single degree of freedom. It is the extension of the mean-field Alessandro-Beatrice- Bertotti-Montorsi (ABBM) model in presence of an inertial mass m. While the ABBM model can be solved exactly, its extension to inertia exhibits complicated history dependence due to oscillations and backward motion. The characteristic scales for avalanche motion are studied from numerics and qualitative arguments. To make analytical progress, we consider two variants which coincide with the original model whenever the particle moves only forward. Using a combination of analytical and numerical methods together with simulations, we characterize the distributions of instantaneous acceleration and velocity, and compare them in these three models. We show that for large driving velocity, all three models share the same large-deviation function for positive velocities, which is obtained analytically for small and large m, as well as for m=6/25. The effect of small additional thermal and quantum fluctuations can be treated within an approximate method.
What's new in stroke? The top 10 studies of 2009-2011: part II.
Hart, Robert G; Oczkowski, Wiesław J
2011-06-01
Five studies published between 2009 and 2011 are reviewed that importantly inform stroke prevention for patients with atrial fibrillation (AF) or with cervical carotid artery stenosis. Two large, phase III randomized trials tested novel oral anticoagulants for stroke prevention in patients with AF: the direct thrombin inhibitor dabigatran 150 mg twice daily was superior to adjusted-dose warfarin (RE-LY trial) and the direct factor Xa inhibitor apixaban was far superior to aspirin in patients deemed unsuitable for warfarin (AVERROES trial). For both novel anticoagulants, major bleeding rates were similar to the comparator treatment. Clopidogrel plus aspirin was more efficacious than aspirin alone for prevention of stroke in patients with AF deemed unsuitable for warfarin, but major bleeding was significantly increased with dual antiplatelet therapy (ACTIVE A trial). Two large randomized trials (CREST, ICSS) provide the best available data on the short-term risks of carotid artery stenting vs. endarterectomy. In both trials, periprocedural stroke was more frequent with stenting than with endarterectomy, but the increased risk was largely confined to patients >70 years old. For younger patients, periprocedural risks were comparable with stenting or endarterectomy, but long-term outcomes are required to assess the relative merits of the two procedures.
Polymer Dynamics from Synthetic to Biological Macromolecules
NASA Astrophysics Data System (ADS)
Richter, D.; Niedzwiedz, K.; Monkenbusch, M.; Wischnewski, A.; Biehl, R.; Hoffmann, B.; Merkel, R.
2008-02-01
High resolution neutron scattering together with a meticulous choice of the contrast conditions allows to access the large scale dynamics of soft materials including biological molecules in space and time. In this contribution we present two examples. One from the world of synthetic polymers, the other from biomolecules. First, we will address the peculiar dynamics of miscible polymer blends with very different component glass transition temperatures. Polymethylmetacrylate (PMMA), polyethyleneoxide (PEO) are perfectly miscible but exhibit a difference in the glass transition temperature by 200 K. We present quasielastic neutron scattering investigations on the dynamics of the fast component in the range from angströms to nanometers over a time frame of five orders of magnitude. All data may be consistently described in terms of a Rouse model with random friction, reflecting the random environment imposed by the nearly frozen PMMA matrix on the fast mobile PEO. In the second part we touch on some new developments relating to large scale internal dynamics of proteins by neutron spin echo. We will report results of some pioneering studies which show the feasibility of such experiments on large scale protein motion which will most likely initiate further studies in the future.
Multi-field inflation with a random potential
NASA Astrophysics Data System (ADS)
Tye, S.-H. Henry; Xu, Jiajun; Zhang, Yang
2009-04-01
Motivated by the possibility of inflation in the cosmic landscape, which may be approximated by a complicated potential, we study the density perturbations in multi-field inflation with a random potential. The random potential causes the inflaton to undergo a Brownian-like motion with a drift in the D-dimensional field space, allowing entropic perturbation modes to continuously and randomly feed into the adiabatic mode. To quantify such an effect, we employ a stochastic approach to evaluate the two-point and three-point functions of primordial perturbations. We find that in the weakly random scenario where the stochastic scatterings are frequent but mild, the resulting power spectrum resembles that of the single field slow-roll case, with up to 2% more red tilt. The strongly random scenario, in which the coarse-grained motion of the inflaton is significantly slowed down by the scatterings, leads to rich phenomenologies. The power spectrum exhibits primordial fluctuations on all angular scales. Such features may already be hiding in the error bars of observed CMB TT (as well as TE and EE) power spectrum and have been smoothed out by binning of data points. With more data coming in the future, we expect these features can be detected or falsified. On the other hand the tensor power spectrum itself is free of fluctuations and the tensor to scalar ratio is enhanced by the large ratio of the Brownian-like motion speed over the drift speed. In addition a large negative running of the power spectral index is possible. Non-Gaussianity is generically suppressed by the growth of adiabatic perturbations on super-horizon scales, and is negligible in the weakly random scenario. However, non-Gaussianity can possibly be enhanced by resonant effects in the strongly random scenario or arise from the entropic perturbations during the onset of (p)reheating if the background inflaton trajectory exhibits particular properties. The formalism developed in this paper can be applied to a wide class of multi-field inflation models including, e.g. the N-flation scenario.
Free Vibration of Uncertain Unsymmetrically Laminated Beams
NASA Technical Reports Server (NTRS)
Kapania, Rakesh K.; Goyal, Vijay K.
2001-01-01
Monte Carlo Simulation and Stochastic FEA are used to predict randomness in the free vibration response of thin unsymmetrically laminated beams. For the present study, it is assumed that randomness in the response is only caused by uncertainties in the ply orientations. The ply orientations may become random or uncertain during the manufacturing process. A new 16-dof beam element, based on the first-order shear deformation beam theory, is used to study the stochastic nature of the natural frequencies. Using variational principles, the element stiffness matrix and mass matrix are obtained through analytical integration. Using a random sequence a large data set is generated, containing possible random ply-orientations. This data is assumed to be symmetric. The stochastic-based finite element model for free vibrations predicts the relation between the randomness in fundamental natural frequencies and the randomness in ply-orientation. The sensitivity derivatives are calculated numerically through an exact formulation. The squared fundamental natural frequencies are expressed in terms of deterministic and probabilistic quantities, allowing to determine how sensitive they are to variations in ply angles. The predicted mean-valued fundamental natural frequency squared and the variance of the present model are in good agreement with Monte Carlo Simulation. Results, also, show that variations between plus or minus 5 degrees in ply-angles can affect free vibration response of unsymmetrically and symmetrically laminated beams.
Comparing spatial regression to random forests for large ...
Environmental data may be “large” due to number of records, number of covariates, or both. Random forests has a reputation for good predictive performance when using many covariates, whereas spatial regression, when using reduced rank methods, has a reputation for good predictive performance when using many records. In this study, we compare these two techniques using a data set containing the macroinvertebrate multimetric index (MMI) at 1859 stream sites with over 200 landscape covariates. Our primary goal is predicting MMI at over 1.1 million perennial stream reaches across the USA. For spatial regression modeling, we develop two new methods to accommodate large data: (1) a procedure that estimates optimal Box-Cox transformations to linearize covariate relationships; and (2) a computationally efficient covariate selection routine that takes into account spatial autocorrelation. We show that our new methods lead to cross-validated performance similar to random forests, but that there is an advantage for spatial regression when quantifying the uncertainty of the predictions. Simulations are used to clarify advantages for each method. This research investigates different approaches for modeling and mapping national stream condition. We use MMI data from the EPA's National Rivers and Streams Assessment and predictors from StreamCat (Hill et al., 2015). Previous studies have focused on modeling the MMI condition classes (i.e., good, fair, and po
Muroff, Jordana; Amodeo, Maryann; Larson, Mary Jo; Carey, Margaret; Loftin, Ralph D
2011-01-01
This article describes a data management system (DMS) developed to support a large-scale randomized study of an innovative web-course that was designed to improve substance abuse counselors' knowledge and skills in applying a substance abuse treatment method (i.e., cognitive behavioral therapy; CBT). The randomized trial compared the performance of web-course-trained participants (intervention group) and printed-manual-trained participants (comparison group) to determine the effectiveness of the web-course in teaching CBT skills. A single DMS was needed to support all aspects of the study: web-course delivery and management, as well as randomized trial management. The authors briefly reviewed several other systems that were described as built either to handle randomized trials or to deliver and evaluate web-based training. However it was clear that these systems fell short of meeting our needs for simultaneous, coordinated management of the web-course and the randomized trial. New England Research Institute's (NERI) proprietary Advanced Data Entry and Protocol Tracking (ADEPT) system was coupled with the web-programmed course and customized for our purposes. This article highlights the requirements for a DMS that operates at the intersection of web-based course management systems and randomized clinical trial systems, and the extent to which the coupled, customized ADEPT satisfied those requirements. Recommendations are included for institutions and individuals considering conducting randomized trials and web-based training programs, and seeking a DMS that can meet similar requirements.
Cho, Hee Ju; Chung, Jae Hoon; Jo, Jung Ki; Kang, Dong Hyuk; Cho, Jeong Man; Yoo, Tag Keun; Lee, Seung Wook
2013-12-01
Randomized controlled trials are one of the most reliable resources for assessing the effectiveness and safety of medical treatments. Low quality randomized controlled trials carry a large bias that can ultimately impair the reliability of their conclusions. The present study aimed to evaluate the quality of randomized controlled trials published in International Journal of Urology by using multiple quality assessment tools. Randomized controlled trials articles published in International Journal of Urology were found using the PubMed MEDLINE database, and qualitative analysis was carried out with three distinct assessment tools: the Jadad scale, the van Tulder scale and the Cochrane Collaboration Risk of Bias Tool. The quality of randomized controlled trials was analyzed by publication year, type of subjects, intervention, presence of funding and whether an institutional review board reviewed the study. A total of 68 randomized controlled trial articles were published among a total of 1399 original articles in International Journal of Urology. Among these randomized controlled trials, 10 (2.70%) were from 1994 to 1999, 23 (4.10%) were from 2000 to 2005 and 35 (4.00%) were from 2006 to 2011 (P = 0.494). On the assessment with the Jadad and van Tulder scale, the numbers and percentage of high quality randomized controlled trials increased over time. The studies that had institutional review board reviews, funding resources or that were carried out in multiple institutions had an increased percentage of high quality articles. The numbers and percentage of high-quality randomized controlled trials published in International Journal of Urology have increased over time. Furthermore, randomized controlled trials with funding resources, institutional review board reviews or carried out in multiple institutions have been found to be of higher quality compared with others not presenting these features. © 2013 The Japanese Urological Association.
NASA Astrophysics Data System (ADS)
Livan, Giacomo; Alfarano, Simone; Scalas, Enrico
2011-07-01
We study some properties of eigenvalue spectra of financial correlation matrices. In particular, we investigate the nature of the large eigenvalue bulks which are observed empirically, and which have often been regarded as a consequence of the supposedly large amount of noise contained in financial data. We challenge this common knowledge by acting on the empirical correlation matrices of two data sets with a filtering procedure which highlights some of the cluster structure they contain, and we analyze the consequences of such filtering on eigenvalue spectra. We show that empirically observed eigenvalue bulks emerge as superpositions of smaller structures, which in turn emerge as a consequence of cross correlations between stocks. We interpret and corroborate these findings in terms of factor models, and we compare empirical spectra to those predicted by random matrix theory for such models.
Wang, Chenchen; Iversen, Maura D; McAlindon, Timothy; Harvey, William F; Wong, John B; Fielding, Roger A; Driban, Jeffrey B; Price, Lori Lyn; Rones, Ramel; Gamache, Tressa; Schmid, Christopher H
2014-09-08
Knee osteoarthritis (OA) causes pain and long-term disability with annual healthcare costs exceeding $185 billion in the United States. Few medical remedies effectively influence the course of the disease. Finding effective treatments to maintain function and quality of life in patients with knee OA is one of the national priorities identified by the Institute of Medicine. We are currently conducting the first comparative effectiveness and cost-effectiveness randomized trial of Tai Chi versus a physical-therapy regimen in a sample of patients with symptomatic and radiographically confirmed knee OA. This article describes the design and conduct of this trial. A single-center, 52-week, comparative effectiveness randomized controlled trial of Tai Chi versus a standardized physical-therapy regimen is being conducted at an urban tertiary medical center in Boston, Massachusetts. The study population consists of adults ≥ 40 years of age with symptomatic and radiographic knee OA (American College of Rheumatology criteria). Participants are randomly allocated to either 12 weeks of Tai Chi (2x/week) or Physical Therapy (2x/week for 6 weeks, followed by 6 weeks of rigorously monitored home exercise). The primary outcome measure is pain (Western Ontario and McMaster Universities WOMAC) subscale at 12 weeks. Secondary outcomes include WOMAC stkiffness and function domain scores, lower extremity strength and power, functional balance, physical performance tests, psychological and psychosocial functioning, durability effects, health related quality of life, and healthcare utilization at 12, 24 and 52 weeks. This study will be the first randomized comparative-effectiveness and cost-effectiveness trial of Tai Chi versus Physical Therapy in a large symptomatic knee OA population with long-term follow up. We present here a robust and well-designed randomized comparative-effectiveness trial that also explores multiple outcomes to elucidate the potential mechanisms of mind-body effect for a major disabling disease with substantial health burdens and economic costs. Results of this study are expected to have important public health implications for the large and growing population with knee OA. ClinicalTrials.gov identifier: NCT01258985.
Generic features of the primary relaxation in glass-forming materials (Review Article)
NASA Astrophysics Data System (ADS)
Kokshenev, Valery B.
2017-08-01
We discuss structural relaxation in molecular and polymeric supercooled liquids, metallic alloys and orientational glass crystals. The study stresses especially the relationships between observables raised from underlying constraints imposed on degrees of freedom of vitrification systems. A self-consistent parametrization of the α-timescale on macroscopic level results in the material-and-model independent universal equation, relating three fundamental temperatures, characteristic of the primary relaxation, that is numerically proven in all studied glass formers. During the primary relaxation, the corresponding small and large mesoscopic clusters modify their size and structure in a self-similar way, regardless of underlying microscopic realizations. We show that cluster-shape similarity, instead of cluster-size fictive divergence, gives rise to universal features observed in primary relaxation. In all glass formers with structural disorder, including orientational-glass materials (with the exception of plastic crystals), structural relaxation is shown to be driven by local random fields. Within the dynamic stochastic approach, the universal subdiffusive dynamics corresponds to random walks on small and large fractals.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ben-Naim, Eli; Krapivsky, Paul
Here we generalize the ordinary aggregation process to allow for choice. In ordinary aggregation, two random clusters merge and form a larger aggregate. In our implementation of choice, a target cluster and two candidate clusters are randomly selected and the target cluster merges with the larger of the two candidate clusters.We study the long-time asymptotic behavior and find that as in ordinary aggregation, the size density adheres to the standard scaling form. However, aggregation with choice exhibits a number of different features. First, the density of the smallest clusters exhibits anomalous scaling. Second, both the small-size and the large-size tailsmore » of the density are overpopulated, at the expense of the density of moderate-size clusters. Finally, we also study the complementary case where the smaller candidate cluster participates in the aggregation process and find an abundance of moderate clusters at the expense of small and large clusters. Additionally, we investigate aggregation processes with choice among multiple candidate clusters and a symmetric implementation where the choice is between two pairs of clusters.« less
Patil, Sumeet R; Arnold, Benjamin F; Salvatore, Alicia L; Briceno, Bertha; Ganguly, Sandipan; Colford, John M; Gertler, Paul J
2014-08-01
Poor sanitation is thought to be a major cause of enteric infections among young children. However, there are no previously published randomized trials to measure the health impacts of large-scale sanitation programs. India's Total Sanitation Campaign (TSC) is one such program that seeks to end the practice of open defecation by changing social norms and behaviors, and providing technical support and financial subsidies. The objective of this study was to measure the effect of the TSC implemented with capacity building support from the World Bank's Water and Sanitation Program in Madhya Pradesh on availability of individual household latrines (IHLs), defecation behaviors, and child health (diarrhea, highly credible gastrointestinal illness [HCGI], parasitic infections, anemia, growth). We conducted a cluster-randomized, controlled trial in 80 rural villages. Field staff collected baseline measures of sanitation conditions, behaviors, and child health (May-July 2009), and revisited households 21 months later (February-April 2011) after the program was delivered. The study enrolled a random sample of 5,209 children <5 years old from 3,039 households that had at least one child <24 months at the beginning of the study. A random subsample of 1,150 children <24 months at enrollment were tested for soil transmitted helminth and protozoan infections in stool. The randomization successfully balanced intervention and control groups, and we estimated differences between groups in an intention to treat analysis. The intervention increased percentage of households in a village with improved sanitation facilities as defined by the WHO/UNICEF Joint Monitoring Programme by an average of 19% (95% CI for difference: 12%-26%; group means: 22% control versus 41% intervention), decreased open defecation among adults by an average of 10% (95% CI for difference: 4%-15%; group means: 73% intervention versus 84% control). However, the intervention did not improve child health measured in terms of multiple health outcomes (diarrhea, HCGI, helminth infections, anemia, growth). Limitations of the study included a relatively short follow-up period following implementation, evidence for contamination in ten of the 40 control villages, and bias possible in self-reported outcomes for diarrhea, HCGI, and open defecation behaviors. The intervention led to modest increases in availability of IHLs and even more modest reductions in open defecation. These improvements were insufficient to improve child health outcomes (diarrhea, HCGI, parasite infection, anemia, growth). The results underscore the difficulty of achieving adequately large improvements in sanitation levels to deliver expected health benefits within large-scale rural sanitation programs. ClinicalTrials.gov NCT01465204. Please see later in the article for the Editors' Summary.
Patil, Sumeet R.; Arnold, Benjamin F.; Salvatore, Alicia L.; Briceno, Bertha; Ganguly, Sandipan; Colford, John M.; Gertler, Paul J.
2014-01-01
Background Poor sanitation is thought to be a major cause of enteric infections among young children. However, there are no previously published randomized trials to measure the health impacts of large-scale sanitation programs. India's Total Sanitation Campaign (TSC) is one such program that seeks to end the practice of open defecation by changing social norms and behaviors, and providing technical support and financial subsidies. The objective of this study was to measure the effect of the TSC implemented with capacity building support from the World Bank's Water and Sanitation Program in Madhya Pradesh on availability of individual household latrines (IHLs), defecation behaviors, and child health (diarrhea, highly credible gastrointestinal illness [HCGI], parasitic infections, anemia, growth). Methods and Findings We conducted a cluster-randomized, controlled trial in 80 rural villages. Field staff collected baseline measures of sanitation conditions, behaviors, and child health (May–July 2009), and revisited households 21 months later (February–April 2011) after the program was delivered. The study enrolled a random sample of 5,209 children <5 years old from 3,039 households that had at least one child <24 months at the beginning of the study. A random subsample of 1,150 children <24 months at enrollment were tested for soil transmitted helminth and protozoan infections in stool. The randomization successfully balanced intervention and control groups, and we estimated differences between groups in an intention to treat analysis. The intervention increased percentage of households in a village with improved sanitation facilities as defined by the WHO/UNICEF Joint Monitoring Programme by an average of 19% (95% CI for difference: 12%–26%; group means: 22% control versus 41% intervention), decreased open defecation among adults by an average of 10% (95% CI for difference: 4%–15%; group means: 73% intervention versus 84% control). However, the intervention did not improve child health measured in terms of multiple health outcomes (diarrhea, HCGI, helminth infections, anemia, growth). Limitations of the study included a relatively short follow-up period following implementation, evidence for contamination in ten of the 40 control villages, and bias possible in self-reported outcomes for diarrhea, HCGI, and open defecation behaviors. Conclusions The intervention led to modest increases in availability of IHLs and even more modest reductions in open defecation. These improvements were insufficient to improve child health outcomes (diarrhea, HCGI, parasite infection, anemia, growth). The results underscore the difficulty of achieving adequately large improvements in sanitation levels to deliver expected health benefits within large-scale rural sanitation programs. Trial Registration ClinicalTrials.gov NCT01465204 Please see later in the article for the Editors' Summary PMID:25157929
Qigong Exercises for the Management of Type 2 Diabetes Mellitus
Close, Jacqueline R.; Lilly, Harold Ryan; Guillaume, Nathalie; Sun, Guan-Cheng
2017-01-01
Background: The purpose of this article is to clarify and define medical qigong and to identify an appropriate study design and methodology for a large-scale study looking at the effects of qigong in patients with type 2 diabetes mellitus (T2DM), specifically subject enrollment criteria, selection of the control group and study duration. Methods: A comprehensive literature review of English databases was used to locate articles from 1980–May 2017 involving qigong and T2DM. Control groups, subject criteria and the results of major diabetic markers were reviewed and compared within each study. Definitions of qigong and its differentiation from physical exercise were also considered. Results: After a thorough review, it was found that qigong shows positive effects on T2DM; however, there were inconsistencies in control groups, research subjects and diabetic markers analyzed. It was also discovered that there is a large variation in styles and definitions of qigong. Conclusions: Qigong exercise has shown promising results in clinical experience and in randomized, controlled pilot studies for affecting aspects of T2DM including blood glucose, triglycerides, total cholesterol, weight, BMI and insulin resistance. Due to the inconsistencies in study design and methods and the lack of large-scale studies, further well-designed randomized control trials (RCT) are needed to evaluate the ‘vital energy’ or qi aspect of internal medical qigong in people who have been diagnosed with T2DM. PMID:28930273
A pilot cluster randomized controlled trial of structured goal-setting following stroke.
Taylor, William J; Brown, Melanie; William, Levack; McPherson, Kathryn M; Reed, Kirk; Dean, Sarah G; Weatherall, Mark
2012-04-01
To determine the feasibility, the cluster design effect and the variance and minimal clinical importance difference in the primary outcome in a pilot study of a structured approach to goal-setting. A cluster randomized controlled trial. Inpatient rehabilitation facilities. People who were admitted to inpatient rehabilitation following stroke who had sufficient cognition to engage in structured goal-setting and complete the primary outcome measure. Structured goal elicitation using the Canadian Occupational Performance Measure. Quality of life at 12 weeks using the Schedule for Individualised Quality of Life (SEIQOL-DW), Functional Independence Measure, Short Form 36 and Patient Perception of Rehabilitation (measuring satisfaction with rehabilitation). Assessors were blinded to the intervention. Four rehabilitation services and 41 patients were randomized. We found high values of the intraclass correlation for the outcome measures (ranging from 0.03 to 0.40) and high variance of the SEIQOL-DW (SD 19.6) in relation to the minimally importance difference of 2.1, leading to impractically large sample size requirements for a cluster randomized design. A cluster randomized design is not a practical means of avoiding contamination effects in studies of inpatient rehabilitation goal-setting. Other techniques for coping with contamination effects are necessary.
ERIC Educational Resources Information Center
Taylor, Joseph; Kowalski, Susan; Wilson, Christopher; Getty, Stephen; Carlson, Janet
2013-01-01
This paper focuses on the trade-offs that lie at the intersection of methodological requirements for causal effect studies and policies that affect how and to what extent schools engage in such studies. More specifically, current federal funding priorities encourage large-scale randomized studies of interventions in authentic settings. At the same…
ERIC Educational Resources Information Center
Jayanthi, Madhavi; Dimino, Joseph; Gersten, Russell; Taylor, Mary Jo; Haymond, Kelly; Smolkowski, Keith; Newman-Gonchar, Rebecca
2018-01-01
The purpose of this replication study was to examine the impact of the Teacher Study Group (TSG) professional development in vocabulary on first-grade teachers' knowledge of vocabulary instruction and observed teaching practice, and on students' vocabulary knowledge. Sixty-two schools from 16 districts in four states were randomly assigned to…
ERIC Educational Resources Information Center
What Works Clearinghouse, 2015
2015-01-01
The study authors examined the impact of "Responsive Classroom," a professional development program for teachers, on student achievement. This study took place in a large, ethnically and socioeconomically diverse district in a mid-Atlantic state. The intervention was implemented during 3 school years from 2008 to 2011. Study authors…
Braschel, Melissa C; Svec, Ivana; Darlington, Gerarda A; Donner, Allan
2016-04-01
Many investigators rely on previously published point estimates of the intraclass correlation coefficient rather than on their associated confidence intervals to determine the required size of a newly planned cluster randomized trial. Although confidence interval methods for the intraclass correlation coefficient that can be applied to community-based trials have been developed for a continuous outcome variable, fewer methods exist for a binary outcome variable. The aim of this study is to evaluate confidence interval methods for the intraclass correlation coefficient applied to binary outcomes in community intervention trials enrolling a small number of large clusters. Existing methods for confidence interval construction are examined and compared to a new ad hoc approach based on dividing clusters into a large number of smaller sub-clusters and subsequently applying existing methods to the resulting data. Monte Carlo simulation is used to assess the width and coverage of confidence intervals for the intraclass correlation coefficient based on Smith's large sample approximation of the standard error of the one-way analysis of variance estimator, an inverted modified Wald test for the Fleiss-Cuzick estimator, and intervals constructed using a bootstrap-t applied to a variance-stabilizing transformation of the intraclass correlation coefficient estimate. In addition, a new approach is applied in which clusters are randomly divided into a large number of smaller sub-clusters with the same methods applied to these data (with the exception of the bootstrap-t interval, which assumes large cluster sizes). These methods are also applied to a cluster randomized trial on adolescent tobacco use for illustration. When applied to a binary outcome variable in a small number of large clusters, existing confidence interval methods for the intraclass correlation coefficient provide poor coverage. However, confidence intervals constructed using the new approach combined with Smith's method provide nominal or close to nominal coverage when the intraclass correlation coefficient is small (<0.05), as is the case in most community intervention trials. This study concludes that when a binary outcome variable is measured in a small number of large clusters, confidence intervals for the intraclass correlation coefficient may be constructed by dividing existing clusters into sub-clusters (e.g. groups of 5) and using Smith's method. The resulting confidence intervals provide nominal or close to nominal coverage across a wide range of parameters when the intraclass correlation coefficient is small (<0.05). Application of this method should provide investigators with a better understanding of the uncertainty associated with a point estimator of the intraclass correlation coefficient used for determining the sample size needed for a newly designed community-based trial. © The Author(s) 2015.
Marcus, Hani J; Seneci, Carlo A; Hughes-Hallett, Archie; Cundy, Thomas P; Nandi, Dipankar; Yang, Guang-Zhong; Darzi, Ara
2016-04-01
Surgical approaches such as transanal endoscopic microsurgery, which utilize small operative working spaces, and are necessarily single-port, are particularly demanding with standard instruments and have not been widely adopted. The aim of this study was to compare simultaneously surgical performance in single-port versus multiport approaches, and small versus large working spaces. Ten novice, 4 intermediate, and 1 expert surgeons were recruited from a university hospital. A preclinical randomized crossover study design was implemented, comparing performance under the following conditions: (1) multiport approach and large working space, (2) multiport approach and intermediate working space, (3) single-port approach and large working space, (4) single-port approach and intermediate working space, and (5) single-port approach and small working space. In each case, participants performed a peg transfer and pattern cutting tasks, and each task repetition was scored. Intermediate and expert surgeons performed significantly better than novices in all conditions (P < .05). Performance in single-port surgery was significantly worse than multiport surgery (P < .01). In multiport surgery, there was a nonsignificant trend toward worsened performance in the intermediate versus large working space. In single-port surgery, there was a converse trend; performances in the intermediate and small working spaces were significantly better than in the large working space. Single-port approaches were significantly more technically challenging than multiport approaches, possibly reflecting loss of instrument triangulation. Surprisingly, in single-port approaches, in which triangulation was no longer a factor, performance in large working spaces was worse than in intermediate and small working spaces. © The Author(s) 2015.
Simple Emergent Power Spectra from Complex Inflationary Physics
NASA Astrophysics Data System (ADS)
Dias, Mafalda; Frazer, Jonathan; Marsh, M. C. David
2016-09-01
We construct ensembles of random scalar potentials for Nf-interacting scalar fields using nonequilibrium random matrix theory, and use these to study the generation of observables during small-field inflation. For Nf=O (few ), these heavily featured scalar potentials give rise to power spectra that are highly nonlinear, at odds with observations. For Nf≫1 , the superhorizon evolution of the perturbations is generically substantial, yet the power spectra simplify considerably and become more predictive, with most realizations being well approximated by a linear power spectrum. This provides proof of principle that complex inflationary physics can give rise to simple emergent power spectra. We explain how these results can be understood in terms of large Nf universality of random matrix theory.
Simple Emergent Power Spectra from Complex Inflationary Physics.
Dias, Mafalda; Frazer, Jonathan; Marsh, M C David
2016-09-30
We construct ensembles of random scalar potentials for N_{f}-interacting scalar fields using nonequilibrium random matrix theory, and use these to study the generation of observables during small-field inflation. For N_{f}=O(few), these heavily featured scalar potentials give rise to power spectra that are highly nonlinear, at odds with observations. For N_{f}≫1, the superhorizon evolution of the perturbations is generically substantial, yet the power spectra simplify considerably and become more predictive, with most realizations being well approximated by a linear power spectrum. This provides proof of principle that complex inflationary physics can give rise to simple emergent power spectra. We explain how these results can be understood in terms of large N_{f} universality of random matrix theory.
Cosmic Rays in Intermittent Magnetic Fields
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shukurov, Anvar; Seta, Amit; Bushby, Paul J.
The propagation of cosmic rays in turbulent magnetic fields is a diffusive process driven by the scattering of the charged particles by random magnetic fluctuations. Such fields are usually highly intermittent, consisting of intense magnetic filaments and ribbons surrounded by weaker, unstructured fluctuations. Studies of cosmic-ray propagation have largely overlooked intermittency, instead adopting Gaussian random magnetic fields. Using test particle simulations, we calculate cosmic-ray diffusivity in intermittent, dynamo-generated magnetic fields. The results are compared with those obtained from non-intermittent magnetic fields having identical power spectra. The presence of magnetic intermittency significantly enhances cosmic-ray diffusion over a wide range of particlemore » energies. We demonstrate that the results can be interpreted in terms of a correlated random walk.« less
Glyburide Advantage in Malignant Edema and Stroke (GAMES-RP) Trial: Rationale and Design.
Sheth, Kevin N; Elm, Jordan J; Beslow, Lauren A; Sze, Gordon K; Kimberly, W Taylor
2016-02-01
Patients with large territory infarction are at high risk of cerebral edema and neurological deterioration, including death. Preclinical studies have shown that a continuous infusion of glyburide blocks edema formation and improves outcome. We hypothesize that treatment with RP-1127 (Glyburide for Injection) reduces formation of brain edema in patients after large anterior circulation infarction. GAMES-RP is a prospective, randomized, double-blind, multicenter trial designed to evaluate RP-1127 in patients at high risk for the development of malignant cerebral edema. The study population consisted of subjects with a clinical diagnosis of acute severe anterior circulation ischemic stroke with a baseline diffusion-weighted image lesion between 82 and 300 cm(3) who are 18-80 years of age. The target time from symptom onset to start of study infusion was ≤10 h. Subjects were randomized to RP-1127 (glyburide for injection) or placebo and treated with a continuous infusion for 72 h. The primary efficacy outcome was a composite of the modified Rankin Scale and the incidence of decompressive craniectomy, assessed at 90 days. Safety outcomes were the frequency and severity of adverse events, with a focus on cardiac- and glucose-related serious adverse events. GAMES-RP was designed to provide critical information regarding glyburide for injection in patients with large hemispheric stroke and will inform the design of future studies.
ERIC Educational Resources Information Center
Hegedus, Stephen J.; Dalton, Sara; Tapper, John R.
2015-01-01
We report on two large studies conducted in advanced algebra classrooms in the US, which evaluated the effect of replacing traditional algebra 2 curriculum with an integrated suite of dynamic interactive software, wireless networks and technology-enhanced curriculum on student learning. The first study was a cluster randomized trial and the second…
Hassel, Diana M; Smith, Phoebe A; Nieto, Jorge E; Beldomenico, Pablo; Spier, Sharon J
2009-11-01
The aim of this study was to evaluate the effects of a commercially available di-tri-octahedral (DTO) smectite product on clinical signs and prevalence of post-operative diarrhea in horses with colic associated with disease of the large intestine. Sixty-seven horses with surgical disease of the large intestine were randomly assigned to be treated with DTO smectite (n=37; 0.5 kg via nasogastric intubation every 24 h for 3 days post-operatively) or a placebo (n=30). The effect of treatment on fecal scores and clinical and hematological parameters, including heart rate, mucous membrane color, temperature, total white blood cell count, total neutrophil count and total plasma protein values, were determined. Horses treated with DTO smectite had a significant reduction in the prevalence of post-operative diarrhea (10.8%), compared with controls (41.4%). A significant improvement in mucous membrane color was observed 72 h post-operatively in horses receiving treatment, compared with placebo. Administration of DTO smectite to colic patients with disease of the large intestine reduced the occurrence of diarrhea in the early post-operative period.
Yin, J Kevin; Heywood, Anita E; Georgousakis, Melina; King, Catherine; Chiu, Clayton; Isaacs, David; Macartney, Kristine K
2017-09-01
Universal childhood vaccination is a potential solution to reduce seasonal influenza burden. We reviewed systematically the literature on "herd"/indirect protection from vaccinating children aged 6 months to 17 years against influenza. Of 30 studies included, 14 (including 1 cluster randomized controlled trial [cRCT]) used live attenuated influenza vaccine, 11 (7 cRCTs) used inactivated influenza vaccine, and 5 (1 cRCT) compared both vaccine types. Twenty of 30 studies reported statistically significant indirect protection effectiveness (IPE) with point estimates ranging from 4% to 66%. Meta-regression suggests that studies with high quality and/or sufficiently large sample size are more likely to report significant IPE. In meta-analyses of 6 cRCTs with full randomization (rated as moderate quality overall), significant IPE was found in 1 cRCT in closely connected communities where school-aged children were vaccinated: 60% (95% confidence interval [CI], 41%-72%; I2 = 0%; N = 2326) against laboratory-confirmed influenza, and 3 household cRCTs in which preschool-aged children were vaccinated: 22% (95% CI, 1%-38%; I2 = 0%; N = 1903) against acute respiratory infections or influenza-like illness. Significant IPE was also reported in a large-scale cRCT (N = 8510) that was not fully randomized, and 3 ecological studies (N > 10000) of moderate quality including 36% reduction in influenza-related mortality among the elderly in a Japanese school-based program. Data on IPE in other settings are heterogeneous and lacked power to draw a firm conclusion. The available evidence suggests that influenza vaccination of children confers indirect protection in some but not all settings. Robust, large-scaled studies are required to better quantify the indirect protection from vaccinating children for different settings/endpoints. © The Author 2017. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, e-mail: journals.permissions@oup.com.
Reflective properties of randomly rough surfaces under large incidence angles.
Qiu, J; Zhang, W J; Liu, L H; Hsu, P-f; Liu, L J
2014-06-01
The reflective properties of randomly rough surfaces at large incidence angles have been reported due to their potential applications in some of the radiative heat transfer research areas. The main purpose of this work is to investigate the formation mechanism of the specular reflection peak of rough surfaces at large incidence angles. The bidirectional reflectance distribution function (BRDF) of rough aluminum surfaces with different roughnesses at different incident angles is measured by a three-axis automated scatterometer. This study used a validated and accurate computational model, the rigorous coupled-wave analysis (RCWA) method, to compare and analyze the measurement BRDF results. It is found that the RCWA results show the same trend of specular peak as the measurement. This paper mainly focuses on the relative roughness at the range of 0.16<σ/λ<5.35. As the relative roughness decreases, the specular peak enhancement dramatically increases and the scattering region significantly reduces, especially under large incidence angles. The RCWA and the Rayleigh criterion results have been compared, showing that the relative error of the total integrated scatter increases as the roughness of the surface increases at large incidence angles. In addition, the zero-order diffractive power calculated by RCWA and the reflectance calculated by Fresnel equations are compared. The comparison shows that the relative error declines sharply when the incident angle is large and the roughness is small.
Bakbergenuly, Ilyas; Morgenthaler, Stephan
2016-01-01
We study bias arising as a result of nonlinear transformations of random variables in random or mixed effects models and its effect on inference in group‐level studies or in meta‐analysis. The findings are illustrated on the example of overdispersed binomial distributions, where we demonstrate considerable biases arising from standard log‐odds and arcsine transformations of the estimated probability p^, both for single‐group studies and in combining results from several groups or studies in meta‐analysis. Our simulations confirm that these biases are linear in ρ, for small values of ρ, the intracluster correlation coefficient. These biases do not depend on the sample sizes or the number of studies K in a meta‐analysis and result in abysmal coverage of the combined effect for large K. We also propose bias‐correction for the arcsine transformation. Our simulations demonstrate that this bias‐correction works well for small values of the intraclass correlation. The methods are applied to two examples of meta‐analyses of prevalence. PMID:27192062
Coupled continuous time-random walks in quenched random environment
NASA Astrophysics Data System (ADS)
Magdziarz, M.; Szczotka, W.
2018-02-01
We introduce a coupled continuous-time random walk with coupling which is characteristic for Lévy walks. Additionally we assume that the walker moves in a quenched random environment, i.e. the site disorder at each lattice point is fixed in time. We analyze the scaling limit of such a random walk. We show that for large times the behaviour of the analyzed process is exactly the same as in the case of uncoupled quenched trap model for Lévy flights.
[Studies on localized low-risk prostate cancer : Do we know enough?
Weißbach, L; Roloff, C
2018-06-05
Treatment of localized low-risk prostate cancer (PCa) is undergoing a paradigm shift: Invasive treatments such as surgery and radiation therapy are being replaced by defensive strategies such as active surveillance (AS) and watchful waiting (WW). The aim of this work is to evaluate the significance of current studies regarding defensive strategies (AS and WW). The best-known AS studies are critically evaluated for their significance in terms of input criteria, follow-up criteria, and statistical significance. The difficulties faced by randomized studies in answering the question of the best treatment for low-risk cancer in two or even more study groups with known low tumor-specific mortality are clearly shown. Some studies fail because of the objective, others-like PIVOT-are underpowered. ProtecT, a renowned randomized, controlled trial (RCT), lists systematic and statistical shortcomings in detail. The time and effort required for RCTs to answer the question of which therapy is best for locally limited low-risk cancer is very large because the low specific mortality rate requires a large number of participants and a long study duration. In any case, RCTs create hand-picked cohorts for statistical evaluation that have little to do with care in daily clinical practice. The necessary randomization is also offset by the decision-making of the informed patient. If further studies of low-risk PCa are needed, they will need real-world conditions that an RCT can not provide. To obtain clinically relevant results, we need to rethink things: When planning the study, biometricians and clinicians must understand that the statistical methods used in RCTs are of limited use and they must select a method (e.g. propensity scores) appropriate for health care research.
A randomized trial comparing concise and standard consent forms in the START trial
Touloumi, Giota; Walker, A. Sarah; Smolskis, Mary; Sharma, Shweta; Babiker, Abdel G.; Pantazis, Nikos; Tavel, Jorge; Florence, Eric; Sanchez, Adriana; Hudson, Fleur; Papadopoulos, Antonios; Emanuel, Ezekiel; Clewett, Megan; Munroe, David; Denning, Eileen
2017-01-01
Background Improving the effectiveness and efficiency of research informed consent is a high priority. Some express concern about longer, more complex, written consent forms creating barriers to participant understanding. A recent meta-analysis concluded that randomized comparisons were needed. Methods We conducted a cluster-randomized non-inferiority comparison of a standard versus concise consent form within a multinational trial studying the timing of starting antiretroviral therapy in HIV+ adults (START). Interested sites were randomized to standard or concise consent forms for all individuals signing START consent. Participants completed a survey measuring comprehension of study information and satisfaction with the consent process. Site personnel reported usual site consent practices. The primary outcome was comprehension of the purpose of randomization (pre-specified 7.5% non-inferiority margin). Results 77 sites (2429 participants) were randomly allocated to use standard consent and 77 sites (2000 participants) concise consent, for an evaluable cohort of 4229. Site and participant characteristics were similar for the two groups. The concise consent was non-inferior to the standard consent on comprehension of randomization (80.2% versus 82%, site adjusted difference: 0.75% (95% CI -3.8%, +5.2%)); and the two groups did not differ significantly on total comprehension score, satisfaction, or voluntariness (p>0.1). Certain independent factors, such as education, influenced comprehension and satisfaction but not differences between consent groups. Conclusions An easier to read, more concise consent form neither hindered nor improved comprehension of study information nor satisfaction with the consent process among a large number of participants. This supports continued efforts to make consent forms more efficient. Trial registration Informed consent substudy was registered as part of START study in clinicaltrials.gov #NCT00867048, and EudraCT # 2008-006439-12 PMID:28445471
Collective relaxation dynamics of small-world networks
NASA Astrophysics Data System (ADS)
Grabow, Carsten; Grosskinsky, Stefan; Kurths, Jürgen; Timme, Marc
2015-05-01
Complex networks exhibit a wide range of collective dynamic phenomena, including synchronization, diffusion, relaxation, and coordination processes. Their asymptotic dynamics is generically characterized by the local Jacobian, graph Laplacian, or a similar linear operator. The structure of networks with regular, small-world, and random connectivities are reasonably well understood, but their collective dynamical properties remain largely unknown. Here we present a two-stage mean-field theory to derive analytic expressions for network spectra. A single formula covers the spectrum from regular via small-world to strongly randomized topologies in Watts-Strogatz networks, explaining the simultaneous dependencies on network size N , average degree k , and topological randomness q . We present simplified analytic predictions for the second-largest and smallest eigenvalue, and numerical checks confirm our theoretical predictions for zero, small, and moderate topological randomness q , including the entire small-world regime. For large q of the order of one, we apply standard random matrix theory, thereby overarching the full range from regular to randomized network topologies. These results may contribute to our analytic and mechanistic understanding of collective relaxation phenomena of network dynamical systems.
Collective relaxation dynamics of small-world networks.
Grabow, Carsten; Grosskinsky, Stefan; Kurths, Jürgen; Timme, Marc
2015-05-01
Complex networks exhibit a wide range of collective dynamic phenomena, including synchronization, diffusion, relaxation, and coordination processes. Their asymptotic dynamics is generically characterized by the local Jacobian, graph Laplacian, or a similar linear operator. The structure of networks with regular, small-world, and random connectivities are reasonably well understood, but their collective dynamical properties remain largely unknown. Here we present a two-stage mean-field theory to derive analytic expressions for network spectra. A single formula covers the spectrum from regular via small-world to strongly randomized topologies in Watts-Strogatz networks, explaining the simultaneous dependencies on network size N, average degree k, and topological randomness q. We present simplified analytic predictions for the second-largest and smallest eigenvalue, and numerical checks confirm our theoretical predictions for zero, small, and moderate topological randomness q, including the entire small-world regime. For large q of the order of one, we apply standard random matrix theory, thereby overarching the full range from regular to randomized network topologies. These results may contribute to our analytic and mechanistic understanding of collective relaxation phenomena of network dynamical systems.
Douglas, Pamela S; Hoffmann, Udo; Lee, Kerry L; Mark, Daniel B; Al-Khalidi, Hussein R; Anstrom, Kevin; Dolor, Rowena J; Kosinski, Andrzej; Krucoff, Mitchell W; Mudrick, Daniel W; Patel, Manesh R; Picard, Michael H; Udelson, James E; Velazquez, Eric J; Cooper, Lawton
2014-06-01
Suspected coronary artery disease (CAD) is one of the most common, potentially life-threatening diagnostic problems clinicians encounter. However, no large outcome-based randomized trials have been performed to guide the selection of diagnostic strategies for these patients. The PROMISE study is a prospective, randomized trial comparing the effectiveness of 2 initial diagnostic strategies in patients with symptoms suspicious for CAD. Patients are randomized to either (1) functional testing (exercise electrocardiogram, stress nuclear imaging, or stress echocardiogram) or (2) anatomical testing with ≥64-slice multidetector coronary computed tomographic angiography. Tests are interpreted locally in real time by subspecialty certified physicians, and all subsequent care decisions are made by the clinical care team. Sites are provided results of central core laboratory quality and completeness assessment. All subjects are followed up for ≥1 year. The primary end point is the time to occurrence of the composite of death, myocardial infarction, major procedural complications (stroke, major bleeding, anaphylaxis, and renal failure), or hospitalization for unstable angina. More than 10,000 symptomatic subjects were randomized in 3.2 years at 193 US and Canadian cardiology, radiology, primary care, urgent care, and anesthesiology sites. Multispecialty community practice enrollment into a large pragmatic trial of diagnostic testing strategies is both feasible and efficient. The PROMISE trial will compare the clinical effectiveness of an initial strategy of functional testing against an initial strategy of anatomical testing in symptomatic patients with suspected CAD. Quality of life, resource use, cost-effectiveness, and radiation exposure will be assessed. Copyright © 2014 Mosby, Inc. All rights reserved.
Douglas, Pamela S.; Hoffmann, Udo; Lee, Kerry L.; Mark, Daniel B.; Al-Khalidi, Hussein R.; Anstrom, Kevin; Dolor, Rowena J.; Kosinski, Andrzej; Krucoff, Mitchell W.; Mudrick, Daniel W.; Patel, Manesh R.; Picard, Michael H.; Udelson, James E.; Velazquez, Eric J.; Cooper, Lawton
2014-01-01
Background Suspected coronary artery disease (CAD) is one of the most common, potentially life threatening diagnostic problems clinicians encounter. However, no large outcome-based randomized trials have been performed to guide the selection of diagnostic strategies for these patients. Methods The PROMISE study is a prospective, randomized trial comparing the effectiveness of two initial diagnostic strategies in patients with symptoms suspicious for CAD. Patients are randomized to either: 1) functional testing (exercise electrocardiogram, stress nuclear imaging, or stress echocardiogram); or 2) anatomic testing with >=64 slice multidetector coronary computed tomographic angiography. Tests are interpreted locally in real time by subspecialty certified physicians and all subsequent care decisions are made by the clinical care team. Sites are provided results of central core lab quality and completeness assessment. All subjects are followed for ≥1 year. The primary end-point is the time to occurrence of the composite of death, myocardial infarction, major procedural complications (stroke, major bleeding, anaphylaxis and renal failure) or hospitalization for unstable angina. Results Over 10,000 symptomatic subjects were randomized in 3.2 years at 193 US and Canadian cardiology, radiology, primary care, urgent care and anesthesiology sites. Conclusion Multi-specialty community practice enrollment into a large pragmatic trial of diagnostic testing strategies is both feasible and efficient. PROMISE will compare the clinical effectiveness of an initial strategy of functional testing against an initial strategy of anatomic testing in symptomatic patients with suspected CAD. Quality of life, resource use, cost effectiveness and radiation exposure will be assessed. Clinical trials.gov identifier NCT01174550 PMID:24890527
Stanley, Clayton; Byrne, Michael D
2016-12-01
The growth of social media and user-created content on online sites provides unique opportunities to study models of human declarative memory. By framing the task of choosing a hashtag for a tweet and tagging a post on Stack Overflow as a declarative memory retrieval problem, 2 cognitively plausible declarative memory models were applied to millions of posts and tweets and evaluated on how accurately they predict a user's chosen tags. An ACT-R based Bayesian model and a random permutation vector-based model were tested on the large data sets. The results show that past user behavior of tag use is a strong predictor of future behavior. Furthermore, past behavior was successfully incorporated into the random permutation model that previously used only context. Also, ACT-R's attentional weight term was linked to an entropy-weighting natural language processing method used to attenuate high-frequency words (e.g., articles and prepositions). Word order was not found to be a strong predictor of tag use, and the random permutation model performed comparably to the Bayesian model without including word order. This shows that the strength of the random permutation model is not in the ability to represent word order, but rather in the way in which context information is successfully compressed. The results of the large-scale exploration show how the architecture of the 2 memory models can be modified to significantly improve accuracy, and may suggest task-independent general modifications that can help improve model fit to human data in a much wider range of domains. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Chung, Eugene S; Fischer, Trent M; Kueffer, Fred; Anand, Inder S; Bax, Jeroen J; Gold, Michael R; Gorman, Robert C; Theres, Heinz; Udelson, James E; Stancak, Branislav; Svendsen, Jesper H; Stone, Gregg W; Leon, Angel
2015-07-01
Despite considerable improvements in the medical management of patients with myocardial infarction (MI), patients with large MI still have substantial risk of developing heart failure. In the early post-MI setting, implantable cardioverter defibrillators have reduced arrhythmic deaths but have no impact on overall mortality. Therefore, additional interventions are required to further reduce the overall morbidity and mortality of patients with large MI. The Pacing Remodeling Prevention Therapy (PRomPT) trial is designed to study the effects of peri-infarct pacing in preventing adverse post-MI remodeling. Up to 120 subjects with peak creatine phosphokinase >3,000 U/L (or troponin T >10 μg/L) at time of MI will be randomized to either dual-site or single-site biventricular pacing with the left ventricular lead implanted in a peri-infarct region or to a nonimplanted control group. Those randomized to a device will be blinded to the pacing mode, but randomization to a device or control cannot be blinded. Subjects randomized to pacing will have the device implanted within 10 days of MI. The primary objective is to assess the change in left ventricular end-diastolic volume from baseline to 18 months. Secondary objectives are to assess changes in clinical and mechanistic parameters between the groups, including rates of hospitalization for heart failure and cardiovascular events, the incidence of sudden cardiac death and all-cause mortality, New York Heart Association functional class, 6-minute walking distance, and quality of life. The PRomPT trial will provide important evidence regarding the potential of peri-infarct pacing to interrupt adverse remodeling in patients with large MI. Copyright © 2015 Elsevier Inc. All rights reserved.
Application of stochastic processes in random growth and evolutionary dynamics
NASA Astrophysics Data System (ADS)
Oikonomou, Panagiotis
We study the effect of power-law distributed randomness on the dynamical behavior of processes such as stochastic growth patterns and evolution. First, we examine the geometrical properties of random shapes produced by a generalized stochastic Loewner Evolution driven by a superposition of a Brownian motion and a stable Levy process. The situation is defined by the usual stochastic Loewner Evolution parameter, kappa, as well as alpha which defines the power-law tail of the stable Levy distribution. We show that the properties of these patterns change qualitatively and singularly at critical values of kappa and alpha. It is reasonable to call such changes "phase transitions". These transitions occur as kappa passes through four and as alpha passes through one. Numerical simulations are used to explore the global scaling behavior of these patterns in each "phase". We show both analytically and numerically that the growth continues indefinitely in the vertical direction for alpha greater than 1, goes as logarithmically with time for alpha equals to 1, and saturates for alpha smaller than 1. The probability density has two different scales corresponding to directions along and perpendicular to the boundary. Scaling functions for the probability density are given for various limiting cases. Second, we study the effect of the architecture of biological networks on their evolutionary dynamics. In recent years, studies of the architecture of large networks have unveiled a common topology, called scale-free, in which a majority of the elements are poorly connected except for a small fraction of highly connected components. We ask how networks with distinct topologies can evolve towards a pre-established target phenotype through a process of random mutations and selection. We use networks of Boolean components as a framework to model a large class of phenotypes. Within this approach, we find that homogeneous random networks and scale-free networks exhibit drastically different evolutionary paths. While homogeneous random networks accumulate neutral mutations and evolve by sparse punctuated steps, scale-free networks evolve rapidly and continuously towards the target phenotype. Moreover, we show that scale-free networks always evolve faster than homogeneous random networks; remarkably, this property does not depend on the precise value of the topological parameter. By contrast, homogeneous random networks require a specific tuning of their topological parameter in order to optimize their fitness. This model suggests that the evolutionary paths of biological networks, punctuated or continuous, may solely be determined by the network topology.
Mechanics of single cells: rheology, time dependence, and fluctuations.
Massiera, Gladys; Van Citters, Kathleen M; Biancaniello, Paul L; Crocker, John C
2007-11-15
The results of mechanical measurements on single cultured epithelial cells using both magnetic twisting cytometry (MTC) and laser tracking microrheology (LTM) are described. Our unique approach uses laser deflection for high-performance tracking of cell-adhered magnetic beads either in response to an oscillatory magnetic torque (MTC) or due to random Brownian or ATP-dependent forces (LTM). This approach is well suited for accurately determining the rheology of single cells, the study of temporal and cell-to-cell variations in the MTC signal amplitude, and assessing the statistical character of the tracers' random motion in detail. The temporal variation of the MTC rocking amplitude is surprisingly large and manifests as a frequency-independent multiplicative factor having a 1/f spectrum in living cells, which disappears upon ATP depletion. In the epithelial cells we study, random bead position fluctuations are Gaussian to the limits of detection both in the Brownian and ATP-dependent cases, unlike earlier studies on other cell types.
Autonomous Modeling, Statistical Complexity and Semi-annealed Treatment of Boolean Networks
NASA Astrophysics Data System (ADS)
Gong, Xinwei
This dissertation presents three studies on Boolean networks. Boolean networks are a class of mathematical systems consisting of interacting elements with binary state variables. Each element is a node with a Boolean logic gate, and the presence of interactions between any two nodes is represented by directed links. Boolean networks that implement the logic structures of real systems are studied as coarse-grained models of the real systems. Large random Boolean networks are studied with mean field approximations and used to provide a baseline of possible behaviors of large real systems. This dissertation presents one study of the former type, concerning the stable oscillation of a yeast cell-cycle oscillator, and two studies of the latter type, respectively concerning the statistical complexity of large random Boolean networks and an extension of traditional mean field techniques that accounts for the presence of short loops. In the cell-cycle oscillator study, a novel autonomous update scheme is introduced to study the stability of oscillations in small networks. A motif that corrects pulse-growing perturbations and a motif that grows pulses are identified. A combination of the two motifs is capable of sustaining stable oscillations. Examining a Boolean model of the yeast cell-cycle oscillator using an autonomous update scheme yields evidence that it is endowed with such a combination. Random Boolean networks are classified as ordered, critical or disordered based on their response to small perturbations. In the second study, random Boolean networks are taken as prototypical cases for the evaluation of two measures of complexity based on a criterion for optimal statistical prediction. One measure, defined for homogeneous systems, does not distinguish between the static spatial inhomogeneity in the ordered phase and the dynamical inhomogeneity in the disordered phase. A modification in which complexities of individual nodes are calculated yields vanishing complexity values for networks in the ordered and critical phases and for highly disordered networks, peaking somewhere in the disordered phase. Individual nodes with high complexity have, on average, a larger influence on the system dynamics. Lastly, a semi-annealed approximation that preserves the correlation between states at neighboring nodes is introduced to study a social game-inspired network model in which all links are bidirectional and all nodes have a self-input. The technique developed here is shown to yield accurate predictions of distribution of players' states, and accounts for some nontrivial collective behavior of game theoretic interest.
Meinich Petersen, Sandra; Zoffmann, Vibeke; Kjærgaard, Jesper; Graff Stensballe, Lone; Graff Steensballe, Lone; Greisen, Gorm
2014-04-15
When a child participates in a clinical trial, informed consent has to be given by the parents. Parental motives for participation are complex, but the hope of getting a new and better treatment for the child is important. We wondered how parents react when their child is allocated to the control group of a randomized controlled trial, and how it will affect their future engagement in the trial. We included parents of newborns randomized to the control arm in the Danish Calmette study at Rigshospitalet in Copenhagen. The Calmette study is a randomized clinical trial investigating the non-specific effects of early BCG-vaccine to healthy neonates. Randomization is performed immediately after birth and parents are not blinded to the allocation. We set up a semi-structured focus group with six parents from four families. Afterwards we telephone-interviewed another 19 mothers to achieve saturation. Thematic analysis was used to identify themes across the data sets. The parents reported good understanding of the randomization process. Their most common reaction to allocation was disappointment, though relief was also seen. A model of reactions to being allocated to the control group was developed based on the participants' different positions along two continuities from 'Our participation in trial is not important' to 'Our participation in trial is important', and 'Vaccine not important to us' to 'Vaccine important to us'. Four very disappointed families had thought of getting the vaccine elsewhere, and one had actually had their child vaccinated. All parents involved in the focus group and the telephone interviews wanted to participate in the follow-ups planned for the Calmette study. This study identified an almost universal experience of disappointment among parents of newborns who were randomized to the control group, but also a broad expression of understanding and accepting the idea of randomization. The trial staff might use the model of reactions in understanding the parents' disappointment and in this way support their motives for participation. A generalized version might be applicable across randomized controlled trials at large. The Calmette study is registered in EudraCT (https://eudract.ema.europa.eu/) with trial number 2010-021979-85.
Memory consolidation and contextual interference effects with computer games.
Shewokis, Patricia A
2003-10-01
Some investigators of the contextual interference effect contend that there is a direct relation between the amount of practice and the contextual interference effect based on the prediction that the improvement in learning tasks in a random practice schedule, compared to a blocked practice schedule, increases in magnitude as the amount of practice during acquisition on the tasks increases. Research using computer games in contextual interference studies has yielded a large effect (f = .50) with a random practice schedule advantage during transfer. These investigations had a total of 36 and 72 acquisition trials, respectively. The present study tested this prediction by having 72 college students, who were randomly assigned to a blocked or random practice schedule, practice 102 trials of three computer-game tasks across three days. After a 24-hr. interval, 6 retention and 5 transfer trials were performed. Dependent variables were time to complete an event in seconds and number of errors. No significant differences were found for retention and transfer. These results are discussed in terms of how the amount of practice, task-related factors, and memory consolidation mediate the contextual interference effect.
Grover, Sandeep; Del Greco M, Fabiola; Stein, Catherine M; Ziegler, Andreas
2017-01-01
Confounding and reverse causality have prevented us from drawing meaningful clinical interpretation even in well-powered observational studies. Confounding may be attributed to our inability to randomize the exposure variable in observational studies. Mendelian randomization (MR) is one approach to overcome confounding. It utilizes one or more genetic polymorphisms as a proxy for the exposure variable of interest. Polymorphisms are randomly distributed in a population, they are static throughout an individual's lifetime, and may thus help in inferring directionality in exposure-outcome associations. Genome-wide association studies (GWAS) or meta-analyses of GWAS are characterized by large sample sizes and the availability of many single nucleotide polymorphisms (SNPs), making GWAS-based MR an attractive approach. GWAS-based MR comes with specific challenges, including multiple causality. Despite shortcomings, it still remains one of the most powerful techniques for inferring causality.With MR still an evolving concept with complex statistical challenges, the literature is relatively scarce in terms of providing working examples incorporating real datasets. In this chapter, we provide a step-by-step guide for causal inference based on the principles of MR with a real dataset using both individual and summary data from unrelated individuals. We suggest best possible practices and give recommendations based on the current literature.
ERIC Educational Resources Information Center
National Center for Education Evaluation and Regional Assistance, 2015
2015-01-01
The Teacher Incentive Fund (TIF) provides grants to support performance-based compensation systems for teachers and principals in high-need schools. The study measures the impact of pay-for-performance bonuses as part of a comprehensive compensation system within a large, multisite random assignment study design. The treatment schools were to…
ERIC Educational Resources Information Center
Adjei, Augustine; Dontoh, Samuel; Baafi-Frimpong, Stephen
2017-01-01
The study aimed at investigating the extent to which College climate (Leadership roles/practices and Class size) impact on academic work of Teacher-trainees. A survey research design was used for the study because it involved a study of relatively large population who were purposively and randomly selected. A sample size of 322 out of the…
Large-scale data analysis of power grid resilience across multiple US service regions
NASA Astrophysics Data System (ADS)
Ji, Chuanyi; Wei, Yun; Mei, Henry; Calzada, Jorge; Carey, Matthew; Church, Steve; Hayes, Timothy; Nugent, Brian; Stella, Gregory; Wallace, Matthew; White, Joe; Wilcox, Robert
2016-05-01
Severe weather events frequently result in large-scale power failures, affecting millions of people for extended durations. However, the lack of comprehensive, detailed failure and recovery data has impeded large-scale resilience studies. Here, we analyse data from four major service regions representing Upstate New York during Super Storm Sandy and daily operations. Using non-stationary spatiotemporal random processes that relate infrastructural failures to recoveries and cost, our data analysis shows that local power failures have a disproportionally large non-local impact on people (that is, the top 20% of failures interrupted 84% of services to customers). A large number (89%) of small failures, represented by the bottom 34% of customers and commonplace devices, resulted in 56% of the total cost of 28 million customer interruption hours. Our study shows that extreme weather does not cause, but rather exacerbates, existing vulnerabilities, which are obscured in daily operations.
Hamada, Hirokazu; Abe, Yuko; Nagane, Ryoichi; Fang, Ya-Yin; Lewis, Mark A.; Long, Eric C.; Chikira, Makoto
2007-01-01
DNA fiber EPR was used to investigate the DNA binding stabilities and orientations of Cu(II)•Gly-Gly-His-derived metallopeptides containing d- vs. l-amino acid substitutions in the first peptide position. This examination included studies of Cu(II)•d-Arg-Gly-His and Cu(II)•d-Lys-Gly-His for comparison to metallopeptides containing l-Arg/Lys substitutions, and also the diastereoisomeric pairs Cu(II)•d/l-Pro-Gly-His and Cu(II)•d/l-Pro-Lys-His. Results indicated that l-Arg/Lys to d-Arg/Lys substitutions considerably randomized the orientation of the metallopeptides on DNA whereas the replacement of l-Pro by d-Pro in Cu(II)•l-Pro-Gly-His caused a decrease in randomness. The difference in the extent of randomness of d- vs. l-Pro-Gly-His complexes was diminished through the substitution of Gly for Lys in the middle peptide position, supporting the notion that the ε-amino group of Lys triggered further randomization, likely through hydrogen bonding or electrostatic interactions that disrupt binding of the metallopeptide equatorial plane and the DNA. The relationship between the stereochemistry of amino acid residues and the binding and reaction of M(II)•Xaa-Xaa’-His metallopeptides with DNA are also discussed. PMID:17706784
Carter, Barry L; Clarke, William; Ardery, Gail; Weber, Cynthia A; James, Paul A; Vander Weg, Mark; Chrischilles, Elizabeth A; Vaughn, Thomas; Egan, Brent M
2010-07-01
Numerous studies have demonstrated the value of team-based care to improve blood pressure (BP) control, but there is limited information on whether these models would be adopted in diverse populations. The purpose of this study was to evaluate whether a collaborative model between physicians and pharmacists can improve BP control in multiple primary care medical offices with diverse geographic and patient characteristics and whether long-term BP control can be sustained. This study is a randomized prospective trial in 27 primary care offices first stratified by the percentage of underrepresented minorities and the level of clinical pharmacy services within the office. Each office is then randomized to either a 9- or 24-month intervention or a control group. Patients will be enrolled in this study until 2012. The results of this study should provide information on whether this model can be implemented in large numbers of diverse offices, if it is effective in diverse populations, and whether BP control can be sustained long term. URL: http://www.clinicaltrials.gov. Unique identifier: NCT00935077.
Inquiry in the Physical Geology Classroom: Supporting Students' Conceptual Model Development
ERIC Educational Resources Information Center
Miller, Heather R.; McNeal, Karen S.; Herbert, Bruce E.
2010-01-01
This study characterizes the impact of an inquiry-based learning (IBL) module versus a traditionally structured laboratory exercise. Laboratory sections were randomized into experimental and control groups. The experimental group was taught using IBL pedagogical techniques and included manipulation of large-scale data-sets, use of multiple…
Delivery Systems: "Saber Tooth" Effect in Counseling.
ERIC Educational Resources Information Center
Traylor, Elwood B.
This study reported the role of counselors as perceived by black students in a secondary school. Observational and interview methods were employed to obtain data from 24 black students selected at random from the junior and senior classes of a large metropolitan secondary school. Findings include: counselors were essentially concerned with…
U.S. EPA/ORD LARGE BUILDINGS STUDY: RESULTS OF THE INITIAL SURVEY OF RANDOMLY SELECTED GSA BUILDINGS
The Atmospheric Research and Exposure Assessment Laboratory (AREAL), Office of Research and Development (ORD), U.S. Environmental Protection Agency (EPA), is initiating a research program to connect fundamental information on the key parameters and factors that influence indoor a...
Weight management using the internet: A randomized controlled trial
USDA-ARS?s Scientific Manuscript database
Most weight-loss research targets obese individuals who desire large weight reductions. However, evaluation of weight-gain prevention in overweight individuals is also critical as most Americans become obese as a result of a gradual gain of 1-2 pounds per year over many years. This study evaluated t...
A Program to Improve Student Engagement at Research-Focused Universities
ERIC Educational Resources Information Center
Whillans, Ashley V.; Hope, Sally E.; Wylie, Lauren J.; Zhao, Bob; Souza, Michael J.
2018-01-01
Promoting undergraduate engagement is an important and challenging obstacle at large research-focused universities. Thus, the current study evaluated whether a peer-led program of student-geared events could improve engagement among a diverse group of psychology students early on in their degrees. We randomly assigned interested second-year…
Job Insecurity and Employee Well-Being.
ERIC Educational Resources Information Center
Vance, Robert J.; Kuhnert, Karl W.
This study explored the consequences of perceived job security and insecurity on the psychological and physical health of employees. Data were gathered from employees of a large midwestern manufacturing organization that produced products for material removal applications. Surveys were sent through company mail to a stratified random sample of 442…
Predicting protein functions from redundancies in large-scale protein interaction networks
NASA Technical Reports Server (NTRS)
Samanta, Manoj Pratim; Liang, Shoudan
2003-01-01
Interpreting data from large-scale protein interaction experiments has been a challenging task because of the widespread presence of random false positives. Here, we present a network-based statistical algorithm that overcomes this difficulty and allows us to derive functions of unannotated proteins from large-scale interaction data. Our algorithm uses the insight that if two proteins share significantly larger number of common interaction partners than random, they have close functional associations. Analysis of publicly available data from Saccharomyces cerevisiae reveals >2,800 reliable functional associations, 29% of which involve at least one unannotated protein. By further analyzing these associations, we derive tentative functions for 81 unannotated proteins with high certainty. Our method is not overly sensitive to the false positives present in the data. Even after adding 50% randomly generated interactions to the measured data set, we are able to recover almost all (approximately 89%) of the original associations.
Extinction order and altered community structure rapidly disrupt ecosystem functioning.
Larsen, Trond H; Williams, Neal M; Kremen, Claire
2005-05-01
By causing extinctions and altering community structure, anthropogenic disturbances can disrupt processes that maintain ecosystem integrity. However, the relationship between community structure and ecosystem functioning in natural systems is poorly understood. Here we show that habitat loss appeared to disrupt ecosystem functioning by affecting extinction order, species richness and abundance. We studied pollination by bees in a mosaic of agricultural and natural habitats in California and dung burial by dung beetles on recently created islands in Venezuela. We found that large-bodied bee and beetle species tended to be both most extinction-prone and most functionally efficient, contributing to rapid functional loss. Simulations confirmed that extinction order led to greater disruption of function than predicted by random species loss. Total abundance declined with richness and also appeared to contribute to loss of function. We demonstrate conceptually and empirically how the non-random response of communities to disturbance can have unexpectedly large functional consequences.
Sun, Dennis L; Harris, Naftali; Walther, Guenther; Baiocchi, Michael
2015-01-01
Feedback has a powerful influence on learning, but it is also expensive to provide. In large classes it may even be impossible for instructors to provide individualized feedback. Peer assessment is one way to provide personalized feedback that scales to large classes. Besides these obvious logistical benefits, it has been conjectured that students also learn from the practice of peer assessment. However, this has never been conclusively demonstrated. Using an online educational platform that we developed, we conducted an in-class matched-set, randomized crossover experiment with high power to detect small effects. We establish that peer assessment causes a small but significant gain in student achievement. Our study also demonstrates the potential of web-based platforms to facilitate the design of high-quality experiments to identify small effects that were previously not detectable.
Large deviations in the random sieve
NASA Astrophysics Data System (ADS)
Grimmett, Geoffrey
1997-05-01
The proportion [rho]k of gaps with length k between square-free numbers is shown to satisfy log[rho]k=[minus sign](1+o(1))(6/[pi]2) klogk as k[rightward arrow][infty infinity]. Such asymptotics are consistent with Erdos's challenge to prove that the gap following the square-free number t is smaller than clogt/log logt, for all t and some constant c satisfying c>[pi]2/12. The results of this paper are achieved by studying the probabilities of large deviations in a certain ‘random sieve’, for which the proportions [rho]k have representations as probabilities. The asymptotic form of [rho]k may be obtained in situations of greater generality, when the squared primes are replaced by an arbitrary sequence (sr) of relatively prime integers satisfying [sum L: summation operator]r1/sr<[infty infinity], subject to two further conditions of regularity on this sequence.
O'Farrell, Timothy J.; Murphy, Marie; Alter, Jane; Fals-Stewart, William
2008-01-01
Alcoholic patients in inpatient detoxification were randomized to treatment as usual (TAU) or to a brief family treatment (BFT) intervention to promote continuing care post-detox. BFT consisted of meeting with the patient and an adult family member (in person or over the phone) with whom the patient lived, to review and recommend potential continuing care plans for the patient. Results showed that BFT patients (N=24), were significantly more likely than TAU patients (N=21), to enter a continuing care program after detoxification. This was a medium to large effect size. In the 3 months after detoxification, days using alcohol or drugs (a) trended lower for treatment-exposed BFT patients who had an in-person family meeting than TAU counterparts (medium effect), and (b) were significantly lower for patients who entered continuing care regardless of treatment condition (large effect). PMID:17614242
Moura, Lidia Mvr; Westover, M Brandon; Kwasnik, David; Cole, Andrew J; Hsu, John
2017-01-01
The elderly population faces an increasing number of cases of chronic neurological conditions, such as epilepsy and Alzheimer's disease. Because the elderly with epilepsy are commonly excluded from randomized controlled clinical trials, there are few rigorous studies to guide clinical practice. When the elderly are eligible for trials, they either rarely participate or frequently have poor adherence to therapy, thus limiting both generalizability and validity. In contrast, large observational data sets are increasingly available, but are susceptible to bias when using common analytic approaches. Recent developments in causal inference-analytic approaches also introduce the possibility of emulating randomized controlled trials to yield valid estimates. We provide a practical example of the application of the principles of causal inference to a large observational data set of patients with epilepsy. This review also provides a framework for comparative-effectiveness research in chronic neurological conditions.
O'Farrell, Timothy J; Murphy, Marie; Alter, Jane; Fals-Stewart, William
2008-04-01
Alcohol-dependent patients in inpatient detoxification were randomized to treatment-as-usual (TAU) intervention or brief family treatment (BFT) intervention to promote continuing care postdetoxification. BFT consisted of meeting with the patient and an adult family member (in person or over the phone) with whom the patient lived to review and recommend potential continuing care plans for the patient. Results showed that BFT patients (n = 24) were significantly more likely than TAU patients (n = 21) to enter a continuing care program after detoxification. This was a medium to large effect size. In the 3 months after detoxification, days using alcohol or drugs (a) trended lower for treatment-exposed BFT patients who had an in-person family meeting than for TAU counterparts (medium effect), and (b) were significantly lower for patients who entered continuing care regardless of treatment condition (large effect).
Rigorous control conditions diminish treatment effects in weight loss randomized controlled trials
Dawson, John A.; Kaiser, Kathryn A.; Affuso, Olivia; Cutter, Gary R.; Allison, David B.
2015-01-01
Background It has not been established whether control conditions with large weight losses (WLs) diminish expected treatment effects in WL or prevention of weight gain (PWG) randomized controlled trials (RCTs). Subjects/Methods We performed a meta-analysis of 239 WL/PWG RCTs that include a control group and at least one treatment group. A maximum likelihood meta-analysis framework is used in order to model and understand the relationship between treatment effects and control group outcomes. Results Under the informed model, an increase in control group WL of one kilogram corresponds with an expected shrinkage of the treatment effect by 0.309 kg [95% CI (−0.480, −0.138), p = 0.00081]; this result is robust against violations of the model assumptions. Conclusions We find that control conditions with large weight losses diminish expected treatment effects. Our investigation may be helpful to clinicians as they design future WL/PWG studies. PMID:26449419
De Jager, N. R.; Pastor, J.
2009-01-01
Ungulate herbivores create patterns of forage availability, plant species composition, and soil fertility as they range across large landscapes and consume large quantities of plant material. Over time, herbivore populations fluctuate, producing great potential for spatio-temporal landscape dynamics. In this study, we extend the spatial and temporal extent of a long-term investigation of the relationship of landscape patterns to moose foraging behavior at Isle Royale National Park, MI. We examined how patterns of browse availability and consumption, plant basal area, and soil fertility changed during a recent decline in the moose population. We used geostatistics to examine changes in the nature of spatial patterns in two valleys over 18 years and across short-range and long-range distance scales. Landscape patterns of available and consumed browse changed from either repeated patches or randomly distributed patches in 1988-1992 to random point distributions by 2007 after a recent record high peak followed by a rapid decline in the moose population. Patterns of available and consumed browse became decoupled during the moose population low, which is in contrast to coupled patterns during the earlier high moose population. Distributions of plant basal area and soil nitrogen availability also switched from repeated patches to randomly distributed patches in one valley and to random point distributions in the other valley. Rapid declines in moose population density may release vegetation and soil fertility from browsing pressure and in turn create random landscape patterns. ?? Springer Science+Business Media B.V. 2009.
Deschamps, Alain; Hall, Richard; Grocott, Hilary; Mazer, C David; Choi, Peter T; Turgeon, Alexis F; de Medicis, Etienne; Bussières, Jean S; Hudson, Christopher; Syed, Summer; Seal, Doug; Herd, Stuart; Lambert, Jean; Denault, André; Deschamps, Alain; Mutch, Alan; Turgeon, Alexis; Denault, Andre; Todd, Andrea; Jerath, Angela; Fayad, Ashraf; Finnegan, Barry; Kent, Blaine; Kennedy, Brent; Cuthbertson, Brian H; Kavanagh, Brian; Warriner, Brian; MacAdams, Charles; Lehmann, Christian; Fudorow, Christine; Hudson, Christopher; McCartney, Colin; McIsaac, Dan; Dubois, Daniel; Campbell, David; Mazer, David; Neilpovitz, David; Rosen, David; Cheng, Davy; Drapeau, Dennis; Dillane, Derek; Tran, Diem; Mckeen, Dolores; Wijeysundera, Duminda; Jacobsohn, Eric; Couture, Etienne; de Medicis, Etienne; Alam, Fahad; Abdallah, Faraj; Ralley, Fiona E; Chung, Frances; Lellouche, Francois; Dobson, Gary; Germain, Genevieve; Djaiani, George; Gilron, Ian; Hare, Gregory; Bryson, Gregory; Clarke, Hance; McDonald, Heather; Roman-Smith, Helen; Grocott, Hilary; Yang, Homer; Douketis, James; Paul, James; Beaubien, Jean; Bussières, Jean; Pridham, Jeremy; Armstrong, J N; Parlow, Joel; Murkin, John; Gamble, Jonathan; Duttchen, Kaylene; Karkouti, Keyvan; Turner, Kim; Baghirzada, Leyla; Szabo, Linda; Lalu, Manoj; Wasowicz, Marcin; Bautista, Michael; Jacka, Michael; Murphy, Michael; Schmidt, Michael; Verret, Michaël; Perrault, Michel-Antoine; Beaudet, Nicolas; Buckley, Norman; Choi, Peter; MacDougall, Peter; Jones, Philip; Drolet, Pierre; Beaulieu, Pierre; Taneja, Ravi; Martin, Rene; Hall, Richard; George, Ronald; Chun, Rosa; McMullen, Sarah; Beattie, Scott; Sampson, Sonia; Choi, Stephen; Kowalski, Stephen; McCluskey, Stuart; Syed, Summer; Boet, Sylvain; Ramsay, Tim; Saha, Tarit; Mutter, Thomas; Chowdhury, Tumul; Uppal, Vishal; Mckay, William
2016-04-01
Cerebral oxygen desaturation during cardiac surgery has been associated with adverse perioperative outcomes. Before a large multicenter randomized controlled trial (RCT) on the impact of preventing desaturations on perioperative outcomes, the authors undertook a randomized prospective, parallel-arm, multicenter feasibility RCT to determine whether an intervention algorithm could prevent desaturations. Eight Canadian sites randomized 201 patients between April 2012 and October 2013. The primary outcome was the success rate of reversing cerebral desaturations below 10% relative to baseline in the intervention group. Anesthesiologists were blinded to the cerebral saturation values in the control group. Intensive care unit personnel were blinded to cerebral saturation values for both groups. Secondary outcomes included the area under the curve of cerebral desaturation load, enrolment rates, and a 30-day follow-up for adverse events. Cerebral desaturations occurred in 71 (70%) of the 102 intervention group patients and 56 (57%) of the 99 control group patients (P = 0.04). Reversal was successful in 69 (97%) of the intervention group patients. The mean cerebral desaturation load (SD) in the operating room was smaller for intervention group patients compared with control group patients (104 [217] %.min vs. 398 [869] %.min, mean difference, -294; 95% CI, -562 to -26; P = 0.03). This was also true in the intensive care unit (P = 0.02). There were no differences in adverse events between the groups. Study sites were successful in reversal of desaturation, patient recruitment, randomization, and follow-up in cardiac surgery, supporting the feasibility of conducting a large multicenter RCT.
Zhang, Guo-Qiang; Tao, Shiqiang; Xing, Guangming; Mozes, Jeno; Zonjy, Bilal; Lhatoo, Samden D; Cui, Licong
2015-11-10
A unique study identifier serves as a key for linking research data about a study subject without revealing protected health information in the identifier. While sufficient for single-site and limited-scale studies, the use of common unique study identifiers has several drawbacks for large multicenter studies, where thousands of research participants may be recruited from multiple sites. An important property of study identifiers is error tolerance (or validatable), in that inadvertent editing mistakes during their transmission and use will most likely result in invalid study identifiers. This paper introduces a novel method called "Randomized N-gram Hashing (NHash)," for generating unique study identifiers in a distributed and validatable fashion, in multicenter research. NHash has a unique set of properties: (1) it is a pseudonym serving the purpose of linking research data about a study participant for research purposes; (2) it can be generated automatically in a completely distributed fashion with virtually no risk for identifier collision; (3) it incorporates a set of cryptographic hash functions based on N-grams, with a combination of additional encryption techniques such as a shift cipher; (d) it is validatable (error tolerant) in the sense that inadvertent edit errors will mostly result in invalid identifiers. NHash consists of 2 phases. First, an intermediate string using randomized N-gram hashing is generated. This string consists of a collection of N-gram hashes f1, f2, ..., fk. The input for each function fi has 3 components: a random number r, an integer n, and input data m. The result, fi(r, n, m), is an n-gram of m with a starting position s, which is computed as (r mod |m|), where |m| represents the length of m. The output for Step 1 is the concatenation of the sequence f1(r1, n1, m1), f2(r2, n2, m2), ..., fk(rk, nk, mk). In the second phase, the intermediate string generated in Phase 1 is encrypted using techniques such as shift cipher. The result of the encryption, concatenated with the random number r, is the final NHash study identifier. We performed experiments using a large synthesized dataset comparing NHash with random strings, and demonstrated neglegible probability for collision. We implemented NHash for the Center for SUDEP Research (CSR), a National Institute for Neurological Disorders and Stroke-funded Center Without Walls for Collaborative Research in the Epilepsies. This multicenter collaboration involves 14 institutions across the United States and Europe, bringing together extensive and diverse expertise to understand sudden unexpected death in epilepsy patients (SUDEP). The CSR Data Repository has successfully used NHash to link deidentified multimodal clinical data collected in participating CSR institutions, meeting all desired objectives of NHash.
Levosimendan for Perioperative Cardioprotection: Myth or Reality?
Santillo, Elpidio; Migale, Monica; Massini, Carlo; Incalzi, Raffaele Antonelli
2018-03-21
Levosimendan is a calcium sensitizer drug causing increased contractility in the myocardium and vasodilation in the vascular system. It is mainly used for the therapy of acute decompensated heart failure. Several studies on animals and humans provided evidence of the cardioprotective properties of levosimendan including preconditioning and anti-apoptotic. In view of these favorable effects, levosimendan has been tested in patients undergoing cardiac surgery for the prevention or treatment of low cardiac output syndrome. However, initial positive results from small studies have not been confirmed in three recent large trials. To summarize levosimendan mechanisms of action and clinical use and to review available evidence on its perioperative use in cardiac surgery setting. We searched two electronic medical databases for randomized controlled trials studying levosimendan in cardiac surgery patients, ranging from January 2000 to August 2017. Meta-analyses, consensus documents and retrospective studies were also reviewed. In the selected interval of time, 54 studies on the use of levosimendan in heart surgery have been performed. Early small size studies and meta-analyses have suggested that perioperative levosimendan infusion could diminish mortality and other adverse outcomes (i.e. intensive care unit stay and need for inotropic support). Instead, three recent large randomized controlled trials (LEVO-CTS, CHEETAH and LICORN) showed no significant survival benefits from levosimendan. However, in LEVO-CTS trial, prophylactic levosimendan administration significantly reduced the incidence of low cardiac output syndrome. Based on most recent randomized controlled trials, levosimendan, although effective for the treatment of acute heart failure, can't be recommended as standard therapy for the management of heart surgery patients. Further studies are needed to clarify whether selected subgroups of heart surgery patients may benefit from perioperative levosimendan infusion. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
ERIC Educational Resources Information Center
Stanger-Hall, Kathrin F.; Shockley, Floyd W.; Wilson, Rachel E.
2011-01-01
We implemented a "how to study" workshop for small groups of students (6-12) for N = 93 consenting students, randomly assigned from a large introductory biology class. The goal of this workshop was to teach students self-regulating techniques with visualization-based exercises as a foundation for learning and critical thinking in two areas:…
The Role of Gender in Youth Mentoring Relationship Formation and Duration
ERIC Educational Resources Information Center
Rhodes, Jean; Lowe, Sarah R.; Litchfield, Leon; Walsh-Samp, Kathy
2008-01-01
The role of gender in shaping the course and quality of adult-youth mentoring relationships was examined. The study drew on data from a large, random assignment evaluation of Big Brothers Big Sisters of America (BBSA) programs [Grossman, J. B., & Tierney, J. P. (1998). Does mentoring work? An impact study of the Big Brothers Big Sisters program.…
Steve Zack; William F. Laudenslayer; Luke George; Carl Skinner; William Oliver
1999-01-01
At two different locations in northeast California, an interdisciplinary team of scientists is initiating long-term studies to quantify the effects of forest manipulations intended to accelerate andlor enhance late-successional structure of eastside pine forest ecosystems. One study, at Blacks Mountain Experimental Forest, uses a split-plot, factorial, randomized block...
ERIC Educational Resources Information Center
Kalet, A.; Ellaway, R. H.; Song, H. S.; Nick, M.; Sarpel, U.; Hopkins, M. A.; Hill, J.; Plass, J. L.; Pusic, M. V.
2013-01-01
Participant attrition may be a significant threat to the generalizability of the results of educational research studies if participants who do not persist in a study differ from those who do in ways that can affect the experimental outcomes. A multi-center trial of the efficacy of different computer-based instructional strategies gave us the…
ERIC Educational Resources Information Center
Max, Jeffrey; Constantine, Jill; Wellington, Alison; Hallgren, Kristin; Glazerman, Steven; Chiang, Hanley; Speroni, Cecilia
2014-01-01
The Teacher Incentive Fund (TIF) provides grants to support performance-based compensation systems for teachers and principals in high-need schools. The study measures the impact of pay-for-performance bonuses as part of a comprehensive compensation system within a large, multisite random assignment study design. The treatment schools were to…
ERIC Educational Resources Information Center
Chiang, Hanley; Wellington, Alison; Hallgren, Kristin; Speroni, Cecilia; Herrmann, Mariesa; Glazerman, Steven; Constantine, Jill
2015-01-01
The Teacher Incentive Fund (TIF) provides grants to support performance-based compensation systems for teachers and principals in high-need schools. The study measures the impact of pay-for-performance bonuses as part of a comprehensive compensation system within a large, multisite random assignment study design. The treatment schools were to…
ERIC Educational Resources Information Center
Dart, Evan H.; Radley, Keith C.; Briesch, Amy M.; Furlow, Christopher M.; Cavell, Hannah J.; Briesch, Amy M.
2016-01-01
Two studies investigated the accuracy of eight different interval-based group observation methods that are commonly used to assess the effects of classwide interventions. In Study 1, a Microsoft Visual Basic program was created to simulate a large set of observational data. Binary data were randomly generated at the student level to represent…
Meteorite fractures and the behavior of meteoroids in the atmosphere
NASA Astrophysics Data System (ADS)
Bryson, K.; Ostrowski, D. R.; Sears, D. W. G.
2015-12-01
Arguably the major difficulty faced to model the atmospheric behavior of objects entering the atmosphere is that we know very little about the internal structure of these objects and their methods of fragmentation during fall. In a study of over a thousand meteorite fragments (mostly hand-sized, some 40 or 50 cm across) in the collections of the Natural History Museums in Vienna and London, we identified six kinds of fracturing behavior. (1) Chondrites usually showed random fractures with no particular sensitivity to meteorite texture. (2) Coarse irons fractured along kamacite grain boundaries, while (3) fine irons fragmented randomly, c.f. chondrites. (4) Fine irons with large crystal boundaries (e.g. Arispe) fragmented along the crystal boundaries. (5) A few chondrites, three in the present study, have a distinct and strong network of fractures making a brickwork or chicken-wire structure. The Chelyabinsk meteorite has the chicken-wire structure of fractures, which explains the very large number of centimeter-sized fragments that showered the Earth. Finally, (6) previous work on Sutter's Mill showed that water-rich meteorites fracture around clasts. To scale the meteorite fractures to the fragmentation behavior of near-Earth asteroids, it has been suggested that the fracturing behavior follows a statistical prediction made in the 1930s, the Weibull distribution, where fractures are assumed to be randomly distributed through the target and the likelihood of encountering a fracture increases with distance. This results in a relationship: σl = σs(ns/nl)α, where σs and σl refers to stress in the small and large object and ns and nl refer to the number of cracks per unit volume of the small and large object. The value for α, the Weibull coefficient, is unclear. Ames meteorite laboratory is working to measure the density and length of fractures observed in these six types of fracture to determine values for the Weibull coefficient for each type of object.
2012-01-01
This paper presents the rationale and methods for a randomized controlled evaluation of web-based training in motivational interviewing, goal setting, and behavioral task assignment. Web-based training may be a practical and cost-effective way to address the need for large-scale mental health training in evidence-based practice; however, there is a dearth of well-controlled outcome studies of these approaches. For the current trial, 168 mental health providers treating post-traumatic stress disorder (PTSD) were assigned to web-based training plus supervision, web-based training, or training-as-usual (control). A novel standardized patient (SP) assessment was developed and implemented for objective measurement of changes in clinical skills, while on-line self-report measures were used for assessing changes in knowledge, perceived self-efficacy, and practice related to cognitive behavioral therapy (CBT) techniques. Eligible participants were all actively involved in mental health treatment of veterans with PTSD. Study methodology illustrates ways of developing training content, recruiting participants, and assessing knowledge, perceived self-efficacy, and competency-based outcomes, and demonstrates the feasibility of conducting prospective studies of training efficacy or effectiveness in large healthcare systems. PMID:22583520
A simple model for pollen-parent fecundity distributions in bee-pollinated forage legume polycrosses
USDA-ARS?s Scientific Manuscript database
Random mating or panmixis is a fundamental assumption in quantitative genetic theory. Random mating is sometimes thought to occur in actual fact although a large body of empirical work shows that this is often not the case in nature. Models have been developed to model many non-random mating phenome...
No Randomization? No Problem: Experimental Control and Random Assignment in Single Case Research
ERIC Educational Resources Information Center
Ledford, Jennifer R.
2018-01-01
Randomization of large number of participants to different treatment groups is often not a feasible or preferable way to answer questions of immediate interest to professional practice. Single case designs (SCDs) are a class of research designs that are experimental in nature but require only a few participants, all of whom receive the…
NASA Astrophysics Data System (ADS)
Takahashi, T.; Obana, K.; Yamamoto, Y.; Nakanishi, A.; Kodaira, S.; Kaneda, Y.
2011-12-01
In the Nankai trough, there are three seismogenic zones of megathrust earthquakes (Tokai, Tonankai and Nankai earthquakes). Lithospheric structures in and around these seismogenic zones are important for the studies on mutual interactions and synchronization of their fault ruptures. Recent studies on seismic wave scattering at high frequencies (>1Hz) make it possible to estimate 3D distributions of random inhomogeneities (or scattering coefficient) in the lithosphere, and clarified that random inhomogeneity is one of the important medium properties related to microseismicity and damaged structure near the fault zone [Asano & Hasegawa, 2004; Takahashi et al. 2009]. This study estimates the spatial distribution of the power spectral density function (PSDF) of random inhomogeneities the western part of Nankai subduction zone, and examines the relations with crustal velocity structure and seismic activity. Seismic waveform data used in this study are those recorded at seismic stations of Hi-net & F-net operated by NIED, and 160 ocean bottom seismographs (OBSs) deployed at Hyuga-nada region from Dec. 2008 to Jan. 2009. This OBS observation was conducted by JAMSTEC as a part of "Research concerning Interaction Between the Tokai, Tonankai and Nankai Earthquakes" funded by Ministry of Education, Culture, Sports, Science and Technology, Japan. Spatial distribution of random inhomogeneities is estimated by the inversion analysis of the peak delay time of small earthquakes [Takahashi et al. 2009], where the peak delay time is defined as the time lag from the S-wave onset to its maximal amplitude arrival. We assumed the von Karman type functional form for the PSDF. Peak delay times are measured from root mean squared envelopes at 4-8Hz, 8-16Hz and 16-32Hz. Inversion result can be summarized as follows. Random inhomogeneities beneath the Quaternary volcanoes are characterized by strong inhomogeneities at small spatial scale (~ a few hundreds meter) and weak spectral gradient. Those in the Hyuga-nada region are characterized by the strong inhomogeneities at large spatial wavelength and steep spectral gradient. Random inhomogeneities in the Hyuga-nada region are similar with those in the frontal arc high in northern Izu-Bonin arc, which is thought to be a remnant arc that is presently inactive [Takahashi et al. 2011]. This coincidence implies the existence of subducted Kyushu-Palau ridge in this anomaly of random inhomogeneities, which is also suggested by the seismic refraction survey in this region [Nakanishi et al. 2010 AGU Fall Mtg.]. Source rupture areas of large earthquakes (M>6) in Hyuga-nada regions tend to locate around this anomaly of inhomogeneities. We may say that this anomalously inhomogeneous region is a structural factor affecting the seismic activity in Hyuga-nada region.
Random effects coefficient of determination for mixed and meta-analysis models
Demidenko, Eugene; Sargent, James; Onega, Tracy
2011-01-01
The key feature of a mixed model is the presence of random effects. We have developed a coefficient, called the random effects coefficient of determination, Rr2, that estimates the proportion of the conditional variance of the dependent variable explained by random effects. This coefficient takes values from 0 to 1 and indicates how strong the random effects are. The difference from the earlier suggested fixed effects coefficient of determination is emphasized. If Rr2 is close to 0, there is weak support for random effects in the model because the reduction of the variance of the dependent variable due to random effects is small; consequently, random effects may be ignored and the model simplifies to standard linear regression. The value of Rr2 apart from 0 indicates the evidence of the variance reduction in support of the mixed model. If random effects coefficient of determination is close to 1 the variance of random effects is very large and random effects turn into free fixed effects—the model can be estimated using the dummy variable approach. We derive explicit formulas for Rr2 in three special cases: the random intercept model, the growth curve model, and meta-analysis model. Theoretical results are illustrated with three mixed model examples: (1) travel time to the nearest cancer center for women with breast cancer in the U.S., (2) cumulative time watching alcohol related scenes in movies among young U.S. teens, as a risk factor for early drinking onset, and (3) the classic example of the meta-analysis model for combination of 13 studies on tuberculosis vaccine. PMID:23750070
Applying a weighted random forests method to extract karst sinkholes from LiDAR data
NASA Astrophysics Data System (ADS)
Zhu, Junfeng; Pierskalla, William P.
2016-02-01
Detailed mapping of sinkholes provides critical information for mitigating sinkhole hazards and understanding groundwater and surface water interactions in karst terrains. LiDAR (Light Detection and Ranging) measures the earth's surface in high-resolution and high-density and has shown great potentials to drastically improve locating and delineating sinkholes. However, processing LiDAR data to extract sinkholes requires separating sinkholes from other depressions, which can be laborious because of the sheer number of the depressions commonly generated from LiDAR data. In this study, we applied the random forests, a machine learning method, to automatically separate sinkholes from other depressions in a karst region in central Kentucky. The sinkhole-extraction random forest was grown on a training dataset built from an area where LiDAR-derived depressions were manually classified through a visual inspection and field verification process. Based on the geometry of depressions, as well as natural and human factors related to sinkholes, 11 parameters were selected as predictive variables to form the dataset. Because the training dataset was imbalanced with the majority of depressions being non-sinkholes, a weighted random forests method was used to improve the accuracy of predicting sinkholes. The weighted random forest achieved an average accuracy of 89.95% for the training dataset, demonstrating that the random forest can be an effective sinkhole classifier. Testing of the random forest in another area, however, resulted in moderate success with an average accuracy rate of 73.96%. This study suggests that an automatic sinkhole extraction procedure like the random forest classifier can significantly reduce time and labor costs and makes its more tractable to map sinkholes using LiDAR data for large areas. However, the random forests method cannot totally replace manual procedures, such as visual inspection and field verification.
Some practical problems in implementing randomization.
Downs, Matt; Tucker, Kathryn; Christ-Schmidt, Heidi; Wittes, Janet
2010-06-01
While often theoretically simple, implementing randomization to treatment in a masked, but confirmable, fashion can prove difficult in practice. At least three categories of problems occur in randomization: (1) bad judgment in the choice of method, (2) design and programming errors in implementing the method, and (3) human error during the conduct of the trial. This article focuses on these latter two types of errors, dealing operationally with what can go wrong after trial designers have selected the allocation method. We offer several case studies and corresponding recommendations for lessening the frequency of problems in allocating treatment or for mitigating the consequences of errors. Recommendations include: (1) reviewing the randomization schedule before starting a trial, (2) being especially cautious of systems that use on-demand random number generators, (3) drafting unambiguous randomization specifications, (4) performing thorough testing before entering a randomization system into production, (5) maintaining a dataset that captures the values investigators used to randomize participants, thereby allowing the process of treatment allocation to be reproduced and verified, (6) resisting the urge to correct errors that occur in individual treatment assignments, (7) preventing inadvertent unmasking to treatment assignments in kit allocations, and (8) checking a sample of study drug kits to allow detection of errors in drug packaging and labeling. Although we performed a literature search of documented randomization errors, the examples that we provide and the resultant recommendations are based largely on our own experience in industry-sponsored clinical trials. We do not know how representative our experience is or how common errors of the type we have seen occur. Our experience underscores the importance of verifying the integrity of the treatment allocation process before and during a trial. Clinical Trials 2010; 7: 235-245. http://ctj.sagepub.com.
Long, Jiang; Liu, Tie-Qiao; Liao, Yan-Hui; Qi, Chang; He, Hao-Yu; Chen, Shu-Bao; Billieux, Joël
2016-11-17
Smartphones are becoming a daily necessity for most undergraduates in Mainland China. Because the present scenario of problematic smartphone use (PSU) is largely unexplored, in the current study we aimed to estimate the prevalence of PSU and to screen suitable predictors for PSU among Chinese undergraduates in the framework of the stress-coping theory. A sample of 1062 undergraduate smartphone users was recruited by means of the stratified cluster random sampling strategy between April and May 2015. The Problematic Cellular Phone Use Questionnaire was used to identify PSU. We evaluated five candidate risk factors for PSU by using logistic regression analysis while controlling for demographic characteristics and specific features of smartphone use. The prevalence of PSU among Chinese undergraduates was estimated to be 21.3%. The risk factors for PSU were majoring in the humanities, high monthly income from the family (≥1500 RMB), serious emotional symptoms, high perceived stress, and perfectionism-related factors (high doubts about actions, high parental expectations). PSU among undergraduates appears to be ubiquitous and thus constitutes a public health issue in Mainland China. Although further longitudinal studies are required to test whether PSU is a transient phenomenon or a chronic and progressive condition, our study successfully identified socio-demographic and psychological risk factors for PSU. These results, obtained from a random and thus representative sample of undergraduates, opens up new avenues in terms of prevention and regulation policies.
Fidelity under isospectral perturbations: a random matrix study
NASA Astrophysics Data System (ADS)
Leyvraz, F.; García, A.; Kohler, H.; Seligman, T. H.
2013-07-01
The set of Hamiltonians generated by all unitary transformations from a single Hamiltonian is the largest set of isospectral Hamiltonians we can form. Taking advantage of the fact that the unitary group can be generated from Hermitian matrices we can take the ones generated by the Gaussian unitary ensemble with a small parameter as small perturbations. Similarly, the transformations generated by Hermitian antisymmetric matrices from orthogonal matrices form isospectral transformations among symmetric matrices. Based on this concept we can obtain the fidelity decay of a system that decays under a random isospectral perturbation with well-defined properties regarding time-reversal invariance. If we choose the Hamiltonian itself also from a classical random matrix ensemble, then we obtain solutions in terms of form factors in the limit of large matrices.
Bayesian exponential random graph modelling of interhospital patient referral networks.
Caimo, Alberto; Pallotti, Francesca; Lomi, Alessandro
2017-08-15
Using original data that we have collected on referral relations between 110 hospitals serving a large regional community, we show how recently derived Bayesian exponential random graph models may be adopted to illuminate core empirical issues in research on relational coordination among healthcare organisations. We show how a rigorous Bayesian computation approach supports a fully probabilistic analytical framework that alleviates well-known problems in the estimation of model parameters of exponential random graph models. We also show how the main structural features of interhospital patient referral networks that prior studies have described can be reproduced with accuracy by specifying the system of local dependencies that produce - but at the same time are induced by - decentralised collaborative arrangements between hospitals. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
NASA Technical Reports Server (NTRS)
Weger, R. C.; Lee, J.; Zhu, Tianri; Welch, R. M.
1992-01-01
The current controversy existing in reference to the regularity vs. clustering in cloud fields is examined by means of analysis and simulation studies based upon nearest-neighbor cumulative distribution statistics. It is shown that the Poisson representation of random point processes is superior to pseudorandom-number-generated models and that pseudorandom-number-generated models bias the observed nearest-neighbor statistics towards regularity. Interpretation of this nearest-neighbor statistics is discussed for many cases of superpositions of clustering, randomness, and regularity. A detailed analysis is carried out of cumulus cloud field spatial distributions based upon Landsat, AVHRR, and Skylab data, showing that, when both large and small clouds are included in the cloud field distributions, the cloud field always has a strong clustering signal.
Brownian motion on random dynamical landscapes
NASA Astrophysics Data System (ADS)
Suñé Simon, Marc; Sancho, José María; Lindenberg, Katja
2016-03-01
We present a study of overdamped Brownian particles moving on a random landscape of dynamic and deformable obstacles (spatio-temporal disorder). The obstacles move randomly, assemble, and dissociate following their own dynamics. This landscape may account for a soft matter or liquid environment in which large obstacles, such as macromolecules and organelles in the cytoplasm of a living cell, or colloids or polymers in a liquid, move slowly leading to crowding effects. This representation also constitutes a novel approach to the macroscopic dynamics exhibited by active matter media. We present numerical results on the transport and diffusion properties of Brownian particles under this disorder biased by a constant external force. The landscape dynamics are characterized by a Gaussian spatio-temporal correlation, with fixed time and spatial scales, and controlled obstacle concentrations.
Solomon, Daniel H; Katz, Jeffrey N; Finkelstein, Joel S; Polinski, Jennifer M; Stedman, Margaret; Brookhart, M Alan; Arnold, Marilyn; Gauthier, Suzanne; Avorn, Jerry
2007-11-01
We conducted a randomized controlled trial within the setting of a large drug benefit plan for Medicare beneficiaries. Primary care physicians and their patients were randomized to usual care, patient intervention only, physician intervention only, or both interventions. There was no difference in the probability of the primary composite endpoint (BMD test or osteoporosis medication) or in either of its components comparing the combined intervention group with usual care (risk ratio = 1.04; 95% CI, 0.85-1.26). Fractures from osteoporosis are associated with substantial morbidity, mortality, and cost. However, only a minority of at-risk older adults receives screening and/or treatment for this condition. We evaluated the effect of educational interventions for osteoporosis targeting at-risk patients, primary care physicians, or both. We conducted a randomized controlled trial within the setting of a large drug benefit plan for Medicare beneficiaries. Primary care physicians and their patients were randomized to usual care, patient intervention only, physician intervention only, or both interventions. The at-risk patients were women >or=65 yr of age, men and women >or=65 yr of age with a prior fracture, and men and women >or=65 yr of age who used oral glucocorticoids. The primary outcome studied was a composite of either undergoing a BMD test or initiating a medication used for osteoporosis. The secondary outcome was a hip, humerus, spine, or wrist fracture. We randomized 828 primary care physicians and their 13,455 eligible at-risk patients into four study arms. Physician and patient characteristics were very similar across all four groups. Across all four groups, the rate of the composite outcome was 10.3 per 100 person-years and did not differ between the usual care and the combined intervention groups (p = 0.5). In adjusted Cox proportional hazards models, there was no difference in the probability of the primary composite endpoint comparing the combined intervention group with usual care (risk ratio = 1.04; 95% CI, 0.85-1.26). There was also no difference in either of the components of the composite endpoint. The probability of fracture during follow-up was 4.2 per 100 person-years and did not differ by treatment assignment (p = 0.9). In this trial, a relatively brief program of patient and/or physician education did not work to improve the management of osteoporosis. More intensive efforts should be considered for future quality improvement programs for osteoporosis.
Aging in the three-dimensional random-field Ising model
NASA Astrophysics Data System (ADS)
von Ohr, Sebastian; Manssen, Markus; Hartmann, Alexander K.
2017-07-01
We studied the nonequilibrium aging behavior of the random-field Ising model in three dimensions for various values of the disorder strength. This allowed us to investigate how the aging behavior changes across the ferromagnetic-paramagnetic phase transition. We investigated a large system size of N =2563 spins and up to 108 Monte Carlo sweeps. To reach these necessary long simulation times, we employed an implementation running on Intel Xeon Phi coprocessors, reaching single-spin-flip times as short as 6 ps. We measured typical correlation functions in space and time to extract a growing length scale and corresponding exponents.
Universal energy distribution for interfaces in a random-field environment
NASA Astrophysics Data System (ADS)
Fedorenko, Andrei A.; Stepanow, Semjon
2003-11-01
We study the energy distribution function ρ(E) for interfaces in a random-field environment at zero temperature by summing the leading terms in the perturbation expansion of ρ(E) in powers of the disorder strength, and by taking into account the nonperturbational effects of the disorder using the functional renormalization group. We have found that the average and the variance of the energy for one-dimensional interface of length L behave as,
NASA Astrophysics Data System (ADS)
Braides, Andrea; Causin, Andrea; Piatnitski, Andrey; Solci, Margherita
2018-06-01
We consider randomly distributed mixtures of bonds of ferromagnetic and antiferromagnetic type in a two-dimensional square lattice with probability 1-p and p, respectively, according to an i.i.d. random variable. We study minimizers of the corresponding nearest-neighbour spin energy on large domains in Z^2. We prove that there exists p_0 such that for p≤ p_0 such minimizers are characterized by a majority phase; i.e., they take identically the value 1 or - 1 except for small disconnected sets. A deterministic analogue is also proved.
NASA Astrophysics Data System (ADS)
Braides, Andrea; Causin, Andrea; Piatnitski, Andrey; Solci, Margherita
2018-04-01
We consider randomly distributed mixtures of bonds of ferromagnetic and antiferromagnetic type in a two-dimensional square lattice with probability 1-p and p, respectively, according to an i.i.d. random variable. We study minimizers of the corresponding nearest-neighbour spin energy on large domains in Z^2 . We prove that there exists p_0 such that for p≤p_0 such minimizers are characterized by a majority phase; i.e., they take identically the value 1 or - 1 except for small disconnected sets. A deterministic analogue is also proved.
Strand-seq: a unifying tool for studies of chromosome segregation
Falconer, Ester; Lansdorp, Peter M.
2013-01-01
Non random segregation of sister chromatids has been implicated to help specify daughter cell fate (the Silent Sister Hypothesis [1]) or to protect the genome of long-lived stem cells (the Immortal Strand Hypothesis [2]). The idea that sister chromatids are non-randomly segregated into specific daughter cells is only marginally supported by data in sporadic and often contradictory studies. As a result, the field has moved forward rather slowly. The advent of being able to directly label and differentiate sister chromatids in vivo using fluorescence in situ hybridization [3] was a significant advance for such studies. However, this approach is limited by the need for large tracks of unidirectional repeats on chromosomes and the reliance on quantitative imaging of fluorescent probes and rigorous statistical analysis to discern between the two competing hypotheses. A novel method called Strand-seq which uses next-generation sequencing to assay sister chromatid inheritance patterns independently for each chromosome [4] offers a comprehensive approach to test for non-random segregation. In addition Strand-seq enables studies on the deposition of chromatin marks in relation to DNA replication. This method is expected to help unify the field by testing previous claims of non-random segregation in an unbiased way in many model systems in vitro and in vivo. PMID:23665005
Montgomery, John H; Byerly, Matthew; Carmody, Thomas; Li, Baitao; Miller, Daniel R; Varghese, Femina; Holland, Rhiannon
2004-12-01
The effect of funding source on the outcome of randomized controlled trials has been investigated in several medical disciplines; however, psychiatry has been largely excluded from such analyses. In this article, randomized controlled trials of second generation antipsychotics in schizophrenia are reviewed and analyzed with respect to funding source (industry vs. non-industry funding). A literature search was conducted for randomized, double-blind trials in which at least one of the tested treatments was a second generation antipsychotic. In each study, design quality and study outcome were assessed quantitatively according to rating scales. Mean quality and outcome scores were compared in the industry-funded studies and non-industry-funded studies. An analysis of the primary author's affiliation with industry was similarly performed. Results of industry-funded studies significantly favored second generation over first generation antipsychotics when compared to non-industry-funded studies. Non-industry-funded studies showed a trend toward higher quality than industry-funded studies; however, the difference between the two was not significant. Also, within the industry-funded studies, outcomes of trials involving first authors employed by industry sponsors demonstrated a trend toward second generation over first generation antipsychotics to a greater degree than did trials involving first authors employed outside the industry (p=0.05). While the retrospective design of the study limits the strength of the findings, the data suggest that industry bias may occur in randomized controlled trials in schizophrenia. There appears to be several sources by which bias may enter clinical research, including trial design, control of data analysis and multiplicity/redundancy of trials.
Treatment of Non-Tuberculous Mycobacterial Lung Disease.
Philley, Julie V; DeGroote, Mary Ann; Honda, Jennifer R; Chan, Michael M; Kasperbauer, Shannon; Walter, Nicholas D; Chan, Edward D
2016-12-01
Treatment of non-tuberculous mycobacterial lung disease (NTM-LD) is challenging for several reasons including the relative resistance of NTM to currently available drugs and the difficulty in tolerating prolonged treatment with multiple drugs. Yet-to-be-done, large, multicenter, prospective randomized studies to establish the best regimens will also be arduous because multiple NTM species are known to cause human lung disease, differences in virulence and response to treatment between different species and strains within a species will make randomization more difficult, the need to distinguish relapse from a new infection, and the difficulty in adhering to the prescribed treatment due to intolerance, toxicity, and/or drug-drug interactions, often necessitating modification of therapeutic regimens. Furthermore, the out-of-state resident status of many patients seen at the relatively few centers that care for large number of NTM-LD patients pose logistical issues in monitoring response to treatment. Thus, current treatment regimens for NTM-LD is largely based on small case series, retrospective analyses, and guidelines based on expert opinions. It has been nearly 10 years since the publication of a consensus guideline for the treatment of NTM-LD. This review is a summary of the available evidence on the treatment of the major NTM-LD until more definitive studies and guidelines become available.
Optimizing the LSST Dither Pattern for Survey Uniformity
NASA Astrophysics Data System (ADS)
Awan, Humna; Gawiser, Eric J.; Kurczynski, Peter; Carroll, Christopher M.; LSST Dark Energy Science Collaboration
2015-01-01
The Large Synoptic Survey Telescope (LSST) will gather detailed data of the southern sky, enabling unprecedented study of Baryonic Acoustic Oscillations, which are an important probe of dark energy. These studies require a survey with highly uniform depth, and we aim to find an observation strategy that optimizes this uniformity. We have shown that in the absence of dithering (large telescope-pointing offsets), the LSST survey will vary significantly in depth. Hence, we implemented various dithering strategies, including random and repulsive random pointing offsets and spiral patterns with the spiral reaching completion in either a few months or the entire ten-year run. We employed three different implementations of dithering strategies: a single offset assigned to all fields observed on each night, offsets assigned to each field independently whenever the field is observed, and offsets assigned to each field only when the field is observed on a new night. Our analysis reveals that large dithers are crucial to guarantee survey uniformity and that assigning dithers to each field independently whenever the field is observed significantly increases this uniformity. These results suggest paths towards an optimal observation strategy that will enable LSST to achieve its science goals.We gratefully acknowledge support from the National Science Foundation REU program at Rutgers, PHY-1263280, and the Department of Energy, DE-SC0011636.
Edgren, Gustaf; Anderson, Jacqueline; Dolk, Anders; Torgerson, Jarl; Nyberg, Svante; Skau, Tommy; Forsberg, Birger C; Werr, Joachim; Öhlen, Gunnar
2016-10-01
A small group of frequent visitors to Emergency Departments accounts for a disproportionally large fraction of healthcare consumption including unplanned hospitalizations and overall healthcare costs. In response, several case and disease management programs aimed at reducing healthcare consumption in this group have been tested; however, results vary widely. To investigate whether a telephone-based, nurse-led case management intervention can reduce healthcare consumption for frequent Emergency Department visitors in a large-scale setup. A total of 12 181 frequent Emergency Department users in three counties in Sweden were randomized using Zelen's design or a traditional randomized design to receive either a nurse-led case management intervention or no intervention, and were followed for healthcare consumption for up to 2 years. The traditional design showed an overall 12% (95% confidence interval 4-19%) decreased rate of hospitalization, which was mostly driven by effects in the last year. Similar results were achieved in the Zelen studies, with a significant reduction in hospitalization in the last year, but mixed results in the early development of the project. Our study provides evidence that a carefully designed telephone-based intervention with accurate and systematic patient selection and appropriate staff training in a centralized setup can lead to significant decreases in healthcare consumption and costs. Further, our results also show that the effects are sensitive to the delivery model chosen.
Adalsteinsson, David; McMillen, David; Elston, Timothy C
2004-03-08
Intrinsic fluctuations due to the stochastic nature of biochemical reactions can have large effects on the response of biochemical networks. This is particularly true for pathways that involve transcriptional regulation, where generally there are two copies of each gene and the number of messenger RNA (mRNA) molecules can be small. Therefore, there is a need for computational tools for developing and investigating stochastic models of biochemical networks. We have developed the software package Biochemical Network Stochastic Simulator (BioNetS) for efficiently and accurately simulating stochastic models of biochemical networks. BioNetS has a graphical user interface that allows models to be entered in a straightforward manner, and allows the user to specify the type of random variable (discrete or continuous) for each chemical species in the network. The discrete variables are simulated using an efficient implementation of the Gillespie algorithm. For the continuous random variables, BioNetS constructs and numerically solves the appropriate chemical Langevin equations. The software package has been developed to scale efficiently with network size, thereby allowing large systems to be studied. BioNetS runs as a BioSpice agent and can be downloaded from http://www.biospice.org. BioNetS also can be run as a stand alone package. All the required files are accessible from http://x.amath.unc.edu/BioNetS. We have developed BioNetS to be a reliable tool for studying the stochastic dynamics of large biochemical networks. Important features of BioNetS are its ability to handle hybrid models that consist of both continuous and discrete random variables and its ability to model cell growth and division. We have verified the accuracy and efficiency of the numerical methods by considering several test systems.
Kohn, Michael R.; Tsang, Tracey W.; Clarke, Simon D.
2012-01-01
Several non-stimulant medications have been used in the treatment of attention deficit hyperactivity disorder (ADHD). Atomoxetine, was introduced in 2002. The safety and efficacy of atomoxetine in the treatment of ADHD for children, adolescents, and adults has been evaluated in over 4000 patients in randomized controlled studies and double blinded studies as well as in recent large longitudinal studies. This paper provides an updated summary of the literature on atomoxetine, particularly in relation to findings on the short- and long-term safety of atomoxetine in children and adolescents arising from recent large longitudinal cohort studies. Information is presented about the efficacy, safety, and tolerability of this medication. PMID:23641171
The Quality Improvement Demonstration Study: An example of evidence-based policy-making in practice
Shimkhada, Riti; Peabody, John W; Quimbo, Stella A; Solon, Orville
2008-01-01
Background Randomized trials have long been the gold-standard for evaluating clinical practice. There is growing recognition that rigorous studies are similarly needed to assess the effects of policy. However, these studies are rarely conducted. We report on the Quality Improvement Demonstration Study (QIDS), an example of a large randomized policy experiment, introduced and conducted in a scientific manner to evaluate the impact of large-scale governmental policy interventions. Methods In 1999 the Philippine government proposed sweeping reforms in the National Health Sector Reform Agenda. We recognized the unique opportunity to conduct a social experiment. Our ongoing goal has been to generate results that inform health policy. Early on we concentrated on developing a multi-institutional collaborative effort. The QIDS team then developed hypotheses that specifically evaluated the impact of two policy reforms on both the delivery of care and long-term health status in children. We formed an experimental design by randomizing matched blocks of three communities into one of the two policy interventions plus a control group. Based on the reform agenda, one arm of the experiment provided expanded insurance coverage for children; the other introduced performance-based payments to hospitals and physicians. Data were collected in household, hospital-based patient exit, and facility surveys, as well as clinical vignettes, which were used to assess physician practice. Delivery of services and health status were evaluated at baseline and after the interventions were put in place using difference-in-difference estimation. Results We found and addressed numerous challenges conducting this study, namely: formalizing the experimental design using the existing health infrastructure; securing funding to do research coincident with the policy reforms; recognizing biases and designing the study to account for these; putting in place a broad data collection effort to account for unanticipated findings; introducing sustainable policy interventions based on the reform agenda; and providing results in real-time to policy makers through a combination of venues. Conclusion QIDS demonstrates that a large, prospective, randomized controlled policy experiment can be successfully implemented at a national level as part of sectoral reform. While we believe policy experiments should be used to generate evidence-based health policy, to do this requires opportunity and trust, strong collaborative relationships, and timing. This study nurtures the growing attitude that translation of scientific findings from the bedside to the community can be done successfully and that we should raise the bar on project evaluation and the policy-making process. PMID:18364050
Caponnetto, Pasquale; Polosa, Riccardo; Auditore, Roberta; Minutolo, Giuseppe; Signorelli, Maria; Maglia, Marilena; Alamo, Angela; Palermo, Filippo; Aguglia, Eugenio
2014-03-22
It is well established in studies across several countries that tobacco smoking is more prevalent among schizophrenic patients than the general population. Electronic cigarettes are becoming increasingly popular with smokers worldwide. To date there are no large randomized trials of electronic cigarettes in schizophrenic smokers. A well-designed trial is needed to compare efficacy and safety of these products in this special population. We have designed a randomized controlled trial investigating the efficacy and safety of electronic cigarette. The trial will take the form of a prospective 12-month randomized clinical study to evaluate smoking reduction, smoking abstinence and adverse events in schizophrenic smokers not intending to quit. We will also monitor quality of life, neurocognitive functioning and measure participants' perception and satisfaction of the product. A ≥50% reduction in the number of cigarettes/day from baseline, will be calculated at each study visit ("reducers"). Abstinence from smoking will be calculated at each study visit ("quitters"). Smokers who leave the study protocol before its completion and will carry out the Early Termination Visit or who will not satisfy the criteria of "reducers" and "quitters" will be defined "non responders". The differences of continuous variables between the three groups will be evaluated with the Kruskal-Wallis Test, followed by the Dunn multiple comparison test. The differences between the three groups for normally distributed data will be evaluated with ANOVA test one way, followed by the Newman-Keuls multiple comparison test. The normality of the distribution will be evaluated with the Kolmogorov-Smirnov test. Any correlations between the variables under evaluation will be assessed by Spearman r correlation. To compare qualitative data will be used the Chi-square test. The main strengths of the SCARIS study are the following: it's the first large RCT on schizophrenic patient, involving in and outpatient, evaluating the effect of a three-arm study design, and a long term of follow-up (52-weeks).The goal is to propose an effective intervention to reduce the risk of tobacco smoking, as a complementary tool to treat tobacco addiction in schizophrenia. ClinicalTrials.gov, NCT01979796.
High Angular Resolution Microwave Sensing with Large, Sparse, Random Arrays
1983-11-01
RESEARCH AFOSR 82-0012 DTIC s" A6 19M UNIVERSITY of PENNSYLVANIA VALLEY FORGE RESEARCH CENTER THE MOORE SCHOOL OF ELECTRICAL ENGINEERING PHILADELPHIA...MICROWAVE SENSING WITH LARGE, SPARSE, RANDOM ARRAYS Final Scientific Report AIR FORCE OFFICE OF SCIENTIFIC RESEARCH AFOSR 82-0012 Valley Forge Research ...CONTROLLING OFFICE NAME AND ADDRESS 12. REPORT DATE Air Force Office of Scientific Research /NE Nov 1983 - . Bildin 41073. NUMBER Or PAG ES BOllinZ AFB, DIC
Energetic Consistency and Coupling of the Mean and Covariance Dynamics
NASA Technical Reports Server (NTRS)
Cohn, Stephen E.
2008-01-01
The dynamical state of the ocean and atmosphere is taken to be a large dimensional random vector in a range of large-scale computational applications, including data assimilation, ensemble prediction, sensitivity analysis, and predictability studies. In each of these applications, numerical evolution of the covariance matrix of the random state plays a central role, because this matrix is used to quantify uncertainty in the state of the dynamical system. Since atmospheric and ocean dynamics are nonlinear, there is no closed evolution equation for the covariance matrix, nor for the mean state. Therefore approximate evolution equations must be used. This article studies theoretical properties of the evolution equations for the mean state and covariance matrix that arise in the second-moment closure approximation (third- and higher-order moment discard). This approximation was introduced by EPSTEIN [1969] in an early effort to introduce a stochastic element into deterministic weather forecasting, and was studied further by FLEMING [1971a,b], EPSTEIN and PITCHER [1972], and PITCHER [1977], also in the context of atmospheric predictability. It has since fallen into disuse, with a simpler one being used in current large-scale applications. The theoretical results of this article make a case that this approximation should be reconsidered for use in large-scale applications, however, because the second moment closure equations possess a property of energetic consistency that the approximate equations now in common use do not possess. A number of properties of solutions of the second-moment closure equations that result from this energetic consistency will be established.
Estimation of Rice Crop Yields Using Random Forests in Taiwan
NASA Astrophysics Data System (ADS)
Chen, C. F.; Lin, H. S.; Nguyen, S. T.; Chen, C. R.
2017-12-01
Rice is globally one of the most important food crops, directly feeding more people than any other crops. Rice is not only the most important commodity, but also plays a critical role in the economy of Taiwan because it provides employment and income for large rural populations. The rice harvested area and production are thus monitored yearly due to the government's initiatives. Agronomic planners need such information for more precise assessment of food production to tackle issues of national food security and policymaking. This study aimed to develop a machine-learning approach using physical parameters to estimate rice crop yields in Taiwan. We processed the data for 2014 cropping seasons, following three main steps: (1) data pre-processing to construct input layers, including soil types and weather parameters (e.g., maxima and minima air temperature, precipitation, and solar radiation) obtained from meteorological stations across the country; (2) crop yield estimation using the random forests owing to its merits as it can process thousands of variables, estimate missing data, maintain the accuracy level when a large proportion of the data is missing, overcome most of over-fitting problems, and run fast and efficiently when handling large datasets; and (3) error verification. To execute the model, we separated the datasets into two groups of pixels: group-1 (70% of pixels) for training the model and group-2 (30% of pixels) for testing the model. Once the model is trained to produce small and stable out-of-bag error (i.e., the mean squared error between predicted and actual values), it can be used for estimating rice yields of cropping seasons. The results obtained from the random forests-based regression were compared with the actual yield statistics indicated the values of root mean square error (RMSE) and mean absolute error (MAE) achieved for the first rice crop were respectively 6.2% and 2.7%, while those for the second rice crop were 5.3% and 2.9%, respectively. Although there are several uncertainties attributed to the data quality of input layers, our study demonstrates the promising application of random forests for estimating rice crop yields at the national level in Taiwan. This approach could be transferable to other regions of the world for improving large-scale estimation of rice crop yields.
NASA Astrophysics Data System (ADS)
Deng, Chengbin; Wu, Changshan
2013-12-01
Urban impervious surface information is essential for urban and environmental applications at the regional/national scales. As a popular image processing technique, spectral mixture analysis (SMA) has rarely been applied to coarse-resolution imagery due to the difficulty of deriving endmember spectra using traditional endmember selection methods, particularly within heterogeneous urban environments. To address this problem, we derived endmember signatures through a least squares solution (LSS) technique with known abundances of sample pixels, and integrated these endmember signatures into SMA for mapping large-scale impervious surface fraction. In addition, with the same sample set, we carried out objective comparative analyses among SMA (i.e. fully constrained and unconstrained SMA) and machine learning (i.e. Cubist regression tree and Random Forests) techniques. Analysis of results suggests three major conclusions. First, with the extrapolated endmember spectra from stratified random training samples, the SMA approaches performed relatively well, as indicated by small MAE values. Second, Random Forests yields more reliable results than Cubist regression tree, and its accuracy is improved with increased sample sizes. Finally, comparative analyses suggest a tentative guide for selecting an optimal approach for large-scale fractional imperviousness estimation: unconstrained SMA might be a favorable option with a small number of samples, while Random Forests might be preferred if a large number of samples are available.
Hoeffding Type Inequalities and their Applications in Statistics and Operations Research
NASA Astrophysics Data System (ADS)
Daras, Tryfon
2007-09-01
Large Deviation theory is the branch of Probability theory that deals with rare events. Sometimes, these events can be described by the sum of random variables that deviates from its mean more than a "normal" amount. A precise calculation of the probabilities of such events turns out to be crucial in a variety of different contents (e.g. in Probability Theory, Statistics, Operations Research, Statistical Physics, Financial Mathematics e.t.c.). Recent applications of the theory deal with random walks in random environments, interacting diffusions, heat conduction, polymer chains [1]. In this paper we prove an inequality of exponential type, namely theorem 2.1, which gives a large deviation upper bound for a specific sequence of r.v.s. Inequalities of this type have many applications in Combinatorics [2]. The inequality generalizes already proven results of this type, in the case of symmetric probability measures. We get as consequences to the inequality: (a) large deviations upper bounds for exchangeable Bernoulli sequences of random variables, generalizing results proven for independent and identically distributed Bernoulli sequences of r.v.s. and (b) a general form of Bernstein's inequality. We compare the inequality with large deviation results already proven by the author and try to see its advantages. Finally, using the inequality, we solve one of the basic problems of Operations Research (bin packing problem) in the case of exchangeable r.v.s.
Meta-analysis in evidence-based healthcare: a paradigm shift away from random effects is overdue.
Doi, Suhail A R; Furuya-Kanamori, Luis; Thalib, Lukman; Barendregt, Jan J
2017-12-01
Each year up to 20 000 systematic reviews and meta-analyses are published whose results influence healthcare decisions, thus making the robustness and reliability of meta-analytic methods one of the world's top clinical and public health priorities. The evidence synthesis makes use of either fixed-effect or random-effects statistical methods. The fixed-effect method has largely been replaced by the random-effects method as heterogeneity of study effects led to poor error estimation. However, despite the widespread use and acceptance of the random-effects method to correct this, it too remains unsatisfactory and continues to suffer from defective error estimation, posing a serious threat to decision-making in evidence-based clinical and public health practice. We discuss here the problem with the random-effects approach and demonstrate that there exist better estimators under the fixed-effect model framework that can achieve optimal error estimation. We argue for an urgent return to the earlier framework with updates that address these problems and conclude that doing so can markedly improve the reliability of meta-analytical findings and thus decision-making in healthcare.
Anomalous diffusion on a random comblike structure
NASA Astrophysics Data System (ADS)
Havlin, Shlomo; Kiefer, James E.; Weiss, George H.
1987-08-01
We have recently studied a random walk on a comblike structure as an analog of diffusion on a fractal structure. In our earlier work, the comb was assumed to have a deterministic structure, the comb having teeth of infinite length. In the present paper we study diffusion on a one-dimensional random comb, the length of whose teeth are random variables with an asymptotic stable law distribution φ(L)~L-(1+γ) where 0<γ<=1. Two mean-field methods are used for the analysis, one based on the continuous-time random walk, and the second a self-consistent scaling theory. Both lead to the same conclusions. We find that the diffusion exponent characterizing the mean-square displacement along the backbone of the comb is dw=4/(1+γ) for γ<1 and dw=2 for γ>=1. The probability of being at the origin at time t is P0(t)~t-ds/2 for large t with ds=(3-γ)/2 for γ<1 and ds=1 for γ>1. When a field is applied along the backbone of the comb the diffusion exponent is dw=2/(1+γ) for γ<1 and dw=1 for γ>=1. The theoretical results are confirmed using the exact enumeration method.
McClure, Erin A; Sonne, Susan C; Winhusen, Theresa; Carroll, Kathleen M; Ghitza, Udi E; McRae-Clark, Aimee L; Matthews, Abigail G; Sharma, Gaurav; Van Veldhuisen, Paul; Vandrey, Ryan G; Levin, Frances R; Weiss, Roger D; Lindblad, Robert; Allen, Colleen; Mooney, Larissa J; Haynes, Louise; Brigham, Gregory S; Sparenborg, Steve; Hasson, Albert L; Gray, Kevin M
2014-11-01
Despite recent advances in behavioral interventions for cannabis use disorders, effect sizes remain modest, and few individuals achieve long-term abstinence. One strategy to enhance outcomes is the addition of pharmacotherapy to complement behavioral treatment, but to date no efficacious medications targeting cannabis use disorders in adults through large, randomized controlled trials have been identified. The National Institute on Drug Abuse Clinical Trials Network (NIDA CTN) is currently conducting a study to test the efficacy of N-acetylcysteine (NAC) versus placebo (PBO), added to contingency management, for cannabis cessation in adults (ages 18-50). This study was designed to replicate positive findings from a study in cannabis-dependent adolescents that found greater odds of abstinence with NAC compared to PBO. This paper describes the design and implementation of an ongoing 12-week, intent-to-treat, double-blind, randomized, placebo-controlled study with one follow-up visit four weeks post-treatment. Approximately 300 treatment-seeking cannabis-dependent adults will be randomized to NAC or PBO across six study sites in the United States. The primary objective of this 12-week study is to evaluate the efficacy of twice-daily orally-administered NAC (1200 mg) versus matched PBO, added to contingency management, on cannabis abstinence. NAC is among the first medications to demonstrate increased odds of abstinence in a randomized controlled study among cannabis users in any age group. The current study will assess the cannabis cessation efficacy of NAC combined with a behavioral intervention in adults, providing a novel and timely contribution to the evidence base for the treatment of cannabis use disorders. Copyright © 2014 Elsevier Inc. All rights reserved.
McClure, Erin A.; Sonne, Susan C.; Winhusen, Theresa; Carroll, Kathleen M.; Ghitza, Udi E.; McRae-Clark, Aimee L.; Matthews, Abigail G.; Sharma, Gaurav; Van Veldhuisen, Paul; Vandrey, Ryan G.; Levin, Frances R.; Weiss, Roger D.; Lindblad, Robert; Allen, Colleen; Mooney, Larissa J.; Haynes, Louise; Brigham, Gregory S.; Sparenborg, Steve; Hasson, Albert L.; Gray, Kevin M.
2014-01-01
Despite recent advances in behavioral interventions for cannabis use disorders, effect sizes remain modest, and few individuals achieve long-term abstinence. One strategy to enhance outcomes is the addition of pharmacotherapy to complement behavioral treatment, but to date no efficacious medications targeting cannabis use disorders in adults through large, randomized controlled trials have been identified. The National Institute on Drug Abuse Clinical Trials Network (NIDA CTN) is currently conducting a study to test the efficacy of N-acetylcysteine (NAC) versus placebo (PBO), added to contingency management, for cannabis cessation in adults (ages 18–50). This study was designed to replicate positive findings from a study in cannabis-dependent adolescents that found greater odds of abstinence with NAC compared to PBO. This paper describes the design and implementation of an ongoing 12-week, intent-to-treat, double-blind, randomized, placebo-controlled study with one follow-up visit four weeks post-treatment. Approximately 300 treatment-seeking cannabis-dependent adults will be randomized to NAC or PBO across six study sites in the United States. The primary objective of this 12-week study is to evaluate the efficacy of twice-daily orally-administered NAC (1200 mg) versus matched PBO, added to contingency management, on cannabis abstinence. NAC is among the first medications to demonstrate increased odds of abstinence in a randomized controlled study among cannabis users in any age group. The current study will assess the cannabis cessation efficacy of NAC combined with a behavioral intervention in adults, providing a novel and timely contribution to the evidence base for the treatment of cannabis use disorders. PMID:25179587
Prevailing practices in the use of antibiotics by dairy farmers in Eastern Haryana region of India
Kumar, Vikash; Gupta, Jancy
2018-01-01
Aim: The aim of the study was to assess the antibiotic use in dairy animals and to trace its usage pattern among the small, medium, and large dairy farmers in Eastern Haryana region of India. Materials and Methods: Karnal and Kurukshetra districts from Eastern region of Haryana state were purposively selected, and four villages from each district were selected randomly. From each village, 21 farmers were selected using stratified random sampling by categorizing into small, medium, and large farmers constituting a total of 168 farmers as respondents. An antibiotic usage index (AUI) was developed to assess usage of antibiotics by dairy farmers. Results: Frequency of veterinary consultancy was high among large dairy farmers, and they mostly preferred veterinarians over para-veterinarians for treatment of dairy animals. Small farmers demanded low-cost antibiotics from veterinarians whereas large farmers rarely went for it. Antibiotics were used maximum for therapeutic purposes by all categories of farmers. Completion of treatment schedules and follow-up were strictly practiced by the majority of large farmers. AUI revealed that large farmers were more consistent on decision-making about prudent use of antibiotics. Routine use of antibiotics after parturition to prevent disease and sale of milk without adhering to withdrawal period was responsible for aggravating the antibiotic resistance. The extent of antibiotic use by small farmers depended on the severity of disease. The large farmers opted for the prophylactic use of antibiotics at the herd level. Conclusion: Antibiotic usage practices were judicious among large dairy farmers, moderately prudent by medium dairy farmers and faulty by small farmers. The frequency of veterinary consultancy promoted better veterinary-client relationship among large farmers. PMID:29657416
Large leptonic Dirac CP phase from broken democracy with random perturbations
NASA Astrophysics Data System (ADS)
Ge, Shao-Feng; Kusenko, Alexander; Yanagida, Tsutomu T.
2018-06-01
A large value of the leptonic Dirac CP phase can arise from broken democracy, where the mass matrices are democratic up to small random perturbations. Such perturbations are a natural consequence of broken residual S3 symmetries that dictate the democratic mass matrices at leading order. With random perturbations, the leptonic Dirac CP phase has a higher probability to attain a value around ± π / 2. Comparing with the anarchy model, broken democracy can benefit from residual S3 symmetries, and it can produce much better, realistic predictions for the mass hierarchy, mixing angles, and Dirac CP phase in both quark and lepton sectors. Our approach provides a general framework for a class of models in which a residual symmetry determines the general features at leading order, and where, in the absence of other fundamental principles, the symmetry breaking appears in the form of random perturbations.
Festen, Dederieke A M; de Lind van Wijngaarden, Roderick; van Eekelen, Marielle; Otten, Barto J; Wit, Jan M; Duivenvoorden, Hugo J; Hokken-Koelega, Anita C S
2008-09-01
Prader-Willi syndrome (PWS) children have impaired growth, and abnormal body composition. Previous 1-year controlled studies showed improvement of height and body composition during GH-treatment. To evaluate growth, body composition and body proportions during GH-treatment in a large group of PWS children. We performed a randomized controlled GH trial in 91 prepubertal PWS children (42 infants, 49 children, aged 3-14 years). After stratification for age, infants were randomized to GH-treatment (GH-group; 1 mg/m(2)/day; n = 20), or no treatment (control group; n = 22) for 1 year. In the second year all infants were treated with GH. After stratification for BMI, children > 3 years of age were randomized to GH-treatment (GH-group; 1 mg/m(2)/day; n = 27) or no treatment (control group; n = 22) for 2 years. Anthropometric parameters were assessed once in every 3 months. Body composition was measured by Dual Energy X-ray Absorptiometry. Median (interquartile range, iqr) height SDS increased during 2 years of GH in infants from -2.3 (-2.8 to -0.7) to -0.4 (-1.1-0.0) and in prepubertal children from -2.0 (-3.1 to -1.7) to -0.6 (-1.1 to -0.1). In non-GH-treated children height SDS did not increase. Head circumference completely normalized during 1 and 2 years of GH in infants and children, respectively. Body fat percentage and body proportions improved in GH-treated children, but did not completely normalize. Lean body mass SDS improved compared to the control group. Serum IGF-I increased to levels above the normal range in most GH-treated children. Our randomized study shows that GH-treatment in PWS children significantly improves height, BMI, head circumference, body composition and body proportions. PWS children are highly sensitive to GH, suggesting that monitoring of serum IGF-I is indicated.
Mobile access to virtual randomization for investigator-initiated trials.
Deserno, Thomas M; Keszei, András P
2017-08-01
Background/aims Randomization is indispensable in clinical trials in order to provide unbiased treatment allocation and a valid statistical inference. Improper handling of allocation lists can be avoided using central systems, for example, human-based services. However, central systems are unaffordable for investigator-initiated trials and might be inaccessible from some places, where study subjects need allocations. We propose mobile access to virtual randomization, where the randomization lists are non-existent and the appropriate allocation is computed on demand. Methods The core of the system architecture is an electronic data capture system or a clinical trial management system, which is extended by an R interface connecting the R server using the Java R Interface. Mobile devices communicate via the representational state transfer web services. Furthermore, a simple web-based setup allows configuring the appropriate statistics by non-statisticians. Our comprehensive R script supports simple randomization, restricted randomization using a random allocation rule, block randomization, and stratified randomization for un-blinded, single-blinded, and double-blinded trials. For each trial, the electronic data capture system or the clinical trial management system stores the randomization parameters and the subject assignments. Results Apps are provided for iOS and Android and subjects are randomized using smartphones. After logging onto the system, the user selects the trial and the subject, and the allocation number and treatment arm are displayed instantaneously and stored in the core system. So far, 156 subjects have been allocated from mobile devices serving five investigator-initiated trials. Conclusion Transforming pre-printed allocation lists into virtual ones ensures the correct conduct of trials and guarantees a strictly sequential processing in all trial sites. Covering 88% of all randomization models that are used in recent trials, virtual randomization becomes available for investigator-initiated trials and potentially for large multi-center trials.
Immune tolerance: critical issues of factor dose, purity and treatment complications.
DiMichele, D M
2006-12-01
The current practice of immune tolerance induction (ITI) therapy has been largely influenced by the results of small institutional studies and three large registries. However, many questions remain. Successful outcome predictors for ITI in haemophilia A have been suggested by the analyses of two of these registries. Among these predictors, factor VIII (FVIII) dose/dosing regimen remains a controversial outcome parameter, demonstrating a strong direct relationship to ITI success in the international registry and a weaker inverse relationship in the North American registry. There is an international multicentre prospective randomized trial underway to further study the role of FVIII dose in successful ITI induction in a good risk haemophilia A inhibitor patient cohort. FVIII purity also remains an unproved ITI outcome predictor. Institutional experience with von-Willebrand-factor-containing products has suggested its therapeutic advantage in both inhibitor development and eradication. The International ITI Study, although not designed to answer this particular question, may be able to determine an impact on outcome depending on the final distribution of investigator choice of product among the study subjects. Much less is known about the influence of factor IX (FIX) dose and purity on ITI success in haemophilia B. Importantly, nephrotic syndrome has been a major determinant of ITI failure in FIX inhibitor patients, particularly those with the allergic phenotype. Unfortunately, large prospective randomized trials in this group will not be feasible. Rather, we will have to rely on prospectively collected registry data to build our knowledge base of inhibitors and ITI in haemophilia B.
Testing a stepped care model for binge-eating disorder: a two-step randomized controlled trial.
Tasca, Giorgio A; Koszycki, Diana; Brugnera, Agostino; Chyurlia, Livia; Hammond, Nicole; Francis, Kylie; Ritchie, Kerri; Ivanova, Iryna; Proulx, Genevieve; Wilson, Brian; Beaulac, Julie; Bissada, Hany; Beasley, Erin; Mcquaid, Nancy; Grenon, Renee; Fortin-Langelier, Benjamin; Compare, Angelo; Balfour, Louise
2018-05-24
A stepped care approach involves patients first receiving low-intensity treatment followed by higher intensity treatment. This two-step randomized controlled trial investigated the efficacy of a sequential stepped care approach for the psychological treatment of binge-eating disorder (BED). In the first step, all participants with BED (n = 135) received unguided self-help (USH) based on a cognitive-behavioral therapy model. In the second step, participants who remained in the trial were randomized either to 16 weeks of group psychodynamic-interpersonal psychotherapy (GPIP) (n = 39) or to a no-treatment control condition (n = 46). Outcomes were assessed for USH in step 1, and then for step 2 up to 6-months post-treatment using multilevel regression slope discontinuity models. In the first step, USH resulted in large and statistically significant reductions in the frequency of binge eating. Statistically significant moderate to large reductions in eating disorder cognitions were also noted. In the second step, there was no difference in change in frequency of binge eating between GPIP and the control condition. Compared with controls, GPIP resulted in significant and large improvement in attachment avoidance and interpersonal problems. The findings indicated that a second step of a stepped care approach did not significantly reduce binge-eating symptoms beyond the effects of USH alone. The study provided some evidence for the second step potentially to reduce factors known to maintain binge eating in the long run, such as attachment avoidance and interpersonal problems.
Effects of unstratified and centre-stratified randomization in multi-centre clinical trials.
Anisimov, Vladimir V
2011-01-01
This paper deals with the analysis of randomization effects in multi-centre clinical trials. The two randomization schemes most often used in clinical trials are considered: unstratified and centre-stratified block-permuted randomization. The prediction of the number of patients randomized to different treatment arms in different regions during the recruitment period accounting for the stochastic nature of the recruitment and effects of multiple centres is investigated. A new analytic approach using a Poisson-gamma patient recruitment model (patients arrive at different centres according to Poisson processes with rates sampled from a gamma distributed population) and its further extensions is proposed. Closed-form expressions for corresponding distributions of the predicted number of the patients randomized in different regions are derived. In the case of two treatments, the properties of the total imbalance in the number of patients on treatment arms caused by using centre-stratified randomization are investigated and for a large number of centres a normal approximation of imbalance is proved. The impact of imbalance on the power of the study is considered. It is shown that the loss of statistical power is practically negligible and can be compensated by a minor increase in sample size. The influence of patient dropout is also investigated. The impact of randomization on predicted drug supply overage is discussed. Copyright © 2010 John Wiley & Sons, Ltd.
Random sphere packing model of heterogeneous propellants
NASA Astrophysics Data System (ADS)
Kochevets, Sergei Victorovich
It is well recognized that combustion of heterogeneous propellants is strongly dependent on the propellant morphology. Recent developments in computing systems make it possible to start three-dimensional modeling of heterogeneous propellant combustion. A key component of such large scale computations is a realistic model of industrial propellants which retains the true morphology---a goal never achieved before. The research presented develops the Random Sphere Packing Model of heterogeneous propellants and generates numerical samples of actual industrial propellants. This is done by developing a sphere packing algorithm which randomly packs a large number of spheres with a polydisperse size distribution within a rectangular domain. First, the packing code is developed, optimized for performance, and parallelized using the OpenMP shared memory architecture. Second, the morphology and packing fraction of two simple cases of unimodal and bimodal packs are investigated computationally and analytically. It is shown that both the Loose Random Packing and Dense Random Packing limits are not well defined and the growth rate of the spheres is identified as the key parameter controlling the efficiency of the packing. For a properly chosen growth rate, computational results are found to be in excellent agreement with experimental data. Third, two strategies are developed to define numerical samples of polydisperse heterogeneous propellants: the Deterministic Strategy and the Random Selection Strategy. Using these strategies, numerical samples of industrial propellants are generated. The packing fraction is investigated and it is shown that the experimental values of the packing fraction can be achieved computationally. It is strongly believed that this Random Sphere Packing Model of propellants is a major step forward in the realistic computational modeling of heterogeneous propellant of combustion. In addition, a method of analysis of the morphology of heterogeneous propellants is developed which uses the concept of multi-point correlation functions. A set of intrinsic length scales of local density fluctuations in random heterogeneous propellants is identified by performing a Monte-Carlo study of the correlation functions. This method of analysis shows great promise for understanding the origins of the combustion instability of heterogeneous propellants, and is believed to become a valuable tool for the development of safe and reliable rocket engines.
SMERFS: Stochastic Markov Evaluation of Random Fields on the Sphere
NASA Astrophysics Data System (ADS)
Creasey, Peter; Lang, Annika
2018-04-01
SMERFS (Stochastic Markov Evaluation of Random Fields on the Sphere) creates large realizations of random fields on the sphere. It uses a fast algorithm based on Markov properties and fast Fourier Transforms in 1d that generates samples on an n X n grid in O(n2 log n) and efficiently derives the necessary conditional covariance matrices.
Population differentiation in Pacific salmon: local adaptation, genetic drift, or the environment?
Adkison, Milo D.
1995-01-01
Morphological, behavioral, and life-history differences between Pacific salmon (Oncorhynchus spp.) populations are commonly thought to reflect local adaptation, and it is likewise common to assume that salmon populations separated by small distances are locally adapted. Two alternatives to local adaptation exist: random genetic differentiation owing to genetic drift and founder events, and genetic homogeneity among populations, in which differences reflect differential trait expression in differing environments. Population genetics theory and simulations suggest that both alternatives are possible. With selectively neutral alleles, genetic drift can result in random differentiation despite many strays per generation. Even weak selection can prevent genetic drift in stable populations; however, founder effects can result in random differentiation despite selective pressures. Overlapping generations reduce the potential for random differentiation. Genetic homogeneity can occur despite differences in selective regimes when straying rates are high. In sum, localized differences in selection should not always result in local adaptation. Local adaptation is favored when population sizes are large and stable, selection is consistent over large areas, selective diffeentials are large, and straying rates are neither too high nor too low. Consideration of alternatives to local adaptation would improve both biological research and salmon conservation efforts.
NASA Astrophysics Data System (ADS)
Zou, Guang'an; Wang, Qiang; Mu, Mu
2016-09-01
Sensitive areas for prediction of the Kuroshio large meander using a 1.5-layer, shallow-water ocean model were investigated using the conditional nonlinear optimal perturbation (CNOP) and first singular vector (FSV) methods. A series of sensitivity experiments were designed to test the sensitivity of sensitive areas within the numerical model. The following results were obtained: (1) the eff ect of initial CNOP and FSV patterns in their sensitive areas is greater than that of the same patterns in randomly selected areas, with the eff ect of the initial CNOP patterns in CNOP sensitive areas being the greatest; (2) both CNOP- and FSV-type initial errors grow more quickly than random errors; (3) the eff ect of random errors superimposed on the sensitive areas is greater than that of random errors introduced into randomly selected areas, and initial errors in the CNOP sensitive areas have greater eff ects on final forecasts. These results reveal that the sensitive areas determined using the CNOP are more sensitive than those of FSV and other randomly selected areas. In addition, ideal hindcasting experiments were conducted to examine the validity of the sensitive areas. The results indicate that reduction (or elimination) of CNOP-type errors in CNOP sensitive areas at the initial time has a greater forecast benefit than the reduction (or elimination) of FSV-type errors in FSV sensitive areas. These results suggest that the CNOP method is suitable for determining sensitive areas in the prediction of the Kuroshio large-meander path.
Visser, Kirsten; Greaves-Lord, Kirstin; Tick, Nouchka T; Verhulst, Frank C; Maras, Athanasios; van der Vegt, Esther J M
2015-08-28
Previous research shows that adolescents with autism spectrum disorder (ASD) run several risks in their psychosexual development and that these adolescents can have limited access to reliable information on puberty and sexuality, emphasizing the need for specific guidance of adolescents with ASD in their psychosexual development. Few studies have investigated the effects of psychosexual training programs for adolescents with ASD and to date no randomized controlled trials are available to study the effects of psychosexual interventions for this target group. The randomized controlled trial (RCT) described in this study protocol aims to investigate the effects of the Tackling Teenage Training (TTT) program on the psychosexual development of adolescents with ASD. This parallel clinical trial, conducted in the South-West of the Netherlands, has a simple equal randomization design with an intervention and a waiting-list control condition. Two hundred adolescents and their parents participate in this study. We assess the participants in both conditions using self-report as well as parent-report questionnaires at three time points during 1 year: at baseline (T1), post-treatment (T2), and for follow-up (T3). To our knowledge, the current study is the first that uses a randomized controlled design to study the effects of a psychosexual training program for adolescents with ASD. It has a number of methodological strengths, namely a large sample size, a wide range of functionally relevant outcome measures, the use of multiple informants, and a standardized research and intervention protocol. Also some limitations of the described study are identified, for instance not making a comparison between two treatment conditions, and no use of blinded observational measures to investigate the ecological validity of the research results. Dutch Trial Register NTR2860. Registered on 20 April 2011.
Academic Specialisation and Returns to Education: Evidence from India
ERIC Educational Resources Information Center
Saha, Bibhas; Sensarma, Rudra
2011-01-01
We study returns to academic specialisation for Indian corporate sector workers by analysing cross-sectional data on male employees randomly selected from six large firms. Our analysis shows that going to college pays off, as it brings significant incremental returns over and above school education. However, the increase in returns is more…
Efficacy of Web-Based Personalized Normative Feedback: A Two-Year Randomized Controlled Trial
ERIC Educational Resources Information Center
Neighbors, Clayton; Lewis, Melissa A.; Atkins, David C.; Jensen, Megan M.; Walter, Theresa; Fossos, Nicole; Lee, Christine M.; Larimer, Mary E.
2010-01-01
Objective: Web-based brief alcohol interventions have the potential to reach a large number of individuals at low cost; however, few controlled evaluations have been conducted to date. The present study was designed to evaluate the efficacy of gender-specific versus gender-nonspecific personalized normative feedback (PNF) with single versus…
Effects of Infant Massage on Attachment Security: An Experimental Manipulation.
ERIC Educational Resources Information Center
Jump, Vonda K.
The formation of attachments is an important phenomenon occurring in the realm of socioemotional development. This study examined the impact of infant massage on infants' subsequent attachment security. Fifty-seven mother-infant dyads (48 dyads from Head Start, 9 from the community at large) were randomly assigned to a treatment or control group…
How Do Minnesota School Board Members Learn to Do Their Jobs?
ERIC Educational Resources Information Center
Conlon, Thomas Julius
2009-01-01
School boards in Minnesota largely function as volunteer or lowly-compensated elected bodies whose members are not professionally trained for their jobs, yet the public demands accountability and results from their local public school districts. This descriptive study examined how a random sample of 322 Minnesota school board members learned to do…
Process and Learning Outcomes from Remotely-Operated, Simulated, and Hands-on Student Laboratories
ERIC Educational Resources Information Center
Corter, James E.; Esche, Sven K.; Chassapis, Constantin; Ma, Jing; Nickerson, Jeffrey V.
2011-01-01
A large-scale, multi-year, randomized study compared learning activities and outcomes for hands-on, remotely-operated, and simulation-based educational laboratories in an undergraduate engineering course. Students (N = 458) worked in small-group lab teams to perform two experiments involving stress on a cantilever beam. Each team conducted the…
Faculty Participation in Academic Decision Making. Report of a Study.
ERIC Educational Resources Information Center
Dykes, Archie R.
Personal interviews with a random sample of 106 faculty members of a large midwestern university dealt with the role of faculty in decision making on academic, financial, and student affairs, personnel matters, capital improvements, and public and alumni relations. While the faculty members interviewed indicated that faculty should have a strong,…
Identification of Infants with Major Cognitive Delay Using Parental Report
ERIC Educational Resources Information Center
Martin, Andrew J.; Darlow, Brian A.; Salt, Alison; Hague, Wendy; Sebastian, Lucille; Mann, Kristy; Tarnow-Mordi, William
2012-01-01
Aim: The collection of data on longer-term neurodevelopmental outcomes within large neonatal randomized controlled trials by trained assessors can greatly increase costs and present many operational difficulties. The aim of this study was to develop a more practical alternative for identifying major cognitive delay in infants at the age of 24…
Recreational Prescription Drug Use among College Students
ERIC Educational Resources Information Center
Kolek, Ethan A.
2009-01-01
The purpose of this study was to explore recreational prescription drug use among undergraduate students. Although anecdotal accounts on this subject abound, empirical research is extremely limited. Data from a survey of a random sample of 734 students at a large public research university in the Northeast were examined. Results indicate that a…
ERIC Educational Resources Information Center
Vaughan, Angela L.; Lalonde, Trent L.; Jenkins-Guarnieri, Michael A.
2014-01-01
Many researchers assessing the efficacy of educational programs face challenges due to issues with non-randomization and the likelihood of dependence between nested subjects. The purpose of the study was to demonstrate a rigorous research methodology using a hierarchical propensity score matching method that can be utilized in contexts where…
Correlates of Sexual Abuse and Smoking among French Adults
ERIC Educational Resources Information Center
King, Gary; Guilbert, Philippe; Ward, D. Gant; Arwidson, Pierre; Noubary, Farzad
2006-01-01
Objective: The goal of this study was to examine the association between sexual abuse (SA) and initiation, cessation, and current cigarette smoking among a large representative adult population in France. Method: A random sample size of 12,256 adults (18-75 years of age) was interviewed by telephone concerning demographic variables, health…
Mindful Music Listening Instruction Increases Listening Sensitivity and Enjoyment
ERIC Educational Resources Information Center
Anderson, William Todd
2016-01-01
The purpose of this study was to examine the effect of mindful listening instruction on music listening sensitivity and music listening enjoyment. A pretest--posttest control group design was used. Participants, fourth-grade students (N = 42) from an elementary school in a large city in the Northeastern United States, were randomly assigned to two…
Educational Research with Real-World Data: Reducing Selection Bias with Propensity Scores
ERIC Educational Resources Information Center
Adelson, Jill L.
2013-01-01
Often it is infeasible or unethical to use random assignment in educational settings to study important constructs and questions. Hence, educational research often uses observational data, such as large-scale secondary data sets and state and school district data, and quasi-experimental designs. One method of reducing selection bias in estimations…
Height as a Measure of Success in Academe.
ERIC Educational Resources Information Center
Hensley, Wayne E.
This paper presents the results of two studies at a large mid-Atlantic university that examined the height/success paradigm within the context of the university settings. Specifically, are the trends observed among taller persons in police and sales work equally valid for university professors? A random sample of faculty (N=90), revealed that…
Health Literacy in College Students
ERIC Educational Resources Information Center
Ickes, Melinda J.; Cottrell, Randall
2010-01-01
Objective: The purpose of this study was to assess the health literacy levels, and the potential importance of healthy literacy, of college students. Participants: Courses were randomly selected from all upper level undergraduate courses at a large Research I university to obtain a sample size of N = 399. Methods: During the 2007-2008 school year,…
Results and Implications of a Problem-Solving Treatment Program for Obesity.
ERIC Educational Resources Information Center
Mahoney, B. K.; And Others
Data are from a large scale experimental study which was designed to evaluate a multimethod problem solving approach to obesity. Obese adult volunteers (N=90) were randomly assigned to three groups: maximal treatment, minimal treatment, and no treatment control. In the two treatment groups, subjects were exposed to bibliographic material and…
Mental Health and Clinical Correlates in Lesbian, Gay, Bisexual, and Queer Young Adults
ERIC Educational Resources Information Center
Grant, Jon E.; Odlaug, Brian L.; Derbyshire, Katherine; Schreiber, Liana R. N.; Lust, Katherine; Christenson, Gary
2014-01-01
Objective: This study examined the prevalence of mental health disorders and their clinical correlates in a university sample of lesbian, gay, bisexual, and queer (LGBQ) students. Participants: College students at a large public university. Methods: An anonymous, voluntary survey was distributed via random e-mail generation to university students…
It's a Girl! Random Numbers, Simulations, and the Law of Large Numbers
ERIC Educational Resources Information Center
Goodwin, Chris; Ortiz, Enrique
2015-01-01
Modeling using mathematics and making inferences about mathematical situations are becoming more prevalent in most fields of study. Descriptive statistics cannot be used to generalize about a population or make predictions of what can occur. Instead, inference must be used. Simulation and sampling are essential in building a foundation for…
Assessing Faculty Perspectives about Teaching and Working with Students with Disabilities
ERIC Educational Resources Information Center
Becker, Sandra; Palladino, John
2016-01-01
This study presents a unique assessment of faculty perspectives about teaching and working with students with disabilities against the backdrop of the Individuals with Disabilities Education Act (IDEA) and the Americans with Disabilities Act (ADA). A randomized sample of 127 faculty from a large Midwest comprehensive university completed the…
ERIC Educational Resources Information Center
Tipton, Elizabeth; Fellers, Lauren; Caverly, Sarah; Vaden-Kiernan, Michael; Borman, Geoffrey; Sullivan, Kate; Ruiz de Castilla, Veronica
2016-01-01
Recently, statisticians have begun developing methods to improve the generalizability of results from large-scale experiments in education. This work has included the development of methods for improved site selection when random sampling is infeasible, including the use of stratification and targeted recruitment strategies. This article provides…
Doing More with Less: A Preliminary Study of the School District Investment.
ERIC Educational Resources Information Center
MacPhail-Wilcox, Bettye
1983-01-01
Changes in cash management practices from 1978 to 1981 were investigated in a random sample of 145 North Carolina school districts, stratified by attendance size. Analysis using chi-square tests indicated the level of investment sophistication (as measured by the proportion of cash invested) has increased, especially for large districts. (RW)
Dy, Jessica; Rainey, Jenna; Walker, Mark C; Fraser, William; Smith, Graeme N; White, Ruth Rennicks; Waddell, Patti; Janoudi, Ghayath; Corsi, Daniel J; Wei, Shu Qin
2018-06-01
The primary objective was to determine the feasibility of a large RCT assessing the effectiveness of an accelerated oxytocin titration (AOT) protocol compared with a standard gradual oxytocin titration (GOT) in reducing the risk of CS in nulliparous women diagnosed with dystocia in the first stage of labour. The secondary objective was to obtain preliminary data on the safety and efficacy of the foregoing AOT protocol. This was a multicentre, double-masked, parallel-group pilot RCT. This study was conducted in three Canadian birthing centres. A total of 79 term nulliparous women carrying a singleton pregnancy in spontaneous labour, with a diagnosis of labour dystocia, were randomized to receive either GOT (initial dose 2 mU/min with increments of 2 mU/min) or AOT (initial dose 4 mU/min with increments of 4 mU/min), in a 1:1 ratio. An intention-to-treat analysis was applied. A total of 252 women were screened and approached, 137 (54.4%) consented, and 79 (31.3%) were randomized. Overall protocol adherence was 76 of 79 (96.2%). Of the women randomized, 10 (25.6%) allocated to GOT had a CS compared with six (15.0%) allocated to AOT (Fisher exact test P = 0.27). This pilot study demonstrated that a large, multicentre RCT is not only feasible, but also necessary to assess the effectiveness and safety of an AOT protocol for labour augmentation with regard to CS rate and indicators of maternal and perinatal morbidities. Copyright © 2018 Society of Obstetricians and Gynaecologists of Canada. Published by Elsevier Inc. All rights reserved.
Kinetics of Aggregation with Choice
Ben-Naim, Eli; Krapivsky, Paul
2016-12-01
Here we generalize the ordinary aggregation process to allow for choice. In ordinary aggregation, two random clusters merge and form a larger aggregate. In our implementation of choice, a target cluster and two candidate clusters are randomly selected and the target cluster merges with the larger of the two candidate clusters.We study the long-time asymptotic behavior and find that as in ordinary aggregation, the size density adheres to the standard scaling form. However, aggregation with choice exhibits a number of different features. First, the density of the smallest clusters exhibits anomalous scaling. Second, both the small-size and the large-size tailsmore » of the density are overpopulated, at the expense of the density of moderate-size clusters. Finally, we also study the complementary case where the smaller candidate cluster participates in the aggregation process and find an abundance of moderate clusters at the expense of small and large clusters. Additionally, we investigate aggregation processes with choice among multiple candidate clusters and a symmetric implementation where the choice is between two pairs of clusters.« less
NASA Technical Reports Server (NTRS)
Baker, B.; Brown, H.
1974-01-01
Advantages of the large time bandwidth product of optical processing are presented. Experiments were performed to study the feasibility of the use of optical spectral analysis for detection of flaws in structural elements excited by random noise. Photographic and electronic methods of comparison of complex spectra were developed. Limitations were explored, and suggestions for further work are offered.
ERIC Educational Resources Information Center
Bettinger, Eric; Doss, Christopher; Loeb, Susanna; Taylor, Eric
2015-01-01
Class size is a first-order consideration in the study of education production and education costs. How larger or smaller classes affect student outcomes is especially relevant to the growth and design of online classes. We study a field experiment in which college students were quasi-randomly assigned to either a large or a small class. All…
ERIC Educational Resources Information Center
Chiang, Hanley; Speroni, Cecilia; Herrmann, Mariesa; Hallgren, Kristin; Burkander, Paul; Wellington, Alison
2017-01-01
The Teacher Incentive Fund (TIF) provides grants to support performance-based compensation systems for teachers and principals in high-need schools. The study measures the impact of pay-for-performance bonuses as part of a comprehensive compensation system within a large, multisite random assignment study design. The treatment schools were to…
ERIC Educational Resources Information Center
Chiang, Hanley; Speroni, Cecilia; Herrmann, Mariesa; Hallgren, Kristin; Burkander, Paul; Wellington, Alison
2017-01-01
The Teacher Incentive Fund (TIF) provides grants to support performance-based compensation systems for teachers and principals in high-need schools. The study measures the impact of pay-for-performance bonuses as part of a comprehensive compensation system within a large, multisite random assignment study design. The treatment schools were to…
ERIC Educational Resources Information Center
Max, Jeffrey; Constantine, Jill; Wellington, Alison; Hallgren, Kristin; Glazerman, Steven; Chiang, Hanley; Speroni, Cecilia
2014-01-01
The Teacher Incentive Fund (TIF) provides grants to support performance-based compensation systems for teachers and principals in high-need schools. The study measures the impact of pay-for-performance bonuses as part of a comprehensive compensation system within a large, multisite random assignment study design. The treatment schools were to…
Zhao, Lei; Guo, Yi; Wang, Wei; Yan, Li-juan
2011-08-01
To evaluate the effectiveness of acupuncture as a treatment for neurovascular headache and to analyze the current situation related to acupuncture treatment. PubMed database (1966-2010), EMBASE database (1986-2010), Cochrane Library (Issue 1, 2010), Chinese Biomedical Literature Database (1979-2010), China HowNet Knowledge Database (1979-2010), VIP Journals Database (1989-2010), and Wanfang database (1998-2010) were retrieved. Randomized or quasi-randomized controlled studies were included. The priority was given to high-quality randomized, controlled trials. Statistical outcome indicators were measured using RevMan 5.0.20 software. A total of 16 articles and 1 535 cases were included. Meta-analysis showed a significant difference between the acupuncture therapy and Western medicine therapy [combined RR (random efficacy model)=1.46, 95% CI (1.21, 1.75), Z=3.96, P<0.0001], indicating an obvious superior effect of the acupuncture therapy; significant difference also existed between the comprehensive acupuncture therapy and acupuncture therapy alone [combined RR (fixed efficacy model)=3.35, 95% CI (1.92, 5.82), Z=4.28, P<0.0001], indicating that acupuncture combined with other therapies, such as points injection, scalp acupuncture, auricular acupuncture, etc., were superior to the conventional body acupuncture therapy alone. The inclusion of limited clinical studies had verified the efficacy of acupuncture in the treatment of neurovascular headache. Although acupuncture or its combined therapies provides certain advantages, most clinical studies are of small sample sizes. Large sample size, randomized, controlled trials are needed in the future for more definitive results.
Use of simulation to compare the performance of minimization with stratified blocked randomization.
Toorawa, Robert; Adena, Michael; Donovan, Mark; Jones, Steve; Conlon, John
2009-01-01
Minimization is an alternative method to stratified permuted block randomization, which may be more effective at balancing treatments when there are many strata. However, its use in the regulatory setting for industry trials remains controversial, primarily due to the difficulty in interpreting conventional asymptotic statistical tests under restricted methods of treatment allocation. We argue that the use of minimization should be critically evaluated when designing the study for which it is proposed. We demonstrate by example how simulation can be used to investigate whether minimization improves treatment balance compared with stratified randomization, and how much randomness can be incorporated into the minimization before any balance advantage is no longer retained. We also illustrate by example how the performance of the traditional model-based analysis can be assessed, by comparing the nominal test size with the observed test size over a large number of simulations. We recommend that the assignment probability for the minimization be selected using such simulations. Copyright (c) 2008 John Wiley & Sons, Ltd.
Towse, John N; Loetscher, Tobias; Brugger, Peter
2014-01-01
We investigate the number preferences of children and adults when generating random digit sequences. Previous research has shown convincingly that adults prefer smaller numbers when randomly choosing between responses 1-6. We analyze randomization choices made by both children and adults, considering a range of experimental studies and task configurations. Children - most of whom are between 8 and 11~years - show a preference for relatively large numbers when choosing numbers 1-10. Adults show a preference for small numbers with the same response set. We report a modest association between children's age and numerical bias. However, children also exhibit a small number bias with a smaller response set available, and they show a preference specifically for the numbers 1-3 across many datasets. We argue that number space demonstrates both continuities (numbers 1-3 have a distinct status) and change (a developmentally emerging bias toward the left side of representational space or lower numbers).
Bakbergenuly, Ilyas; Kulinskaya, Elena; Morgenthaler, Stephan
2016-07-01
We study bias arising as a result of nonlinear transformations of random variables in random or mixed effects models and its effect on inference in group-level studies or in meta-analysis. The findings are illustrated on the example of overdispersed binomial distributions, where we demonstrate considerable biases arising from standard log-odds and arcsine transformations of the estimated probability p̂, both for single-group studies and in combining results from several groups or studies in meta-analysis. Our simulations confirm that these biases are linear in ρ, for small values of ρ, the intracluster correlation coefficient. These biases do not depend on the sample sizes or the number of studies K in a meta-analysis and result in abysmal coverage of the combined effect for large K. We also propose bias-correction for the arcsine transformation. Our simulations demonstrate that this bias-correction works well for small values of the intraclass correlation. The methods are applied to two examples of meta-analyses of prevalence. © 2016 The Authors. Biometrical Journal Published by Wiley-VCH Verlag GmbH & Co. KGaA.
Li, Shenshen; Wu, Yangfeng; Du, Xin; Li, Xian; Patel, Anushka; Peterson, Eric D; Turnbull, Fiona; Lo, Serigne; Billot, Laurent; Laba, Tracey; Gao, Runlin
2015-03-01
Acute coronary syndromes (ACSs) are a major cause of morbidity and mortality, yet effective ACS treatments are frequently underused in clinical practice. Randomized trials including the CPACS-2 study suggest that quality improvement initiatives can increase the use of effective treatments, but whether such programs can impact hard clinical outcomes has never been demonstrated in a well-powered randomized controlled trial. The CPACS-3 study is a stepped-wedge cluster-randomized trial conducted in 104 remote level 2 hospitals without PCI facilities in China. All hospitalized ACS patients will be recruited consecutively over a 30-month period to an anticipated total study population of more than 25,000 patients. After a 6-month baseline period, hospitals will be randomized to 1 of 4 groups, and a 6-component quality improvement intervention will be implemented sequentially in each group every 6months. These components include the following: establishment of a quality improvement team, implementation of a clinical pathway, training of physicians and nurses, hospital performance audit and feedback, online technical support, and patient education. All patients will be followed up for 6months postdischarge. The primary outcome will be the incidence of in-hospital major adverse cardiovascular events comprising all-cause mortality, myocardial infarction or reinfarction, and nonfatal stroke. The CPACS-3 study will be the first large randomized trial with sufficient power to assess the effects of a multifaceted quality of care improvement initiative on hard clinical outcomes, in patients with ACS. Copyright © 2014 Elsevier Inc. All rights reserved.
Jones, L A T; Lammertse, D P; Charlifue, S B; Kirshblum, S C; Apple, D F; Ragnarsson, K T; Poonian, D; Betz, R R; Knoller, N; Heary, R F; Choudhri, T F; Jenkins, A L; Falci, S P; Snyder, D A
2010-11-01
Post hoc analysis from a randomized controlled cellular therapy trial in acute, complete spinal cord injury (SCI). Description and quantitative review of study logistics, referral patterns, current practice patterns and subject demographics. Subjects were recruited to one of six international study centers. Data are presented from 1816 patients pre-screened, 75 participants screened and 50 randomized. Of the 1816 patients pre-screened, 53.7% did not meet initial study criteria, primarily due to an injury outside the time window (14 days) or failure to meet neurological criteria (complete SCI between C5 motor/C4 sensory and T11). MRIs were obtained on 339 patients; 51.0% were ineligible based on imaging criteria. Of the 75 participants enrolled, 25 failed screening (SF), leaving 50 randomized. The primary reason for SF was based on the neurological exam (51.9%), followed by failure to meet MRI criteria (22.2%). Of the 50 randomized subjects, there were no significant differences in demographics in the active versus control arms. In those participants for whom data was available, 93.8% (45 of 48) of randomized participants received steroids before study entry, whereas 94.0% (47 of 50) had spine surgery before study enrollment. The 'funnel effect' (large numbers of potentially eligible participants with a small number enrolled) impacts all trials, but was particularly challenging in this trial due to eligibility criteria and logistics. Data collected may provide information on current practice patterns and the issues encountered and addressed may facilitate design of future trials.
Koethe, John R; Westfall, Andrew O; Luhanga, Dora K; Clark, Gina M; Goldman, Jason D; Mulenga, Priscilla L; Cantrell, Ronald A; Chi, Benjamin H; Zulu, Isaac; Saag, Michael S; Stringer, Jeffrey S A
2010-03-12
The benefit of routine HIV-1 viral load (VL) monitoring of patients on antiretroviral therapy (ART) in resource-constrained settings is uncertain because of the high costs associated with the test and the limited treatment options. We designed a cluster randomized controlled trial to compare the use of routine VL testing at ART-initiation and at 3, 6, 12, and 18 months, versus our local standard of care (which uses immunological and clinical criteria to diagnose treatment failure, with discretionary VL testing when the two do not agree). Dedicated study personnel were integrated into public-sector ART clinics. We collected participant information in a dedicated research database. Twelve ART clinics in Lusaka, Zambia constituted the units of randomization. Study clinics were stratified into pairs according to matching criteria (historical mortality rate, size, and duration of operation) to limit the effect of clustering, and independently randomized to the intervention and control arms. The study was powered to detect a 36% reduction in mortality at 18 months. From December 2006 to May 2008, we completed enrollment of 1973 participants. Measured baseline characteristics did not differ significantly between the study arms. Enrollment was staggered by clinic pair and truncated at two matched sites. A large clinical trial of routing VL monitoring was successfully implemented in a dynamic and rapidly growing national ART program. Close collaboration with local health authorities and adequate reserve staff were critical to success. Randomized controlled trials such as this will likely prove valuable in determining long-term outcomes in resource-constrained settings. Clinicaltrials.gov NCT00929604.
Classroom management programs for deaf children in state residential and large public schools.
Wenkus, M; Rittenhouse, B; Dancer, J
1999-12-01
Personnel in 4 randomly selected state residential schools for the deaf and 3 randomly selected large public schools with programs for the deaf were surveyed to assess the types of management or disciplinary programs and strategies currently in use with deaf students and the rated effectiveness of such programs. Several behavioral management programs were identified by respondents, with Assertive Discipline most often listed. Ratings of program effectiveness were generally above average on a number of qualitative criteria.
NASA Astrophysics Data System (ADS)
Sycheva, Elena A.; Vasilev, Aleksandr S.; Lashmanov, Oleg U.; Korotaev, Valery V.
2017-06-01
The article is devoted to the optimization of optoelectronic systems of the spatial position of objects. Probabilistic characteristics of the detection of an active structured mark on a random noisy background are investigated. The developed computer model and the results of the study allow us to estimate the probabilistic characteristics of detection of a complex structured mark on a random gradient background, and estimate the error of spatial coordinates. The results of the study make it possible to improve the accuracy of measuring the coordinates of the object. Based on the research recommendations are given on the choice of parameters of the optimal mark structure for use in opticalelectronic systems for monitoring the spatial position of large-sized structures.
Mariampillai, Julian E; Eskås, Per Anders; Heimark, Sondre; Kjeldsen, Sverre E; Narkiewicz, Krzysztof; Mancia, Giuseppe
Although high blood pressure (BP) is the leading risk factors for cardiovascular (CV) disease, the optimal BP treatment target in order to reduce CV risk is unclear in the aftermath of the SPRINT study. The aim of this review is to assess large, randomized, and controlled trials on BP targets, as well as review selected observational analyses from other large randomized BP trials in order to evaluate the benefit of intense vs. standard BP control. None of the studies, except SPRINT, favored intense BP treatment. Some of the studies suggested favorable effects of lowering treatment target in patients with diabetes or high risk of stroke. In SPRINT, a new BP measurement method was introduced, and the results must be interpreted in light of this. The results of the observational analyses indicated the best preventive effect when achieving early and sustained BP control rather than low targets. In conclusion, today's guidelines' recommended treatment target of <140/90mmHg seems sufficient for most patients. Early and sustained BP control should be the main focus. Copyright © 2016 Elsevier Inc. All rights reserved.
Cotton, Bryan A; Podbielski, Jeanette; Camp, Elizabeth; Welch, Timothy; del Junco, Deborah; Bai, Yu; Hobbs, Rhonda; Scroggins, Jamie; Hartwell, Beth; Kozar, Rosemary A; Wade, Charles E; Holcomb, John B
2013-10-01
To determine whether resuscitation of severely injured patients with modified whole blood (mWB) resulted in fewer overall transfusions compared with component (COMP) therapy. For decades, whole blood (WB) was the primary product for resuscitating patients in hemorrhagic shock. After dramatic advances in blood banking in the 1970s, blood donor centers began supplying hospitals with individual components [red blood cell (RBC), plasma, platelets] and removed WB as an available product. However, no studies of efficacy or hemostatic potential in trauma patients were performed before doing so. Single-center, randomized trial of severely injured patients predicted to large transfusion volume. Pregnant patients, prisoners, those younger than 18 years or with more than 20% total body surface area burns (TBSA) burns were excluded. Patients were randomized to mWB (1 U mWB) or COMP therapy (1 U RBC+ 1 U plasma) immediately on arrival. Each group also received 1 U platelets (apheresis or prepooled random donor) for every 6 U of mWB or 6 U of RBC + 6 U plasma. The study was performed under the Exception From Informed Consent (Food and Drug Administration 21 code of federal regulations [CFR] 50.24). Primary outcome was 24-hour transfusion volumes. A total of 107 patients were randomized (55 mWB, 52 COMP therapy) over 14 months. There were no differences in demographics, arrival vitals or laboratory values, injury severity, or mechanism. Transfusions were similar between groups (intent-to-treat analysis). However, when excluding patients with severe brain injury (sensitivity analysis), WB group received less 24-hour RBC (median 3 vs 6, P = 0.02), plasma (4 vs 6, P = 0.02), platelets (0 vs 3, P = 0.09), and total products (11 vs 16, P = 0.02). Compared with COMP therapy, WB did not reduce transfusion volumes in severely injured patients predicted to receive massive transfusion. However, in the sensitivity analysis (patients without severe brain injuries), use of mWB significantly reduced transfusion volumes, achieving the prespecified endpoint of this initial pilot study.
Wang, Chenchen; McAlindon, Timothy; Fielding, Roger A; Harvey, William F; Driban, Jeffrey B; Price, Lori Lyn; Kalish, Robert; Schmid, Anna; Scott, Tammy M; Schmid, Christopher H
2015-01-30
Fibromyalgia is a chronic musculoskeletal pain syndrome that causes substantial physical and psychological impairment and costs the US healthcare system over $25 billion annually. Current pharmacological therapies may cause serious adverse effects, are expensive, and fail to effectively improve pain and function. Finding new and effective non-pharmacological treatments for fibromyalgia patients is urgently needed. We are currently conducting the first comparative effectiveness randomized trial of Tai Chi versus aerobic exercise (a recommended component of the current standard of care) in a large fibromyalgia population. This article describes the design and conduct of this trial. A single-center, 52-week, randomized controlled trial of Tai Chi versus aerobic exercise is being conducted at an urban tertiary medical center in Boston, Massachusetts. We plan to recruit 216 patients with fibromyalgia. The study population consists of adults ≥21 years of age with fibromyalgia who meet American College of Rheumatology 1990 and 2010 diagnostic criteria. Participants are randomized to one of four Tai Chi intervention groups: 12 or 24 weeks of supervised Tai Chi held once or twice per week, or a supervised aerobic exercise control held twice per week for 24 weeks. The primary outcome is the change in Revised Fibromyalgia Impact Questionnaire total score from baseline to 24 weeks. Secondary outcomes include measures of widespread pain, symptom severity, functional performance, balance, muscle strength and power, psychological functioning, sleep quality, self-efficacy, durability effects, and health-related quality of life at 12, 24, and 52 week follow-up. This study is the first comparative effectiveness randomized trial of Tai Chi versus aerobic exercise in a large fibromyalgia population with long-term follow up. We present here a robust and well-designed trial to determine the optimal frequency and duration of a supervised Tai Chi intervention with regard to short- and long-term effectiveness. The trial also explores multiple outcomes to elucidate the potential mechanisms of Tai Chi and aerobic exercise and the generalizability of these interventions across instructors. Results of this study are expected to have important public health implications for patients with a major disabling disease that incurs substantial health burdens and economic costs. ClinicalTrials.gov identifier: NCT01420640 , registered 18 August 2011.
Random effects coefficient of determination for mixed and meta-analysis models.
Demidenko, Eugene; Sargent, James; Onega, Tracy
2012-01-01
The key feature of a mixed model is the presence of random effects. We have developed a coefficient, called the random effects coefficient of determination, [Formula: see text], that estimates the proportion of the conditional variance of the dependent variable explained by random effects. This coefficient takes values from 0 to 1 and indicates how strong the random effects are. The difference from the earlier suggested fixed effects coefficient of determination is emphasized. If [Formula: see text] is close to 0, there is weak support for random effects in the model because the reduction of the variance of the dependent variable due to random effects is small; consequently, random effects may be ignored and the model simplifies to standard linear regression. The value of [Formula: see text] apart from 0 indicates the evidence of the variance reduction in support of the mixed model. If random effects coefficient of determination is close to 1 the variance of random effects is very large and random effects turn into free fixed effects-the model can be estimated using the dummy variable approach. We derive explicit formulas for [Formula: see text] in three special cases: the random intercept model, the growth curve model, and meta-analysis model. Theoretical results are illustrated with three mixed model examples: (1) travel time to the nearest cancer center for women with breast cancer in the U.S., (2) cumulative time watching alcohol related scenes in movies among young U.S. teens, as a risk factor for early drinking onset, and (3) the classic example of the meta-analysis model for combination of 13 studies on tuberculosis vaccine.
Radiation Transport in Random Media With Large Fluctuations
NASA Astrophysics Data System (ADS)
Olson, Aaron; Prinja, Anil; Franke, Brian
2017-09-01
Neutral particle transport in media exhibiting large and complex material property spatial variation is modeled by representing cross sections as lognormal random functions of space and generated through a nonlinear memory-less transformation of a Gaussian process with covariance uniquely determined by the covariance of the cross section. A Karhunen-Loève decomposition of the Gaussian process is implemented to effciently generate realizations of the random cross sections and Woodcock Monte Carlo used to transport particles on each realization and generate benchmark solutions for the mean and variance of the particle flux as well as probability densities of the particle reflectance and transmittance. A computationally effcient stochastic collocation method is implemented to directly compute the statistical moments such as the mean and variance, while a polynomial chaos expansion in conjunction with stochastic collocation provides a convenient surrogate model that also produces probability densities of output quantities of interest. Extensive numerical testing demonstrates that use of stochastic reduced-order modeling provides an accurate and cost-effective alternative to random sampling for particle transport in random media.
Red blood cell transfusion triggers in acute leukemia: a randomized pilot study.
DeZern, Amy E; Williams, Katherine; Zahurak, Marianna; Hand, Wesley; Stephens, R Scott; King, Karen E; Frank, Steven M; Ness, Paul M
2016-07-01
Red blood cell (RBC) transfusion thresholds have yet to be examined in large randomized trials in hematologic malignancies. This pilot study in acute leukemia uses a restrictive compared to a liberal transfusion strategy. A randomized (2:1) study was conducted of restrictive (LOW) hemoglobin (Hb) trigger (7 g/dL) compared to higher (HIGH) Hb trigger (8 g/dL). The primary outcome was feasibility of conducting a larger trial. The four requirements for success required that more than 50% of the eligible patients could be consented, more than 75% of the patients randomized to the LOW arm tolerated the transfusion trigger, fewer than 15% of patients crossed over from the LOW arm to the HIGH arm, and no indication for the need to pause the study for safety concerns. Secondary outcomes included fatigue, bleeding, and RBCs and platelets transfused. Ninety patients were consented and randomly assigned to LOW to HIGH. The four criteria for the primary objective of feasibility were met. When the number of units transfused was compared, adjusting for baseline Hb, the LOW arm was transfused on average 8.0 (95% confidence interval [CI], 6.9-9.1) units/patient while the HIGH arm received 11.7 (95% CI, 10.1-13.2) units (p = 0.0003). There was no significant difference in bleeding events or neutropenic fevers between study arms. This study establishes feasibility for trial of Hb thresholds in leukemia through demonstration of success in all primary outcome metrics and a favorable safety profile. This population requires further study to evaluate the equivalence of liberal and restrictive transfusion thresholds in this unique clinical setting. © 2016 AABB.
Montassier, Emmanuel; Hardouin, Jean-Benoît; Segard, Julien; Batard, Eric; Potel, Gilles; Planchon, Bernard; Trochu, Jean-Noël; Pottier, Pierre
2016-04-01
An ECG is pivotal for the diagnosis of coronary heart disease. Previous studies have reported deficiencies in ECG interpretation skills that have been responsible for misdiagnosis. However, the optimal way to acquire ECG interpretation skills is still under discussion. Thus, our objective was to compare the effectiveness of e-learning and lecture-based courses for learning ECG interpretation skills in a large randomized study. We conducted a prospective, randomized, controlled, noninferiority study. Participants were recruited from among fifth-year medical students and were assigned to the e-learning group or the lecture-based group using a computer-generated random allocation sequence. The e-learning and lecture-based groups were compared on a score of effectiveness, comparing the 95% unilateral confidence interval (95% UCI) of the score of effectiveness with the mean effectiveness in the lecture-based group, adjusted for a noninferiority margin. Ninety-eight students were enrolled. As compared with the lecture-based course, e-learning was noninferior with regard to the postcourse test score (15.1; 95% UCI 14.2; +∞), which can be compared with 12.5 [the mean effectiveness in the lecture-based group (15.0) minus the noninferiority margin (2.5)]. Furthermore, there was a significant increase in the test score points in both the e-learning and lecture-based groups during the study period (both P<0.0001). Our randomized study showed that the e-learning course is an effective tool for the acquisition of ECG interpretation skills by medical students. These preliminary results should be confirmed with further multicenter studies before the implementation of e-learning courses for learning ECG interpretation skills during medical school.
Saver, Jeffrey L; Goyal, Mayank; Bonafe, Alain; Diener, Hans-Christoph; Levy, Elad I; Pereira, Vitor M; Albers, Gregory W; Cognard, Christophe; Cohen, David J; Hacke, Werner; Jansen, Olav; Jovin, Tudor G; Mattle, Heinrich P; Nogueira, Raul G; Siddiqui, Adnan H; Yavagal, Dileep R; Devlin, Thomas G; Lopes, Demetrius K; Reddy, Vivek; du Mesnil de Rochemont, Richard; Jahan, Reza
2015-04-01
Early reperfusion in patients experiencing acute ischemic stroke is critical, especially for patients with large vessel occlusion who have poor prognosis without revascularization. Solitaire™ stent retriever devices have been shown to immediately restore vascular perfusion safely, rapidly, and effectively in acute ischemic stroke patients with large vessel occlusions. The aim of the study was to demonstrate that, among patients with large vessel, anterior circulation occlusion who have received intravenous tissue plasminogen activator, treatment with Solitaire revascularization devices reduces degree of disability 3 months post stroke. The study is a global multicenter, two-arm, prospective, randomized, open, blinded end-point trial comparing functional outcomes in acute ischemic stroke patients who are treated with either intravenous tissue plasminogen activator alone or intravenous tissue plasminogen activator in combination with the Solitaire device. Up to 833 patients will be enrolled. Patients who have received intravenous tissue plasminogen activator are randomized to either continue with intravenous tissue plasminogen activator alone or additionally proceed to neurothrombectomy using the Solitaire device within six-hours of symptom onset. The primary end-point is 90-day global disability, assessed with the modified Rankin Scale (mRS). Secondary outcomes include mortality at 90 days, functional independence (mRS ≤ 2) at 90 days, change in National Institutes of Health Stroke Scale at 27 h, reperfusion at 27 h, and thrombolysis in cerebral infarction 2b/3 flow at the end of the procedure. Statistical analysis will be conducted using simultaneous success criteria on the overall distribution of modified Rankin Scale (Rankin shift) and proportions of subjects achieving functional independence (mRS 0-2). © 2015 The Authors. International Journal of Stroke published by John Wiley & Sons Ltd on behalf of World Stroke Organization.
Camacho, Macario; Chang, Edward T; Song, Sungjin A; Abdullatif, Jose; Zaghi, Soroush; Pirelli, Paola; Certal, Victor; Guilleminault, Christian
2017-07-01
To perform a systematic review with meta-analysis for sleep study outcomes in children who have undergone rapid maxillary expansion (RME) as treatment for obstructive sleep apnea (OSA). PubMed/MEDLINE and eight additional databases. Three authors independently and systematically reviewed the international literature through February 21, 2016. Seventeen studies reported outcomes for 314 children (7.6 ± 2.0 years old) with high-arched and/or narrow hard palates (transverse maxillary deficiency) and OSA. Data were analyzed based on follow-up duration: ≤3 years (314 patients) and >3 years (52 patients). For ≤3-year follow-up, the pre- and post-RME apnea-hypopnea index (AHI) decreased from a mean ± standard deviation (M ± SD) of 8.9 ± 7.0/hr to 2.7 ± 3.3/hr (70% reduction). The cure rate (AHI <1/hr) for 90 patients for whom it could be calculated was 25.6%. Random effects modeling for AHI standardized mean difference (SMD) is -1.54 (large effect). Lowest oxygen saturation (LSAT) improved from 87.0 ± 9.1% to 96.0 ± 2.7%. Random effects modeling for LSAT SMD is 1.74 (large effect). AHI improved more in children with previous adenotonsillectomy or small tonsils (73-95% reduction) than in children with large tonsils (61% reduction). For >3-year follow-up (range = 6.5-12 years), the AHI was reduced from an M ± SD of 7.1 ± 5.7/hr to 1.5 ± 1.8/hr (79% reduction). Improvement in AHI and lowest oxygen saturation has consistently been seen in children undergoing RME, especially in the short term (<3-year follow-up). Randomized trials and more studies reporting long-term data (≥3-year follow-up) would help determine the effect of growth and spontaneous resolution of OSA. Laryngoscope, 2016 Laryngoscope, 127:1712-1719, 2017. © 2016 The American Laryngological, Rhinological and Otological Society, Inc.
Community turnover of wood-inhabiting fungi across hierarchical spatial scales.
Abrego, Nerea; García-Baquero, Gonzalo; Halme, Panu; Ovaskainen, Otso; Salcedo, Isabel
2014-01-01
For efficient use of conservation resources it is important to determine how species diversity changes across spatial scales. In many poorly known species groups little is known about at which spatial scales the conservation efforts should be focused. Here we examined how the community turnover of wood-inhabiting fungi is realised at three hierarchical levels, and how much of community variation is explained by variation in resource composition and spatial proximity. The hierarchical study design consisted of management type (fixed factor), forest site (random factor, nested within management type) and study plots (randomly placed plots within each study site). To examine how species richness varied across the three hierarchical scales, randomized species accumulation curves and additive partitioning of species richness were applied. To analyse variation in wood-inhabiting species and dead wood composition at each scale, linear and Permanova modelling approaches were used. Wood-inhabiting fungal communities were dominated by rare and infrequent species. The similarity of fungal communities was higher within sites and within management categories than among sites or between the two management categories, and it decreased with increasing distance among the sampling plots and with decreasing similarity of dead wood resources. However, only a small part of community variation could be explained by these factors. The species present in managed forests were in a large extent a subset of those species present in natural forests. Our results suggest that in particular the protection of rare species requires a large total area. As managed forests have only little additional value complementing the diversity of natural forests, the conservation of natural forests is the key to ecologically effective conservation. As the dissimilarity of fungal communities increases with distance, the conserved natural forest sites should be broadly distributed in space, yet the individual conserved areas should be large enough to ensure local persistence.
Community Turnover of Wood-Inhabiting Fungi across Hierarchical Spatial Scales
Abrego, Nerea; García-Baquero, Gonzalo; Halme, Panu; Ovaskainen, Otso; Salcedo, Isabel
2014-01-01
For efficient use of conservation resources it is important to determine how species diversity changes across spatial scales. In many poorly known species groups little is known about at which spatial scales the conservation efforts should be focused. Here we examined how the community turnover of wood-inhabiting fungi is realised at three hierarchical levels, and how much of community variation is explained by variation in resource composition and spatial proximity. The hierarchical study design consisted of management type (fixed factor), forest site (random factor, nested within management type) and study plots (randomly placed plots within each study site). To examine how species richness varied across the three hierarchical scales, randomized species accumulation curves and additive partitioning of species richness were applied. To analyse variation in wood-inhabiting species and dead wood composition at each scale, linear and Permanova modelling approaches were used. Wood-inhabiting fungal communities were dominated by rare and infrequent species. The similarity of fungal communities was higher within sites and within management categories than among sites or between the two management categories, and it decreased with increasing distance among the sampling plots and with decreasing similarity of dead wood resources. However, only a small part of community variation could be explained by these factors. The species present in managed forests were in a large extent a subset of those species present in natural forests. Our results suggest that in particular the protection of rare species requires a large total area. As managed forests have only little additional value complementing the diversity of natural forests, the conservation of natural forests is the key to ecologically effective conservation. As the dissimilarity of fungal communities increases with distance, the conserved natural forest sites should be broadly distributed in space, yet the individual conserved areas should be large enough to ensure local persistence. PMID:25058128
On factoring RSA modulus using random-restart hill-climbing algorithm and Pollard’s rho algorithm
NASA Astrophysics Data System (ADS)
Budiman, M. A.; Rachmawati, D.
2017-12-01
The security of the widely-used RSA public key cryptography algorithm depends on the difficulty of factoring a big integer into two large prime numbers. For many years, the integer factorization problem has been intensively and extensively studied in the field of number theory. As a result, a lot of deterministic algorithms such as Euler’s algorithm, Kraitchik’s, and variants of Pollard’s algorithms have been researched comprehensively. Our study takes a rather uncommon approach: rather than making use of intensive number theories, we attempt to factorize RSA modulus n by using random-restart hill-climbing algorithm, which belongs the class of metaheuristic algorithms. The factorization time of RSA moduli with different lengths is recorded and compared with the factorization time of Pollard’s rho algorithm, which is a deterministic algorithm. Our experimental results indicates that while random-restart hill-climbing algorithm is an acceptable candidate to factorize smaller RSA moduli, the factorization speed is much slower than that of Pollard’s rho algorithm.
Miller, Lucy Jane; Coll, Joseph R; Schoen, Sarah A
2007-01-01
A pilot randomized controlled trial (RCT) of the effectiveness of occupational therapy using a sensory integration approach (OT-SI) was conducted with children who had sensory modulation disorders (SMDs). This study evaluated the effectiveness of three treatment groups. In addition, sample size estimates for a large scale, multisite RCT were calculated. Twenty-four children with SMD were randomly assigned to one of three treatment conditions; OT-SI, Activity Protocol, and No Treatment. Pretest and posttest measures of behavior, sensory and adaptive functioning, and physiology were administered. The OT-SI group, compared to the other two groups, made significant gains on goal attainment scaling and on the Attention subtest and the Cognitive/Social composite of the Leiter International Performance Scale-Revised. Compared to the control groups, OT-SI improvement trends on the Short Sensory Profile, Child Behavior Checklist, and electrodermal reactivity were in the hypothesized direction. Findings suggest that OT-SI may be effective in ameliorating difficulties of children with SMD.
Huang, Jia; Lin, Zhengkun; Wang, Qin; Liu, Feiwen; Liu, Jiao; Fang, Yunhua; Chen, Shanjia; Zhou, Xiaoxuan; Hong, Wenjun; Wu, Jinsong; Madrigal-Mora, Natalia; Zheng, Guohua; Yang, Shanli; Tao, Jing; Chen, Lidian
2015-06-16
Post-stroke cognitive impairment (PSCI) lessens quality of life, restricts the rehabilitation of stroke, and increases the social and economic burden stroke imposes on patients and their families. Therefore effective treatment is of paramount importance. However, the treatment of PSCI is very limited. The primary aim of this protocol is to propose a lower cost and more effective therapy, and to confirm the long-term effectiveness of a therapeutic regimen of Traditional Chinese Medicine (TCM) rehabilitation for PSCI. A prospective, multicenter, large sample, randomized controlled trial will be conducted. A total of 416 eligible patients will be recruited from seven inpatient and outpatient stroke rehabilitation units and randomly allocated into a therapeutic regimen of TCM rehabilitation group or cognitive training (CT) control group. The intervention period of both groups will last 12 weeks (30 minutes per day, five days per week). Primary and secondary outcomes will be measured at baseline, 12 weeks (at the end of the intervention), and 36 weeks (after the 24-week follow-up period). This protocol presents an objective design of a multicenter, large sample, randomized controlled trial that aims to put forward a lower cost and more effective therapy, and confirm the long-term effectiveness of a therapeutic regimen of TCM rehabilitation for PSCI through subjective and objective assessments, as well as highlight its economic advantages. This trial was registered with the Chinese Clinical Trial Registry (identifier: ChiCTR-TRC-14004872 ) on 23 June 2014.
Resonance, criticality, and emergence in city traffic investigated in cellular automaton models.
Varas, A; Cornejo, M D; Toledo, B A; Muñoz, V; Rogan, J; Zarama, R; Valdivia, J A
2009-11-01
The complex behavior that occurs when traffic lights are synchronized is studied for a row of interacting cars. The system is modeled through a cellular automaton. Two strategies are considered: all lights in phase and a "green wave" with a propagating green signal. It is found that the mean velocity near the resonant condition follows a critical scaling law. For the green wave, it is shown that the mean velocity scaling law holds even for random separation between traffic lights and is not dependent on the density. This independence on car density is broken when random perturbations are considered in the car velocity. Random velocity perturbations also have the effect of leading the system to an emergent state, where cars move in clusters, but with an average velocity which is independent of traffic light switching for large injection rates.
Experiments in randomly agitated granular assemblies close to the jamming transition
NASA Astrophysics Data System (ADS)
Caballero, Gabriel; Lindner, Anke; Ovarlez, Guillaume; Reydellet, Guillaume; Lanuza, José; Clément, Eric
2004-11-01
We present the results obtained for two experiments on randomly agitated granular assemblies using a novel way of shaking. First we discuss the transport properties of a 2D model system undergoing classical shaking that show the importance of large scale dynamics for this type of agitation and offer a local view of the microscopic motions of a grain. We then develop a new way of vibrating the system allowing for random accelerations smaller than gravity. Using this method we study the evolution of the free surface as well as results from a light scattering method for a 3D model system. The final aim of these experiments is to investigate the ideas of effective temperature on the one hand as a function of inherent states and on the other hand using fluctuation dissipation relations.
Experiments in randomly agitated granular assemblies close to the jamming transition
NASA Astrophysics Data System (ADS)
Caballero, Gabriel; Lindner, Anke; Ovarlez, Guillaume; Reydellet, Guillaume; Lanuza, José; Clément, Eric
2004-03-01
We present the results obtained for two experiments on randomly agitated granular assemblies using a novel way of shaking. First we discuss the transport properties of a 2D model system undergoing classical shaking that show the importance of large scale dynamics for this type of agitation and offer a local view of the microscopic motions of a grain. We then develop a new way of vibrating the system allowing for random accelerations smaller than gravity. Using this method we study the evolution of the free surface as well as results from a light scattering method for a 3D model system. The final aim of these experiments is to investigate the ideas of effective temperature on the one hand as a function of inherent states and on the other hand using fluctuation dissipation relations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Romero, Vicente; Bonney, Matthew; Schroeder, Benjamin
When very few samples of a random quantity are available from a source distribution of unknown shape, it is usually not possible to accurately infer the exact distribution from which the data samples come. Under-estimation of important quantities such as response variance and failure probabilities can result. For many engineering purposes, including design and risk analysis, we attempt to avoid under-estimation with a strategy to conservatively estimate (bound) these types of quantities -- without being overly conservative -- when only a few samples of a random quantity are available from model predictions or replicate experiments. This report examines a classmore » of related sparse-data uncertainty representation and inference approaches that are relatively simple, inexpensive, and effective. Tradeoffs between the methods' conservatism, reliability, and risk versus number of data samples (cost) are quantified with multi-attribute metrics use d to assess method performance for conservative estimation of two representative quantities: central 95% of response; and 10 -4 probability of exceeding a response threshold in a tail of the distribution. Each method's performance is characterized with 10,000 random trials on a large number of diverse and challenging distributions. The best method and number of samples to use in a given circumstance depends on the uncertainty quantity to be estimated, the PDF character, and the desired reliability of bounding the true value. On the basis of this large data base and study, a strategy is proposed for selecting the method and number of samples for attaining reasonable credibility levels in bounding these types of quantities when sparse samples of random variables or functions are available from experiments or simulations.« less
He, Qiang; Hu, Xiangtao; Ren, Hong; Zhang, Hongqi
2015-11-01
A novel artificial fish swarm algorithm (NAFSA) is proposed for solving large-scale reliability-redundancy allocation problem (RAP). In NAFSA, the social behaviors of fish swarm are classified in three ways: foraging behavior, reproductive behavior, and random behavior. The foraging behavior designs two position-updating strategies. And, the selection and crossover operators are applied to define the reproductive ability of an artificial fish. For the random behavior, which is essentially a mutation strategy, the basic cloud generator is used as the mutation operator. Finally, numerical results of four benchmark problems and a large-scale RAP are reported and compared. NAFSA shows good performance in terms of computational accuracy and computational efficiency for large scale RAP. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.
Spatiotemporal property and predictability of large-scale human mobility
NASA Astrophysics Data System (ADS)
Zhang, Hai-Tao; Zhu, Tao; Fu, Dongfei; Xu, Bowen; Han, Xiao-Pu; Chen, Duxin
2018-04-01
Spatiotemporal characteristics of human mobility emerging from complexity on individual scale have been extensively studied due to the application potential on human behavior prediction and recommendation, and control of epidemic spreading. We collect and investigate a comprehensive data set of human activities on large geographical scales, including both websites browse and mobile towers visit. Numerical results show that the degree of activity decays as a power law, indicating that human behaviors are reminiscent of scale-free random walks known as Lévy flight. More significantly, this study suggests that human activities on large geographical scales have specific non-Markovian characteristics, such as a two-segment power-law distribution of dwelling time and a high possibility for prediction. Furthermore, a scale-free featured mobility model with two essential ingredients, i.e., preferential return and exploration, and a Gaussian distribution assumption on the exploration tendency parameter is proposed, which outperforms existing human mobility models under scenarios of large geographical scales.
Fottrell, Edward; Byass, Peter; Berhane, Yemane
2008-03-25
As in any measurement process, a certain amount of error may be expected in routine population surveillance operations such as those in demographic surveillance sites (DSSs). Vital events are likely to be missed and errors made no matter what method of data capture is used or what quality control procedures are in place. The extent to which random errors in large, longitudinal datasets affect overall health and demographic profiles has important implications for the role of DSSs as platforms for public health research and clinical trials. Such knowledge is also of particular importance if the outputs of DSSs are to be extrapolated and aggregated with realistic margins of error and validity. This study uses the first 10-year dataset from the Butajira Rural Health Project (BRHP) DSS, Ethiopia, covering approximately 336,000 person-years of data. Simple programmes were written to introduce random errors and omissions into new versions of the definitive 10-year Butajira dataset. Key parameters of sex, age, death, literacy and roof material (an indicator of poverty) were selected for the introduction of errors based on their obvious importance in demographic and health surveillance and their established significant associations with mortality. Defining the original 10-year dataset as the 'gold standard' for the purposes of this investigation, population, age and sex compositions and Poisson regression models of mortality rate ratios were compared between each of the intentionally erroneous datasets and the original 'gold standard' 10-year data. The composition of the Butajira population was well represented despite introducing random errors, and differences between population pyramids based on the derived datasets were subtle. Regression analyses of well-established mortality risk factors were largely unaffected even by relatively high levels of random errors in the data. The low sensitivity of parameter estimates and regression analyses to significant amounts of randomly introduced errors indicates a high level of robustness of the dataset. This apparent inertia of population parameter estimates to simulated errors is largely due to the size of the dataset. Tolerable margins of random error in DSS data may exceed 20%. While this is not an argument in favour of poor quality data, reducing the time and valuable resources spent on detecting and correcting random errors in routine DSS operations may be justifiable as the returns from such procedures diminish with increasing overall accuracy. The money and effort currently spent on endlessly correcting DSS datasets would perhaps be better spent on increasing the surveillance population size and geographic spread of DSSs and analysing and disseminating research findings.
Buchbinder, Rachelle; Richards, Bethan; Harris, Ian
2014-03-01
Over the last decade, there has been increased recognition of the importance of high-quality randomized controlled trials in determining the role of surgery for knee osteoarthritis. This review highlights key findings from the best available studies, and considers whether or not this knowledge has resulted in better evidence-based care. Use of arthroscopy to treat knee osteoarthritis has not declined despite strong evidence-based recommendations that do not sanction its use. A large randomized controlled trial has demonstrated that arthroscopic partial meniscectomy followed by a standardized physical therapy program results in similar improvements in pain and function at 6 and 12 months in comparison to physical therapy alone in patients with knee osteoarthritis and a symptomatic meniscal tear, confirming the findings of two previous trials. Two recent randomized controlled trials have demonstrated that decision aids help people to reach better-informed decisions about total knee arthroplasty. A majority of studies have indicated that for people with obesity the positive results of total knee arthroplasty may be compromised by postoperative complications, particularly infection. More efforts are needed to overcome significant evidence-practice gaps in the surgical management of knee osteoarthritis, particularly arthroscopy. Decision aids are a promising tool.
Petruzzi, M; Grassi, F R; Nardi, G M; Martinelli, D; Serpico, R; Luglie, P F; Baldoni, E
2010-01-01
Candidiasis is a relevant problem in oral medicine practice. We compared the antimycotic activity of nystatin with a solution of sodium iodide associated to salicylic acid (SISA) in the topical management of chronic candidiasis. Consecutive patients affected by chronic candidiasis were randomly allocated to SISA (group A) or nystatin (group B). VAS and swab scores were recorded at the beginning and at the end of the study while the healing index was evaluated at the end of the study only. Data were analyzed by STATA 10 MP. Forty patients (20 male, 20 female) were randomized. SIAS was as effective as nystatin in affecting VAS (p greater than 0.05) and swab score (p greater than 0.05). A statistically significant reduction (p less than 0.05) of healing index was observed in both groups. No side effects were reported. SISA topical application, shows a comparable efficacy to the nystatin in the management of chronic oral candidiasis. Its use could represent an adequate alternative to the nystatin above all in the cases of drug-resistance. Further large scale randomized trials are warranted to confirm these preliminary findings.
NASA Astrophysics Data System (ADS)
Chen, Jie; Brissette, François P.; Lucas-Picher, Philippe
2016-11-01
Given the ever increasing number of climate change simulations being carried out, it has become impractical to use all of them to cover the uncertainty of climate change impacts. Various methods have been proposed to optimally select subsets of a large ensemble of climate simulations for impact studies. However, the behaviour of optimally-selected subsets of climate simulations for climate change impacts is unknown, since the transfer process from climate projections to the impact study world is usually highly non-linear. Consequently, this study investigates the transferability of optimally-selected subsets of climate simulations in the case of hydrological impacts. Two different methods were used for the optimal selection of subsets of climate scenarios, and both were found to be capable of adequately representing the spread of selected climate model variables contained in the original large ensemble. However, in both cases, the optimal subsets had limited transferability to hydrological impacts. To capture a similar variability in the impact model world, many more simulations have to be used than those that are needed to simply cover variability from the climate model variables' perspective. Overall, both optimal subset selection methods were better than random selection when small subsets were selected from a large ensemble for impact studies. However, as the number of selected simulations increased, random selection often performed better than the two optimal methods. To ensure adequate uncertainty coverage, the results of this study imply that selecting as many climate change simulations as possible is the best avenue. Where this was not possible, the two optimal methods were found to perform adequately.
Xu, Jiuping; Feng, Cuiying
2014-01-01
This paper presents an extension of the multimode resource-constrained project scheduling problem for a large scale construction project where multiple parallel projects and a fuzzy random environment are considered. By taking into account the most typical goals in project management, a cost/weighted makespan/quality trade-off optimization model is constructed. To deal with the uncertainties, a hybrid crisp approach is used to transform the fuzzy random parameters into fuzzy variables that are subsequently defuzzified using an expected value operator with an optimistic-pessimistic index. Then a combinatorial-priority-based hybrid particle swarm optimization algorithm is developed to solve the proposed model, where the combinatorial particle swarm optimization and priority-based particle swarm optimization are designed to assign modes to activities and to schedule activities, respectively. Finally, the results and analysis of a practical example at a large scale hydropower construction project are presented to demonstrate the practicality and efficiency of the proposed model and optimization method.
TemperSAT: A new efficient fair-sampling random k-SAT solver
NASA Astrophysics Data System (ADS)
Fang, Chao; Zhu, Zheng; Katzgraber, Helmut G.
The set membership problem is of great importance to many applications and, in particular, database searches for target groups. Recently, an approach to speed up set membership searches based on the NP-hard constraint-satisfaction problem (random k-SAT) has been developed. However, the bottleneck of the approach lies in finding the solution to a large SAT formula efficiently and, in particular, a large number of independent solutions is needed to reduce the probability of false positives. Unfortunately, traditional random k-SAT solvers such as WalkSAT are biased when seeking solutions to the Boolean formulas. By porting parallel tempering Monte Carlo to the sampling of binary optimization problems, we introduce a new algorithm (TemperSAT) whose performance is comparable to current state-of-the-art SAT solvers for large k with the added benefit that theoretically it can find many independent solutions quickly. We illustrate our results by comparing to the currently fastest implementation of WalkSAT, WalkSATlm.
Xu, Jiuping
2014-01-01
This paper presents an extension of the multimode resource-constrained project scheduling problem for a large scale construction project where multiple parallel projects and a fuzzy random environment are considered. By taking into account the most typical goals in project management, a cost/weighted makespan/quality trade-off optimization model is constructed. To deal with the uncertainties, a hybrid crisp approach is used to transform the fuzzy random parameters into fuzzy variables that are subsequently defuzzified using an expected value operator with an optimistic-pessimistic index. Then a combinatorial-priority-based hybrid particle swarm optimization algorithm is developed to solve the proposed model, where the combinatorial particle swarm optimization and priority-based particle swarm optimization are designed to assign modes to activities and to schedule activities, respectively. Finally, the results and analysis of a practical example at a large scale hydropower construction project are presented to demonstrate the practicality and efficiency of the proposed model and optimization method. PMID:24550708
Follow-up of colorectal cancer patients after resection with curative intent-the GILDA trial.
Grossmann, Erik M; Johnson, Frank E; Virgo, Katherine S; Longo, Walter E; Fossati, Rolando
2004-01-01
Surgery remains the primary treatment of colorectal cancer. Data are lacking to delineate the optimal surveillance strategy following resection. A large-scale multi-center European study is underway to address this issue (Gruppo Italiano di Lavoro per la Diagnosi Anticipata-GILDA). Following primary surgery with curative intent, stratification, and randomization at GILDA headquarters, colon cancer patients are then assigned to a more intensive or less intensive surveillance regimen. Rectal cancer patients undergoing curative resection are similarly randomized, with their follow-up regimens placing more emphasis on detection of local recurrence. Target recruitment for the study will be 1500 patients to achieve a statistical power of 80% (assuming an alpha of 0.05 and a hazard-rate reduction of >24%). Since the trial opened in 1998, 985 patients have been randomized from 41 centers as of February 2004. There were 496 patients randomized to the less intensive regimens, and 489 randomized to the more intensive regimens. The mean duration of follow-up is 14 months. 75 relapses (15%) and 32 deaths (7%) had been observed in the two more intensive follow-up arms, while 64 relapses (13%) and 24 deaths (5%) had been observed in the two less intensive arms as of February 2004. This trial should provide the first evidence based on an adequately powered randomized trial to determine the optimal follow-up strategy for colorectal cancer patients. This trial is open to US centers, and recruitment continues.
NASA Astrophysics Data System (ADS)
Sato, Haruo; Fehler, Michael C.
2016-10-01
The envelope broadening and the peak delay of the S-wavelet of a small earthquake with increasing travel distance are results of scattering by random velocity inhomogeneities in the earth medium. As a simple mathematical model, Sato proposed a new stochastic synthesis of the scalar wavelet envelope in 3-D von Kármán type random media when the centre wavenumber of the wavelet is in the power-law spectral range of the random velocity fluctuation. The essential idea is to split the random medium spectrum into two components using the centre wavenumber as a reference: the long-scale (low-wavenumber spectral) component produces the peak delay and the envelope broadening by multiple scattering around the forward direction; the short-scale (high-wavenumber spectral) component attenuates wave amplitude by wide angle scattering. The former is calculated by the Markov approximation based on the parabolic approximation and the latter is calculated by the Born approximation. Here, we extend the theory for the envelope synthesis of a wavelet in 2-D random media, which makes it easy to compare with finite difference (FD) simulation results. The synthetic wavelet envelope is analytically written by using the random medium parameters in the angular frequency domain. For the case that the power spectral density function of the random velocity fluctuation has a steep roll-off at large wavenumbers, the envelope broadening is small and frequency independent, and scattering attenuation is weak. For the case of a small roll-off, however, the envelope broadening is large and increases with frequency, and the scattering attenuation is strong and increases with frequency. As a preliminary study, we compare synthetic wavelet envelopes with the average of FD simulation wavelet envelopes in 50 synthesized random media, which are characterized by the RMS fractional velocity fluctuation ε = 0.05, correlation scale a = 5 km and the background wave velocity V0 = 4 km s-1. We use the radiation of a 2 Hz Ricker wavelet from a point source. For all the cases of von Kármán order κ = 0.1, 0.5 and 1, we find the synthetic wavelet envelopes are a good match to the characteristics of FD simulation wavelet envelopes in a time window starting from the onset through the maximum peak to the time when the amplitude decreases to half the peak amplitude.
Abis, Gabor S A; Stockmann, Hein B A C; van Egmond, Marjolein; Bonjer, Hendrik J; Vandenbroucke-Grauls, Christina M J E; Oosterling, Steven J
2013-12-01
Gastrointestinal surgery is associated with a high incidence of infectious complications. Selective decontamination of the digestive tract is an antimicrobial prophylaxis regimen that aims to eradicate gastrointestinal carriage of potentially pathogenic microorganisms and represents an adjunct to regular prophylaxis in surgery. Relevant studies were identified using bibliographic searches of MEDLINE, EMBASE, and the Cochrane database (period from 1970 to November 1, 2012). Only studies investigating selective decontamination of the digestive tract in gastrointestinal surgery were included. Two randomized clinical trials and one retrospective case-control trial showed significant benefit in terms of infectious complications and anastomotic leakage in colorectal surgery. Two randomized controlled trials in esophageal surgery and two randomized clinical trials in gastric surgery reported lower levels of infectious complications. Selective decontamination of the digestive tract reduces infections following esophageal, gastric, and colorectal surgeries and also appears to have beneficial effects on anastomotic leakage in colorectal surgery. We believe these results provide the basis for a large multicenter prospective study to investigate the role of selective decontamination of the digestive tract in colorectal surgery.
Wearn, Oliver R.; Rowcliffe, J. Marcus; Carbone, Chris; Bernard, Henry; Ewers, Robert M.
2013-01-01
The proliferation of camera-trapping studies has led to a spate of extensions in the known distributions of many wild cat species, not least in Borneo. However, we still do not have a clear picture of the spatial patterns of felid abundance in Southeast Asia, particularly with respect to the large areas of highly-disturbed habitat. An important obstacle to increasing the usefulness of camera trap data is the widespread practice of setting cameras at non-random locations. Non-random deployment interacts with non-random space-use by animals, causing biases in our inferences about relative abundance from detection frequencies alone. This may be a particular problem if surveys do not adequately sample the full range of habitat features present in a study region. Using camera-trapping records and incidental sightings from the Kalabakan Forest Reserve, Sabah, Malaysian Borneo, we aimed to assess the relative abundance of felid species in highly-disturbed forest, as well as investigate felid space-use and the potential for biases resulting from non-random sampling. Although the area has been intensively logged over three decades, it was found to still retain the full complement of Bornean felids, including the bay cat Pardofelis badia, a poorly known Bornean endemic. Camera-trapping using strictly random locations detected four of the five Bornean felid species and revealed inter- and intra-specific differences in space-use. We compare our results with an extensive dataset of >1,200 felid records from previous camera-trapping studies and show that the relative abundance of the bay cat, in particular, may have previously been underestimated due to the use of non-random survey locations. Further surveys for this species using random locations will be crucial in determining its conservation status. We advocate the more wide-spread use of random survey locations in future camera-trapping surveys in order to increase the robustness and generality of inferences that can be made. PMID:24223717
Magneto-transport properties of a random distribution of few-layer graphene patches
DOE Office of Scientific and Technical Information (OSTI.GOV)
Iacovella, Fabrice; Mitioglu, Anatolie; Pierre, Mathieu
In this study, we address the electronic properties of conducting films constituted of an array of randomly distributed few layer graphene patches and investigate on their most salient galvanometric features in the moderate and extreme disordered limit. We demonstrate that, in annealed devices, the ambipolar behaviour and the onset of Landau level quantization in high magnetic field constitute robust hallmarks of few-layer graphene films. In the strong disorder limit, however, the magneto-transport properties are best described by a variable-range hopping behaviour. A large negative magneto-conductance is observed at the charge neutrality point, in consistency with localized transport regime.
Genetically determined schizophrenia is not associated with impaired glucose homeostasis.
Polimanti, Renato; Gelernter, Joel; Stein, Dan J
2018-05-01
Here, we used data from large genome-wide association studies to test the presence of causal relationships, conducting a Mendelian randomization analysis; and shared molecular mechanisms, calculating the genetic correlation, among schizophrenia, type 2 diabetes (T2D), and impaired glucose homeostasis. Although our Mendelian randomization analysis was well-powered, no causal relationship was observed between schizophrenia and T2D, or traits related to glucose impaired homeostasis. Similarly, we did not observe any global genetic overlap among these traits. These findings indicate that there is no causal relationships or shared mechanisms between schizophrenia and impaired glucose homeostasis. Copyright © 2017 Elsevier B.V. All rights reserved.
The Multi-Orientable Random Tensor Model, a Review
NASA Astrophysics Data System (ADS)
Tanasa, Adrian
2016-06-01
After its introduction (initially within a group field theory framework) in [Tanasa A., J. Phys. A: Math. Theor. 45 (2012), 165401, 19 pages, arXiv:1109.0694], the multi-orientable (MO) tensor model grew over the last years into a solid alternative of the celebrated colored (and colored-like) random tensor model. In this paper we review the most important results of the study of this MO model: the implementation of the 1/N expansion and of the large N limit (N being the size of the tensor), the combinatorial analysis of the various terms of this expansion and finally, the recent implementation of a double scaling limit.
Multivitamin/multimineral supplements for cancer prevention: implications for primary care practice.
Hardy, Mary L; Duvall, Karen
2015-01-01
There is a popular belief that multivitamin and mineral (MVM) supplements can help prevent cancer and other chronic diseases. Studies evaluating the effects of MVM supplements on cancer risk have largely been observational, with considerable methodologic limitations, and with conflicting results. We review evidence from the few available randomized, controlled trials that assessed the effects of supplements containing individual vitamins, a combination of a few select vitamins, or complete MVM supplements, with a focus on the recent Physicians' Health Study II (PHS II). PHS II is a landmark trial that followed generally healthy middle-aged and older men (mean age 64 years) who were randomized to daily MVM supplementation for a mean duration of 11 years. Men taking MVMs experienced a statistically significant 8% reduction in incidence of total cancer (hazard ratio [HR]: 0.92; 95% confidence interval [CI]: 0.86-0.998; p = 0.04). Men with a history of cancer derived an even greater benefit: cancer incidence was 27% lower with MVM supplementation versus placebo in this subgroup (HR: 0.73; 95% CI: 0.56-0.96; p = 0.02). Positive results of PHS II contrast with randomized studies of individual vitamins or small combinations of vitamins, which have largely shown a neutral effect, and in some cases, an adverse effect, on cancer risk. The results of PHS II may have a considerable public health impact, potentially translating to prevention of approximately 68 000 cancers per year if all men were to use similar supplements, and to an even greater benefit with regard to secondary prevention of cancer.
Back, Christopher; Dearman, Bronwyn; Li, Amy; Neild, Tim; Greenwood, John E
2009-01-01
Randomized controlled trials in the literature investigating the efficacy of noncultured keratinocyte/melanocyte suspensions are scarce; however, the advocates of such techniques press the value of their application based largely on case studies and anecdote. Caucasian patients with burn hypopigmentation seldom request cosmetic revision making worthwhile clinical trials difficult so that informal case treatments with new therapies generate anecdotal results. A randomized, placebo-controlled trial was carried out to evaluate whether cosuspensions of noncultured skin cells are capable of (1) decreasing the time to reepithelialization and (2) reestablishing pigmentation in vitiligo leukoderma following epidermal/superficial dermal ablation (in the knowledge that a positive result would make the technique likely to be successful in burn hypopigmentation). Vitiligo is common and is socially more debilitating such that suitable trial subjects for new therapies from this pool are more forthcoming. This study demonstrated that suspensions of noncultured keratinocytes and melanocytes do not decrease the time to epithelialization of superficial partial thickness wounds compared with controls. It also suggested that the achievement, quality, and duration of any pigmentation were unpredictable and largely disappointing. Some pigmentation was recorded in placebo-treated areas indicating an effect of the method of epidermal ablation in these patients. These findings have mandated a complete review of the use of these techniques in burn care at the Royal Adelaide Hospital; they have been omitted from surgical protocols where the aim of use was to speed reepithelialization. Their infrequent use in burns hypopigmentation will continue contingent on the successful repigmentation of a test patch.
Carlson, Mike; Vigen, Cheryl Lp; Rubayi, Salah; Blanche, Erna Imperatore; Blanchard, Jeanine; Atkins, Michal; Bates-Jensen, Barbara; Garber, Susan L; Pyatak, Elizabeth A; Diaz, Jesus; Florindez, Lucia I; Hay, Joel W; Mallinson, Trudy; Unger, Jennifer B; Azen, Stanley Paul; Scott, Michael; Cogan, Alison; Clark, Florence
2017-04-17
Medically serious pressure injuries (MSPrIs), a common complication of spinal cord injury (SCI), have devastating consequences on health and well-being and are extremely expensive to treat. We aimed to test the efficacy of a lifestyle-based intervention designed to reduce incidence of MSPrIs in adults with SCI. A randomized controlled trial (RCT), and a separate study wing involving a nonrandomized standard care control group. Rancho Los Amigos National Rehabilitation Center, a large facility serving ethnically diverse, low income residents of Los Angeles County. Adults with SCI, with history of one or more MSPrIs over the past 5 years: N=166 for RCT component, N=66 in nonrandomized control group. The Pressure Ulcer Prevention Program, a 12-month lifestyle-based treatment administered by healthcare professionals, largely via in-home visits and phone contacts. Blinded assessments of annualized MSPrI incidence rates at 12 and 24 months, based on: skin checks, quarterly phone interviews with participants, and review of medical charts and billing records. Secondary outcomes included number of surgeries and various quality-of-life measures. Annualized MSPrI rates did not differ significantly between study groups. At 12 months, rates were .56 for intervention recipients, .48 for randomized controls, and .65 for nonrandomized controls. At follow-up, rates were .44 and .39 respectively for randomized intervention and control participants. Evidence for intervention efficacy was inconclusive. The intractable nature of MSPrI threat in high-risk SCI populations, and lack of statistical power, may have contributed to this inability to detect an effect. ClinicalTrials.gov NCT01999816.
Rietbergen, Charlotte; Stefansdottir, Gudrun; Leufkens, Hubert G; Knol, Mirjam J; De Bruin, Marie L; Klugkist, Irene
2017-01-01
The current system of harm assessment of medicines has been criticized for relying on intuitive expert judgment. There is a call for more quantitative approaches and transparency in decision-making. Illustrated with the case of cardiovascular safety concerns for rosiglitazone, we aimed to explore a structured procedure for the collection, quality assessment, and statistical modeling of safety data from observational and randomized studies. We distinguished five stages in the synthesis process. In Stage I, the general research question, population and outcome, and general inclusion and exclusion criteria are defined and a systematic search is performed. Stage II focusses on the identification of sub-questions examined in the included studies and the classification of the studies into the different categories of sub-questions. In Stage III, the quality of the identified studies is assessed. Coding and data extraction are performed in Stage IV. Finally, meta-analyses on the study results per sub-question are performed in Stage V. A Pubmed search identified 30 randomized and 14 observational studies meeting our search criteria. From these studies, we identified 4 higher level sub-questions and 4 lower level sub-questions. We were able to categorize 29 individual treatment comparisons into one or more of the sub-question categories, and selected study duration as an important covariate. We extracted covariate, outcome, and sample size information at the treatment arm level of the studies. We extracted absolute numbers of myocardial infarctions from the randomized study, and adjusted risk estimates with 95% confidence intervals from the observational studies. Overall, few events were observed in the randomized studies that were frequently of relatively short duration. The large observational studies provided more information since these were often of longer duration. A Bayesian random effects meta-analysis on these data showed no significant increase in risk of rosiglitazone for any of the sub-questions. The proposed procedure can be of additional value for drug safety assessment because it provides a stepwise approach that guides the decision-making in increasing process transparency. The procedure allows for the inclusion of results from both randomized an observational studies, which is especially relevant for this type of research.
Hiremath, Swapnil; Dangas, George; Mehran, Roxana; Brar, Simerjeet K.; Leon, Martin B.
2009-01-01
Background and objectives: Infusion of sodium bicarbonate has been suggested as a preventative strategy but reports are conflicting on its efficacy. The aim of this study was to assess the effectiveness of hydration with sodium bicarbonate for the prevention of contrast-induced acute kidney injury (CI-AKI). Design, setting, participants, & measurements: Medline, EMBASE, Cochrane library, and the Internet were searched for randomized controlled trials comparing hydration between sodium bicarbonate and chloride for the prevention of CI-AKI between 1966 and November 2008. Fourteen trials that included 2290 patients were identified. There was significant heterogeneity between studies (P heterogeneity = 0.02; I2 = 47.8%), which was largely accounted for by trial size (P = 0.016). Trials were therefore classified by size. Results: Three trials were categorized as large (n = 1145) and 12 as small (n = 1145). Among the large trials, the incidence of CI-AKI for sodium bicarbonate and sodium chloride was 10.7 and 12.5%, respectively; the relative risk (RR) [95% confidence interval (CI)] was 0.85 (0.63 to 1.16) without evidence of heterogeneity (P = 0.89, I2 = 0%). The pooled RR (95% CI) among the 12 small trials was 0.50 (0.27 to 0.93) with significant between-trial heterogeneity (P = 0.01; I2 = 56%). The small trials were more likely to be of lower methodological quality. Conclusions: A significant clinical and statistical heterogeneity was observed that was largely explained by trial size and published status. Among the large randomized trials there was no evidence of benefit for hydration with sodium bicarbonate compared with sodium chloride for the prevention of CI-AKI. The benefit of sodium bicarbonate was limited to small trials of lower methodological quality. PMID:19713291
Fleckenstein, Johannes; Lill, Christian; Lüdtke, Rainer; Gleditsch, Jochen; Rasp, Gerd; Irnich, Dominik
2009-09-01
One out of 4 patients visiting a general practitioner reports of a sore throat associated with pain on swallowing. This study was established to examine the immediate pain alleviating effect of a single point acupuncture treatment applied to the large intestine meridian of patients with sore throat. Sixty patients with acute tonsillitis and pharyngitis were enrolled in this randomized placebo-controlled trial. They either received acupuncture, or sham laser acupuncture, directed to the large intestine meridian section between acupuncture points LI 8 and LI 10. The main outcome measure was the change of pain intensity on swallowing a sip of water evaluated by a visual analog scale 15 minutes after treatment. A credibility assessment regarding the respective treatment was performed. The pain intensity for the acupuncture group before and immediately after therapy was 5.6+/-2.8 and 3.0+/-3.0, and for the sham group 5.6+/-2.5 and 3.8+/-2.5, respectively. Despite the articulation of a more pronounced improvement among the acupuncture group, there was no significant difference between groups (Delta=0.9, confidence interval: -0.2-2.0; P=0.12; analysis of covariance). Patients' satisfaction was high in both treatment groups. The study was prematurely terminated due to a subsequent lack of suitable patients. A single acupuncture treatment applied to a selected area of the large intestine meridian was no more effective in the alleviation of pain associated with clinical sore throat than sham laser acupuncture applied to the same area. Hence, clinically relevant improvement could be achieved. Pain alleviation might partly be due to the intense palpation of the large intestine meridian. The benefit of a comprehensive acupuncture treatment protocol in this condition should be subject to further trials.
Zhang, Guo-Qiang; Tao, Shiqiang; Xing, Guangming; Mozes, Jeno; Zonjy, Bilal; Lhatoo, Samden D
2015-01-01
Background A unique study identifier serves as a key for linking research data about a study subject without revealing protected health information in the identifier. While sufficient for single-site and limited-scale studies, the use of common unique study identifiers has several drawbacks for large multicenter studies, where thousands of research participants may be recruited from multiple sites. An important property of study identifiers is error tolerance (or validatable), in that inadvertent editing mistakes during their transmission and use will most likely result in invalid study identifiers. Objective This paper introduces a novel method called "Randomized N-gram Hashing (NHash)," for generating unique study identifiers in a distributed and validatable fashion, in multicenter research. NHash has a unique set of properties: (1) it is a pseudonym serving the purpose of linking research data about a study participant for research purposes; (2) it can be generated automatically in a completely distributed fashion with virtually no risk for identifier collision; (3) it incorporates a set of cryptographic hash functions based on N-grams, with a combination of additional encryption techniques such as a shift cipher; (d) it is validatable (error tolerant) in the sense that inadvertent edit errors will mostly result in invalid identifiers. Methods NHash consists of 2 phases. First, an intermediate string using randomized N-gram hashing is generated. This string consists of a collection of N-gram hashes f 1, f 2, ..., f k. The input for each function f i has 3 components: a random number r, an integer n, and input data m. The result, f i(r, n, m), is an n-gram of m with a starting position s, which is computed as (r mod |m|), where |m| represents the length of m. The output for Step 1 is the concatenation of the sequence f 1(r 1, n 1, m 1), f 2(r 2, n 2, m 2), ..., f k(r k, n k, m k). In the second phase, the intermediate string generated in Phase 1 is encrypted using techniques such as shift cipher. The result of the encryption, concatenated with the random number r, is the final NHash study identifier. Results We performed experiments using a large synthesized dataset comparing NHash with random strings, and demonstrated neglegible probability for collision. We implemented NHash for the Center for SUDEP Research (CSR), a National Institute for Neurological Disorders and Stroke-funded Center Without Walls for Collaborative Research in the Epilepsies. This multicenter collaboration involves 14 institutions across the United States and Europe, bringing together extensive and diverse expertise to understand sudden unexpected death in epilepsy patients (SUDEP). Conclusions The CSR Data Repository has successfully used NHash to link deidentified multimodal clinical data collected in participating CSR institutions, meeting all desired objectives of NHash. PMID:26554419
Aromatherapy for the treatment of PONV in children: a pilot RCT.
Kiberd, Mathew B; Clarke, Suzanne K; Chorney, Jill; d'Eon, Brandon; Wright, Stuart
2016-11-09
Postoperative nausea and vomiting (PONV) is one of the most common postoperative complications of general anesthesia in pediatrics. Aromatherapy has been shown to be effective in treating PONV in adults. Given the encouraging results of the adult studies, we planned to determine feasibility of doing a large-scale study in the pediatric population. Our group conducted a pilot randomized controlled trial examining the effect of aromatherapy on post-operative nausea and vomiting in patients 4-16 undergoing ambulatory surgery at a single center. Nausea was defined as a score of 4/10 on the Baxter Retching Faces Scale (BARF scale). A clinically significant reduction was defined as a two-point reduction in Nausea. Post operatively children were administered the BARF scale in 15 min internals until discharge home or until nausea score of 4/10 or greater. Children with nausea were randomized to saline placebo group or aromatherapy QueaseEase™ (Soothing Scents, Inc, Enterprise, AL: blend of ginger, lavender, mint and spearmint). Nausea scores were recorded post intervention. A total of 162 subjects were screened for inclusion in the study. Randomization occurred in 41 subjects of which 39 were included in the final analysis. For the primary outcome, 14/18 (78 %) of controls reached primary outcome compared to 19/21 (90 %) in the aromatherapy group (p = 0.39, Eta 0.175). Other outcomes included use of antiemetic in PACU (control 44 %, aromatherapy 52 % P = 0.75, Eta 0.08), emesis (Control 11 %, 9 % aromatherapy, P = 0.87, Eta = 0.03). There was a statistically significant difference in whether subjects continued to use the intervention (control 28 %, aromatherapy 66 %, p-value 0.048, Eta 0.33). Aromatherapy had a small non-significant effect size in treating postoperative nausea and vomiting compared with control. A large-scale randomized control trial would not be feasible at our institution and would be of doubtful utility. ClinicalTrials.gov NCT02663154 .
Assessing the sustainable construction of large construction companies in Malaysia
NASA Astrophysics Data System (ADS)
Adewale, Bamgbade Jibril; Mohammed, Kamaruddeen Ahmed; Nasrun, Mohd Nawi Mohd
2016-08-01
Considering the increasing concerns for the consideration of sustainability issues in construction project delivery within the construction industry, this paper assesses the extent of sustainable construction among Malaysian large contractors, in order to ascertain the level of the industry's impacts on both the environment and the society. Sustainable construction explains the construction industry's responsibility to efficiently utilise the finite resources while also reducing construction impacts on both humans and the environment throughout the phases of construction. This study used proportionate stratified random sampling to conduct a field study with a sample of 172 contractors out of the 708 administered questionnaires. Data were collected from large contractors in the eleven states of peninsular Malaysia. Using the five-level rating scale (which include: 1= Very Low; 2= Low; 3= Moderate; 4= High; 5= Very High) to describe the level of sustainable construction of Malaysian contractors based on previous studies, statistical analysis reveals that environmental, social and economic sustainability of Malaysian large contractors are high.
Strand-seq: a unifying tool for studies of chromosome segregation.
Falconer, Ester; Lansdorp, Peter M
2013-01-01
Non random segregation of sister chromatids has been implicated to help specify daughter cell fate (the Silent Sister Hypothesis [1]) or to protect the genome of long-lived stem cells (the Immortal Strand Hypothesis [2]). The idea that sister chromatids are non-randomly segregated into specific daughter cells is only marginally supported by data in sporadic and often contradictory studies. As a result, the field has moved forward rather slowly. The advent of being able to directly label and differentiate sister chromatids in vivo using fluorescence in situ hybridization [3] was a significant advance for such studies. However, this approach is limited by the need for large tracks of unidirectional repeats on chromosomes and the reliance on quantitative imaging of fluorescent probes and rigorous statistical analysis to discern between the two competing hypotheses. A novel method called Strand-seq which uses next-generation sequencing to assay sister chromatid inheritance patterns independently for each chromosome [4] offers a comprehensive approach to test for non-random segregation. In addition Strand-seq enables studies on the deposition of chromatin marks in relation to DNA replication. This method is expected to help unify the field by testing previous claims of non-random segregation in an unbiased way in many model systems in vitro and in vivo. Crown Copyright © 2013. Published by Elsevier Ltd. All rights reserved.
On supervised graph Laplacian embedding CA model & kernel construction and its application
NASA Astrophysics Data System (ADS)
Zeng, Junwei; Qian, Yongsheng; Wang, Min; Yang, Yongzhong
2017-01-01
There are many methods to construct kernel with given data attribute information. Gaussian radial basis function (RBF) kernel is one of the most popular ways to construct a kernel. The key observation is that in real-world data, besides the data attribute information, data label information also exists, which indicates the data class. In order to make use of both data attribute information and data label information, in this work, we propose a supervised kernel construction method. Supervised information from training data is integrated into standard kernel construction process to improve the discriminative property of resulting kernel. A supervised Laplacian embedding cellular automaton model is another key application developed for two-lane heterogeneous traffic flow with the safe distance and large-scale truck. Based on the properties of traffic flow in China, we re-calibrate the cell length, velocity, random slowing mechanism and lane-change conditions and use simulation tests to study the relationships among the speed, density and flux. The numerical results show that the large-scale trucks will have great effects on the traffic flow, which are relevant to the proportion of the large-scale trucks, random slowing rate and the times of the lane space change.
Dronedarone: a novel antiarrhythmic agent for the treatment of atrial fibrillation.
Duray, Gabor Z; Ehrlich, Joachim R; Hohnloser, Stefan H
2010-01-01
To describe the electrophysiological profile and the clinical portfolio of dronedarone, a new multichannel-blocking antiarrhythmic drug developed for the treatment of atrial fibrillation. Dronedarone is a derivative of amiodarone that is free of iodine and less lipophilic. The drug has - as its predecessor - multichannel-blocking efficacy and in addition vasodilating effects. It reduces the incidence of ventricular fibrillation in several experimental models. Dronedarone has undergone thorough clinical evaluation in various patient populations. In two large trials, the drug was shown to postpone the recurrence of atrial fibrillation after cardioversion relative to placebo. In a trial in unstable heart failure patients, there was excess mortality in the dronedarone arm. This trial was stopped prematurely and prompted the conduct of a large outcome study. The ATHENA trial demonstrated a significant reduction in cardiovascular hospitalizations and death in atrial fibrillation patients randomly assigned to receive dronedarone or placebo. This large trial in more than 4600 patients revealed no signs of excess mortality or morbidity in patients receiving dronedarone. On the basis of the results of five international, multicenter, randomized clinical trials involving nearly 6300 patients, dronedarone was approved by the FDA for treatment of nonpermanent atrial fibrillation to reduce the risk of cardiovascular hospitalization.
ERIC Educational Resources Information Center
Anderson, Daniel; Alonzo, Julie; Tindal, Gerald
2011-01-01
In this technical report, we document the results of a cross-validation study designed to identify optimal cut-scores for the use of the easyCBM[R] mathematics test in the state of Washington. A large sample, randomly split into two groups of roughly equal size, was used for this study. Students' performance classification on the Washington state…
A Cross-Validation of easyCBM[R] Mathematics Cut Scores in Oregon: 2009-2010. Technical Report #1104
ERIC Educational Resources Information Center
Anderson, Daniel; Alonzo, Julie; Tindal, Gerald
2011-01-01
In this technical report, we document the results of a cross-validation study designed to identify optimal cut-scores for the use of the easyCBM[R] mathematics test in Oregon. A large sample, randomly split into two groups of roughly equal size, was used for this study. Students' performance classification on the Oregon state test was used as the…
Colen, Sascha; van den Bekerom, Michel P J; Bellemans, Johan; Mulier, Michiel
2010-11-16
Although intra-articular hyaluronic acid is well established as a treatment for osteoarthritis of the knee, its use in hip osteoarthritis is not based on large randomized controlled trials. There is a need for more rigorously designed studies on hip osteoarthritis treatment as this subject is still very much under debate. Randomized, controlled trial with a three-armed, parallel-group design. Approximately 315 patients complying with the inclusion and exclusion criteria will be randomized into one of the following treatment groups: infiltration of the hip joint with hyaluronic acid, with a corticosteroid or with 0.125% bupivacaine.The following outcome measure instruments will be assessed at baseline, i.e. before the intra-articular injection of one of the study products, and then again at six weeks, 3 and 6 months after the initial injection: Pain (100 mm VAS), Harris Hip Score and HOOS, patient assessment of their clinical status (worse, stable or better then at the time of enrollment) and intake of pain rescue medication (number per week). In addition patients will be asked if they have complications/adverse events. The six-month follow-up period for all patients will begin on the date the first injection is administered. This randomized, controlled, three-arm study will hopefully provide robust information on two of the intra-articular treatments used in hip osteoarthritis, in comparison to bupivacaine. NCT01079455.
Clinicians' Views on Treatment-Resistant Depression: 2016 Survey Reports.
Arandjelovic, Katarina; Eyre, Harris A; Lavretsky, Helen
2016-10-01
There is a relative paucity of information on both empirical and subjective treatment strategies for treatment-resistant depression (TRD), especially in late life. This paper reviews the findings from two 2016 surveys conducted through the American Psychiatric Association publication the Psychiatric Times and via a member survey by the American Association for Geriatric Psychiatry (AAGP). We present the results of the two surveys in terms of descriptive frequencies and percentages and discuss the strengths and weaknesses of various approaches to late-life TRD. The Psychiatric Times survey received 468 responses, and the AAGP survey received 117 responses, giving an overall sample of 585 responses. The majority (76.3%) of respondents from both groups believed that a large randomized study comparing the risks and benefits of augmentation and switching strategies for TRD in patients aged 60 years and older would be helpful, and 80% of clinicians believed their practice would benefit from the findings of such a study. Of the treatment strategies that need evidence of efficacy, the most popular options were augmentation/combination strategies, particularly augmentation with aripiprazole (58.7%), bupropion (55.0%), and lithium (50.9%). Late-life TRD constitutes a large proportion of clinical practices, particularly of geriatric psychiatry, with lacking evidence of efficacy of most treatment strategies. These surveys indicate a clear need for a large randomized study that compares risks and benefits of augmentation and switching strategies. Copyright © 2016 American Association for Geriatric Psychiatry. Published by Elsevier Inc. All rights reserved.
Yilmaz, T; Cordero-Coma, M; Gallagher, M J
2012-01-01
To assess the effectiveness of ketorolac vs control for prevention of acute pseudophakic cystoid macular edema (CME). The following databases were searched: Medline (1950–June 11, 2011), The Cochrane Library (Issue 2, 2011), and the TRIP Database (up to 11 June 2011), using no language or other limits. Randomized controlled clinical trials (RCTs) were included that consisted of patients with acute pseudophakic cystoid macular edema, those comparing ketorolac with control, and those having at least a minimum follow-up of 28 days. In the four RCTs evaluating ketorolac vs control, treatment with ketorolac significantly reduced the risk of CME development at the end of treatment (∼4 weeks) compared to control (P=0.008; 95% confidence interval (0.03–0.58)). When analyzed individually, each individual study was statistically nonsignificant in its findings with the exception of one study. When the pooled relative risk was calculated, the large sample size of this systematic review led to overall statistical significance, which is attributable to the review's large sample size and not to the individual studies themselves. In this systematic review of four RCTs, two of which compared ketorolac with no treatment and two of which evaluated ketorolac vs placebo drops, treatment with ketorolac significantly reduced the risk of developing CME at the end of ∼4 weeks of treatment compared with controls. These results, however, should be interpreted with caution considering the paucity of large randomized clinical trials in the literature. PMID:22094296
Alcohol assessment among college students using wireless mobile technology.
Bernhardt, Jay M; Usdan, Stuart; Mays, Darren; Martin, Ryan; Cremeens, Jennifer; Arriola, Kimberly Jacob
2009-09-01
This study used a two-group randomized design to assess the validity of measuring self-reported alcohol consumption among college students using the Handheld Assisted Network Diary (HAND), a daily diary assessment administered using wireless mobile devices. A convenience sample of college students was recruited at a large, public university in the southeastern United States and randomized into two groups. A randomly assigned group of 86 students completed the daily HAND assessment during the 30-day study and a Timeline Followback (TLFB) at 30-day follow-up. A randomly assigned group of 82 students completed the paper-and-pencil Daily Social Diary (DSD) over the same study period. Data from the daily HAND assessment were compared with the TLFB completed at follow-up by participants who completed the HAND using 95% limits of agreement analysis. Furthermore, individual growth models were used to examine differences between the HAND and DSD by comparing the total drinks, drinking days, and drinks per drinking day captured by the two assessments over the study period. Results suggest that the HAND captured similar levels of alcohol use compared with the TLFB completed at follow-up by the same participants. In addition, comparisons of the two study groups suggest that, controlling for baseline alcohol use and demographics, the HAND assessment captured similar levels of total drinks, drinking days, and drinks per drinking day as the paper-and-pencil DSD. The study findings support the validity of wireless mobile devices as a daily assessment of alcohol use among college students.
ERIC Educational Resources Information Center
Anglin, Linda; Anglin, Kenneth; Schumann, Paul L.; Kaliski, John A.
2008-01-01
This study tests the use of computer-assisted grading rubrics compared to other grading methods with respect to the efficiency and effectiveness of different grading processes for subjective assignments. The test was performed on a large Introduction to Business course. The students in this course were randomly assigned to four treatment groups…
Barrett A. Garrison; Christopher D. otahal; Matthew L. Triggs
2002-01-01
Age structure and growth of California black oak (Quercus kelloggii) was determined from tagged trees at four 26.1-acre study stands in Placer County, California. Stands were dominated by large diameter (>20 inch dbh) California black oak and ponderosa pine (Pinus ponderosa). Randomly selected trees were tagged in June-August...
McDonald and Company Securities Library User Survey, 1996.
ERIC Educational Resources Information Center
Wolfgram, Derek E.
The library of McDonald and Company Securities is important to the success of the business and its employees. This study assesses the needs and expectations of the library users, and analyzes how well the current library services are meeting those needs and expectations. A questionnaire was distributed to a large random sample of the firm's…
USDA-ARS?s Scientific Manuscript database
Benefits of plant sterols (PS) for cholesterol lowering are compromised by large variability in efficacy across individuals. High fractional cholesterol synthesis measured by deuterium incorporation has been associated with non-response to PS consumption; however, prospective studies showing this as...
ERIC Educational Resources Information Center
Bradley, Dominique; Crawford, Evan; Dahill-Brown, Sara E.
2015-01-01
Several studies suggest that values-affirmation can serve as a simple, yet powerful, tool for dramatically reducing achievement gaps. Because subtle variations in implementation procedures may explain some of the variation in these findings, it is crucial for researchers to measure the fidelity with which interventions are implemented. The authors…
Public Attitude toward Optician Education as Human Capital.
ERIC Educational Resources Information Center
Gerardi, Steven J.; Woods, Thomas A.; White, Debra R.; Hill, Roger S.
A study sought to identify what the public thinks about the appropriate level of education and training for opticians. A 10% random sample of 1,510 New York State customers (n=151) of a large multinational opticianry corporation was surveyed. Two categories of data were social background (combined annual family income, age, marital status, race,…
ERIC Educational Resources Information Center
Hafner, Lawrence E.
A study developed a multiple regression prediction equation for each of six selected achievement variables in a popular standardized test of achievement. Subjects, 42 fourth-grade pupils randomly selected across several classes in a large elementary school in a north Florida city, were administered several standardized tests to determine predictor…
ERIC Educational Resources Information Center
Lara-Alecio, Rafael; Tong, Fuhui; Irby, Beverly J.; Mathes, Patricia
2009-01-01
Using a low-inference observational instrument, the authors empirically described and compared pedagogical behaviors in bilingual and structured English-immersion programs serving Spanish-speaking English language learners in a large urban school district in Southeast Texas. The two programs included both intervention/control of each type during…
Automated Scoring of L2 Spoken English with Random Forests
ERIC Educational Resources Information Center
Kobayashi, Yuichiro; Abe, Mariko
2016-01-01
The purpose of the present study is to assess second language (L2) spoken English using automated scoring techniques. Automated scoring aims to classify a large set of learners' oral performance data into a small number of discrete oral proficiency levels. In automated scoring, objectively measurable features such as the frequencies of lexical and…
ERIC Educational Resources Information Center
Minnesota Department of Education, 2004
2004-01-01
A large and growing body of research supports the critical relationship between early childhood experiences and successful life-long outcomes. Assessing the readiness of children as they enter school is a high priority. This report describes findings from Year Two of the assessment of school readiness with a larger random sample of children…
Risk Groups in Exposure to Terror: The Case of Israel's Citizens
ERIC Educational Resources Information Center
Feniger, Yariv; Yuchtman-Yaar, Ephraim
2010-01-01
This research addresses a largely ignored question in the study of terror: who are its likely victims? An answer was sought through analysis of comprehensive data on civilian victims of terror in Israel from 1993 through 2003. The chances of being killed in seemingly random terror attacks were found unequally distributed in Israeli society, but…
Monetary and Nonmonetary Student Incentives for Tutoring Services: A Randomized Controlled Trial
ERIC Educational Resources Information Center
Springer, Matthew G.; Rosenquist, Brooks A.; Swain, Walker A.
2015-01-01
In recent years, the largely punitive accountability measures imposed by the 2001 No Child Left Behind Act have given way to an emphasis on financial incentives. Although most policy interventions have focused primarily on linking teacher compensation to student test scores, several recent studies have examined the prospects for the use of…
Psychological Health in Midlife among Women Who Have Ever Lived with a Violent Partner or Spouse
ERIC Educational Resources Information Center
Loxton, Deborah; Schofield, Margot; Hussain, Rafat
2006-01-01
This study examines the psychological health correlates of domestic violence in a large random sample of mid-aged Australian women (N = 11,310, age 47 to 52 years). Logistic regressions were used to investigate the associations between domestic violence and depression, anxiety, and psychological wellbeing, after adjusting for demographic variables…
ERIC Educational Resources Information Center
Ngoma, Muhammed; Ntale, Peter Dithan; Abaho, Earnest
2017-01-01
This article evaluates the relationship between social-economic factors, students' factors, student academic goals and performance of students. The study adopts a cross-sectional survey, with largely quantitative approaches. A sample of 950 students was randomly and proportionately drawn from undergraduates in four institutions of higher learning.…
Human Capital Background and the Educational Attainment of Second-Generation Immigrants in France
ERIC Educational Resources Information Center
Dos Santos, Manon Domingues; Wolff, Francois-Charles
2011-01-01
In this paper, we study the impact of parental human capital background on ethnic educational gaps between second-generation immigrants using a large data set conducted in France in 2003. Estimates from censored random effect ordered Probit regressions show that the skills of immigrants explain in the most part, the ethnic educational gap between…
ERIC Educational Resources Information Center
Emmett, Joshua
2013-01-01
The purpose of this qualitative research study was to discover the influence of a student achievement program implemented at one large urban high school that employed extrinsic motivation to promote student achievement on state assessments. Using organismic integration theory as the theoretical framework, 19 randomly selected students participated…
Ownership and ecosystem as sources of spatial heterogeneity in a forested landscape, Wisconsin, USA
Thomas R. Crow; George E. Host; David J. Mladenoff
1999-01-01
The interaction between physical environment and land ownership in creating spatial heterogeneity was studied in largely forested landscapes of northern Wisconsin, USA. A stratified random approach was used in which 2500-ha plots representing two ownerships (National Forest and private non-industrial) were located within two regional ecosystems (extremely well-drained...
Influence of a Message's Reception as a Function of Perceived Feminist Authorship.
ERIC Educational Resources Information Center
Laux, John M.; Newman, Isadore
This paper discusses a study that was conducted in order to add to the body of literature that investigates the manner in which feminist psychology is accepted among education graduate students. Graduate students (N=69) at a large public mid-western university were recruited and randomly assigned to one of four treatment groups. Participants were…
Investigating the Relationship between School Level and a School Growth Mindset
ERIC Educational Resources Information Center
Hanson, Janet; Ruff, William; Bangert, Arthur
2016-01-01
This study explored the relationship between school level and the psychosocial construct of a growth mindset school culture. Data was collected on the What's My School Mindset (WMSM) Survey from a stratified random sample of PK-12 faculty and administrators (n = 347) in 30 schools across a large northwestern state. The overarching research…
Isonymy structure of four Venezuelan states.
Rodríguez-Larralde, A; Barrai, I; Alfonzo, J C
1993-01-01
The isonymy structure of four Venezuelan States-Falcón, Mérida, Nueva Esparta and Yaracuy-was studied using the surnames of the Venezuelan register of electors updated in 1984. The surname distributions of 155 counties were obtained and, for each county, estimates of consanguinity due to random isonymy and Fisher's alpha were calculated. It was shown that for large sample sizes the inverse of Fisher's alpha is identical to the unbiased estimate of within-population random isonymy. A three-dimensional isometric surface plot was obtained for each State, based on the counties' random isonymy estimates. The highest estimates of random consanguinity were found in the States of Nueva Esparta and Mérida, while the lowest were found in Yaracuy. Other microdifferentiation indicators from the same data gave similar results, and an interpretation was attempted, based on the particular economic and geographic characteristics of each State. Four different genetic distances between all possible pairs of counties were calculated within States; geographic distance shows the highest correlations with random isonymy and Euclidean distance, with the exception of the State of Nueva Esparta, where there is no correlation between geographic distance and random isonymy. It was possible to group counties in clusters, from dendrograms based on Euclidean distance. Isonymy clustering was also consistent with socioeconomic and geographic characteristics of the counties.
A generator for unique quantum random numbers based on vacuum states
NASA Astrophysics Data System (ADS)
Gabriel, Christian; Wittmann, Christoffer; Sych, Denis; Dong, Ruifang; Mauerer, Wolfgang; Andersen, Ulrik L.; Marquardt, Christoph; Leuchs, Gerd
2010-10-01
Random numbers are a valuable component in diverse applications that range from simulations over gambling to cryptography. The quest for true randomness in these applications has engendered a large variety of different proposals for producing random numbers based on the foundational unpredictability of quantum mechanics. However, most approaches do not consider that a potential adversary could have knowledge about the generated numbers, so the numbers are not verifiably random and unique. Here we present a simple experimental setup based on homodyne measurements that uses the purity of a continuous-variable quantum vacuum state to generate unique random numbers. We use the intrinsic randomness in measuring the quadratures of a mode in the lowest energy vacuum state, which cannot be correlated to any other state. The simplicity of our source, combined with its verifiably unique randomness, are important attributes for achieving high-reliability, high-speed and low-cost quantum random number generators.
Effective dynamics of a random walker on a heterogeneous ring: Exact results
NASA Astrophysics Data System (ADS)
Masharian, S. R.
2018-07-01
In this paper, by considering a biased random walker hopping on a one-dimensional lattice with a ring geometry, we investigate the fluctuations of the speed of the random walker. We assume that the lattice is heterogeneous i.e. the hopping rate of the random walker between the first and the last lattice sites is different from the hopping rate of the random walker between the other links of the lattice. Assuming that the average speed of the random walker in the steady-state is v∗, we have been able to find the unconditional effective dynamics of the random walker where the absolute value of the average speed of the random walker is -v∗. Using a perturbative method in the large system-size limit, we have also been able to show that the effective hopping rates of the random walker near the defective link are highly site-dependent.
Learning From Past Failures of Oral Insulin Trials.
Michels, Aaron W; Gottlieb, Peter A
2018-07-01
Very recently one of the largest type 1 diabetes prevention trials using daily administration of oral insulin or placebo was completed. After 9 years of study enrollment and follow-up, the randomized controlled trial failed to delay the onset of clinical type 1 diabetes, which was the primary end point. The unfortunate outcome follows the previous large-scale trial, the Diabetes Prevention Trial-Type 1 (DPT-1), which again failed to delay diabetes onset with oral insulin or low-dose subcutaneous insulin injections in a randomized controlled trial with relatives at risk for type 1 diabetes. These sobering results raise the important question, "Where does the type 1 diabetes prevention field move next?" In this Perspective, we advocate for a paradigm shift in which smaller mechanistic trials are conducted to define immune mechanisms and potentially identify treatment responders. The stage is set for these interventions in individuals at risk for type 1 diabetes as Type 1 Diabetes TrialNet has identified thousands of relatives with islet autoantibodies and general population screening for type 1 diabetes risk is under way. Mechanistic trials will allow for better trial design and patient selection based upon molecular markers prior to large randomized controlled trials, moving toward a personalized medicine approach for the prevention of type 1 diabetes. © 2018 by the American Diabetes Association.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, S.Y.; Tepikian, S.
1985-01-01
Nonlinear magnetic forces become more important for particles in the modern large accelerators. These nonlinear elements are introduced either intentionally to control beam dynamics or by uncontrollable random errors. Equations of motion in the nonlinear Hamiltonian are usually non-integrable. Because of the nonlinear part of the Hamiltonian, the tune diagram of accelerators is a jungle. Nonlinear magnet multipoles are important in keeping the accelerator operation point in the safe quarter of the hostile jungle of resonant tunes. Indeed, all the modern accelerator designs have taken advantages of nonlinear mechanics. On the other hand, the effect of the uncontrollable random multipolesmore » should be evaluated carefully. A powerful method of studying the effect of these nonlinear multipoles is using a particle tracking calculation, where a group of test particles are tracing through these magnetic multipoles in the accelerator hundreds to millions of turns in order to test the dynamical aperture of the machine. These methods are extremely useful in the design of a large accelerator such as SSC, LEP, HERA and RHIC. These calculations unfortunately take a tremendous amount of computing time. In this review the method of determining chaotic orbit and applying the method to nonlinear problems in accelerator physics is discussed. We then discuss the scaling properties and effect of random sextupoles.« less
NASA Astrophysics Data System (ADS)
Jones, A. W.; Bland-Hawthorn, J.; Kaiser, N.
1994-12-01
In the first half of 1995, the Anglo-Australian Observatory is due to commission a wide field (2.1(deg) ), 400-fiber, double spectrograph system (2dF) at the f/3.3 prime focus of the AAT 3.9m bi-national facility. The instrument should be able to measure ~ 4000 galaxy redshifts (assuming a magnitude limit of b_J ~\\ 20) in a single dark night and is therefore ideally suited to studies of large-scale structure. We have carried out simple 3D numerical simulations to judge the relative merits of sparse surveys and contiguous surveys. We generate a survey volume and fill it randomly with particles according to a selection function which mimics a magnitude-limited survey at b_J = 19.7. Each of the particles is perturbed by a gaussian random field according to the dimensionless power spectrum k(3) P(k) / 2pi (2) determined by Feldman, Kaiser & Peacock (1994) from the IRAS QDOT survey. We introduce some redshift-space distortion as described by Kaiser (1987), a `thermal' component measured from pairwise velocities (Davis & Peebles 1983), and `fingers of god' due to rich clusters at random density enhancements. Our particular concern is to understand how the window function W(2(k)) of the survey geometry compromises the accuracy of statistical measures [e.g., P(k), xi (r), xi (r_sigma ,r_pi )] commonly used in the study of large-scale structure. We also examine the reliability of various tools (e.g. genus) for describing the topological structure within a contiguous region of the survey.
Anderson, Jacqueline; Dolk, Anders; Torgerson, Jarl; Nyberg, Svante; Skau, Tommy; Forsberg, Birger C.; Werr, Joachim; Öhlen, Gunnar
2016-01-01
Background A small group of frequent visitors to Emergency Departments accounts for a disproportionally large fraction of healthcare consumption including unplanned hospitalizations and overall healthcare costs. In response, several case and disease management programs aimed at reducing healthcare consumption in this group have been tested; however, results vary widely. Objectives To investigate whether a telephone-based, nurse-led case management intervention can reduce healthcare consumption for frequent Emergency Department visitors in a large-scale setup. Methods A total of 12 181 frequent Emergency Department users in three counties in Sweden were randomized using Zelen’s design or a traditional randomized design to receive either a nurse-led case management intervention or no intervention, and were followed for healthcare consumption for up to 2 years. Results The traditional design showed an overall 12% (95% confidence interval 4–19%) decreased rate of hospitalization, which was mostly driven by effects in the last year. Similar results were achieved in the Zelen studies, with a significant reduction in hospitalization in the last year, but mixed results in the early development of the project. Conclusion Our study provides evidence that a carefully designed telephone-based intervention with accurate and systematic patient selection and appropriate staff training in a centralized setup can lead to significant decreases in healthcare consumption and costs. Further, our results also show that the effects are sensitive to the delivery model chosen. PMID:25969342
Applying Active Learning to Assertion Classification of Concepts in Clinical Text
Chen, Yukun; Mani, Subramani; Xu, Hua
2012-01-01
Supervised machine learning methods for clinical natural language processing (NLP) research require a large number of annotated samples, which are very expensive to build because of the involvement of physicians. Active learning, an approach that actively samples from a large pool, provides an alternative solution. Its major goal in classification is to reduce the annotation effort while maintaining the quality of the predictive model. However, few studies have investigated its uses in clinical NLP. This paper reports an application of active learning to a clinical text classification task: to determine the assertion status of clinical concepts. The annotated corpus for the assertion classification task in the 2010 i2b2/VA Clinical NLP Challenge was used in this study. We implemented several existing and newly developed active learning algorithms and assessed their uses. The outcome is reported in the global ALC score, based on the Area under the average Learning Curve of the AUC (Area Under the Curve) score. Results showed that when the same number of annotated samples was used, active learning strategies could generate better classification models (best ALC – 0.7715) than the passive learning method (random sampling) (ALC – 0.7411). Moreover, to achieve the same classification performance, active learning strategies required fewer samples than the random sampling method. For example, to achieve an AUC of 0.79, the random sampling method used 32 samples, while our best active learning algorithm required only 12 samples, a reduction of 62.5% in manual annotation effort. PMID:22127105
Miyauchi, Katsumi; Kimura, Takeshi; Shimokawa, Hiroaki; Daida, Hiroyuki; Iimuro, Satoshi; Iwata, Hiroshi; Ozaki, Yukio; Sakuma, Ichiro; Nakagawa, Yoshihisa; Hibi, Kiyoshi; Hiro, Takafumi; Fukumoto, Yoshihiro; Hokimoto, Seiji; Ohashi, Yasuo; Ohtsu, Hiroshi; Saito, Yasushi; Matsuzaki, Masunori; Nagai, Ryozo
2018-03-30
Large-scale clinical trials in patients in Western countries with coronary artery disease (CAD) have found that aggressive lipid-lowering therapy using high-dose statins reduces cardiovascular (CV) events further than low-dose statins. However, such evidence has not yet been fully established in Asian populations, including in Japan. The Randomized Evaluation of Aggressive or Moderate Lipid-Lowering Therapy with Pitavastatin in Coronary Artery Disease (REAL-CAD) study addresses whether intensification of statin therapy improves clinical outcomes in Japanese patients with CAD.REAL-CAD is a prospective, multicenter, randomized, open-label, blinded-endpoint, physician-initiated phase 4 trial in Japan. The study will recruit up to 12,600 patients with stable CAD. Patients are assigned to receive either pitavastatin 1 mg/day or pitavastatin 4 mg/day. LDL-C levels are expected to reach approximate mean values of 100 mg/dL in the low-dose pitavastatin group and 80 mg/dL in the high-dose group. The primary endpoint is the time to occurrence of a major CV event, including CV death, non-fatal myocardial infarction, non-fatal ischemic stroke, and unstable angina requiring emergency hospitalization during an average of 5 years. The large number of patients and the long follow-up period in the REAL-CAD study should ensure that there is adequate power to definitively determine if reducing LDL-C levels to approximately 80 mg/dL by high-dose statin can provide additional clinical benefit.After the study is completed, we will have categorical evidence on the optimal statin dose and target LDL-C level for secondary prevention in Japanese patients.
Green, Beverly B; Ralston, James D; Fishman, Paul A; Catz, Sheryl L; Cook, Andrea; Carlson, Jim; Tyll, Lynda; Carrell, David; Thompson, Robert S
2008-05-01
Randomized controlled trials have provided unequivocal evidence that treatment of hypertension decreases mortality and major disability from cardiovascular disease; however, blood pressure remains inadequately treated in most affected individuals. This large gap continues despite the facts that more than 90% of adults with hypertension have health insurance, and hypertension is the leading cause of visits to the doctor. New approaches are needed to improve hypertension care. The Electronic Communications and Home Blood Pressure Monitoring (e-BP) study is a three-arm randomized controlled trial designed to determine whether care based on the Chronic Care Model and delivered over the Internet improves hypertension care. The primary study outcomes are systolic, diastolic, and blood pressure control; secondary outcomes are medication adherence, patient self-efficacy, satisfaction and quality of life, and healthcare utilization and costs. Hypertensive patients receiving care at Group Health medical centers are eligible if they have uncontrolled blood pressure on two screening visits and access to the Web and an e-mail address. Study participants are randomly assigned to three intervention groups: (a) usual care; (b) home blood pressure monitoring receipt and proficiency training on its use and the Group Health secure patient website (with secure e-mail access to their healthcare provider, access to a shared medical record, prescription refill and other services); or (c) this plus pharmacist care management (collaborative care management between the patient, the pharmacist, and the patient's physician via a secure patient website and the electronic medical record). We will determine whether a new model of patient-centered care that leverages Web communications, self-monitoring, and collaborative care management improves hypertension control. If this model proves successful and cost-effective, similar interventions could be used to improve the care of large numbers of patients with uncontrolled hypertension.
Sparks, Jeffrey A; Barbhaiya, Medha; Karlson, Elizabeth W; Ritter, Susan Y; Raychaudhuri, Soumya; Corrigan, Cassandra C; Lu, Fengxin; Selhub, Jacob; Chasman, Daniel I; Paynter, Nina P; Ridker, Paul M; Solomon, Daniel H
2017-08-01
The role of low dose methotrexate (LDM) in potential serious toxicities remains unclear despite its common use. Prior observational studies investigating LDM toxicity compared LDM to other active drugs. Prior placebo-controlled clinical trials of LDM in inflammatory conditions were not large enough to investigate toxicity. The Cardiovascular Inflammation Reduction Trial (CIRT) is an ongoing NIH-funded, randomized, double-blind, placebo-controlled trial of LDM in the secondary prevention of cardiovascular disease. We describe here the rationale and design of the CIRT-Adverse Events (CIRT-AE) ancillary study which aims to investigate adverse events within CIRT. CIRT will randomize up to 7000 participants with cardiovascular disease and no systemic rheumatic disease to either LDM (target dose: 15-20mg/week) or placebo for an average follow-up period of 3-5 years; subjects in both treatment arms receive folic acid 1mg daily for 6 days each week. The primary endpoints of CIRT include recurrent cardio vascular events, incident diabetes, and all-cause mortality, and the ancillary CIRT-AE study has been designed to adjudicate other clinically important adverse events including hepatic, gastrointestinal, respiratory, hematologic, infectious, mucocutaneous, oncologic, renal, neurologic, and musculoskeletal outcomes. Methotrexate polyglutamate levels and genome-wide single nucleotide polymorphisms will be examined for association with adverse events. CIRT-AE will comprehensively evaluate potential LDM toxicities among subjects with cardiovascular disease within the context of a large, ongoing, double-blind, placebo-controlled trial. This information may lead to a personalized approach to monitoring LDM in clinical practice. Copyright © 2017 Elsevier Inc. All rights reserved.
Return probabilities and hitting times of random walks on sparse Erdös-Rényi graphs.
Martin, O C; Sulc, P
2010-03-01
We consider random walks on random graphs, focusing on return probabilities and hitting times for sparse Erdös-Rényi graphs. Using the tree approach, which is expected to be exact in the large graph limit, we show how to solve for the distribution of these quantities and we find that these distributions exhibit a form of self-similarity.
Marino, S R; Lin, S; Maiers, M; Haagenson, M; Spellman, S; Klein, J P; Binkowski, T A; Lee, S J; van Besien, K
2012-02-01
The identification of important amino acid substitutions associated with low survival in hematopoietic cell transplantation (HCT) is hampered by the large number of observed substitutions compared with the small number of patients available for analysis. Random forest analysis is designed to address these limitations. We studied 2107 HCT recipients with good or intermediate risk hematological malignancies to identify HLA class I amino acid substitutions associated with reduced survival at day 100 post transplant. Random forest analysis and traditional univariate and multivariate analyses were used. Random forest analysis identified amino acid substitutions in 33 positions that were associated with reduced 100 day survival, including HLA-A 9, 43, 62, 63, 76, 77, 95, 97, 114, 116, 152, 156, 166 and 167; HLA-B 97, 109, 116 and 156; and HLA-C 6, 9, 11, 14, 21, 66, 77, 80, 95, 97, 99, 116, 156, 163 and 173. In all 13 had been previously reported by other investigators using classical biostatistical approaches. Using the same data set, traditional multivariate logistic regression identified only five amino acid substitutions associated with lower day 100 survival. Random forest analysis is a novel statistical methodology for analysis of HLA mismatching and outcome studies, capable of identifying important amino acid substitutions missed by other methods.
Accurate prediction of personalized olfactory perception from large-scale chemoinformatic features.
Li, Hongyang; Panwar, Bharat; Omenn, Gilbert S; Guan, Yuanfang
2018-02-01
The olfactory stimulus-percept problem has been studied for more than a century, yet it is still hard to precisely predict the odor given the large-scale chemoinformatic features of an odorant molecule. A major challenge is that the perceived qualities vary greatly among individuals due to different genetic and cultural backgrounds. Moreover, the combinatorial interactions between multiple odorant receptors and diverse molecules significantly complicate the olfaction prediction. Many attempts have been made to establish structure-odor relationships for intensity and pleasantness, but no models are available to predict the personalized multi-odor attributes of molecules. In this study, we describe our winning algorithm for predicting individual and population perceptual responses to various odorants in the DREAM Olfaction Prediction Challenge. We find that random forest model consisting of multiple decision trees is well suited to this prediction problem, given the large feature spaces and high variability of perceptual ratings among individuals. Integrating both population and individual perceptions into our model effectively reduces the influence of noise and outliers. By analyzing the importance of each chemical feature, we find that a small set of low- and nondegenerative features is sufficient for accurate prediction. Our random forest model successfully predicts personalized odor attributes of structurally diverse molecules. This model together with the top discriminative features has the potential to extend our understanding of olfactory perception mechanisms and provide an alternative for rational odorant design.
Graft Utilization in the Augmentation of Large-to-Massive Rotator Cuff Repairs: A Systematic Review.
Ferguson, Devin P; Lewington, Matthew R; Smith, T Duncan; Wong, Ivan H
2016-11-01
Current treatment options for symptomatic large-to-massive rotator cuff tears can reduce pain, but failure rates remain high. Surgeons have incorporated synthetic and biologic grafts to augment these repairs, with promising results. Multiple reviews exist that summarize these products; however, no systematic review has investigated the grafts' ability to maintain structural integrity after augmentation of large-to-massive rotator cuff repairs. To systematically review and evaluate the effectiveness of grafts in the augmentation of large-to-massive rotator cuff repairs. Systematic review. A comprehensive search of 4 reputable databases was completed. Inclusion criteria were (1) large-to-massive rotator cuff tear, (2) graft augmentation of primary repairs ± primary repair control group, and (3) minimum clinical and radiologic follow-up of 12 months. Two reviewers screened the titles, abstracts, and full articles and extracted the data from eligible studies. Results were summarized into evidence tables stratified by graft origin and level of evidence. Ten studies fit the inclusion criteria. Allograft augmentation was functionally and structurally superior to primary repair controls, with intact repairs in 85% versus 40% of patients (P < .01). This was supported by observational study data. Xenograft augmentation failed to demonstrate superiority to primary repair controls, with worse structural healing rates (27% vs 60%; P =.11). Both comparative studies supported this finding. There have also been many reports of inflammatory reactions with xenograft use. Polypropylene patches are associated with improved structural (83% vs 59% and 49%; P < .01) and functional outcomes when compared with controls and xenograft augmentation; however, randomized data are lacking. Augmentation of large-to-massive rotator cuff repairs with human dermal allografts is associated with superior functional and structural outcome when compared with conventional primary repair. Xenograft augmentation failed to demonstrate a statistically significant difference and may be associated with worse rerupture rates and occasional severe inflammatory reactions. Polypropylene patches have initial promising results. Research in this field is limited; future researchers should continue to develop prospective, randomized controlled trials to establish clear recommendations. © 2016 The Author(s).
Comonotonic bounds on the survival probabilities in the Lee-Carter model for mortality projection
NASA Astrophysics Data System (ADS)
Denuit, Michel; Dhaene, Jan
2007-06-01
In the Lee-Carter framework, future survival probabilities are random variables with an intricate distribution function. In large homogeneous portfolios of life annuities, value-at-risk or conditional tail expectation of the total yearly payout of the company are approximately equal to the corresponding quantities involving random survival probabilities. This paper aims to derive some bounds in the increasing convex (or stop-loss) sense on these random survival probabilities. These bounds are obtained with the help of comonotonic upper and lower bounds on sums of correlated random variables.
Scaling exponents for ordered maxima
Ben-Naim, E.; Krapivsky, P. L.; Lemons, N. W.
2015-12-22
We study extreme value statistics of multiple sequences of random variables. For each sequence with N variables, independently drawn from the same distribution, the running maximum is defined as the largest variable to date. We compare the running maxima of m independent sequences and investigate the probability S N that the maxima are perfectly ordered, that is, the running maximum of the first sequence is always larger than that of the second sequence, which is always larger than the running maximum of the third sequence, and so on. The probability S N is universal: it does not depend on themore » distribution from which the random variables are drawn. For two sequences, S N~N –1/2, and in general, the decay is algebraic, S N~N –σm, for large N. We analytically obtain the exponent σ 3≅1.302931 as root of a transcendental equation. Moreover, the exponents σ m grow with m, and we show that σ m~m for large m.« less
Statistical mechanics of complex economies
NASA Astrophysics Data System (ADS)
Bardoscia, Marco; Livan, Giacomo; Marsili, Matteo
2017-04-01
In the pursuit of ever increasing efficiency and growth, our economies have evolved to remarkable degrees of complexity, with nested production processes feeding each other in order to create products of greater sophistication from less sophisticated ones, down to raw materials. The engine of such an expansion have been competitive markets that, according to general equilibrium theory (GET), achieve efficient allocations under specific conditions. We study large random economies within the GET framework, as templates of complex economies, and we find that a non-trivial phase transition occurs: the economy freezes in a state where all production processes collapse when either the number of primary goods or the number of available technologies fall below a critical threshold. As in other examples of phase transitions in large random systems, this is an unintended consequence of the growth in complexity. Our findings suggest that the Industrial Revolution can be regarded as a sharp transition between different phases, but also imply that well developed economies can collapse if too many intermediate goods are introduced.
Investigation of spectral analysis techniques for randomly sampled velocimetry data
NASA Technical Reports Server (NTRS)
Sree, Dave
1993-01-01
It is well known that velocimetry (LV) generates individual realization velocity data that are randomly or unevenly sampled in time. Spectral analysis of such data to obtain the turbulence spectra, and hence turbulence scales information, requires special techniques. The 'slotting' technique of Mayo et al, also described by Roberts and Ajmani, and the 'Direct Transform' method of Gaster and Roberts are well known in the LV community. The slotting technique is faster than the direct transform method in computation. There are practical limitations, however, as to how a high frequency and accurate estimate can be made for a given mean sampling rate. These high frequency estimates are important in obtaining the microscale information of turbulence structure. It was found from previous studies that reliable spectral estimates can be made up to about the mean sampling frequency (mean data rate) or less. If the data were evenly samples, the frequency range would be half the sampling frequency (i.e. up to Nyquist frequency); otherwise, aliasing problem would occur. The mean data rate and the sample size (total number of points) basically limit the frequency range. Also, there are large variabilities or errors associated with the high frequency estimates from randomly sampled signals. Roberts and Ajmani proposed certain pre-filtering techniques to reduce these variabilities, but at the cost of low frequency estimates. The prefiltering acts as a high-pass filter. Further, Shapiro and Silverman showed theoretically that, for Poisson sampled signals, it is possible to obtain alias-free spectral estimates far beyond the mean sampling frequency. But the question is, how far? During his tenure under 1993 NASA-ASEE Summer Faculty Fellowship Program, the author investigated from his studies on the spectral analysis techniques for randomly sampled signals that the spectral estimates can be enhanced or improved up to about 4-5 times the mean sampling frequency by using a suitable prefiltering technique. But, this increased bandwidth comes at the cost of the lower frequency estimates. The studies further showed that large data sets of the order of 100,000 points, or more, high data rates, and Poisson sampling are very crucial for obtaining reliable spectral estimates from randomly sampled data, such as LV data. Some of the results of the current study are presented.
Phenomenological picture of fluctuations in branching random walks
NASA Astrophysics Data System (ADS)
Mueller, A. H.; Munier, S.
2014-10-01
We propose a picture of the fluctuations in branching random walks, which leads to predictions for the distribution of a random variable that characterizes the position of the bulk of the particles. We also interpret the 1 /√{t } correction to the average position of the rightmost particle of a branching random walk for large times t ≫1 , computed by Ebert and Van Saarloos, as fluctuations on top of the mean-field approximation of this process with a Brunet-Derrida cutoff at the tip that simulates discreteness. Our analytical formulas successfully compare to numerical simulations of a particular model of a branching random walk.
Abraham, William T; Burkhoff, Daniel; Nademanee, Koonlawee; Carson, Peter; Bourge, Robert; Ellenbogen, Kenneth A; Parides, Michael; Kadish, Alan
2008-10-01
Cardiac contractility modulation (CCM) signals are nonexcitatory electrical signals delivered during the cardiac absolute refractory period that enhance the strength of cardiac muscular contraction. Prior research in experimental and human heart failure has shown that CCM signals normalize phosphorylation of key proteins and expression of genes coding for proteins involved in regulation of calcium cycling and contraction. The results of prior clinical studies of CCM have supported its safety and efficacy. A large-scale clinical study, the FIX-HF-5 study, is currently underway to test the safety and efficacy of this treatment. In this article, we provide an overview of the system used to deliver CCM signals, the implant procedure, and the details and rationale of the FIX-HF-5 study design. Baseline characteristics for patients randomized in this trial are also presented.
[Laservaporization of the prostate: current status of the greenlight and diode laser].
Rieken, M; Bachmann, A; Gratzke, C
2013-03-01
In the last decade laser vaporization of the prostate has emerged as a safe and effective alternative to transurethral resection of the prostate (TURP). This was facilitated in particular by the introduction of photoselective vaporization of the prostate (PVP) with a 532 nm 80 W KTP laser in 2002. Prospective randomized trials comparing PVP and TURP with a maximum follow-up of 3 years mostly demonstrated comparable functional results. Cohort studies showed a safe application of PVP in patients under oral anticoagulation and with large prostates. Systems from various manufacturers with different maximum power output and wavelengths are now available for diode laser vaporization of the prostate. Prospective randomized trials comparing diode lasers and TURP are not yet available. In cohort studies and comparative studies PVP diode lasers are characterized by excellent hemostatic properties but functional results vary greatly with some studies reporting high reoperation rates.
Skingley, Ann; Bungay, Hilary; Clift, Stephen; Warden, June
2014-12-01
Existing randomized controlled trials within the health field suggest that the concept of randomization is not always well understood and that feelings of disappointment may occur when participants are not placed in their preferred arm. This may affect a study's rigour and ethical integrity if not addressed. We aimed to test whether these issues apply to a healthy volunteer sample within a health promotion trial of singing for older people. Written comments from control group participants at two points during the trial were analysed, together with individual semi-structured interviews with a small sample (n = 11) of this group. We found that motivation to participate in the trial was largely due to the appeal of singing and disappointment resulted from allocation to the control group. Understanding of randomization was generally good and feelings of disappointment lessened over time and with a post-research opportunity to sing. Findings suggest that measures should be put in place to minimize the potential negative impacts of randomized controlled trials in health promotion research. © The Author (2013). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Heckman, James; Moon, Seong Hyeok; Pinto, Rodrigo; Savelyev, Peter; Yavitz, Adam
2012-01-01
Social experiments are powerful sources of information about the effectiveness of interventions. In practice, initial randomization plans are almost always compromised. Multiple hypotheses are frequently tested. “Significant” effects are often reported with p-values that do not account for preliminary screening from a large candidate pool of possible effects. This paper develops tools for analyzing data from experiments as they are actually implemented. We apply these tools to analyze the influential HighScope Perry Preschool Program. The Perry program was a social experiment that provided preschool education and home visits to disadvantaged children during their preschool years. It was evaluated by the method of random assignment. Both treatments and controls have been followed from age 3 through age 40. Previous analyses of the Perry data assume that the planned randomization protocol was implemented. In fact, as in many social experiments, the intended randomization protocol was compromised. Accounting for compromised randomization, multiple-hypothesis testing, and small sample sizes, we find statistically significant and economically important program effects for both males and females. We also examine the representativeness of the Perry study. PMID:23255883
NASA Astrophysics Data System (ADS)
Mikami, Masato; Saputro, Herman; Seo, Takehiko; Oyagi, Hiroshi
2018-03-01
Stable operation of liquid-fueled combustors requires the group combustion of fuel spray. Our study employs a percolation approach to describe unsteady group-combustion excitation based on findings obtained from microgravity experiments on the flame spread of fuel droplets. We focus on droplet clouds distributed randomly in three-dimensional square lattices with a low-volatility fuel, such as n-decane in room-temperature air, where the pre-vaporization effect is negligible. We also focus on the flame spread in dilute droplet clouds near the group-combustion-excitation limit, where the droplet interactive effect is assumed negligible. The results show that the occurrence probability of group combustion sharply decreases with the increase in mean droplet spacing around a specific value, which is termed the critical mean droplet spacing. If the lattice size is at smallest about ten times as large as the flame-spread limit distance, the flame-spread characteristics are similar to those over an infinitely large cluster. The number density of unburned droplets remaining after completion of burning attained maximum around the critical mean droplet spacing. Therefore, the critical mean droplet spacing is a good index for stable combustion and unburned hydrocarbon. In the critical condition, the flame spreads through complicated paths, and thus the characteristic time scale of flame spread over droplet clouds has a very large value. The overall flame-spread rate of randomly distributed droplet clouds is almost the same as the flame-spread rate of a linear droplet array except over the flame-spread limit.
Renal Denervation for Treatment of Hypertension: a Second Start and New Challenges.
Persu, Alexandre; Kjeldsen, Sverre; Staessen, Jan A; Azizi, Michel
2016-01-01
Following the publication of the randomized controlled but open-label trial Symplicity HTN-2, catheter-based renal sympathetic denervation was proposed as a novel treatment for drug-resistant hypertension. Thousands of procedures were routinely performed in Europe, Australia and Asia, and many observational studies were published. A sudden shift from overoptimistic views to radical scepticism occurred later, when the large US randomized sham-controlled trial Symplicity HTN-3 failed to meet its primary blood pressure lowering efficacy endpoint. Experts are divided on the reasons accounting for the large discrepancy between the results of initial studies and those of Symplicity HTN-3. Indeed, the blood pressure lowering effect associated with renal denervation was overestimated in initial trials due to various patient and physician-related biases, whereas it could have been underestimated in Symplicity HTN-3, which was well designed but not rigorously executed. Still, there is a large consensus on the need to further study catheter-based renal denervation in more controlled conditions, with particular emphasis on identification of predictors of blood pressure response. US and European experts have recently issued very similar recommendations on design of upcoming trials, procedural aspects, drug treatment, patient population and inclusion-exclusion criteria. Application of these new standards may represent a second chance for renal denervation to demonstrate--or not--its efficacy and safety in various patient populations. With its highly standardized treatment regimen, the French trial DENERHTN paved the way for this new approach and may inspire upcoming studies testing novel renal denervation systems in different populations.
Engel, Dorothee; Schnitzer, Andreas; Brade, Joachim; Blank, Elena; Wenz, Frederik; Suetterlin, Marc; Schoenberg, Stefan; Wasser, Klaus
2013-01-01
Intraoperative radiotherapy (IORT) with low-energy x-rays is increasingly used in breast-conserving therapy (BCT). Previous non-randomized studies have observed mammographic changes in the tumor bed to be more pronounced after IORT. The purpose of this study was to reassess the postoperative changes in a randomized single-center subgroup of patients from a multicenter trial (TARGIT-A). In this subgroup (n = 48) 27 patients received BCT with IORT, 21 patients had BCT with standard whole-breast radiotherapy serving as controls. Overall 258 postoperative mammograms (median follow-up 4.3 years, range 3-8) were retrospectively evaluated by two radiologists in consensus focusing on changes in the tumor bed. Fat necroses showed to be significantly more frequent (56% versus 24%) and larger (8.7 versus 1.6 sq cm, median) after IORT than those in controls. Scar calcifications were also significantly more frequent after IORT (63% versus 19%). The high incidence of large fat necroses in our study confirms previous study findings. However, the overall higher incidence of calcifications in the tumor bed after IORT represents a new finding, requiring further attention. © 2012 Wiley Periodicals, Inc.
Botelho, Marco Antonio; Bezerra, José Gomes; Correa, Luciano Lima; Fonseca, Said Gonçalves da Cruz; Montenegro, Danusa; Gapski, Ricardo; Brito, Gerly Anne Castro; Heukelbach, Jörg
2007-01-01
Several different plant extracts have been evaluated with respect to their antimicrobial effects against oral pathogens and for reduction of gingivitis. Given that a large number of these substances have been associated with significant side effects that contraindicate their long-term use, new compounds need to be tested. The aim of this study was to assess the short-term safety and efficacy of a Lippia sidoides ("alecrim pimenta")-based essential oil mouthrinse on gingival inflammation and bacterial plaque. Fifty-five patients were enrolled into a pilot, double-blinded, randomized, parallel-armed study. Patients were randomly assigned to undergo a 7-day treatment regimen with either the L. sidoides-based mouthrinse or 0.12% chlorhexidine mouthrinse. The results demonstrated decreased plaque index, gingival index and gingival bleeding index scores at 7 days, as compared to baseline. There was no statistically significance difference (p>0.05) between test and control groups for any of the clinical parameters assessed throughout the study. Adverse events were mild and transient. The findings of this study demonstrated that the L. sidoides-based mouthrinse was safe and efficacious in reducing bacterial plaque and gingival inflammation. PMID:19089126
Melloni, Chiara; Washam, Jeffrey B; Jones, W Schuyler; Halim, Sharif A; Hasselblad, Victor; Mayer, Stephanie B; Heidenfelder, Brooke L; Dolor, Rowena J
2015-01-01
Discordant results have been reported on the effects of concomitant use of proton pump inhibitors (PPIs) and dual antiplatelet therapy (DAPT) for cardiovascular outcomes. We conducted a systematic review comparing the effectiveness and safety of concomitant use of PPIs and DAPT in the postdischarge treatment of unstable angina/non-ST-segment-elevation myocardial infarction patients. We searched for clinical studies in MEDLINE, EMBASE, and the Cochrane Database of Systematic Reviews, from 1995 to 2012. Reviewers screened and extracted data, assessed applicability and quality, and graded the strength of evidence. We performed meta-analyses of direct comparisons when outcomes and follow-up periods were comparable. Thirty-five studies were eligible. Five (4 randomized controlled trials and 1 observational) assessed the effect of omeprazole when added to DAPT; the other 30 (observational) assessed the effect of PPIs as a class when compared with no PPIs. Random-effects meta-analyses of the studies assessing PPIs as a class consistently reported higher event rates in patients receiving PPIs for various clinical outcomes at 1 year (composite ischemic end points, all-cause mortality, nonfatal MI, stroke, revascularization, and stent thrombosis). However, the results from randomized controlled trials evaluating omeprazole compared with placebo showed no difference in ischemic outcomes, despite a reduction in upper gastrointestinal bleeding with omeprazole. Large, well-conducted observational studies of PPIs and randomized controlled trials of omeprazole seem to provide conflicting results for the effect of PPIs on cardiovascular outcomes when coadministered with DAPT. Prospective trials that directly compare pharmacodynamic parameters and clinical events among specific PPI agents in patients with unstable angina/non-ST-segment-elevation myocardial infarction treated with DAPT are warranted. © 2015 American Heart Association, Inc.
Ruston, Teresa; Hunter, Kathleen; Cummings, Greta; Lazarescu, Adriana
2013-01-01
Opioid-induced constipation (OIC) is a side effect of opioid therapy that can affect quality of life, adherence to treatment, and morbidity and possibly mortality. To investigate whether docusate sodium, sennosides, and lactulose have equal efficacy and side effect profiles compared to PEG in the management of OIC in adults. A systematic review was undertaken. Randomized controlled trials of adults taking opioids for cancer or non-cancer pain were considered if they met inclusion criteria. Statistical pooling was not possible as no studies met inclusion criteria. Large, well-powered, randomized controlled trials are feasible. Standard definitions of OIC would assist with the execution of these studies and contribute to their internal and external validity. Further research is strongly encouraged.
Doros, Gheorghe; Massaro, Joseph M; Kandzari, David E; Waksman, Ron; Koolen, Jacques J; Cutlip, Donald E; Mauri, Laura
2017-11-01
Traditional study design submitted to the Food and Drug Administration to test newer drug-eluting stents (DES) for marketing approval is the prospective randomized controlled trial. However, several DES have extensive clinical data from trials conducted outside the United States that have led to utilization of a novel design using the Bayesian approach. This design was proposed for testing DES with bioresorbable polymer compared with DES most commonly in use today that use durable polymers for drug elution. This prospective, multicenter, randomized, controlled trial is designed to assess the safety and efficacy of the Orsiro bioresorbable polymer sirolimus-eluting stent (BP SES). Up to 1,334 subjects with up to 3 de novo or restenotic coronary artery lesions who qualify for percutaneous coronary intervention with stenting will be randomized 2:1 to the BP SES versus the Xience durable polymer everolimus-eluting stent (DP EES). Data from this trial will be combined with data from 2 similarly designed trials that also randomize subjects to BP SES and DP EES (BIOFLOW II, N=452 and BIOFLOW IV, N=579) by using a Bayesian approach. The primary end point is target lesion failure at 12 months post index procedure, defined as cardiac death, target vessel myocardial infarction, or clinically driven target lesion revascularization, and the primary analysis is a test of noninferiority of the BP SES versus DP EES on the primary end point according to a noninferiority delta of 3.85%. Secondary end points include stent thrombosis and the individual components of target lesion failure. Subjects will be followed for 5 years after randomization. The BIOFLOW V trial offers an opportunity to assess clinical outcomes in patients treated with coronary revascularization using the Orsiro BP SES relative to a commonly used DP EES. The use of a Bayesian analysis combines a large randomized cohort of patients 2 two smaller contributing randomized trials to augment the efficiency of the comparison. Copyright © 2017 Elsevier Inc. All rights reserved.
Narrow log-periodic modulations in non-Markovian random walks
NASA Astrophysics Data System (ADS)
Diniz, R. M. B.; Cressoni, J. C.; da Silva, M. A. A.; Mariz, A. M.; de Araújo, J. M.
2017-12-01
What are the necessary ingredients for log-periodicity to appear in the dynamics of a random walk model? Can they be subtle enough to be overlooked? Previous studies suggest that long-range damaged memory and negative feedback together are necessary conditions for the emergence of log-periodic oscillations. The role of negative feedback would then be crucial, forcing the system to change direction. In this paper we show that small-amplitude log-periodic oscillations can emerge when the system is driven by positive feedback. Due to their very small amplitude, these oscillations can easily be mistaken for numerical finite-size effects. The models we use consist of discrete-time random walks with strong memory correlations where the decision process is taken from memory profiles based either on a binomial distribution or on a delta distribution. Anomalous superdiffusive behavior and log-periodic modulations are shown to arise in the large time limit for convenient choices of the models parameters.
Elephant random walks and their connection to Pólya-type urns
NASA Astrophysics Data System (ADS)
Baur, Erich; Bertoin, Jean
2016-11-01
In this paper, we explain the connection between the elephant random walk (ERW) and an urn model à la Pólya and derive functional limit theorems for the former. The ERW model was introduced in [Phys. Rev. E 70, 045101 (2004), 10.1103/PhysRevE.70.045101] to study memory effects in a highly non-Markovian setting. More specifically, the ERW is a one-dimensional discrete-time random walk with a complete memory of its past. The influence of the memory is measured in terms of a memory parameter p between zero and one. In the past years, a considerable effort has been undertaken to understand the large-scale behavior of the ERW, depending on the choice of p . Here, we use known results on urns to explicitly solve the ERW in all memory regimes. The method works as well for ERWs in higher dimensions and is widely applicable to related models.
Descriptive parameter for photon trajectories in a turbid medium
NASA Astrophysics Data System (ADS)
Gandjbakhche, Amir H.; Weiss, George H.
2000-06-01
In many applications of laser techniques for diagnostic or therapeutic purposes it is necessary to be able to characterize photon trajectories to know which parts of the tissue are being interrogated. In this paper, we consider the cw reflectance experiment on a semi-infinite medium with uniform optical parameters and having a planar interface. The analysis is carried out in terms of a continuous-time random walk and the relation between the occupancy of a plane parallel to the surface to the maximum depth reached by the random walker is studied. The first moment of the ratio of average depth to the average maximum depth yields information about the volume of tissue interrogated as well as giving some indication of the region of tissue that gets the most light. We have also calculated the standard deviation of this random variable. It is not large enough to qualitatively affect information contained in the first moment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bradonjic, Milan; Hagberg, Aric; Hengartner, Nick
We analyze component evolution in general random intersection graphs (RIGs) and give conditions on existence and uniqueness of the giant component. Our techniques generalize the existing methods for analysis on component evolution in RIGs. That is, we analyze survival and extinction properties of a dependent, inhomogeneous Galton-Watson branching process on general RIGs. Our analysis relies on bounding the branching processes and inherits the fundamental concepts from the study on component evolution in Erdos-Renyi graphs. The main challenge becomes from the underlying structure of RIGs, when the number of offsprings follows a binomial distribution with a different number of nodes andmore » different rate at each step during the evolution. RIGs can be interpreted as a model for large randomly formed non-metric data sets. Besides the mathematical analysis on component evolution, which we provide in this work, we perceive RIGs as an important random structure which has already found applications in social networks, epidemic networks, blog readership, or wireless sensor networks.« less
Revisiting sample size: are big trials the answer?
Lurati Buse, Giovanna A L; Botto, Fernando; Devereaux, P J
2012-07-18
The superiority of the evidence generated in randomized controlled trials over observational data is not only conditional to randomization. Randomized controlled trials require proper design and implementation to provide a reliable effect estimate. Adequate random sequence generation, allocation implementation, analyses based on the intention-to-treat principle, and sufficient power are crucial to the quality of a randomized controlled trial. Power, or the probability of the trial to detect a difference when a real difference between treatments exists, strongly depends on sample size. The quality of orthopaedic randomized controlled trials is frequently threatened by a limited sample size. This paper reviews basic concepts and pitfalls in sample-size estimation and focuses on the importance of large trials in the generation of valid evidence.
Najafi, Zahra; Taghadosi, Mohsen; Sharifi, Khadijeh; Farrokhian, Alireza; Tagharrobi, Zahra
2014-01-01
Background: Anxiety is an important mental health problem in patients with cardiac disease. Anxiety reduces patients’ quality of life and increases the risk of different cardiac complications. Objectives: The aim of this study was to investigate the effects of inhalation aromatherapy on anxiety in patients with myocardial infarction. Patients and Methods: This was a randomized clinical trial conduced on 68 patients with myocardial infarction hospitalized in coronary care units of a large-scale teaching hospital affiliated to Kashan University of Medical Sciences, Kashan, Iran in 2013. By using the block randomization technique, patients were randomly assigned to experimental (33 patients receiving inhalation aromatherapy with lavender aroma twice a day for two subsequent days) and control (35 patients receiving routine care of study setting including no aromatherapy) groups. At the beginning of study and twenty minutes after each aromatherapy session, anxiety state of patients was assessed using the Spielberger’s State Anxiety Inventory. Data was analyzed using SPSS v. 16.0. We used Chi-square, Fisher’s exact, independent-samples T-test and repeated measures analysis of variance to analyze the study data. Results: The study groups did not differ significantly regarding baseline anxiety mean and demographic characteristics. However, after the administration of aromatherapy, anxiety mean in the experimental group was significantly lower than the control group. Conclusions: Inhalation aromatherapy with lavender aroma can reduce anxiety in patients with myocardial infarction. Consequently, healthcare providers, particularly nurses, can use this strategy to improve postmyocardial infarction anxiety management. PMID:25389481
Random forest (RF) modeling has emerged as an important statistical learning method in ecology due to its exceptional predictive performance. However, for large and complex ecological datasets there is limited guidance on variable selection methods for RF modeling. Typically, e...
Effect of physical activity on frailty and associated negative outcomes: the LIFE randomized trial
USDA-ARS?s Scientific Manuscript database
Background: Limited evidence suggests that physical activity may prevent frailty and associated negative outcomes in older adults. Definitive data from large, long-term, randomized trials are lacking. Objective: To determine whether a long-term structured moderate-intensity physical activity (PA) p...
Investigation of estimators of probability density functions
NASA Technical Reports Server (NTRS)
Speed, F. M.
1972-01-01
Four research projects are summarized which include: (1) the generation of random numbers on the IBM 360/44, (2) statistical tests used to check out random number generators, (3) Specht density estimators, and (4) use of estimators of probability density functions in analyzing large amounts of data.
Muscedere, John; Maslove, David; Boyd, John Gordon; O'Callaghan, Nicole; Lamontagne, Francois; Reynolds, Steven; Albert, Martin; Hall, Rick; McGolrick, Danielle; Jiang, Xuran; Day, Andrew G
2016-09-29
Nosocomial infections remain an important source of morbidity, mortality, and increased health care costs in hospitalized patients. This is particularly problematic in intensive care units (ICUs) because of increased patient vulnerability due to the underlying severity of illness and increased susceptibility from utilization of invasive therapeutic and monitoring devices. Lactoferrin (LF) and the products of its breakdown have multiple biological effects, which make its utilization of interest for the prevention of nosocomial infections in the critically ill. This is a phase II randomized, multicenter, double-blinded trial to determine the effect of LF on antibiotic-free days in mechanically ventilated, critically ill, adult patients in the ICU. Eligible, consenting patients will be randomized to receive either LF or placebo. The treating clinician will remain blinded to allocation during the study; blinding will be maintained by using opaque syringes and containers. The primary outcome will be antibiotic-free days, defined as the number of days alive and free of antibiotics 28 days after randomization. Secondary outcomes will include: antibiotic utilization, adjudicated diagnosis of nosocomial infection (longer than 72 h of admission to ICU), hospital and ICU length of stay, change in organ function after randomization, hospital and 90-day mortality, incidence of tracheal colonization, changes in gastrointestinal permeability, and immune function. Outcomes to inform the conduct of a larger definitive trial will also be evaluated, including feasibility as determined by recruitment rates and protocol adherence. The results from this study are expected to provide insight into a potential novel therapeutic use for LF in critically ill adult patients. Further, analysis of study outcomes will inform a future, large-scale phase III randomized controlled trial powered on clinically important outcomes related to the use of LF. The trial was registered at www.ClinicalTrials.gov on 18 November 2013. NCT01996579 .
Jo, Chris Hyunchul; Shin, Ji Sun; Shin, Won Hyoung; Lee, Seung Yeon; Yoon, Kang Sup; Shin, Sue
2015-09-01
Two main questions about the use of platelet-rich plasma (PRP) for regeneration purposes are its effect on the speed of healing and the quality of healing. Despite recent numerous studies, evidence is still lacking in this area, especially in a representative patient population with medium to large rotator cuff tears. To assess the efficacy of PRP augmentation on the speed and quality of healing in patients undergoing arthroscopic repair for medium to large rotator cuff tears. Randomized controlled trial; Level of evidence, 1. A total of 74 patients scheduled for arthroscopic repair of medium to large rotator cuff tears were randomly assigned to undergo either PRP-augmented repair (PRP group) or conventional repair (conventional group). In the PRP group, 3 PRP gels (3 × 3 mL) were applied to each patient between the torn end and the greater tuberosity. The primary outcome was the Constant score at 3 months after surgery. Secondary outcome measures included the visual analog scale (VAS) for pain, range of motion (ROM), muscle strength, overall satisfaction and function, functional scores, retear rate, and change in the cross-sectional area (CSA) of the supraspinatus muscle. There was no difference between the 2 groups in the Constant score at 3 months (P > .05). The 2 groups had similar results on the VAS for pain, ROM, muscle strength, overall satisfaction and function, and other functional scores (all P > .05) except for the VAS for worst pain (P = .043). The retear rate of the PRP group (3.0%) was significantly lower than that of the conventional group (20.0%) (P = .032). The change in 1-year postoperative and immediately postoperative CSAs was significantly different between the 2 groups: -36.76 ± 45.31 mm(2) in the PRP group versus -67.47 ± 47.26 mm(2) in the conventional group (P = .014). Compared with repairs without PRP augmentation, the current PRP preparation and application methods for medium to large rotator cuff repairs significantly improved the quality, as evidenced by a decreased retear rate and increased CSA of the supraspinatus, but not the speed of healing. However, further studies may be needed to investigate the effects of PRP on the speed of healing without risking the quality. © 2015 The Author(s).
2014-01-01
Background It is well established in studies across several countries that tobacco smoking is more prevalent among schizophrenic patients than the general population. Electronic cigarettes are becoming increasingly popular with smokers worldwide. To date there are no large randomized trials of electronic cigarettes in schizophrenic smokers. A well-designed trial is needed to compare efficacy and safety of these products in this special population. Methods/Design Intervention: We have designed a randomized controlled trial investigating the efficacy and safety of electronic cigarette. The trial will take the form of a prospective 12-month randomized clinical study to evaluate smoking reduction, smoking abstinence and adverse events in schizophrenic smokers not intending to quit. We will also monitor quality of life, neurocognitive functioning and measure participants’ perception and satisfaction of the product. Outcome measures: A ≥50% reduction in the number of cigarettes/day from baseline, will be calculated at each study visit (“reducers”). Abstinence from smoking will be calculated at each study visit (“quitters”). Smokers who leave the study protocol before its completion and will carry out the Early Termination Visit or who will not satisfy the criteria of “reducers” and “quitters” will be defined “non responders”. Statistical analysis: The differences of continuous variables between the three groups will be evaluated with the Kruskal-Wallis Test, followed by the Dunn multiple comparison test. The differences between the three groups for normally distributed data will be evaluated with ANOVA test one way, followed by the Newman-Keuls multiple comparison test. The normality of the distribution will be evaluated with the Kolmogorov-Smirnov test. Any correlations between the variables under evaluation will be assessed by Spearman r correlation. To compare qualitative data will be used the Chi-square test. Discussion The main strengths of the SCARIS study are the following: it’s the first large RCT on schizophrenic patient, involving in and outpatient, evaluating the effect of a three-arm study design, and a long term of follow-up (52-weeks). The goal is to propose an effective intervention to reduce the risk of tobacco smoking, as a complementary tool to treat tobacco addiction in schizophrenia. Trial registration ClinicalTrials.gov, NCT01979796. PMID:24655473
ERIC Educational Resources Information Center
Jucovy, Linda; Herrera, Carla
2009-01-01
This issue of "Public/Private Ventures (P/PV) In Brief" is based on "High School Students as Mentors," a report that examined the efficacy of high school mentors using data from P/PV's large-scale random assignment impact study of Big Brothers Big Sisters school-based mentoring programs. The brief presents an overview of the findings, which…
ERIC Educational Resources Information Center
Pete, Judith; Mulder, Fred; Neto, Jose Dutra Oliveira
2017-01-01
In order to obtain a fair "OER picture" for the Global South a large-scale study has been carried out for a series of countries, including Kenya. In this paper we report on the Kenya study, run at four universities that have been selected with randomly sampled students and lecturers. Empirical data have been generated by the use of a…
Langford, R M; Mares, J; Novotna, A; Vachova, M; Novakova, I; Notcutt, W; Ratcliffe, S
2013-04-01
Central neuropathic pain (CNP) occurs in many multiple sclerosis (MS) patients. The provision of adequate pain relief to these patients can very difficult. Here we report the first phase III placebo-controlled study of the efficacy of the endocannabinoid system modulator delta-9-tetrahydrocannabinol (THC)/cannabidiol (CBD) oromucosal spray (USAN name, nabiximols; Sativex, GW Pharmaceuticals, Salisbury, Wiltshire, UK), to alleviate CNP. Patients who had failed to gain adequate analgesia from existing medication were treated with THC/CBD spray or placebo as an add-on treatment, in a double-blind manner, for 14 weeks to investigate the efficacy of the medication in MS-induced neuropathic pain. This parallel-group phase of the study was then followed by an 18-week randomized-withdrawal study (14-week open-label treatment period plus a double-blind 4-week randomized-withdrawal phase) to investigate time to treatment failure and show maintenance of efficacy. A total of 339 patients were randomized to phase A (167 received THC/CBD spray and 172 received placebo). Of those who completed phase A, 58 entered the randomized-withdrawal phase. The primary endpoint of responder analysis at the 30 % level at week 14 of phase A of the study was not met, with 50 % of patients on THC/CBD spray classed as responders at the 30 % level compared to 45 % of patients on placebo (p = 0.234). However, an interim analysis at week 10 showed a statistically significant treatment difference in favor of THC/CBD spray at this time point (p = 0.046). During the randomized-withdrawal phase, the primary endpoint of time to treatment failure was statistically significant in favor of THC/CBD spray, with 57 % of patients receiving placebo failing treatment versus 24 % of patients from the THC/CBD spray group (p = 0.04). The mean change from baseline in Pain Numerical Rating Scale (NRS) (p = 0.028) and sleep quality NRS (p = 0.015) scores, both secondary endpoints in phase B, were also statistically significant compared to placebo, with estimated treatment differences of -0.79 and 0.99 points, respectively, in favor of THC/CBD spray treatment. The results of the current investigation were equivocal, with conflicting findings in the two phases of the study. While there were a large proportion of responders to THC/CBD spray treatment during the phase A double-blind period, the primary endpoint was not met due to a similarly large number of placebo responders. In contrast, there was a marked effect in phase B of the study, with an increased time to treatment failure in the THC/CBD spray group compared to placebo. These findings suggest that further studies are required to explore the full potential of THC/CBD spray in these patients.
Shirazi, M; Zeinaloo, A A; Parikh, S V; Sadeghi, M; Taghva, A; Arbabi, M; Kashani, A Sabouri; Alaeddini, F; Lonka, K; Wahlström, R
2008-04-01
The Prochaska model of readiness to change has been proposed to be used in educational interventions to improve medical care. To evaluate the impact on readiness to change of an educational intervention on management of depressive disorders based on a modified version of the Prochaska model in comparison with a standard programme of continuing medical education (CME). This is a randomized controlled trial within primary care practices in southern Tehran, Iran. The participants included 192 general physicians working in primary care (GPs) were recruited after random selection and randomized to intervention (96) and control (96). Intervention consisted of interactive, learner-centred educational methods in large and small group settings depending on the GPs' stages of readiness to change. Change in stage of readiness to change measured by the modified version of the Prochaska questionnaire was the The final number of participants was 78 (81%) in the intervention arm and 81 (84%) in the control arm. Significantly (P < 0.01), more GPs (57/96 = 59% versus 12/96 = 12%) in the intervention group changed to higher stages of readiness to change. The intervention effect was 46% points (P < 0.001) and 50% points (P < 0.001) in the large and small group setting, respectively. Educational formats that suit different stages of learning can support primary care doctors to reach higher stages of behavioural change in the topic of depressive disorders. Our findings have practical implications for conducting CME programmes in Iran and are possibly also applicable in other parts of the world.
Optimal Detection of a Localized Perturbation in Random Networks of Integrate-and-Fire Neurons.
Bernardi, Davide; Lindner, Benjamin
2017-06-30
Experimental and theoretical studies suggest that cortical networks are chaotic and coding relies on averages over large populations. However, there is evidence that rats can respond to the short stimulation of a single cortical cell, a theoretically unexplained fact. We study effects of single-cell stimulation on a large recurrent network of integrate-and-fire neurons and propose a simple way to detect the perturbation. Detection rates obtained from simulations and analytical estimates are similar to experimental response rates if the readout is slightly biased towards specific neurons. Near-optimal detection is attained for a broad range of intermediate values of the mean coupling between neurons.
Optimal Detection of a Localized Perturbation in Random Networks of Integrate-and-Fire Neurons
NASA Astrophysics Data System (ADS)
Bernardi, Davide; Lindner, Benjamin
2017-06-01
Experimental and theoretical studies suggest that cortical networks are chaotic and coding relies on averages over large populations. However, there is evidence that rats can respond to the short stimulation of a single cortical cell, a theoretically unexplained fact. We study effects of single-cell stimulation on a large recurrent network of integrate-and-fire neurons and propose a simple way to detect the perturbation. Detection rates obtained from simulations and analytical estimates are similar to experimental response rates if the readout is slightly biased towards specific neurons. Near-optimal detection is attained for a broad range of intermediate values of the mean coupling between neurons.
Bottiglione, F; Carbone, G
2015-01-14
The apparent contact angle of large 2D drops with randomly rough self-affine profiles is numerically investigated. The numerical approach is based upon the assumption of large separation of length scales, i.e. it is assumed that the roughness length scales are much smaller than the drop size, thus making it possible to treat the problem through a mean-field like approach relying on the large-separation of scales. The apparent contact angle at equilibrium is calculated in all wetting regimes from full wetting (Wenzel state) to partial wetting (Cassie state). It was found that for very large values of the roughness Wenzel parameter (r(W) > -1/ cos θ(Y), where θ(Y) is the Young's contact angle), the interface approaches the perfect non-wetting condition and the apparent contact angle is almost equal to 180°. The results are compared with the case of roughness on one single scale (sinusoidal surface) and it is found that, given the same value of the Wenzel roughness parameter rW, the apparent contact angle is much larger for the case of a randomly rough surface, proving that the multi-scale character of randomly rough surfaces is a key factor to enhance superhydrophobicity. Moreover, it is shown that for millimetre-sized drops, the actual drop pressure at static equilibrium weakly affects the wetting regime, which instead seems to be dominated by the roughness parameter. For this reason a methodology to estimate the apparent contact angle is proposed, which relies only upon the micro-scale properties of the rough surface.
Dahabreh, Issa J.; Sheldrick, Radley C.; Paulus, Jessica K.; Chung, Mei; Varvarigou, Vasileia; Jafri, Haseeb; Rassen, Jeremy A.; Trikalinos, Thomas A.; Kitsios, Georgios D.
2012-01-01
Aims Randomized controlled trials (RCTs) are the gold standard for assessing the efficacy of therapeutic interventions because randomization protects from biases inherent in observational studies. Propensity score (PS) methods, proposed as a potential solution to confounding of the treatment–outcome association, are widely used in observational studies of therapeutic interventions for acute coronary syndromes (ACS). We aimed to systematically assess agreement between observational studies using PS methods and RCTs on therapeutic interventions for ACS. Methods and results We searched for observational studies of interventions for ACS that used PS methods to estimate treatment effects on short- or long-term mortality. Using a standardized algorithm, we matched observational studies to RCTs based on patients’ characteristics, interventions, and outcomes (‘topics’), and we compared estimates of treatment effect between the two designs. When multiple observational studies or RCTs were identified for the same topic, we performed a meta-analysis and used the summary relative risk for comparisons. We matched 21 observational studies investigating 17 distinct clinical topics to 63 RCTs (median = 3 RCTs per observational study) for short-term (7 topics) and long-term (10 topics) mortality. Estimates from PS analyses differed statistically significantly from randomized evidence in two instances; however, observational studies reported more extreme beneficial treatment effects compared with RCTs in 13 of 17 instances (P = 0.049). Sensitivity analyses limited to large RCTs, and using alternative meta-analysis models yielded similar results. Conclusion For the treatment of ACS, observational studies using PS methods produce treatment effect estimates that are of more extreme magnitude compared with those from RCTs, although the differences are rarely statistically significant. PMID:22711757
Antman, Karen
2002-01-01
High dose therapy for breast cancer remains controversial. Of the 15 randomized trials of high dose therapy in breast cancer reported to date, two South African studies have been discredited leaving 13 remaining studies. Mortality was consistently low, in the 0 to 2.5% range, except for the BCNU containing American Intergroup study, which had a 7.4% toxic mortality rate. Seven of the remaining 13 studies randomized fewer than 200 patients. Three of these small studies have significant differences in disease free survival, and a fourth study has a trend in favor of high dose therapy. The other three small studies cannot exclude a survival difference of 20%. Of the 6 remaining moderately large trials of 219 to 885 randomized patients, 5 are adjuvant studies and one included patients with metastatic disease. Of the five adjuvant trials, four have significant differences in relapse rate favoring the high dose arm, and the remaining study has a trend (with a high dose sequential single agent design rather than combination therapy as in the other studies). A planned subset analysis of the first 284 patients in the largest study funded by the Dutch insurance industry showed a significant advantage for high dose therapy. Given the 2-year median time to relapse and an addition 2-year median to death after relapse, the follow up for survival of 3-5 years on these studies is still short. In the only moderately sized metastatic trial from the National Cancer Institute of Canada with a very short median follow-up of 19 months, a significant difference in disease free survival has emerged, with no difference in survival. PMID:12053718
Antman, Karen
2002-01-01
High dose therapy for breast cancer remains controversial. Of the 15 randomized trials of high dose therapy in breast cancer reported to date, two South African studies have been discredited leaving 13 remaining studies. Mortality was consistently low, in the 0 to 2.5% range, except for the BCNU containing American Intergroup study, which had a 7.4% toxic mortality rate. Seven of the remaining 13 studies randomized fewer than 200 patients. Three of these small studies have significant differences in disease free survival, and a fourth study has a trend in favor of high dose therapy. The other three small studies cannot exclude a survival difference of 20%. Of the 6 remaining moderately large trials of 219 to 885 randomized patients, 5 are adjuvant studies and one included patients with metastatic disease. Of the five adjuvant trials, four have significant differences in relapse rate favoring the high dose arm, and the remaining study has a trend (with a high dose sequential single agent design rather than combination therapy as in the other studies). A planned subset analysis of the first 284 patients in the largest study funded by the Dutch insurance industry showed a significant advantage for high dose therapy. Given the 2-year median time to relapse and an addition 2-year median to death after relapse, the follow up for survival of 3-5 years on these studies is still short. In the only moderately sized metastatic trial from the National Cancer Institute of Canada with a very short median follow-up of 19 months, a significant difference in disease free survival has emerged, with no difference in survival.
Sommer, Anders; Kronborg, Mads Brix; Poulsen, Steen Hvitfeldt; Böttcher, Morten; Nørgaard, Bjarne Linde; Bouchelouche, Kirsten; Mortensen, Peter Thomas; Gerdes, Christian; Nielsen, Jens Cosedis
2013-04-26
Cardiac resynchronization therapy (CRT) is an established treatment in heart failure patients. However, a large proportion of patients remain nonresponsive to this pacing strategy. Left ventricular (LV) lead position is one of the main determinants of response to CRT. This study aims to clarify whether multimodality imaging guided LV lead placement improves clinical outcome after CRT. The ImagingCRT study is a prospective, randomized, patient- and assessor-blinded, two-armed trial. The study is designed to investigate the effect of imaging guided left ventricular lead positioning on a clinical composite primary endpoint comprising all-cause mortality, hospitalization for heart failure, or unchanged or worsened functional capacity (no improvement in New York Heart Association class and <10% improvement in six-minute-walk test). Imaging guided LV lead positioning is targeted to the latest activated non-scarred myocardial region by speckle tracking echocardiography, single-photon emission computed tomography, and cardiac computed tomography. Secondary endpoints include changes in LV dimensions, ejection fraction and dyssynchrony. A total of 192 patients are included in the study. Despite tremendous advances in knowledge with CRT, the proportion of patients not responding to this treatment has remained stable since the introduction of CRT. ImagingCRT is a prospective, randomized study assessing the clinical and echocardiographic effect of multimodality imaging guided LV lead placement in CRT. The results are expected to make an important contribution in the pursuit of increasing response rate to CRT. Clinicaltrials.gov identifier NCT01323686. The trial was registered March 25, 2011 and the first study subject was randomized April 11, 2011.
Enama, Mary E.; Hu, Zonghui; Gordon, Ingelise; Costner, Pamela; Ledgerwood, Julie E.; Grady, Christine
2012-01-01
Background Consent to participate in research is an important component of the conduct of ethical clinical trials. Current consent practices are largely policy-driven. This study was conducted to assess comprehension of study information and satisfaction with the consent form between subjects randomized to concise or to standard informed consent forms as one approach to developing evidence-based consent practices. Methods Participants (N=111) who enrolled into two Phase I investigational influenza vaccine protocols (VRC 306 and VRC 307) at the NIH Clinical Center were randomized to one of two IRB-approved consents; either a standard or concise form. Concise consents had an average of 63% fewer words. All other aspects of the consent process were the same. Questionnaires about the study and the consent process were completed at enrollment and at the last visit in both studies. Results Subjects using concise consent forms scored as well as those using standard length consents in measures of comprehension (7 versus 7, p=0.79 and 20 versus 21, p=0.13), however, the trend was for the concise consent group to report feeling better informed. Both groups thought the length and detail of the consent form was appropriate. Conclusions Randomization of study subjects to different length IRB-approved consents forms as one method for developing evidence-based consent practices, resulted in no differences in study comprehension or satisfaction with the consent form. A concise consent form may be used ethically in the context of a consent process conducted by well-trained staff with opportunities for discussion and education throughout the study. PMID:22542645
ERIC Educational Resources Information Center
Cohen, Dale; Tracy, Ryan; Cohen, Jon
2017-01-01
This study examined the effectiveness and influence on validity of a computer-based pop-up English glossary accommodation for English learners (ELs) in grades 3 and 7. In a randomized controlled trial, we administered pop-up English glossaries with audio to students taking a statewide accountability English language arts (ELA) and mathematics…
Economic feasibility of products from inland West small-diameter timber
Spelter Henry; Rong Wang; Peter Ince
1996-01-01
A large part of the forests located in the Rocky Mountain region of the U.S. West (inland West) is characterized by densely packed, small-diameter stands. The purpose of this study was to examine the economic feasibility of using small-diameter material from this resource to manufacture various wood products: oriented strandboard (OSB), stud lumber, random-length...
Producing a functional eukaryotic messenger RNA (mRNA) requires the coordinated activity of several large protein complexes to initiate transcription, elongate nascent transcripts, splice together exons, and cleave and polyadenylate the 3’ end. Kinetic competition between these various processes has been proposed to regulate mRNA maturation, but this model could lead to
ERIC Educational Resources Information Center
Hua, Haiyan; Burchfield, Shirley
A large-scale longitudinal study in Bolivia examined the relationship between adult women's basic education and their social and economic well-being and development. A random sample of 1,600 participants and 600 nonparticipants, aged 15-45, was tracked for 3 years (the final sample included 717 participants and 224 controls). The four adult…
Applying Neural Networks in Optical Communication Systems: Possible Pitfalls
NASA Astrophysics Data System (ADS)
Eriksson, Tobias A.; Bulow, Henning; Leven, Andreas
2017-12-01
We investigate the risk of overestimating the performance gain when applying neural network based receivers in systems with pseudo random bit sequences or with limited memory depths, resulting in repeated short patterns. We show that with such sequences, a large artificial gain can be obtained which comes from pattern prediction rather than predicting or compensating the studied channel/phenomena.
Course Shopping in Urban Community Colleges: An Analysis of Student Drop and Add Activities
ERIC Educational Resources Information Center
Hagedorn, Linda Serra; Maxwell, William E.; Cypers, Scott; Moon, Hye Sun; Lester, Jaime
2007-01-01
This study examined the course shopping behaviors among a sample of approximately 5,000 community college students enrolled across nine campuses of a large urban district. The sample was purposely designed as an analytic, rather than a random, sample that sought to obtain adequate numbers of students in course areas that were of theoretical and of…