Sample records for dimeticone randomized observer

  1. A highly efficacious pediculicide based on dimeticone: Randomized observer blinded comparative trial

    PubMed Central

    Heukelbach, Jorg; Pilger, Daniel; Oliveira, Fabíola A; Khakban, Adak; Ariza, Liana; Feldmeier, Hermann

    2008-01-01

    Background Infestation with the human head louse (Pediculus humanus capitis) occurs worldwide. Existing treatment options are limited, and reports of resistance to commonly used pediculicides have been increasing. In this trial we assessed the efficacy of a product containing a high (92%) concentration of the silicone oil dimeticone (identical in composition to NYDA®), as compared to a 1% permethrin lotion. Methods Randomized, controlled, observer blinded clinical trial. Participants were recruited from a poor urban neighbourhood in Brazil where pediculosis capitis was highly prevalent. To minimize reinfestation during the trial, participants (145 children aged 5–15 years with head lice infestations) were transferred to a holiday resort outside the endemic area for a period of 9 days. Two applications of dimeticone or 1% permethrin were done, seven days apart. Outcome measures were defined as cure (absence of vital head lice) after first application and before and after second applications, degree of itching, cosmetic acceptability, and clinical pathology. Results Overall cure rates were: day 2 – dimeticone 94.5% (95% CI: 86.6% – 98.5%) and permethrin 66.7% (95% CI: 54.6% – 77.3%; p < 0.0001); day 7 – dimeticone 64.4% (95% CI: 53.3% – 75.3%) and permethrin 59.7% (95% CI: 47.5% – 71.1%; p = 0.5); day 9 – dimeticone 97.2% (95% CI: 90.3% – 99.7%) and permethrin 67.6% (95% CI: 55.4%-78.2%); p < 0.0001). Itching was reduced similarly in both groups. Cosmetic acceptability was significantly better in the dimeticone group as compared to the permethrin group (p = 0.01). Two mild product-related incidents occurred in the dimeticone group. Conclusion The dimeticone product is a safe and highly efficacious pediculicide. Due to its physical mode of action (interruption of the lice's oxygen supply of the central nervous system), development of resistance is unlikely. Trial registration Current Controlled Trials ISRCTN15117709. PMID:18783606

  2. Lethal effects of treatment with a special dimeticone formula on head lice and house crickets (Orthoptera, Ensifera: Acheta domestica and Anoplura, phthiraptera: Pediculus humanus). Insights into physical mechanisms.

    PubMed

    Richling, Ira; Böckeler, Wolfgang

    2008-01-01

    The present study provides the first convincing explanation of the mode of action of the medical device NYDA, a special dimeticone (CAS 9006-65-9) formula containing 92% of two dimeticones with different viscosities specifically designed for the physical treatment of head lice infestations (pediculosis capitis) by suffocation. Both, lice (Pediculus humanus) and house crickets (Acheta domestica) treated with this anti-head lice product are knocked out to the status "of no major vital signs" within less than 1 min that in consequence is accompanied irreversibly with the death of the respective insects. Scanning electron microscopical investigations have revealed that the cuticle is coated by a thin closed layer of the dimeticone formula that also enters the stigmata. In vivo observations and dissections of Acheta domestica have shown that application of the medical device to the thoracic stigmata invariably leads to rapid death; this is strongly correlated with the influx of the special dimeticone formula into the head trachea, whereby the solution effectively blocks the oxygen supply of the central nervous system. Dissections after application of the stained product show that it also enters the finest tracheal branches. Analogous in vivo observations in Pediculus humanus have confirmed the correlation between the disappearance of major vital signs and the displacement of air by the dimeticone formula in the tracheal system of the head. For both insect species, statistical data are provided for the chronological sequence of the filling of the tracheal system in relation to the respective vitality conditions of the Insects. On average, the special dimeticone formula reaches the insect's head tracheae within 0.5 min in house crickets and in less than 1 min in lice with a complete filling of the entire head tracheal system of lice within 3.5 min. In addition, a timed sequence of images illustrates this process for lice. The experiments clearly reveal the exclusive and pure physical mode of action of the tested dimeticone formula.

  3. Effect of dimeticone and pepsin on the bioavailability of metoclopramide in healthy volunteers.

    PubMed

    do Nascimento, D F; Silva Leite, A L A e; de Moraes, R A; Camarão, G C; Bezerra, F A F; de Moraes, M O; de Moraes, M E A

    2014-10-01

    To assess the effect of dimeticone and pepsin on the bioavailability of metoclopramide (CAS 7232-21-5) in healthy volunteers. The study was conducted using a randomized, open, 2-period crossover design. The volunteers received single administration of 7-mg conventional metoclopramide capsule and a formulation containing metoclopramide (7 mg) plus dimeticone (40 mg) and pepsin (50 mg), with a 7-day interval between treatments. Serial blood samples were collected before dosing and during 24 h post-treatment. Plasma metoclopramide concentrations were analyzed by liquid chromatography coupled with tandem mass spectrometry (LC-MS/MS). The pharmacokinetics parameters AUC(last) and C(max) were obtained from the metoclopramide plasma concentration vs. time curves. Metoclopramide's association was bioequivalent to conventional capsule; 90% CIs for geometric mean treatment ratios of C(max) [108.0% (90% CI, 100.4-116.3%)], AUC(last) [103.3% (90% CI, 99.5-107.4%)] were within the predefined range. The metoclopramide formulations were well tolerated at the administered doses and no significant adverse reactions were observed. Thus, these results confirm the good bioavailability of metoclopramide in the new formulation and rule out any impaired absorption when the drugs are formulated in combination. © Georg Thieme Verlag KG Stuttgart · New York.

  4. Treatment of head louse infestation with 4% dimeticone lotion: randomised controlled equivalence trial

    PubMed Central

    Burgess, Ian F; Brown, Christine M; Lee, Peter N

    2005-01-01

    Objective To evaluate the efficacy and safety of 4% dimeticone lotion for treatment of head louse infestation. Design Randomised controlled equivalence trial. Setting Community, with home visits. Participants 214 young people aged 4 to 18 years and 39 adults with active head louse infestation. Interventions Two applications seven days apart of either 4.0% dimeticone lotion, applied for eight hours or overnight, or 0.5% phenothrin liquid, applied for 12 hours or overnight. Outcome measures Cure of infestation (no evidence of head lice after second treatment) or reinfestation after cure. Results Cure or reinfestation after cure occurred in 89 of 127 (70%) participants treated with dimeticone and 94 of 125 (75%) treated with phenothrin (difference -5%, 95% confidence interval -16% to 6%). Per protocol analysis showed that 84 of 121 (69%) participants were cured with dimeticone and 90 of 116 (78%) were cured with phenothrin. Irritant reactions occurred significantly less with dimeticone (3/127, 2%) than with phenothrin (11/125, 9%; difference -6%, -12% to -1%). Per protocol this was 3 of 121 (3%) participants treated with dimeticone and 10 of 116 (9%) treated with phenothrin (difference -6%, -12% to -0.3%). Conclusion Dimeticone lotion cures head louse infestation. Dimeticone seems less irritant than existing treatments and has a physical action on lice that should not be affected by resistance to neurotoxic insecticides. PMID:15951310

  5. The mode of action of dimeticone 4% lotion against head lice, Pediculus capitis

    PubMed Central

    Burgess, Ian F

    2009-01-01

    Background Treatment of head lice using physically acting preparations based on silicones is currently replacing insecticide use due to widespread resistance to neurotoxic agents. It has been postulated that some products act by asphyxiation, although the limited experimental evidence and the anatomy of the louse respiratory system suggest this is unlikely. Results Observation over several hours of lice treated using 4% high molecular weight dimeticone in a volatile silicone base showed that, although rapidly immobilised initially, the insects still exhibited small movements of extremities and death was delayed. One common effect of treatment is inhibition of the louse's ability to excrete water by transpiration through the spiracles. Inability to excrete water that is ingested as part of the louse blood meal appears to subject the louse gut to osmotic stress resulting in rupture. Scanning electron microscopy coupled with X-ray microanalysis to detect silicon showed dimeticone lotion is deposited in the spiracles and distal region of the tracheae of lice and in some cases blocks the lumen or opening entirely. Conclusion This work raises doubts that lice treated using dimeticone preparations die from anoxia despite blockage of the outer respiratory tract because movements can be observed for hours after exposure. However, the blockage inhibits water excretion, which causes physiological stress that leads to death either through prolonged immobilisation or, in some cases, disruption of internal organs such as the gut. PMID:19232080

  6. Single application of 4% dimeticone liquid gel versus two applications of 1% permethrin creme rinse for treatment of head louse infestation: a randomised controlled trial

    PubMed Central

    2013-01-01

    Background A previous study indicated that a single application of 4% dimeticone liquid gel was effective in treating head louse infestation. This study was designed to confirm this in comparison with two applications of 1% permethrin. Methods We have performed a single centre parallel group, randomised, controlled, open label, community based trial, with domiciliary visits, in Cambridgeshire, UK. Treatments were allocated through sealed instructions derived from a computer generated list. We enrolled 90 children and adults with confirmed head louse infestation analysed by intention to treat (80 per-protocol after 4 drop outs and 6 non-compliant). The comparison was between 4% dimeticone liquid gel applied once for 15 minutes and 1% permethrin creme rinse applied for 10 minutes, repeated after 7 days as per manufacturer’s directions. Evaluated by elimination of louse infestation after completion of treatment application regimen. Results Intention to treat comparison of a single dimeticone liquid gel treatment with two of permethrin gave success for 30/43 (69.8%) of the dimeticone liquid gel group and 7/47 (14.9%) of the permethrin creme rinse group (OR 13.19, 95% CI 4.69 to 37.07) (p < 0.001). Per protocol results were similar with 27/35 (77.1%) success for dimeticone versus 7/45 (15.6%) for permethrin. Analyses by household gave essentially similar outcomes. Conclusions The study showed one 15 minute application of 4% dimeticone liquid gel was superior to two applications of 1% permethrin creme rinse (p < 0.001). The low efficacy of permethrin suggests it should be withdrawn. Trial registration Current Controlled Trials ISRCTN88144046. PMID:23548062

  7. Treatment of head lice with dimeticone 4% lotion: comparison of two formulations in a randomised controlled trial in rural Turkey

    PubMed Central

    2009-01-01

    Background Dimeticone 4% lotion was shown to be an effective treatment for head louse infestation in two randomised controlled trials in England. It is not affected by insecticide resistance but efficacy obtained (70-75%) was lower than expected. This study was designed to evaluate efficacy of dimeticone 4% lotion in a geographically, socially, and culturally different setting, in rural Turkey and, in order to achieve blinding, it was compared with a potential alternative formulation. Methods Children from two village schools were screened for head lice by detection combing. All infested students and family members could participate, giving access to treatment for the whole community. Two investigator applied treatments were given 7 days apart. Outcome was assessed by detection combing three times between treatments and twice the week following second treatment. Results In the intention to treat group 35/36 treated using dimeticone 4% had no lice after the second treatment but there were two protocol violators giving 91.7% treatment success. The alternative product gave 30/36 (83.3%) treatment success, a difference of 8.4% (95% CI -9.8% to 26.2%). The cure rates per-protocol were 33/34 (97.1%) and 30/35 (85.7%) respectively. We were unable to find any newly emerged louse nymphs on 77.8% of dimeticone 4% treated participants or on 66.7% of those treated with the alternative formulation. No adverse events were identified. Conclusion Our results confirm the efficacy of dimeticone 4% lotion against lice and eggs and we found no detectable difference between this product and dimeticone 4% lotion with nerolidol 2% added. We believe that the high cure rate was related to the lower intensity of infestation in Turkey, together with the level of community engagement, compared with previous studies in the UK. Trial Registration Current Controlled Trials ISRCTN10431107 PMID:19951427

  8. Size exclusion chromatography with evaporative light scattering detection as a method for speciation analysis of polydimethylsiloxanes. III. Identification and determination of dimeticone and simeticone in pharmaceutical formulations.

    PubMed

    Mojsiewicz-Pieńkowska, Krystyna

    2012-01-25

    The pharmaceutical industry is one of the more important sectors for the use of polydimethylsiloxanes (PDMS), which belong to the organosilicon polymers. In drugs for internal use, they are used as an active pharmaceutical ingredient (API) called dimeticone or simeticone. Due to their specific chemical nature, PDMS can have different degrees of polymerization, which determine the molecular weight and viscosity. The Pharmacopoeial monographs for dimeticone and simeticone, only give the permitted polymerization and viscosity range. It is, however, essential to know also the degree of polymerization or the specific molecular weight of PDMS that are present in pharmaceutical formulations. In the literature there is information about the impact of particle size, and thus molecular weight, on the toxicity, absorption and migration in living organisms. This study focused on the use of a developed method - the exclusion chromatography with evaporative light scattering detector (SEC-ELSD) - for identification and determination of dimeticone and simeticone in various pharmaceutical formulations. The method had a high degree of specificity and was suitable for speciation analysis of these polymers. So far the developed method has not been used in the control of medicinal products containing dimeticone or simeticone. Copyright © 2011 Elsevier B.V. All rights reserved.

  9. Single-center, noninterventional clinical trial to assess the safety, efficacy, and tolerability of a dimeticone-based medical device in facilitating the removal of scales after topical application in patients with psoriasis corporis or psoriasis capitis.

    PubMed

    Hengge, Ulrich R; Röschmann, Kristina; Candler, Henning

    2017-01-01

    Psoriasis is a frequent inflammatory skin disease affecting ~2%-3% of the population in western countries. Scaling of the psoriatic lesions is the most impairing symptom in patients with psoriasis. In contrast to conventional keratolytic treatment concepts containing salicylic acid or urea, a dimeticone-based medical device (Loyon ® ) removes scales in a physical way without any pharmacological effect. To assess the efficacy and tolerability of a dimeticone-based medical device in removal of scales in patients with psoriasis corporis/capitis under real-life conditions. Forty patients with psoriasis capitis or corporis were included and received once-daily treatments for 7 days. Clinical assessment of the psoriasis area severity index score (psoriasis corporis) and the psoriasis scalp severity index score (psoriasis capitis) was performed and evaluated at baseline, after 3 and 7 days of treatment. Baseline scaling scores and redness scores were calculated for two target lesions of the scalp or the body on a 5-point scale each. For the primary efficacy variable scaling score, a statistically significant decrease was observed after treatment, with a relative reduction in scaling of 36.8% after 7 days of treatment within patients affected by psoriasis capitis. Treatment success was achieved in 76.8% of patients with psoriasis capitis, and time to treatment success was evaluated to be 4.14 days for these patients and 4.33 days for patients suffering from psoriasis corporis. In conclusion, this trial demonstrated that the dimeticone-based medical device is a safe, well-tolerated, practicable, and efficient keratolytic compound, which can be well implemented in and recommended for standard therapy of psoriasis.

  10. Single-center, noninterventional clinical trial to assess the safety, efficacy, and tolerability of a dimeticone-based medical device in facilitating the removal of scales after topical application in patients with psoriasis corporis or psoriasis capitis

    PubMed Central

    Hengge, Ulrich R; Röschmann, Kristina; Candler, Henning

    2017-01-01

    Introduction Psoriasis is a frequent inflammatory skin disease affecting ~2%–3% of the population in western countries. Scaling of the psoriatic lesions is the most impairing symptom in patients with psoriasis. In contrast to conventional keratolytic treatment concepts containing salicylic acid or urea, a dimeticone-based medical device (Loyon®) removes scales in a physical way without any pharmacological effect. Objective To assess the efficacy and tolerability of a dimeticone-based medical device in removal of scales in patients with psoriasis corporis/capitis under real-life conditions. Methods Forty patients with psoriasis capitis or corporis were included and received once-daily treatments for 7 days. Clinical assessment of the psoriasis area severity index score (psoriasis corporis) and the psoriasis scalp severity index score (psoriasis capitis) was performed and evaluated at baseline, after 3 and 7 days of treatment. Baseline scaling scores and redness scores were calculated for two target lesions of the scalp or the body on a 5-point scale each. Results For the primary efficacy variable scaling score, a statistically significant decrease was observed after treatment, with a relative reduction in scaling of 36.8% after 7 days of treatment within patients affected by psoriasis capitis. Treatment success was achieved in 76.8% of patients with psoriasis capitis, and time to treatment success was evaluated to be 4.14 days for these patients and 4.33 days for patients suffering from psoriasis corporis. Conclusion In conclusion, this trial demonstrated that the dimeticone-based medical device is a safe, well-tolerated, practicable, and efficient keratolytic compound, which can be well implemented in and recommended for standard therapy of psoriasis. PMID:29387607

  11. Comparing the Efficacy of Commercially Available Insecticide and Dimeticone based Solutions on Head Lice, Pediculus capitis: in vitro Trials.

    PubMed

    Balcıoğlu, I Cüneyt; Karakuş, Mehmet; Arserim, Suha K; Limoncu, M Emin; Töz, Seray; Baştemur, Serkan; Öncel, Koray; Özbel, Yusuf

    2015-12-01

    Head lice infestation is a public health and social problem for almost all countries worldwide. For its treatment, insecticide and dimeticone-based solutions are currently available in the markets in many countries. We aimed to compare the efficacy of commercially available anti-head lice shampoos containing insecticide and physically effective products with different percentages of dimeticone using an in vitro technique. Head lice specimens were collected from primary school children using special plastic and metal combs. Anti-head lice products were commercially purchased and used directly. The specimens were placed one by one in 5-cm Petri dishes containing a slightly wet filter paper and were kept in a plastic cage at 28±2°C and 50%±20% relative humidity. A standardized protocol was used for testing all the products, and mortality data were obtained after 24 h. Two control tests were performed with each batch of trials. For each product and control, 10-20 head lice specimens were used, and the results were statistically analyzed. Our study demonstrated that among all the tested products, two products containing mineral oils [5.5% dimeticone & silicone (patented product) and dimeticone (no percentage mentioned in the prospectus) & cyclopentasiloxane] were found to be more effective for killing head lice in vitro. Physically effective products can be repetitively used because they are non-toxic and resistance to them is not expected. To control the infestation at a public level, the use of these products needs to be encouraged with respect to their cost price.

  12. Treatment of pediculosis capitis: a critical appraisal of the current literature.

    PubMed

    Feldmeier, Hermann

    2014-10-01

    Pediculosis capitis is the most common ectoparasitic disease in children in industrialized countries and extremely common in resource-poor communities of the developing world. The extensive use of pediculicides with a neurotoxic mode of action has led to the development and spread of resistant head lice populations all over the world. This triggered the development of compounds with other modes of action. The current literature on treatment approaches of head lice infestation was searched, and published randomized controlled trials were critically analyzed. The following compounds/family of compounds were identified: spinosad, a novel compound with a new neurotoxic mode of action, isopropyl myristate, 1,2-octanediol, ivermectin, plant-based products, and dimeticones. The efficacy and safety of these compounds are reviewed and recommendations for the treatment of pediculosis capitis in individuals as well as the interruption of ongoing epidemics are provided.

  13. Head lice infestations: A clinical update.

    PubMed

    Cummings, Carl; Finlay, Jane C; MacDonald, Noni E

    2018-02-01

    Head lice ( Pediculus humanus capitis ) infestations are not a primary health hazard or a vector for disease, but they are a societal problem with substantial costs. Diagnosis of head lice infestation requires the detection of a living louse. Although pyrethrins and permethrin remain first-line treatments in Canada, isopropyl myristate/ST-cyclomethicone solution and dimeticone can be considered as second-line therapies when there is evidence of treatment failure.

  14. Head lice. Dimeticone is the pediculicide of choice.

    PubMed

    2014-07-01

    Infestation of the scalp by head lice, or pediculosis, is a common, unpleasant but harmless parasitosis. For patients with pediculosis, which topical treatment eradicates the parasites effectively while causing the least harm? We reviewed the available evidence using the standard Prescrire methodology. Lice can be eradicated by shaving the head or combing the hair several times a day for several weeks with a fine-toothed lice comb, although combing is only completely effective in about 50% of cases. Pyrethroids (permethrin, phenothrin and bioallethrin), often combined with piperonyl butoxide, are insecticides that are neurotoxic to lice. The lice eradication rates achieved in trials of these agents are highly variable, ranging from 13% to 75% depending on the country, probably due to the development of resistance. In five randomised trials, the organophosphorus insecticide malathion was more effective than permethrin or phenothrin, achieving eradication rates of 80% to 98%. Topical application of the insecticides ivermectin or spinosad was effective in 75% to 85% of patients in randomised trials. Insecticides have mainly local adverse effects: pruritus and irritation of the scalp. Cases of malathion poisoning have been reported following topical application or ingestion. The long-term toxicity of insecticides is unclear; it therefore appears preferable to minimise their use. Agents that kill lice through physical mechanisms have few known adverse effects. It seems unlikely that lice will develop resistance to them. Dimeticone, a silicone compound, is not absorbed through the skin and provokes very few adverse effects. It is one of the better evaluated agents: in three randomised trials, 70% to 97% of patients were lice-free after two weeks. Other agents with a physical action on lice have been evaluated, each in one randomised trial including a few dozen patients. One of these, 1,2-octanediol, applied in an alcoholic solution, seemed to eradicate lice effectively with no notable adverse effects. It is advisable to avoid aerosol formulations due to the risk of bronchospasm, products containing terpenes as these compounds can cause seizures in infants and young children, and products that lack a child-proof cap. In practice, as of early 2014, pyrethroids are no longer the first-choice treatment for head lice: they are losing effectiveness and may be toxic in the long-term. Dimeticone is a better choice, because it has few known adverse effects and proven efficacy.

  15. Do drowning and anoxia kill head lice?

    PubMed

    Candy, Kerdalidec; Brun, Sophie; Nicolas, Patrick; Durand, Rémy; Charrel, Remi N; Izri, Arezki

    2018-01-01

    Chemical, physical, and mechanical methods are used to control human lice. Attempts have been made to eradicate head lice Pediculus humanus capitis by hot air, soaking in various fluids or asphyxiation using occlusive treatments. In this study, we assessed the maximum time that head lice can survive anoxia (oxygen deprivation) and their ability to survive prolonged water immersion. We also observed the ingress of fluids across louse tracheae and spiracle characteristics contrasting with those described in the literature. We showed that 100% of lice can withstand 8 h of anoxia and 12.2% survived 14 h of anoxia; survival was 48.9% in the untreated control group at 14 h. However, all lice had died following 16 h of anoxia. In contrast, the survival rate of water-immersed lice was significantly higher when compared with non-immersed lice after 6 h (100% vs. 76.6%, p = 0.0037), and 24 h (50.9% vs. 15.9%, p = 0.0003). Although water-immersed lice did not close their spiracles, water did not penetrate into the respiratory system. In contrast, immersion in colored dimeticone/cyclomethicone or colored ethanol resulted in penetration through the spiracles and spreading to the entire respiratory system within 30 min, leading to death in 100% of the lice. © K. Candy et al., published by EDP Sciences, 2018.

  16. Pediculosis capitis: new insights into epidemiology, diagnosis and treatment.

    PubMed

    Feldmeier, H

    2012-09-01

    Pediculosis capitis is a ubiquitous parasitic skin disease caused by Pediculus humanus capitis. Head lice are highly specialised parasites which can propagate only on human scalp and hair. Transmission occurs by direct head-to-head contact. Head lice are vectors of important bacterial pathogens. Pediculosis capitis usually occurs in small epidemics in play groups, kindergartens and schools. Population-based studies in European countries show highly diverging prevalences, ranging from 1% to 20%. The diagnosis of head lice infestation is made through the visual inspection of hair and scalp or dry/wet combing. The optimal method for the diagnosis of active head lice infestation is dry/wet combing. Topical application of a pediculicide is the most common treatment. Compounds with a neurotoxic mode of action are widely used but are becoming less effective due to resistant parasite populations. Besides, their use is restricted by safety concerns. Dimeticones, silicone oils with a low surface tension and the propensity to perfectly coat surfaces, have a purely physical mode of action. This group of compounds is highly effective and safe, and there is no risk that head lice become resistant. The control of epidemics requires active contact tracing and synchronised treatment with an effective and safe pediculicide.

  17. Ovicidal Response of NYDA Formulations on the Human Head Louse (Anoplura: Pediculidae) Using a Hair Tuft Bioassay

    PubMed Central

    STRYCHARZ, JOSEPH P.; LAO, ALICE R.; ALVES, ANNA-MARIA; CLARK, J. MARSHALL

    2012-01-01

    Using the in vitro rearing system in conjunction with the hair tuft bioassay, NYDA and NYDA without fragrances formulations (92% wt:wt dimeticones) were 100% ovicidal (0% of treated eggs hatched) after an 8-h exposure of the eggs of the human head louse (Pediculus humanus capitis De Geer) following the manufacturer’s instructions. Comparatively, 78 and 66% of eggs similarly exposed hatched after distilled deionized water or Nix (1% permethrin) treatments, respectively. NYDA and NYDA without fragrances formulations were also statistically and substantially more ovicidal than either distilled deionized water or Nix treatments after 10, 30 min, and 1 h exposures. Only the 10 min exposure of eggs to NYDA and NYDA without fragrances formulations resulted in hatched lice that survived to adulthood (5–8% survival). Of the lice that hatched from eggs exposed to NYDA formulations for 10 min, there were no significant differences in the time it took them to become adults, female fecundity or the viability of eggs laid by surviving females. The longevity of adults, however, was reduced after the 10 min treatments of eggs with NYDA and NYDA without fragrances formulations compared with either the distilled deionized water or Nix treatments. PMID:22493852

  18. Clinical Knowledge from Observational Studies: Everything You Wanted to Know but Were Afraid to Ask.

    PubMed

    Gershon, Andrea S; Jafarzadeh, S Reza; Wilson, Kevin C; Walkey, Allan J

    2018-05-07

    Well-done randomized trials provide accurate estimates of treatment effect by producing groups that are similar on all measures except for the intervention of interest. However, inferences of efficacy in tightly-controlled experimental settings may not translate into similar effectiveness in real-world settings. Observational studies generally enable inferences over a wider range of patient characteristics and evaluation of a broader range of outcomes over a longer period than randomized trials. However, clinicians are often reluctant to incorporate the findings of observational studies into clinical practice. Reason for uncertainty regarding observational studies include a lack of familiarity with observational research methods, occasional disagreements between results of observational studies and randomized trials, the perceived risk of spurious results from systematic bias, and prior teaching that randomized trials are the most reliable source of medical evidence. We propose that a better understanding of observational research will enhance clinicians' ability to distinguish reliable observational studies from those that are subjected to biases and, therefore, provide more confidence to apply observational research results into clinical practice when appropriate. Herein, we explain why observational studies may be perceived as less conclusive than randomized trials, address situations in which observational research and randomized trials produced different findings, and provide information on observational study design so that quality can be evaluated. We conclude that observational research is a valuable source of medical evidence and that clinical action is strongest when supported by both high quality observational studies and randomized trials.

  19. A Mixed Effects Randomized Item Response Model

    ERIC Educational Resources Information Center

    Fox, J.-P.; Wyrick, Cheryl

    2008-01-01

    The randomized response technique ensures that individual item responses, denoted as true item responses, are randomized before observing them and so-called randomized item responses are observed. A relationship is specified between randomized item response data and true item response data. True item response data are modeled with a (non)linear…

  20. Medication administered to children from 0 to 7.5 years in the Avon Longitudinal Study of Parents and Children (ALSPAC)

    PubMed Central

    Northstone, K.

    2007-01-01

    Objective To present data on the parentally-reported use of all types of medicinal products in children from 0 to 7.5 years, in a large cohort in south-west England. Methods Participants in the population-based Avon Longitudinal Study of Parents and Children (ALSPAC) have been sent self-completion postal questionnaires since they were enrolled during pregnancy. The use of medicinal products has been obtained from questionnaires sent out when the study children were 4 weeks, 54, 78 and 91 months old. Results The data included prescription, over-the-counter and complementary and alternative medicines. Around three-quarters of study children were exposed to some form of medicinal product before 8 weeks of age. Dermatological preparations were the most commonly used products in young babies. Activated dimeticone, for treatment of colic and flatulence, was given to 16% of babies and gripe water was used by 13%. Other commonly reported products included oral and topical antifungals and ophthalmic antibiotics. Several OTC products, not licensed for use in this age group, were reported. At each of the older ages surveyed, over 95% of children had used some form of medicinal product within the previous 12–18 months. Use of several product categories was higher at 54 months than at 78 or 91 months, such as topical steroids, analgesics (mostly paracetamol) and systemic antibiotics (mainly amoxicillin). Conversely, use of other categories, such as asthma medication, throat preparations (sprays and lozenges) and anti-inflammatory products, increased with increasing age. Conclusion Parentally-reported use of medicinal products in the ALSPAC study children appears to be consistent with data from other sources. The use of medicinal products by young children is high and does not always conform to the product labelling. PMID:17200835

  1. Concordance of Results from Randomized and Observational Analyses within the Same Study: A Re-Analysis of the Women's Health Initiative Limited-Access Dataset.

    PubMed

    Bolland, Mark J; Grey, Andrew; Gamble, Greg D; Reid, Ian R

    2015-01-01

    Observational studies (OS) and randomized controlled trials (RCTs) often report discordant results. In the Women's Health Initiative Calcium and Vitamin D (WHI CaD) RCT, women were randomly assigned to CaD or placebo, but were permitted to use personal calcium and vitamin D supplements, creating a unique opportunity to compare results from randomized and observational analyses within the same study. WHI CaD was a 7-year RCT of 1g calcium/400IU vitamin D daily in 36,282 post-menopausal women. We assessed the effects of CaD on cardiovascular events, death, cancer and fracture in a randomized design- comparing CaD with placebo in 43% of women not using personal calcium or vitamin D supplements- and in a observational design- comparing women in the placebo group (44%) using personal calcium and vitamin D supplements with non-users. Incidence was assessed using Cox proportional hazards models, and results from the two study designs deemed concordant if the absolute difference in hazard ratios was ≤0.15. We also compared results from WHI CaD to those from the WHI Observational Study(WHI OS), which used similar methodology for analyses and recruited from the same population. In WHI CaD, for myocardial infarction and stroke, results of unadjusted and 6/8 covariate-controlled observational analyses (age-adjusted, multivariate-adjusted, propensity-adjusted, propensity-matched) were not concordant with the randomized design results. For death, hip and total fracture, colorectal and total cancer, unadjusted and covariate-controlled observational results were concordant with randomized results. For breast cancer, unadjusted and age-adjusted observational results were concordant with randomized results, but only 1/3 other covariate-controlled observational results were concordant with randomized results. Multivariate-adjusted results from WHI OS were concordant with randomized WHI CaD results for only 4/8 endpoints. Results of randomized analyses in WHI CaD were concordant with observational analyses for 5/8 endpoints in WHI CaD and 4/8 endpoints in WHI OS.

  2. Concordance of Results from Randomized and Observational Analyses within the Same Study: A Re-Analysis of the Women’s Health Initiative Limited-Access Dataset

    PubMed Central

    Bolland, Mark J.; Grey, Andrew; Gamble, Greg D.; Reid, Ian R.

    2015-01-01

    Background Observational studies (OS) and randomized controlled trials (RCTs) often report discordant results. In the Women’s Health Initiative Calcium and Vitamin D (WHI CaD) RCT, women were randomly assigned to CaD or placebo, but were permitted to use personal calcium and vitamin D supplements, creating a unique opportunity to compare results from randomized and observational analyses within the same study. Methods WHI CaD was a 7-year RCT of 1g calcium/400IU vitamin D daily in 36,282 post-menopausal women. We assessed the effects of CaD on cardiovascular events, death, cancer and fracture in a randomized design- comparing CaD with placebo in 43% of women not using personal calcium or vitamin D supplements- and in a observational design- comparing women in the placebo group (44%) using personal calcium and vitamin D supplements with non-users. Incidence was assessed using Cox proportional hazards models, and results from the two study designs deemed concordant if the absolute difference in hazard ratios was ≤0.15. We also compared results from WHI CaD to those from the WHI Observational Study(WHI OS), which used similar methodology for analyses and recruited from the same population. Results In WHI CaD, for myocardial infarction and stroke, results of unadjusted and 6/8 covariate-controlled observational analyses (age-adjusted, multivariate-adjusted, propensity-adjusted, propensity-matched) were not concordant with the randomized design results. For death, hip and total fracture, colorectal and total cancer, unadjusted and covariate-controlled observational results were concordant with randomized results. For breast cancer, unadjusted and age-adjusted observational results were concordant with randomized results, but only 1/3 other covariate-controlled observational results were concordant with randomized results. Multivariate-adjusted results from WHI OS were concordant with randomized WHI CaD results for only 4/8 endpoints. Conclusions Results of randomized analyses in WHI CaD were concordant with observational analyses for 5/8 endpoints in WHI CaD and 4/8 endpoints in WHI OS. PMID:26440516

  3. The Analysis of Completely Randomized Factorial Experiments When Observations Are Lost at Random.

    ERIC Educational Resources Information Center

    Hummel, Thomas J.

    An investigation was conducted of the characteristics of two estimation procedures and corresponding test statistics used in the analysis of completely randomized factorial experiments when observations are lost at random. For one estimator, contrast coefficients for cell means did not involve the cell frequencies. For the other, contrast…

  4. Spatio-temporal Hotelling observer for signal detection from image sequences

    PubMed Central

    Caucci, Luca; Barrett, Harrison H.; Rodríguez, Jeffrey J.

    2010-01-01

    Detection of signals in noisy images is necessary in many applications, including astronomy and medical imaging. The optimal linear observer for performing a detection task, called the Hotelling observer in the medical literature, can be regarded as a generalization of the familiar prewhitening matched filter. Performance on the detection task is limited by randomness in the image data, which stems from randomness in the object, randomness in the imaging system, and randomness in the detector outputs due to photon and readout noise, and the Hotelling observer accounts for all of these effects in an optimal way. If multiple temporal frames of images are acquired, the resulting data set is a spatio-temporal random process, and the Hotelling observer becomes a spatio-temporal linear operator. This paper discusses the theory of the spatio-temporal Hotelling observer and estimation of the required spatio-temporal covariance matrices. It also presents a parallel implementation of the observer on a cluster of Sony PLAYSTATION 3 gaming consoles. As an example, we consider the use of the spatio-temporal Hotelling observer for exoplanet detection. PMID:19550494

  5. Spatio-temporal Hotelling observer for signal detection from image sequences.

    PubMed

    Caucci, Luca; Barrett, Harrison H; Rodriguez, Jeffrey J

    2009-06-22

    Detection of signals in noisy images is necessary in many applications, including astronomy and medical imaging. The optimal linear observer for performing a detection task, called the Hotelling observer in the medical literature, can be regarded as a generalization of the familiar prewhitening matched filter. Performance on the detection task is limited by randomness in the image data, which stems from randomness in the object, randomness in the imaging system, and randomness in the detector outputs due to photon and readout noise, and the Hotelling observer accounts for all of these effects in an optimal way. If multiple temporal frames of images are acquired, the resulting data set is a spatio-temporal random process, and the Hotelling observer becomes a spatio-temporal linear operator. This paper discusses the theory of the spatio-temporal Hotelling observer and estimation of the required spatio-temporal covariance matrices. It also presents a parallel implementation of the observer on a cluster of Sony PLAYSTATION 3 gaming consoles. As an example, we consider the use of the spatio-temporal Hotelling observer for exoplanet detection.

  6. Randomized clinical trials and observational studies in the assessment of drug safety.

    PubMed

    Sawchik, J; Hamdani, J; Vanhaeverbeek, M

    2018-05-01

    Randomized clinical trials are considered as the preferred design to assess the potential causal relationships between drugs or other medical interventions and intended effects. For this reason, randomized clinical trials are generally the basis of development programs in the life cycle of drugs and the cornerstone of evidence-based medicine. Instead, randomized clinical trials are not the design of choice for the detection and assessment of rare, delayed and/or unexpected effects related to drug safety. Moreover, the highly homogeneous populations resulting from restrictive eligibility criteria make randomized clinical trials inappropriate to describe comprehensively the safety profile of drugs. In that context, observational studies have a key added value when evaluating the benefit-risk balance of the drugs. However, observational studies are more prone to bias than randomized clinical trials and they have to be designed, conducted and reported judiciously. In this article, we discuss the strengths and limitations of randomized clinical trials and of observational studies, more particularly regarding their contribution to the knowledge of medicines' safety profile. In addition, we present general recommendations for the sensible use of observational data. Copyright © 2018 Elsevier Masson SAS. All rights reserved.

  7. Stochastic space interval as a link between quantum randomness and macroscopic randomness?

    NASA Astrophysics Data System (ADS)

    Haug, Espen Gaarder; Hoff, Harald

    2018-03-01

    For many stochastic phenomena, we observe statistical distributions that have fat-tails and high-peaks compared to the Gaussian distribution. In this paper, we will explain how observable statistical distributions in the macroscopic world could be related to the randomness in the subatomic world. We show that fat-tailed (leptokurtic) phenomena in our everyday macroscopic world are ultimately rooted in Gaussian - or very close to Gaussian-distributed subatomic particle randomness, but they are not, in a strict sense, Gaussian distributions. By running a truly random experiment over a three and a half-year period, we observed a type of random behavior in trillions of photons. Combining our results with simple logic, we find that fat-tailed and high-peaked statistical distributions are exactly what we would expect to observe if the subatomic world is quantized and not continuously divisible. We extend our analysis to the fact that one typically observes fat-tails and high-peaks relative to the Gaussian distribution in stocks and commodity prices and many aspects of the natural world; these instances are all observable and documentable macro phenomena that strongly suggest that the ultimate building blocks of nature are discrete (e.g. they appear in quanta).

  8. Transabdominal amnioinfusion for preterm premature rupture of membranes: a systematic review and metaanalysis of randomized and observational studies.

    PubMed

    Porat, Shay; Amsalem, Hagai; Shah, Prakesh S; Murphy, Kellie E

    2012-11-01

    The purpose of this study was to review systematically the efficacy of transabdominal amnioinfusion (TA) in early preterm premature rupture of membranes (PPROM). We conducted a literature search of EMBASE, MEDLINE, and ClinicalTrials.gov databases and identified studies in which TA was used in cases of proven PPROM and oligohydramnios. Risk of bias was assessed for observational studies and randomized controlled trials. Primary outcomes were latency period and perinatal mortality rates. Four observational studies (n = 147) and 3 randomized controlled trials (n = 165) were eligible. Pooled latency period was 14.4 (range, 8.2-20.6) and 11.41 (range -3.4 to 26.2) days longer in the TA group in the observational and the randomized controlled trials, respectively. Perinatal mortality rates were reduced among the treatment groups in both the observational studies (odds ratio, 0.12; 95% confidence interval, 0.02-0.61) and the randomized controlled trials (odds ratio, 0.33; 95% confidence interval, 0.10-1.12). Serial TA for early PPROM may improve early PPROM-associated morbidity and mortality rates. Additional adequately powered randomized control trials are needed. Copyright © 2012 Mosby, Inc. All rights reserved.

  9. Osteoporosis therapies: evidence from health-care databases and observational population studies.

    PubMed

    Silverman, Stuart L

    2010-11-01

    Osteoporosis is a well-recognized disease with severe consequences if left untreated. Randomized controlled trials are the most rigorous method for determining the efficacy and safety of therapies. Nevertheless, randomized controlled trials underrepresent the real-world patient population and are costly in both time and money. Modern technology has enabled researchers to use information gathered from large health-care or medical-claims databases to assess the practical utilization of available therapies in appropriate patients. Observational database studies lack randomization but, if carefully designed and successfully completed, can provide valuable information that complements results obtained from randomized controlled trials and extends our knowledge to real-world clinical patients. Randomized controlled trials comparing fracture outcomes among osteoporosis therapies are difficult to perform. In this regard, large observational database studies could be useful in identifying clinically important differences among therapeutic options. Database studies can also provide important information with regard to osteoporosis prevalence, health economics, and compliance and persistence with treatment. This article describes the strengths and limitations of both randomized controlled trials and observational database studies, discusses considerations for observational study design, and reviews a wealth of information generated by database studies in the field of osteoporosis.

  10. Quantized vortices in the ideal bose gas: a physical realization of random polynomials.

    PubMed

    Castin, Yvan; Hadzibabic, Zoran; Stock, Sabine; Dalibard, Jean; Stringari, Sandro

    2006-02-03

    We propose a physical system allowing one to experimentally observe the distribution of the complex zeros of a random polynomial. We consider a degenerate, rotating, quasi-ideal atomic Bose gas prepared in the lowest Landau level. Thermal fluctuations provide the randomness of the bosonic field and of the locations of the vortex cores. These vortices can be mapped to zeros of random polynomials, and observed in the density profile of the gas.

  11. Nonlinear Estimation of Discrete-Time Signals Under Random Observation Delay

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Caballero-Aguila, R.; Jimenez-Lopez, J. D.; Hermoso-Carazo, A.

    2008-11-06

    This paper presents an approximation to the nonlinear least-squares estimation problem of discrete-time stochastic signals using nonlinear observations with additive white noise which can be randomly delayed by one sampling time. The observation delay is modelled by a sequence of independent Bernoulli random variables whose values, zero or one, indicate that the real observation arrives on time or it is delayed and, hence, the available measurement to estimate the signal is not up-to-date. Assuming that the state-space model generating the signal is unknown and only the covariance functions of the processes involved in the observation equation are ready for use,more » a filtering algorithm based on linear approximations of the real observations is proposed.« less

  12. Experimental nonlocality-based randomness generation with nonprojective measurements

    NASA Astrophysics Data System (ADS)

    Gómez, S.; Mattar, A.; Gómez, E. S.; Cavalcanti, D.; Farías, O. Jiménez; Acín, A.; Lima, G.

    2018-04-01

    We report on an optical setup generating more than one bit of randomness from one entangled bit (i.e., a maximally entangled state of two qubits). The amount of randomness is certified through the observation of Bell nonlocal correlations. To attain this result we implemented a high-purity entanglement source and a nonprojective three-outcome measurement. Our implementation achieves a gain of 27% of randomness as compared with the standard methods using projective measurements. Additionally, we estimate the amount of randomness certified in a one-sided device-independent scenario, through the observation of Einstein-Podolsky-Rosen steering. Our results prove that nonprojective quantum measurements allow extending the limits for nonlocality-based certified randomness generation using current technology.

  13. Tunable random lasing behavior in plasmonic nanostructures

    NASA Astrophysics Data System (ADS)

    Yadav, Ashish; Zhong, Liubiao; Sun, Jun; Jiang, Lin; Cheng, Gary J.; Chi, Lifeng

    2017-01-01

    Random lasing is desired in plasmonics nanostructures through surface plasmon amplification. In this study, tunable random lasing behavior was observed in dye molecules attached with Au nanorods (NRs), Au nanoparticles (NPs) and Au@Ag nanorods (NRs) respectively. Our experimental investigations showed that all nanostructures i.e., Au@AgNRs, AuNRs & AuNPs have intensive tunable spectral effects. The random lasing has been observed at excitation wavelength 532 nm and varying pump powers. The best random lasing properties were noticed in Au@AgNRs structure, which exhibits broad absorption spectrum, sufficiently overlapping with that of dye Rhodamine B (RhB). Au@AgNRs significantly enhance the tunable spectral behavior through localized electromagnetic field and scattering. The random lasing in Au@AgNRs provides an efficient coherent feedback for random lasers.

  14. From randomized controlled trials to observational studies.

    PubMed

    Silverman, Stuart L

    2009-02-01

    Randomized controlled trials are considered the gold standard in the hierarchy of research designs for evaluating the efficacy and safety of a treatment intervention. However, their results can have limited applicability to patients in clinical settings. Observational studies using large health care databases can complement findings from randomized controlled trials by assessing treatment effectiveness in patients encountered in day-to-day clinical practice. Results from these designs can expand upon outcomes of randomized controlled trials because of the use of larger and more diverse patient populations with common comorbidities and longer follow-up periods. Furthermore, well-designed observational studies can identify clinically important differences among therapeutic options and provide data on long-term drug effectiveness and safety.

  15. Time-varying output performances of piezoelectric vibration energy harvesting under nonstationary random vibrations

    NASA Astrophysics Data System (ADS)

    Yoon, Heonjun; Kim, Miso; Park, Choon-Su; Youn, Byeng D.

    2018-01-01

    Piezoelectric vibration energy harvesting (PVEH) has received much attention as a potential solution that could ultimately realize self-powered wireless sensor networks. Since most ambient vibrations in nature are inherently random and nonstationary, the output performances of PVEH devices also randomly change with time. However, little attention has been paid to investigating the randomly time-varying electroelastic behaviors of PVEH systems both analytically and experimentally. The objective of this study is thus to make a step forward towards a deep understanding of the time-varying performances of PVEH devices under nonstationary random vibrations. Two typical cases of nonstationary random vibration signals are considered: (1) randomly-varying amplitude (amplitude modulation; AM) and (2) randomly-varying amplitude with randomly-varying instantaneous frequency (amplitude and frequency modulation; AM-FM). In both cases, this study pursues well-balanced correlations of analytical predictions and experimental observations to deduce the relationships between the time-varying output performances of the PVEH device and two primary input parameters, such as a central frequency and an external electrical resistance. We introduce three correlation metrics to quantitatively compare analytical prediction and experimental observation, including the normalized root mean square error, the correlation coefficient, and the weighted integrated factor. Analytical predictions are in an excellent agreement with experimental observations both mechanically and electrically. This study provides insightful guidelines for designing PVEH devices to reliably generate electric power under nonstationary random vibrations.

  16. On Probability Domains IV

    NASA Astrophysics Data System (ADS)

    Frič, Roman; Papčo, Martin

    2017-12-01

    Stressing a categorical approach, we continue our study of fuzzified domains of probability, in which classical random events are replaced by measurable fuzzy random events. In operational probability theory (S. Bugajski) classical random variables are replaced by statistical maps (generalized distribution maps induced by random variables) and in fuzzy probability theory (S. Gudder) the central role is played by observables (maps between probability domains). We show that to each of the two generalized probability theories there corresponds a suitable category and the two resulting categories are dually equivalent. Statistical maps and observables become morphisms. A statistical map can send a degenerated (pure) state to a non-degenerated one —a quantum phenomenon and, dually, an observable can map a crisp random event to a genuine fuzzy random event —a fuzzy phenomenon. The dual equivalence means that the operational probability theory and the fuzzy probability theory coincide and the resulting generalized probability theory has two dual aspects: quantum and fuzzy. We close with some notes on products and coproducts in the dual categories.

  17. Multipartite nonlocality and random measurements

    NASA Astrophysics Data System (ADS)

    de Rosier, Anna; Gruca, Jacek; Parisio, Fernando; Vértesi, Tamás; Laskowski, Wiesław

    2017-07-01

    We present an exhaustive numerical analysis of violations of local realism by families of multipartite quantum states. As an indicator of nonclassicality we employ the probability of violation for randomly sampled observables. Surprisingly, it rapidly increases with the number of parties or settings and even for relatively small values local realism is violated for almost all observables. We have observed this effect to be typical in the sense that it emerged for all investigated states including some with randomly drawn coefficients. We also present the probability of violation as a witness of genuine multipartite entanglement.

  18. Exact intervals and tests for median when one sample value possibly an outliner

    NASA Technical Reports Server (NTRS)

    Keller, G. J.; Walsh, J. E.

    1973-01-01

    Available are independent observations (continuous data) that are believed to be a random sample. Desired are distribution-free confidence intervals and significance tests for the population median. However, there is the possibility that either the smallest or the largest observation is an outlier. Then, use of a procedure for rejection of an outlying observation might seem appropriate. Such a procedure would consider that two alternative situations are possible and would select one of them. Either (1) the n observations are truly a random sample, or (2) an outlier exists and its removal leaves a random sample of size n-1. For either situation, confidence intervals and tests are desired for the median of the population yielding the random sample. Unfortunately, satisfactory rejection procedures of a distribution-free nature do not seem to be available. Moreover, all rejection procedures impose undesirable conditional effects on the observations, and also, can select the wrong one of the two above situations. It is found that two-sided intervals and tests based on two symmetrically located order statistics (not the largest and smallest) of the n observations have this property.

  19. Was RA Fisher Right?

    PubMed

    Srivastava, Ayush; Srivastava, Anurag; Pandey, Ravindra M

    2017-10-01

    Randomized controlled trials have become the most respected scientific tool to measure the effectiveness of a medical therapy. The design, conduct and analysis of randomized controlled trials were developed by Sir Ronald A. Fisher, a mathematician in Great Britain. Fisher propounded that the process of randomization would equally distribute all the known and even unknown covariates in the two or more comparison groups, so that any difference observed could be ascribed to treatment effect. Today, we observe that in many situations, this prediction of Fisher does not stand true; hence, adaptive randomization schedules have been designed to adjust for major imbalance in important covariates. Present essay unravels some weaknesses inherent in Fisherian concept of randomized controlled trial.

  20. A random effects meta-analysis model with Box-Cox transformation.

    PubMed

    Yamaguchi, Yusuke; Maruo, Kazushi; Partlett, Christopher; Riley, Richard D

    2017-07-19

    In a random effects meta-analysis model, true treatment effects for each study are routinely assumed to follow a normal distribution. However, normality is a restrictive assumption and the misspecification of the random effects distribution may result in a misleading estimate of overall mean for the treatment effect, an inappropriate quantification of heterogeneity across studies and a wrongly symmetric prediction interval. We focus on problems caused by an inappropriate normality assumption of the random effects distribution, and propose a novel random effects meta-analysis model where a Box-Cox transformation is applied to the observed treatment effect estimates. The proposed model aims to normalise an overall distribution of observed treatment effect estimates, which is sum of the within-study sampling distributions and the random effects distribution. When sampling distributions are approximately normal, non-normality in the overall distribution will be mainly due to the random effects distribution, especially when the between-study variation is large relative to the within-study variation. The Box-Cox transformation addresses this flexibly according to the observed departure from normality. We use a Bayesian approach for estimating parameters in the proposed model, and suggest summarising the meta-analysis results by an overall median, an interquartile range and a prediction interval. The model can be applied for any kind of variables once the treatment effect estimate is defined from the variable. A simulation study suggested that when the overall distribution of treatment effect estimates are skewed, the overall mean and conventional I 2 from the normal random effects model could be inappropriate summaries, and the proposed model helped reduce this issue. We illustrated the proposed model using two examples, which revealed some important differences on summary results, heterogeneity measures and prediction intervals from the normal random effects model. The random effects meta-analysis with the Box-Cox transformation may be an important tool for examining robustness of traditional meta-analysis results against skewness on the observed treatment effect estimates. Further critical evaluation of the method is needed.

  1. Application of the Hotelling and ideal observers to detection and localization of exoplanets.

    PubMed

    Caucci, Luca; Barrett, Harrison H; Devaney, Nicholas; Rodríguez, Jeffrey J

    2007-12-01

    The ideal linear discriminant or Hotelling observer is widely used for detection tasks and image-quality assessment in medical imaging, but it has had little application in other imaging fields. We apply it to detection of planets outside of our solar system with long-exposure images obtained from ground-based or space-based telescopes. The statistical limitations in this problem include Poisson noise arising mainly from the host star, electronic noise in the image detector, randomness or uncertainty in the point-spread function (PSF) of the telescope, and possibly a random background. PSF randomness is reduced but not eliminated by the use of adaptive optics. We concentrate here on the effects of Poisson and electronic noise, but we also show how to extend the calculation to include a random PSF. For the case where the PSF is known exactly, we compare the Hotelling observer to other observers commonly used for planet detection; comparison is based on receiver operating characteristic (ROC) and localization ROC (LROC) curves.

  2. Application of the Hotelling and ideal observers to detection and localization of exoplanets

    PubMed Central

    Caucci, Luca; Barrett, Harrison H.; Devaney, Nicholas; Rodríguez, Jeffrey J.

    2008-01-01

    The ideal linear discriminant or Hotelling observer is widely used for detection tasks and image-quality assessment in medical imaging, but it has had little application in other imaging fields. We apply it to detection of planets outside of our solar system with long-exposure images obtained from ground-based or space-based telescopes. The statistical limitations in this problem include Poisson noise arising mainly from the host star, electronic noise in the image detector, randomness or uncertainty in the point-spread function (PSF) of the telescope, and possibly a random background. PSF randomness is reduced but not eliminated by the use of adaptive optics. We concentrate here on the effects of Poisson and electronic noise, but we also show how to extend the calculation to include a random PSF. For the case where the PSF is known exactly, we compare the Hotelling observer to other observers commonly used for planet detection; comparison is based on receiver operating characteristic (ROC) and localization ROC (LROC) curves. PMID:18059905

  3. Teacher Effectiveness and Causal Inference in Observational Studies

    ERIC Educational Resources Information Center

    Rose, Roderick A.

    2013-01-01

    An important target of education policy is to improve overall teacher effectiveness using evidence-based policies. Randomized control trials (RCTs), which randomly assign study participants or groups of participants to treatment and control conditions, are not always practical or possible and observational studies using rigorous quasi-experimental…

  4. An introduction to chaotic and random time series analysis

    NASA Technical Reports Server (NTRS)

    Scargle, Jeffrey D.

    1989-01-01

    The origin of chaotic behavior and the relation of chaos to randomness are explained. Two mathematical results are described: (1) a representation theorem guarantees the existence of a specific time-domain model for chaos and addresses the relation between chaotic, random, and strictly deterministic processes; (2) a theorem assures that information on the behavior of a physical system in its complete state space can be extracted from time-series data on a single observable. Focus is placed on an important connection between the dynamical state space and an observable time series. These two results lead to a practical deconvolution technique combining standard random process modeling methods with new embedded techniques.

  5. Erasure and reestablishment of random allelic expression imbalance after epigenetic reprogramming

    PubMed Central

    Jeffries, Aaron Richard; Uwanogho, Dafe Aghogho; Cocks, Graham; Perfect, Leo William; Dempster, Emma; Mill, Jonathan; Price, Jack

    2016-01-01

    Clonal level random allelic expression imbalance and random monoallelic expression provides cellular heterogeneity within tissues by modulating allelic dosage. Although such expression patterns have been observed in multiple cell types, little is known about when in development these stochastic allelic choices are made. We examine allelic expression patterns in human neural progenitor cells before and after epigenetic reprogramming to induced pluripotency, observing that loci previously characterized by random allelic expression imbalance (0.63% of expressed genes) are generally reset to a biallelic state in induced pluripotent stem cells (iPSCs). We subsequently neuralized the iPSCs and profiled isolated clonal neural stem cells, observing that significant random allelic expression imbalance is reestablished at 0.65% of expressed genes, including novel loci not found to show allelic expression imbalance in the original parental neural progenitor cells. Allelic expression imbalance was associated with altered DNA methylation across promoter regulatory regions, with clones characterized by skewed allelic expression being hypermethylated compared to their biallelic sister clones. Our results suggest that random allelic expression imbalance is established during lineage commitment and is associated with increased DNA methylation at the gene promoter. PMID:27539784

  6. Erasure and reestablishment of random allelic expression imbalance after epigenetic reprogramming.

    PubMed

    Jeffries, Aaron Richard; Uwanogho, Dafe Aghogho; Cocks, Graham; Perfect, Leo William; Dempster, Emma; Mill, Jonathan; Price, Jack

    2016-10-01

    Clonal level random allelic expression imbalance and random monoallelic expression provides cellular heterogeneity within tissues by modulating allelic dosage. Although such expression patterns have been observed in multiple cell types, little is known about when in development these stochastic allelic choices are made. We examine allelic expression patterns in human neural progenitor cells before and after epigenetic reprogramming to induced pluripotency, observing that loci previously characterized by random allelic expression imbalance (0.63% of expressed genes) are generally reset to a biallelic state in induced pluripotent stem cells (iPSCs). We subsequently neuralized the iPSCs and profiled isolated clonal neural stem cells, observing that significant random allelic expression imbalance is reestablished at 0.65% of expressed genes, including novel loci not found to show allelic expression imbalance in the original parental neural progenitor cells. Allelic expression imbalance was associated with altered DNA methylation across promoter regulatory regions, with clones characterized by skewed allelic expression being hypermethylated compared to their biallelic sister clones. Our results suggest that random allelic expression imbalance is established during lineage commitment and is associated with increased DNA methylation at the gene promoter. © 2016 Jeffries et al.; Published by Cold Spring Harbor Laboratory Press for the RNA Society.

  7. Observation of random lasing in gold-silica nanoshell/water solution

    NASA Astrophysics Data System (ADS)

    Kang, Jin U.

    2006-11-01

    The author reports experimental observation of resonant surface plasmon enhanced random lasing in gold-silica nanoshells in de-ionized water. The gold-silica nanoshell/water solution with concentration of 8×109particles/ml was pumped above the surface plasmon resonance frequency using 514nm argon-krypton laser. When pumping power was above the lasing threshold, sharp random lasing peaks occurred near and below the plasmon peak from 720to860nm with a lasing linewidth less than 1nm.

  8. Block randomization versus complete randomization of human perception stimuli: is there a difference?

    NASA Astrophysics Data System (ADS)

    Moyer, Steve; Uhl, Elizabeth R.

    2015-05-01

    For more than 50 years, the U.S. Army RDECOM CERDEC Night Vision and Electronic Sensors Directorate (NVESD) has been studying and modeling the human visual discrimination process as it pertains to military imaging systems. In order to develop sensor performance models, human observers are trained to expert levels in the identification of military vehicles. From 1998 until 2006, the experimental stimuli were block randomized, meaning that stimuli with similar difficulty levels (for example, in terms of distance from target, blur, noise, etc.) were presented together in blocks of approximately 24 images but the order of images within the block was random. Starting in 2006, complete randomization came into vogue, meaning that difficulty could change image to image. It was thought that this would provide a more statistically robust result. In this study we investigated the impact of the two types of randomization on performance in two groups of observers matched for skill to create equivalent groups. It is hypothesized that Soldiers in the Complete Randomized condition will have to shift their decision criterion more frequently than Soldiers in the Block Randomization group and this shifting is expected to impede performance so that Soldiers in the Block Randomized group perform better.

  9. Emergence of Lévy Walks from Second-Order Stochastic Optimization

    NASA Astrophysics Data System (ADS)

    Kuśmierz, Łukasz; Toyoizumi, Taro

    2017-12-01

    In natural foraging, many organisms seem to perform two different types of motile search: directed search (taxis) and random search. The former is observed when the environment provides cues to guide motion towards a target. The latter involves no apparent memory or information processing and can be mathematically modeled by random walks. We show that both types of search can be generated by a common mechanism in which Lévy flights or Lévy walks emerge from a second-order gradient-based search with noisy observations. No explicit switching mechanism is required—instead, continuous transitions between the directed and random motions emerge depending on the Hessian matrix of the cost function. For a wide range of scenarios, the Lévy tail index is α =1 , consistent with previous observations in foraging organisms. These results suggest that adopting a second-order optimization method can be a useful strategy to combine efficient features of directed and random search.

  10. Neuroendocrine recovery initiated by cognitive behavioral therapy in women with functional hypothalamic amenorrhea: a randomized controlled trial

    PubMed Central

    Michopoulos, Vasiliki; Mancini, Fulvia; Loucks, Tammy L.; Berga, Sarah L.

    2013-01-01

    Objective To determine whether cognitive behavior therapy (CBT), which we previously showed restored ovarian function in women with functional hypothalamic amenorrhea (FHA), also ameliorated hypercortisolemia and improved other neuroendocrine and metabolic concomitants of in FHA. Design Randomized controlled trial. Intervention CBT vs. observation. Setting Clinical research center at an academic medical university. Patient(s) Seventeen women with FHA were randomized either to CBT or observation. Main Outcome Measure(s) Circulatory concentrations of cortisol, leptin, TSH, total and free thyronine (T3), and total and free thyroxine (T4) before and immediately after completion of CBT or observation. Each woman served as her own control. Results CBT but not observation reduced cortisol levels in women with FHA. There were no changes in cortisol, leptin, TSH, T3, or T4 levels in women randomized to observation. Women treated with CBT showed increased levels of leptin and TSH, while levels of T3 and T4 remained unchanged. Conclusions CBT ameliorated hypercortisolism and improved neuroendocrine and metabolic concomitants of FHA while observation did not. We conclude that a cognitive, nonpharmacological approach aimed at alleviating problematic attitudes not only restored ovarian activity but also improved neuroendocrine and metabolic function in women with FHA. PMID:23507474

  11. Usefulness of Mendelian Randomization in Observational Epidemiology

    PubMed Central

    Bochud, Murielle; Rousson, Valentin

    2010-01-01

    Mendelian randomization refers to the random allocation of alleles at the time of gamete formation. In observational epidemiology, this refers to the use of genetic variants to estimate a causal effect between a modifiable risk factor and an outcome of interest. In this review, we recall the principles of a “Mendelian randomization” approach in observational epidemiology, which is based on the technique of instrumental variables; we provide simulations and an example based on real data to demonstrate its implications; we present the results of a systematic search on original articles having used this approach; and we discuss some limitations of this approach in view of what has been found so far. PMID:20616999

  12. Selenium and Preeclampsia: a Systematic Review and Meta-analysis.

    PubMed

    Xu, Min; Guo, Dan; Gu, Hao; Zhang, Li; Lv, Shuyan

    2016-06-01

    Conflicting results exist between selenium concentration and preeclampsia. The role of selenium in the development of preeclampsia is unclear. We conducted a meta-analysis to compare the blood selenium level in patients with preeclampsia and healthy pregnant women, and to determine the effectiveness of selenium supplementation in preventing preeclampsia. We searched PubMed, ScienceDirect, the Cochrane Library, and relevant references for English language literature up to November 25, 2014. Mean difference from observational studies and relative risk from randomized controlled trials were meta-analyzed by a random-effect model. Thirteen observational studies with 1515 participants and 3 randomized controlled trials with 439 participants were included in the meta-analysis. Using a random-effect model, a statistically significant difference in blood selenium concentration of -6.47 μg/l (95 % confidence interval (CI) -11.24 to -1.7, p = 0.008) was seen after comparing the mean difference of observational studies. In randomized controlled trials, using a random-effect model, the relative risk for preeclampsia was 0.28 (0.09 to 0.84) for selenium supplementation (p = 0.02). Evidence from observational studies indicates an inverse association of blood selenium level and the risk of preeclampsia. Supplementation with selenium significantly reduces the incidence of preeclampsia. However, more prospective clinical trials are required to assess the association between selenium supplementation and preeclampsia and to determine the dose, beginning time, and duration of selenium supplementation.

  13. [Realization of design regarding experimental research in the clinical real-world research].

    PubMed

    He, Q; Shi, J P

    2018-04-10

    Real world study (RWS), a further verification and supplement for explanatory randomized controlled trial to evaluate the effectiveness of intervention measures in real clinical environment, has increasingly become the focus in the field of research on medical and health care services. However, some people mistakenly equate real world study with observational research, and argue that intervention and randomization cannot be carried out in real world study. In fact, both observational and experimental design are the basic designs in real world study, while the latter usually refers to pragmatic randomized controlled trial and registry-based randomized controlled trial. Other nonrandomized controlled and adaptive designs can also be adopted in the RWS.

  14. Common Language Effect Size for Multiple Treatment Comparisons

    ERIC Educational Resources Information Center

    Liu, Xiaofeng Steven

    2015-01-01

    Researchers who need to explain treatment effects to laypeople can translate Cohen's effect size (standardized mean difference) to a common language effect size--a probability of a random observation from one population being larger than a random observation from the other population. This common language effect size can be extended to represent…

  15. Using Big Data to Emulate a Target Trial When a Randomized Trial Is Not Available.

    PubMed

    Hernán, Miguel A; Robins, James M

    2016-04-15

    Ideally, questions about comparative effectiveness or safety would be answered using an appropriately designed and conducted randomized experiment. When we cannot conduct a randomized experiment, we analyze observational data. Causal inference from large observational databases (big data) can be viewed as an attempt to emulate a randomized experiment-the target experiment or target trial-that would answer the question of interest. When the goal is to guide decisions among several strategies, causal analyses of observational data need to be evaluated with respect to how well they emulate a particular target trial. We outline a framework for comparative effectiveness research using big data that makes the target trial explicit. This framework channels counterfactual theory for comparing the effects of sustained treatment strategies, organizes analytic approaches, provides a structured process for the criticism of observational studies, and helps avoid common methodologic pitfalls. © The Author 2016. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  16. ADAPTIVE MATCHING IN RANDOMIZED TRIALS AND OBSERVATIONAL STUDIES

    PubMed Central

    van der Laan, Mark J.; Balzer, Laura B.; Petersen, Maya L.

    2014-01-01

    SUMMARY In many randomized and observational studies the allocation of treatment among a sample of n independent and identically distributed units is a function of the covariates of all sampled units. As a result, the treatment labels among the units are possibly dependent, complicating estimation and posing challenges for statistical inference. For example, cluster randomized trials frequently sample communities from some target population, construct matched pairs of communities from those included in the sample based on some metric of similarity in baseline community characteristics, and then randomly allocate a treatment and a control intervention within each matched pair. In this case, the observed data can neither be represented as the realization of n independent random variables, nor, contrary to current practice, as the realization of n/2 independent random variables (treating the matched pair as the independent sampling unit). In this paper we study estimation of the average causal effect of a treatment under experimental designs in which treatment allocation potentially depends on the pre-intervention covariates of all units included in the sample. We define efficient targeted minimum loss based estimators for this general design, present a theorem that establishes the desired asymptotic normality of these estimators and allows for asymptotically valid statistical inference, and discuss implementation of these estimators. We further investigate the relative asymptotic efficiency of this design compared with a design in which unit-specific treatment assignment depends only on the units’ covariates. Our findings have practical implications for the optimal design and analysis of pair matched cluster randomized trials, as well as for observational studies in which treatment decisions may depend on characteristics of the entire sample. PMID:25097298

  17. Thermally switchable photonic band-edge to random laser emission in dye-doped cholesteric liquid crystals

    NASA Astrophysics Data System (ADS)

    Ye, Lihua; Wang, Yan; Feng, Yangyang; Liu, Bo; Gu, Bing; Cui, Yiping; Lu, Yanqing

    2018-03-01

    By changing the doping concentration of the chiral agent to adjust the relative position of the reflection band of cholesteric liquid crystals and the fluorescence emission spectrum of the dye, photonic band-edge and random lasing were observed, respectively. The reflection band of the cholesteric phase liquid crystal can also be controlled by adjusting the temperature: the reflection band is blue-shifted with increasing temperature, and a reversible switch from photonic band-edge to random lasing is obtained. Furthermore, the laser line width can be thermally adjusted from 1.1 nm (at 27 °C) to 4.6 nm (at 32.1 °C). A thermally tunable polarization state of a random laser from dual cells was observed, broadening the field of application liquid crystal random lasers.

  18. Do observational studies using propensity score methods agree with randomized trials? A systematic comparison of studies on acute coronary syndromes

    PubMed Central

    Dahabreh, Issa J.; Sheldrick, Radley C.; Paulus, Jessica K.; Chung, Mei; Varvarigou, Vasileia; Jafri, Haseeb; Rassen, Jeremy A.; Trikalinos, Thomas A.; Kitsios, Georgios D.

    2012-01-01

    Aims Randomized controlled trials (RCTs) are the gold standard for assessing the efficacy of therapeutic interventions because randomization protects from biases inherent in observational studies. Propensity score (PS) methods, proposed as a potential solution to confounding of the treatment–outcome association, are widely used in observational studies of therapeutic interventions for acute coronary syndromes (ACS). We aimed to systematically assess agreement between observational studies using PS methods and RCTs on therapeutic interventions for ACS. Methods and results We searched for observational studies of interventions for ACS that used PS methods to estimate treatment effects on short- or long-term mortality. Using a standardized algorithm, we matched observational studies to RCTs based on patients’ characteristics, interventions, and outcomes (‘topics’), and we compared estimates of treatment effect between the two designs. When multiple observational studies or RCTs were identified for the same topic, we performed a meta-analysis and used the summary relative risk for comparisons. We matched 21 observational studies investigating 17 distinct clinical topics to 63 RCTs (median = 3 RCTs per observational study) for short-term (7 topics) and long-term (10 topics) mortality. Estimates from PS analyses differed statistically significantly from randomized evidence in two instances; however, observational studies reported more extreme beneficial treatment effects compared with RCTs in 13 of 17 instances (P = 0.049). Sensitivity analyses limited to large RCTs, and using alternative meta-analysis models yielded similar results. Conclusion For the treatment of ACS, observational studies using PS methods produce treatment effect estimates that are of more extreme magnitude compared with those from RCTs, although the differences are rarely statistically significant. PMID:22711757

  19. Knowledge translation interventions for critically ill patients: a systematic review*.

    PubMed

    Sinuff, Tasnim; Muscedere, John; Adhikari, Neill K J; Stelfox, Henry T; Dodek, Peter; Heyland, Daren K; Rubenfeld, Gordon D; Cook, Deborah J; Pinto, Ruxandra; Manoharan, Venika; Currie, Jan; Cahill, Naomi; Friedrich, Jan O; Amaral, Andre; Piquette, Dominique; Scales, Damon C; Dhanani, Sonny; Garland, Allan

    2013-11-01

    We systematically reviewed ICU-based knowledge translation studies to assess the impact of knowledge translation interventions on processes and outcomes of care. We searched electronic databases (to July, 2010) without language restrictions and hand-searched reference lists of relevant studies and reviews. Two reviewers independently identified randomized controlled trials and observational studies comparing any ICU-based knowledge translation intervention (e.g., protocols, guidelines, and audit and feedback) to management without a knowledge translation intervention. We focused on clinical topics that were addressed in greater than or equal to five studies. Pairs of reviewers abstracted data on the clinical topic, knowledge translation intervention(s), process of care measures, and patient outcomes. For each individual or combination of knowledge translation intervention(s) addressed in greater than or equal to three studies, we summarized each study using median risk ratio for dichotomous and standardized mean difference for continuous process measures. We used random-effects models. Anticipating a small number of randomized controlled trials, our primary meta-analyses included randomized controlled trials and observational studies. In separate sensitivity analyses, we excluded randomized controlled trials and collapsed protocols, guidelines, and bundles into one category of intervention. We conducted meta-analyses for clinical outcomes (ICU and hospital mortality, ventilator-associated pneumonia, duration of mechanical ventilation, and ICU length of stay) related to interventions that were associated with improvements in processes of care. From 11,742 publications, we included 119 investigations (seven randomized controlled trials, 112 observational studies) on nine clinical topics. Interventions that included protocols with or without education improved continuous process measures (seven observational studies and one randomized controlled trial; standardized mean difference [95% CI]: 0.26 [0.1, 0.42]; p = 0.001 and four observational studies and one randomized controlled trial; 0.83 [0.37, 1.29]; p = 0.0004, respectively). Heterogeneity among studies within topics ranged from low to extreme. The exclusion of randomized controlled trials did not change our results. Single-intervention and lower-quality studies had higher standardized mean differences compared to multiple-intervention and higher-quality studies (p = 0.013 and 0.016, respectively). There were no associated improvements in clinical outcomes. Knowledge translation interventions in the ICU that include protocols with or without education are associated with the greatest improvements in processes of critical care.

  20. Observational Studies: Cohort and Case-Control Studies

    PubMed Central

    Song, Jae W.; Chung, Kevin C.

    2010-01-01

    Observational studies are an important category of study designs. To address some investigative questions in plastic surgery, randomized controlled trials are not always indicated or ethical to conduct. Instead, observational studies may be the next best method to address these types of questions. Well-designed observational studies have been shown to provide results similar to randomized controlled trials, challenging the belief that observational studies are second-rate. Cohort studies and case-control studies are two primary types of observational studies that aid in evaluating associations between diseases and exposures. In this review article, we describe these study designs, methodological issues, and provide examples from the plastic surgery literature. PMID:20697313

  1. New constraints on modelling the random magnetic field of the MW

    NASA Astrophysics Data System (ADS)

    Beck, Marcus C.; Beck, Alexander M.; Beck, Rainer; Dolag, Klaus; Strong, Andrew W.; Nielaba, Peter

    2016-05-01

    We extend the description of the isotropic and anisotropic random component of the small-scale magnetic field within the existing magnetic field model of the Milky Way from Jansson & Farrar, by including random realizations of the small-scale component. Using a magnetic-field power spectrum with Gaussian random fields, the NE2001 model for the thermal electrons and the Galactic cosmic-ray electron distribution from the current GALPROP model we derive full-sky maps for the total and polarized synchrotron intensity as well as the Faraday rotation-measure distribution. While previous work assumed that small-scale fluctuations average out along the line-of-sight or which only computed ensemble averages of random fields, we show that these fluctuations need to be carefully taken into account. Comparing with observational data we obtain not only good agreement with 408 MHz total and WMAP7 22 GHz polarized intensity emission maps, but also an improved agreement with Galactic foreground rotation-measure maps and power spectra, whose amplitude and shape strongly depend on the parameters of the random field. We demonstrate that a correlation length of 0≈22 pc (05 pc being a 5σ lower limit) is needed to match the slope of the observed power spectrum of Galactic foreground rotation-measure maps. Using multiple realizations allows us also to infer errors on individual observables. We find that previously-used amplitudes for random and anisotropic random magnetic field components need to be rescaled by factors of ≈0.3 and 0.6 to account for the new small-scale contributions. Our model predicts a rotation measure of -2.8±7.1 rad/m2 and 04.4±11. rad/m2 for the north and south Galactic poles respectively, in good agreement with observations. Applying our model to deflections of ultra-high-energy cosmic rays we infer a mean deflection of ≈3.5±1.1 degree for 60 EeV protons arriving from CenA.

  2. Measurement Error Correction Formula for Cluster-Level Group Differences in Cluster Randomized and Observational Studies

    ERIC Educational Resources Information Center

    Cho, Sun-Joo; Preacher, Kristopher J.

    2016-01-01

    Multilevel modeling (MLM) is frequently used to detect cluster-level group differences in cluster randomized trial and observational studies. Group differences on the outcomes (posttest scores) are detected by controlling for the covariate (pretest scores) as a proxy variable for unobserved factors that predict future attributes. The pretest and…

  3. Objective assessment of image quality. IV. Application to adaptive optics

    PubMed Central

    Barrett, Harrison H.; Myers, Kyle J.; Devaney, Nicholas; Dainty, Christopher

    2008-01-01

    The methodology of objective assessment, which defines image quality in terms of the performance of specific observers on specific tasks of interest, is extended to temporal sequences of images with random point spread functions and applied to adaptive imaging in astronomy. The tasks considered include both detection and estimation, and the observers are the optimal linear discriminant (Hotelling observer) and the optimal linear estimator (Wiener). A general theory of first- and second-order spatiotemporal statistics in adaptive optics is developed. It is shown that the covariance matrix can be rigorously decomposed into three terms representing the effect of measurement noise, random point spread function, and random nature of the astronomical scene. Figures of merit are developed, and computational methods are discussed. PMID:17106464

  4. Neuroendocrine recovery initiated by cognitive behavioral therapy in women with functional hypothalamic amenorrhea: a randomized, controlled trial.

    PubMed

    Michopoulos, Vasiliki; Mancini, Fulvia; Loucks, Tammy L; Berga, Sarah L

    2013-06-01

    To determine whether cognitive behavior therapy (CBT), which we had shown in a previous study to restore ovarian function in women with functional hypothalamic amenorrhea (FHA), could also ameliorate hypercortisolemia and improve other neuroendocrine and metabolic concomitants of in FHA. Randomized controlled trial. Clinical research center at an academic medical university. Seventeen women with FHA were randomized either to CBT or observation. CBT versus observation. Circulatory concentrations of cortisol, leptin, thyroid-stimulating hormone (TSH), total and free thyronine (T(3)), and total and free thyroxine (T(4)) before and immediately after completion of CBT or observation. (Each woman served as her own control.) Cognitive behavior therapy but not observation reduced cortisol levels in women with FHA. There were no changes in cortisol, leptin, TSH, T(3), or T(4) levels in women randomized to observation. Women treated with CBT showed increased levels of leptin and TSH, but their levels of T(3) and T(4) remained unchanged. In women with FHA, CBT ameliorated hypercortisolism and improved the neuroendocrine and metabolic concomitants of FHA while observation did not. We conclude that a cognitive, nonpharmacologic approach aimed at alleviating problematic attitudes not only can restore ovarian activity but also improve neuroendocrine and metabolic function in women with FHA. NCT01674426. Copyright © 2013 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.

  5. Retrocausation Or Extant Indefinite Reality?

    NASA Astrophysics Data System (ADS)

    Houtkooper, Joop M.

    2006-10-01

    The possibility of retrocausation has been considered to explain the occurrence of anomalous phenomena in which the ostensible effects are preceded by their causes. A scrutiny of both experimental methodology and the experimental data is called for. A review of experimental data reveals the existence of such effects to be a serious possibility. The experimental methodology entails some conceptual difficulties, these depending on the underlying assumptions about the effects. A major point is an ambiguity between anomalous acquisition of information and retrocausation in exerted influences. A unifying theory has been proposed, based upon the fundamental randomness of quantum mechanics. Quantum mechanical randomness may be regarded as a tenacious phenomenon, that apparently is only resolved by the human observer of the random variable in question. This has led to the "observational theory" of anomalous phenomena, which is based upon the assumption that the preference of a motivated observer is able to interact with the extant indefinite random variable that is being observed. This observational theory has led to a novel prediction, which has been corroborated in experiments. Moreover, different classes of anomalous phenomena can be explained by the same basic mechanism. This foregoes retroactive causation, but, instead, requires that macroscopic physical variables remain in a state of indefinite reality and thus remain influenceable by mental efforts until these are observed. More work is needed to discover the relevant psychological and neurophysiological variables involved in effective motivated observation. Besides these practicalities, the fundamentals still have some interesting loose ends.

  6. An analytic technique for statistically modeling random atomic clock errors in estimation

    NASA Technical Reports Server (NTRS)

    Fell, P. J.

    1981-01-01

    Minimum variance estimation requires that the statistics of random observation errors be modeled properly. If measurements are derived through the use of atomic frequency standards, then one source of error affecting the observable is random fluctuation in frequency. This is the case, for example, with range and integrated Doppler measurements from satellites of the Global Positioning and baseline determination for geodynamic applications. An analytic method is presented which approximates the statistics of this random process. The procedure starts with a model of the Allan variance for a particular oscillator and develops the statistics of range and integrated Doppler measurements. A series of five first order Markov processes is used to approximate the power spectral density obtained from the Allan variance.

  7. Parametric down-conversion with nonideal and random quasi-phase-matching

    NASA Astrophysics Data System (ADS)

    Yang, Chun-Yao; Lin, Chun; Liljestrand, Charlotte; Su, Wei-Min; Canalias, Carlota; Chuu, Chih-Sung

    2016-05-01

    Quasi-phase-matching (QPM) has enriched the capacity of parametric down-conversion (PDC) in generating biphotons for many fundamental tests and advanced applications. However, it is not clear how the nonidealities and randomness in the QPM grating of a parametric down-converter may affect the quantum properties of the biphotons. This paper intends to provide insights into the interplay between PDC and nonideal or random QPM structures. Using a periodically poled nonlinear crystal with short periodicity, we conduct experimental and theoretical studies of PDC subject to nonideal duty cycle and random errors in domain lengths. We report the observation of biphotons emerging through noncritical birefringent-phasematching, which is impossible to occur in PDC with an ideal QPM grating, and a biphoton spectrum determined by the details of nonidealities and randomness. We also observed QPM biphotons with a diminished strength. These features are both confirmed by our theory. Our work provides new perspectives for biphoton engineering with QPM.

  8. Quantifying randomness in real networks

    NASA Astrophysics Data System (ADS)

    Orsini, Chiara; Dankulov, Marija M.; Colomer-de-Simón, Pol; Jamakovic, Almerima; Mahadevan, Priya; Vahdat, Amin; Bassler, Kevin E.; Toroczkai, Zoltán; Boguñá, Marián; Caldarelli, Guido; Fortunato, Santo; Krioukov, Dmitri

    2015-10-01

    Represented as graphs, real networks are intricate combinations of order and disorder. Fixing some of the structural properties of network models to their values observed in real networks, many other properties appear as statistical consequences of these fixed observables, plus randomness in other respects. Here we employ the dk-series, a complete set of basic characteristics of the network structure, to study the statistical dependencies between different network properties. We consider six real networks--the Internet, US airport network, human protein interactions, technosocial web of trust, English word network, and an fMRI map of the human brain--and find that many important local and global structural properties of these networks are closely reproduced by dk-random graphs whose degree distributions, degree correlations and clustering are as in the corresponding real network. We discuss important conceptual, methodological, and practical implications of this evaluation of network randomness, and release software to generate dk-random graphs.

  9. Canonical Naimark extension for generalized measurements involving sets of Pauli quantum observables chosen at random

    NASA Astrophysics Data System (ADS)

    Sparaciari, Carlo; Paris, Matteo G. A.

    2013-01-01

    We address measurement schemes where certain observables Xk are chosen at random within a set of nondegenerate isospectral observables and then measured on repeated preparations of a physical system. Each observable has a probability zk to be measured, with ∑kzk=1, and the statistics of this generalized measurement is described by a positive operator-valued measure. This kind of scheme is referred to as quantum roulettes, since each observable Xk is chosen at random, e.g., according to the fluctuating value of an external parameter. Here we focus on quantum roulettes for qubits involving the measurements of Pauli matrices, and we explicitly evaluate their canonical Naimark extensions, i.e., their implementation as indirect measurements involving an interaction scheme with a probe system. We thus provide a concrete model to realize the roulette without destroying the signal state, which can be measured again after the measurement or can be transmitted. Finally, we apply our results to the description of Stern-Gerlach-like experiments on a two-level system.

  10. Observational studies are complementary to randomized controlled trials.

    PubMed

    Grootendorst, Diana C; Jager, Kitty J; Zoccali, Carmine; Dekker, Friedo W

    2010-01-01

    Randomized controlled trials (RCTs) are considered the gold standard study design to investigate the effect of health interventions, including treatment. However, in some situations, it may be unnecessary, inappropriate, impossible, or inadequate to perform an RCT. In these special situations, well-designed observational studies, including cohort and case-control studies, may provide an alternative to doing nothing in order to obtain estimates of treatment effect. It should be noted that such studies should be performed with caution and correctly. The aims of this review are (1) to explain why RCTs are considered the optimal study design to evaluate treatment effects, (2) to describe the situations in which an RCT is not possible and observational studies are an adequate alternative, and (3) to explain when randomization is not needed and can be approximated in observational studies. Examples from the nephrology literature are used for illustration. Copyright 2009 S. Karger AG, Basel.

  11. Development of Novel Composite and Random Materials for Nonlinear Optics and Lasers

    NASA Technical Reports Server (NTRS)

    Noginov, Mikhail

    2002-01-01

    A qualitative model explaining sharp spectral peaks in emission of solid-state random laser materials with broad-band gain is proposed. The suggested mechanism of coherent emission relies on synchronization of phases in an ensemble of emitting centers, via time delays provided by a network of random scatterers, and amplification of spontaneous emission that supports the spontaneously organized coherent state. Laser-like emission from powders of solid-state luminophosphors, characterized by dramatic narrowing of the emission spectrum and shortening of emission pulses above the threshold, was first observed by Markushev et al. and further studied by a number of research groups. In particular, it has been shown that when the pumping energy significantly exceeds the threshold, one or several narrow emission lines can be observed in broad-band gain media with scatterers, such as films of ZnO nanoparticles, films of pi-conjugated polymers or infiltrated opals. The experimental features, commonly observed in various solid-state random laser materials characterized by different particle sizes, different values of the photon mean free path l*, different indexes of refraction, etc.. can be described as follows. (Liquid dye random lasers are not discussed here.)

  12. Modeling Achievement Trajectories when Attrition Is Informative

    ERIC Educational Resources Information Center

    Feldman, Betsy J.; Rabe-Hesketh, Sophia

    2012-01-01

    In longitudinal education studies, assuming that dropout and missing data occur completely at random is often unrealistic. When the probability of dropout depends on covariates and observed responses (called "missing at random" [MAR]), or on values of responses that are missing (called "informative" or "not missing at random" [NMAR]),…

  13. Dairy consumption and body mass index among adults: Mendelian randomization analysis of 184802 individuals from 25 studies

    USDA-ARS?s Scientific Manuscript database

    Background: Associations between dairy intake and body mass index (BMI) have been inconsistently observed in epidemiological studies; and the causal relationship remains ill defined. Using the Mendelian randomization (MR) approach and meta-analysis of selected randomized controlled trials (RCTs), we...

  14. Assessment of reporting quality of conference abstracts in sports injury prevention according to CONSORT and STROBE criteria and their subsequent publication rate as full papers.

    PubMed

    Yoon, Uzung; Knobloch, Karsten

    2012-04-11

    The preliminary results of a study are usually presented as an abstract in conference meetings. The reporting quality of those abstracts and the relationship between their study designs and full paper publication rate is unknown. We hypothesized that randomized controlled trials are more likely to be published as full papers than observational studies. 154 oral abstracts presented at the World Congress of Sports Injury Prevention 2005 Oslo and the corresponding full paper publication were identified and analysed. The main outcome measures were frequency of publication, time to publication, impact factor, CONSORT (for Consolidated Standards of Reporting Trials) score, STROBE (for Strengthening the Reporting of Observational Studies in Epidemiology) score, and minor and major inconsistencies between the abstract and the full paper publication. Overall, 76 of the 154 (49%) presented abstracts were published as full papers in a peer-reviewed journal with an impact factor of 1.946 ± 0.812. No significant difference existed between the impact factor for randomized controlled trials (2.122 ± 1.015) and observational studies (1.913 ± 0.765, p = 0.469). The full papers for the randomized controlled trials were published after an average (SD) of 17 months (± 13 months); for observational studies, the average (SD) was 12 months (± 14 months) (p = 0.323). A trend was observed in this study that a higher percentage of randomized controlled trial abstracts were published as full papers (71% vs. 47%, p = 0.078) than observational trials. The reporting quality of abstracts, published as full papers, significantly increased compared to conference abstracts both in randomized control studies ( 5.7 ± 0.7 to 7.2 ± 1.3; p = 0.018, CI -2.7 to -0.32) and in observational studies (STROBE: 8.2 ± 1.3 to 8.6 ± 1.4; p = 0.007, CI -0.63 to -0.10). All of the published abstracts had at least one minor inconsistency (title, authors, research center, outcome presentation, conclusion), while 65% had at least major inconsistencies (study objective, hypothesis, study design, primary outcome measures, sample size, statistical analysis, results, SD/CI). Comparing the results of conference and full paper; results changed in 90% vs. 68% (randomized, controlled studies versus observational studies); data were added (full paper reported more result data) in 60% vs. 30%, and deleted (full paper reported fewer result data) in 40% vs. 30%. No significant differences with respect to type of study (randomized controlled versus observational), impact factor, and time to publication existed for the likelihood that a World Congress of Sports Injury conference abstract could be published as a full paper.

  15. Modeling of synchronization behavior of bursting neurons at nonlinearly coupled dynamical networks.

    PubMed

    Çakir, Yüksel

    2016-01-01

    Synchronization behaviors of bursting neurons coupled through electrical and dynamic chemical synapses are investigated. The Izhikevich model is used with random and small world network of bursting neurons. Various currents which consist of diffusive electrical and time-delayed dynamic chemical synapses are used in the simulations to investigate the influences of synaptic currents and couplings on synchronization behavior of bursting neurons. The effects of parameters, such as time delay, inhibitory synaptic strengths, and decay time on synchronization behavior are investigated. It is observed that in random networks with no delay, bursting synchrony is established with the electrical synapse alone, single spiking synchrony is observed with hybrid coupling. In small world network with no delay, periodic bursting behavior with multiple spikes is observed when only chemical and only electrical synapse exist. Single-spike and multiple-spike bursting are established with hybrid couplings. A decrease in the synchronization measure is observed with zero time delay, as the decay time is increased in random network. For synaptic delays which are above active phase period, synchronization measure increases with an increase in synaptic strength and time delay in small world network. However, in random network, it increases with only an increase in synaptic strength.

  16. Entanglement dynamics in critical random quantum Ising chain with perturbations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Yichen, E-mail: ychuang@caltech.edu

    We simulate the entanglement dynamics in a critical random quantum Ising chain with generic perturbations using the time-evolving block decimation algorithm. Starting from a product state, we observe super-logarithmic growth of entanglement entropy with time. The numerical result is consistent with the analytical prediction of Vosk and Altman using a real-space renormalization group technique. - Highlights: • We study the dynamical quantum phase transition between many-body localized phases. • We simulate the dynamics of a very long random spin chain with matrix product states. • We observe numerically super-logarithmic growth of entanglement entropy with time.

  17. Kalman filter data assimilation: targeting observations and parameter estimation.

    PubMed

    Bellsky, Thomas; Kostelich, Eric J; Mahalov, Alex

    2014-06-01

    This paper studies the effect of targeted observations on state and parameter estimates determined with Kalman filter data assimilation (DA) techniques. We first provide an analytical result demonstrating that targeting observations within the Kalman filter for a linear model can significantly reduce state estimation error as opposed to fixed or randomly located observations. We next conduct observing system simulation experiments for a chaotic model of meteorological interest, where we demonstrate that the local ensemble transform Kalman filter (LETKF) with targeted observations based on largest ensemble variance is skillful in providing more accurate state estimates than the LETKF with randomly located observations. Additionally, we find that a hybrid ensemble Kalman filter parameter estimation method accurately updates model parameters within the targeted observation context to further improve state estimation.

  18. Kalman filter data assimilation: Targeting observations and parameter estimation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bellsky, Thomas, E-mail: bellskyt@asu.edu; Kostelich, Eric J.; Mahalov, Alex

    2014-06-15

    This paper studies the effect of targeted observations on state and parameter estimates determined with Kalman filter data assimilation (DA) techniques. We first provide an analytical result demonstrating that targeting observations within the Kalman filter for a linear model can significantly reduce state estimation error as opposed to fixed or randomly located observations. We next conduct observing system simulation experiments for a chaotic model of meteorological interest, where we demonstrate that the local ensemble transform Kalman filter (LETKF) with targeted observations based on largest ensemble variance is skillful in providing more accurate state estimates than the LETKF with randomly locatedmore » observations. Additionally, we find that a hybrid ensemble Kalman filter parameter estimation method accurately updates model parameters within the targeted observation context to further improve state estimation.« less

  19. Chaos and random matrices in supersymmetric SYK

    NASA Astrophysics Data System (ADS)

    Hunter-Jones, Nicholas; Liu, Junyu

    2018-05-01

    We use random matrix theory to explore late-time chaos in supersymmetric quantum mechanical systems. Motivated by the recent study of supersymmetric SYK models and their random matrix classification, we consider the Wishart-Laguerre unitary ensemble and compute the spectral form factors and frame potentials to quantify chaos and randomness. Compared to the Gaussian ensembles, we observe the absence of a dip regime in the form factor and a slower approach to Haar-random dynamics. We find agreement between our random matrix analysis and predictions from the supersymmetric SYK model, and discuss the implications for supersymmetric chaotic systems.

  20. On the Wigner law in dilute random matrices

    NASA Astrophysics Data System (ADS)

    Khorunzhy, A.; Rodgers, G. J.

    1998-12-01

    We consider ensembles of N × N symmetric matrices whose entries are weakly dependent random variables. We show that random dilution can change the limiting eigenvalue distribution of such matrices. We prove that under general and natural conditions the normalised eigenvalue counting function coincides with the semicircle (Wigner) distribution in the limit N → ∞. This can be explained by the observation that dilution (or more generally, random modulation) eliminates the weak dependence (or correlations) between random matrix entries. It also supports our earlier conjecture that the Wigner distribution is stable to random dilution and modulation.

  1. Are parent-reported outcomes for self-directed or telephone-assisted behavioral family intervention enhanced if parents are observed?

    PubMed

    Morawska, Alina; Sanders, Matthew R

    2007-05-01

    The study examined the effects of conducting observations as part of a broader assessment of families participating in behavior family intervention (BFI). It was designed to investigate whether the observations improve intervention outcomes. Families were randomly assigned to different levels of BFI or a waitlist control condition and subsequently randomly assigned to either observation or no-observation conditions. This study demonstrated significant intervention and observation effects. Mothers in more intensive BFI reported more improvement in their child's behavior and their own parenting. Observed mothers reported lower intensity of child behavior problems and more effective parenting styles. There was also a trend for less anger among mothers who were observed and evidence of an observation-intervention interaction for parental anger, with observed mothers in more intensive intervention reporting less anger compared to those not observed. Implications for clinical and research intervention contexts are discussed.

  2. Investigation of random walks knee cartilage segmentation model using inter-observer reproducibility: Data from the osteoarthritis initiative.

    PubMed

    Hong-Seng, Gan; Sayuti, Khairil Amir; Karim, Ahmad Helmy Abdul

    2017-01-01

    Existing knee cartilage segmentation methods have reported several technical drawbacks. In essence, graph cuts remains highly susceptible to image noise despite extended research interest; active shape model is often constraint by the selection of training data while shortest path have demonstrated shortcut problem in the presence of weak boundary, which is a common problem in medical images. The aims of this study is to investigate the capability of random walks as knee cartilage segmentation method. Experts would scribble on knee cartilage image to initialize random walks segmentation. Then, reproducibility of the method is assessed against manual segmentation by using Dice Similarity Index. The evaluation consists of normal cartilage and diseased cartilage sections which is divided into whole and single cartilage categories. A total of 15 normal images and 10 osteoarthritic images were included. The results showed that random walks method has demonstrated high reproducibility in both normal cartilage (observer 1: 0.83±0.028 and observer 2: 0.82±0.026) and osteoarthritic cartilage (observer 1: 0.80±0.069 and observer 2: 0.83±0.029). Besides, results from both experts were found to be consistent with each other, suggesting the inter-observer variation is insignificant (Normal: P=0.21; Diseased: P=0.15). The proposed segmentation model has overcame technical problems reported by existing semi-automated techniques and demonstrated highly reproducible and consistent results against manual segmentation method.

  3. On Measuring and Reducing Selection Bias with a Quasi-Doubly Randomized Preference Trial

    ERIC Educational Resources Information Center

    Joyce, Ted; Remler, Dahlia K.; Jaeger, David A.; Altindag, Onur; O'Connell, Stephen D.; Crockett, Sean

    2017-01-01

    Randomized experiments provide unbiased estimates of treatment effects, but are costly and time consuming. We demonstrate how a randomized experiment can be leveraged to measure selection bias by conducting a subsequent observational study that is identical in every way except that subjects choose their treatment--a quasi-doubly randomized…

  4. The Effects of Including Observed Means or Latent Means as Covariates in Multilevel Models for Cluster Randomized Trials

    ERIC Educational Resources Information Center

    Aydin, Burak; Leite, Walter L.; Algina, James

    2016-01-01

    We investigated methods of including covariates in two-level models for cluster randomized trials to increase power to detect the treatment effect. We compared multilevel models that included either an observed cluster mean or a latent cluster mean as a covariate, as well as the effect of including Level 1 deviation scores in the model. A Monte…

  5. Impact of Cognitive Behavioral Therapy on Observed Autism Symptom Severity during School Recess: A Preliminary Randomized, Controlled Trial

    ERIC Educational Resources Information Center

    Wood, Jeffrey J.; Fujii, Cori; Renno, Patricia; Van Dyke, Marilyn

    2014-01-01

    This study compared cognitive behavioral therapy (CBT) and treatment-as-usual (TAU) in terms of effects on observed social communication-related autism symptom severity during unstructured play time at school for children with autism spectrum disorders (ASD). Thirteen children with ASD (7-11 years old) were randomly assigned to 32 sessions of CBT…

  6. Evidence Synthesis in Harm Assessment of Medicines Using the Example of Rosiglitazone and Myocardial Infarction.

    PubMed

    Rietbergen, Charlotte; Stefansdottir, Gudrun; Leufkens, Hubert G; Knol, Mirjam J; De Bruin, Marie L; Klugkist, Irene

    2017-01-01

    The current system of harm assessment of medicines has been criticized for relying on intuitive expert judgment. There is a call for more quantitative approaches and transparency in decision-making. Illustrated with the case of cardiovascular safety concerns for rosiglitazone, we aimed to explore a structured procedure for the collection, quality assessment, and statistical modeling of safety data from observational and randomized studies. We distinguished five stages in the synthesis process. In Stage I, the general research question, population and outcome, and general inclusion and exclusion criteria are defined and a systematic search is performed. Stage II focusses on the identification of sub-questions examined in the included studies and the classification of the studies into the different categories of sub-questions. In Stage III, the quality of the identified studies is assessed. Coding and data extraction are performed in Stage IV. Finally, meta-analyses on the study results per sub-question are performed in Stage V. A Pubmed search identified 30 randomized and 14 observational studies meeting our search criteria. From these studies, we identified 4 higher level sub-questions and 4 lower level sub-questions. We were able to categorize 29 individual treatment comparisons into one or more of the sub-question categories, and selected study duration as an important covariate. We extracted covariate, outcome, and sample size information at the treatment arm level of the studies. We extracted absolute numbers of myocardial infarctions from the randomized study, and adjusted risk estimates with 95% confidence intervals from the observational studies. Overall, few events were observed in the randomized studies that were frequently of relatively short duration. The large observational studies provided more information since these were often of longer duration. A Bayesian random effects meta-analysis on these data showed no significant increase in risk of rosiglitazone for any of the sub-questions. The proposed procedure can be of additional value for drug safety assessment because it provides a stepwise approach that guides the decision-making in increasing process transparency. The procedure allows for the inclusion of results from both randomized an observational studies, which is especially relevant for this type of research.

  7. Analysis of causality from observational studies and its application in clinical research in Intensive Care Medicine.

    PubMed

    Coscia Requena, C; Muriel, A; Peñuelas, O

    2018-02-28

    Random allocation of treatment or intervention is the key feature of clinical trials and divides patients into treatment groups that are approximately balanced for baseline, and therefore comparable covariates except for the variable treatment of the study. However, in observational studies, where treatment allocation is not random, patients in the treatment and control groups often differ in covariates that are related to intervention variables. These imbalances in covariates can lead to biased estimates of the treatment effect. However, randomized clinical trials are sometimes not feasible for ethical, logistical, economic or other reasons. To resolve these situations, interest in the field of clinical research has grown in designing studies that are most similar to randomized experiments using observational (i.e. non-random) data. Observational studies using propensity score analysis methods have been increasing in the scientific papers of Intensive Care. Propensity score analyses attempt to control for confounding in non-experimental studies by adjusting for the likelihood that a given patient is exposed. However, studies with propensity indexes may be confusing, and intensivists are not familiar with this methodology and may not fully understand the importance of this technique. The objectives of this review are: to describe the fundamentals of propensity index methods; to present the techniques to adequately evaluate propensity index models; to discuss the advantages and disadvantages of these techniques. Copyright © 2018 Elsevier España, S.L.U. y SEMICYUC. All rights reserved.

  8. Random bits, true and unbiased, from atmospheric turbulence

    PubMed Central

    Marangon, Davide G.; Vallone, Giuseppe; Villoresi, Paolo

    2014-01-01

    Random numbers represent a fundamental ingredient for secure communications and numerical simulation as well as to games and in general to Information Science. Physical processes with intrinsic unpredictability may be exploited to generate genuine random numbers. The optical propagation in strong atmospheric turbulence is here taken to this purpose, by observing a laser beam after a 143 km free-space path. In addition, we developed an algorithm to extract the randomness of the beam images at the receiver without post-processing. The numbers passed very selective randomness tests for qualification as genuine random numbers. The extracting algorithm can be easily generalized to random images generated by different physical processes. PMID:24976499

  9. Exerting control over the helical chirality in the main chain of sergeants-and-soldiers-type poly(quinoxaline-2,3-diyl)s by changing from random to block copolymerization protocols.

    PubMed

    Nagata, Yuuya; Nishikawa, Tsuyoshi; Suginome, Michinori

    2015-04-01

    Chiral random poly(quinoxaline-2,3-diyl) polymers of the sergeants-and-soldiers-type (sergeant units bearing (S)-3-octyloxymethyl groups) adopt an M- or P-helical conformation in the presence of achiral units bearing propoxymethyl or butoxy groups (soldier units), respectively. Unusual bidirectional induction of the helical sense can be observed for a copolymer with butoxy soldier units upon changing the mole fraction of the sergeant units. In the presence of 16-20% of sergeant units, the selective induction of a P-helix was observed, while the selective induction of an M-helix was observed for a mole fraction of sergeant units of more than 60%. This phenomenon could be successfully employed to control the helical chirality of copolymers by applying either random or block copolymerization protocols. Random or block copolymerization of sergeant and soldier monomers in a 18:82 ratio resulted in the formation of 250mers with almost absolute P- or M-helical conformation, respectively (>99% ee). Incorporation of a small amount of coordination sites into the random and block copolymers resulted in chiral macromolecular ligands, which allowed the enantioselective synthesis of both enantiomers in the Pd-catalyzed asymmetric hydrosilylation of β-methylstyrene.

  10. Stochastic epidemic outbreaks: why epidemics are like lasers

    NASA Astrophysics Data System (ADS)

    Schwartz, Ira B.; Billings, Lora

    2004-05-01

    Many diseases, such as childhood diseases, dengue fever, and West Nile virus, appear to oscillate randomly as a function of seasonal environmental or social changes. Such oscillations appear to have a chaotic bursting character, although it is still uncertain how much is due to random fluctuations. Such bursting in the presence of noise is also observed in driven lasers. In this talk, I will show how noise can excite random outbreaks in simple models of seasonally driven outbreaks, as well as lasers. The models for both population dynamics will be shown to share the same class of underlying topology, which plays a major role in the cause of observed stochastic bursting.

  11. Synchronization properties of coupled chaotic neurons: The role of random shared input

    NASA Astrophysics Data System (ADS)

    Kumar, Rupesh; Bilal, Shakir; Ramaswamy, Ram

    2016-06-01

    Spike-time correlations of neighbouring neurons depend on their intrinsic firing properties as well as on the inputs they share. Studies have shown that periodically firing neurons, when subjected to random shared input, exhibit asynchronicity. Here, we study the effect of random shared input on the synchronization of weakly coupled chaotic neurons. The cases of so-called electrical and chemical coupling are both considered, and we observe a wide range of synchronization behaviour. When subjected to identical shared random input, there is a decrease in the threshold coupling strength needed for chaotic neurons to synchronize in-phase. The system also supports lag-synchronous states, and for these, we find that shared input can cause desynchronization. We carry out a master stability function analysis for a network of such neurons and show agreement with the numerical simulations. The contrasting role of shared random input for complete and lag synchronized neurons is useful in understanding spike-time correlations observed in many areas of the brain.

  12. Synchronization properties of coupled chaotic neurons: The role of random shared input

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kumar, Rupesh; Bilal, Shakir; Ramaswamy, Ram

    Spike-time correlations of neighbouring neurons depend on their intrinsic firing properties as well as on the inputs they share. Studies have shown that periodically firing neurons, when subjected to random shared input, exhibit asynchronicity. Here, we study the effect of random shared input on the synchronization of weakly coupled chaotic neurons. The cases of so-called electrical and chemical coupling are both considered, and we observe a wide range of synchronization behaviour. When subjected to identical shared random input, there is a decrease in the threshold coupling strength needed for chaotic neurons to synchronize in-phase. The system also supports lag–synchronous states,more » and for these, we find that shared input can cause desynchronization. We carry out a master stability function analysis for a network of such neurons and show agreement with the numerical simulations. The contrasting role of shared random input for complete and lag synchronized neurons is useful in understanding spike-time correlations observed in many areas of the brain.« less

  13. Size differences in migrant sandpiper flocks: ghosts in ephemeral guilds

    USGS Publications Warehouse

    Eldridge, J.L.; Johnson, D.H.

    1988-01-01

    Scolopacid sandpipers were studied from 1980 until 1984 during spring migration in North Dakota. Common species foraging together in mixed-species flocks differed in bill length most often by 20 to 30 percent (ratios from 1.2:1 to 1.3:1). Observed flocks were compared to computer generated flocks drawn from three source pools of Arctic-nesting sandpipers. The source pools included 51 migrant species from a global pool, 33 migrant species from a Western Hemisphere pool, and 13 species that migrated through North Dakota. The observed flocks formed randomly from the available species that used the North Dakota migration corridor but the North Dakota species were not a random selection from the Western Hemisphere and global pools of Arctic-nesting scolopacid sandpipers. In short, the ephemeral, mixed-species foraging flocks that we observed in North Dakota were random mixes from a non-random pool. The size-ratio distributions were consistent with the interpretation that use of this migration corridor by sandpipers has been influenced by some form of size-related selection such as competition.

  14. Decay of random correlation functions for unimodal maps

    NASA Astrophysics Data System (ADS)

    Baladi, Viviane; Benedicks, Michael; Maume-Deschamps, Véronique

    2000-10-01

    Since the pioneering results of Jakobson and subsequent work by Benedicks-Carleson and others, it is known that quadratic maps tfa( χ) = a - χ2 admit a unique absolutely continuous invariant measure for a positive measure set of parameters a. For topologically mixing tfa, Young and Keller-Nowicki independently proved exponential decay of correlation functions for this a.c.i.m. and smooth observables. We consider random compositions of small perturbations tf + ωt, with tf = tfa or another unimodal map satisfying certain nonuniform hyperbolicity axioms, and ωt chosen independently and identically in [-ɛ, ɛ]. Baladi-Viana showed exponential mixing of the associated Markov chain, i.e., averaging over all random itineraries. We obtain stretched exponential bounds for the random correlation functions of Lipschitz observables for the sample measure μωof almost every itinerary.

  15. Random Telegraph Signal-Like Fluctuation Created by Fowler-Nordheim Stress in Gate Induced Drain Leakage Current of the Saddle Type Dynamic Random Access Memory Cell Transistor

    NASA Astrophysics Data System (ADS)

    Kim, Heesang; Oh, Byoungchan; Kim, Kyungdo; Cha, Seon-Yong; Jeong, Jae-Goan; Hong, Sung-Joo; Lee, Jong-Ho; Park, Byung-Gook; Shin, Hyungcheol

    2010-09-01

    We generated traps inside gate oxide in gate-drain overlap region of recess channel type dynamic random access memory (DRAM) cell transistor through Fowler-Nordheim (FN) stress, and observed gate induced drain leakage (GIDL) current both in time domain and in frequency domain. It was found that the trap inside gate oxide could generate random telegraph signal (RTS)-like fluctuation in GIDL current. The characteristics of that fluctuation were similar to those of RTS-like fluctuation in GIDL current observed in the non-stressed device. This result shows the possibility that the trap causing variable retention time (VRT) in DRAM data retention time can be located inside gate oxide like channel RTS of metal-oxide-semiconductor field-effect transistors (MOSFETs).

  16. Quantum-Classical Hybrid for Information Processing

    NASA Technical Reports Server (NTRS)

    Zak, Michail

    2011-01-01

    Based upon quantum-inspired entanglement in quantum-classical hybrids, a simple algorithm for instantaneous transmissions of non-intentional messages (chosen at random) to remote distances is proposed. The idea is to implement instantaneous transmission of conditional information on remote distances via a quantum-classical hybrid that preserves superposition of random solutions, while allowing one to measure its state variables using classical methods. Such a hybrid system reinforces the advantages, and minimizes the limitations, of both quantum and classical characteristics. Consider n observers, and assume that each of them gets a copy of the system and runs it separately. Although they run identical systems, the outcomes of even synchronized runs may be different because the solutions of these systems are random. However, the global constrain must be satisfied. Therefore, if the observer #1 (the sender) made a measurement of the acceleration v(sub 1) at t =T, then the receiver, by measuring the corresponding acceleration v(sub 1) at t =T, may get a wrong value because the accelerations are random, and only their ratios are deterministic. Obviously, the transmission of this knowledge is instantaneous as soon as the measurements have been performed. In addition to that, the distance between the observers is irrelevant because the x-coordinate does not enter the governing equations. However, the Shannon information transmitted is zero. None of the senders can control the outcomes of their measurements because they are random. The senders cannot transmit intentional messages. Nevertheless, based on the transmitted knowledge, they can coordinate their actions based on conditional information. If the observer #1 knows his own measurements, the measurements of the others can be fully determined. It is important to emphasize that the origin of entanglement of all the observers is the joint probability density that couples their actions. There is no centralized source, or a sender of the signal, because each receiver can become a sender as well. An observer receives a signal by performing certain measurements synchronized with the measurements of the others. This means that the signal is uniformly and simultaneously distributed over the observers in a decentralized way. The signals transmit no intentional information that would favor one agent over another. All the sequence of signals received by different observers are not only statistically equivalent, but are also point-by-point identical. It is important to assume that each agent knows that the other agent simultaneously receives the identical signals. The sequences of the signals are true random, so that no agent could predict the next step with the probability different from those described by the density. Under these quite general assumptions, the entangled observers-agents can perform non-trivial tasks that include transmission of conditional information from one agent to another, simple paradigm of cooperation, etc. The problem of behavior of intelligent agents correlated by identical random messages in a decentralized way has its own significance: it simulates evolutionary behavior of biological and social systems correlated only via simultaneous sensoring sequences of unexpected events.

  17. Sparsely sampling the sky: Regular vs. random sampling

    NASA Astrophysics Data System (ADS)

    Paykari, P.; Pires, S.; Starck, J.-L.; Jaffe, A. H.

    2015-09-01

    Aims: The next generation of galaxy surveys, aiming to observe millions of galaxies, are expensive both in time and money. This raises questions regarding the optimal investment of this time and money for future surveys. In a previous work, we have shown that a sparse sampling strategy could be a powerful substitute for the - usually favoured - contiguous observation of the sky. In our previous paper, regular sparse sampling was investigated, where the sparse observed patches were regularly distributed on the sky. The regularity of the mask introduces a periodic pattern in the window function, which induces periodic correlations at specific scales. Methods: In this paper, we use a Bayesian experimental design to investigate a "random" sparse sampling approach, where the observed patches are randomly distributed over the total sparsely sampled area. Results: We find that in this setting, the induced correlation is evenly distributed amongst all scales as there is no preferred scale in the window function. Conclusions: This is desirable when we are interested in any specific scale in the galaxy power spectrum, such as the matter-radiation equality scale. As the figure of merit shows, however, there is no preference between regular and random sampling to constrain the overall galaxy power spectrum and the cosmological parameters.

  18. Comparison of Random Forest and Parametric Imputation Models for Imputing Missing Data Using MICE: A CALIBER Study

    PubMed Central

    Shah, Anoop D.; Bartlett, Jonathan W.; Carpenter, James; Nicholas, Owen; Hemingway, Harry

    2014-01-01

    Multivariate imputation by chained equations (MICE) is commonly used for imputing missing data in epidemiologic research. The “true” imputation model may contain nonlinearities which are not included in default imputation models. Random forest imputation is a machine learning technique which can accommodate nonlinearities and interactions and does not require a particular regression model to be specified. We compared parametric MICE with a random forest-based MICE algorithm in 2 simulation studies. The first study used 1,000 random samples of 2,000 persons drawn from the 10,128 stable angina patients in the CALIBER database (Cardiovascular Disease Research using Linked Bespoke Studies and Electronic Records; 2001–2010) with complete data on all covariates. Variables were artificially made “missing at random,” and the bias and efficiency of parameter estimates obtained using different imputation methods were compared. Both MICE methods produced unbiased estimates of (log) hazard ratios, but random forest was more efficient and produced narrower confidence intervals. The second study used simulated data in which the partially observed variable depended on the fully observed variables in a nonlinear way. Parameter estimates were less biased using random forest MICE, and confidence interval coverage was better. This suggests that random forest imputation may be useful for imputing complex epidemiologic data sets in which some patients have missing data. PMID:24589914

  19. Comparison of random forest and parametric imputation models for imputing missing data using MICE: a CALIBER study.

    PubMed

    Shah, Anoop D; Bartlett, Jonathan W; Carpenter, James; Nicholas, Owen; Hemingway, Harry

    2014-03-15

    Multivariate imputation by chained equations (MICE) is commonly used for imputing missing data in epidemiologic research. The "true" imputation model may contain nonlinearities which are not included in default imputation models. Random forest imputation is a machine learning technique which can accommodate nonlinearities and interactions and does not require a particular regression model to be specified. We compared parametric MICE with a random forest-based MICE algorithm in 2 simulation studies. The first study used 1,000 random samples of 2,000 persons drawn from the 10,128 stable angina patients in the CALIBER database (Cardiovascular Disease Research using Linked Bespoke Studies and Electronic Records; 2001-2010) with complete data on all covariates. Variables were artificially made "missing at random," and the bias and efficiency of parameter estimates obtained using different imputation methods were compared. Both MICE methods produced unbiased estimates of (log) hazard ratios, but random forest was more efficient and produced narrower confidence intervals. The second study used simulated data in which the partially observed variable depended on the fully observed variables in a nonlinear way. Parameter estimates were less biased using random forest MICE, and confidence interval coverage was better. This suggests that random forest imputation may be useful for imputing complex epidemiologic data sets in which some patients have missing data.

  20. Sunspot random walk and 22-year variation

    USGS Publications Warehouse

    Love, Jeffrey J.; Rigler, E. Joshua

    2012-01-01

    We examine two stochastic models for consistency with observed long-term secular trends in sunspot number and a faint, but semi-persistent, 22-yr signal: (1) a null hypothesis, a simple one-parameter random-walk model of sunspot-number cycle-to-cycle change, and, (2) an alternative hypothesis, a two-parameter random-walk model with an imposed 22-yr alternating amplitude. The observed secular trend in sunspots, seen from solar cycle 5 to 23, would not be an unlikely result of the accumulation of multiple random-walk steps. Statistical tests show that a 22-yr signal can be resolved in historical sunspot data; that is, the probability is low that it would be realized from random data. On the other hand, the 22-yr signal has a small amplitude compared to random variation, and so it has a relatively small effect on sunspot predictions. Many published predictions for cycle 24 sunspots fall within the dispersion of previous cycle-to-cycle sunspot differences. The probability is low that the Sun will, with the accumulation of random steps over the next few cycles, walk down to a Dalton-like minimum. Our models support published interpretations of sunspot secular variation and 22-yr variation resulting from cycle-to-cycle accumulation of dynamo-generated magnetic energy.

  1. Observational Study Designs for Comparative Effectiveness Research: An Alternative Approach to Close Evidence Gaps in Head-and-Neck Cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goulart, Bernardo H.L., E-mail: bhg@uw.edu; University of Washington, Seattle, Washington; Ramsey, Scott D.

    Comparative effectiveness research (CER) has emerged as an approach to improve quality of care and patient outcomes while reducing healthcare costs by providing evidence to guide healthcare decisions. Randomized controlled trials (RCTs) have represented the ideal study design to support treatment decisions in head-and-neck (H and N) cancers. In RCTs, formal chance (randomization) determines treatment allocation, which prevents selection bias from distorting the measure of treatment effects. Despite this advantage, only a minority of patients qualify for inclusion in H and N RCTs, which limits the validity of their results to the broader H and N cancer patient population seenmore » in clinical practice. Randomized controlled trials often do not address other knowledge gaps in the management of H and N cancer, including treatment comparisons for rare types of H and N cancers, monitoring of rare or late toxicity events (eg, osteoradionecrosis), or in some instances an RCT is simply not feasible. Observational studies, or studies in which treatment allocation occurs independently of investigators' choice or randomization, may address several of these gaps in knowledge, thereby complementing the role of RCTs. This critical review discusses how observational CER studies complement RCTs in generating the evidence to inform healthcare decisions and improve the quality of care and outcomes of H and N cancer patients. Review topics include a balanced discussion about the strengths and limitations of both RCT and observational CER study designs; a brief description of design and analytic techniques to handle selection bias in observational studies; examples of observational studies that inform current clinical practices and management of H and N cancers; and suggestions for relevant CER questions that could be addressed by an observational study design.« less

  2. Observational study designs for comparative effectiveness research: an alternative approach to close evidence gaps in head-and-neck cancer.

    PubMed

    Goulart, Bernardo H L; Ramsey, Scott D; Parvathaneni, Upendra

    2014-01-01

    Comparative effectiveness research (CER) has emerged as an approach to improve quality of care and patient outcomes while reducing healthcare costs by providing evidence to guide healthcare decisions. Randomized controlled trials (RCTs) have represented the ideal study design to support treatment decisions in head-and-neck (H&N) cancers. In RCTs, formal chance (randomization) determines treatment allocation, which prevents selection bias from distorting the measure of treatment effects. Despite this advantage, only a minority of patients qualify for inclusion in H&N RCTs, which limits the validity of their results to the broader H&N cancer patient population seen in clinical practice. Randomized controlled trials often do not address other knowledge gaps in the management of H&N cancer, including treatment comparisons for rare types of H&N cancers, monitoring of rare or late toxicity events (eg, osteoradionecrosis), or in some instances an RCT is simply not feasible. Observational studies, or studies in which treatment allocation occurs independently of investigators' choice or randomization, may address several of these gaps in knowledge, thereby complementing the role of RCTs. This critical review discusses how observational CER studies complement RCTs in generating the evidence to inform healthcare decisions and improve the quality of care and outcomes of H&N cancer patients. Review topics include a balanced discussion about the strengths and limitations of both RCT and observational CER study designs; a brief description of design and analytic techniques to handle selection bias in observational studies; examples of observational studies that inform current clinical practices and management of H&N cancers; and suggestions for relevant CER questions that could be addressed by an observational study design. Copyright © 2014 Elsevier Inc. All rights reserved.

  3. Influence of finger and mouth action observation on random number generation: an instance of embodied cognition for abstract concepts.

    PubMed

    Grade, Stéphane; Badets, Arnaud; Pesenti, Mauro

    2017-05-01

    Numerical magnitude and specific grasping action processing have been shown to interfere with each other because some aspects of numerical meaning may be grounded in sensorimotor transformation mechanisms linked to finger grip control. However, how specific these interactions are to grasping actions is still unknown. The present study tested the specificity of the number-grip relationship by investigating how the observation of different closing-opening stimuli that might or not refer to prehension-releasing actions was able to influence a random number generation task. Participants had to randomly produce numbers after they observed action stimuli representing either closure or aperture of the fingers, the hand or the mouth, or a colour change used as a control condition. Random number generation was influenced by the prior presentation of finger grip actions, whereby observing a closing finger grip led participants to produce small rather than large numbers, whereas observing an opening finger grip led them to produce large rather than small numbers. Hand actions had reduced or no influence on number production; mouth action influence was restricted to opening, with an overproduction of large numbers. Finally, colour changes did not influence number generation. These results show that some characteristics of observed finger, hand and mouth grip actions automatically prime number magnitude, with the strongest effect for finger grasping. The findings are discussed in terms of the functional and neural mechanisms shared between hand actions and number processing, but also between hand and mouth actions. The present study provides converging evidence that part of number semantics is grounded in sensory-motor mechanisms.

  4. The role of vitamin D in the prevention of late-life depression.

    PubMed

    Okereke, Olivia I; Singh, Ankura

    2016-07-01

    In this article, we review current evidence regarding potential benefits of vitamin D for improving mood and reducing depression risk in older adults. We summarize gaps in knowledge and describe future efforts that may clarify the role of vitamin D in late-life depression prevention. MEDLINE and PsychINFO databases were searched for all articles on vitamin D and mood that had been published up to and including May 2015. Observational studies and randomized trials with 50 or more participants were included. We excluded studies that involved only younger adults and/or exclusively involved persons with current depression. Twenty observational (cross-sectional and prospective) studies and 10 randomized trials (nine were randomized placebo-controlled trials [RCTs]; one was a randomized blinded comparison trial) were reviewed. Inverse associations of vitamin D blood level or vitamin D intake with depression were found in 13 observational studies; three identified prospective relations. Results from all but one of the RCTs showed no statistically significant differences in depression outcomes between vitamin D and placebo groups. Observational studies were mostly cross-sectional and frequently lacked adequate control of confounding. RCTs often featured low treatment doses, suboptimal post-intervention changes in biochemical levels of vitamin D, and/or short trial durations. Vitamin D level-mood associations were observed in most, but not all, observational studies; results indicated that vitamin D deficiency may be a risk factor for late-life depression. However, additional data from well-designed RCTs are required to determine the impact of vitamin D in late-life depression prevention. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. The Shark Random Swim - (Lévy Flight with Memory)

    NASA Astrophysics Data System (ADS)

    Businger, Silvia

    2018-05-01

    The Elephant Random Walk (ERW), first introduced by Schütz and Trimper (Phys Rev E 70:045101, 2004), is a one-dimensional simple random walk on Z having a memory about the whole past. We study the Shark Random Swim, a random walk with memory about the whole past, whose steps are α -stable distributed with α \\in (0,2] . Our aim in this work is to study the impact of the heavy tailed step distributions on the asymptotic behavior of the random walk. We shall see that, as for the ERW, the asymptotic behavior of the Shark Random Swim depends on its memory parameter p, and that a phase transition can be observed at the critical value p=1/α.

  6. Electromagnetic Scattering by Fully Ordered and Quasi-Random Rigid Particulate Samples

    NASA Technical Reports Server (NTRS)

    Mishchenko, Michael I.; Dlugach, Janna M.; Mackowski, Daniel W.

    2016-01-01

    In this paper we have analyzed circumstances under which a rigid particulate sample can behave optically as a true discrete random medium consisting of particles randomly moving relative to each other during measurement. To this end, we applied the numerically exact superposition T-matrix method to model far-field scattering characteristics of fully ordered and quasi-randomly arranged rigid multiparticle groups in fixed and random orientations. We have shown that, in and of itself, averaging optical observables over movements of a rigid sample as a whole is insufficient unless it is combined with a quasi-random arrangement of the constituent particles in the sample. Otherwise, certain scattering effects typical of discrete random media (including some manifestations of coherent backscattering) may not be accurately replicated.

  7. A mathematical model of case-ascertainment bias: Applied to case-control studies nested within a randomized screening trial.

    PubMed

    Jansen, Rick J; Alexander, Bruce H; Hayes, Richard B; Miller, Anthony B; Wacholder, Sholom; Church, Timothy R

    2018-01-01

    When some individuals are screen-detected before the beginning of the study, but otherwise would have been diagnosed symptomatically during the study, this results in different case-ascertainment probabilities among screened and unscreened participants, referred to here as lead-time-biased case-ascertainment (LTBCA). In fact, this issue can arise even in risk-factor studies nested within a randomized screening trial; even though the screening intervention is randomly allocated to trial arms, there is no randomization to potential risk-factors and uptake of screening can differ by risk-factor strata. Under the assumptions that neither screening nor the risk factor affects underlying incidence and no other forms of bias operate, we simulate and compare the underlying cumulative incidence and that observed in the study due to LTBCA. The example used will be constructed from the randomized Prostate, Lung, Colorectal, and Ovarian cancer screening trial. The derived mathematical model is applied to simulating two nested studies to evaluate the potential for screening bias in observational lung cancer studies. Because of differential screening under plausible assumptions about preclinical incidence and duration, the simulations presented here show that LTBCA due to chest x-ray screening can significantly increase the estimated risk of lung cancer due to smoking by 1% and 50%. Traditional adjustment methods cannot account for this bias, as the influence screening has on observational study estimates involves events outside of the study observation window (enrollment and follow-up) that change eligibility for potential participants, thus biasing case ascertainment.

  8. Noisy scale-free networks

    NASA Astrophysics Data System (ADS)

    Scholz, Jan; Dejori, Mathäus; Stetter, Martin; Greiner, Martin

    2005-05-01

    The impact of observational noise on the analysis of scale-free networks is studied. Various noise sources are modeled as random link removal, random link exchange and random link addition. Emphasis is on the resulting modifications for the node-degree distribution and for a functional ranking based on betweenness centrality. The implications for estimated gene-expressed networks for childhood acute lymphoblastic leukemia are discussed.

  9. Complete convergence of randomly weighted END sequences and its application.

    PubMed

    Li, Penghua; Li, Xiaoqin; Wu, Kehan

    2017-01-01

    We investigate the complete convergence of partial sums of randomly weighted extended negatively dependent (END) random variables. Some results of complete moment convergence, complete convergence and the strong law of large numbers for this dependent structure are obtained. As an application, we study the convergence of the state observers of linear-time-invariant systems. Our results extend the corresponding earlier ones.

  10. Navigability of Random Geometric Graphs in the Universe and Other Spacetimes.

    PubMed

    Cunningham, William; Zuev, Konstantin; Krioukov, Dmitri

    2017-08-18

    Random geometric graphs in hyperbolic spaces explain many common structural and dynamical properties of real networks, yet they fail to predict the correct values of the exponents of power-law degree distributions observed in real networks. In that respect, random geometric graphs in asymptotically de Sitter spacetimes, such as the Lorentzian spacetime of our accelerating universe, are more attractive as their predictions are more consistent with observations in real networks. Yet another important property of hyperbolic graphs is their navigability, and it remains unclear if de Sitter graphs are as navigable as hyperbolic ones. Here we study the navigability of random geometric graphs in three Lorentzian manifolds corresponding to universes filled only with dark energy (de Sitter spacetime), only with matter, and with a mixture of dark energy and matter. We find these graphs are navigable only in the manifolds with dark energy. This result implies that, in terms of navigability, random geometric graphs in asymptotically de Sitter spacetimes are as good as random hyperbolic graphs. It also establishes a connection between the presence of dark energy and navigability of the discretized causal structure of spacetime, which provides a basis for a different approach to the dark energy problem in cosmology.

  11. A randomized, blinded study of canal wall up versus canal wall down mastoidectomy determining the differences in viewing middle ear anatomy and pathology.

    PubMed

    Hulka, G F; McElveen, J T

    1998-09-01

    Canal wall down and intact canal wall tympanomastoidectomy represent two surgical approaches to middle ear pathology. The authors hypothesize that there is a difference in the ability to view structures in the middle ear between these two methods. Depending on the individual, many surgeons have used the two different techniques of intact canal wall and canal wall down tympanomastoidectomy for approaching the middle ear. However, opinions conflict as to which approach provides the best visualization of different locations in the middle ear. This study prospectively evaluated temporal bones to determine the differences in visualizing structures of the middle ear using these two approaches. Twelve temporal bones underwent a standardized canal wall down tympanomastoidectomy using a reversible canal wall down technique. All bones were viewed in two dissections: intact canal wall and canal wall down preparations. Four points previously had been marked on each temporal bone in randomly assigned colors. These points include the sinus tympani, posterior crus of stapes, lateral epitympanum, and the Eustachian tube orifice. An observer blinded to the purpose of the study, color, and number of locations recorded the color and location of marks observed within the temporal bones. Randomized bones of two separate settings were viewed such that each bone was viewed in both the canal wall down and the intact canal wall preparations. A significant difference was noted in the ability to observe middle ear pathology between the intact canal wall versus canal wall down tympanomastoidectomy, with the latter showing superiority (p < 0.001). Of the four subsites, the sinus tympani, posterior crus of stapes, and lateral epitympanum were observed more frequently with the canal wall down. There was no significant difference in the ability to observe the Eustachian tube orifice between the two techniques. Statistical analysis shows good reproducibility and randomization of this study. The canal wall down tympanomastoidectomy allowed for superior viewing of the three locations, sinus tympanic, posterior crus of stapes, and lateral at the tympanum, as they were marked in the study. This study shows the potential for improved visualization via the canal wall down tympanomastoidectomy. A significant amount of literature written by individuals and otology group practices is available retrospectively comparing the advantages and disadvantages of intact canal wall versus canal wall down mastoidectomy procedures for approaching middle ear pathology. In the interest of objectively evaluating the differences between these two approaches, we have studied temporal bones in a prospective randomized, blinded study comparing the two. Twelve bones were used and observed twice, once in each of 2 sessions. All bones were viewed in two dissections: intact canal wall and canal wall down mastoidectomy. Four points were marked on each temporal bone in three different colors applied in a randomized order to eliminate observer expectation. The four points marked include sinus tympani, posterior crus of the stapes footplate, lateral epitympanum, and Eustachian tube orifice. Both intact canal wall and canal wall down bones were provided randomly to the observer at each viewing session. Before the observer was allowed to see the dissections, those requiring replacement of the canal for the first session of the study had this done in a method using native posterior bony canal. Temporal bones were presented to an expert otologist in a randomized fashion with each temporal bone being placed in a temporal bone bowl holder and specialized framework, allowing for rotation and repositioning approximating the experience in an operating room setting. For each temporal bone, the observer filled in a questionnaire describing his or her observations by denoting both location and color of marks observed. (ABSTRACT TRUNCATED)

  12. Bias in Observational Studies of Prevalent Users: Lessons for Comparative Effectiveness Research From a Meta-Analysis of Statins

    PubMed Central

    Danaei, Goodarz; Tavakkoli, Mohammad; Hernán, Miguel A.

    2012-01-01

    Randomized clinical trials (RCTs) are usually the preferred strategy with which to generate evidence of comparative effectiveness, but conducting an RCT is not always feasible. Though observational studies and RCTs often provide comparable estimates, the questioning of observational analyses has recently intensified because of randomized-observational discrepancies regarding the effect of postmenopausal hormone replacement therapy on coronary heart disease. Reanalyses of observational data that excluded prevalent users of hormone replacement therapy led to attenuated discrepancies, which begs the question of whether exclusion of prevalent users should be generally recommended. In the current study, the authors evaluated the effect of excluding prevalent users of statins in a meta-analysis of observational studies of persons with cardiovascular disease. The pooled, multivariate-adjusted mortality hazard ratio for statin use was 0.77 (95% confidence interval (CI): 0.65, 0.91) in 4 studies that compared incident users with nonusers, 0.70 (95% CI: 0.64, 0.78) in 13 studies that compared a combination of prevalent and incident users with nonusers, and 0.54 (95% CI: 0.45, 0.66) in 13 studies that compared prevalent users with nonusers. The corresponding hazard ratio from 18 RCTs was 0.84 (95% CI: 0.77, 0.91). It appears that the greater the proportion of prevalent statin users in observational studies, the larger the discrepancy between observational and randomized estimates. PMID:22223710

  13. Narrow-band generation in random distributed feedback fiber laser.

    PubMed

    Sugavanam, Srikanth; Tarasov, Nikita; Shu, Xuewen; Churkin, Dmitry V

    2013-07-15

    Narrow-band emission of spectral width down to ~0.05 nm line-width is achieved in the random distributed feedback fiber laser employing narrow-band fiber Bragg grating or fiber Fabry-Perot interferometer filters. The observed line-width is ~10 times less than line-width of other demonstrated up to date random distributed feedback fiber lasers. The random DFB laser with Fabry-Perot interferometer filter provides simultaneously multi-wavelength and narrow-band (within each line) generation with possibility of further wavelength tuning.

  14. Effectiveness of adjuvant radiotherapy in patients with oropharyngeal and floor of mouth squamous cell carcinoma and concomitant histological verification of singular ipsilateral cervical lymph node metastasis (pN1-state)--a prospective multicenter randomized controlled clinical trial using a comprehensive cohort design.

    PubMed

    Moergel, Maximilian; Jahn-Eimermacher, Antje; Krummenauer, Frank; Reichert, Torsten E; Wagner, Wilfried; Wendt, Thomas G; Werner, Jochen A; Al-Nawas, Bilal

    2009-12-23

    Modern radiotherapy plays an important role in therapy of advanced head and neck carcinomas. However, no clinical studies have been published addressing the effectiveness of postoperative radiotherapy in patients with small tumor (pT1, pT2) and concomitant ipsilateral metastasis of a single lymph node (pN1), which would provide a basis for a general treatment recommendation. The present study is a non-blinded, prospective, multi-center randomized controlled trial (RCT). As the primary clinical endpoint, overall-survival in patients receiving postoperative radiation therapy vs. patients without adjuvant therapy following curative intended surgery is compared. The aim of the study is to enroll 560 adult males and females for 1:1 randomization to one of the two treatment arms (irradiation/no irradiation). Since patients with small tumor (T1/T2) but singular lymph node metastasis are rare and the amount of patients consenting to randomization is not predictable in advance, all patients rejecting randomization will be treated as preferred and enrolled in a prospective observational study (comprehensive cohort design) after giving informed consent. This observational part of the trial will be performed with maximum consistency to the treatment and observation protocol of the RCT. Because the impact of patient preference for a certain treatment option is not calculable, parallel design of RCT and observational study may provide a maximum of evidence and efficacy for evaluation of treatment outcome. Secondary clinical endpoints are as follows: incidence and time to tumor relapse (locoregional relapse, lymph node involvement and distant metastatic spread), Quality of life as reported by EORTC (QLQ-C30 with H&N 35 module), and time from operation to orofacial rehabilitation. All tumors represent a homogeneous clinical state and therefore additional investigation of protein expression levels within resection specimen may serve for establishment of surrogate parameters of patient outcome. The inherent challenges of a rare clinical condition (pN1) and two substantially different therapy arms would limit the practicality of a classical randomized study. The concept of a Comprehensive Cohort Design combines the preference of a randomized study, with the option of careful data interpretation within an observational study. ClinicalTrials.gov: NCT00964977.

  15. [Randomized, controlled clinical trials with observational follow-up investigations for evaluating efficacy of antihyperglycaemic treatment. I. Main results of the studies].

    PubMed

    Jermendy, György

    2018-04-01

    The effect of antihyperglycaemic (antidiabetic) treatment on the late diabetic complications is one of the most important research areas in clinical diabetology. The relationship between glycaemic control and late micro- and macrovascular complications was highlighted by the results of the DCCT (Diabetes Control and Complications Trial) with type 1 and by the UKPDS (United Kingdom Prospective Diabetes Study) with type 2 diabetic patients. In these studies, observational follow-up investigations were also performed after the close-out of the randomized phase of the trial. In addition to these landmark studies, other randomized, controlled efficacy trials were also performed with observational follow-up investigations resulting in the development of the concept of metabolic memory or metabolic legacy. In this article, the main results of the studies are summarized. Orv Hetil. 2018; 159(15): 575-582.

  16. Are Randomized Controlled Trials the (G)old Standard? From Clinical Intelligence to Prescriptive Analytics.

    PubMed

    Van Poucke, Sven; Thomeer, Michiel; Heath, John; Vukicevic, Milan

    2016-07-06

    Despite the accelerating pace of scientific discovery, the current clinical research enterprise does not sufficiently address pressing clinical questions. Given the constraints on clinical trials, for a majority of clinical questions, the only relevant data available to aid in decision making are based on observation and experience. Our purpose here is 3-fold. First, we describe the classic context of medical research guided by Poppers' scientific epistemology of "falsificationism." Second, we discuss challenges and shortcomings of randomized controlled trials and present the potential of observational studies based on big data. Third, we cover several obstacles related to the use of observational (retrospective) data in clinical studies. We conclude that randomized controlled trials are not at risk for extinction, but innovations in statistics, machine learning, and big data analytics may generate a completely new ecosystem for exploration and validation.

  17. Are Randomized Controlled Trials the (G)old Standard? From Clinical Intelligence to Prescriptive Analytics

    PubMed Central

    2016-01-01

    Despite the accelerating pace of scientific discovery, the current clinical research enterprise does not sufficiently address pressing clinical questions. Given the constraints on clinical trials, for a majority of clinical questions, the only relevant data available to aid in decision making are based on observation and experience. Our purpose here is 3-fold. First, we describe the classic context of medical research guided by Poppers’ scientific epistemology of “falsificationism.” Second, we discuss challenges and shortcomings of randomized controlled trials and present the potential of observational studies based on big data. Third, we cover several obstacles related to the use of observational (retrospective) data in clinical studies. We conclude that randomized controlled trials are not at risk for extinction, but innovations in statistics, machine learning, and big data analytics may generate a completely new ecosystem for exploration and validation. PMID:27383622

  18. Time-resolved two-window measurement of Wigner functions for coherent backscatter from a turbid medium

    NASA Astrophysics Data System (ADS)

    Reil, Frank; Thomas, John E.

    2002-05-01

    For the first time we are able to observe the time-resolved Wigner function of enhanced backscatter from a random medium using a novel two-window technique. This technique enables us to directly verify the phase-conjugating properties of random media. An incident divergent beam displays a convergent enhanced backscatter cone. We measure the joint position and momentum (x, p) distributions of the light field as a function of propagation time in the medium. The two-window technique allows us to independently control the resolutions for position and momentum, thereby surpassing the uncertainty limit associated with Fourier transform pairs. By using a low-coherence light source in a heterodyne detection scheme, we observe enhanced backscattering resolved by path length in the random medium, providing information about the evolution of optical coherence as a function of penetration depth in the random medium.

  19. Comparative effectiveness research in cancer with observational data.

    PubMed

    Giordano, Sharon H

    2015-01-01

    Observational studies are increasingly being used for comparative effectiveness research. These studies can have the greatest impact when randomized trials are not feasible or when randomized studies have not included the population or outcomes of interest. However, careful attention must be paid to study design to minimize the likelihood of selection biases. Analytic techniques, such as multivariable regression modeling, propensity score analysis, and instrumental variable analysis, also can also be used to help address confounding. Oncology has many existing large and clinically rich observational databases that can be used for comparative effectiveness research. With careful study design, observational studies can produce valid results to assess the benefits and harms of a treatment or intervention in representative real-world populations.

  20. Random matrix approach to plasmon resonances in the random impedance network model of disordered nanocomposites

    NASA Astrophysics Data System (ADS)

    Olekhno, N. A.; Beltukov, Y. M.

    2018-05-01

    Random impedance networks are widely used as a model to describe plasmon resonances in disordered metal-dielectric and other two-component nanocomposites. In the present work, the spectral properties of resonances in random networks are studied within the framework of the random matrix theory. We have shown that the appropriate ensemble of random matrices for the considered problem is the Jacobi ensemble (the MANOVA ensemble). The obtained analytical expressions for the density of states in such resonant networks show a good agreement with the results of numerical simulations in a wide range of metal filling fractions 0

  1. Detecting targets hidden in random forests

    NASA Astrophysics Data System (ADS)

    Kouritzin, Michael A.; Luo, Dandan; Newton, Fraser; Wu, Biao

    2009-05-01

    Military tanks, cargo or troop carriers, missile carriers or rocket launchers often hide themselves from detection in the forests. This plagues the detection problem of locating these hidden targets. An electro-optic camera mounted on a surveillance aircraft or unmanned aerial vehicle is used to capture the images of the forests with possible hidden targets, e.g., rocket launchers. We consider random forests of longitudinal and latitudinal correlations. Specifically, foliage coverage is encoded with a binary representation (i.e., foliage or no foliage), and is correlated in adjacent regions. We address the detection problem of camouflaged targets hidden in random forests by building memory into the observations. In particular, we propose an efficient algorithm to generate random forests, ground, and camouflage of hidden targets with two dimensional correlations. The observations are a sequence of snapshots consisting of foliage-obscured ground or target. Theoretically, detection is possible because there are subtle differences in the correlations of the ground and camouflage of the rocket launcher. However, these differences are well beyond human perception. To detect the presence of hidden targets automatically, we develop a Markov representation for these sequences and modify the classical filtering equations to allow the Markov chain observation. Particle filters are used to estimate the position of the targets in combination with a novel random weighting technique. Furthermore, we give positive proof-of-concept simulations.

  2. The Random Telegraph Signal Behavior of Intermittently Stuck Bits in SDRAMs

    NASA Astrophysics Data System (ADS)

    Chugg, Andrew Michael; Burnell, Andrew J.; Duncan, Peter H.; Parker, Sarah; Ward, Jonathan J.

    2009-12-01

    This paper reports behavior analogous to the Random Telegraph Signal (RTS) seen in the leakage currents from radiation induced hot pixels in Charge Coupled Devices (CCDs), but in the context of stuck bits in Synchronous Dynamic Random Access Memories (SDRAMs). Our analysis suggests that pseudo-random sticking and unsticking of the SDRAM bits is due to thermally induced fluctuations in leakage current through displacement damage complexes in depletion regions that were created by high-energy neutron and proton interactions. It is shown that the number of observed stuck bits increases exponentially with temperature, due to the general increase in the leakage currents through the damage centers with temperature. Nevertheless, some stuck bits are seen to pseudo-randomly stick and unstick in the context of a continuously rising trend of temperature, thus demonstrating that their damage centers can exist in multiple widely spaced, discrete levels of leakage current, which is highly consistent with RTS. This implies that these intermittently stuck bits (ISBs) are a displacement damage phenomenon and are unrelated to microdose issues, which is confirmed by the observation that they also occur in unbiased irradiation. Finally, we note that observed variations in the periodicity of the sticking and unsticking behavior on several timescales is most readily explained by multiple leakage current pathways through displacement damage complexes spontaneously and independently opening and closing under the influence of thermal vibrations.

  3. Cumulative Evidence of Randomized Controlled and Observational Studies on Catheter-Related Infection Risk of Central Venous Catheter Insertion Site in ICU Patients: A Pairwise and Network Meta-Analysis.

    PubMed

    Arvaniti, Kostoula; Lathyris, Dimitrios; Blot, Stijn; Apostolidou-Kiouti, Fani; Koulenti, Despoina; Haidich, Anna-Bettina

    2017-04-01

    Selection of central venous catheter insertion site in ICU patients could help reduce catheter-related infections. Although subclavian was considered the most appropriate site, its preferential use in ICU patients is not generalized and questioned by contradicted meta-analysis results. In addition, conflicting data exist on alternative site selection whenever subclavian is contraindicated. To compare catheter-related bloodstream infection and colonization risk between the three sites (subclavian, internal jugular, and femoral) in adult ICU patients. We searched MEDLINE, EMBASE, Cochrane Central Register of Controlled trials, CINAHL, and ClinicalTrials.gov. Eligible studies were randomized controlled trials and observational ones. Extracted data were analyzed by pairwise and network meta-analysis. Twenty studies were included; 11 were observational, seven were randomized controlled trials for other outcomes, and two were randomized controlled trials for sites. We evaluated 18,554 central venous catheters: 9,331 from observational studies, 5,482 from randomized controlled trials for other outcomes, and 3,741 from randomized controlled trials for sites. Colonization risk was higher for internal jugular (relative risk, 2.25 [95% CI, 1.84-2.75]; I = 0%) and femoral (relative risk, 2.92 [95% CI, 2.11-4.04]; I = 24%), compared with subclavian. Catheter-related bloodstream infection risk was comparable for internal jugular and subclavian, higher for femoral than subclavian (relative risk, 2.44 [95% CI, 1.25-4.75]; I = 61%), and lower for internal jugular than femoral (relative risk, 0.55 [95% CI, 0.34-0.89]; I = 61%). When observational studies that did not control for baseline characteristics were excluded, catheter-related bloodstream infection risk was comparable between the sites. In ICU patients, internal jugular and subclavian may, similarly, decrease catheter-related bloodstream infection risk, when compared with femoral. Subclavian could be suggested as the most appropriate site, whenever colonization risk is considered and not, otherwise, contraindicated. Current evidence on catheter-related bloodstream infection femoral risk, compared with the other sites, is inconclusive.

  4. Local dependence in random graph models: characterization, properties and statistical inference

    PubMed Central

    Schweinberger, Michael; Handcock, Mark S.

    2015-01-01

    Summary Dependent phenomena, such as relational, spatial and temporal phenomena, tend to be characterized by local dependence in the sense that units which are close in a well-defined sense are dependent. In contrast with spatial and temporal phenomena, though, relational phenomena tend to lack a natural neighbourhood structure in the sense that it is unknown which units are close and thus dependent. Owing to the challenge of characterizing local dependence and constructing random graph models with local dependence, many conventional exponential family random graph models induce strong dependence and are not amenable to statistical inference. We take first steps to characterize local dependence in random graph models, inspired by the notion of finite neighbourhoods in spatial statistics and M-dependence in time series, and we show that local dependence endows random graph models with desirable properties which make them amenable to statistical inference. We show that random graph models with local dependence satisfy a natural domain consistency condition which every model should satisfy, but conventional exponential family random graph models do not satisfy. In addition, we establish a central limit theorem for random graph models with local dependence, which suggests that random graph models with local dependence are amenable to statistical inference. We discuss how random graph models with local dependence can be constructed by exploiting either observed or unobserved neighbourhood structure. In the absence of observed neighbourhood structure, we take a Bayesian view and express the uncertainty about the neighbourhood structure by specifying a prior on a set of suitable neighbourhood structures. We present simulation results and applications to two real world networks with ‘ground truth’. PMID:26560142

  5. Autoshaping, random control, and omission training in the rat1

    PubMed Central

    Locurto, Charles; Terrace, H. S.; Gibbon, John

    1976-01-01

    The role of the stimulus-reinforcer contingency in the development and maintenance of lever contact responding was studied in hooded rats. In Experiment I, three groups of experimentally naive rats were trained either on autoshaping, omission training, or a random-control procedure. Subjects trained by the autoshaping procedure responded more consistently than did either random-control or omission-trained subjects. The probability of at least one lever contact per trial was slightly higher in subjects trained by the omission procedure than by the random-control procedure. However, these differences were not maintained during extended training, nor were they evident in total lever-contact frequencies. When omission and random-control subjects were switched to the autoshaping condition, lever contacts increased in all animals, but a pronounced retardation was observed in omission subjects relative to the random-control subjects. In addition, subjects originally exposed to the random-control procedure, and later switched to autoshaping, acquired more rapidly than naive subjects that were exposed only on the autoshaping procedure. In Experiment II, subjects originally trained by an autoshaping procedure were exposed either to an omission, a random-control, or an extinction procedure. No differences were observed among the groups either in the rate at which lever contacts decreased or in the frequency of lever contacts at the end of training. These data implicate prior experience in the interpretation of omission-training effects and suggest limitations in the influence of stimulus-reinforcer relations in autoshaping. PMID:16811960

  6. Autoshaping, random control, and omission training in the rat.

    PubMed

    Locurto, C; Terrace, H S; Gibbon, J

    1976-11-01

    The role of the stimulus-reinforcer contingency in the development and maintenance of lever contact responding was studied in hooded rats. In Experiment I, three groups of experimentally naive rats were trained either on autoshaping, omission training, or a random-control procedure. Subjects trained by the autoshaping procedure responded more consistently than did either random-control or omission-trained subjects. The probability of at least one lever contact per trial was slightly higher in subjects trained by the omission procedure than by the random-control procedure. However, these differences were not maintained during extended training, nor were they evident in total lever-contact frequencies. When omission and random-control subjects were switched to the autoshaping condition, lever contacts increased in all animals, but a pronounced retardation was observed in omission subjects relative to the random-control subjects. In addition, subjects originally exposed to the random-control procedure, and later switched to autoshaping, acquired more rapidly than naive subjects that were exposed only on the autoshaping procedure. In Experiment II, subjects originally trained by an autoshaping procedure were exposed either to an omission, a random-control, or an extinction procedure. No differences were observed among the groups either in the rate at which lever contacts decreased or in the frequency of lever contacts at the end of training. These data implicate prior experience in the interpretation of omission-training effects and suggest limitations in the influence of stimulus-reinforcer relations in autoshaping.

  7. A direct observation method for auditing large urban centers using stratified sampling, mobile GIS technology and virtual environments.

    PubMed

    Lafontaine, Sean J V; Sawada, M; Kristjansson, Elizabeth

    2017-02-16

    With the expansion and growth of research on neighbourhood characteristics, there is an increased need for direct observational field audits. Herein, we introduce a novel direct observational audit method and systematic social observation instrument (SSOI) for efficiently assessing neighbourhood aesthetics over large urban areas. Our audit method uses spatial random sampling stratified by residential zoning and incorporates both mobile geographic information systems technology and virtual environments. The reliability of our method was tested in two ways: first, in 15 Ottawa neighbourhoods, we compared results at audited locations over two subsequent years, and second; we audited every residential block (167 blocks) in one neighbourhood and compared the distribution of SSOI aesthetics index scores with results from the randomly audited locations. Finally, we present interrater reliability and consistency results on all observed items. The observed neighbourhood average aesthetics index score estimated from four or five stratified random audit locations is sufficient to characterize the average neighbourhood aesthetics. The SSOI was internally consistent and demonstrated good to excellent interrater reliability. At the neighbourhood level, aesthetics is positively related to SES and physical activity and negatively correlated with BMI. The proposed approach to direct neighbourhood auditing performs sufficiently and has the advantage of financial and temporal efficiency when auditing a large city.

  8. Thalamic reticular cells firing modes and its dependency on the frequency and amplitude ranges of the current stimulus.

    PubMed

    Hernandez, Oscar; Hernandez, Lilibeth; Vera, David; Santander, Alcides; Zurek, Eduardo

    2015-01-01

    The neurons of the Thalamic Reticular Nucleus (TRNn) respond to inputs in two activity modes called burst and tonic firing and both can be observed in different physiological states. The functional states of the thalamus depend in part on the properties of synaptic transmission between the TRNn and the thalamocortical and corticothalamic neurons. A dendrite can receive inhibitory and excitatory postsynaptic potentials. The novelties presented in this paper can be summarized as follows: First, it shows, through a computational simulation, that the burst and tonic firings observed in the TRNn soma could be explained as a product of random synaptic inputs on the distal dendrites, the tonic firings are generated by random excitatory stimuli, and the burst firings are generated by two different types of stimuli: inhibitory random stimuli, and a combination of inhibitory (from TRNn) and excitatory (from corticothalamic and thalamocortical neurons) random stimuli; second, according to in vivo recordings, we have found that the burst observed in the TRNn soma has graduate properties that are proportional to the stimuli frequency; and third, a novel method for showing in a quantitative manner the accelerando-decelerando pattern is proposed.

  9. Randomized Trials Built on Sand: Examples from COPD, Hormone Therapy, and Cancer

    PubMed Central

    Suissa, Samy

    2012-01-01

    The randomized controlled trial is the fundamental study design to evaluate the effectiveness of medications and receive regulatory approval. Observational studies, on the other hand, are essential to address post-marketing drug safety issues but have also been used to uncover new indications or new benefits for already marketed drugs. Hormone replacement therapy (HRT) for instance, effective for menopausal symptoms, was reported in several observational studies during the 1980s and 1990s to also significantly reduce the incidence of coronary heart disease. This claim was refuted in 2002 by the large-scale Women’s Health Initiative randomized trial. An example of a new indication for an old drug is that of metformin, an anti-diabetic medication, which is being hailed as a potential anti-cancer agent, primarily on the basis of several recent observational studies that reported impressive reductions in cancer incidence and mortality with its use. These observational studies have now sparked the conduct of large-scale randomized controlled trials currently ongoing in cancer. We show in this paper that the spectacular effects on new indications or new outcomes reported in many observational studies in chronic obstructive pulmonary disease (COPD), HRT, and cancer are the result of time-related biases, such as immortal time bias, that tend to seriously exaggerate the benefits of a drug and that eventually disappear with the proper statistical analysis. In all, while observational studies are central to assess the effects of drugs, their proper design and analysis are essential to avoid bias. The scientific evidence on the potential beneficial effects in new indications of existing drugs will need to be more carefully assessed before embarking on long and expensive unsubstantiated trials. PMID:23908838

  10. Patent foramen ovale closure and medical treatments for secondary stroke prevention: a systematic review of observational and randomized evidence.

    PubMed

    Kitsios, Georgios D; Dahabreh, Issa J; Abu Dabrh, Abd Moain; Thaler, David E; Kent, David M

    2012-02-01

    Patients discovered to have a patent foramen ovale in the setting of a cryptogenic stroke may be treated with percutaneous closure, antiplatelet therapy, or anticoagulants. A recent randomized trial (CLOSURE I) did not detect any benefit of closure over medical treatment alone; the optimal medical therapy is also unknown. We synthesized the available evidence on secondary stroke prevention in patients with patent foramen ovale and cryptogenic stroke. A MEDLINE search was performed for finding longitudinal studies investigating medical treatment or closure, meta-analysis of incidence rates (IR), and IR ratios of recurrent cerebrovascular events. Fifty-two single-arm studies and 7 comparative nonrandomized studies and the CLOSURE I trial were reviewed. The summary IR of recurrent stroke was 0.36 events (95% confidence interval [CI], 0.24-0.56) per 100 person-years with closure versus 2.53 events (95% CI, 1.91-3.35) per 100 person-years with medical therapy. In comparative observational studies, closure was superior to medical therapy (IR ratio=0.19; 95% CI, 0.07-0.54). The IR for the closure arm of the CLOSURE I trial was higher than the summary estimate from observational studies; there was no significant benefit of closure over medical treatment (P=0.002 comparing efficacy estimates between observational studies and the trial). Observational and randomized data (9 studies) comparing medical therapies were consistent and suggested that anticoagulants are superior to antiplatelets for preventing stroke recurrence (IR ratio=0.42; 95% CI, 0.18-0.98). Although further randomized trial data are needed to precisely determine the effects of closure on stroke recurrence, the results of CLOSURE I challenge the credibility of a substantial body of observational evidence strongly favoring mechanical closure over medical therapy.

  11. Efficiency of the human observer detecting random signals in random backgrounds

    PubMed Central

    Park, Subok; Clarkson, Eric; Kupinski, Matthew A.; Barrett, Harrison H.

    2008-01-01

    The efficiencies of the human observer and the channelized-Hotelling observer relative to the ideal observer for signal-detection tasks are discussed. Both signal-known-exactly (SKE) tasks and signal-known-statistically (SKS) tasks are considered. Signal location is uncertain for the SKS tasks, and lumpy backgrounds are used for background uncertainty in both cases. Markov chain Monte Carlo methods are employed to determine ideal-observer performance on the detection tasks. Psychophysical studies are conducted to compute human-observer performance on the same tasks. Efficiency is computed as the squared ratio of the detectabilities of the observer of interest to the ideal observer. Human efficiencies are approximately 2.1% and 24%, respectively, for the SKE and SKS tasks. The results imply that human observers are not affected as much as the ideal observer by signal-location uncertainty even though the ideal observer outperforms the human observer for both tasks. Three different simplified pinhole imaging systems are simulated, and the humans and the model observers rank the systems in the same order for both the SKE and the SKS tasks. PMID:15669610

  12. Conflicting results between randomized trials and observational studies on the impact of proton pump inhibitors on cardiovascular events when coadministered with dual antiplatelet therapy: systematic review.

    PubMed

    Melloni, Chiara; Washam, Jeffrey B; Jones, W Schuyler; Halim, Sharif A; Hasselblad, Victor; Mayer, Stephanie B; Heidenfelder, Brooke L; Dolor, Rowena J

    2015-01-01

    Discordant results have been reported on the effects of concomitant use of proton pump inhibitors (PPIs) and dual antiplatelet therapy (DAPT) for cardiovascular outcomes. We conducted a systematic review comparing the effectiveness and safety of concomitant use of PPIs and DAPT in the postdischarge treatment of unstable angina/non-ST-segment-elevation myocardial infarction patients. We searched for clinical studies in MEDLINE, EMBASE, and the Cochrane Database of Systematic Reviews, from 1995 to 2012. Reviewers screened and extracted data, assessed applicability and quality, and graded the strength of evidence. We performed meta-analyses of direct comparisons when outcomes and follow-up periods were comparable. Thirty-five studies were eligible. Five (4 randomized controlled trials and 1 observational) assessed the effect of omeprazole when added to DAPT; the other 30 (observational) assessed the effect of PPIs as a class when compared with no PPIs. Random-effects meta-analyses of the studies assessing PPIs as a class consistently reported higher event rates in patients receiving PPIs for various clinical outcomes at 1 year (composite ischemic end points, all-cause mortality, nonfatal MI, stroke, revascularization, and stent thrombosis). However, the results from randomized controlled trials evaluating omeprazole compared with placebo showed no difference in ischemic outcomes, despite a reduction in upper gastrointestinal bleeding with omeprazole. Large, well-conducted observational studies of PPIs and randomized controlled trials of omeprazole seem to provide conflicting results for the effect of PPIs on cardiovascular outcomes when coadministered with DAPT. Prospective trials that directly compare pharmacodynamic parameters and clinical events among specific PPI agents in patients with unstable angina/non-ST-segment-elevation myocardial infarction treated with DAPT are warranted. © 2015 American Heart Association, Inc.

  13. German adjuvant intergroup node-positive study: a phase III trial to compare oral ibandronate versus observation in patients with high-risk early breast cancer.

    PubMed

    von Minckwitz, Gunter; Möbus, Volker; Schneeweiss, Andreas; Huober, Jens; Thomssen, Christoph; Untch, Michael; Jackisch, Christian; Diel, Ingo J; Elling, Dirk; Conrad, Bettina; Kreienberg, Rolf; Müller, Volkmar; Lück, Hans-Joachim; Bauerfeind, Ingo; Clemens, Michael; Schmidt, Marcus; Noeding, Stefanie; Forstbauer, Helmut; Barinoff, Jana; Belau, Antje; Nekljudova, Valentina; Harbeck, Nadia; Loibl, Sibylle

    2013-10-01

    Bisphosphonates prevent skeletal-related events in patients with metastatic breast cancer. Their effect in early breast cancer is controversial. Ibandronate is an orally and intravenously available amino-bisphosphonate with a favorable toxicity profile. It therefore qualifies as potential agent for adjuvant use. The GAIN (German Adjuvant Intergroup Node-Positive) study was an open-label, randomized, controlled phase III trial with a 2 × 2 factorial design. Patients with node-positive early breast cancer were randomly assigned 1:1 to two different dose-dense chemotherapy regimens and 2:1 to ibandronate 50 mg per day orally for 2 years or observation. In all, 2,640 patients and 728 events were estimated to be required to demonstrate an increase in disease-free survival (DFS) by ibandronate from 75% to 79.5% by using a two-sided α = .05 and 1-β of 80%. We report here the efficacy analysis for ibandronate, which was released by the independent data monitoring committee because the futility boundary was not crossed after 50% of the required DFS events were observed. Between June 2004 and August 2008, 2,015 patients were randomly assigned to ibandronate and 1,008 to observation. Patients randomly assigned to ibandronate showed no superior DFS or overall survival (OS) compared with patients randomly assigned to observation (DFS: hazard ratio, 0.945; 95% CI, 0.768 to 1.161; P = .589; OS: HR, 1.040; 95% CI, 0.763 to 1.419; P = .803). DFS was numerically longer if ibandronate was used in patients younger than 40 years or older than 60 years compared with patients age 40 to 59 years (test for interaction P = .093). Adjuvant treatment with oral ibandronate did not improve outcome of patients with high-risk early breast cancer who received dose-dense chemotherapy.

  14. Simulating intrafraction prostate motion with a random walk model.

    PubMed

    Pommer, Tobias; Oh, Jung Hun; Munck Af Rosenschöld, Per; Deasy, Joseph O

    2017-01-01

    Prostate motion during radiation therapy (ie, intrafraction motion) can cause unwanted loss of radiation dose to the prostate and increased dose to the surrounding organs at risk. A compact but general statistical description of this motion could be useful for simulation of radiation therapy delivery or margin calculations. We investigated whether prostate motion could be modeled with a random walk model. Prostate motion recorded during 548 radiation therapy fractions in 17 patients was analyzed and used for input in a random walk prostate motion model. The recorded motion was categorized on the basis of whether any transient excursions (ie, rapid prostate motion in the anterior and superior direction followed by a return) occurred in the trace and transient motion. This was separately modeled as a large step in the anterior/superior direction followed by a returning large step. Random walk simulations were conducted with and without added artificial transient motion using either motion data from all observed traces or only traces without transient excursions as model input, respectively. A general estimate of motion was derived with reasonable agreement between simulated and observed traces, especially during the first 5 minutes of the excursion-free simulations. Simulated and observed diffusion coefficients agreed within 0.03, 0.2 and 0.3 mm 2 /min in the left/right, superior/inferior, and anterior/posterior directions, respectively. A rapid increase in variance at the start of observed traces was difficult to reproduce and seemed to represent the patient's need to adjust before treatment. This could be estimated somewhat using artificial transient motion. Random walk modeling is feasible and recreated the characteristics of the observed prostate motion. Introducing artificial transient motion did not improve the overall agreement, although the first 30 seconds of the traces were better reproduced. The model provides a simple estimate of prostate motion during delivery of radiation therapy.

  15. Simple Emergent Power Spectra from Complex Inflationary Physics

    NASA Astrophysics Data System (ADS)

    Dias, Mafalda; Frazer, Jonathan; Marsh, M. C. David

    2016-09-01

    We construct ensembles of random scalar potentials for Nf-interacting scalar fields using nonequilibrium random matrix theory, and use these to study the generation of observables during small-field inflation. For Nf=O (few ), these heavily featured scalar potentials give rise to power spectra that are highly nonlinear, at odds with observations. For Nf≫1 , the superhorizon evolution of the perturbations is generically substantial, yet the power spectra simplify considerably and become more predictive, with most realizations being well approximated by a linear power spectrum. This provides proof of principle that complex inflationary physics can give rise to simple emergent power spectra. We explain how these results can be understood in terms of large Nf universality of random matrix theory.

  16. Simple Emergent Power Spectra from Complex Inflationary Physics.

    PubMed

    Dias, Mafalda; Frazer, Jonathan; Marsh, M C David

    2016-09-30

    We construct ensembles of random scalar potentials for N_{f}-interacting scalar fields using nonequilibrium random matrix theory, and use these to study the generation of observables during small-field inflation. For N_{f}=O(few), these heavily featured scalar potentials give rise to power spectra that are highly nonlinear, at odds with observations. For N_{f}≫1, the superhorizon evolution of the perturbations is generically substantial, yet the power spectra simplify considerably and become more predictive, with most realizations being well approximated by a linear power spectrum. This provides proof of principle that complex inflationary physics can give rise to simple emergent power spectra. We explain how these results can be understood in terms of large N_{f} universality of random matrix theory.

  17. Stochastic Epidemic Outbreaks, or Why Epidemics Behave Like Lasers

    NASA Astrophysics Data System (ADS)

    Schwartz, Ira; Billings, Lora; Bollt, Erik; Carr, Thomas

    2004-03-01

    Many diseases, such childhood diseases, dengue fever, and West Nile virus, appear to oscillate randomly as a function of seasonal environmental or social changes. Such oscillations appear to have a chaotic bursting character, although it is still uncertain how much is due to random fluctuations. Such bursting in the presence of noise is also observed in driven lasers. In this talk, I will show how noise can excite random outbreaks in simple models of seasonally driven outbreaks, as well as lasers. The models for both population dynamics will be shown to share the same class of underlying topology, which plays a major role in the cause of observed stochastic bursting. New tools for predicting stcohastic outbreaks will be presented.

  18. Revisiting sample size: are big trials the answer?

    PubMed

    Lurati Buse, Giovanna A L; Botto, Fernando; Devereaux, P J

    2012-07-18

    The superiority of the evidence generated in randomized controlled trials over observational data is not only conditional to randomization. Randomized controlled trials require proper design and implementation to provide a reliable effect estimate. Adequate random sequence generation, allocation implementation, analyses based on the intention-to-treat principle, and sufficient power are crucial to the quality of a randomized controlled trial. Power, or the probability of the trial to detect a difference when a real difference between treatments exists, strongly depends on sample size. The quality of orthopaedic randomized controlled trials is frequently threatened by a limited sample size. This paper reviews basic concepts and pitfalls in sample-size estimation and focuses on the importance of large trials in the generation of valid evidence.

  19. Scaling Limit of Symmetric Random Walk in High-Contrast Periodic Environment

    NASA Astrophysics Data System (ADS)

    Piatnitski, A.; Zhizhina, E.

    2017-11-01

    The paper deals with the asymptotic properties of a symmetric random walk in a high contrast periodic medium in Z^d, d≥1. From the existing homogenization results it follows that under diffusive scaling the limit behaviour of this random walk need not be Markovian. The goal of this work is to show that if in addition to the coordinate of the random walk in Z^d we introduce an extra variable that characterizes the position of the random walk inside the period then the limit dynamics of this two-component process is Markov. We describe the limit process and observe that the components of the limit process are coupled. We also prove the convergence in the path space for the said random walk.

  20. Random-order fractional bistable system and its stochastic resonance

    NASA Astrophysics Data System (ADS)

    Gao, Shilong; Zhang, Li; Liu, Hui; Kan, Bixia

    2017-01-01

    In this paper, the diffusion motion of Brownian particles in a viscous liquid suffering from stochastic fluctuations of the external environment is modeled as a random-order fractional bistable equation, and as a typical nonlinear dynamic behavior, the stochastic resonance phenomena in this system are investigated. At first, the derivation process of the random-order fractional bistable system is given. In particular, the random-power-law memory is deeply discussed to obtain the physical interpretation of the random-order fractional derivative. Secondly, the stochastic resonance evoked by random-order and external periodic force is mainly studied by numerical simulation. In particular, the frequency shifting phenomena of the periodical output are observed in SR induced by the excitation of the random order. Finally, the stochastic resonance of the system under the double stochastic excitations of the random order and the internal color noise is also investigated.

  1. Nonparametric tests for interaction and group differences in a two-way layout.

    PubMed

    Fisher, A C; Wallenstein, S

    1991-01-01

    Nonparametric tests of group differences and interaction across strata are developed in which the null hypotheses for these tests are expressed as functions of rho i = P(X > Y) + 1/2P(X = Y), where X refers to a random observation from one group and Y refers to a random observation from the other group within stratum i. The estimator r of the parameter rho is shown to be a useful way to summarize and examine data for ordinal and continuous data.

  2. Nanostructuring on zinc phthalocyanine thin films for single-junction organic solar cells

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chaudhary, Dhirendra K.; Kumar, Lokendra, E-mail: lokendrakr@allduniv.ac.in

    2016-05-23

    Vertically aligned and random oriented crystalline molecular nanorods of organic semiconducting Zinc Phthalocyanine (ZnPc) have been grown on ITO coated glass substrate using solvent volatilization method. Interesting changes in surface morphology were observed under different solvent treatment. Vertically aligned nanorods of ZnPc thin film were observed in the films treated with acetone, where as the random oriented nanorods were observed in the films treated with chloroform. The X-ray diffraction (XRD), scanning electron microscopy (SEM), atomic force microscopy (AFM) have been used for characterization of nanostructures. The optical properties of the nanorods have been investigated by UV-Vis. absorption spectroscopy.

  3. Common Randomness Principles of Secrecy

    ERIC Educational Resources Information Center

    Tyagi, Himanshu

    2013-01-01

    This dissertation concerns the secure processing of distributed data by multiple terminals, using interactive public communication among themselves, in order to accomplish a given computational task. In the setting of a probabilistic multiterminal source model in which several terminals observe correlated random signals, we analyze secure…

  4. A randomized phase II chemoprevention trial of 13-CIS retinoic acid with or without alpha tocopherol or observation in subjects at high risk for lung cancer.

    PubMed

    Kelly, Karen; Kittelson, John; Franklin, Wilbur A; Kennedy, Timothy C; Klein, Catherine E; Keith, Robert L; Dempsey, Edward C; Lewis, Marina; Jackson, Mary K; Hirsch, Fred R; Bunn, Paul A; Miller, York E

    2009-05-01

    No chemoprevention strategies have been proven effective for lung cancer. We evaluated the effect of 13-cis retinoic acid (13-cis RA), with or without alpha tocopherol, as a lung cancer chemoprevention agent in a phase II randomized controlled clinical trial of adult subjects at high risk for lung cancer as defined by the presence of sputum atypia, history of smoking, and airflow obstruction, or a prior surgically cured nonsmall cell lung cancer (disease free, >3 years). Subjects were randomly assigned to receive either 13-cis RA, 13-cis RA plus alpha tocopherol (13-cis RA/alpha toco) or observation for 12 months. Outcome measures are derived from histologic evaluation of bronchial biopsy specimens obtained by bronchoscopy at baseline and follow-up. The primary outcome measure is treatment "failure" defined as histologic progression (any increase in the maximum histologic score) or failure to return for follow-up bronchoscopy. Seventy-five subjects were randomized (27/22/26 to observations/13-cis RA/13-cis RA/alpha toco); 59 completed the trial; 55 had both baseline and follow-up bronchoscopy. The risk of treatment failure was 55.6% (15 of 27) and 50% (24 of 48) in the observation and combined (13 cis RA plus 13 cis RA/alpha toco) treatment arms, respectively (odds ratio adjusted for baseline histology, 0.97; 95% confidence interval, 0.36-2.66; P = 0.95). Among subjects with complete histology data, maximum histology score in the observation arm increased by 0.37 units and by 0.03 units in the treated arms (difference adjusted for baseline, -0.18; 95% confidence interval, -1.16 to 0.81; P = 0.72). Similar (nonsignificant) results were observed for treatment effects on endobronchial proliferation as assessed by Ki-67 immunolabeling. Twelve-month treatment with 13-cis RA produced nonsignificant changes in bronchial histology, consistent with results in other trials. Agents advancing to phase III randomized trials should produce greater histologic changes. The addition of alpha tocopherol did not affect toxicity.

  5. Antioxidant supplements and mortality.

    PubMed

    Bjelakovic, Goran; Nikolova, Dimitrinka; Gluud, Christian

    2014-01-01

    Oxidative damage to cells and tissues is considered involved in the aging process and in the development of chronic diseases in humans, including cancer and cardiovascular diseases, the leading causes of death in high-income countries. This has stimulated interest in the preventive potential of antioxidant supplements. Today, more than one half of adults in high-income countries ingest antioxidant supplements hoping to improve their health, oppose unhealthy behaviors, and counteract the ravages of aging. Older observational studies and some randomized clinical trials with high risks of systematic errors ('bias') have suggested that antioxidant supplements may improve health and prolong life. A number of randomized clinical trials with adequate methodologies observed neutral or negative results of antioxidant supplements. Recently completed large randomized clinical trials with low risks of bias and systematic reviews of randomized clinical trials taking systematic errors ('bias') and risks of random errors ('play of chance') into account have shown that antioxidant supplements do not seem to prevent cancer, cardiovascular diseases, or death. Even more, beta-carotene, vitamin A, and vitamin E may increase mortality. Some recent large observational studies now support these findings. According to recent dietary guidelines, there is no evidence to support the use of antioxidant supplements in the primary prevention of chronic diseases or mortality. Antioxidant supplements do not possess preventive effects and may be harmful with unwanted consequences to our health, especially in well-nourished populations. The optimal source of antioxidants seems to come from our diet, not from antioxidant supplements in pills or tablets.

  6. Using Behavioral Analytics to Increase Exercise: A Randomized N-of-1 Study.

    PubMed

    Yoon, Sunmoo; Schwartz, Joseph E; Burg, Matthew M; Kronish, Ian M; Alcantara, Carmela; Julian, Jacob; Parsons, Faith; Davidson, Karina W; Diaz, Keith M

    2018-04-01

    This intervention study used mobile technologies to investigate whether those randomized to receive a personalized "activity fingerprint" (i.e., a one-time tailored message about personal predictors of exercise developed from 6 months of observational data) increased their physical activity levels relative to those not receiving the fingerprint. A 12-month randomized intervention study. From 2014 to 2015, 79 intermittent exercisers had their daily physical activity assessed by accelerometry (Fitbit Flex) and daily stress experience, a potential predictor of exercise behavior, was assessed by smartphone. Data collected during the first 6 months of observation were used to develop a person-specific "activity fingerprint" (i.e., N-of-1) that was subsequently sent via email on a single occasion to randomized participants. Pre-post changes in the percentage of days exercised were analyzed within and between control and intervention groups. The control group significantly decreased their proportion of days exercised (10.5% decrease, p<0.0001) following randomization. By contrast, the intervention group showed a nonsignificant decrease in the proportion of days exercised (4.0% decrease, p=0.14). Relative to the decrease observed in the control group, receipt of the activity fingerprint significantly increased the likelihood of exercising in the intervention group (6.5%, p=0.04). This N-of-1 intervention study demonstrates that a one-time brief message conveying personalized exercise predictors had a beneficial effect on exercise behavior among urban adults. Copyright © 2018 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.

  7. Study design and "evidence" in patient-oriented research.

    PubMed

    Concato, John

    2013-06-01

    Individual studies in patient-oriented research, whether described as "comparative effectiveness" or using other terms, are based on underlying methodological designs. A simple taxonomy of study designs includes randomized controlled trials on the one hand, and observational studies (such as case series, cohort studies, and case-control studies) on the other. A rigid hierarchy of these design types is a fairly recent phenomenon, promoted as a tenet of "evidence-based medicine," with randomized controlled trials receiving gold-standard status in terms of producing valid results. Although randomized trials have many strengths, and contribute substantially to the evidence base in clinical care, making presumptions about the quality of a study based solely on category of research design is unscientific. Both the limitations of randomized trials as well as the strengths of observational studies tend to be overlooked when a priori assumptions are made. This essay presents an argument in support of a more balanced approach to evaluating evidence, and discusses representative examples from the general medical as well as pulmonary and critical care literature. The simultaneous consideration of validity (whether results are correct "internally") and generalizability (how well results apply to "external" populations) is warranted in assessing whether a study's results are accurate for patients likely to receive the intervention-examining the intersection of clinical and methodological issues in what can be called a medicine-based evidence approach. Examination of cause-effect associations in patient-oriented research should recognize both the strengths and limitations of randomized trials as well as observational studies.

  8. A random Q-switched fiber laser

    PubMed Central

    Tang, Yulong; Xu, Jianqiu

    2015-01-01

    Extensive studies have been performed on random lasers in which multiple-scattering feedback is used to generate coherent emission. Q-switching and mode-locking are well-known routes for achieving high peak power output in conventional lasers. However, in random lasers, the ubiquitous random cavities that are formed by multiple scattering inhibit energy storage, making Q-switching impossible. In this paper, widespread Rayleigh scattering arising from the intrinsic micro-scale refractive-index irregularities of fiber cores is used to form random cavities along the fiber. The Q-factor of the cavity is rapidly increased by stimulated Brillouin scattering just after the spontaneous emission is enhanced by random cavity resonances, resulting in random Q-switched pulses with high brightness and high peak power. This report is the first observation of high-brightness random Q-switched laser emission and is expected to stimulate new areas of scientific research and applications, including encryption, remote three-dimensional random imaging and the simulation of stellar lasing. PMID:25797520

  9. Effectiveness of adjuvant radiotherapy in patients with oropharyngeal and floor of mouth squamous cell carcinoma and concomitant histological verification of singular ipsilateral cervical lymph node metastasis (pN1-state) - A prospective multicenter randomized controlled clinical trial using a comprehensive cohort design

    PubMed Central

    2009-01-01

    Background Modern radiotherapy plays an important role in therapy of advanced head and neck carcinomas. However, no clinical studies have been published addressing the effectiveness of postoperative radiotherapy in patients with small tumor (pT1, pT2) and concomitant ipsilateral metastasis of a single lymph node (pN1), which would provide a basis for a general treatment recommendation. Methods/Design The present study is a non-blinded, prospective, multi-center randomized controlled trial (RCT). As the primary clinical endpoint, overall-survival in patients receiving postoperative radiation therapy vs. patients without adjuvant therapy following curative intended surgery is compared. The aim of the study is to enroll 560 adult males and females for 1:1 randomization to one of the two treatment arms (irradiation/no irradiation). Since patients with small tumor (T1/T2) but singular lymph node metastasis are rare and the amount of patients consenting to randomization is not predictable in advance, all patients rejecting randomization will be treated as preferred and enrolled in a prospective observational study (comprehensive cohort design) after giving informed consent. This observational part of the trial will be performed with maximum consistency to the treatment and observation protocol of the RCT. Because the impact of patient preference for a certain treatment option is not calculable, parallel design of RCT and observational study may provide a maximum of evidence and efficacy for evaluation of treatment outcome. Secondary clinical endpoints are as follows: incidence and time to tumor relapse (locoregional relapse, lymph node involvement and distant metastatic spread), Quality of life as reported by EORTC (QLQ-C30 with H&N 35 module), and time from operation to orofacial rehabilitation. All tumors represent a homogeneous clinical state and therefore additional investigation of protein expression levels within resection specimen may serve for establishment of surrogate parameters of patient outcome. Conclusion The inherent challenges of a rare clinical condition (pN1) and two substantially different therapy arms would limit the practicality of a classical randomized study. The concept of a Comprehensive Cohort Design combines the preference of a randomized study, with the option of careful data interpretation within an observational study. Trial registration ClinicalTrials.gov: NCT00964977 PMID:20028566

  10. Causal inference from observational data.

    PubMed

    Listl, Stefan; Jürges, Hendrik; Watt, Richard G

    2016-10-01

    Randomized controlled trials have long been considered the 'gold standard' for causal inference in clinical research. In the absence of randomized experiments, identification of reliable intervention points to improve oral health is often perceived as a challenge. But other fields of science, such as social science, have always been challenged by ethical constraints to conducting randomized controlled trials. Methods have been established to make causal inference using observational data, and these methods are becoming increasingly relevant in clinical medicine, health policy and public health research. This study provides an overview of state-of-the-art methods specifically designed for causal inference in observational data, including difference-in-differences (DiD) analyses, instrumental variables (IV), regression discontinuity designs (RDD) and fixed-effects panel data analysis. The described methods may be particularly useful in dental research, not least because of the increasing availability of routinely collected administrative data and electronic health records ('big data'). © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  11. A randomized trial on folic acid supplementation and risk of recurrent colorectal adenoma

    USDA-ARS?s Scientific Manuscript database

    Background: Evidence from observational studies suggests that inadequate folate status enhances colorectal carcinogenesis, but results from some randomized trials do not support this hypothesis. Objective: To assess the effect of folic acid supplementation on recurrent colorectal adenoma, we conduc...

  12. Brownian Motion.

    ERIC Educational Resources Information Center

    Lavenda, Bernard H.

    1985-01-01

    Explains the phenomenon of Brownian motion, which serves as a mathematical model for random processes. Topics addressed include kinetic theory, Einstein's theory, particle displacement, and others. Points out that observations of the random course of a particle suspended in fluid led to the first accurate measurement of atomic mass. (DH)

  13. Dairy consumption, systolic blood pressure, and risk of hypertension: Mendelian randomization study

    USDA-ARS?s Scientific Manuscript database

    Objective: To examine whether previous observed inverse associations of dairy intake with systolic blood pressure and risk of hypertension were causal. Design: Mendelian randomization study using the single nucleotide polymorphism rs4988235 related to lactase persistence as an instrumental variable...

  14. The Locomotion of Mouse Fibroblasts in Tissue Culture

    PubMed Central

    Gail, Mitchell H.; Boone, Charles W.

    1970-01-01

    Time-lapse cinematography was used to investigate the motion of mouse fibroblasts in tissue culture. Observations over successive short time intervals revealed a tendency for the cells to persist in their direction of motion from one 2.5 hr time interval to the next. Over 5.0-hr time intervals, however, the direction of motion appeared random. This fact suggested that D, the diffusion constant of a random walk model, might serve to characterize cellular motility if suitably long observation times were used. We therefore investigated the effect of “persistence” on the pure random walk model, and we found theoretically and confirmed experimentally that the motility of a persisting cell could indeed be characterized by an augmented diffusion constant, D*. A method for determining confidence limits on D* was also developed. Thus a random walk model, modified to comprehend the persistence effect, was found to describe the motion of fibroblasts in tissue culture and to provide a numerical measure of cellular motility. PMID:5531614

  15. Professional opinion concerning the effectiveness of bracing relative to observation in adolescent idiopathic scoliosis.

    PubMed

    Dolan, Lori A; Donnelly, Melanie J; Spratt, Kevin F; Weinstein, Stuart L

    2007-01-01

    To determine if community equipoise exists concerning the effectiveness of bracing in adolescent idiopathic scoliosis. Bracing is the standard of care for adolescent idiopathic scoliosis despite the lack of strong reasearch evidence concerning its effectiveness. Thus, some researchers support the idea of a randomized trial, whereas others think that randomization in the face of a standard of care would be unethical. A random of Scoliosis Research Society and Pediatric Orthopaedic Society of North America members were asked to consider 12 clinical profiles and to give their opinion concerning the radiographic outcomes after observation and bracing. An expert panel was created from the respondents. They expressed a wide array of opinions concerning the percentage of patients within each scenario who would benefit from bracing. Agreement was noted concerning the risk due to bracing for post-menarchal patients only. : This study found a high degree of variability in opinion among clinicians concerning the effectiveness of bracing, suggesting that a randomized trial of bracing would be ethical.

  16. Local randomness: Examples and application

    NASA Astrophysics Data System (ADS)

    Fu, Honghao; Miller, Carl A.

    2018-03-01

    When two players achieve a superclassical score at a nonlocal game, their outputs must contain intrinsic randomness. This fact has many useful implications for quantum cryptography. Recently it has been observed [C. Miller and Y. Shi, Quantum Inf. Computat. 17, 0595 (2017)] that such scores also imply the existence of local randomness—that is, randomness known to one player but not to the other. This has potential implications for cryptographic tasks between two cooperating but mistrustful players. In the current paper we bring this notion toward practical realization, by offering near-optimal bounds on local randomness for the CHSH game, and also proving the security of a cryptographic application of local randomness (single-bit certified deletion).

  17. On the efficiency of a randomized mirror descent algorithm in online optimization problems

    NASA Astrophysics Data System (ADS)

    Gasnikov, A. V.; Nesterov, Yu. E.; Spokoiny, V. G.

    2015-04-01

    A randomized online version of the mirror descent method is proposed. It differs from the existing versions by the randomization method. Randomization is performed at the stage of the projection of a subgradient of the function being optimized onto the unit simplex rather than at the stage of the computation of a subgradient, which is common practice. As a result, a componentwise subgradient descent with a randomly chosen component is obtained, which admits an online interpretation. This observation, for example, has made it possible to uniformly interpret results on weighting expert decisions and propose the most efficient method for searching for an equilibrium in a zero-sum two-person matrix game with sparse matrix.

  18. Multicenter Randomized Controlled Trial on Duration of Therapy for Thrombosis in Children and Young Adults (Kids-DOTT): Pilot/Feasibility Phase Findings

    PubMed Central

    Goldenberg, N.A.; Abshire, T.; Blatchford, P.J.; Fenton, L.Z.; Halperin, J.L.; Hiatt, W.R.; Kessler, C.M.; Kittelson, J.M.; Manco-Johnson, M.J.; Spyropoulos, A.C.; Steg, P.G.; Stence, N.V.; Turpie, A.G.G.; Schulman, S.

    2015-01-01

    BACKGROUND Randomized controlled trials (RCTs) in pediatric venous thromboembolism (VTE) treatment have been challenged by unsubstantiated design assumptions and/or poor accrual. Pilot/feasibility (P/F) studies are critical to future RCT success. METHODS Kids-DOTT is a multicenter RCT investigating non-inferiority of a 6-week (shortened) vs. 3-month (conventional) duration of anticoagulation in patients <21 years old with provoked venous thrombosis. Primary efficacy and safety endpoints are symptomatic recurrent VTE at 1 year and anticoagulant-related, clinically-relevant bleeding. In the P/F phase, 100 participants were enrolled in an open, blinded endpoint, parallel-cohort RCT design. RESULTS No eligibility violations or randomization errors occurred. Of enrolled patients, 69% were randomized, 3% missed the randomization window, and 28% were followed in pre-specified observational cohorts for completely occlusive thrombosis or persistent antiphospholipid antibodies. Retention at 1 year was 82%. Inter-observer agreement between local vs. blinded central determination of venous occlusion by imaging at 6 weeks post-diagnosis was strong (κ-statistic=0.75; 95% confidence interval [CI] 0.48–1.0). Primary efficacy and safety event rates were 3.3% (95% CI 0.3–11.5%) and 1.4% (0.03–7.4%). CONCLUSIONS The P/F phase of Kids-DOTT has demonstrated validity of vascular imaging findings of occlusion as a randomization criterion, and defined randomization, retention, and endpoint rates to inform the fully-powered RCT. PMID:26118944

  19. Encoding plaintext by Fourier transform hologram in double random phase encoding using fingerprint keys

    NASA Astrophysics Data System (ADS)

    Takeda, Masafumi; Nakano, Kazuya; Suzuki, Hiroyuki; Yamaguchi, Masahiro

    2012-09-01

    It has been shown that biometric information can be used as a cipher key for binary data encryption by applying double random phase encoding. In such methods, binary data are encoded in a bit pattern image, and the decrypted image becomes a plain image when the key is genuine; otherwise, decrypted images become random images. In some cases, images decrypted by imposters may not be fully random, such that the blurred bit pattern can be partially observed. In this paper, we propose a novel bit coding method based on a Fourier transform hologram, which makes images decrypted by imposters more random. Computer experiments confirm that the method increases the randomness of images decrypted by imposters while keeping the false rejection rate as low as in the conventional method.

  20. Continuous-variable phase estimation with unitary and random linear disturbance

    NASA Astrophysics Data System (ADS)

    Delgado de Souza, Douglas; Genoni, Marco G.; Kim, M. S.

    2014-10-01

    We address the problem of continuous-variable quantum phase estimation in the presence of linear disturbance at the Hamiltonian level by means of Gaussian probe states. In particular we discuss both unitary and random disturbance by considering the parameter which characterizes the unwanted linear term present in the Hamiltonian as fixed (unitary disturbance) or random with a given probability distribution (random disturbance). We derive the optimal input Gaussian states at fixed energy, maximizing the quantum Fisher information over the squeezing angle and the squeezing energy fraction, and we discuss the scaling of the quantum Fisher information in terms of the output number of photons, nout. We observe that, in the case of unitary disturbance, the optimal state is a squeezed vacuum state and the quadratic scaling is conserved. As regards the random disturbance, we observe that the optimal squeezing fraction may not be equal to one and, for any nonzero value of the noise parameter, the quantum Fisher information scales linearly with the average number of photons. Finally, we discuss the performance of homodyne measurement by comparing the achievable precision with the ultimate limit imposed by the quantum Cramér-Rao bound.

  1. Simvastatin as an Adjunct to Conventional Therapy of Non-infectious Uveitis: A Randomized, Open-Label Pilot Study.

    PubMed

    Shirinsky, Ivan V; Biryukova, Anastasia A; Shirinsky, Valery S

    2017-12-01

    Statins have been shown to reduce ocular inflammation in animal models of uveitis and to prevent development of uveitis in observational studies. There have been no experimental human studies evaluating statins' efficacy and safety in uveitis. In this study, we aimed to investigate efficacy and safety of simvastatin in patients with uveitis. For this single-center, open-label, randomized study, we enrolled patients with acute non-infectious uveitis. The patients were randomized to receive 40 mg simvastatin per day for 2 months in addition to conventional treatment or conventional treatment alone. The studied outcomes were the rate of steroid-sparing control of ocular inflammation, measures of ocular inflammation, intraocular pressure, and visual acuity. Fifty patients were enrolled in the study. Twenty-five patients were randomly assigned to receive simvastatin with conventional treatment and 25 to conventional treatment alone. Simvastatin was associated with significantly higher rates of steroid-sparing ocular inflammation control, decrease in anterior chamber inflammation, and improvement in visual acuity. The treatment was well tolerated, no serious adverse effects were observed. Our findings suggest that statins may have therapeutic potential in uveitis. These results need to be confirmed in double-blind, randomized, controlled studies.

  2. Auto Regressive Moving Average (ARMA) Modeling Method for Gyro Random Noise Using a Robust Kalman Filter

    PubMed Central

    Huang, Lei

    2015-01-01

    To solve the problem in which the conventional ARMA modeling methods for gyro random noise require a large number of samples and converge slowly, an ARMA modeling method using a robust Kalman filtering is developed. The ARMA model parameters are employed as state arguments. Unknown time-varying estimators of observation noise are used to achieve the estimated mean and variance of the observation noise. Using the robust Kalman filtering, the ARMA model parameters are estimated accurately. The developed ARMA modeling method has the advantages of a rapid convergence and high accuracy. Thus, the required sample size is reduced. It can be applied to modeling applications for gyro random noise in which a fast and accurate ARMA modeling method is required. PMID:26437409

  3. Continuous-Time Classical and Quantum Random Walk on Direct Product of Cayley Graphs

    NASA Astrophysics Data System (ADS)

    Salimi, S.; Jafarizadeh, M. A.

    2009-06-01

    In this paper we define direct product of graphs and give a recipe for obtaining probability of observing particle on vertices in the continuous-time classical and quantum random walk. In the recipe, the probability of observing particle on direct product of graph is obtained by multiplication of probability on the corresponding to sub-graphs, where this method is useful to determining probability of walk on complicated graphs. Using this method, we calculate the probability of continuous-time classical and quantum random walks on many of finite direct product Cayley graphs (complete cycle, complete Kn, charter and n-cube). Also, we inquire that the classical state the stationary uniform distribution is reached as t → ∞ but for quantum state is not always satisfied.

  4. Observer-based sliding mode control of Markov jump systems with random sensor delays and partly unknown transition rates

    NASA Astrophysics Data System (ADS)

    Yao, Deyin; Lu, Renquan; Xu, Yong; Ren, Hongru

    2017-10-01

    In this paper, the sliding mode control problem of Markov jump systems (MJSs) with unmeasured state, partly unknown transition rates and random sensor delays is probed. In the practical engineering control, the exact information of transition rates is hard to obtain and the measurement channel is supposed to subject to random sensor delay. Design a Luenberger observer to estimate the unmeasured system state, and an integral sliding mode surface is constructed to ensure the exponential stability of MJSs. A sliding mode controller based on estimator is proposed to drive the system state onto the sliding mode surface and render the sliding mode dynamics exponentially mean-square stable with H∞ performance index. Finally, simulation results are provided to illustrate the effectiveness of the proposed results.

  5. A Randomized Controlled Study of Art Observation Training to Improve Medical Student Ophthalmology Skills.

    PubMed

    Gurwin, Jaclyn; Revere, Karen E; Niepold, Suzannah; Bassett, Barbara; Mitchell, Rebecca; Davidson, Stephanie; DeLisser, Horace; Binenbaum, Gil

    2018-01-01

    Observation and description are critical to the practice of medicine, and to ophthalmology in particular. However, medical education does not provide explicit training in these areas, and medical students are often criticized for deficiencies in these skills. We sought to evaluate the effects of formal observation training in the visual arts on the general and ophthalmologic observational skills of medical students. Randomized, single-masked, controlled trial. Thirty-six first-year medical students, randomized 1:1 into art-training and control groups. Students in the art-training group were taught by professional art educators at the Philadelphia Museum of Art, during 6 custom-designed, 1.5-hour art observation sessions over a 3-month period. All subjects completed pre- and posttesting, in which they described works of art, retinal pathology images, and external photographs of eye diseases. Grading of written descriptions for observational and descriptive abilities by reviewers using an a priori rubric and masked to group assignment and pretesting/posttesting status. Observational skills, as measured by description testing, improved significantly in the training group (mean change +19.1 points) compared with the control group (mean change -13.5 points), P = 0.001. There were significant improvements in the training vs. control group for each of the test subscores. In a poststudy questionnaire, students reported applying the skills they learned in the museum in clinically meaningful ways at medical school. Art observation training for first-year medical students can improve clinical ophthalmology observational skills. Principles from the field of visual arts, which is reputed to excel in teaching observation and descriptive abilities, can be successfully applied to medical training. Further studies can examine the impact of such training on clinical care. Copyright © 2017 American Academy of Ophthalmology. Published by Elsevier Inc. All rights reserved.

  6. PROMOTING SUPPORTIVE PARENTING IN NEW MOTHERS WITH SUBSTANCE-USE PROBLEMS: A PILOT RANDOMIZED TRIAL OF RESIDENTIAL TREATMENT PLUS AN ATTACHMENT-BASED PARENTING PROGRAM

    PubMed Central

    BERLIN, LISA J.; SHANAHAN, MEGHAN; CARMODY, KAREN APPLEYARD

    2015-01-01

    This pilot randomized trial tested the feasibility and efficacy of supplementing residential substance-abuse treatment for new mothers with a brief, yet rigorous, attachment-based parenting program. Twenty-one predominantly (86%) White mothers and their infants living together in residential substance-abuse treatment were randomly assigned to the program (n = 11) or control (n = 10) group. Program mothers received 10 home-based sessions of Dozier’s Attachment and Biobehavioral Catch-up (ABC) intervention. Postintervention observations revealed more supportive parenting behaviors among the randomly assigned ABC mothers. PMID:25424409

  7. Bias, Confounding, and Interaction: Lions and Tigers, and Bears, Oh My!

    PubMed

    Vetter, Thomas R; Mascha, Edward J

    2017-09-01

    Epidemiologists seek to make a valid inference about the causal effect between an exposure and a disease in a specific population, using representative sample data from a specific population. Clinical researchers likewise seek to make a valid inference about the association between an intervention and outcome(s) in a specific population, based upon their randomly collected, representative sample data. Both do so by using the available data about the sample variable to make a valid estimate about its corresponding or underlying, but unknown population parameter. Random error in an experiment can be due to the natural, periodic fluctuation or variation in the accuracy or precision of virtually any data sampling technique or health measurement tool or scale. In a clinical research study, random error can be due to not only innate human variability but also purely chance. Systematic error in an experiment arises from an innate flaw in the data sampling technique or measurement instrument. In the clinical research setting, systematic error is more commonly referred to as systematic bias. The most commonly encountered types of bias in anesthesia, perioperative, critical care, and pain medicine research include recall bias, observational bias (Hawthorne effect), attrition bias, misclassification or informational bias, and selection bias. A confounding variable is a factor associated with both the exposure of interest and the outcome of interest. A confounding variable (confounding factor or confounder) is a variable that correlates (positively or negatively) with both the exposure and outcome. Confounding is typically not an issue in a randomized trial because the randomized groups are sufficiently balanced on all potential confounding variables, both observed and nonobserved. However, confounding can be a major problem with any observational (nonrandomized) study. Ignoring confounding in an observational study will often result in a "distorted" or incorrect estimate of the association or treatment effect. Interaction among variables, also known as effect modification, exists when the effect of 1 explanatory variable on the outcome depends on the particular level or value of another explanatory variable. Bias and confounding are common potential explanations for statistically significant associations between exposure and outcome when the true relationship is noncausal. Understanding interactions is vital to proper interpretation of treatment effects. These complex concepts should be consistently and appropriately considered whenever one is not only designing but also analyzing and interpreting data from a randomized trial or observational study.

  8. Dairy consumption, systolic blood pressure, and risk of hypertension: Mendelian randomization study

    USDA-ARS?s Scientific Manuscript database

    This study examined whether previous observed inverse associations of dairy intake with systolic blood pressure and risk of hypertension were causal. A Mendelian randomization study was employed, using the single nucleotide polymorphism rs4988235 related to lactase persistence as an instrumental var...

  9. A new u-statistic with superior design sensitivity in matched observational studies.

    PubMed

    Rosenbaum, Paul R

    2011-09-01

    In an observational or nonrandomized study of treatment effects, a sensitivity analysis indicates the magnitude of bias from unmeasured covariates that would need to be present to alter the conclusions of a naïve analysis that presumes adjustments for observed covariates suffice to remove all bias. The power of sensitivity analysis is the probability that it will reject a false hypothesis about treatment effects allowing for a departure from random assignment of a specified magnitude; in particular, if this specified magnitude is "no departure" then this is the same as the power of a randomization test in a randomized experiment. A new family of u-statistics is proposed that includes Wilcoxon's signed rank statistic but also includes other statistics with substantially higher power when a sensitivity analysis is performed in an observational study. Wilcoxon's statistic has high power to detect small effects in large randomized experiments-that is, it often has good Pitman efficiency-but small effects are invariably sensitive to small unobserved biases. Members of this family of u-statistics that emphasize medium to large effects can have substantially higher power in a sensitivity analysis. For example, in one situation with 250 pair differences that are Normal with expectation 1/2 and variance 1, the power of a sensitivity analysis that uses Wilcoxon's statistic is 0.08 while the power of another member of the family of u-statistics is 0.66. The topic is examined by performing a sensitivity analysis in three observational studies, using an asymptotic measure called the design sensitivity, and by simulating power in finite samples. The three examples are drawn from epidemiology, clinical medicine, and genetic toxicology. © 2010, The International Biometric Society.

  10. A new test statistic for climate models that includes field and spatial dependencies using Gaussian Markov random fields

    DOE PAGES

    Nosedal-Sanchez, Alvaro; Jackson, Charles S.; Huerta, Gabriel

    2016-07-20

    A new test statistic for climate model evaluation has been developed that potentially mitigates some of the limitations that exist for observing and representing field and space dependencies of climate phenomena. Traditionally such dependencies have been ignored when climate models have been evaluated against observational data, which makes it difficult to assess whether any given model is simulating observed climate for the right reasons. The new statistic uses Gaussian Markov random fields for estimating field and space dependencies within a first-order grid point neighborhood structure. We illustrate the ability of Gaussian Markov random fields to represent empirical estimates of fieldmore » and space covariances using "witch hat" graphs. We further use the new statistic to evaluate the tropical response of a climate model (CAM3.1) to changes in two parameters important to its representation of cloud and precipitation physics. Overall, the inclusion of dependency information did not alter significantly the recognition of those regions of parameter space that best approximated observations. However, there were some qualitative differences in the shape of the response surface that suggest how such a measure could affect estimates of model uncertainty.« less

  11. A new test statistic for climate models that includes field and spatial dependencies using Gaussian Markov random fields

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nosedal-Sanchez, Alvaro; Jackson, Charles S.; Huerta, Gabriel

    A new test statistic for climate model evaluation has been developed that potentially mitigates some of the limitations that exist for observing and representing field and space dependencies of climate phenomena. Traditionally such dependencies have been ignored when climate models have been evaluated against observational data, which makes it difficult to assess whether any given model is simulating observed climate for the right reasons. The new statistic uses Gaussian Markov random fields for estimating field and space dependencies within a first-order grid point neighborhood structure. We illustrate the ability of Gaussian Markov random fields to represent empirical estimates of fieldmore » and space covariances using "witch hat" graphs. We further use the new statistic to evaluate the tropical response of a climate model (CAM3.1) to changes in two parameters important to its representation of cloud and precipitation physics. Overall, the inclusion of dependency information did not alter significantly the recognition of those regions of parameter space that best approximated observations. However, there were some qualitative differences in the shape of the response surface that suggest how such a measure could affect estimates of model uncertainty.« less

  12. Factorization of Observables

    NASA Astrophysics Data System (ADS)

    Eliaš, Peter; Frič, Roman

    2017-12-01

    Categorical approach to probability leads to better understanding of basic notions and constructions in generalized (fuzzy, operational, quantum) probability, where observables—dual notions to generalized random variables (statistical maps)—play a major role. First, to avoid inconsistencies, we introduce three categories L, S, and P, the objects and morphisms of which correspond to basic notions of fuzzy probability theory and operational probability theory, and describe their relationships. To illustrate the advantages of categorical approach, we show that two categorical constructions involving observables (related to the representation of generalized random variables via products, or smearing of sharp observables, respectively) can be described as factorizing a morphism into composition of two morphisms having desired properties. We close with a remark concerning products.

  13. The emergence of collective phenomena in systems with random interactions

    NASA Astrophysics Data System (ADS)

    Abramkina, Volha

    Emergent phenomena are one of the most profound topics in modern science, addressing the ways that collectivities and complex patterns appear due to multiplicity of components and simple interactions. Ensembles of random Hamiltonians allow one to explore emergent phenomena in a statistical way. In this work we adopt a shell model approach with a two-body interaction Hamiltonian. The sets of the two-body interaction strengths are selected at random, resulting in the two-body random ensemble (TBRE). Symmetries such as angular momentum, isospin, and parity entangled with complex many-body dynamics result in surprising order discovered in the spectrum of low-lying excitations. The statistical patterns exhibited in the TBRE are remarkably similar to those observed in real nuclei. Signs of almost every collective feature seen in nuclei, namely, pairing superconductivity, deformation, and vibration, have been observed in random ensembles [3, 4, 5, 6]. In what follows a systematic investigation of nuclear shape collectivities in random ensembles is conducted. The development of the mean field, its geometry, multipole collectivities and their dependence on the underlying two-body interaction are explored. Apart from the role of static symmetries such as SU(2) angular momentum and isospin groups, the emergence of dynamical symmetries including the seniority SU(2), rotational symmetry, as well as the Elliot SU(3) is shown to be an important precursor for the existence of geometric collectivities.

  14. Dairy consumption and body mass index among adults: Mendelian randomization analysis of 184802 individuals from 25 studies

    USDA-ARS?s Scientific Manuscript database

    Associations between dairy intake and body mass index (BMI) have been inconsistently observed in epidemiological studies, and the causal relationship remains ill defined. We performed Mendelian randomization (MR) analysis using an established dairy intake-associated genetic polymorphism located upst...

  15. Vaccine Effectiveness - How Well Does the Seasonal Flu Vaccine Work?

    MedlinePlus

    ... to determine the benefits of flu vaccination are “observational studies.” “Observational studies” compare the occurrence of flu illness in ... randomized. The measurement of vaccine effects in an observational study is referred to as “effectiveness.” Top of ...

  16. Ultra-fast quantum randomness generation by accelerated phase diffusion in a pulsed laser diode.

    PubMed

    Abellán, C; Amaya, W; Jofre, M; Curty, M; Acín, A; Capmany, J; Pruneri, V; Mitchell, M W

    2014-01-27

    We demonstrate a high bit-rate quantum random number generator by interferometric detection of phase diffusion in a gain-switched DFB laser diode. Gain switching at few-GHz frequencies produces a train of bright pulses with nearly equal amplitudes and random phases. An unbalanced Mach-Zehnder interferometer is used to interfere subsequent pulses and thereby generate strong random-amplitude pulses, which are detected and digitized to produce a high-rate random bit string. Using established models of semiconductor laser field dynamics, we predict a regime of high visibility interference and nearly complete vacuum-fluctuation-induced phase diffusion between pulses. These are confirmed by measurement of pulse power statistics at the output of the interferometer. Using a 5.825 GHz excitation rate and 14-bit digitization, we observe 43 Gbps quantum randomness generation.

  17. Randomized Trial of Plaque-Identifying Toothpaste: Decreasing Plaque and Inflammation.

    PubMed

    Fasula, Kim; Evans, Carla A; Boyd, Linda; Giblin, Lori; Belavsky, Benjamin Z; Hetzel, Scott; McBride, Patrick; DeMets, David L; Hennekens, Charles H

    2017-06-01

    Randomized data are sparse about whether a plaque-identifying toothpaste reduces dental plaque and nonexistent for inflammation. Inflammation is intimately involved in the pathogenesis of atherosclerosis and is accurately measured by high-sensitivity C-reactive protein (hs-CRP), a sensitive marker for cardiovascular disease. The hypotheses that Plaque HD (TJA Health LLC, Joliet, Ill), a plaque-identifying toothpaste, produces statistically significant reductions in dental plaque and hs-CRP were tested in this randomized trial. Sixty-one apparently healthy subjects aged 19 to 44 years were assigned at random to this plaque-identifying (n = 31) or placebo toothpaste (n = 30) for 60 days. Changes from baseline to follow-up in dental plaque and hs-CRP were assessed. In an intention-to-treat analysis, the plaque-identifying toothpaste reduced mean plaque score by 49%, compared with a 24% reduction in placebo (P = .001). In a prespecified subgroup analysis of 38 subjects with baseline levels >0.5 mg/L, the plaque-identifying toothpaste reduced hs-CRP by 29%, compared with a 25% increase in placebo toothpaste (P = .041). This plaque-identifying toothpaste produced statistically significant reductions in dental plaque and hs-CRP. The observed reduction in dental plaque confirms and extends a previous observation. The observed reduction in inflammation supports the hypothesis of a reduction in risks of cardiovascular disease. The direct test of this hypothesis requires a large-scale randomized trial of sufficient size and duration designed a priori to do so. Such a finding would have major clinical and public health implications. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. Calibration of Predictor Models Using Multiple Validation Experiments

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2015-01-01

    This paper presents a framework for calibrating computational models using data from several and possibly dissimilar validation experiments. The offset between model predictions and observations, which might be caused by measurement noise, model-form uncertainty, and numerical error, drives the process by which uncertainty in the models parameters is characterized. The resulting description of uncertainty along with the computational model constitute a predictor model. Two types of predictor models are studied: Interval Predictor Models (IPMs) and Random Predictor Models (RPMs). IPMs use sets to characterize uncertainty, whereas RPMs use random vectors. The propagation of a set through a model makes the response an interval valued function of the state, whereas the propagation of a random vector yields a random process. Optimization-based strategies for calculating both types of predictor models are proposed. Whereas the formulations used to calculate IPMs target solutions leading to the interval value function of minimal spread containing all observations, those for RPMs seek to maximize the models' ability to reproduce the distribution of observations. Regarding RPMs, we choose a structure for the random vector (i.e., the assignment of probability to points in the parameter space) solely dependent on the prediction error. As such, the probabilistic description of uncertainty is not a subjective assignment of belief, nor is it expected to asymptotically converge to a fixed value, but instead it casts the model's ability to reproduce the experimental data. This framework enables evaluating the spread and distribution of the predicted response of target applications depending on the same parameters beyond the validation domain.

  19. Effects of relationship education on couple communication and satisfaction: A randomized controlled trial with low-income couples.

    PubMed

    Williamson, Hannah C; Altman, Noemi; Hsueh, JoAnn; Bradbury, Thomas N

    2016-02-01

    Although preventive educational interventions for couples have been examined in more than 100 experimental studies, the value of this work is limited by reliance on economically advantaged populations and by an absence of data on proposed mediators and moderators. Data from the Supporting Healthy Marriage Project-a randomized, controlled trial of relationship education for couples living with low incomes-were therefore analyzed to test whether intervention effects on relationship satisfaction would be mediated by observational assessments of relationship communication and whether any such effects would be moderated by couples' pretreatment risk. Within the larger sample of Supporting Healthy Marriage Project couples randomized to a relationship education or no-treatment control condition, the present analyses focus on the 1,034 couples who provided (a) data on sociodemographic risk at baseline, (b) observational data on couple communication 12 months after randomization, and (c) reports of relationship satisfaction 30 months after randomization. Intervention couples reported higher satisfaction at 30 months than control couples, regardless of their level of pretreatment risk. Among higher risk couples, the intervention improved observed communication as well. Contrary to prediction, treatment effects on satisfaction were not mediated by improvements in communication, and improvements in communication did not translate into greater satisfaction. Relationship education programs produce small improvements in relationship satisfaction and communication, particularly for couples at elevated sociodemographic risk. The absence of behavioral effects on satisfaction indicates, however, that the mechanisms by which couples may benefit from relationship education are not yet well understood. (c) 2016 APA, all rights reserved).

  20. A Single Mechanism Can Account for Human Perception of Depth in Mixed Correlation Random Dot Stereograms

    PubMed Central

    Cumming, Bruce G.

    2016-01-01

    In order to extract retinal disparity from a visual scene, the brain must match corresponding points in the left and right retinae. This computationally demanding task is known as the stereo correspondence problem. The initial stage of the solution to the correspondence problem is generally thought to consist of a correlation-based computation. However, recent work by Doi et al suggests that human observers can see depth in a class of stimuli where the mean binocular correlation is 0 (half-matched random dot stereograms). Half-matched random dot stereograms are made up of an equal number of correlated and anticorrelated dots, and the binocular energy model—a well-known model of V1 binocular complex cells—fails to signal disparity here. This has led to the proposition that a second, match-based computation must be extracting disparity in these stimuli. Here we show that a straightforward modification to the binocular energy model—adding a point output nonlinearity—is by itself sufficient to produce cells that are disparity-tuned to half-matched random dot stereograms. We then show that a simple decision model using this single mechanism can reproduce psychometric functions generated by human observers, including reduced performance to large disparities and rapidly updating dot patterns. The model makes predictions about how performance should change with dot size in half-matched stereograms and temporal alternation in correlation, which we test in human observers. We conclude that a single correlation-based computation, based directly on already-known properties of V1 neurons, can account for the literature on mixed correlation random dot stereograms. PMID:27196696

  1. Analysis of Machine Learning Techniques for Heart Failure Readmissions.

    PubMed

    Mortazavi, Bobak J; Downing, Nicholas S; Bucholz, Emily M; Dharmarajan, Kumar; Manhapra, Ajay; Li, Shu-Xia; Negahban, Sahand N; Krumholz, Harlan M

    2016-11-01

    The current ability to predict readmissions in patients with heart failure is modest at best. It is unclear whether machine learning techniques that address higher dimensional, nonlinear relationships among variables would enhance prediction. We sought to compare the effectiveness of several machine learning algorithms for predicting readmissions. Using data from the Telemonitoring to Improve Heart Failure Outcomes trial, we compared the effectiveness of random forests, boosting, random forests combined hierarchically with support vector machines or logistic regression (LR), and Poisson regression against traditional LR to predict 30- and 180-day all-cause readmissions and readmissions because of heart failure. We randomly selected 50% of patients for a derivation set, and a validation set comprised the remaining patients, validated using 100 bootstrapped iterations. We compared C statistics for discrimination and distributions of observed outcomes in risk deciles for predictive range. In 30-day all-cause readmission prediction, the best performing machine learning model, random forests, provided a 17.8% improvement over LR (mean C statistics, 0.628 and 0.533, respectively). For readmissions because of heart failure, boosting improved the C statistic by 24.9% over LR (mean C statistic 0.678 and 0.543, respectively). For 30-day all-cause readmission, the observed readmission rates in the lowest and highest deciles of predicted risk with random forests (7.8% and 26.2%, respectively) showed a much wider separation than LR (14.2% and 16.4%, respectively). Machine learning methods improved the prediction of readmission after hospitalization for heart failure compared with LR and provided the greatest predictive range in observed readmission rates. © 2016 American Heart Association, Inc.

  2. Image discrimination models predict detection in fixed but not random noise

    NASA Technical Reports Server (NTRS)

    Ahumada, A. J. Jr; Beard, B. L.; Watson, A. B. (Principal Investigator)

    1997-01-01

    By means of a two-interval forced-choice procedure, contrast detection thresholds for an aircraft positioned on a simulated airport runway scene were measured with fixed and random white-noise masks. The term fixed noise refers to a constant, or unchanging, noise pattern for each stimulus presentation. The random noise was either the same or different in the two intervals. Contrary to simple image discrimination model predictions, the same random noise condition produced greater masking than the fixed noise. This suggests that observers seem unable to hold a new noisy image for comparison. Also, performance appeared limited by internal process variability rather than by external noise variability, since similar masking was obtained for both random noise types.

  3. A Comparison of Techniques for Scheduling Earth-Observing Satellites

    NASA Technical Reports Server (NTRS)

    Globus, Al; Crawford, James; Lohn, Jason; Pryor, Anna

    2004-01-01

    Scheduling observations by coordinated fleets of Earth Observing Satellites (EOS) involves large search spaces, complex constraints and poorly understood bottlenecks, conditions where evolutionary and related algorithms are often effective. However, there are many such algorithms and the best one to use is not clear. Here we compare multiple variants of the genetic algorithm: stochastic hill climbing, simulated annealing, squeaky wheel optimization and iterated sampling on ten realistically-sized EOS scheduling problems. Schedules are represented by a permutation (non-temperal ordering) of the observation requests. A simple deterministic scheduler assigns times and resources to each observation request in the order indicated by the permutation, discarding those that violate the constraints created by previously scheduled observations. Simulated annealing performs best. Random mutation outperform a more 'intelligent' mutator. Furthermore, the best mutator, by a small margin, was a novel approach we call temperature dependent random sampling that makes large changes in the early stages of evolution and smaller changes towards the end of search.

  4. Teacher coaching supported by formative assessment for improving classroom practices.

    PubMed

    Fabiano, Gregory A; Reddy, Linda A; Dudek, Christopher M

    2018-06-01

    The present study is a wait-list controlled, randomized study investigating a teacher coaching approach that emphasizes formative assessment and visual performance feedback to enhance elementary school teachers' classroom practices. The coaching model targeted instructional and behavioral management practices as measured by the Classroom Strategies Assessment System (CSAS) Observer and Teacher Forms. The sample included 89 general education teachers, stratified by grade level, and randomly assigned to 1 of 2 conditions: (a) immediate coaching, or (b) waitlist control. Results indicated that, relative to the waitlist control, teachers in immediate coaching demonstrated significantly greater improvements in observations of behavior management strategy use but not for observations of instructional strategy use. Observer- and teacher-completed ratings of behavioral management strategy use at postassessment were significantly improved by both raters; ratings of instructional strategy use were significantly improved for teacher but not observer ratings. A brief coaching intervention improved teachers' use of observed behavior management strategies and self-reported use of behavior management and instructional strategies. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  5. The origin of dispersion of magnetoresistance of a domain wall spin valve

    NASA Astrophysics Data System (ADS)

    Sato, Jun; Matsushita, Katsuyoshi; Imamura, Hiroshi

    2010-01-01

    We theoretically study the current-perpendicular-to-plane magnetoresistance of a domain wall confined in a nanocontact which is experimentally fabricated as current-confined-path (CCP) structure in a nano-oxide-layer (NOL). We solve the non-collinear spin diffusion equation by using the finite element method and calculate the MR ratio by evaluating the additional voltage drop due to the spin accumulation. We investigate the origin of dispersion of magnetoresistance by considering the effect of randomness of the size and distribution of the nanocontacts in the NOL. It is observed that the effect of randomness of the contact size is much larger than that of the contact distribution. Our results suggest that the origin of dispersion of magnetoresistance observed in the experiments is the randomness of the size of the nanocontacts in the NOL.

  6. A Randomized Phase II Chemoprevention Trial of 13-CIS Retinoic Acid with Or without α Tocopherol or Observation in Subjects at High Risk for Lung Cancer

    PubMed Central

    Kelly, Karen; Kittelson, John; Franklin, Wilbur A.; Kennedy, Timothy C.; Klein, Catherine E.; Keith, Robert L.; Dempsey, Edward C.; Lewis, Marina; Jackson, Mary K.; Hirsch, Fred R.; Bunn, Paul A.; Miller, York E.

    2011-01-01

    No chemoprevention strategies have been proven effective for lung cancer. We evaluated the effect of 13-cis retinoic acid (13-cis RA), with or without α tocopherol, as a lung cancer chemoprevention agent in a phase II randomized controlled clinical trial of adult subjects at high risk for lung cancer as defined by the presence of sputum atypia, history of smoking, and airflow obstruction, or a prior surgically cured nonsmall cell lung cancer (disease free, >3 years). Subjects were randomly assigned to receive either 13-cis RA, 13-cis RA plus α tocopherol (13-cis RA/α toco) or observation for 12 months. Outcome measures are derived from histologic evaluation of bronchial biopsy specimens obtained by bronchoscopy at baseline and follow-up. The primary outcome measure is treatment “failure” defined as histologic progression (any increase in the maximum histologic score) or failure to return for follow-up bronchoscopy. Seventy-five subjects were randomized (27/22/26 to obervations/13-cis RA/13-cis RA/α toco); 59 completed the trial; 55 had both baseline and follow-up bronchoscopy. The risk of treatment failure was 55.6% (15 of 27) and 50% (24 of 48) in the observation and combined (13 cis RA plus 13 cis RA/α toco) treatment arms, respectively (odds ratio adjusted for baseline histology, 0.97; 95% confidence interval, 0.36–2.66; P = 0.95). Among subjects with complete histology data, maximum histology score in the observation arm increased by 0.37 units and by 0.03 units in the treated arms (difference adjusted for baseline, −0.18; 95% confidence interval, −1.16 to 0.81; P = 0.72). Similar (nonsignificant) results were observed for treatment effects on endobronchial proliferation as assessed by Ki-67 immunolabeling. Twelve-month treatment with 13-cis RA produced nonsignificant changes in bronchial histology, consistent with results in other trials. Agents advancing to phase III randomized trials should produce greater histologic changes. The addition of α tocopherol did not affect toxicity. PMID:19401528

  7. Record statistics of a strongly correlated time series: random walks and Lévy flights

    NASA Astrophysics Data System (ADS)

    Godrèche, Claude; Majumdar, Satya N.; Schehr, Grégory

    2017-08-01

    We review recent advances on the record statistics of strongly correlated time series, whose entries denote the positions of a random walk or a Lévy flight on a line. After a brief survey of the theory of records for independent and identically distributed random variables, we focus on random walks. During the last few years, it was indeed realized that random walks are a very useful ‘laboratory’ to test the effects of correlations on the record statistics. We start with the simple one-dimensional random walk with symmetric jumps (both continuous and discrete) and discuss in detail the statistics of the number of records, as well as of the ages of the records, i.e. the lapses of time between two successive record breaking events. Then we review the results that were obtained for a wide variety of random walk models, including random walks with a linear drift, continuous time random walks, constrained random walks (like the random walk bridge) and the case of multiple independent random walkers. Finally, we discuss further observables related to records, like the record increments, as well as some questions raised by physical applications of record statistics, like the effects of measurement error and noise.

  8. Bayesian approach to non-Gaussian field statistics for diffusive broadband terahertz pulses.

    PubMed

    Pearce, Jeremy; Jian, Zhongping; Mittleman, Daniel M

    2005-11-01

    We develop a closed-form expression for the probability distribution function for the field components of a diffusive broadband wave propagating through a random medium. We consider each spectral component to provide an individual observation of a random variable, the configurationally averaged spectral intensity. Since the intensity determines the variance of the field distribution at each frequency, this random variable serves as the Bayesian prior that determines the form of the non-Gaussian field statistics. This model agrees well with experimental results.

  9. Estimating random errors due to shot noise in backscatter lidar observations.

    PubMed

    Liu, Zhaoyan; Hunt, William; Vaughan, Mark; Hostetler, Chris; McGill, Matthew; Powell, Kathleen; Winker, David; Hu, Yongxiang

    2006-06-20

    We discuss the estimation of random errors due to shot noise in backscatter lidar observations that use either photomultiplier tube (PMT) or avalanche photodiode (APD) detectors. The statistical characteristics of photodetection are reviewed, and photon count distributions of solar background signals and laser backscatter signals are examined using airborne lidar observations at 532 nm using a photon-counting mode APD. Both distributions appear to be Poisson, indicating that the arrival at the photodetector of photons for these signals is a Poisson stochastic process. For Poisson- distributed signals, a proportional, one-to-one relationship is known to exist between the mean of a distribution and its variance. Although the multiplied photocurrent no longer follows a strict Poisson distribution in analog-mode APD and PMT detectors, the proportionality still exists between the mean and the variance of the multiplied photocurrent. We make use of this relationship by introducing the noise scale factor (NSF), which quantifies the constant of proportionality that exists between the root mean square of the random noise in a measurement and the square root of the mean signal. Using the NSF to estimate random errors in lidar measurements due to shot noise provides a significant advantage over the conventional error estimation techniques, in that with the NSF, uncertainties can be reliably calculated from or for a single data sample. Methods for evaluating the NSF are presented. Algorithms to compute the NSF are developed for the Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observations lidar and tested using data from the Lidar In-space Technology Experiment.

  10. Estimating Random Errors Due to Shot Noise in Backscatter Lidar Observations

    NASA Technical Reports Server (NTRS)

    Liu, Zhaoyan; Hunt, William; Vaughan, Mark A.; Hostetler, Chris A.; McGill, Matthew J.; Powell, Kathy; Winker, David M.; Hu, Yongxiang

    2006-01-01

    In this paper, we discuss the estimation of random errors due to shot noise in backscatter lidar observations that use either photomultiplier tube (PMT) or avalanche photodiode (APD) detectors. The statistical characteristics of photodetection are reviewed, and photon count distributions of solar background signals and laser backscatter signals are examined using airborne lidar observations at 532 nm using a photon-counting mode APD. Both distributions appear to be Poisson, indicating that the arrival at the photodetector of photons for these signals is a Poisson stochastic process. For Poisson-distributed signals, a proportional, one-to-one relationship is known to exist between the mean of a distribution and its variance. Although the multiplied photocurrent no longer follows a strict Poisson distribution in analog-mode APD and PMT detectors, the proportionality still exists between the mean and the variance of the multiplied photocurrent. We make use of this relationship by introducing the noise scale factor (NSF), which quantifies the constant of proportionality that exists between the root-mean-square of the random noise in a measurement and the square root of the mean signal. Using the NSF to estimate random errors in lidar measurements due to shot noise provides a significant advantage over the conventional error estimation techniques, in that with the NSF uncertainties can be reliably calculated from/for a single data sample. Methods for evaluating the NSF are presented. Algorithms to compute the NSF are developed for the Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observations (CALIPSO) lidar and tested using data from the Lidar In-space Technology Experiment (LITE). OCIS Codes:

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Douillard, Jean-Yves; Rosell, Rafael; De Lena, Mario

    Purpose: To study the impact of postoperative radiation therapy (PORT) on survival in the Adjuvant Navelbine International Trialist Association (ANITA) randomized study of adjuvant chemotherapy. Methods and Materials: ANITA is a randomized trial of adjuvant cisplatin and vinorelbine chemotherapy vs. observation in completely resected non-small-cell lung carcinoma (NSCLC) Stages IB to IIIA. Use of PORT was recommended for pN+ disease but was not randomized or mandatory. Each center decided whether to use PORT before initiation of the study. We describe here the survival of patients with and without PORT within each treatment group of ANITA. No statistical comparison of survivalmore » was performed because this was an unplanned subgroup analysis. Results: Overall, 232 of 840 patients received PORT (33.3% in the observation arm and 21.6% in the chemotherapy arm). In univariate analysis, PORT had a deleterious effect on the overall population survival. Patients with pN1 disease had an improved survival from PORT in the observation arm (median survival [MS] 25.9 vs. 50.2 months), whereas PORT had a detrimental effect in the chemotherapy group (MS 93.6 months and 46.6 months). In contrast, survival was improved in patients with pN2 disease who received PORT, both in the chemotherapy (MS 23.8 vs. 47.4 months) and observation arm (median 12.7 vs. 22.7 months). Conclusion: This retrospective evaluation suggests a positive effect of PORT in pN2 disease and a negative effect on pN1 disease when patients received adjuvant chemotherapy. The results support further evaluation of PORT in prospectively randomized studies in completely resected pN2 NSCLC.« less

  12. Targeting Prodromal Alzheimer Disease With Avagacestat: A Randomized Clinical Trial.

    PubMed

    Coric, Vladimir; Salloway, Stephen; van Dyck, Christopher H; Dubois, Bruno; Andreasen, Niels; Brody, Mark; Curtis, Craig; Soininen, Hilkka; Thein, Stephen; Shiovitz, Thomas; Pilcher, Gary; Ferris, Steven; Colby, Susan; Kerselaers, Wendy; Dockens, Randy; Soares, Holly; Kaplita, Stephen; Luo, Feng; Pachai, Chahin; Bracoud, Luc; Mintun, Mark; Grill, Joshua D; Marek, Ken; Seibyl, John; Cedarbaum, Jesse M; Albright, Charles; Feldman, Howard H; Berman, Robert M

    2015-11-01

    Early identification of Alzheimer disease (AD) is important for clinical management and affords the opportunity to assess potential disease-modifying agents in clinical trials. To our knowledge, this is the first report of a randomized trial to prospectively enrich a study population with prodromal AD (PDAD) defined by cerebrospinal fluid (CSF) biomarker criteria and mild cognitive impairment (MCI) symptoms. To assess the safety of the γ-secretase inhibitor avagacestat in PDAD and to determine whether CSF biomarkers can identify this patient population prior to clinical diagnosis of dementia. A randomized, placebo-controlled phase 2 clinical trial with a parallel, untreated, nonrandomized observational cohort of CSF biomarker-negative participants was conducted May 26, 2009, to July 9, 2013, in a multicenter global population. Of 1358 outpatients screened, 263 met MCI and CSF biomarker criteria for randomization into the treatment phase. One hundred two observational cohort participants who met MCI criteria but were CSF biomarker-negative were observed during the same study period to evaluate biomarker assay sensitivity. Oral avagacestat or placebo daily. Safety and tolerability of avagacestat. Of the 263 participants in the treatment phase, 132 were randomized to avagacestat and 131 to placebo; an additional 102 participants were observed in an untreated observational cohort. Avagacestat was relatively well tolerated with low discontinuation rates (19.6%) at a dose of 50 mg/d, whereas the dose of 125 mg/d had higher discontinuation rates (43%), primarily attributable to gastrointestinal tract adverse events. Increases in nonmelanoma skin cancer and nonprogressive, reversible renal tubule effects were observed with avagacestat. Serious adverse event rates were higher with avagacestat (49 participants [37.1%]) vs placebo (31 [23.7%]), attributable to the higher incidence of nonmelanoma skin cancer. At 2 years, progression to dementia was more frequent in the PDAD cohort (30.7%) vs the observational cohort (6.5%). Brain atrophy rate in PDAD participants was approximately double that of the observational cohort. Concordance between abnormal amyloid burden on positron emission tomography and pathologic CSF was approximately 87% (κ = 0.68; 95% CI, 0.48-0.87). No significant treatment differences were observed in the avagacestat vs placebo arm in key clinical outcome measures. Avagacestat did not demonstrate efficacy and was associated with adverse dose-limiting effects. This PDAD population receiving avagacestat or placebo had higher rates of clinical progression to dementia and greater brain atrophy compared with CSF biomarker-negative participants. The CSF biomarkers and amyloid positron emission tomography imaging were correlated, suggesting that either modality could be used to confirm the presence of cerebral amyloidopathy and identify PDAD. clinicaltrials.gov Identifier: NCT00890890.

  13. Improving parenting skills for families of young children in pediatric settings: a randomized clinical trial.

    PubMed

    Perrin, Ellen C; Sheldrick, R Christopher; McMenamy, Jannette M; Henson, Brandi S; Carter, Alice S

    2014-01-01

    Disruptive behavior disorders, such as attention-deficient/hyperactivity disorder and oppositional defiant disorder, are common and stable throughout childhood. These disorders cause long-term morbidity but benefit from early intervention. While symptoms are often evident before preschool, few children receive appropriate treatment during this period. Group parent training, such as the Incredible Years program, has been shown to be effective in improving parenting strategies and reducing children's disruptive behaviors. Because they already monitor young children's behavior and development, primary care pediatricians are in a good position to intervene early when indicated. To investigate the feasibility and effectiveness of parent-training groups delivered to parents of toddlers in pediatric primary care settings. This randomized clinical trial was conducted at 11 diverse pediatric practices in the Greater Boston area. A total of 273 parents of children between 2 and 4 years old who acknowledged disruptive behaviors on a 20-item checklist were included. A 10-week Incredible Years parent-training group co-led by a research clinician and a pediatric staff member. Self-reports and structured videotaped observations of parent and child behaviors conducted prior to, immediately after, and 12 months after the intervention. A total of 150 parents were randomly assigned to the intervention or the waiting-list group. An additional 123 parents were assigned to receive intervention without a randomly selected comparison group. Compared with the waiting-list group, greater improvement was observed in both intervention groups (P < .05). No differences were observed between the randomized and the nonrandomized intervention groups. Self-reports and structured observations provided evidence of improvements in parenting practices and child disruptive behaviors that were attributable to participation in the Incredible Years groups. This study demonstrated the feasibility and effectiveness of parent-training groups conducted in pediatric office settings to reduce disruptive behavior in toddlers. clinicaltrials.gov Identifier: NCT00402857.

  14. Random Telegraph Signal Amplitudes in Sub 100 nm (Decanano) MOSFETs: A 3D 'Atomistic' Simulation Study

    NASA Technical Reports Server (NTRS)

    Asenov, Asen; Balasubramaniam, R.; Brown, A. R.; Davies, J. H.; Saini, Subhash

    2000-01-01

    In this paper we use 3D simulations to study the amplitudes of random telegraph signals (RTS) associated with the trapping of a single carrier in interface states in the channel of sub 100 nm (decanano) MOSFETs. Both simulations using continuous doping charge and random discrete dopants in the active region of the MOSFETs are presented. We have studied the dependence of the RTS amplitudes on the position of the trapped charge in the channel and on the device design parameters. We have observed a significant increase in the maximum RTS amplitude when discrete random dopants are employed in the simulations.

  15. Is instillation of anesthetic gel necessary in flexible cystoscopic examination? A prospective randomized study.

    PubMed

    Kobayashi, Takashi; Nishizawa, Koji; Ogura, Keiji

    2003-01-01

    To determine whether urethral injection of anesthetic and lubricating agent before outpatient flexible cystoscopic examination is worthwhile regarding patient tolerance of pain. A randomized prospective study was conducted. A total of 133 consecutive men scheduled to undergo flexible cystoscopy were randomized to receive 11 mL of 0.2% oxybuprocaine hydrochloride gel (group 1), 11 mL of plain lubricating gel (group 2), or no gel injection (group 3). In every group, 2% lidocaine gel was applied to the fiberscope. Patients recorded the level of pain during gel instillation, scope insertion, and intravesical observation separately on a 100-mm visual analog self-assessment scale. Pain scores for gel instillation were approximately two thirds those for scope insertion and intravesical observation in groups 1 and 2. No significant difference was noted in the pain score of each group during either scope insertion or intravesical observation. Pain during intraurethral gel instillation is significant. Anesthetic gel instillation has no advantage compared with no-gel injection in men when lubricating gel is applied to a flexible fiberscope.

  16. Random potentials and cosmological attractors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Linde, Andrei, E-mail: alinde@stanford.edu

    I show that the problem of realizing inflation in theories with random potentials of a limited number of fields can be solved, and agreement with the observational data can be naturally achieved if at least one of these fields has a non-minimal kinetic term of the type used in the theory of cosmological α-attractors.

  17. Unbiased Causal Inference from an Observational Study: Results of a Within-Study Comparison

    ERIC Educational Resources Information Center

    Pohl, Steffi; Steiner, Peter M.; Eisermann, Jens; Soellner, Renate; Cook, Thomas D.

    2009-01-01

    Adjustment methods such as propensity scores and analysis of covariance are often used for estimating treatment effects in nonexperimental data. Shadish, Clark, and Steiner used a within-study comparison to test how well these adjustments work in practice. They randomly assigned participating students to a randomized or nonrandomized experiment.…

  18. Professional Opinion Concerning the Effectiveness of Bracing Relative to Observation in Adolescent Idiopathic Scoliosis

    PubMed Central

    Dolan, Lori A.; Donnelly, Melanie J.; Spratt, Kevin F.; Weinstein, Stuart L.

    2015-01-01

    Objective To determine if community equipoise exists concerning the effectiveness of bracing in adolescent idiopathic scoliosis. Background Data Bracing is the standard of care for adolescent idiopathic scoliosis despite the lack of strong reasearch evidence concerning its effectiveness. Thus, some researchers support the idea of a randomized trial, whereas others think that randomization in the face of a standard of care would be unethical. Methods A random of Scoliosis Research Society and Pediatric Orthopaedic Society of North America members were asked to consider 12 clinical profiles and to give their opinion concerning the radiographic outcomes after observation and bracing. Results An expert panel was created from the respondents. They expressed a wide array of opinions concerning the percentage of patients within each scenario who would benefit from bracing. Agreement was noted concerning the risk due to bracing for post-menarchal patients only. Conclusions This study found a high degree of variability in opinion among clinicians concerning the effectiveness of bracing, suggesting that a randomized trial of bracing would be ethical. PMID:17414008

  19. Multi-Sensory Intervention Observational Research

    ERIC Educational Resources Information Center

    Thompson, Carla J.

    2011-01-01

    An observational research study based on sensory integration theory was conducted to examine the observed impact of student selected multi-sensory experiences within a multi-sensory intervention center relative to the sustained focus levels of students with special needs. A stratified random sample of 50 students with severe developmental…

  20. A theory of eu-estrogenemia: a unifying concept

    PubMed Central

    Turner, Ralph J.; Kerber, Irwin J.

    2017-01-01

    Abstract Objective: The aim of the study was to propose a unifying theory for the role of estrogen in postmenopausal women through examples in basic science, randomized controlled trials, observational studies, and clinical practice. Methods: Review and evaluation of the literature relating to estrogen. Discussion: The role of hormone therapy and ubiquitous estrogen receptors after reproductive senescence gains insight from basic science models. Observational studies and individualized patient care in clinical practice may show outcomes that are not reproduced in randomized clinical trials. The understanding gained from the timing hypothesis for atherosclerosis, the critical window theory in neurosciences, randomized controlled trials, and numerous genomic and nongenomic actions of estrogen discovered in basic science provides new explanations to clinical challenges that practitioners face. Consequences of a hypo-estrogenemic duration in women's lives are poorly understood. The Study of Women Across the Nation suggests its magnitude is greater than was previously acknowledged. We propose that the healthy user bias was the result of surgical treatment (hysterectomy with oophorectomy) for many gynecological maladies followed by pharmacological and physiological doses of estrogen to optimize patient quality of life. The past decade of research has begun to demonstrate the role of estrogen in homeostasis. Conclusions: The theory of eu-estrogenemia provides a robust framework to unify the timing hypothesis, critical window theory, randomized controlled trials, the basic science of estrogen receptors, and clinical observations of patients over the past five decades. PMID:28562489

  1. A randomized and placebo-controlled study to compare the skin-lightening efficacy and safety of lignin peroxidase cream vs. 2% hydroquinone cream.

    PubMed

    Mauricio, Tess; Karmon, Yoram; Khaiat, Alain

    2011-12-01

      Historically, the most effective treatments for skin lightening have contained hydroquinone. However, there is a need for an effective alternative.   The purpose of this study was to evaluate the skin-lightening efficacy and safety of lignin peroxidase (LIP) creams using a regimen of both day and night products compared with twice-daily application of 2% hydroquinone cream and placebo in Asian women.   This was a randomized, double-blind, placebo-controlled, split-face, single-center study of 51 patients. Patients were randomized to receive day and night LIP cream on one randomly selected side of their face and either 2% hydroquinone cream or placebo on the other.   A statistically significant change from baseline in the melanin index was observed in LIP-treated skin, with a mean reduction of 7.6% (P < 0.001) on Day 31. Conversely, hydroquinone and placebo did not provide a statistically significant lightening effect when instrumentally measured. Dermatologist scoring demonstrated a significant improvement in overall fairness as early as 8 days after treatment initiation in the LIP-treated group, which was not observed in the other groups. Overall, patients preferred the LIP creams.   The application of day/night LIP cream provided a significantly more rapid and observable skin-lightening effect than hydroquinone 2% cream or placebo. © 2011 Wiley Periodicals, Inc.

  2. Investigating Cell Criticality

    NASA Astrophysics Data System (ADS)

    Serra, R.; Villani, M.; Damiani, C.; Graudenzi, A.; Ingrami, P.; Colacci, A.

    Random Boolean networks provide a way to give a precise meaning to the notion that living beings are in a critical state. Some phenomena which are observed in real biological systems (distribution of "avalanches" in gene knock-out experiments) can be modeled using random Boolean networks, and the results can be analytically proven to depend upon the Derrida parameter, which also determines whether the network is critical. By comparing observed and simulated data one can then draw inferences about the criticality of biological cells, although with some care because of the limited number of experimental observations. The relationship between the criticality of a single network and that of a set of interacting networks, which simulate a tissue or a bacterial colony, is also analyzed by computer simulations.

  3. Fast state estimation subject to random data loss in discrete-time nonlinear stochastic systems

    NASA Astrophysics Data System (ADS)

    Mahdi Alavi, S. M.; Saif, Mehrdad

    2013-12-01

    This paper focuses on the design of the standard observer in discrete-time nonlinear stochastic systems subject to random data loss. By the assumption that the system response is incrementally bounded, two sufficient conditions are subsequently derived that guarantee exponential mean-square stability and fast convergence of the estimation error for the problem at hand. An efficient algorithm is also presented to obtain the observer gain. Finally, the proposed methodology is employed for monitoring the Continuous Stirred Tank Reactor (CSTR) via a wireless communication network. The effectiveness of the designed observer is extensively assessed by using an experimental tested-bed that has been fabricated for performance evaluation of the over wireless-network estimation techniques under realistic radio channel conditions.

  4. The topology of large-scale structure. V - Two-dimensional topology of sky maps

    NASA Astrophysics Data System (ADS)

    Gott, J. R., III; Mao, Shude; Park, Changbom; Lahav, Ofer

    1992-01-01

    A 2D algorithm is applied to observed sky maps and numerical simulations. It is found that when topology is studied on smoothing scales larger than the correlation length, the topology is approximately in agreement with the random phase formula for the 2D genus-threshold density relation, G2(nu) varies as nu(e) exp-nu-squared/2. Some samples show small 'meatball shifts' similar to those seen in corresponding 3D observational samples and similar to those produced by biasing in cold dark matter simulations. The observational results are thus consistent with the standard model in which the structure in the universe today has grown from small fluctuations caused by random quantum noise in the early universe.

  5. Mendelian randomization in nutritional epidemiology

    PubMed Central

    Qi, Lu

    2013-01-01

    Nutritional epidemiology aims to identify dietary and lifestyle causes for human diseases. Causality inference in nutritional epidemiology is largely based on evidence from studies of observational design, and may be distorted by unmeasured or residual confounding and reverse causation. Mendelian randomization is a recently developed methodology that combines genetic and classical epidemiological analysis to infer causality for environmental exposures, based on the principle of Mendel’s law of independent assortment. Mendelian randomization uses genetic variants as proxiesforenvironmentalexposuresofinterest.AssociationsderivedfromMendelian randomization analysis are less likely to be affected by confounding and reverse causation. During the past 5 years, a body of studies examined the causal effects of diet/lifestyle factors and biomarkers on a variety of diseases. The Mendelian randomization approach also holds considerable promise in the study of intrauterine influences on offspring health outcomes. However, the application of Mendelian randomization in nutritional epidemiology has some limitations. PMID:19674341

  6. Measures of Reliability in Behavioral Observation: The Advantage of "Real Time" Data Acquisition.

    ERIC Educational Resources Information Center

    Hollenbeck, Albert R.; Slaby, Ronald G.

    Two observers who were using an electronic digital data acquisition system were spot checked for reliability at random times over a four month period. Between-and within-observer reliability was assessed for frequency, duration, and duration-per-event measures of four infant behaviors. The results confirmed the problem of observer drift--the…

  7. Effects of Different Observational Systems and Time Sequences Upon Non-Participant Observers' Behavioral Ratings.

    ERIC Educational Resources Information Center

    Wodarski, John S.; And Others

    Four different observational systems and two time sequences were employed to determine the extent to which they would yield different incidences of anti-social behavior. Two videotapes, randomly chosen from a pool of 30 tapes, were utilized. These illustrated the behaviors of anti-social children in a natural setting. Six observers were reliably…

  8. Time series, correlation matrices and random matrix models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vinayak; Seligman, Thomas H.

    2014-01-08

    In this set of five lectures the authors have presented techniques to analyze open classical and quantum systems using correlation matrices. For diverse reasons we shall see that random matrices play an important role to describe a null hypothesis or a minimum information hypothesis for the description of a quantum system or subsystem. In the former case various forms of correlation matrices of time series associated with the classical observables of some system. The fact that such series are necessarily finite, inevitably introduces noise and this finite time influence lead to a random or stochastic component in these time series.more » By consequence random correlation matrices have a random component, and corresponding ensembles are used. In the latter we use random matrices to describe high temperature environment or uncontrolled perturbations, ensembles of differing chaotic systems etc. The common theme of the lectures is thus the importance of random matrix theory in a wide range of fields in and around physics.« less

  9. Balancing the Evidence: How to Reconcile the Results of Observational Studies vs. Randomized Clinical Trials in Dialysis.

    PubMed

    Shen, Jenny I; Lum, Erik L; Chang, Tara I

    2016-09-01

    Because large randomized clinical trials (RCTs) in dialysis have been relatively scarce, evidence-based dialysis care has depended heavily on the results of observational studies. However, when results from RCTs appear to contradict the findings of observational studies, nephrologists are left to wonder which type of study they should believe. In this editorial, we explore the key differences between observational studies and RCTs in the context of such seemingly conflicting studies in dialysis. Confounding is the major limitation of observational studies, whereas low statistical power and problems with external validity are more likely to limit the findings of RCTs. Differences in the specification of the population, exposure, and outcomes can also contribute to different results among RCTs and observational studies. Rigorous methods are required regardless of what type of study is conducted, and readers should not automatically assume that one type of study design is superior to the other. Ultimately, dialysis care requires both well-designed, well-conducted observational studies and RCTs to move the field forward. © 2016 Wiley Periodicals, Inc.

  10. Balancing the Evidence: How to Reconcile the Results of Observational Studies vs. Randomized Clinical Trials in Dialysis

    PubMed Central

    Shen, Jenny I.; Lum, Erik L.; Chang, Tara I.

    2016-01-01

    Because large randomized clinical trials (RCTs) in dialysis have been relatively scarce, evidence-based dialysis care has depended heavily on the results of observational studies. However, when results from RCTs appear to contradict the findings of observational studies, nephrologists are left to wonder which type of study they should believe. In this editorial we explore the key differences between observational studies and RCTs in the context of such seemingly conflicting studies in dialysis. Confounding is the major limitation of observational studies, while low statistical power and problems with external validity are more likely to limit the findings of RCTs. Differences in the specification of the population, exposure, and outcomes can also contribute to different results among RCTs and observational studies. Rigorous methods are required regardless of what type of study is conducted, and readers should not automatically assume that one type of study design is superior to the other. Ultimately, dialysis care requires both well-designed, well-conducted observational studies and RCTs to move the field forward. PMID:27207819

  11. Run-Reversal Equilibrium for Clinical Trial Randomization

    PubMed Central

    Grant, William C.

    2015-01-01

    In this paper, we describe a new restricted randomization method called run-reversal equilibrium (RRE), which is a Nash equilibrium of a game where (1) the clinical trial statistician chooses a sequence of medical treatments, and (2) clinical investigators make treatment predictions. RRE randomization counteracts how each investigator could observe treatment histories in order to forecast upcoming treatments. Computation of a run-reversal equilibrium reflects how the treatment history at a particular site is imperfectly correlated with the treatment imbalance for the overall trial. An attractive feature of RRE randomization is that treatment imbalance follows a random walk at each site, while treatment balance is tightly constrained and regularly restored for the overall trial. Less predictable and therefore more scientifically valid experiments can be facilitated by run-reversal equilibrium for multi-site clinical trials. PMID:26079608

  12. Fast generation of sparse random kernel graphs

    DOE PAGES

    Hagberg, Aric; Lemons, Nathan; Du, Wen -Bo

    2015-09-10

    The development of kernel-based inhomogeneous random graphs has provided models that are flexible enough to capture many observed characteristics of real networks, and that are also mathematically tractable. We specify a class of inhomogeneous random graph models, called random kernel graphs, that produces sparse graphs with tunable graph properties, and we develop an efficient generation algorithm to sample random instances from this model. As real-world networks are usually large, it is essential that the run-time of generation algorithms scales better than quadratically in the number of vertices n. We show that for many practical kernels our algorithm runs in timemore » at most ο(n(logn)²). As an example, we show how to generate samples of power-law degree distribution graphs with tunable assortativity.« less

  13. Asymmetrically dominated choice problems, the isolation hypothesis and random incentive mechanisms.

    PubMed

    Cox, James C; Sadiraj, Vjollca; Schmidt, Ulrich

    2014-01-01

    This paper presents an experimental study of the random incentive mechanisms which are a standard procedure in economic and psychological experiments. Random incentive mechanisms have several advantages but are incentive-compatible only if responses to the single tasks are independent. This is true if either the independence axiom of expected utility theory or the isolation hypothesis of prospect theory holds. We present a simple test of this in the context of choice under risk. In the baseline (one task) treatment we observe risk behavior in a given choice problem. We show that by integrating a second, asymmetrically dominated choice problem in a random incentive mechanism risk behavior can be manipulated systematically. This implies that the isolation hypothesis is violated and the random incentive mechanism does not elicit true preferences in our example.

  14. Follow-up of colorectal cancer patients after resection with curative intent-the GILDA trial.

    PubMed

    Grossmann, Erik M; Johnson, Frank E; Virgo, Katherine S; Longo, Walter E; Fossati, Rolando

    2004-01-01

    Surgery remains the primary treatment of colorectal cancer. Data are lacking to delineate the optimal surveillance strategy following resection. A large-scale multi-center European study is underway to address this issue (Gruppo Italiano di Lavoro per la Diagnosi Anticipata-GILDA). Following primary surgery with curative intent, stratification, and randomization at GILDA headquarters, colon cancer patients are then assigned to a more intensive or less intensive surveillance regimen. Rectal cancer patients undergoing curative resection are similarly randomized, with their follow-up regimens placing more emphasis on detection of local recurrence. Target recruitment for the study will be 1500 patients to achieve a statistical power of 80% (assuming an alpha of 0.05 and a hazard-rate reduction of >24%). Since the trial opened in 1998, 985 patients have been randomized from 41 centers as of February 2004. There were 496 patients randomized to the less intensive regimens, and 489 randomized to the more intensive regimens. The mean duration of follow-up is 14 months. 75 relapses (15%) and 32 deaths (7%) had been observed in the two more intensive follow-up arms, while 64 relapses (13%) and 24 deaths (5%) had been observed in the two less intensive arms as of February 2004. This trial should provide the first evidence based on an adequately powered randomized trial to determine the optimal follow-up strategy for colorectal cancer patients. This trial is open to US centers, and recruitment continues.

  15. Behavioral family intervention for children with developmental disabilities and behavioral problems.

    PubMed

    Roberts, Clare; Mazzucchelli, Trevor; Studman, Lisa; Sanders, Matthew R

    2006-06-01

    The outcomes of a randomized clinical trial of a new behavioral family intervention, Stepping Stones Triple P, for preschoolers with developmental and behavior problems are presented. Forty-eight children with developmental disabilities participated, 27 randomly allocated to an intervention group and 20 to a wait-list control group. Parents completed measures of parenting style and stress, and independent observers assessed parent-child interactions. The intervention was associated with fewer child behavior problems reported by mothers and independent observers, improved maternal and paternal parenting style, and decreased maternal stress. All effects were maintained at 6-month follow-up.

  16. Long-range ordering effect in electrodeposition of zinc and zinc oxide.

    PubMed

    Liu, Tao; Wang, Sheng; Shi, Zi-Liang; Ma, Guo-Bin; Wang, Mu; Peng, Ru-Wen; Hao, Xi-Ping; Ming, Nai-Ben

    2007-05-01

    In this paper, we report the long-range ordering effect observed in the electro-crystallization of Zn and ZnO from an ultrathin aqueous electrolyte layer of ZnSO4 . The deposition branches are regularly angled, covered with random-looking, scalelike crystalline platelets of ZnO. Although the orientation of each crystalline platelet of ZnO appears random, transmission electron microscopy shows that they essentially possess the same crystallographic orientation as the single-crystalline zinc electrodeposit underneath. Based on the experimental observations, we suggest that this unique long-range ordering effect results from an epitaxial nucleation effect in electrocrystallization.

  17. Binocular interactions in random chromatic changes at isoluminance

    NASA Astrophysics Data System (ADS)

    Medina, José M.

    2006-02-01

    To examine the type of chromatic interactions at isoluminance in the phenomenon of binocular vision, I have determined simple visual reaction times (VRT) under three observational conditions (monocular left, monocular right, and binocular) for different chromatic stimuli along random color axes at isoluminance (simultaneous L-, M-, and S-cone variations). Upper and lower boundaries of probability summation as well as the binocular capacity coefficient were estimated with observed distributions of reaction times. The results were not consistent with the notion of independent chromatic channels between eyes, suggesting the existence of excitatory and inhibitory binocular interactions at suprathreshold isoluminance conditions.

  18. Reporting and methodological quality of sample size calculations in cluster randomized trials could be improved: a review.

    PubMed

    Rutterford, Clare; Taljaard, Monica; Dixon, Stephanie; Copas, Andrew; Eldridge, Sandra

    2015-06-01

    To assess the quality of reporting and accuracy of a priori estimates used in sample size calculations for cluster randomized trials (CRTs). We reviewed 300 CRTs published between 2000 and 2008. The prevalence of reporting sample size elements from the 2004 CONSORT recommendations was evaluated and a priori estimates compared with those observed in the trial. Of the 300 trials, 166 (55%) reported a sample size calculation. Only 36 of 166 (22%) reported all recommended descriptive elements. Elements specific to CRTs were the worst reported: a measure of within-cluster correlation was specified in only 58 of 166 (35%). Only 18 of 166 articles (11%) reported both a priori and observed within-cluster correlation values. Except in two cases, observed within-cluster correlation values were either close to or less than a priori values. Even with the CONSORT extension for cluster randomization, the reporting of sample size elements specific to these trials remains below that necessary for transparent reporting. Journal editors and peer reviewers should implement stricter requirements for authors to follow CONSORT recommendations. Authors should report observed and a priori within-cluster correlation values to enable comparisons between these over a wider range of trials. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  19. Recovery of ovarian activity in women with functional hypothalamic amenorrhea who were treated with cognitive behavior therapy.

    PubMed

    Berga, Sarah L; Marcus, Marsha D; Loucks, Tammy L; Hlastala, Stefanie; Ringham, Rebecca; Krohn, Marijane A

    2003-10-01

    To determine whether cognitive behavior therapy (CBT) targeted to problematic attitudes common among women with functional hypothalamic amenorrhea would restore ovarian function. Randomized, prospective, controlled intervention. Clinical research center in an academic medical institution. Sixteen women participated who had functional hypothalamic amenorrhea; were of normal body weight; and did not report psychiatric conditions, eating disorders, or excessive exercise. Subjects were randomized to CBT or observation for 20 weeks. Serum levels of E(2) and P and vaginal bleeding were monitored. Of eight women treated with CBT, six resumed ovulating, one had partial recovery of ovarian function without evidence of ovulation, and one did not display return of ovarian function. Of those randomized to observation, one resumed ovulating, one had partial return of ovarian function, and six did not recover. Thus, CBT resulted in a higher rate of ovarian activity (87.5%) than did observation (25.0%), chi(2) = 7.14. A cognitive behavioral intervention designed to minimize problematic attitudes linked to hypothalamic allostasis was more likely to result in resumption of ovarian activity than observation. The prompt ovarian response to CBT suggests that a tailored behavioral intervention offers an efficacious treatment option that also avoids the pitfalls of pharmacological modalities.

  20. No evidence for inbreeding avoidance in a natural population of song sparrows (Melospiza melodia).

    PubMed

    Keller, L F; Arcese, P

    1998-09-01

    We studied mate choice and inbreeding avoidance a natural population of song sparrows (Melospiza melodia) on Mandarte Island, Canada. Inbreeding occurred regularly: 59% all matings were between known relatives. We tested for inbreeding avoidance by comparing the observed levels of inbreeding to those expected if mate choice had been random with respect to relatedness. Independent of our assumptions about the availability of mates in the random mating model, we found that the expected and observed distributions of inbreeding coefficients were similar, as was the expected and observed frequency of close (f >/= 0.125) inbreeding. Furthermore, there was no difference in relatedness observed pairs and those that would have resulted had birds mated instead with their nearest neighbors. The only evidence to suggest any inbreeding avoidance was a reduced rate of parent-offspring matings as compared to one random mating model but not the other. Hence, despite substantial inbreeding depression in this population, we found little evidence for inbreeding avoidance through mate choice. We present a simple model to suggest that variation in inbreeding avoidance behaviors in birds may arise from differences in survival rates: in species with low survival rates, the costs of forfeiting matings to avoid inbreeding may exceed the costs of inbreeding.

  1. On the estimation of intracluster correlation for time-to-event outcomes in cluster randomized trials.

    PubMed

    Kalia, Sumeet; Klar, Neil; Donner, Allan

    2016-12-30

    Cluster randomized trials (CRTs) involve the random assignment of intact social units rather than independent subjects to intervention groups. Time-to-event outcomes often are endpoints in CRTs. Analyses of such data need to account for the correlation among cluster members. The intracluster correlation coefficient (ICC) is used to assess the similarity among binary and continuous outcomes that belong to the same cluster. However, estimating the ICC in CRTs with time-to-event outcomes is a challenge because of the presence of censored observations. The literature suggests that the ICC may be estimated using either censoring indicators or observed event times. A simulation study explores the effect of administrative censoring on estimating the ICC. Results show that ICC estimators derived from censoring indicators or observed event times are negatively biased. Analytic work further supports these results. Observed event times are preferred to estimate the ICC under minimum frequency of administrative censoring. To our knowledge, the existing literature provides no practical guidance on the estimation of ICC when substantial amount of administrative censoring is present. The results from this study corroborate the need for further methodological research on estimating the ICC for correlated time-to-event outcomes. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  2. Ways of learning: Observational studies versus experiments

    USGS Publications Warehouse

    Shaffer, T.L.; Johnson, D.H.

    2008-01-01

    Manipulative experimentation that features random assignment of treatments, replication, and controls is an effective way to determine causal relationships. Wildlife ecologists, however, often must take a more passive approach to investigating causality. Their observational studies lack one or more of the 3 cornerstones of experimentation: controls, randomization, and replication. Although an observational study can be analyzed similarly to an experiment, one is less certain that the presumed treatment actually caused the observed response. Because the investigator does not actively manipulate the system, the chance that something other than the treatment caused the observed results is increased. We reviewed observational studies and contrasted them with experiments and, to a lesser extent, sample surveys. We identified features that distinguish each method of learning and illustrate or discuss some complications that may arise when analyzing results of observational studies. Findings from observational studies are prone to bias. Investigators can reduce the chance of reaching erroneous conclusions by formulating a priori hypotheses that can be pursued multiple ways and by evaluating the sensitivity of study conclusions to biases of various magnitudes. In the end, however, professional judgment that considers all available evidence is necessary to render a decision regarding causality based on observational studies.

  3. Empirically Driven Variable Selection for the Estimation of Causal Effects with Observational Data

    ERIC Educational Resources Information Center

    Keller, Bryan; Chen, Jianshen

    2016-01-01

    Observational studies are common in educational research, where subjects self-select or are otherwise non-randomly assigned to different interventions (e.g., educational programs, grade retention, special education). Unbiased estimation of a causal effect with observational data depends crucially on the assumption of ignorability, which specifies…

  4. Two Universality Classes for the Many-Body Localization Transition

    NASA Astrophysics Data System (ADS)

    Khemani, Vedika; Sheng, D. N.; Huse, David A.

    2017-08-01

    We provide a systematic comparison of the many-body localization (MBL) transition in spin chains with nonrandom quasiperiodic versus random fields. We find evidence suggesting that these belong to two separate universality classes: the first dominated by "intrinsic" intrasample randomness, and the second dominated by external intersample quenched randomness. We show that the effects of intersample quenched randomness are strongly growing, but not yet dominant, at the system sizes probed by exact-diagonalization studies on random models. Thus, the observed finite-size critical scaling collapses in such studies appear to be in a preasymptotic regime near the nonrandom universality class, but showing signs of the initial crossover towards the external-randomness-dominated universality class. Our results provide an explanation for why exact-diagonalization studies on random models see an apparent scaling near the transition while also obtaining finite-size scaling exponents that strongly violate Harris-Chayes bounds that apply to disorder-driven transitions. We also show that the MBL phase is more stable for the quasiperiodic model as compared to the random one, and the transition in the quasiperiodic model suffers less from certain finite-size effects.

  5. Topological analysis of the CfA redshift survey

    NASA Technical Reports Server (NTRS)

    Vogeley, Michael S.; Park, Changbom; Geller, Margaret J.; Huchra, John P.; Gott, J. Richard, III

    1994-01-01

    We study the topology of large-scale structure in the Center for Astrophysics Redshift Survey, which now includes approximately 12,000 galaxies with limiting magnitude m(sub B) is less than or equal to 15.5. The dense sampling and large volume of this survey allow us to compute the topology on smoothing scales from 6 to 20/h Mpc; we thus examine the topology of structure in both 'nonlinear' and 'linear' regimes. On smoothing scales less than or equal to 10/h Mpc this sample has 3 times the number of resolution elements of samples examined in previous studies. Isodensity surface of the smoothed galaxy density field demonstrate that coherent high-density structures and large voids dominate the galaxy distribution. We compute the genus-threshold density relation for isodensity surfaces of the CfA survey. To quantify phase correlation in these data, we compare the CfA genus with the genus of realizations of Gaussian random fields with the power spectrum measured for the CfA survey. On scales less than or equal to 10/h Mpc the observed genus amplitude is smaller than random phase (96% confidence level). This decrement reflects the degree of phase coherence in the observed galaxy distribution. In other words the genus amplitude on these scales is not good measure of the power spectrum slope. On scales greater than 10/h Mpc, where the galaxy distribution is rougly in the 'linear' regime, the genus ampitude is consistent with the random phase amplitude. The shape of the genus curve reflects the strong coherence in the observed structure; the observed genus curve appears broader than random phase (94% confidence level for smoothing scales less than or equal to 10/h Mpc) because the topolgoy is spongelike over a very large range of density threshold. This departre from random phase consistent with a distribution like a filamentary net of 'walls with holes.' On smoothing scales approaching approximately 20/h Mpc the shape of the CfA genus curve is consistent with random phase. There is very weak evidence for a shift of the genus toward a 'bubble-like' topology. To test cosmological models, we compute the genus for mock CfA surveys drawn from large (L greater than or approximately 400/h Mpc) N-body simulations of three variants of the cold dark matter (CDM) cosmogony. The genus amplitude of the 'standard' CDM model (omega h = 0.5, b = 1.5) differs from the observations (96% confidence level) on smoothing scales is less than or approximately 10/h Mpc. An open CDM model (omega h = 0.2) and a CDM model with nonzero cosmological constant (omega h = 0.24, lambda (sub 0) = 0.6) are consistent with the observed genus amplitude over the full range of smoothing scales. All of these models fail (97% confidence level) to match the broadness of the observed genus curve on smoothing scales is less than or equal to 10/h Mpc.

  6. Impact of Free Glasses and a Teacher Incentive on Children's Use of Eyeglasses: A Cluster-Randomized Controlled Trial.

    PubMed

    Yi, Hongmei; Zhang, Haiqing; Ma, Xiaochen; Zhang, Linxiu; Wang, Xiuqin; Jin, Ling; Naidoo, Kovin; Minto, Hasan; Zou, Haidong; Lu, Lina; Rozelle, Scott; Congdon, Nathan

    2015-11-01

    To study the effect of free glasses combined with teacher incentives on in-school glasses wear among Chinese urban migrant children. Cluster-randomized controlled trial. Children with visual acuity (VA) ≤6/12 in either eye owing to refractive error in 94 randomly chosen primary schools underwent randomization by school to receive free glasses, education on their use, and a teacher incentive (Intervention), or glasses prescriptions only (Control). Intervention group teachers received a tablet computer if ≥80% of children given glasses wore them during unannounced visits 6 weeks and 6 months (main outcome) after intervention. Among 4376 children, 728 (16.7%, mean age 10.9 years, 51.0% boys) met enrollment criteria and were randomly allocated, 358 (49.2%, 47 schools) to Intervention and 370 (50.8%, 47 schools) to Control. Among these, 693 children (95.2%) completed the study and underwent analysis. Spectacle wear was significantly higher at 6 months among Intervention children (Observed [main outcome]: 68.3% vs 23.9%, adjusted odds ratio [OR] = 11.5, 95% confidence interval [CI] 5.91-22.5, P < .001; Self-reported: 90.6% vs 32.1%, OR = 43.7, 95% CI = 21.7-88.5, P < .001). Other predictors of observed wear at 6 months included baseline spectacle wear (P < .001), uncorrected VA <6/18 (P = .01), and parental spectacle wear (P = .02). The 6-month observed wear rate was only 41% among similar-aged children provided free glasses in our previous trial without teacher incentives. Free spectacles and teacher incentives maintain classroom wear in the large majority of children needing glasses over a school year. Low wear among Control children demonstrates the need for interventions. Copyright © 2015 Elsevier Inc. All rights reserved.

  7. A randomized, phase II study of pazopanib in castrate-sensitive prostate cancer: a University of Chicago Phase II Consortium/Department of Defense Prostate Cancer Clinical Trials Consortium study.

    PubMed

    Ward, J E; Karrison, T; Chatta, G; Hussain, M; Shevrin, D; Szmulewitz, R Z; O'Donnell, P H; Stadler, W M; Posadas, E M

    2012-03-01

    Intermittent androgen suppression (IAS) is an increasingly popular treatment option for castrate-sensitive prostate cancer. On the basis of previous data with anti-angiogenic strategies, we hypothesized that pan-inhibition of the vascular endothelial growth factor receptor using pazopanib during the IAS off period would result in prolonged time to PSA failure. Men with biochemically recurrent prostate cancer, whose PSA was <0.5 ng ml(-1) after 6 months of androgen deprivation therapy were randomized to pazopanib 800 mg daily or observation. The planned primary outcome was time to PSA progression >4.0 ng ml(-1). Thirty-seven patients were randomized. Of 18 patients randomized to pazopanib, at the time of study closure, 4 had progressive disease, 1 remained on treatment and 13 (72%) electively disenrolled, the most common reason being patient request due to grade 1/2 toxicity (8 patients). Two additional patients were removed from treatment due to adverse events. Of 19 patients randomized to observation, at the time of study closure, 4 had progressive disease, 7 remained under protocol-defined observation and 8 (42%) had disenrolled, most commonly due to non-compliance with protocol visits (3 patients). Because of high dropout rates in both arms, the study was halted. IAS is a treatment approach that may facilitate investigation of novel agents in the hormone-sensitive state. This trial attempted to investigate the role of antiangiogenic therapy in this setting, but encountered several barriers, including toxicities and patient non-compliance, which can make implementation of such a study difficult. Future investigative efforts in this arena should carefully consider drug toxicity and employ a design that maximizes patient convenience to reduce the dropout rate.

  8. Statins as antiarrhythmics: a systematic review part I: effects on risk of atrial fibrillation.

    PubMed

    Abuissa, Hussam; O'Keefe, James H; Bybee, Kevin A

    2009-10-01

    Recent studies have demonstrated that statins may possess antiarrhythmic properties in addition to their lipid-lowering effects. Studies which reported the association of statins with the incidence of atrial arrhythmias were identified through a systematic review of published literature. One randomized, placebo-controlled trial of 200 patients undergoing cardiac surgery showed that atorvastatin decreased the incidence of postoperative atrial fibrillation by 61%. Observational studies in patients with stable coronary disease, left ventricular dysfunction, or those undergoing cardiac or noncardiac surgery show that statin therapy is associated with an approximately 50% lower rate of atrial fibrillation. Two small randomized trials reported conflicting results: one showing that atorvastatin reduced the recurrence of AF after electrical cardioversion and the other finding that pravastatin did not. Published data suggests that statins may possess antiarrhythmic properties that reduce the propensity for atrial fibrillation. Most of this data is observational; more randomized, placebo-controlled trials are needed.

  9. Testing Models for the Contributions of Genes and Environment to Developmental Change in Adolescent Depression

    PubMed Central

    Eaves, Lindon J.; Maes, Hermine; Silberg, Judy L.

    2015-01-01

    We tested two models to identify the genetic and environmental processes underlying longitudinal changes in depression among adolescents. The first assumes that observed changes in covariance structure result from the unfolding of inherent, random individual differences in the overall levels and rates of change in depression over time (random growth curves). The second assumes that observed changes are due to time-specific random effects (innovations) accumulating over time (autoregressive effects). We found little evidence of age-specific genetic effects or persistent genetic innovations. Instead, genetic effects are consistent with a gradual unfolding in the liability to depression and rates of change with increasing age. Likewise, the environment also creates significant individual differences in overall levels of depression and rates of change. However, there are also time-specific environmental experiences that persist with fidelity. The implications of these differing genetic and environmental mechanisms in the etiology of depression are considered. PMID:25894924

  10. Testing Models for the Contributions of Genes and Environment to Developmental Change in Adolescent Depression.

    PubMed

    Gillespie, Nathan A; Eaves, Lindon J; Maes, Hermine; Silberg, Judy L

    2015-07-01

    We tested two models to identify the genetic and environmental processes underlying longitudinal changes in depression among adolescents. The first assumes that observed changes in covariance structure result from the unfolding of inherent, random individual differences in the overall levels and rates of change in depression over time (random growth curves). The second assumes that observed changes are due to time-specific random effects (innovations) accumulating over time (autoregressive effects). We found little evidence of age-specific genetic effects or persistent genetic innovations. Instead, genetic effects are consistent with a gradual unfolding in the liability to depression and rates of change with increasing age. Likewise, the environment also creates significant individual differences in overall levels of depression and rates of change. However, there are also time-specific environmental experiences that persist with fidelity. The implications of these differing genetic and environmental mechanisms in the etiology of depression are considered.

  11. Observing random walks of atoms in buffer gas through resonant light absorption

    NASA Astrophysics Data System (ADS)

    Aoki, Kenichiro; Mitsui, Takahisa

    2016-07-01

    Using resonant light absorption, random-walk motions of rubidium atoms in nitrogen buffer gas are observed directly. The transmitted light intensity through atomic vapor is measured, and its spectrum is obtained, down to orders of magnitude below the shot-noise level to detect fluctuations caused by atomic motions. To understand the measured spectra, the spectrum for atoms performing random walks in a Gaussian light beam is computed, and its analytical form is obtained. The spectrum has 1 /f2 (f is frequency) behavior at higher frequencies, crossing over to a different, but well-defined, behavior at lower frequencies. The properties of this theoretical spectrum agree excellently with the measured spectrum. This understanding also enables us to obtain the diffusion constant, the photon cross section of atoms in buffer gas, and the atomic number density from a single spectral measurement. We further discuss other possible applications of our experimental method and analysis.

  12. 'Mendelian randomization': an approach for exploring causal relations in epidemiology.

    PubMed

    Gupta, V; Walia, G K; Sachdeva, M P

    2017-04-01

    To assess the current status of Mendelian randomization (MR) approach in effectively influencing the observational epidemiology for examining causal relationships. Narrative review on studies related to principle, strengths, limitations, and achievements of MR approach. Observational epidemiological studies have repeatedly produced several beneficiary associations which were discarded when tested by standard randomized controlled trials (RCTs). The technique which is more feasible, highly similar to RCTs, and has the potential to establish a causal relationship between modifiable exposures and disease outcomes is known as MR. The technique uses genetic variants related to modifiable traits/exposures as instruments for detecting causal and directional associations with outcomes. In the last decade, the approach of MR has methodologically developed and progressed to a stage of high acceptance among the epidemiologists and is gradually expanding the landscape of causal relationships in non-communicable chronic diseases. Copyright © 2016 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.

  13. The dependence of carbide morphology on grain boundary character in the highly twinned Alloy 690

    NASA Astrophysics Data System (ADS)

    Li, Hui; Xia, Shuang; Zhou, Bangxin; Chen, Wenjue; Hu, Changliang

    2010-04-01

    The dependence of morphology of grain boundary carbides on grain boundary character in Alloy 690 (Ni-30Cr-10Fe, mass fraction, %) with high fraction of low Σ coincidence site lattice (CSL) grain boundaries was investigated by scanning electron microscopy (SEM) and transmission electron microscopy (TEM). Some of the surface grains were removed by means of deep etching. It was observed that carbides grow dendritically at grain boundaries. The carbide bars observed near incoherent twin boundaries and twin related Σ9 grain boundaries are actually secondary dendrites of the carbides on these boundaries. Higher order dendrites could be observed on random grain boundaries, however, no bar-like dendrites were observed near Σ27 grain boundaries and random grain boundaries. The morphology difference of carbides precipitated at grain boundaries with different characters is discussed based on the experimental results in this paper.

  14. Long-acting reversible contraceptive acceptability and unintended pregnancy among women presenting for short-acting methods: a randomized patient preference trial.

    PubMed

    Hubacher, David; Spector, Hannah; Monteith, Charles; Chen, Pai-Lien; Hart, Catherine

    2017-02-01

    Measures of contraceptive effectiveness combine technology and user-related factors. Observational studies show higher effectiveness of long-acting reversible contraception compared with short-acting reversible contraception. Women who choose long-acting reversible contraception may differ in key ways from women who choose short-acting reversible contraception, and it may be these differences that are responsible for the high effectiveness of long-acting reversible contraception. Wider use of long-acting reversible contraception is recommended, but scientific evidence of acceptability and successful use is lacking in a population that typically opts for short-acting methods. The objective of the study was to reduce bias in measuring contraceptive effectiveness and better isolate the independent role that long-acting reversible contraception has in preventing unintended pregnancy relative to short-acting reversible contraception. We conducted a partially randomized patient preference trial and recruited women aged 18-29 years who were seeking a short-acting method (pills or injectable). Participants who agreed to randomization were assigned to 1 of 2 categories: long-acting reversible contraception or short-acting reversible contraception. Women who declined randomization but agreed to follow-up in the observational cohort chose their preferred method. Under randomization, participants chose a specific method in the category and received it for free, whereas participants in the preference cohort paid for the contraception in their usual fashion. Participants were followed up prospectively to measure primary outcomes of method continuation and unintended pregnancy at 12 months. Kaplan-Meier techniques were used to estimate method continuation probabilities. Intent-to-treat principles were applied after method initiation for comparing incidence of unintended pregnancy. We also measured acceptability in terms of level of happiness with the products. Of the 916 participants, 43% chose randomization and 57% chose the preference option. Complete loss to follow-up at 12 months was <2%. The 12-month method continuation probabilities were 63.3% (95% confidence interval, 58.9-67.3) (preference short-acting reversible contraception), 53.0% (95% confidence interval, 45.7-59.8) (randomized short-acting reversible contraception), and 77.8% (95% confidence interval, 71.0-83.2) (randomized long-acting reversible contraception) (P < .001 in the primary comparison involving randomized groups). The 12-month cumulative unintended pregnancy probabilities were 6.4% (95% confidence interval, 4.1-8.7) (preference short-acting reversible contraception), 7.7% (95% confidence interval, 3.3-12.1) (randomized short-acting reversible contraception), and 0.7% (95% confidence interval, 0.0-4.7) (randomized long-acting reversible contraception) (P = .01 when comparing randomized groups). In the secondary comparisons involving only short-acting reversible contraception users, the continuation probability was higher in the preference group compared with the randomized group (P = .04). However, the short-acting reversible contraception randomized group and short-acting reversible contraception preference group had statistically equivalent rates of unintended pregnancy (P = .77). Seventy-eight percent of randomized long-acting reversible contraception users were happy/neutral with their initial method, compared with 89% of randomized short-acting reversible contraception users (P < .05). However, among method continuers at 12 months, all groups were equally happy/neutral (>90%). Even in a typical population of women who presented to initiate or continue short-acting reversible contraception, long-acting reversible contraception proved highly acceptable. One year after initiation, women randomized to long-acting reversible contraception had high continuation rates and consequently experienced superior protection from unintended pregnancy compared with women using short-acting reversible contraception; these findings are attributable to the initial technology and not underlying factors that often bias observational estimates of effectiveness. The similarly patterned experiences of the 2 short-acting reversible contraception cohorts provide a bridge of generalizability between the randomized group and usual-care preference group. Benefits of increased voluntary uptake of long-acting reversible contraception may extend to wider populations than previously thought. Copyright © 2016 Elsevier Inc. All rights reserved.

  15. Sampling large random knots in a confined space

    NASA Astrophysics Data System (ADS)

    Arsuaga, J.; Blackstone, T.; Diao, Y.; Hinson, K.; Karadayi, E.; Saito, M.

    2007-09-01

    DNA knots formed under extreme conditions of condensation, as in bacteriophage P4, are difficult to analyze experimentally and theoretically. In this paper, we propose to use the uniform random polygon model as a supplementary method to the existing methods for generating random knots in confinement. The uniform random polygon model allows us to sample knots with large crossing numbers and also to generate large diagrammatically prime knot diagrams. We show numerically that uniform random polygons sample knots with large minimum crossing numbers and certain complicated knot invariants (as those observed experimentally). We do this in terms of the knot determinants or colorings. Our numerical results suggest that the average determinant of a uniform random polygon of n vertices grows faster than O(e^{n^2}) . We also investigate the complexity of prime knot diagrams. We show rigorously that the probability that a randomly selected 2D uniform random polygon of n vertices is almost diagrammatically prime goes to 1 as n goes to infinity. Furthermore, the average number of crossings in such a diagram is at the order of O(n2). Therefore, the two-dimensional uniform random polygons offer an effective way in sampling large (prime) knots, which can be useful in various applications.

  16. Golden Ratio Versus Pi as Random Sequence Sources for Monte Carlo Integration

    NASA Technical Reports Server (NTRS)

    Sen, S. K.; Agarwal, Ravi P.; Shaykhian, Gholam Ali

    2007-01-01

    We discuss here the relative merits of these numbers as possible random sequence sources. The quality of these sequences is not judged directly based on the outcome of all known tests for the randomness of a sequence. Instead, it is determined implicitly by the accuracy of the Monte Carlo integration in a statistical sense. Since our main motive of using a random sequence is to solve real world problems, it is more desirable if we compare the quality of the sequences based on their performances for these problems in terms of quality/accuracy of the output. We also compare these sources against those generated by a popular pseudo-random generator, viz., the Matlab rand and the quasi-random generator ha/ton both in terms of error and time complexity. Our study demonstrates that consecutive blocks of digits of each of these numbers produce a good random sequence source. It is observed that randomly chosen blocks of digits do not have any remarkable advantage over consecutive blocks for the accuracy of the Monte Carlo integration. Also, it reveals that pi is a better source of a random sequence than theta when the accuracy of the integration is concerned.

  17. The Random Forests Statistical Technique: An Examination of Its Value for the Study of Reading

    ERIC Educational Resources Information Center

    Matsuki, Kazunaga; Kuperman, Victor; Van Dyke, Julie A.

    2016-01-01

    Studies investigating individual differences in reading ability often involve data sets containing a large number of collinear predictors and a small number of observations. In this article, we discuss the method of Random Forests and demonstrate its suitability for addressing the statistical concerns raised by such data sets. The method is…

  18. Factors Influencing Hand Washing Behaviour in Primary Schools: Process Evaluation within a Randomized Controlled Trial

    ERIC Educational Resources Information Center

    Chittleborough, Catherine R.; Nicholson, Alexandra L.; Basker, Elaine; Bell, Sarah; Campbell, Rona

    2012-01-01

    This article explores factors that may influence hand washing behaviour among pupils and staff in primary schools. A qualitative process evaluation within a cluster randomized controlled trial included pupil focus groups (n = 16, aged 6-11 years), semi-structured interviews (n = 16 teachers) and observations of hand washing facilities (n = 57).…

  19. The Dirichlet-Multinomial Model for Multivariate Randomized Response Data and Small Samples

    ERIC Educational Resources Information Center

    Avetisyan, Marianna; Fox, Jean-Paul

    2012-01-01

    In survey sampling the randomized response (RR) technique can be used to obtain truthful answers to sensitive questions. Although the individual answers are masked due to the RR technique, individual (sensitive) response rates can be estimated when observing multivariate response data. The beta-binomial model for binary RR data will be generalized…

  20. Monte Carlo Simulation Using HyperCard and Lotus 1-2-3.

    ERIC Educational Resources Information Center

    Oulman, Charles S.; Lee, Motoko Y.

    Monte Carlo simulation is a computer modeling procedure for mimicking observations on a random variable. A random number generator is used in generating the outcome for the events that are being modeled. The simulation can be used to obtain results that otherwise require extensive testing or complicated computations. This paper describes how Monte…

  1. An Interactive Computer Model for Improved Student Understanding of Random Particle Motion and Osmosis

    ERIC Educational Resources Information Center

    Kottonau, Johannes

    2011-01-01

    Effectively teaching the concepts of osmosis to college-level students is a major obstacle in biological education. Therefore, a novel computer model is presented that allows students to observe the random nature of particle motion simultaneously with the seemingly directed net flow of water across a semipermeable membrane during osmotic…

  2. Molecular selection in a unified evolutionary sequence

    NASA Technical Reports Server (NTRS)

    Fox, S. W.

    1986-01-01

    With guidance from experiments and observations that indicate internally limited phenomena, an outline of unified evolutionary sequence is inferred. Such unification is not visible for a context of random matrix and random mutation. The sequence proceeds from Big Bang through prebiotic matter, protocells, through the evolving cell via molecular and natural selection, to mind, behavior, and society.

  3. The Effects of Attrition on Baseline Comparability in Randomized Experiments in Education: A Meta-Analysis

    ERIC Educational Resources Information Center

    Valentine, Jeffrey C.; McHugh, Cathleen M.

    2007-01-01

    Using meta-analysis, randomized experiments in education that either clearly did or clearly did not experience student attrition were examined for the baseline comparability of groups. Results from 35 studies suggested that after attrition, the observed measures of baseline comparability of groups did not differ more than would be expected given…

  4. Sensitivity Analysis for Multivalued Treatment Effects: An Example of a Cross-Country Study of Teacher Participation and Job Satisfaction

    ERIC Educational Resources Information Center

    Chang, Chi

    2015-01-01

    It is known that interventions are hard to assign randomly to subjects in social psychological studies, because randomized control is difficult to implement strictly and precisely. Thus, in nonexperimental studies and observational studies, controlling the impact of covariates on the dependent variables and addressing the robustness of the…

  5. Transition from nonresonant to resonant random lasers by the geometrical confinement of disorder.

    PubMed

    Ghofraniha, N; Viola, I; Zacheo, A; Arima, V; Gigli, G; Conti, C

    2013-12-01

    We report on a transition in random lasers that is induced by the geometrical confinement of the emitting material. Different dye doped paper devices with controlled geometry are fabricated by soft lithography and show two distinguished behaviors in the stimulated emission: in the absence of boundary constraints, the energy threshold decreases for larger laser volumes showing the typical trend of diffusive nonresonant random lasers, while when the same material is lithographed into channels, the walls act as cavity and the resonant behavior typical of standard lasers is observed. The experimental results are consistent with the general theories of random and standard lasers and a clear phase diagram of the transition is reported.

  6. Thermodynamics of strain-induced crystallization of random copolymers.

    PubMed

    Nie, Yijing; Gao, Huanhuan; Wu, Yixian; Hu, Wenbing

    2014-01-14

    Industrial semi-crystalline polymers contain various kinds of sequence defects, which behave like non-crystallizable comonomer units on random copolymers. We performed dynamic Monte Carlo simulations of strain-induced crystallization of random copolymers with various contents of comonomers at high temperatures. We observed that the onset strains of crystallization shift up with the increase of comonomer contents and temperatures. The behaviors can be predicted well by a combination of Flory's theories on the melting-point shifting-down of random copolymers and on the melting-point shifting-up of strain-induced crystallization. Our thermodynamic results are fundamentally important for us to understand the rubber strain-hardening, the plastic molding, the film stretching as well as the fiber spinning.

  7. Using Random Forest Models to Predict Organizational Violence

    NASA Technical Reports Server (NTRS)

    Levine, Burton; Bobashev, Georgly

    2012-01-01

    We present a methodology to access the proclivity of an organization to commit violence against nongovernment personnel. We fitted a Random Forest model using the Minority at Risk Organizational Behavior (MAROS) dataset. The MAROS data is longitudinal; so, individual observations are not independent. We propose a modification to the standard Random Forest methodology to account for the violation of the independence assumption. We present the results of the model fit, an example of predicting violence for an organization; and finally, we present a summary of the forest in a "meta-tree,"

  8. Monte Carlo computer simulations of Venus equilibrium and global resurfacing models

    NASA Technical Reports Server (NTRS)

    Dawson, D. D.; Strom, R. G.; Schaber, G. G.

    1992-01-01

    Two models have been proposed for the resurfacing history of Venus: (1) equilibrium resurfacing and (2) global resurfacing. The equilibrium model consists of two cases: in case 1, areas less than or equal to 0.03 percent of the planet are spatially randomly resurfaced at intervals of less than or greater than 150,000 yr to produce the observed spatially random distribution of impact craters and average surface age of about 500 m.y.; and in case 2, areas greater than or equal to 10 percent of the planet are resurfaced at intervals of greater than or equal to 50 m.y. The global resurfacing model proposes that the entire planet was resurfaced about 500 m.y. ago, destroying the preexisting crater population and followed by significantly reduced volcanism and tectonism. The present crater population has accumulated since then with only 4 percent of the observed craters having been embayed by more recent lavas. To test the equilibrium resurfacing model we have run several Monte Carlo computer simulations for the two proposed cases. It is shown that the equilibrium resurfacing model is not a valid model for an explanation of the observed crater population characteristics or Venus' resurfacing history. The global resurfacing model is the most likely explanation for the characteristics of Venus' cratering record. The amount of resurfacing since that event, some 500 m.y. ago, can be estimated by a different type of Monte Carolo simulation. To date, our initial simulation has only considered the easiest case to implement. In this case, the volcanic events are randomly distributed across the entire planet and, therefore, contrary to observation, the flooded craters are also randomly distributed across the planet.

  9. Observed oil and gas field size distributions: A consequence of the discovery process and prices of oil and gas

    USGS Publications Warehouse

    Drew, L.J.; Attanasi, E.D.; Schuenemeyer, J.H.

    1988-01-01

    If observed oil and gas field size distributions are obtained by random samplings, the fitted distributions should approximate that of the parent population of oil and gas fields. However, empirical evidence strongly suggests that larger fields tend to be discovered earlier in the discovery process than they would be by random sampling. Economic factors also can limit the number of small fields that are developed and reported. This paper examines observed size distributions in state and federal waters of offshore Texas. Results of the analysis demonstrate how the shape of the observable size distributions change with significant hydrocarbon price changes. Comparison of state and federal observed size distributions in the offshore area shows how production cost differences also affect the shape of the observed size distribution. Methods for modifying the discovery rate estimation procedures when economic factors significantly affect the discovery sequence are presented. A primary conclusion of the analysis is that, because hydrocarbon price changes can significantly affect the observed discovery size distribution, one should not be confident about inferring the form and specific parameters of the parent field size distribution from the observed distributions. ?? 1988 International Association for Mathematical Geology.

  10. Bias Correction and Random Error Characterization for the Assimilation of HRDI Line-of-Sight Wind Measurements

    NASA Technical Reports Server (NTRS)

    Tangborn, Andrew; Menard, Richard; Ortland, David; Einaudi, Franco (Technical Monitor)

    2001-01-01

    A new approach to the analysis of systematic and random observation errors is presented in which the error statistics are obtained using forecast data rather than observations from a different instrument type. The analysis is carried out at an intermediate retrieval level, instead of the more typical state variable space. This method is carried out on measurements made by the High Resolution Doppler Imager (HRDI) on board the Upper Atmosphere Research Satellite (UARS). HRDI, a limb sounder, is the only satellite instrument measuring winds in the stratosphere, and the only instrument of any kind making global wind measurements in the upper atmosphere. HRDI measures doppler shifts in the two different O2 absorption bands (alpha and B) and the retrieved products are tangent point Line-of-Sight wind component (level 2 retrieval) and UV winds (level 3 retrieval). This analysis is carried out on a level 1.9 retrieval, in which the contributions from different points along the line-of-sight have not been removed. Biases are calculated from O-F (observed minus forecast) LOS wind components and are separated into a measurement parameter space consisting of 16 different values. The bias dependence on these parameters (plus an altitude dependence) is used to create a bias correction scheme carried out on the level 1.9 retrieval. The random error component is analyzed by separating the gamma and B band observations and locating observation pairs where both bands are very nearly looking at the same location at the same time. It is shown that the two observation streams are uncorrelated and that this allows the forecast error variance to be estimated. The bias correction is found to cut the effective observation error variance in half.

  11. Random telegraph noise in 2D hexagonal boron nitride dielectric films

    NASA Astrophysics Data System (ADS)

    Ranjan, A.; Puglisi, F. M.; Raghavan, N.; O'Shea, S. J.; Shubhakar, K.; Pavan, P.; Padovani, A.; Larcher, L.; Pey, K. L.

    2018-03-01

    This study reports the observation of low frequency random telegraph noise (RTN) in a 2D layered hexagonal boron nitride dielectric film in the pre- and post-soft breakdown phases using conductive atomic force microscopy as a nanoscale spectroscopy tool. The RTN traces of the virgin and electrically stressed dielectric (after percolation breakdown) were compared, and the signal features were statistically analyzed using the Factorial Hidden Markov Model technique. We observe a combination of both two-level and multi-level RTN signals in h-BN, akin to the trends commonly observed for bulk oxides such as SiO2 and HfO2. Experimental evidence suggests frequent occurrence of unstable and anomalous RTN traces in 2D dielectrics which makes extraction of defect energetics challenging.

  12. Increased Patient Enrollment to a Randomized Surgical Trial Through Equipoise Polling of an Expert Surgeon Panel.

    PubMed

    Ghogawala, Zoher; Schwartz, J Sanford; Benzel, Edward C; Magge, Subu N; Coumans, Jean Valery; Harrington, J Fred; Gelbs, Jared C; Whitmore, Robert G; Butler, William E; Barker, Fred G

    2016-07-01

    To determine whether patients who learned the views of an expert surgeons' panel's assessment of equipoise between 2 alternative operative treatments had increased likelihood of consenting to randomization. Difficulty obtaining patient consent to randomization is an important barrier to conducting surgical randomized clinical trials, the gold standard for generating clinical evidence. Observational study of the rate of patient acceptance of randomization within a 5-center randomized clinical trial comparing lumbar spinal decompression versus lumbar spinal decompression plus instrumented fusion for patients with symptomatic grade I degenerative lumbar spondylolisthesis with spinal stenosis. Eligible patients were enrolled in the trial and then asked to accept randomization. A panel of 10 expert spine surgeons was formed to review clinical information and images for individual patients to provide an assessment of suitability for randomization. The expert panel vote was disclosed to the patient by the patient's surgeon before the patient decided whether to accept randomization or not. Randomization acceptance among eligible patients without expert panel review was 40% (19/48) compared with 81% (47/58) among patients undergoing expert panel review (P < 0.001). Among expert-reviewed patients, randomization acceptance was 95% when all experts or all except 1 voted for randomization, 75% when 2 experts voted against randomization, and 20% with 3 or 4 votes against (P < 0.001 for trend). Patients provided with an expert panel's assessment of their own suitability for randomization were twice as likely to agree to randomization compared with patients receiving only their own surgeon's recommendation.

  13. A topological analysis of large-scale structure, studied using the CMASS sample of SDSS-III

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parihar, Prachi; Gott, J. Richard III; Vogeley, Michael S.

    2014-12-01

    We study the three-dimensional genus topology of large-scale structure using the northern region of the CMASS Data Release 10 (DR10) sample of the SDSS-III Baryon Oscillation Spectroscopic Survey. We select galaxies with redshift 0.452 < z < 0.625 and with a stellar mass M {sub stellar} > 10{sup 11.56} M {sub ☉}. We study the topology at two smoothing lengths: R {sub G} = 21 h {sup –1} Mpc and R {sub G} = 34 h {sup –1} Mpc. The genus topology studied at the R {sub G} = 21 h {sup –1} Mpc scale results in the highest genusmore » amplitude observed to date. The CMASS sample yields a genus curve that is characteristic of one produced by Gaussian random phase initial conditions. The data thus support the standard model of inflation where random quantum fluctuations in the early universe produced Gaussian random phase initial conditions. Modest deviations in the observed genus from random phase are as expected from shot noise effects and the nonlinear evolution of structure. We suggest the use of a fitting formula motivated by perturbation theory to characterize the shift and asymmetries in the observed genus curve with a single parameter. We construct 54 mock SDSS CMASS surveys along the past light cone from the Horizon Run 3 (HR3) N-body simulations, where gravitationally bound dark matter subhalos are identified as the sites of galaxy formation. We study the genus topology of the HR3 mock surveys with the same geometry and sampling density as the observational sample and find the observed genus topology to be consistent with ΛCDM as simulated by the HR3 mock samples. We conclude that the topology of the large-scale structure in the SDSS CMASS sample is consistent with cosmological models having primordial Gaussian density fluctuations growing in accordance with general relativity to form galaxies in massive dark matter halos.« less

  14. Statistical power in parallel group point exposure studies with time-to-event outcomes: an empirical comparison of the performance of randomized controlled trials and the inverse probability of treatment weighting (IPTW) approach.

    PubMed

    Austin, Peter C; Schuster, Tibor; Platt, Robert W

    2015-10-15

    Estimating statistical power is an important component of the design of both randomized controlled trials (RCTs) and observational studies. Methods for estimating statistical power in RCTs have been well described and can be implemented simply. In observational studies, statistical methods must be used to remove the effects of confounding that can occur due to non-random treatment assignment. Inverse probability of treatment weighting (IPTW) using the propensity score is an attractive method for estimating the effects of treatment using observational data. However, sample size and power calculations have not been adequately described for these methods. We used an extensive series of Monte Carlo simulations to compare the statistical power of an IPTW analysis of an observational study with time-to-event outcomes with that of an analysis of a similarly-structured RCT. We examined the impact of four factors on the statistical power function: number of observed events, prevalence of treatment, the marginal hazard ratio, and the strength of the treatment-selection process. We found that, on average, an IPTW analysis had lower statistical power compared to an analysis of a similarly-structured RCT. The difference in statistical power increased as the magnitude of the treatment-selection model increased. The statistical power of an IPTW analysis tended to be lower than the statistical power of a similarly-structured RCT.

  15. Predicting structures in the Zone of Avoidance

    NASA Astrophysics Data System (ADS)

    Sorce, Jenny G.; Colless, Matthew; Kraan-Korteweg, Renée C.; Gottlöber, Stefan

    2017-11-01

    The Zone of Avoidance (ZOA), whose emptiness is an artefact of our Galaxy dust, has been challenging observers as well as theorists for many years. Multiple attempts have been made on the observational side to map this region in order to better understand the local flows. On the theoretical side, however, this region is often simply statistically populated with structures but no real attempt has been made to confront theoretical and observed matter distributions. This paper takes a step forward using constrained realizations (CRs) of the local Universe shown to be perfect substitutes of local Universe-like simulations for smoothed high-density peak studies. Far from generating completely `random' structures in the ZOA, the reconstruction technique arranges matter according to the surrounding environment of this region. More precisely, the mean distributions of structures in a series of constrained and random realizations (RRs) differ: while densities annihilate each other when averaging over 200 RRs, structures persist when summing 200 CRs. The probability distribution function of ZOA grid cells to be highly overdense is a Gaussian with a 15 per cent mean in the random case, while that of the constrained case exhibits large tails. This implies that areas with the largest probabilities host most likely a structure. Comparisons between these predictions and observations, like those of the Puppis 3 cluster, show a remarkable agreement and allow us to assert the presence of the, recently highlighted by observations, Vela supercluster at about 180 h-1 Mpc, right behind the thickest dust layers of our Galaxy.

  16. Efficient Bayesian hierarchical functional data analysis with basis function approximations using Gaussian-Wishart processes.

    PubMed

    Yang, Jingjing; Cox, Dennis D; Lee, Jong Soo; Ren, Peng; Choi, Taeryon

    2017-12-01

    Functional data are defined as realizations of random functions (mostly smooth functions) varying over a continuum, which are usually collected on discretized grids with measurement errors. In order to accurately smooth noisy functional observations and deal with the issue of high-dimensional observation grids, we propose a novel Bayesian method based on the Bayesian hierarchical model with a Gaussian-Wishart process prior and basis function representations. We first derive an induced model for the basis-function coefficients of the functional data, and then use this model to conduct posterior inference through Markov chain Monte Carlo methods. Compared to the standard Bayesian inference that suffers serious computational burden and instability in analyzing high-dimensional functional data, our method greatly improves the computational scalability and stability, while inheriting the advantage of simultaneously smoothing raw observations and estimating the mean-covariance functions in a nonparametric way. In addition, our method can naturally handle functional data observed on random or uncommon grids. Simulation and real studies demonstrate that our method produces similar results to those obtainable by the standard Bayesian inference with low-dimensional common grids, while efficiently smoothing and estimating functional data with random and high-dimensional observation grids when the standard Bayesian inference fails. In conclusion, our method can efficiently smooth and estimate high-dimensional functional data, providing one way to resolve the curse of dimensionality for Bayesian functional data analysis with Gaussian-Wishart processes. © 2017, The International Biometric Society.

  17. Observations of the Middle School Environment: The Context for Student Behavior beyond the Classroom

    ERIC Educational Resources Information Center

    Rusby, Julie C.; Crowley, Ryann; Sprague, Jeffrey; Biglan, Anthony

    2011-01-01

    This article describes the use of an observation system to measure middle school staff practices, environment characteristics, and student behavior in the school common areas. Data were collected at baseline from 18 middle schools participating in a randomized controlled trial of school-wide Positive Behavior Support. The observations were…

  18. Examining Teacher Effectiveness Using Classroom Observation Scores: Evidence from the Randomization of Teachers to Students

    ERIC Educational Resources Information Center

    Garrett, Rachel; Steinberg, Matthew P.

    2015-01-01

    Despite policy efforts to encourage multiple measures of performance in newly developing teacher evaluation systems, practical constraints often result in evaluations based predominantly on formal classroom observations. Yet there is limited knowledge of how these observational measures relate to student achievement. This article leverages the…

  19. Assessing the Generalizability of Randomized Trial Results to Target Populations

    PubMed Central

    Stuart, Elizabeth A.; Bradshaw, Catherine P.; Leaf, Philip J.

    2014-01-01

    Recent years have seen increasing interest in and attention to evidence-based practices, where the “evidence” generally comes from well-conducted randomized trials. However, while those trials yield accurate estimates of the effect of the intervention for the participants in the trial (known as “internal validity”), they do not always yield relevant information about the effects in a particular target population (known as “external validity”). This may be due to a lack of specification of a target population when designing the trial, difficulties recruiting a sample that is representative of a pre-specified target population, or to interest in considering a target population somewhat different from the population directly targeted by the trial. This paper first provides an overview of existing design and analysis methods for assessing and enhancing the ability of a randomized trial to estimate treatment effects in a target population. It then provides a case study using one particular method, which weights the subjects in a randomized trial to match the population on a set of observed characteristics. The case study uses data from a randomized trial of School-wide Positive Behavioral Interventions and Supports (PBIS); our interest is in generalizing the results to the state of Maryland. In the case of PBIS, after weighting, estimated effects in the target population were similar to those observed in the randomized trial. The paper illustrates that statistical methods can be used to assess and enhance the external validity of randomized trials, making the results more applicable to policy and clinical questions. However, there are also many open research questions; future research should focus on questions of treatment effect heterogeneity and further developing these methods for enhancing external validity. Researchers should think carefully about the external validity of randomized trials and be cautious about extrapolating results to specific populations unless they are confident of the similarity between the trial sample and that target population. PMID:25307417

  20. Assessing the generalizability of randomized trial results to target populations.

    PubMed

    Stuart, Elizabeth A; Bradshaw, Catherine P; Leaf, Philip J

    2015-04-01

    Recent years have seen increasing interest in and attention to evidence-based practices, where the "evidence" generally comes from well-conducted randomized trials. However, while those trials yield accurate estimates of the effect of the intervention for the participants in the trial (known as "internal validity"), they do not always yield relevant information about the effects in a particular target population (known as "external validity"). This may be due to a lack of specification of a target population when designing the trial, difficulties recruiting a sample that is representative of a prespecified target population, or to interest in considering a target population somewhat different from the population directly targeted by the trial. This paper first provides an overview of existing design and analysis methods for assessing and enhancing the ability of a randomized trial to estimate treatment effects in a target population. It then provides a case study using one particular method, which weights the subjects in a randomized trial to match the population on a set of observed characteristics. The case study uses data from a randomized trial of school-wide positive behavioral interventions and supports (PBIS); our interest is in generalizing the results to the state of Maryland. In the case of PBIS, after weighting, estimated effects in the target population were similar to those observed in the randomized trial. The paper illustrates that statistical methods can be used to assess and enhance the external validity of randomized trials, making the results more applicable to policy and clinical questions. However, there are also many open research questions; future research should focus on questions of treatment effect heterogeneity and further developing these methods for enhancing external validity. Researchers should think carefully about the external validity of randomized trials and be cautious about extrapolating results to specific populations unless they are confident of the similarity between the trial sample and that target population.

  1. Random Wiring, Ganglion Cell Mosaics, and the Functional Architecture of the Visual Cortex

    PubMed Central

    Coppola, David; White, Leonard E.; Wolf, Fred

    2015-01-01

    The architecture of iso-orientation domains in the primary visual cortex (V1) of placental carnivores and primates apparently follows species invariant quantitative laws. Dynamical optimization models assuming that neurons coordinate their stimulus preferences throughout cortical circuits linking millions of cells specifically predict these invariants. This might indicate that V1’s intrinsic connectome and its functional architecture adhere to a single optimization principle with high precision and robustness. To validate this hypothesis, it is critical to closely examine the quantitative predictions of alternative candidate theories. Random feedforward wiring within the retino-cortical pathway represents a conceptually appealing alternative to dynamical circuit optimization because random dimension-expanding projections are believed to generically exhibit computationally favorable properties for stimulus representations. Here, we ask whether the quantitative invariants of V1 architecture can be explained as a generic emergent property of random wiring. We generalize and examine the stochastic wiring model proposed by Ringach and coworkers, in which iso-orientation domains in the visual cortex arise through random feedforward connections between semi-regular mosaics of retinal ganglion cells (RGCs) and visual cortical neurons. We derive closed-form expressions for cortical receptive fields and domain layouts predicted by the model for perfectly hexagonal RGC mosaics. Including spatial disorder in the RGC positions considerably changes the domain layout properties as a function of disorder parameters such as position scatter and its correlations across the retina. However, independent of parameter choice, we find that the model predictions substantially deviate from the layout laws of iso-orientation domains observed experimentally. Considering random wiring with the currently most realistic model of RGC mosaic layouts, a pairwise interacting point process, the predicted layouts remain distinct from experimental observations and resemble Gaussian random fields. We conclude that V1 layout invariants are specific quantitative signatures of visual cortical optimization, which cannot be explained by generic random feedforward-wiring models. PMID:26575467

  2. High-flavanol and high-theobromine versus low-flavanol and low-theobromine chocolate to improve uterine artery pulsatility index: a double blind randomized clinical trial.

    PubMed

    Bujold, Emmanuel; Leblanc, Vicky; Lavoie-Lebel, Élise; Babar, Asma; Girard, Mario; Poungui, Lionel; Blanchet, Claudine; Marc, Isabelle; Lemieux, Simone; Belkacem, Abdous; Sidi, Elhadji Laouan; Dodin, Sylvie

    2017-09-01

    To evaluate the impact of high-flavanol and high-theobromine (HFHT) chocolate in women at risk of preeclampsia (PE). We conducted a single-center randomized controlled trial including women with singleton pregnancy between 11 and 14 weeks gestation who had bilateral abnormal uterine artery (UtA) waveforms (notching) and elevated pulsatility index (PI). Participants were randomized to either HFHT or low-flavanol and low-theobromine (LFLT) chocolate (30 grams daily for a total of 12 weeks). UtA PI, reported as multiple of medians (MoM) adjusted for gestational age, was assessed at baseline and 12 weeks after randomization. One hundred thirty-one women were randomized with mean gestational age of 12.4 ± 0.6 weeks and a mean UtA PI of 1.39 ± 0.31 MoM. UtA PI adjusted for gestational age significantly decreased from baseline to the second visit (12 weeks later) in the two groups (p < 0.0001) but no significant difference was observed between the groups (p = 0.16). Compared with LFLT chocolate, daily intake of HFHT chocolate was not associated with significant changes of UtA PI. Nevertheless, the improvement observed in both groups suggests that chocolate could improve placental function independently of flavanol and/or theobromine content.

  3. Changing friend selection in middle school: A social network analysis of a randomized intervention study designed to prevent adolescent problem behavior

    PubMed Central

    DeLay, Dawn; Ha, Thao; Van Ryzin, Mark; Winter, Charlotte; Dishion, Thomas J.

    2015-01-01

    Adolescent friendships that promote problem behavior are often chosen in middle school. The current study examines the unintended impact of a randomized school based intervention on the selection of friends in middle school, as well as on observations of deviant talk with friends five years later. Participants included 998 middle school students (526 boys and 472 girls) recruited at the onset of middle school (age 11-12 years) from three public middle schools participating in the Family Check-up model intervention. The current study focuses only on the effects of the SHAPe curriculum—one level of the Family Check-up model—on friendship choices. Participants nominated friends and completed measures of deviant peer affiliation. Approximately half of the sample (n=500) was randomly assigned to the intervention and the other half (n=498) comprised the control group within each school. The results indicate that the SHAPe curriculum affected friend selection within School 1, but not within Schools 2 or 3. The effects of friend selection in School 1 translated into reductions in observed deviancy training five years later (age 16-17 years). By coupling longitudinal social network analysis with a randomized intervention study the current findings provide initial evidence that a randomized public middle school intervention can disrupt the formation of deviant peer groups and diminish levels of adolescent deviance five years later. PMID:26377235

  4. Quantifying the value of redundant measurements at GCOS Reference Upper-Air Network sites

    DOE PAGES

    Madonna, F.; Rosoldi, M.; Güldner, J.; ...

    2014-11-19

    The potential for measurement redundancy to reduce uncertainty in atmospheric variables has not been investigated comprehensively for climate observations. We evaluated the usefulness of entropy and mutual correlation concepts, as defined in information theory, for quantifying random uncertainty and redundancy in time series of the integrated water vapour (IWV) and water vapour mixing ratio profiles provided by five highly instrumented GRUAN (GCOS, Global Climate Observing System, Reference Upper-Air Network) stations in 2010–2012. Results show that the random uncertainties on the IWV measured with radiosondes, global positioning system, microwave and infrared radiometers, and Raman lidar measurements differed by less than 8%.more » Comparisons of time series of IWV content from ground-based remote sensing instruments with in situ soundings showed that microwave radiometers have the highest redundancy with the IWV time series measured by radiosondes and therefore the highest potential to reduce the random uncertainty of the radiosondes time series. Moreover, the random uncertainty of a time series from one instrument can be reduced by ~ 60% by constraining the measurements with those from another instrument. The best reduction of random uncertainty is achieved by conditioning Raman lidar measurements with microwave radiometer measurements. In conclusion, specific instruments are recommended for atmospheric water vapour measurements at GRUAN sites. This approach can be applied to the study of redundant measurements for other climate variables.« less

  5. Random myosin loss along thick-filaments increases myosin attachment time and the proportion of bound myosin heads to mitigate force decline in skeletal muscle

    PubMed Central

    Tanner, Bertrand C.W.; McNabb, Mark; Palmer, Bradley M.; Toth, Michael J.; Miller, Mark S.

    2014-01-01

    Diminished skeletal muscle performance with aging, disuse, and disease may be partially attributed to the loss of myofilament proteins. Several laboratories have found a disproportionate loss of myosin protein content relative to other myofilament proteins, but due to methodological limitations, the structural manifestation of this protein loss is unknown. To investigate how variations in myosin content affect ensemble cross-bridge behavior and force production we simulated muscle contraction in the half-sarcomere as myosin was removed either i) uniformly, from the Z-line end of thick-filaments, or ii) randomly, along the length of thick-filaments. Uniform myosin removal decreased force production, showing a slightly steeper force-to-myosin content relationship than the 1:1 relationship that would be expected from the loss of cross-bridges. Random myosin removal also decreased force production, but this decrease was less than observed with uniform myosin loss, largely due to increased myosin attachment time (ton) and fractional cross-bridge binding with random myosin loss. These findings support our prior observations that prolonged ton may augment force production in single fibers with randomly reduced myosin content from chronic heart failure patients. These simulation also illustrate that the pattern of myosin loss along thick-filaments influences ensemble cross-bridge behavior and maintenance of force throughout the sarcomere. PMID:24486373

  6. Exercise and Bone Mineral Density in Premenopausal Women: A Meta-Analysis of Randomized Controlled Trials

    PubMed Central

    Kelley, George A.; Kelley, Kristi S.; Kohrt, Wendy M.

    2013-01-01

    Objective. Examine the effects of exercise on femoral neck (FN) and lumbar spine (LS) bone mineral density (BMD) in premenopausal women. Methods. Meta-analysis of randomized controlled exercise trials ≥24 weeks in premenopausal women. Standardized effect sizes (g) were calculated for each result and pooled using random-effects models, Z score alpha values, 95% confidence intervals (CIs), and number needed to treat (NNT). Heterogeneity was examined using Q and I 2. Moderator and predictor analyses using mixed-effects ANOVA and simple metaregression were conducted. Statistical significance was set at P ≤ 0.05. Results. Statistically significant improvements were found for both FN (7g's, 466 participants, g = 0.342, 95%  CI = 0.132, 0.553, P = 0.001, Q = 10.8, P = 0.22, I 2 = 25.7%, NNT = 5) and LS (6g's, 402 participants, g = 0.201, 95%  CI = 0.009, 0.394, P = 0.04, Q = 3.3, P = 0.65, I 2 = 0%, NNT = 9) BMD. A trend for greater benefits in FN BMD was observed for studies published in countries other than the United States and for those who participated in home versus facility-based exercise. Statistically significant, or a trend for statistically significant, associations were observed for 7 different moderators and predictors, 6 for FN BMD and 1 for LS BMD. Conclusions. Exercise benefits FN and LS BMD in premenopausal women. The observed moderators and predictors deserve further investigation in well-designed randomized controlled trials. PMID:23401684

  7. Exercise and bone mineral density in premenopausal women: a meta-analysis of randomized controlled trials.

    PubMed

    Kelley, George A; Kelley, Kristi S; Kohrt, Wendy M

    2013-01-01

    Objective. Examine the effects of exercise on femoral neck (FN) and lumbar spine (LS) bone mineral density (BMD) in premenopausal women. Methods. Meta-analysis of randomized controlled exercise trials ≥24 weeks in premenopausal women. Standardized effect sizes (g) were calculated for each result and pooled using random-effects models, Z score alpha values, 95% confidence intervals (CIs), and number needed to treat (NNT). Heterogeneity was examined using Q and I(2). Moderator and predictor analyses using mixed-effects ANOVA and simple metaregression were conducted. Statistical significance was set at P ≤ 0.05. Results. Statistically significant improvements were found for both FN (7g's, 466 participants, g = 0.342, 95%  CI = 0.132, 0.553, P = 0.001, Q = 10.8, P = 0.22, I(2) = 25.7%, NNT = 5) and LS (6g's, 402 participants, g = 0.201, 95%  CI = 0.009, 0.394, P = 0.04, Q = 3.3, P = 0.65, I(2) = 0%, NNT = 9) BMD. A trend for greater benefits in FN BMD was observed for studies published in countries other than the United States and for those who participated in home versus facility-based exercise. Statistically significant, or a trend for statistically significant, associations were observed for 7 different moderators and predictors, 6 for FN BMD and 1 for LS BMD. Conclusions. Exercise benefits FN and LS BMD in premenopausal women. The observed moderators and predictors deserve further investigation in well-designed randomized controlled trials.

  8. An observational study showed that explaining randomization using gambling-related metaphors and computer-agency descriptions impeded randomized clinical trial recruitment.

    PubMed

    Jepson, Marcus; Elliott, Daisy; Conefrey, Carmel; Wade, Julia; Rooshenas, Leila; Wilson, Caroline; Beard, David; Blazeby, Jane M; Birtle, Alison; Halliday, Alison; Stein, Rob; Donovan, Jenny L

    2018-07-01

    To explore how the concept of randomization is described by clinicians and understood by patients in randomized controlled trials (RCTs) and how it contributes to patient understanding and recruitment. Qualitative analysis of 73 audio recordings of recruitment consultations from five, multicenter, UK-based RCTs with identified or anticipated recruitment difficulties. One in 10 appointments did not include any mention of randomization. Most included a description of the method or process of allocation. Descriptions often made reference to gambling-related metaphors or similes, or referred to allocation by a computer. Where reference was made to a computer, some patients assumed that they would receive the treatment that was "best for them". Descriptions of the rationale for randomization were rarely present and often only came about as a consequence of patients questioning the reason for a random allocation. The methods and processes of randomization were usually described by recruiters, but often without clarity, which could lead to patient misunderstanding. The rationale for randomization was rarely mentioned. Recruiters should avoid problematic gambling metaphors and illusions of agency in their explanations and instead focus on clearer descriptions of the rationale and method of randomization to ensure patients are better informed about randomization and RCT participation. Copyright © 2018 University of Bristol. Published by Elsevier Inc. All rights reserved.

  9. Reducing Playground Bullying and Supporting Beliefs: An Experimental Trial of the Steps to Respect Program

    ERIC Educational Resources Information Center

    Frey, Karin S.; Hirschstein, Miriam K.; Snell, Jennie L.; Edstrom, Leihua Van Schoiack; MacKenzie, Elizabeth P.; Broderick, Carole J.

    2005-01-01

    Six schools were randomly assigned to a multilevel bullying intervention or a control condition. Children in Grades 3-6 (N=1,023) completed pre- and posttest surveys of behaviors and beliefs and were rated by teachers. Observers coded playground behavior of a random subsample (n=544). Hierarchical analyses of changes in playground behavior…

  10. Validating Components of Teacher Effectiveness: A Random Assignment Study of Value-Added, Observation, and Survey Scores

    ERIC Educational Resources Information Center

    Bacher-Hicks, Andrew; Chin, Mark; Kane, Thomas J.; Staiger, Douglas O.

    2015-01-01

    Policy changes from the past decade have resulted in a growing interest in identifying effective teachers and their characteristics. This study is the third study to use data from a randomized experiment to test the validity of measures of teacher effectiveness. The authors collected effectiveness measures across three school years from three…

  11. Implementation of a Manualized Communication Intervention for School-Aged Children with Pragmatic and Social Communication Needs in a Randomized Controlled Trial: The Social Communication Intervention Project

    ERIC Educational Resources Information Center

    Adams, Catherine; Lockton, Elaine; Gaile, Jacqueline; Earl, Gillian; Freed, Jenny

    2012-01-01

    Background: Speech-language interventions are often complex in nature, involving multiple observations, variable outcomes and individualization in treatment delivery. The accepted procedure associated with randomized controlled trials (RCT) of such complex interventions is to develop and implement a manual of intervention in order that reliable…

  12. Midwives Performance in Early Detection of Growth and Development Irregularities of Children Based on Task Commitment

    ERIC Educational Resources Information Center

    Utami, Sri; Nursalam; Hargono, Rachmat; Susilaningrum, Rekawati

    2016-01-01

    The purpose of this study was to analyze the performance of midwives based on the task commitment. This was an observational analytic with cross sectional approach. Multistage random sampling was used to determine the public health center, proportional random sampling to selected participants. The samples were 222 midwives in the public health…

  13. Bayesian random-effect model for predicting outcome fraught with heterogeneity--an illustration with episodes of 44 patients with intractable epilepsy.

    PubMed

    Yen, A M-F; Liou, H-H; Lin, H-L; Chen, T H-H

    2006-01-01

    The study aimed to develop a predictive model to deal with data fraught with heterogeneity that cannot be explained by sampling variation or measured covariates. The random-effect Poisson regression model was first proposed to deal with over-dispersion for data fraught with heterogeneity after making allowance for measured covariates. Bayesian acyclic graphic model in conjunction with Markov Chain Monte Carlo (MCMC) technique was then applied to estimate the parameters of both relevant covariates and random effect. Predictive distribution was then generated to compare the predicted with the observed for the Bayesian model with and without random effect. Data from repeated measurement of episodes among 44 patients with intractable epilepsy were used as an illustration. The application of Poisson regression without taking heterogeneity into account to epilepsy data yielded a large value of heterogeneity (heterogeneity factor = 17.90, deviance = 1485, degree of freedom (df) = 83). After taking the random effect into account, the value of heterogeneity factor was greatly reduced (heterogeneity factor = 0.52, deviance = 42.5, df = 81). The Pearson chi2 for the comparison between the expected seizure frequencies and the observed ones at two and three months of the model with and without random effect were 34.27 (p = 1.00) and 1799.90 (p < 0.0001), respectively. The Bayesian acyclic model using the MCMC method was demonstrated to have great potential for disease prediction while data show over-dispersion attributed either to correlated property or to subject-to-subject variability.

  14. Security and composability of randomness expansion from Bell inequalities

    NASA Astrophysics Data System (ADS)

    Fehr, Serge; Gelles, Ran; Schaffner, Christian

    2013-01-01

    The nonlocal behavior of quantum mechanics can be used to generate guaranteed fresh randomness from an untrusted device that consists of two nonsignalling components; since the generation process requires some initial fresh randomness to act as a catalyst, one also speaks of randomness expansion. R. Colbeck and A. Kent [J. Phys. A1751-811310.1088/1751-8113/44/9/095305 44, 095305 (2011)] proposed the first method for generating randomness from untrusted devices, but without providing a rigorous analysis. This was addressed subsequently by S. Pironio [Nature (London)NATUAS0028-083610.1038/nature09008 464, 1021 (2010)], who aimed at deriving a lower bound on the min-entropy of the data extracted from an untrusted device based only on the observed nonlocal behavior of the device. Although that article succeeded in developing important tools for reaching the stated goal, the proof itself contained a bug, and the given formal claim on the guaranteed amount of min-entropy needs to be revisited. In this paper we build on the tools provided by Pironio and obtain a meaningful lower bound on the min-entropy of the data produced by an untrusted device based on the observed nonlocal behavior of the device. Our main result confirms the essence of the (improperly formulated) claims of Pironio and puts them on solid ground. We also address the question of composability and show that different untrusted devices can be composed in an alternating manner under the assumption that they are not entangled. This enables superpolynomial randomness expansion based on two untrusted yet unentangled devices.

  15. Decreased resting-state brain activity complexity in schizophrenia characterized by both increased regularity and randomness.

    PubMed

    Yang, Albert C; Hong, Chen-Jee; Liou, Yin-Jay; Huang, Kai-Lin; Huang, Chu-Chung; Liu, Mu-En; Lo, Men-Tzung; Huang, Norden E; Peng, Chung-Kang; Lin, Ching-Po; Tsai, Shih-Jen

    2015-06-01

    Schizophrenia is characterized by heterogeneous pathophysiology. Using multiscale entropy (MSE) analysis, which enables capturing complex dynamics of time series, we characterized MSE patterns of blood-oxygen-level-dependent (BOLD) signals across different time scales and determined whether BOLD activity in patients with schizophrenia exhibits increased complexity (increased entropy in all time scales), decreased complexity toward regularity (decreased entropy in all time scales), or decreased complexity toward uncorrelated randomness (high entropy in short time scales followed by decayed entropy as the time scale increases). We recruited 105 patients with schizophrenia with an age of onset between 18 and 35 years and 210 age- and sex-matched healthy volunteers. Results showed that MSE of BOLD signals in patients with schizophrenia exhibited two routes of decreased BOLD complexity toward either regular or random patterns. Reduced BOLD complexity toward regular patterns was observed in the cerebellum and temporal, middle, and superior frontal regions, and reduced BOLD complexity toward randomness was observed extensively in the inferior frontal, occipital, and postcentral cortices as well as in the insula and middle cingulum. Furthermore, we determined that the two types of complexity change were associated differently with psychopathology; specifically, the regular type of BOLD complexity change was associated with positive symptoms of schizophrenia, whereas the randomness type of BOLD complexity was associated with negative symptoms of the illness. These results collectively suggested that resting-state dynamics in schizophrenia exhibit two routes of pathologic change toward regular or random patterns, which contribute to the differences in syndrome domains of psychosis in patients with schizophrenia. © 2015 Wiley Periodicals, Inc.

  16. Modeling and Predicting the Stress Relaxation of Composites with Short and Randomly Oriented Fibers

    PubMed Central

    Obaid, Numaira; Sain, Mohini

    2017-01-01

    The addition of short fibers has been experimentally observed to slow the stress relaxation of viscoelastic polymers, producing a change in the relaxation time constant. Our recent study attributed this effect of fibers on stress relaxation behavior to the interfacial shear stress transfer at the fiber-matrix interface. This model explained the effect of fiber addition on stress relaxation without the need to postulate structural changes at the interface. In our previous study, we developed an analytical model for the effect of fully aligned short fibers, and the model predictions were successfully compared to finite element simulations. However, in most industrial applications of short-fiber composites, fibers are not aligned, and hence it is necessary to examine the time dependence of viscoelastic polymers containing randomly oriented short fibers. In this study, we propose an analytical model to predict the stress relaxation behavior of short-fiber composites where the fibers are randomly oriented. The model predictions were compared to results obtained from Monte Carlo finite element simulations, and good agreement between the two was observed. The analytical model provides an excellent tool to accurately predict the stress relaxation behavior of randomly oriented short-fiber composites. PMID:29053601

  17. Six-day randomized safety trial of intravaginal lime juice.

    PubMed

    Mauck, Christine K; Ballagh, Susan A; Creinin, Mitchell D; Weiner, Debra H; Doncel, Gustavo F; Fichorova, Raina N; Schwartz, Jill L; Chandra, Neelima; Callahan, Marianne M

    2008-11-01

    Nigerian women reportedly apply lime juice intravaginally to protect themselves against HIV. In vitro data suggest that lime juice is virucidal, but only at cytotoxic concentrations. This is the first controlled, randomized safety trial of lime juice applied to the human vagina. Forty-seven women were randomized to apply water or lime juice (25%, 50%, or undiluted) intravaginally twice daily for two 6-day intervals, separated by a 3-week washout period. Product application also was randomized: during 1 interval, product was applied using a saturated tampon and in the other by douche. Vaginal pH, symptoms, signs of irritation observed via naked eye examination and colposcopy, microflora, and markers of inflammation in cervicovaginal lavages were evaluated after 1 hour and on days 3 and 7. The largest reduction in pH was about one-half a pH unit, seen 1 hour after douching with 100% lime juice. We observed a dose-dependent pattern of symptoms and clinical and laboratory findings that were consistent with a compromised vaginal barrier function. The brief reduction in pH after vaginal lime juice application is unlikely to be virucidal in the presence of semen. Lime juice is unlikely to protect against HIV and may actually be harmful.

  18. Mendelian Randomization.

    PubMed

    Grover, Sandeep; Del Greco M, Fabiola; Stein, Catherine M; Ziegler, Andreas

    2017-01-01

    Confounding and reverse causality have prevented us from drawing meaningful clinical interpretation even in well-powered observational studies. Confounding may be attributed to our inability to randomize the exposure variable in observational studies. Mendelian randomization (MR) is one approach to overcome confounding. It utilizes one or more genetic polymorphisms as a proxy for the exposure variable of interest. Polymorphisms are randomly distributed in a population, they are static throughout an individual's lifetime, and may thus help in inferring directionality in exposure-outcome associations. Genome-wide association studies (GWAS) or meta-analyses of GWAS are characterized by large sample sizes and the availability of many single nucleotide polymorphisms (SNPs), making GWAS-based MR an attractive approach. GWAS-based MR comes with specific challenges, including multiple causality. Despite shortcomings, it still remains one of the most powerful techniques for inferring causality.With MR still an evolving concept with complex statistical challenges, the literature is relatively scarce in terms of providing working examples incorporating real datasets. In this chapter, we provide a step-by-step guide for causal inference based on the principles of MR with a real dataset using both individual and summary data from unrelated individuals. We suggest best possible practices and give recommendations based on the current literature.

  19. Genetically determined schizophrenia is not associated with impaired glucose homeostasis.

    PubMed

    Polimanti, Renato; Gelernter, Joel; Stein, Dan J

    2018-05-01

    Here, we used data from large genome-wide association studies to test the presence of causal relationships, conducting a Mendelian randomization analysis; and shared molecular mechanisms, calculating the genetic correlation, among schizophrenia, type 2 diabetes (T2D), and impaired glucose homeostasis. Although our Mendelian randomization analysis was well-powered, no causal relationship was observed between schizophrenia and T2D, or traits related to glucose impaired homeostasis. Similarly, we did not observe any global genetic overlap among these traits. These findings indicate that there is no causal relationships or shared mechanisms between schizophrenia and impaired glucose homeostasis. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Effect of clonidine on the efficacy of lignocaine local anesthesia in dentistry: A systematic review and meta-analysis of randomized, controlled trials.

    PubMed

    Sivaramakrishnan, Gowri; Sridharan, Kannan

    2018-05-01

    Alternatives to adrenaline with lignocaine local anesthesia, such as clonidine, have been trialed in various randomized, controlled trials. Therefore, the aim of the present systematic review was to compile the available evidence on using clonidine with lignocaine for dental anesthesia. Electronic databases were searched for eligible studies. A data-extraction form was created, extracted data were analyzed using non-Cochrane mode in RevMan 5.3 software. Heterogeneity between the studies were assessed using the forest plot, I 2 statistics (where >50% was considered to have moderate-to-severe heterogeneity), and χ 2 -test. Random-effects models were used because of moderate heterogeneity. Five studies were included for the final review. While clonidine was found to significantly shorten the onset of local anesthesia when measured subjectively, no significant difference was observed objectively. No significant difference was observed in the duration and postoperative analgesia. Stable hemodynamic parameters within the safe range were observed postoperatively when clonidine was used. Clonidine could be considered as an alternative to adrenaline in cases of contraindications to adrenaline, such as like cardiac abnormalities, hypertension, and diabetes. © 2017 John Wiley & Sons Australia, Ltd.

  1. Clinical outcomes research in gynecologic oncology.

    PubMed

    Melamed, Alexander; Rauh-Hain, J Alejandro; Schorge, John O

    2017-09-01

    Clinical outcomes research seeks to understand the real-world manifestations of clinical care. In particular, outcomes research seeks to reveal the effects of pharmaceutical, procedural, and structural aspects of healthcare on patient outcomes, including mortality, disease control, toxicity, cost, and quality of life. Although outcomes research can utilize interventional study designs, insightful use of observational data is a defining feature of this field. Many questions in gynecologic oncology are not amenable to investigation in randomized clinical trials due to cost, feasibility, or ethical concerns. When a randomized trial is not practical or has not yet been conducted, well-designed observational studies have the potential to provide the best available evidence about the effects of clinical care. Such studies may use surveys, medical records, disease registries, and a variety of administrative data sources. Even when a randomized trial has been conducted, observational studies can be used to estimate the real-world effect of an intervention, which may differ from the results obtained in the controlled setting of a clinical trial. This article reviews the goals, methodologies, data sources, and limitations of clinical outcomes research, with a focus on gynecologic oncology. Copyright © 2017. Published by Elsevier Inc.

  2. Rituximab after Autologous Stem-Cell Transplantation in Mantle-Cell Lymphoma.

    PubMed

    Le Gouill, Steven; Thieblemont, Catherine; Oberic, Lucie; Moreau, Anne; Bouabdallah, Krimo; Dartigeas, Caroline; Damaj, Gandhi; Gastinne, Thomas; Ribrag, Vincent; Feugier, Pierre; Casasnovas, Olivier; Zerazhi, Hacène; Haioun, Corinne; Maisonneuve, Hervé; Houot, Roch; Jardin, Fabrice; Van Den Neste, Eric; Tournilhac, Olivier; Le Dû, Katell; Morschhauser, Franck; Cartron, Guillaume; Fornecker, Luc-Matthieu; Canioni, Danielle; Callanan, Mary; Béné, Marie C; Salles, Gilles; Tilly, Hervé; Lamy, Thierry; Gressin, Remy; Hermine, Olivier

    2017-09-28

    Mantle-cell lymphoma is generally incurable. Despite high rates of complete response after initial immunochemotherapy followed by autologous stem-cell transplantation, patients have relapses. We investigated whether rituximab maintenance therapy at a dose of 375 mg per square meter of body-surface area administered every 2 months for 3 years after transplantation would prolong the duration of response. In a phase 3 trial involving 299 patients who were younger than 66 years of age at diagnosis, we randomly assigned 240 patients to receive rituximab maintenance therapy or to undergo observation after autologous stem-cell transplantation (120 patients per group); 59 patients did not undergo randomization. The primary end point was event-free survival (with an event defined as disease progression, relapse, death, allergy to rituximab, or severe infection) after transplantation among patients who underwent randomization. After four courses of immunochemotherapy induction (rituximab, dexamethasone, cytarabine, and a platinum derivative [R-DHAP]), the overall response rate was 89%, and the complete response rate 77%. Transplantation was performed in 257 patients. The median follow-up from randomization after transplantation was 50.2 months (range, 46.4 to 54.2). Starting from randomization, the rate of event-free survival at 4 years was 79% (95% confidence interval [CI], 70 to 86) in the rituximab group versus 61% (95% CI, 51 to 70) in the observation group (P=0.001). The rate of progression-free survival at 4 years was 83% (95% CI, 73 to 88) in the rituximab group versus 64% (95% CI, 55 to 73) in the observation group (P<0.001). The rate of overall survival was 89% (95% CI, 81 to 94) in the rituximab group versus 80% (95% CI, 72 to 88) in the observation group (P=0.04). According to a Cox regression unadjusted analysis, the rate of overall survival at 4 years was higher in the rituximab group than in the observation group (hazard ratio for death, 0.50; 95% CI, 0.26 to 0.99; P=0.04). Rituximab maintenance therapy after transplantation prolonged event-free survival, progression-free survival, and overall survival among patients with mantle-cell lymphoma who were younger than 66 years of age at diagnosis. (Funded by Roche and Amgen; LyMa ClinicalTrials.gov number, NCT00921414 .).

  3. Poractant alfa versus bovine lipid extract surfactant for infants 24+0 to 31+6 weeks gestational age: A randomized controlled trial.

    PubMed

    Lemyre, Brigitte; Fusch, Christoph; Schmölzer, Georg M; Rouvinez Bouali, Nicole; Reddy, Deepti; Barrowman, Nicholas; Huneault-Purney, Nicole; Lacaze-Masmonteil, Thierry

    2017-01-01

    To compare the efficacy and safety of poractant alfa and bovine lipid extract surfactant in preterm infants. Randomized, partially-blinded, multicenter trial. Infants <32 weeks needing surfactant before 48 hours were randomly assigned to receive poractant alfa or bovine lipid extract surfactant. The primary outcome was being alive and extubated at 48 hours post-randomization. Secondary outcomes included need for re-dosing, duration of respiratory support and oxygen, bronchopulmonary dysplasia, mortality and complications during administration. Three centers recruited 87 infants (mean 26.7 weeks and 906 grams) at a mean age of 5.9 hours, between March 2013 and December 2015. 21/42 (50%) were alive and extubated at 48 hours in the poractant alfa group vs 26/45 (57.8%) in the bovine lipid extract surfactant group; adjusted OR 0.76 (95% CI 0.30-1.93) (p = 0.56). No differences were observed in the need to re-dose. Duration of oxygen support (41.5 vs 62 days; adjusted OR 1.69 95% CI 1.02-2.80; p = 0.04) was reduced in infants who received poractant alfa. We observed a trend in bronchopulmonary dysplasia among survivors (51.5% vs 72.1%; adjusted OR 0.35 95%CI 0.12-1.04; p = 0.06) favoring poractant alfa. Twelve infants died before discharge, 9 in the poractant alfa group and 3 in the bovine lung extract group. Severe airway obstruction following administration was observed in 0 (poractant alfa) and 5 (bovine lipid extract surfactant) infants (adjusted OR 0.09 95%CI <0.01-1.27; p = 0.07). No statistically significant difference was observed in the proportion of infants alive and extubated within 48h between the two study groups. Poractant alfa may be more beneficial and associated with fewer complications than bovine lipid extract surfactant. However, we observed a trend towards higher mortality in the poractant alfa group. Larger studies are needed to determine whether observed possible benefits translate in shorter hospital admissions, or other long term benefits and determine whether there is a difference in mortality.

  4. Block versus Random Amphiphilic Glycopolymer Nanopaticles as Glucose-Responsive Vehicles.

    PubMed

    Guo, Qianqian; Zhang, Tianqi; An, Jinxia; Wu, Zhongming; Zhao, Yu; Dai, Xiaomei; Zhang, Xinge; Li, Chaoxing

    2015-10-12

    To explore the effect of polymer structure on their self-assembled aggregates and their unique characteristics, this study was devoted to developing a series of amphiphilic block and random phenylboronic acid-based glycopolymers by RAFT polymerization. The amphiphilic glycopolymers were successfully self-assembled into spherically shaped nanoparticles with narrow size distribution in aqueous solution. For block and random copolymers with similar monomer compositions, block copolymer nanoparticles exhibited a more regular transmittance change with the increasing glucose level, while a more evident variation of size and quicker decreasing tendency in I/I0 behavior in different glucose media were observed for random copolymer nanoparticles. Cell viability of all the polymer nanoparticles investigated by MTT assay was higher than 80%, indicating that both block and random copolymers had good cytocompatibility. Insulin could be encapsulated into both nanoparticles, and insulin release rate for random glycopolymer was slightly quicker than that for the block ones. We speculate that different chain conformations between block and random glycopolymers play an important role in self-assembled nanoaggregates and underlying glucose-sensitive behavior.

  5. Modeling the Overalternating Bias with an Asymmetric Entropy Measure

    PubMed Central

    Gronchi, Giorgio; Raglianti, Marco; Noventa, Stefano; Lazzeri, Alessandro; Guazzini, Andrea

    2016-01-01

    Psychological research has found that human perception of randomness is biased. In particular, people consistently show the overalternating bias: they rate binary sequences of symbols (such as Heads and Tails in coin flipping) with an excess of alternation as more random than prescribed by the normative criteria of Shannon's entropy. Within data mining for medical applications, Marcellin proposed an asymmetric measure of entropy that can be ideal to account for such bias and to quantify subjective randomness. We fitted Marcellin's entropy and Renyi's entropy (a generalized form of uncertainty measure comprising many different kinds of entropies) to experimental data found in the literature with the Differential Evolution algorithm. We observed a better fit for Marcellin's entropy compared to Renyi's entropy. The fitted asymmetric entropy measure also showed good predictive properties when applied to different datasets of randomness-related tasks. We concluded that Marcellin's entropy can be a parsimonious and effective measure of subjective randomness that can be useful in psychological research about randomness perception. PMID:27458418

  6. DNA-based random number generation in security circuitry.

    PubMed

    Gearheart, Christy M; Arazi, Benjamin; Rouchka, Eric C

    2010-06-01

    DNA-based circuit design is an area of research in which traditional silicon-based technologies are replaced by naturally occurring phenomena taken from biochemistry and molecular biology. This research focuses on further developing DNA-based methodologies to mimic digital data manipulation. While exhibiting fundamental principles, this work was done in conjunction with the vision that DNA-based circuitry, when the technology matures, will form the basis for a tamper-proof security module, revolutionizing the meaning and concept of tamper-proofing and possibly preventing it altogether based on accurate scientific observations. A paramount part of such a solution would be self-generation of random numbers. A novel prototype schema employs solid phase synthesis of oligonucleotides for random construction of DNA sequences; temporary storage and retrieval is achieved through plasmid vectors. A discussion of how to evaluate sequence randomness is included, as well as how these techniques are applied to a simulation of the random number generation circuitry. Simulation results show generated sequences successfully pass three selected NIST random number generation tests specified for security applications.

  7. Should multiple imputation be the method of choice for handling missing data in randomized trials?

    PubMed Central

    Sullivan, Thomas R; White, Ian R; Salter, Amy B; Ryan, Philip; Lee, Katherine J

    2016-01-01

    The use of multiple imputation has increased markedly in recent years, and journal reviewers may expect to see multiple imputation used to handle missing data. However in randomized trials, where treatment group is always observed and independent of baseline covariates, other approaches may be preferable. Using data simulation we evaluated multiple imputation, performed both overall and separately by randomized group, across a range of commonly encountered scenarios. We considered both missing outcome and missing baseline data, with missing outcome data induced under missing at random mechanisms. Provided the analysis model was correctly specified, multiple imputation produced unbiased treatment effect estimates, but alternative unbiased approaches were often more efficient. When the analysis model overlooked an interaction effect involving randomized group, multiple imputation produced biased estimates of the average treatment effect when applied to missing outcome data, unless imputation was performed separately by randomized group. Based on these results, we conclude that multiple imputation should not be seen as the only acceptable way to handle missing data in randomized trials. In settings where multiple imputation is adopted, we recommend that imputation is carried out separately by randomized group. PMID:28034175

  8. Should multiple imputation be the method of choice for handling missing data in randomized trials?

    PubMed

    Sullivan, Thomas R; White, Ian R; Salter, Amy B; Ryan, Philip; Lee, Katherine J

    2016-01-01

    The use of multiple imputation has increased markedly in recent years, and journal reviewers may expect to see multiple imputation used to handle missing data. However in randomized trials, where treatment group is always observed and independent of baseline covariates, other approaches may be preferable. Using data simulation we evaluated multiple imputation, performed both overall and separately by randomized group, across a range of commonly encountered scenarios. We considered both missing outcome and missing baseline data, with missing outcome data induced under missing at random mechanisms. Provided the analysis model was correctly specified, multiple imputation produced unbiased treatment effect estimates, but alternative unbiased approaches were often more efficient. When the analysis model overlooked an interaction effect involving randomized group, multiple imputation produced biased estimates of the average treatment effect when applied to missing outcome data, unless imputation was performed separately by randomized group. Based on these results, we conclude that multiple imputation should not be seen as the only acceptable way to handle missing data in randomized trials. In settings where multiple imputation is adopted, we recommend that imputation is carried out separately by randomized group.

  9. Assessing the value of diagnostic imaging: the role of perception

    NASA Astrophysics Data System (ADS)

    Potchen, E. J.; Cooper, Thomas G.

    2000-04-01

    The value of diagnostic radiology rests in its ability to provide information. Information is defined as a reduction in randomness. Quality improvement in any system requires diminution in the variation in its performance. The major variation in performance of the system of diagnostic radiology occurs in observer performance and in the communication of information from the observer to someone who will apply that information to the benefit of the patient. The ability to provide information can be determined by observer performance studies using a receiver-operating characteristic (ROC) curve analysis. The amount of information provided by each observer can be measured in terms of the uncertainty they reduce. Using a set of standardized radiographs, some normal and some abnormal, sorting them randomly, and then asking an observer to redistribute them according to their probability of normality can measure the difference in the value added by different observers. By applying this observer performance measure, we have been able to characterize individual radiologists, groups of radiologists, and regions of the United States in their ability to add value in chest radiology. The use of these technologies in health care may improve upon the contribution of diagnostic imaging.

  10. Does Transfer Capacitive Resistive Energy Has a Therapeutic Effect on Peyronie's Disease? Randomized, Single-Blind, Sham-Controlled Study on 96 Patients: Fast Pain Relief.

    PubMed

    Pavone, Carlo; Romeo, Salvatore; D'Amato, Francesco; Usala, Manuela; Letizia Mauro, Giulia; Caruana, Giovanni

    2017-01-01

    Background/Aims/Objectives: We have investigated the clinical and physiological effects of Transfer Capacitive Resistive Energy (TCARE) therapy on men with Peyronie's disease (PD). Ninety-six men with PD have been randomized in a 2:1 ratio to receive 3 sessions of TCARE therapy or sham therapy. Pain, penile curvature and erectile function have been assessed before the first treatment and up to 9 months after the end of treatment, using the Visual Analogue Scale for the pain, a goniometer to measure the degree of curvature using at-home photography and an International Index of Erectile Function (IIEF-5) questionnaire. A significant pain reduction at the end of the treatment in 51 (79.6%) patients (p < 0.01) of the treated group was observed. No significant improvements in the sham group (p = 0.23) have been observed. No statistical differences in the degree of curvature have been observed in both groups. No statistical improvements have been observed in the IIEF-5 questionnaire. Adverse events have not been reported. This is, to our knowledge, the first randomized, single-blind, sham-controlled study that shows that TCARE has a positive short-term clinical effect on pain in patients with PD. The feasibility and tolerability of this treatment produce an attractive new therapeutic option for men with PD. © 2017 S. Karger AG, Basel.

  11. Cuing effects for informational masking

    NASA Astrophysics Data System (ADS)

    Richards, Virginia M.; Neff, Donna L.

    2004-01-01

    The detection of a tone added to a random-frequency, multitone masker can be very poor even when the maskers have little energy in the frequency region of the signal. This paper examines the effects of adding a pretrial cue to reduce uncertainty for the masker or the signal. The first two experiments examined the effect of cuing a fixed-frequency signal as the number of masker components and presentation methods were manipulated. Cue effectiveness varied across observers, but could reduce thresholds by as much as 20 dB. Procedural comparisons indicated observers benefited more from having two masker samples to compare, with or without a signal cue, than having a single interval with one masker sample and a signal cue. The third experiment used random-frequency signals and compared no-cue, signal-cue, and masker-cue conditions, and also systematically varied the time interval between cue offset and trial onset. Thresholds with a cued random-frequency signal remained higher than for a cued fixed-frequency signal. For time intervals between the cue and trial of 50 ms or longer, thresholds were approximately the same with a signal or a masker cue and lower than when there was no cue. Without a cue or with a masker cue, analyses of possible decision strategies suggested observers attended to the potential signal frequencies, particularly the highest signal frequency. With a signal cue, observers appeared to attend to the frequency of the subsequent signal.

  12. Optimized hardware framework of MLP with random hidden layers for classification applications

    NASA Astrophysics Data System (ADS)

    Zyarah, Abdullah M.; Ramesh, Abhishek; Merkel, Cory; Kudithipudi, Dhireesha

    2016-05-01

    Multilayer Perceptron Networks with random hidden layers are very efficient at automatic feature extraction and offer significant performance improvements in the training process. They essentially employ large collection of fixed, random features, and are expedient for form-factor constrained embedded platforms. In this work, a reconfigurable and scalable architecture is proposed for the MLPs with random hidden layers with a customized building block based on CORDIC algorithm. The proposed architecture also exploits fixed point operations for area efficiency. The design is validated for classification on two different datasets. An accuracy of ~ 90% for MNIST dataset and 75% for gender classification on LFW dataset was observed. The hardware has 299 speed-up over the corresponding software realization.

  13. Source-Device-Independent Ultrafast Quantum Random Number Generation.

    PubMed

    Marangon, Davide G; Vallone, Giuseppe; Villoresi, Paolo

    2017-02-10

    Secure random numbers are a fundamental element of many applications in science, statistics, cryptography and more in general in security protocols. We present a method that enables the generation of high-speed unpredictable random numbers from the quadratures of an electromagnetic field without any assumption on the input state. The method allows us to eliminate the numbers that can be predicted due to the presence of classical and quantum side information. In particular, we introduce a procedure to estimate a bound on the conditional min-entropy based on the entropic uncertainty principle for position and momentum observables of infinite dimensional quantum systems. By the above method, we experimentally demonstrated the generation of secure true random bits at a rate greater than 1.7 Gbit/s.

  14. Alternatives to the Randomized Controlled Trial

    PubMed Central

    West, Stephen G.; Duan, Naihua; Pequegnat, Willo; Gaist, Paul; Des Jarlais, Don C.; Holtgrave, David; Szapocznik, José; Fishbein, Martin; Rapkin, Bruce; Clatts, Michael; Mullen, Patricia Dolan

    2008-01-01

    Public health researchers are addressing new research questions (e.g., effects of environmental tobacco smoke, Hurricane Katrina) for which the randomized controlled trial (RCT) may not be a feasible option. Drawing on the potential outcomes framework (Rubin Causal Model) and Campbellian perspectives, we consider alternative research designs that permit relatively strong causal inferences. In randomized encouragement designs, participants are randomly invited to participate in one of the treatment conditions, but are allowed to decide whether to receive treatment. In quantitative assignment designs, treatment is assigned on the basis of a quantitative measure (e.g., need, merit, risk). In observational studies, treatment assignment is unknown and presumed to be nonrandom. Major threats to the validity of each design and statistical strategies for mitigating those threats are presented. PMID:18556609

  15. Adjuvant chemotherapy for rectal cancer patients treated with preoperative (chemo)radiotherapy and total mesorectal excision: a Dutch Colorectal Cancer Group (DCCG) randomized phase III trial.

    PubMed

    Breugom, A J; van Gijn, W; Muller, E W; Berglund, Å; van den Broek, C B M; Fokstuen, T; Gelderblom, H; Kapiteijn, E; Leer, J W H; Marijnen, C A M; Martijn, H; Meershoek-Klein Kranenbarg, E; Nagtegaal, I D; Påhlman, L; Punt, C J A; Putter, H; Roodvoets, A G H; Rutten, H J T; Steup, W H; Glimelius, B; van de Velde, C J H

    2015-04-01

    The discussion on the role of adjuvant chemotherapy for rectal cancer patients treated according to current guidelines is still ongoing. A multicentre, randomized phase III trial, PROCTOR-SCRIPT, was conducted to compare adjuvant chemotherapy with observation for rectal cancer patients treated with preoperative (chemo)radiotherapy and total mesorectal excision (TME). The PROCTOR-SCRIPT trial recruited patients from 52 hospitals. Patients with histologically proven stage II or III rectal adenocarcinoma were randomly assigned (1:1) to observation or adjuvant chemotherapy after preoperative (chemo)radiotherapy and TME. Radiotherapy consisted of 5 × 5 Gy. Chemoradiotherapy consisted of 25 × 1.8-2 Gy combined with 5-FU-based chemotherapy. Adjuvant chemotherapy consisted of 5-FU/LV (PROCTOR) or eight courses capecitabine (SCRIPT). Randomization was based on permuted blocks of six, stratified according to centre, residual tumour, time between last irradiation and surgery, and preoperative treatment. The primary end point was overall survival. Of 470 enrolled patients, 437 were eligible. The trial closed prematurely because of slow patient accrual. Patients were randomly assigned to observation (n = 221) or adjuvant chemotherapy (n = 216). After a median follow-up of 5.0 years, 5-year overall survival was 79.2% in the observation group and 80.4% in the chemotherapy group [hazard ratio (HR) 0.93, 95% confidence interval (CI) 0.62-1.39; P = 0.73]. The HR for disease-free survival was 0.80 (95% CI 0.60-1.07; P = 0.13). Five-year cumulative incidence for locoregional recurrences was 7.8% in both groups. Five-year cumulative incidence for distant recurrences was 38.5% and 34.7%, respectively (P = 0.39). The PROCTOR-SCRIPT trial could not demonstrate a significant benefit of adjuvant chemotherapy with fluoropyrimidine monotherapy after preoperative (chemo)radiotherapy and TME on overall survival, disease-free survival, and recurrence rate. However, this trial did not complete planned accrual. Dutch Colorectal Cancer group, CKTO 2003-16, ISRCTN36266738. © The Author 2014. Published by Oxford University Press on behalf of the European Society for Medical Oncology. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  16. A Classroom Observational Study of Qatar's Independent Schools: Instruction and School Reform

    ERIC Educational Resources Information Center

    Palmer, Douglas J.; Sadiq, Hissa M.; Lynch, Patricia; Parker, Dawn; Viruru, Radhika; Knight, Stephanie; Waxman, Hersh; Alford, Beverly; Brown, Danielle Bairrington; Rollins, Kayla; Stillisano, Jacqueline; Abu-Tineh, Abdullah M. Hamdan; Nasser, Ramzi; Allen, Nancy; Al-Binali, Hessa; Ellili, Maha; Al-Kateeb, Haithem; Al-Kubaisi, Huda

    2016-01-01

    Qatar initiated a K-12 national educational reform in 2001. However, there is limited information on the instructional practices of the teachers in the reform schools. This project was an observational study of classrooms with a stratified random sample of the first six cohorts of reform schools. Specifically, 156 classrooms were observed in 29…

  17. An Introduction to Propensity Score Methods for Reducing the Effects of Confounding in Observational Studies

    ERIC Educational Resources Information Center

    Austin, Peter C.

    2011-01-01

    The propensity score is the probability of treatment assignment conditional on observed baseline characteristics. The propensity score allows one to design and analyze an observational (nonrandomized) study so that it mimics some of the particular characteristics of a randomized controlled trial. In particular, the propensity score is a balancing…

  18. Randomizing Genome-Scale Metabolic Networks

    PubMed Central

    Samal, Areejit; Martin, Olivier C.

    2011-01-01

    Networks coming from protein-protein interactions, transcriptional regulation, signaling, or metabolism may appear to have “unusual” properties. To quantify this, it is appropriate to randomize the network and test the hypothesis that the network is not statistically different from expected in a motivated ensemble. However, when dealing with metabolic networks, the randomization of the network using edge exchange generates fictitious reactions that are biochemically meaningless. Here we provide several natural ensembles of randomized metabolic networks. A first constraint is to use valid biochemical reactions. Further constraints correspond to imposing appropriate functional constraints. We explain how to perform these randomizations with the help of Markov Chain Monte Carlo (MCMC) and show that they allow one to approach the properties of biological metabolic networks. The implication of the present work is that the observed global structural properties of real metabolic networks are likely to be the consequence of simple biochemical and functional constraints. PMID:21779409

  19. An instrumental variable random-coefficients model for binary outcomes

    PubMed Central

    Chesher, Andrew; Rosen, Adam M

    2014-01-01

    In this paper, we study a random-coefficients model for a binary outcome. We allow for the possibility that some or even all of the explanatory variables are arbitrarily correlated with the random coefficients, thus permitting endogeneity. We assume the existence of observed instrumental variables Z that are jointly independent with the random coefficients, although we place no structure on the joint determination of the endogenous variable X and instruments Z, as would be required for a control function approach. The model fits within the spectrum of generalized instrumental variable models, and we thus apply identification results from our previous studies of such models to the present context, demonstrating their use. Specifically, we characterize the identified set for the distribution of random coefficients in the binary response model with endogeneity via a collection of conditional moment inequalities, and we investigate the structure of these sets by way of numerical illustration. PMID:25798048

  20. A stochastic model for stationary dynamics of prices in real estate markets. A case of random intensity for Poisson moments of prices changes

    NASA Astrophysics Data System (ADS)

    Rusakov, Oleg; Laskin, Michael

    2017-06-01

    We consider a stochastic model of changes of prices in real estate markets. We suppose that in a book of prices the changes happen in points of jumps of a Poisson process with a random intensity, i.e. moments of changes sequently follow to a random process of the Cox process type. We calculate cumulative mathematical expectations and variances for the random intensity of this point process. In the case that the process of random intensity is a martingale the cumulative variance has a linear grows. We statistically process a number of observations of real estate prices and accept hypotheses of a linear grows for estimations as well for cumulative average, as for cumulative variance both for input and output prises that are writing in the book of prises.

  1. Quantum correlation of fiber-based telecom-band photon pairs through standard loss and random media.

    PubMed

    Sua, Yong Meng; Malowicki, John; Lee, Kim Fook

    2014-08-15

    We study quantum correlation and interference of fiber-based telecom-band photon pairs with one photon of the pair experiencing multiple scattering in a random medium. We measure joint probability of two-photon detection for signal photon in a normal channel and idler photon in a channel, which is subjected to two independent conditions: standard loss (neutral density filter) and random media. We observe that both conditions degrade the correlation of signal and idler photons, and depolarization of the idler photon in random medium can enhance two-photon interference at certain relative polarization angles. Our theoretical calculation on two-photon polarization correlation and interference as a function of mean free path is in agreement with our experiment data. We conclude that quantum correlation of a polarization-entangled photon pair is better preserved than a polarization-correlated photon pair as one photon of the pair scatters through a random medium.

  2. Sensitivity analysis for missing dichotomous outcome data in multi-visit randomized clinical trial with randomization-based covariance adjustment.

    PubMed

    Li, Siying; Koch, Gary G; Preisser, John S; Lam, Diana; Sanchez-Kam, Matilde

    2017-01-01

    Dichotomous endpoints in clinical trials have only two possible outcomes, either directly or via categorization of an ordinal or continuous observation. It is common to have missing data for one or more visits during a multi-visit study. This paper presents a closed form method for sensitivity analysis of a randomized multi-visit clinical trial that possibly has missing not at random (MNAR) dichotomous data. Counts of missing data are redistributed to the favorable and unfavorable outcomes mathematically to address possibly informative missing data. Adjusted proportion estimates and their closed form covariance matrix estimates are provided. Treatment comparisons over time are addressed with Mantel-Haenszel adjustment for a stratification factor and/or randomization-based adjustment for baseline covariables. The application of such sensitivity analyses is illustrated with an example. An appendix outlines an extension of the methodology to ordinal endpoints.

  3. Statistical analysis of loopy belief propagation in random fields

    NASA Astrophysics Data System (ADS)

    Yasuda, Muneki; Kataoka, Shun; Tanaka, Kazuyuki

    2015-10-01

    Loopy belief propagation (LBP), which is equivalent to the Bethe approximation in statistical mechanics, is a message-passing-type inference method that is widely used to analyze systems based on Markov random fields (MRFs). In this paper, we propose a message-passing-type method to analytically evaluate the quenched average of LBP in random fields by using the replica cluster variation method. The proposed analytical method is applicable to general pairwise MRFs with random fields whose distributions differ from each other and can give the quenched averages of the Bethe free energies over random fields, which are consistent with numerical results. The order of its computational cost is equivalent to that of standard LBP. In the latter part of this paper, we describe the application of the proposed method to Bayesian image restoration, in which we observed that our theoretical results are in good agreement with the numerical results for natural images.

  4. Inflation with a graceful exit in a random landscape

    NASA Astrophysics Data System (ADS)

    Pedro, F. G.; Westphal, A.

    2017-03-01

    We develop a stochastic description of small-field inflationary histories with a graceful exit in a random potential whose Hessian is a Gaussian random matrix as a model of the unstructured part of the string landscape. The dynamical evolution in such a random potential from a small-field inflation region towards a viable late-time de Sitter (dS) minimum maps to the dynamics of Dyson Brownian motion describing the relaxation of non-equilibrium eigenvalue spectra in random matrix theory. We analytically compute the relaxation probability in a saddle point approximation of the partition function of the eigenvalue distribution of the Wigner ensemble describing the mass matrices of the critical points. When applied to small-field inflation in the landscape, this leads to an exponentially strong bias against small-field ranges and an upper bound N ≪ 10 on the number of light fields N participating during inflation from the non-observation of negative spatial curvature.

  5. Political science. Reverse-engineering censorship in China: randomized experimentation and participant observation.

    PubMed

    King, Gary; Pan, Jennifer; Roberts, Margaret E

    2014-08-22

    Existing research on the extensive Chinese censorship organization uses observational methods with well-known limitations. We conducted the first large-scale experimental study of censorship by creating accounts on numerous social media sites, randomly submitting different texts, and observing from a worldwide network of computers which texts were censored and which were not. We also supplemented interviews with confidential sources by creating our own social media site, contracting with Chinese firms to install the same censoring technologies as existing sites, and--with their software, documentation, and even customer support--reverse-engineering how it all works. Our results offer rigorous support for the recent hypothesis that criticisms of the state, its leaders, and their policies are published, whereas posts about real-world events with collective action potential are censored. Copyright © 2014, American Association for the Advancement of Science.

  6. Comparing cluster-level dynamic treatment regimens using sequential, multiple assignment, randomized trials: Regression estimation and sample size considerations.

    PubMed

    NeCamp, Timothy; Kilbourne, Amy; Almirall, Daniel

    2017-08-01

    Cluster-level dynamic treatment regimens can be used to guide sequential treatment decision-making at the cluster level in order to improve outcomes at the individual or patient-level. In a cluster-level dynamic treatment regimen, the treatment is potentially adapted and re-adapted over time based on changes in the cluster that could be impacted by prior intervention, including aggregate measures of the individuals or patients that compose it. Cluster-randomized sequential multiple assignment randomized trials can be used to answer multiple open questions preventing scientists from developing high-quality cluster-level dynamic treatment regimens. In a cluster-randomized sequential multiple assignment randomized trial, sequential randomizations occur at the cluster level and outcomes are observed at the individual level. This manuscript makes two contributions to the design and analysis of cluster-randomized sequential multiple assignment randomized trials. First, a weighted least squares regression approach is proposed for comparing the mean of a patient-level outcome between the cluster-level dynamic treatment regimens embedded in a sequential multiple assignment randomized trial. The regression approach facilitates the use of baseline covariates which is often critical in the analysis of cluster-level trials. Second, sample size calculators are derived for two common cluster-randomized sequential multiple assignment randomized trial designs for use when the primary aim is a between-dynamic treatment regimen comparison of the mean of a continuous patient-level outcome. The methods are motivated by the Adaptive Implementation of Effective Programs Trial which is, to our knowledge, the first-ever cluster-randomized sequential multiple assignment randomized trial in psychiatry.

  7. Private randomness expansion with untrusted devices

    NASA Astrophysics Data System (ADS)

    Colbeck, Roger; Kent, Adrian

    2011-03-01

    Randomness is an important resource for many applications, from gambling to secure communication. However, guaranteeing that the output from a candidate random source could not have been predicted by an outside party is a challenging task, and many supposedly random sources used today provide no such guarantee. Quantum solutions to this problem exist, for example a device which internally sends a photon through a beamsplitter and observes on which side it emerges, but, presently, such solutions require the user to trust the internal workings of the device. Here, we seek to go beyond this limitation by asking whether randomness can be generated using untrusted devices—even ones created by an adversarial agent—while providing a guarantee that no outside party (including the agent) can predict it. Since this is easily seen to be impossible unless the user has an initially private random string, the task we investigate here is private randomness expansion. We introduce a protocol for private randomness expansion with untrusted devices which is designed to take as input an initially private random string and produce as output a longer private random string. We point out that private randomness expansion protocols are generally vulnerable to attacks that can render the initial string partially insecure, even though that string is used only inside a secure laboratory; our protocol is designed to remove this previously unconsidered vulnerability by privacy amplification. We also discuss extensions of our protocol designed to generate an arbitrarily long random string from a finite initially private random string. The security of these protocols against the most general attacks is left as an open question.

  8. Optoenergy storage and random walks assisted broadband amplification in Er3+-doped (Pb,La)(Zr,Ti)O3 disordered ceramics.

    PubMed

    Xu, Long; Zhao, Hua; Xu, Caixia; Zhang, Siqi; Zou, Yingyin K; Zhang, Jingwen

    2014-02-01

    A broadband optical amplification was observed and investigated in Er3+-doped electrostrictive ceramics of lanthanum-modified lead zirconate titanate under a corona atmosphere. The ceramic structure change caused by UV light, electric field, and random walks originated from the diffusive process in intrinsically disordered materials may all contribute to the optical amplification and the associated energy storage. Discussion based on optical energy storage and diffusive equations was given to explain the findings. Those experiments performed made it possible to study random walks and optical amplification in transparent ceramics materials.

  9. The blocked-random effect in pictures and words.

    PubMed

    Toglia, M P; Hinman, P J; Dayton, B S; Catalano, J F

    1997-06-01

    Picture and word recall was examined in conjunction with list organization. 60 subjects studied a list of 30 items, either words or their pictorial equivalents. The 30 words/pictures, members of five conceptual categories, each represented by six exemplars, were presented either blocked by category or in a random order. While pictures were recalled better than words and a standard blocked-random effect was observed, the interaction indicated that the recall advantage of a blocked presentation was restricted to the word lists. A similar pattern emerged for clustering. These findings are discussed in terms of limitations upon the pictorial superiority effect.

  10. Using Non-experimental Data to Estimate Treatment Effects

    PubMed Central

    Stuart, Elizabeth A.; Marcus, Sue M.; Horvitz-Lennon, Marcela V.; Gibbons, Robert D.; Normand, Sharon-Lise T.

    2009-01-01

    While much psychiatric research is based on randomized controlled trials (RCTs), where patients are randomly assigned to treatments, sometimes RCTs are not feasible. This paper describes propensity score approaches, which are increasingly used for estimating treatment effects in non-experimental settings. The primary goal of propensity score methods is to create sets of treated and comparison subjects who look as similar as possible, in essence replicating a randomized experiment, at least with respect to observed patient characteristics. A study to estimate the metabolic effects of antipsychotic medication in a sample of Florida Medicaid beneficiaries with schizophrenia illustrates methods. PMID:20563313

  11. Designing clinical trials for amblyopia

    PubMed Central

    Holmes, Jonathan M.

    2015-01-01

    Randomized clinical trial (RCT) study design leads to one of the highest levels of evidence, and is a preferred study design over cohort studies, because randomization reduces bias and maximizes the chance that even unknown confounding factors will be balanced between treatment groups. Recent randomized clinical trials and observational studies in amblyopia can be taken together to formulate an evidence-based approach to amblyopia treatment, which is presented in this review. When designing future clinical studies of amblyopia treatment, issues such as regression to the mean, sample size and trial duration must be considered, since each may impact study results and conclusions. PMID:25752747

  12. The detection of problem analytes in a single proficiency test challenge in the absence of the Health Care Financing Administration rule violations.

    PubMed

    Cembrowski, G S; Hackney, J R; Carey, N

    1993-04-01

    The Clinical Laboratory Improvement Act of 1988 (CLIA 88) has dramatically changed proficiency testing (PT) practices having mandated (1) satisfactory PT for certain analytes as a condition of laboratory operation, (2) fixed PT limits for many of these "regulated" analytes, and (3) an increased number of PT specimens (n = 5) for each testing cycle. For many of these analytes, the fixed limits are much broader than the previously employed Standard Deviation Index (SDI) criteria. Paradoxically, there may be less incentive to identify and evaluate analytically significant outliers to improve the analytical process. Previously described "control rules" to evaluate these PT results are unworkable as they consider only two or three results. We used Monte Carlo simulations of Kodak Ektachem analyzers participating in PT to determine optimal control rules for the identification of PT results that are inconsistent with those from other laboratories using the same methods. The analysis of three representative analytes, potassium, creatine kinase, and iron was simulated with varying intrainstrument and interinstrument standard deviations (si and sg, respectively) obtained from the College of American Pathologists (Northfield, Ill) Quality Assurance Services data and Proficiency Test data, respectively. Analytical errors were simulated in each of the analytes and evaluated in terms of multiples of the interlaboratory SDI. Simple control rules for detecting systematic and random error were evaluated with power function graphs, graphs of probability of error detected vs magnitude of error. Based on the simulation results, we recommend screening all analytes for the occurrence of two or more observations exceeding the same +/- 1 SDI limit. For any analyte satisfying this condition, the mean of the observations should be calculated. For analytes with sg/si ratios between 1.0 and 1.5, a significant systematic error is signaled by the mean exceeding 1.0 SDI. Significant random error is signaled by one observation exceeding the +/- 3-SDI limit or the range of the observations exceeding 4 SDIs. For analytes with higher sg/si, significant systematic or random error is signaled by violation of the screening rule (having at least two observations exceeding the same +/- 1 SDI limit). Random error can also be signaled by one observation exceeding the +/- 1.5-SDI limit or the range of the observations exceeding 3 SDIs. We present a practical approach to the workup of apparent PT errors.

  13. Student Sorting and Bias in Value Added Estimation: Selection on Observables and Unobservables. NBER Working Paper No. 14666

    ERIC Educational Resources Information Center

    Rothstein, Jesse

    2009-01-01

    Non-random assignment of students to teachers can bias value added estimates of teachers' causal effects. Rothstein (2008a, b) shows that typical value added models indicate large counter-factual effects of 5th grade teachers on students' 4th grade learning, indicating that classroom assignments are far from random. This paper quantifies the…

  14. Vitamin D3 supplementation increases spine bone mineral density in adolescents and young adults with HIV infection being treated with tenofovir disoproxil fumarate: a randomized, placebo controlled trial

    USDA-ARS?s Scientific Manuscript database

    Background: Tenofovir disoproxil fumarate (TDF) decreases bone mineral density (BMD). We hypothesized vitamin D3 (VITD3) would increase BMD in adolescents/young adults receiving TDF. Methods: Randomized double-blind placebo-controlled trial of directly observed VITD3 50,000 IU vs. placebo every 4 ...

  15. Symmetry Breaking in a random passive scalar

    NASA Astrophysics Data System (ADS)

    Kilic, Zeliha; McLaughlin, Richard; Camassa, Roberto

    2017-11-01

    We consider the evolution of a decaying passive scalar in the presence of a gaussian white noise fluctuating shear flow. We focus on deterministic initial data and establish the short, intermediate, and long time symmetry properties of the evolving point wise probability measure for the random passive scalar. Analytical results are compared directly to Monte Carlo simulations. Time permitting we will compare the predictions to experimental observations.

  16. The Impact of Including Husbands in Antenatal Health Education Services on Maternal Health Practices in Urban Nepal: Results from a Randomized Controlled Trial

    ERIC Educational Resources Information Center

    Mullany, Britta C.; Becker, S.; Hindin, M. J.

    2007-01-01

    Observational studies suggest that including men in reproductive health interventions can enhance positive health outcomes. A randomized controlled trial was designed to test the impact of involving male partners in antenatal health education on maternal health care utilization and birth preparedness in urban Nepal. In total, 442 women seeking…

  17. The impact of low-dose aspirin on preterm birth: secondary analysis of a randomized controlled trial.

    PubMed

    Allshouse, A A; Jessel, R H; Heyborne, K D

    2016-06-01

    The objective of this study is to determine whether low-dose aspirin (LDA) reduced the rate of preterm birth (PTB) in a cohort of women at high risk for preeclampsia. Secondary analysis of the Maternal-Fetal Medicine Units High-Risk Aspirin trial. Preterm births were categorized by phenotype: indicated, spontaneous or due to preterm premature rupture of membranes (PPROMs). Of 1789 randomized women, 30.5% delivered before 37 weeks (18.5% indicated, 5.8% spontaneous and 6.2% following preterm PPROMs). Among women randomized to LDA, we observed a trend favoring fewer PTBs due to spontaneous preterm labor and preterm PPROMs, odds ratio (OR: 0.826 (0.620, 1.099)); the incidence of indicated PTBs appeared unchanged, OR: 0.999 (0.787, 1.268). Although not reaching significance, we observed an effect size similar to other studies of both low- and high-risk women. These results support findings from other studies assessing LDA as a PTB prevention strategy.

  18. Topical Allium ampeloprasum subsp Iranicum (Leek) extract cream in patients with symptomatic hemorrhoids: a pilot randomized and controlled clinical trial.

    PubMed

    Mosavat, Seyed Hamdollah; Ghahramani, Leila; Sobhani, Zahra; Haghighi, Ehsan Rahmanian; Heydari, Mojtaba

    2015-04-01

    Allium ampeloprasum subsp iranicum (Leek) has been traditionally used in antihemorrhoidal topical herbal formulations. This study aimed to evaluate its safety and efficacy in a pilot randomized controlled clinical trial. Twenty patients with symptomatic hemorrhoids were randomly allocated to receive the topical leek extract cream or standard antihemorrhoid cream for 3 weeks. The patients were evaluated before and after the intervention in terms of pain, defecation discomfort, bleeding severity, anal itching severity, and reported adverse events. A significant decrease was observed in the grade of bleeding severity and defecation discomfort in both the leek and antihemorrhoid cream groups after the intervention, while no significant change was observed in pain scores. There was no significant difference between the leek and antihemorrhoid cream groups with regard to mean changes in outcome measures. This pilot study showed that the topical use of leek cream can be as effective as a standard antihemorrhoid cream. © The Author(s) 2015.

  19. FAST TRACK COMMUNICATION Local randomness in Hardy's correlations: implications from the information causality principle

    NASA Astrophysics Data System (ADS)

    Rajjak Gazi, MD.; Rai, Ashutosh; Kunkri, Samir; Rahaman, Ramij

    2010-11-01

    Study of non-local correlations in terms of Hardy's argument has been quite popular in quantum mechanics. Hardy's non-locality argument depends on some kind of asymmetry, but a two-qubit maximally entangled state, being symmetric, does not exhibit this kind of non-locality. Here we ask the following question: can this feature be explained by some principle outside quantum mechanics? The no-signaling condition does not provide a solution. But, interestingly, the information causality principle (Pawlowski et al 2009 Nature 461 1101) offers an explanation. It shows that any generalized probability theory which gives completely random results for local dichotomic observable, cannot provide Hardy's non-local correlation if it is restricted by a necessary condition for respecting the information causality principle. In fact, the applied necessary condition imposes even more restrictions on the local randomness of measured observable. Still, there are some restrictions imposed by quantum mechanics that are not reproduced from the considered information causality condition.

  20. Kindness in the blood: A randomized controlled trial of the gene regulatory impact of prosocial behavior.

    PubMed

    Nelson-Coffey, S Katherine; Fritz, Megan M; Lyubomirsky, Sonja; Cole, Steve W

    2017-07-01

    Prosocial behavior is linked to longevity, but few studies have experimentally manipulated prosocial behavior to identify the causal mechanisms underlying this association. One possible mediating pathway involves changes in gene expression that may subsequently influence disease development or resistance. In the current study, we examined changes in a leukocyte gene expression profile known as the Conserved Transcriptional Response to Adversity (CTRA) in 159 adults who were randomly assigned for 4 weeks to engage in prosocial behavior directed towards specific others, prosocial behavior directed towards the world in general, self-focused kindness, or a neutral control task. Those randomized to prosocial behavior towards specific others demonstrated improvements (i.e., reductions) in leukocyte expression of CTRA indicator genes. No significant changes in CTRA gene expression were observed in the other 3 conditions. These findings suggest that prosocial behavior can causally impact leukocyte gene expression profiles in ways that might potentially help explain the previously observed health advantages associated with social ties. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Rotational diffusion of a molecular cat

    NASA Astrophysics Data System (ADS)

    Katz-Saporta, Ori; Efrati, Efi

    We show that a simple isolated system can perform rotational random walk on account of internal excitations alone. We consider the classical dynamics of a ''molecular cat'': a triatomic molecule connected by three harmonic springs with non-zero rest lengths, suspended in free space. In this system, much like for falling cats, the angular momentum constraint is non-holonomic allowing for rotations with zero overall angular momentum. The geometric nonlinearities arising from the non-zero rest lengths of the springs suffice to break integrability and lead to chaotic dynamics. The coupling of the non-integrability of the system and its non-holonomic nature results in an angular random walk of the molecule. We study the properties and dynamics of this angular motion analytically and numerically. For low energy excitations the system displays normal-mode-like motion, while for high enough excitation energy we observe regular random-walk. In between, at intermediate energies we observe an angular Lévy-walk type motion associated with a fractional diffusion coefficient interpolating between the two regimes.

  2. Multi-photon excited coherent random laser emission in ZnO powders

    NASA Astrophysics Data System (ADS)

    Tolentino Dominguez, Christian; Gomes, Maria De A.; Macedo, Zélia S.; de Araújo, Cid B.; Gomes, Anderson S. L.

    2014-11-01

    We report the observation and analysis of anti-Stokes coherent random laser (RL) emission from zinc oxide (ZnO) powders excited by one-, two- or three-photon femtosecond laser radiation. The ZnO powders were produced via a novel proteic sol-gel, low-cost and environmentally friendly route using coconut water in the polymerization step of the metal precursor. One- and two-photon excitation at 354 nm and 710 nm, respectively, generated single-band emissions centred at about 387 nm. For three-photon excitation, the emission spectra showed a strong ultraviolet (UV) band (380-396 nm) attributed to direct three-photon absorption from the valence band to the conduction band. The presence of an intensity threshold and a bandwidth narrowing of the UV band from about 20 to 4 nm are clear evidence of RL action. The observation of multiple sub-nanometre narrow peaks in the emission spectra for excitation above the RL threshold is consistent with random lasing by coherent feedback.

  3. Random versus maximum entropy models of neural population activity

    NASA Astrophysics Data System (ADS)

    Ferrari, Ulisse; Obuchi, Tomoyuki; Mora, Thierry

    2017-04-01

    The principle of maximum entropy provides a useful method for inferring statistical mechanics models from observations in correlated systems, and is widely used in a variety of fields where accurate data are available. While the assumptions underlying maximum entropy are intuitive and appealing, its adequacy for describing complex empirical data has been little studied in comparison to alternative approaches. Here, data from the collective spiking activity of retinal neurons is reanalyzed. The accuracy of the maximum entropy distribution constrained by mean firing rates and pairwise correlations is compared to a random ensemble of distributions constrained by the same observables. For most of the tested networks, maximum entropy approximates the true distribution better than the typical or mean distribution from that ensemble. This advantage improves with population size, with groups as small as eight being almost always better described by maximum entropy. Failure of maximum entropy to outperform random models is found to be associated with strong correlations in the population.

  4. Computer simulations of melts of randomly branching polymers

    NASA Astrophysics Data System (ADS)

    Rosa, Angelo; Everaers, Ralf

    2016-10-01

    Randomly branching polymers with annealed connectivity are model systems for ring polymers and chromosomes. In this context, the branched structure represents transient folding induced by topological constraints. Here we present computer simulations of melts of annealed randomly branching polymers of 3 ≤ N ≤ 1800 segments in d = 2 and d = 3 dimensions. In all cases, we perform a detailed analysis of the observed tree connectivities and spatial conformations. Our results are in excellent agreement with an asymptotic scaling of the average tree size of R ˜ N1/d, suggesting that the trees behave as compact, territorial fractals. The observed swelling relative to the size of ideal trees, R ˜ N1/4, demonstrates that excluded volume interactions are only partially screened in melts of annealed trees. Overall, our results are in good qualitative agreement with the predictions of Flory theory. In particular, we find that the trees swell by the combination of modified branching and path stretching. However, the former effect is subdominant and difficult to detect in d = 3 dimensions.

  5. Observational Research Rigor Alone Does Not Justify Causal Inference

    PubMed Central

    Ejima, Keisuke; Li, Peng; Smith, Daniel L.; Nagy, Tim R.; Kadish, Inga; van Groen, Thomas; Dawson, John A.; Yang, Yongbin; Patki, Amit; Allison, David B.

    2016-01-01

    Background Differing opinions exist on whether associations obtained in observational studies can be reliable indicators of a causal effect if the observational study is sufficiently well controlled and executed. Materials and methods To test this, we conducted two animal observational studies that were rigorously controlled and executed beyond what is achieved in studies of humans. In study 1, we randomized 332 genetically identical C57BL/6J mice into three diet groups with differing food energy allotments and recorded individual self-selected daily energy intake and lifespan. In study 2, 60 male mice (CD1) were paired and divided into two groups for a 2-week feeding regimen. We evaluated the association between weight gain and food consumption. Within each pair, one animal was randomly assigned to an S group in which the animals had free access to food. The second paired animal (R group) was provided exactly the same diet that their S partner ate the day before. Results In study 1, across all three groups, we found a significant negative effect of energy intake on lifespan. However, we found a positive association between food intake and lifespan among the ad libitum feeding group: 29.99 (95% CI: 8.2 to 51.7) days per daily kcal. In study 2, we found a significant (P=0.003) group (randomized vs self-selected)-by-food consumption interaction effect on weight gain. Conclusions At least in nutrition research, associations derived from observational studies may not be reliable indicators of causal effects, even with the most rigorous study designs achievable. PMID:27711975

  6. [Drinking-water type fluorosis treated with acupuncture of reinforcing kidney and activating spleen: a randomized controlled trial].

    PubMed

    Zhao, Xiao-guang; Wu, Zhong-chao; Chen, Zhong-jie; Wang, Jing-jing; Zhou, Jin-cao; Pang, Li; Jiao, Yue; Hu, Jing; Cui, Cheng-bin

    2012-06-01

    To observe the impacts of acupuncture of reinforcing kidney and activating spleen on the excretion of urinary fluoride and pain of the patients with drinking-water type fluorosis. The randomized controlled and single-blind trial was adopted. Seventy-two cases were randomized into an observation group and a control group, 36 cases in each one. In the observation group, acupuncture was applied at Pishu (BL 20), Shenshu (BL 23), Guanyuan (CV 4), Zusanli (ST 36), etc. , three treatments a week. In the control group, the Calcium Carbonate D3 tablets were prescribed for oral administration, 600 mg each time, twice a day. The duration of treatment was 2 months. The changes of the content of urinary fluoride and pain score (by VAS) before and after treatment between two groups were compared. The urinary fluoride excretion was increased obviously after treatment in the observation group (P < 0.01), which was superior apparently to that in the control group [(11.06 +/- 4.54) mg/L vs. (8.30 +/- 4.14) mg/L, P < 0.05]. After treatment, VAS score was reduced significantly in either group (both P < 0.01). The result in the observation group was lower remarkably than that in the control group (1.93 +/- 1.30 vs. 3.47 +/- 2.29, P < 0.01). Acupuncture achieves the significant efficacy on the promotion of urinary fluoride excretion and pain relieving of the patients with drinking-water type fluorosis in light of reinforcing kidney and activating spleen, which is superior to the oral administration of the calcium carbonate D3 tablets.

  7. Cosmicflows Constrained Local UniversE Simulations

    NASA Astrophysics Data System (ADS)

    Sorce, Jenny G.; Gottlöber, Stefan; Yepes, Gustavo; Hoffman, Yehuda; Courtois, Helene M.; Steinmetz, Matthias; Tully, R. Brent; Pomarède, Daniel; Carlesi, Edoardo

    2016-01-01

    This paper combines observational data sets and cosmological simulations to generate realistic numerical replicas of the nearby Universe. The latter are excellent laboratories for studies of the non-linear process of structure formation in our neighbourhood. With measurements of radial peculiar velocities in the local Universe (cosmicflows-2) and a newly developed technique, we produce Constrained Local UniversE Simulations (CLUES). To assess the quality of these constrained simulations, we compare them with random simulations as well as with local observations. The cosmic variance, defined as the mean one-sigma scatter of cell-to-cell comparison between two fields, is significantly smaller for the constrained simulations than for the random simulations. Within the inner part of the box where most of the constraints are, the scatter is smaller by a factor of 2 to 3 on a 5 h-1 Mpc scale with respect to that found for random simulations. This one-sigma scatter obtained when comparing the simulated and the observation-reconstructed velocity fields is only 104 ± 4 km s-1, I.e. the linear theory threshold. These two results demonstrate that these simulations are in agreement with each other and with the observations of our neighbourhood. For the first time, simulations constrained with observational radial peculiar velocities resemble the local Universe up to a distance of 150 h-1 Mpc on a scale of a few tens of megaparsecs. When focusing on the inner part of the box, the resemblance with our cosmic neighbourhood extends to a few megaparsecs (<5 h-1 Mpc). The simulations provide a proper large-scale environment for studies of the formation of nearby objects.

  8. Cost-efficiency of knowledge creation: randomized controlled trials vs. observational studies.

    PubMed

    Struck, Rafael; Baumgarten, Georg; Wittmann, Maria

    2014-04-01

    This article reviews traditional and current perspectives on randomized, controlled trials (RCTs) and observational studies relative to the economic implications for public healthcare stakeholders. It takes an average of 17 years to bring 14% of original research into clinical practice. Results from high-quality observational studies may complement limited RCTs in primary and secondary literature bases, and enhance the incorporation of sound evidence-based guidelines. Observational findings from comprehensive medical databases may offer valuable clues on the effectiveness and relevance of public healthcare interventions. Major expenditures associated with RCTs relate to recruitment, inappropriate site selection, conduct and reporting. Application of business strategies and economic evaluation tools, in addition to the planning and conduct of RCTs, may enhance clinical trial site performances. Considering the strengths and limitations of each study type, clinical researchers should explore the contextual worthiness of either design in promulgating knowledge. They should focus on quality of conduct and reporting that may allow for the liberation of limited public and private clinical research funding.

  9. Basic Diagnosis and Prediction of Persistent Contrail Occurrence using High-resolution Numerical Weather Analyses/Forecasts and Logistic Regression. Part I: Effects of Random Error

    NASA Technical Reports Server (NTRS)

    Duda, David P.; Minnis, Patrick

    2009-01-01

    Straightforward application of the Schmidt-Appleman contrail formation criteria to diagnose persistent contrail occurrence from numerical weather prediction data is hindered by significant bias errors in the upper tropospheric humidity. Logistic models of contrail occurrence have been proposed to overcome this problem, but basic questions remain about how random measurement error may affect their accuracy. A set of 5000 synthetic contrail observations is created to study the effects of random error in these probabilistic models. The simulated observations are based on distributions of temperature, humidity, and vertical velocity derived from Advanced Regional Prediction System (ARPS) weather analyses. The logistic models created from the simulated observations were evaluated using two common statistical measures of model accuracy, the percent correct (PC) and the Hanssen-Kuipers discriminant (HKD). To convert the probabilistic results of the logistic models into a dichotomous yes/no choice suitable for the statistical measures, two critical probability thresholds are considered. The HKD scores are higher when the climatological frequency of contrail occurrence is used as the critical threshold, while the PC scores are higher when the critical probability threshold is 0.5. For both thresholds, typical random errors in temperature, relative humidity, and vertical velocity are found to be small enough to allow for accurate logistic models of contrail occurrence. The accuracy of the models developed from synthetic data is over 85 percent for both the prediction of contrail occurrence and non-occurrence, although in practice, larger errors would be anticipated.

  10. Random and externally controlled occurrences of Dansgaard-Oeschger events

    NASA Astrophysics Data System (ADS)

    Lohmann, Johannes; Ditlevsen, Peter D.

    2018-05-01

    Dansgaard-Oeschger (DO) events constitute the most pronounced mode of centennial to millennial climate variability of the last glacial period. Since their discovery, many decades of research have been devoted to understand the origin and nature of these rapid climate shifts. In recent years, a number of studies have appeared that report emergence of DO-type variability in fully coupled general circulation models via different mechanisms. These mechanisms result in the occurrence of DO events at varying degrees of regularity, ranging from periodic to random. When examining the full sequence of DO events as captured in the North Greenland Ice Core Project (NGRIP) ice core record, one can observe high irregularity in the timing of individual events at any stage within the last glacial period. In addition to the prevailing irregularity, certain properties of the DO event sequence, such as the average event frequency or the relative distribution of cold versus warm periods, appear to be changing throughout the glacial. By using statistical hypothesis tests on simple event models, we investigate whether the observed event sequence may have been generated by stationary random processes or rather was strongly modulated by external factors. We find that the sequence of DO warming events is consistent with a stationary random process, whereas dividing the event sequence into warming and cooling events leads to inconsistency with two independent event processes. As we include external forcing, we find a particularly good fit to the observed DO sequence in a model where the average residence time in warm periods are controlled by global ice volume and cold periods by boreal summer insolation.

  11. How preview space/time translates into preview cost/benefit for fixation durations during reading.

    PubMed

    Kliegl, Reinhold; Hohenstein, Sven; Yan, Ming; McDonald, Scott A

    2013-01-01

    Eye-movement control during reading depends on foveal and parafoveal information. If the parafoveal preview of the next word is suppressed, reading is less efficient. A linear mixed model (LMM) reanalysis of McDonald (2006) confirmed his observation that preview benefit may be limited to parafoveal words that have been selected as the saccade target. Going beyond the original analyses, in the same LMM, we examined how the preview effect (i.e., the difference in single-fixation duration, SFD, between random-letter and identical preview) depends on the gaze duration on the pretarget word and on the amplitude of the saccade moving the eye onto the target word. There were two key results: (a) The shorter the saccade amplitude (i.e., the larger preview space), the shorter a subsequent SFD with an identical preview; this association was not observed with a random-letter preview. (b) However, the longer the gaze duration on the pretarget word, the longer the subsequent SFD on the target, with the difference between random-letter string and identical previews increasing with preview time. A third pattern-increasing cost of a random-letter string in the parafovea associated with shorter saccade amplitudes-was observed for target gaze durations. Thus, LMMs revealed that preview effects, which are typically summarized under "preview benefit", are a complex mixture of preview cost and preview benefit and vary with preview space and preview time. The consequence for reading is that parafoveal preview may not only facilitate, but also interfere with lexical access.

  12. Why the null matters: statistical tests, random walks and evolution.

    PubMed

    Sheets, H D; Mitchell, C E

    2001-01-01

    A number of statistical tests have been developed to determine what type of dynamics underlie observed changes in morphology in evolutionary time series, based on the pattern of change within the time series. The theory of the 'scaled maximum', the 'log-rate-interval' (LRI) method, and the Hurst exponent all operate on the same principle of comparing the maximum change, or rate of change, in the observed dataset to the maximum change expected of a random walk. Less change in a dataset than expected of a random walk has been interpreted as indicating stabilizing selection, while more change implies directional selection. The 'runs test' in contrast, operates on the sequencing of steps, rather than on excursion. Applications of these tests to computer generated, simulated time series of known dynamical form and various levels of additive noise indicate that there is a fundamental asymmetry in the rate of type II errors of the tests based on excursion: they are all highly sensitive to noise in models of directional selection that result in a linear trend within a time series, but are largely noise immune in the case of a simple model of stabilizing selection. Additionally, the LRI method has a lower sensitivity than originally claimed, due to the large range of LRI rates produced by random walks. Examination of the published results of these tests show that they have seldom produced a conclusion that an observed evolutionary time series was due to directional selection, a result which needs closer examination in light of the asymmetric response of these tests.

  13. Changing Friend Selection in Middle School: A Social Network Analysis of a Randomized Intervention Study Designed to Prevent Adolescent Problem Behavior.

    PubMed

    DeLay, Dawn; Ha, Thao; Van Ryzin, Mark; Winter, Charlotte; Dishion, Thomas J

    2016-04-01

    Adolescent friendships that promote problem behavior are often chosen in middle school. The current study examines the unintended impact of a randomized school-based intervention on the selection of friends in middle school, as well as on observations of deviant talk with friends 5 years later. Participants included 998 middle school students (526 boys and 472 girls) recruited at the onset of middle school (age 11-12 years) from three public middle schools participating in the Family Check-up model intervention. The current study focuses only on the effects of the SHAPe curriculum-one level of the Family Check-up model-on friendship choices. Participants nominated friends and completed measures of deviant peer affiliation. Approximately half of the sample (n = 500) was randomly assigned to the intervention, and the other half (n = 498) comprised the control group within each school. The results indicate that the SHAPe curriculum affected friend selection within school 1 but not within schools 2 or 3. The effects of friend selection in school 1 translated into reductions in observed deviancy training 5 years later (age 16-17 years). By coupling longitudinal social network analysis with a randomized intervention study, the current findings provide initial evidence that a randomized public middle school intervention can disrupt the formation of deviant peer groups and diminish levels of adolescent deviance 5 years later.

  14. Anterior inferior plating versus superior plating for clavicle fracture: a meta-analysis.

    PubMed

    Ai, Jie; Kan, Shun-Li; Li, Hai-Liang; Xu, Hong; Liu, Yang; Ning, Guang-Zhi; Feng, Shi-Qing

    2017-04-18

    The position of plate fixation for clavicle fracture remains controversial. Our objective was to perform a comprehensive review of the literature and quantify the surgical parameters and clinical indexes between the anterior inferior plating and superior plating for clavicle fracture. PubMed, EMBASE, and the Cochrane Library were searched for randomized and non-randomized studies that compared the anterior inferior plating with the superior plating for clavicle fracture. The relative risk or standardized mean difference with 95% confidence interval was calculated using either a fixed- or random-effects model. Four randomized controlled trials and eight observational studies were identified to compare the surgical parameters and clinical indexes. For the surgical parameters, the anterior inferior plating group was better than the superior plating group in operation time and blood loss (P < 0.05). Furthermore, in terms of clinical indexes, the anterior inferior plating was superior to the superior plating in reducing the union time, and the two kinds of plate fixation methods were comparable in constant score, and the rate of infection, nonunion, and complications (P > 0.05). Based on the current evidence, the anterior inferior plating may reduce the blood loss, the operation and union time, but no differences were observed in constant score, and the rate of infection, nonunion, and complications between the two groups. Given that some of the studies have low quality, more randomized controlled trails with high quality should be conduct to further verify the findings.

  15. Polarization sensitivity of ordered and random antireflective surface structures in silica and spinel

    NASA Astrophysics Data System (ADS)

    Frantz, J. A.; Selby, J.; Busse, L. E.; Shaw, L. B.; Aggarwal, I. D.; Sanghera, J. S.

    2018-02-01

    Both ordered and random anti-reflective surface structures (ARSS) have been shown to increase the transmission of an optical surface to >99.9%. These structures are of great interest as an alternative to traditional thin film anti-reflection (AR) coatings for a variety of reasons. Unlike traditional AR coatings, they are patterned directly into the surface of an optic rather than deposited on its surface and are thus not prone to the delamination under thermal cycling that can occur with thin film coatings. Their laser-induced damage thresholds can also be considerably higher. In addition, they provide AR performance over a larger spectral and angular range. It has been previously demonstrated that random ARSSs in silica are remarkably insensitive to incident polarization, with nearly zero variation in transmittance with respect to polarization of the incident beam at fixed wavelength for angles of incidence up to at least 30°. In this work, we evaluate polarization sensitivity of ARSS as a function of wavelength for both random and ordered ARSS. We demonstrate that ordered ARSS is significantly more sensitive to polarization than random ARSS and explain the reason for this difference. In the case of ordered ARSS, we observe significant differences as a function of wavelength, with the transmittance of s- and p-polarized light diverging near the diffraction edge. We present results for both silica and spinel samples and discuss differences observed for these two sets of samples.

  16. Relating the variation of secondary structure of gelatin at fish oil-water interface to adsorption kinetics, dynamic interfacial tension and emulsion stability.

    PubMed

    Liu, Huihua; Wang, Bo; Barrow, Colin J; Adhikari, Benu

    2014-01-15

    The objectives of this study were to quantify the relationship between secondary structure of gelatin and its adsorption at the fish-oil/water interface and to quantify the implication of the adsorption on the dynamic interfacial tension (DST) and emulsion stability. The surface hydrophobicity of the gelatin solutions decreased when the pH increased from 4.0 to 6.0, while opposite tend was observed in the viscosity of the solution. The DST values decreased as the pH increased from 4.0 to 6.0, indicating that higher positive charges (measured trough zeta potential) in the gelatin solution tended to result in higher DST values. The adsorption kinetics of the gelatin solution was examined through the calculated diffusion coefficients (Deff). The addition of acid promoted the random coil and β-turn structures at the expense of α-helical structure. The addition of NaOH decreased the β-turn and increased the α-helix and random coil. The decrease in the random coil and triple helix structures in the gelatin solution resulted into increased Deff values. The highest diffusion coefficients, the highest emulsion stability and the lowest amount of random coil and triple helix structures were observed at pH=4.8. The lowest amount of random coil and triple helix structures in the interfacial protein layer correlated with the highest stability of the emulsion (highest ESI value). The lower amount of random coil and triple helix structures allowed higher coverage of the oil-water interface by relatively highly ordered secondary structure of gelatin. Copyright © 2013 Elsevier Ltd. All rights reserved.

  17. Discovery of Non-random Spatial Distribution of Impacts in the Stardust Cometary Collector

    NASA Technical Reports Server (NTRS)

    Horz, Friedrich; Westphal, Andrew J.; Gainsforth, Zack; Borg, Janet; Djouadi, Zahia; Bridges, John; Franchi, Ian; Brownlee, Donald E.; Cheng. Andrew F.; Clark, Benton C.; hide

    2007-01-01

    We report the discovery that impacts in the Stardust cometary collector are not distributed randomly in the collecting media, but appear to be clustered on scales smaller than 10 cm. We also report the discovery of at least two populations of oblique tracks. We evaluated several hypotheses that could explain the observations. No hypothesis was consistent with all the observations, but the preponderance of evidence points toward at least one impact on the central Whipple shield of the spacecraft as the origin of both clustering and low-angle oblique tracks. High-angle oblique tracks unambiguously originate from a non-cometary impact on the spacecraft bus just forward of the collector.

  18. Establishing the kinetics of ballistic-to-diffusive transition using directional statistics

    NASA Astrophysics Data System (ADS)

    Liu, Pai; Heinson, William R.; Sumlin, Benjamin J.; Shen, Kuan-Yu; Chakrabarty, Rajan K.

    2018-04-01

    We establish the kinetics of ballistic-to-diffusive (BD) transition observed in two-dimensional random walk using directional statistics. Directional correlation is parameterized using the walker's turning angle distribution, which follows the commonly adopted wrapped Cauchy distribution (WCD) function. During the BD transition, the concentration factor (ρ) governing the WCD shape is observed to decrease from its initial value. We next analytically derive the relationship between effective ρ and time, which essentially quantifies the BD transition rate. The prediction of our kinetic expression agrees well with the empirical datasets obtained from correlated random walk simulation. We further connect our formulation with the conventionally used scaling relationship between the walker's mean-square displacement and time.

  19. On the existence, uniqueness, and asymptotic normality of a consistent solution of the likelihood equations for nonidentically distributed observations: Applications to missing data problems

    NASA Technical Reports Server (NTRS)

    Peters, C. (Principal Investigator)

    1980-01-01

    A general theorem is given which establishes the existence and uniqueness of a consistent solution of the likelihood equations given a sequence of independent random vectors whose distributions are not identical but have the same parameter set. In addition, it is shown that the consistent solution is a MLE and that it is asymptotically normal and efficient. Two applications are discussed: one in which independent observations of a normal random vector have missing components, and the other in which the parameters in a mixture from an exponential family are estimated using independent homogeneous sample blocks of different sizes.

  20. Laser-induced rocket force on a microparticle in a complex (dusty) plasma

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nosenko, V.; Ivlev, A. V.; Morfill, G. E.

    2010-12-15

    The interaction of a focused powerful laser beam with micron-sized melamine formaldehyde (MF) particles was studied experimentally. The microspheres had a thin palladium coating on their surface and were suspended in a radio frequency argon plasma as a single layer (plasma crystal). A particle hit by the laser beam usually accelerated in the direction of the laser beam, consistent with the radiation pressure force mechanism. However, random-direction acceleration up to the speeds on the order 1 m/s was sometimes observed. Rocket-force mechanism is proposed to account for the random-direction acceleration. Similar, but much less pronounced, effect was also observed formore » MF particles without palladium coating.« less

  1. Error Sources in Asteroid Astrometry

    NASA Technical Reports Server (NTRS)

    Owen, William M., Jr.

    2000-01-01

    Asteroid astrometry, like any other scientific measurement process, is subject to both random and systematic errors, not all of which are under the observer's control. To design an astrometric observing program or to improve an existing one requires knowledge of the various sources of error, how different errors affect one's results, and how various errors may be minimized by careful observation or data reduction techniques.

  2. Identifying Conditions That Support Causal Inference in Observational Studies in Education: Empirical Evidence from within Study Comparisons

    ERIC Educational Resources Information Center

    Hallberg, Kelly

    2013-01-01

    This dissertation is a collection of three papers that employ empirical within study comparisons (WSCs) to identify conditions that support causal inference in observational studies. WSC studies empirically estimate the extent to which a given observational study reproduces the result of a randomized clinical trial (RCT) when both share the same…

  3. Assessing the Accuracy of Classwide Direct Observation Methods: Two Analyses Using Simulated and Naturalistic Data

    ERIC Educational Resources Information Center

    Dart, Evan H.; Radley, Keith C.; Briesch, Amy M.; Furlow, Christopher M.; Cavell, Hannah J.; Briesch, Amy M.

    2016-01-01

    Two studies investigated the accuracy of eight different interval-based group observation methods that are commonly used to assess the effects of classwide interventions. In Study 1, a Microsoft Visual Basic program was created to simulate a large set of observational data. Binary data were randomly generated at the student level to represent…

  4. The random walk of a drilling laser beam

    NASA Technical Reports Server (NTRS)

    Anthony, T. R.

    1980-01-01

    The disregistry of holes drilled with a pulse laser beam in 330-micron-thick single-crystal silicon-on-sapphire wafers is examined. The exit positions of the holes were displaced from the hole entrance positions on the opposing face of the wafer, and this random displacement increased with the number of laser pulses required. A model in which the bottom of the drill hole experiences small random displacements during each laser pulse is used to describe the experimental observations. It is shown that the average random displacement caused by each pulse is only a few percent of the hole diameter and can be reduced by using as few laser pulses as necessary while avoiding the cracking and spalling of the wafer that occur with a hole drilled with a single pulse.

  5. In Search of Meaning: Are School Rampage Shootings Random and Senseless Violence?

    PubMed

    Madfis, Eric

    2017-01-02

    This article discusses Joel Best's ( 1999 ) notion of random violence and applies his concepts of pointlessness, patternlessness, and deterioration to the reality about multiple-victim school shootings gleaned from empirical research about the phenomenon. Best describes how violence is rarely random, as scholarship reveals myriad observable patterns, lots of discernable motives and causes, and often far too much fear-mongering over how bad society is getting and how violent we are becoming. In contrast, it is vital that the media, scholars, and the public better understand crime patterns, criminal motivations, and the causes of fluctuating crime rates. As an effort toward such progress, this article reviews the academic literature on school rampage shootings and explores the extent to which these attacks are and are not random acts of violence.

  6. Fluoroquinolones or macrolides in combination with β-lactams in adult patients hospitalized with community acquired pneumonia: a systematic review and meta-analysis.

    PubMed

    Vardakas, K Z; Trigkidis, K K; Falagas, M E

    2017-04-01

    The best treatment option for hospitalized patients with community-acquired pneumonia (CAP) has not been defined. The effectiveness of β-lactam/fluoroquinolone (BLFQ) versus β-lactam/macrolide (BLM) combinations for the treatment of patients with CAP was evaluated. PubMed, Scopus and the Cochrane Library were searched for observational cohort studies, non-randomized and randomized controlled trials providing data for patients with CAP receiving BLM or BLFQ. Mortality was the primary outcome. A meta-analysis was performed. MINORS and GRADE were used for data quality assessment. Seventeen studies (16 684 patients) were included. Randomized trials were not identified. A variety of β-lactams, fluoroquinolones and macrolides were used within and between the studies. Mortality was reported at different time points. The available body of evidence had very low quality. In the analysis of unadjusted data, mortality with BLFQ was higher than with BLM (risk ratio 1.33, 95% CI 1.15-1.54, I 2 28%). BLFQ was associated with higher mortality regardless of the study design, mortality recording time, study period and study BLM group mortality. BLFQ was associated with higher mortality in American but not European studies. No difference was observed in patients with bacteraemia and septic shock. In the meta-analysis of adjusted mortality data, a non-significant difference between the two regimens was observed (eight studies, adjusted risk ratio 1.26, 95% CI 0.95-1.67, I 2 43%). In the absence of data from randomized controlled trials recommendations cannot be made for or against either of the studied regimens in this group of hospitalized patients with CAP. Well designed randomized controlled trials comparing the two regimens are warranted. Copyright © 2016 European Society of Clinical Microbiology and Infectious Diseases. Published by Elsevier Ltd. All rights reserved.

  7. Safety of polyethylene glycol 3350 solution in chronic constipation: randomized, placebo-controlled trial.

    PubMed

    McGraw, Thomas

    2016-01-01

    To evaluate the safety and tolerability of aqueous solution concentrate (ASC) of polyethylene glycol (PEG) 3350 in patients with functional constipation. The patients who met Rome III diagnostic criteria for functional constipation were randomized in this multicenter, randomized, placebo-controlled, single-blind study to receive once daily dose of PEG 3350 (17 g) ASC or placebo solution for 14 days. The study comprised a screening period (visit 1), endoscopy procedure (visits 2 and 3), and followup telephone calls 30 days post-treatment. Safety end points included adverse events (AEs), clinical laboratory evaluations, vital signs, and others. The primary end points were the proportion of patients with abnormalities of the oral and esophageal mucosa, detected by visual and endoscopic examination of the oral cavity and esophagus, respectively, compared with placebo. A secondary objective was to compare the safety and tolerability of ASC by evaluating AEs or adverse drug reactions. A total of 65 patients were enrolled in this study, 31 were randomized to PEG 3350 ASC and 34 were randomized to placebo, of which 62 patients completed the study. No patients in either group showed abnormalities in inflammation of the oral mucosa during visit 2 (before treatment) or visit 3 (after treatment). Fewer abnormalities of the esophageal mucosa were observed in the PEG 3350 ASC group than in the placebo group on visit 3, with no significant difference in the proportion of abnormalities between the treatment groups. Overall, 40 treatment-emergent AEs were observed in 48.4% of patients treated with PEG 3350 ASC, and 41 treatment-emergent AEs were observed in 55.9% of patients treated with placebo - nonsignificant difference of -7.5% (95% CI: -21.3, 6.3) between treatment groups. No serious AEs or deaths were reported, and no patient discontinued because of an AE. PEG 3350 ASC is safe and well tolerated in patients with functional constipation (NCT01885104).

  8. Safety of polyethylene glycol 3350 solution in chronic constipation: randomized, placebo-controlled trial

    PubMed Central

    McGraw, Thomas

    2016-01-01

    Purpose To evaluate the safety and tolerability of aqueous solution concentrate (ASC) of polyethylene glycol (PEG) 3350 in patients with functional constipation. Patients and methods The patients who met Rome III diagnostic criteria for functional constipation were randomized in this multicenter, randomized, placebo-controlled, single-blind study to receive once daily dose of PEG 3350 (17 g) ASC or placebo solution for 14 days. The study comprised a screening period (visit 1), endoscopy procedure (visits 2 and 3), and followup telephone calls 30 days post-treatment. Safety end points included adverse events (AEs), clinical laboratory evaluations, vital signs, and others. The primary end points were the proportion of patients with abnormalities of the oral and esophageal mucosa, detected by visual and endoscopic examination of the oral cavity and esophagus, respectively, compared with placebo. A secondary objective was to compare the safety and tolerability of ASC by evaluating AEs or adverse drug reactions. Results A total of 65 patients were enrolled in this study, 31 were randomized to PEG 3350 ASC and 34 were randomized to placebo, of which 62 patients completed the study. No patients in either group showed abnormalities in inflammation of the oral mucosa during visit 2 (before treatment) or visit 3 (after treatment). Fewer abnormalities of the esophageal mucosa were observed in the PEG 3350 ASC group than in the placebo group on visit 3, with no significant difference in the proportion of abnormalities between the treatment groups. Overall, 40 treatment-emergent AEs were observed in 48.4% of patients treated with PEG 3350 ASC, and 41 treatment-emergent AEs were observed in 55.9% of patients treated with placebo – nonsignificant difference of −7.5% (95% CI: −21.3, 6.3) between treatment groups. No serious AEs or deaths were reported, and no patient discontinued because of an AE. Conclusion PEG 3350 ASC is safe and well tolerated in patients with functional constipation (NCT01885104). PMID:27486340

  9. Random covering of the circle: the configuration-space of the free deposition process

    NASA Astrophysics Data System (ADS)

    Huillet, Thierry

    2003-12-01

    Consider a circle of circumference 1. Throw at random n points, sequentially, on this circle and append clockwise an arc (or rod) of length s to each such point. The resulting random set (the free gas of rods) is a collection of a random number of clusters with random sizes. It models a free deposition process on a 1D substrate. For such processes, we shall consider the occurrence times (number of rods) and probabilities, as n grows, of the following configurations: those avoiding rod overlap (the hard-rod gas), those for which the largest gap is smaller than rod length s (the packing gas), those (parking configurations) for which hard rod and packing constraints are both fulfilled and covering configurations. Special attention is paid to the statistical properties of each such (rare) configuration in the asymptotic density domain when ns = rgr, for some finite density rgr of points. Using results from spacings in the random division of the circle, explicit large deviation rate functions can be computed in each case from state equations. Lastly, a process consisting in selecting at random one of these specific equilibrium configurations (called the observable) can be modelled. When particularized to the parking model, this system produces parking configurations differently from Rényi's random sequential adsorption model.

  10. [Three-dimensional parallel collagen scaffold promotes tendon extracellular matrix formation].

    PubMed

    Zheng, Zefeng; Shen, Weiliang; Le, Huihui; Dai, Xuesong; Ouyang, Hongwei; Chen, Weishan

    2016-03-01

    To investigate the effects of three-dimensional parallel collagen scaffold on the cell shape, arrangement and extracellular matrix formation of tendon stem cells. Parallel collagen scaffold was fabricated by unidirectional freezing technique, while random collagen scaffold was fabricated by freeze-drying technique. The effects of two scaffolds on cell shape and extracellular matrix formation were investigated in vitro by seeding tendon stem/progenitor cells and in vivo by ectopic implantation. Parallel and random collagen scaffolds were produced successfully. Parallel collagen scaffold was more akin to tendon than random collagen scaffold. Tendon stem/progenitor cells were spindle-shaped and unified orientated in parallel collagen scaffold, while cells on random collagen scaffold had disorder orientation. Two weeks after ectopic implantation, cells had nearly the same orientation with the collagen substance. In parallel collagen scaffold, cells had parallel arrangement, and more spindly cells were observed. By contrast, cells in random collagen scaffold were disorder. Parallel collagen scaffold can induce cells to be in spindly and parallel arrangement, and promote parallel extracellular matrix formation; while random collagen scaffold can induce cells in random arrangement. The results indicate that parallel collagen scaffold is an ideal structure to promote tendon repairing.

  11. Random lasing actions in self-assembled perovskite nanoparticles

    NASA Astrophysics Data System (ADS)

    Liu, Shuai; Sun, Wenzhao; Li, Jiankai; Gu, Zhiyuan; Wang, Kaiyang; Xiao, Shumin; Song, Qinghai

    2016-05-01

    Solution-based perovskite nanoparticles have been intensively studied in the past few years due to their applications in both photovoltaic and optoelectronic devices. Here, based on the common ground between solution-based perovskite and random lasers, we have studied the mirrorless lasing actions in self-assembled perovskite nanoparticles. After synthesis from a solution, discrete lasing peaks have been observed from optically pumped perovskites without any well-defined cavity boundaries. We have demonstrated that the origin of the random lasing emissions is the scattering between the nanostructures in the perovskite microplates. The obtained quality (Q) factors and thresholds of random lasers are around 500 and 60 μJ/cm2, respectively. Both values are comparable to the conventional perovskite microdisk lasers with polygon-shaped cavity boundaries. From the corresponding studies on laser spectra and fluorescence microscope images, the lasing actions are considered random lasers that are generated by strong multiple scattering in random gain media. In additional to conventional single-photon excitation, due to the strong nonlinear effects of perovskites, two-photon pumped random lasers have also been demonstrated for the first time. We believe this research will find its potential applications in low-cost coherent light sources and biomedical detection.

  12. Reduction of randomness in seismic noise as a short-term precursor to a volcanic eruption.

    PubMed

    Glynn, C C; Konstantinou, K I

    2016-11-24

    Ambient seismic noise is characterized by randomness incurred by the random position and strength of the noise sources as well as the heterogeneous properties of the medium through which it propagates. Here we use ambient noise data recorded prior to the 1996 Gjálp eruption in Iceland in order to show that a reduction of noise randomness can be a clear short-term precursor to volcanic activity. The eruption was preceded on 29 September 1996 by a Mw ~5.6 earthquake that occurred in the caldera rim of the Bárdarbunga volcano. A significant reduction of randomness started occurring 8 days before the earthquake and 10 days before the onset of the eruption. This reduction was observed even at stations more than 100 km away from the eruption site. Randomness increased to its previous levels 160 minutes after the Bárdarbunga earthquake, during which time aftershocks migrated from the Bárdarbunga caldera to a site near the Gjálp eruption fissure. We attribute this precursory reduction of randomness to the lack of higher frequencies (>1 Hz) in the noise wavefield caused by high absorption losses as hot magma ascended in the upper crust.

  13. Reduction of randomness in seismic noise as a short-term precursor to a volcanic eruption

    PubMed Central

    Glynn, C. C.; Konstantinou, K. I.

    2016-01-01

    Ambient seismic noise is characterized by randomness incurred by the random position and strength of the noise sources as well as the heterogeneous properties of the medium through which it propagates. Here we use ambient noise data recorded prior to the 1996 Gjálp eruption in Iceland in order to show that a reduction of noise randomness can be a clear short-term precursor to volcanic activity. The eruption was preceded on 29 September 1996 by a Mw ~5.6 earthquake that occurred in the caldera rim of the Bárdarbunga volcano. A significant reduction of randomness started occurring 8 days before the earthquake and 10 days before the onset of the eruption. This reduction was observed even at stations more than 100 km away from the eruption site. Randomness increased to its previous levels 160 minutes after the Bárdarbunga earthquake, during which time aftershocks migrated from the Bárdarbunga caldera to a site near the Gjálp eruption fissure. We attribute this precursory reduction of randomness to the lack of higher frequencies (>1 Hz) in the noise wavefield caused by high absorption losses as hot magma ascended in the upper crust. PMID:27883050

  14. Random numbers certified by Bell's theorem.

    PubMed

    Pironio, S; Acín, A; Massar, S; de la Giroday, A Boyer; Matsukevich, D N; Maunz, P; Olmschenk, S; Hayes, D; Luo, L; Manning, T A; Monroe, C

    2010-04-15

    Randomness is a fundamental feature of nature and a valuable resource for applications ranging from cryptography and gambling to numerical simulation of physical and biological systems. Random numbers, however, are difficult to characterize mathematically, and their generation must rely on an unpredictable physical process. Inaccuracies in the theoretical modelling of such processes or failures of the devices, possibly due to adversarial attacks, limit the reliability of random number generators in ways that are difficult to control and detect. Here, inspired by earlier work on non-locality-based and device-independent quantum information processing, we show that the non-local correlations of entangled quantum particles can be used to certify the presence of genuine randomness. It is thereby possible to design a cryptographically secure random number generator that does not require any assumption about the internal working of the device. Such a strong form of randomness generation is impossible classically and possible in quantum systems only if certified by a Bell inequality violation. We carry out a proof-of-concept demonstration of this proposal in a system of two entangled atoms separated by approximately one metre. The observed Bell inequality violation, featuring near perfect detection efficiency, guarantees that 42 new random numbers are generated with 99 per cent confidence. Our results lay the groundwork for future device-independent quantum information experiments and for addressing fundamental issues raised by the intrinsic randomness of quantum theory.

  15. Learning Bayesian Networks from Correlated Data

    NASA Astrophysics Data System (ADS)

    Bae, Harold; Monti, Stefano; Montano, Monty; Steinberg, Martin H.; Perls, Thomas T.; Sebastiani, Paola

    2016-05-01

    Bayesian networks are probabilistic models that represent complex distributions in a modular way and have become very popular in many fields. There are many methods to build Bayesian networks from a random sample of independent and identically distributed observations. However, many observational studies are designed using some form of clustered sampling that introduces correlations between observations within the same cluster and ignoring this correlation typically inflates the rate of false positive associations. We describe a novel parameterization of Bayesian networks that uses random effects to model the correlation within sample units and can be used for structure and parameter learning from correlated data without inflating the Type I error rate. We compare different learning metrics using simulations and illustrate the method in two real examples: an analysis of genetic and non-genetic factors associated with human longevity from a family-based study, and an example of risk factors for complications of sickle cell anemia from a longitudinal study with repeated measures.

  16. Real time visualization of quantum walk

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miyazaki, Akihide; Hamada, Shinji; Sekino, Hideo

    2014-02-20

    Time evolution of quantum particles like electrons is described by time-dependent Schrödinger equation (TDSE). The TDSE is regarded as the diffusion equation of electrons with imaginary diffusion coefficients. And the TDSE is solved by quantum walk (QW) which is regarded as a quantum version of a classical random walk. The diffusion equation is solved in discretized space/time as in the case of classical random walk with additional unitary transformation of internal degree of freedom typical for quantum particles. We call the QW for solution of the TDSE a Schrödinger walk (SW). For observation of one quantum particle evolution under amore » given potential in atto-second scale, we attempt a successive computation and visualization of the SW. Using Pure Data programming, we observe the correct behavior of a probability distribution under the given potential in real time for observers of atto-second scale.« less

  17. Effect on mental health of a participatory intervention to improve psychosocial work environment: a cluster randomized controlled trial among nurses.

    PubMed

    Uchiyama, Ayako; Odagiri, Yuko; Ohya, Yumiko; Takamiya, Tomoko; Inoue, Shigeru; Shimomitsu, Teruichi

    2013-01-01

    Improvement of psychosocial work environment has proved to be valuable for workers' mental health. However, limited evidence is available for the effectiveness of participatory interventions. The purpose of this study was to investigate the effect on mental health among nurses of a participatory intervention to improve the psychosocial work environment. A cluster randomized controlled trial was conducted in hospital settings. A total of 434 nurses in 24 units were randomly allocated to 11 intervention units (n=183) and 13 control units (n=218). A participatory program was provided to the intervention units for 6 months. Depressive symptoms as mental health status and psychosocial work environment, assessed by the Job Content Questionnaire, the Effort-Reward Imbalance Questionnaire, and the Quality Work Competence questionnaire, were measured before and immediately after the 6-month intervention by a self-administered questionnaire. No significant intervention effect was observed for mental health status. However, significant intervention effects were observed in psychosocial work environment aspects, such as Coworker Support (p<0.01) and Goals (p<0.01), and borderline significance was observed for Job Control (p<0.10). It is suggested that a 6-month participatory intervention is effective in improving psychosocial work environment, but not mental health, among Japanese nurses.

  18. Methods for obtaining true particle size distributions from cross section measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lord, Kristina Alyse

    2013-01-01

    Sectioning methods are frequently used to measure grain sizes in materials. These methods do not provide accurate grain sizes for two reasons. First, the sizes of features observed on random sections are always smaller than the true sizes of solid spherical shaped objects, as noted by Wicksell [1]. This is the case because the section very rarely passes through the center of solid spherical shaped objects randomly dispersed throughout a material. The sizes of features observed on random sections are inversely related to the distance of the center of the solid object from the section [1]. Second, on a planemore » section through the solid material, larger sized features are more frequently observed than smaller ones due to the larger probability for a section to come into contact with the larger sized portion of the spheres than the smaller sized portion. As a result, it is necessary to find a method that takes into account these reasons for inaccurate particle size measurements, while providing a correction factor for accurately determining true particle size measurements. I present a method for deducing true grain size distributions from those determined from specimen cross sections, either by measurement of equivalent grain diameters or linear intercepts.« less

  19. Shock Interaction with Random Spherical Particle Beds

    NASA Astrophysics Data System (ADS)

    Neal, Chris; Mehta, Yash; Salari, Kambiz; Jackson, Thomas L.; Balachandar, S. "Bala"; Thakur, Siddharth

    2016-11-01

    In this talk we present results on fully resolved simulations of shock interaction with randomly distributed bed of particles. Multiple simulations were carried out by varying the number of particles to isolate the effect of volume fraction. Major focus of these simulations was to understand 1) the effect of the shockwave and volume fraction on the forces experienced by the particles, 2) the effect of particles on the shock wave, and 3) fluid mediated particle-particle interactions. Peak drag force for particles at different volume fractions show a downward trend as the depth of the bed increased. This can be attributed to dissipation of energy as the shockwave travels through the bed of particles. One of the fascinating observations from these simulations was the fluctuations in different quantities due to presence of multiple particles and their random distribution. These are large simulations with hundreds of particles resulting in large amount of data. We present statistical analysis of the data and make relevant observations. Average pressure in the computational domain is computed to characterize the strengths of the reflected and transmitted waves. We also present flow field contour plots to support our observations. U.S. Department of Energy, National Nuclear Security Administration, Advanced Simulation and Computing Program, as a Cooperative Agreement under the Predictive Science Academic Alliance Program, under Contract No. DE-NA0002378.

  20. Randomly iterated search and statistical competency as powerful inversion tools for deformation source modeling: Application to volcano interferometric synthetic aperture radar data

    NASA Astrophysics Data System (ADS)

    Shirzaei, M.; Walter, T. R.

    2009-10-01

    Modern geodetic techniques provide valuable and near real-time observations of volcanic activity. Characterizing the source of deformation based on these observations has become of major importance in related monitoring efforts. We investigate two random search approaches, simulated annealing (SA) and genetic algorithm (GA), and utilize them in an iterated manner. The iterated approach helps to prevent GA in general and SA in particular from getting trapped in local minima, and it also increases redundancy for exploring the search space. We apply a statistical competency test for estimating the confidence interval of the inversion source parameters, considering their internal interaction through the model, the effect of the model deficiency, and the observational error. Here, we present and test this new randomly iterated search and statistical competency (RISC) optimization method together with GA and SA for the modeling of data associated with volcanic deformations. Following synthetic and sensitivity tests, we apply the improved inversion techniques to two episodes of activity in the Campi Flegrei volcanic region in Italy, observed by the interferometric synthetic aperture radar technique. Inversion of these data allows derivation of deformation source parameters and their associated quality so that we can compare the two inversion methods. The RISC approach was found to be an efficient method in terms of computation time and search results and may be applied to other optimization problems in volcanic and tectonic environments.

  1. Modeling the nitrogen cycle one gene at a time

    NASA Astrophysics Data System (ADS)

    Coles, V.; Stukel, M. R.; Hood, R. R.; Moran, M. A.; Paul, J. H.; Satinsky, B.; Zielinski, B.; Yager, P. L.

    2016-02-01

    Marine ecosystem models are lagging the revolution in microbial oceanography. As a result, modeling of the nitrogen cycle has largely failed to leverage new genomic information on nitrogen cycling pathways and the organisms that mediate them. We developed a nitrogen based ecosystem model whose community is determined by randomly assigning functional genes to build each organism's "DNA". Microbes are assigned a size that sets their baseline environmental responses using allometric response curves. These responses are modified by the costs and benefits conferred by each gene in an organism's genome. The microbes are embedded in a general circulation model where environmental conditions shape the emergent population. This model is used to explore whether organisms constructed from randomized combinations of metabolic capability alone can self-organize to create realistic oceanic biogeochemical gradients. Community size spectra and chlorophyll-a concentrations emerge in the model with reasonable fidelity to observations. The model is run repeatedly with randomly-generated microbial communities and each time realistic gradients in community size spectra, chlorophyll-a, and forms of nitrogen develop. This supports the hypothesis that the metabolic potential of a community rather than the realized species composition is the primary factor setting vertical and horizontal environmental gradients. Vertical distributions of nitrogen and transcripts for genes involved in nitrification are broadly consistent with observations. Modeled gene and transcript abundance for nitrogen cycling and processing of land-derived organic material match observations along the extreme gradients in the Amazon River plume, and they help to explain the factors controlling observed variability.

  2. A randomized controlled trial of an intervention program to Brazilian mothers who use corporal punishment.

    PubMed

    Santini, Paolla Magioni; Williams, Lucia C A

    2017-09-01

    This study evaluated a positive parenting program to Brazilian mothers who used corporal punishment with their children. The intervention was conducted in four agencies serving vulnerable children, and at a home replica laboratory at the University. Mothers who admitted using corporal punishment were randomly assigned between experimental (n=20) and control group (n=20). The program consisted of 12 individual sessions using one unit from Projeto Parceria (Partnership Project), with specific guidelines and materials on positive parenting, followed by observational sessions of mother-child interaction with live coaching and a video feedback session in the lab. The study used an equivalent group experimental design with pre/post-test and follow-up, in randomized controlled trials. Measures involved: Initial Interview; Strengths and Difficulties Questionnaire (SDQ) - parent and child versions; Beck Depression Inventory (BDI); observational sessions with a protocol; and a Program Evaluation by participants. Analysis of mixed models for repeated measures revealed significant positive effects on the BDI and SDQ total scores, as well as less Conduct problems and Hyperactivity in SDQ measures from the experimental group mothers, comparing pre with post-test. Observational data also indicated significant improvement in positive interaction from the experimental group mothers at post-test, in comparison with controls. No significant results were found, however, in children's observational measures. Limitations of the study involved using a restricted sample, among others. Implications for future research are suggested. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Large-scale inverse model analyses employing fast randomized data reduction

    NASA Astrophysics Data System (ADS)

    Lin, Youzuo; Le, Ellen B.; O'Malley, Daniel; Vesselinov, Velimir V.; Bui-Thanh, Tan

    2017-08-01

    When the number of observations is large, it is computationally challenging to apply classical inverse modeling techniques. We have developed a new computationally efficient technique for solving inverse problems with a large number of observations (e.g., on the order of 107 or greater). Our method, which we call the randomized geostatistical approach (RGA), is built upon the principal component geostatistical approach (PCGA). We employ a data reduction technique combined with the PCGA to improve the computational efficiency and reduce the memory usage. Specifically, we employ a randomized numerical linear algebra technique based on a so-called "sketching" matrix to effectively reduce the dimension of the observations without losing the information content needed for the inverse analysis. In this way, the computational and memory costs for RGA scale with the information content rather than the size of the calibration data. Our algorithm is coded in Julia and implemented in the MADS open-source high-performance computational framework (http://mads.lanl.gov). We apply our new inverse modeling method to invert for a synthetic transmissivity field. Compared to a standard geostatistical approach (GA), our method is more efficient when the number of observations is large. Most importantly, our method is capable of solving larger inverse problems than the standard GA and PCGA approaches. Therefore, our new model inversion method is a powerful tool for solving large-scale inverse problems. The method can be applied in any field and is not limited to hydrogeological applications such as the characterization of aquifer heterogeneity.

  4. Implications of crater distributions on Venus

    NASA Technical Reports Server (NTRS)

    Kaula, W. M.

    1993-01-01

    The horizontal locations of craters on Venus are consistent with randomness. However, (1) randomness does not make crater counts useless for age indications; (2) consistency does not imply necessity or optimality; and (3) horizontal location is not the only reference frame against which to test models. Re (1), the apparent smallness of resurfacing areas means that a region on the order of one percent of the planet with a typical number of craters, 5-15, will have a range of feature ages of several 100 My. Re (2), models of resurfacing somewhat similar to Earth's can be found that are also consistent and more optimal than random: i.e., resurfacing occurring in clusters, that arise and die away in lime intervals on the order of 50 My. These agree with the observation that there are more areas of high crater density, and fewer of moderate density, than optimal for random. Re (3), 799 crater elevations were tested; there are more at low elevations and fewer at high elevations than optimal for random: i.e., 54.6 percent below the median. Only one of 40 random sets of 799 was as extreme.

  5. Rigid Plate Fixation Versus Wire Cerclage for Sternotomy After Cardiac Surgery: A Meta-Analysis.

    PubMed

    Tam, Derrick Y; Nedadur, Rashmi; Yu, Monica; Yanagawa, Bobby; Fremes, Stephen E; Friedrich, Jan O

    2018-03-22

    Traditionally, wire cerclage has been used to reapproximate the sternum after sternotomy. Recent evidence suggests that rigid plate fixation for sternal closure may reduce the risk of sternal complications. The Medline and Embase databases were searched from inception to February 2017 for studies that compared rigid plate fixation with wire cerclage for cardiac surgery patients undergoing sternotomy. Random effects meta-analysis compared rates of sternal complications (primary outcome, defined as deep or superficial sternal wound infection, or sternal instability), early mortality, and length of stay (secondary outcomes). Three randomized controlled trials (n = 427) and five unadjusted observational studies (n = 1,025) met inclusion criteria. There was no significant difference in sternal complications with rigid plate fixation at a median of 6 months' follow-up (incidence rate ratio 0.51, 95% confidence interval [CI]: 0.20 to 1.29, p = 0.15) overall, but a decrease when including only patients at high risk for sternal complications (incidence rate ratio 0.23, 95% CI: 0.06 to 0.89, p = 0.03; two observational studies). Perioperative mortality was reduced favoring rigid plate fixation (relative risk 0.40, 95% CI: 0.28 to 0.97, p = 0.04; four observational studies and one randomized controlled trial). Length of stay was similar overall (mean difference -0.77 days, 95% CI: -1.65 to +0.12, p = 0.09), but significantly reduced with rigid plate fixation in the observational studies (mean difference -1.34 days, 95% CI: -2.05 to -0.63, p = 0.0002). This meta-analysis, driven by the results of unmatched observational studies, suggests that rigid plate fixation may lead to reduced sternal complications in patients at high risk for such events, improved perioperative survival, and decreased hospital length of stay. More randomized controlled trials are required to confirm the potential benefits of rigid plate fixation for primary sternotomy closure. Copyright © 2018 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.

  6. An efficient algorithm for generating random number pairs drawn from a bivariate normal distribution

    NASA Technical Reports Server (NTRS)

    Campbell, C. W.

    1983-01-01

    An efficient algorithm for generating random number pairs from a bivariate normal distribution was developed. Any desired value of the two means, two standard deviations, and correlation coefficient can be selected. Theoretically the technique is exact and in practice its accuracy is limited only by the quality of the uniform distribution random number generator, inaccuracies in computer function evaluation, and arithmetic. A FORTRAN routine was written to check the algorithm and good accuracy was obtained. Some small errors in the correlation coefficient were observed to vary in a surprisingly regular manner. A simple model was developed which explained the qualities aspects of the errors.

  7. Modelling nematode movement using time-fractional dynamics.

    PubMed

    Hapca, Simona; Crawford, John W; MacMillan, Keith; Wilson, Mike J; Young, Iain M

    2007-09-07

    We use a correlated random walk model in two dimensions to simulate the movement of the slug parasitic nematode Phasmarhabditis hermaphrodita in homogeneous environments. The model incorporates the observed statistical distributions of turning angle and speed derived from time-lapse studies of individual nematode trails. We identify strong temporal correlations between the turning angles and speed that preclude the case of a simple random walk in which successive steps are independent. These correlated random walks are appropriately modelled using an anomalous diffusion model, more precisely using a fractional sub-diffusion model for which the associated stochastic process is characterised by strong memory effects in the probability density function.

  8. Steroid treatment of acute graft-versus-host disease grade I: a randomized trial.

    PubMed

    Bacigalupo, Andrea; Milone, Giuseppe; Cupri, Alessandra; Severino, Antonio; Fagioli, Franca; Berger, Massimo; Santarone, Stella; Chiusolo, Patrizia; Sica, Simona; Mammoliti, Sonia; Sorasio, Roberto; Massi, Daniela; Van Lint, Maria Teresa; Raiola, Anna Maria; Gualandi, Francesca; Selleri, Carmine; Sormani, Maria Pia; Signori, Alessio; Risitano, Antonio; Bonifazi, Francesca

    2017-12-01

    Patients with acute graft- versus -host disease (GvHD) grade I were randomized to an observation arm (n=85) or to a treatment arm (n=86) consisting of 6-methylprednisolone 1 mg/kg/day, after stratification for age and donor type. The primary end point was development of grade II-IV GvHD. The cumulative incidence of grade II-IV GvHD was 50% in the observation arm and 33% in the treatment arm ( P =0.005). However, grade III-IV GvHD was comparable (13% vs 10%, respectively; P =0.6), and this was true for sibling and alternative donor transplants. Moderate/severe chronic GvHD was also comparable (17% vs 9%). In multivariate analysis, an early interval between transplant and randomization (

  9. Alternating Renewal Process Models for Behavioral Observation: Simulation Methods, Software, and Validity Illustrations

    ERIC Educational Resources Information Center

    Pustejovsky, James E.; Runyon, Christopher

    2014-01-01

    Direct observation recording procedures produce reductive summary measurements of an underlying stream of behavior. Previous methodological studies of these recording procedures have employed simulation methods for generating random behavior streams, many of which amount to special cases of a statistical model known as the alternating renewal…

  10. On the Bias-Amplifying Effect of Near Instruments in Observational Studies

    ERIC Educational Resources Information Center

    Steiner, Peter M.; Kim, Yongnam

    2014-01-01

    In contrast to randomized experiments, the estimation of unbiased treatment effects from observational data requires an analysis that conditions on all confounding covariates. Conditioning on covariates can be done via standard parametric regression techniques or nonparametric matching like propensity score (PS) matching. The regression or…

  11. Within-Cluster and Across-Cluster Matching with Observational Multilevel Data

    ERIC Educational Resources Information Center

    Kim, Jee-Seon; Steiner, Peter M.; Hall, Courtney; Thoemmes, Felix

    2013-01-01

    When randomized experiments cannot be conducted in practice, propensity score (PS) techniques for matching treated and control units are frequently used for estimating causal treatment effects from observational data. Despite the popularity of PS techniques, they are not yet well studied for matching multilevel data where selection into treatment…

  12. Correlated Sources in Distributed Networks--Data Transmission, Common Information Characterization and Inferencing

    ERIC Educational Resources Information Center

    Liu, Wei

    2011-01-01

    Correlation is often present among observations in a distributed system. This thesis deals with various design issues when correlated data are observed at distributed terminals, including: communicating correlated sources over interference channels, characterizing the common information among dependent random variables, and testing the presence of…

  13. Selected Characteristics of Beginning Science and Mathematics Teachers in Georgia.

    ERIC Educational Resources Information Center

    Carter, Jack Caldwell

    One hundred fifty-seven first year science and mathematics teachers were randomly selected from the population of beginning teachers in Georgia for the school years 1965-66 and 1966-67. Instruments used for data collection were the "Classroom Observation Record (COR),""Pupil Observation Survey (POSR),""Bills Index of…

  14. Smooth empirical Bayes estimation of observation error variances in linear systems

    NASA Technical Reports Server (NTRS)

    Martz, H. F., Jr.; Lian, M. W.

    1972-01-01

    A smooth empirical Bayes estimator was developed for estimating the unknown random scale component of each of a set of observation error variances. It is shown that the estimator possesses a smaller average squared error loss than other estimators for a discrete time linear system.

  15. Surgical Versus Nonsurgical Treatment for Midshaft Clavicle Fractures in Patients Aged 16 Years and Older: A Systematic Review, Meta-analysis, and Comparison of Randomized Controlled Trials and Observational Studies.

    PubMed

    Smeeing, Diederik P J; van der Ven, Denise J C; Hietbrink, Falco; Timmers, Tim K; van Heijl, Mark; Kruyt, Moyo C; Groenwold, Rolf H H; van der Meijden, Olivier A J; Houwert, Roderick M

    2017-07-01

    There is no consensus on the choice of treatment of midshaft clavicle fractures (MCFs). The aims of this systematic review and meta-analysis were (1) to compare fracture healing disorders and functional outcomes of surgical versus nonsurgical treatment of MCFs and (2) to compare effect estimates obtained from randomized controlled trials (RCTs) and observational studies. Systematic review and meta-analysis. The PubMed/MEDLINE, Embase, CENTRAL, and CINAHL databases were searched for both RCTs and observational studies. Using the MINORS instrument, all included studies were assessed on their methodological quality. The primary outcome was a nonunion. Effects of surgical versus nonsurgical treatment were estimated using random-effects meta-analysis models. A total of 20 studies were included, of which 8 were RCTs and 12 were observational studies including 1760 patients. Results were similar across the different study designs. A meta-analysis of 19 studies revealed that nonunions were significantly less common after surgical treatment than after nonsurgical treatment (odds ratio [OR], 0.18 [95% CI, 0.10-0.33]). The risk of malunions did not differ between surgical and nonsurgical treatment (OR, 0.38 [95% CI, 0.12-1.19]). Both the long-term Disabilities of the Arm, Shoulder and Hand (DASH) and Constant-Murley scores favored surgical treatment (DASH: mean difference [MD], -2.04 [95% CI, -3.56 to -0.52]; Constant-Murley: MD, 3.23 [95% CI, 1.52 to 4.95]). No differences were observed regarding revision surgery (OR, 0.85 [95% CI, 0.42-1.73]). Including only high-quality studies, both the number of malunions and days to return to work show significant differences in favor of surgical treatment (malunions: OR, 0.26 [95% CI, 0.07 to 0.92]; return to work: MD, -8.64 [95% CI, -16.22 to -1.05]). This meta-analysis of high-quality studies showed that surgical treatment of MCFs results in fewer nonunions, fewer malunions, and an accelerated return to work compared with nonsurgical treatment. A meta-analysis of surgical treatments need not be restricted to randomized trials, provided that the included observational studies are of high quality.

  16. Randomized controlled trial of a cognitive-behavioral intervention for HIV-positive persons: an investigation of treatment effects on psychosocial adjustment.

    PubMed

    Carrico, Adam W; Chesney, Margaret A; Johnson, Mallory O; Morin, Stephen F; Neilands, Torsten B; Remien, Robert H; Rotheram-Borus, Mary Jane; Lennie Wong, F

    2009-06-01

    Questions remain regarding the clinical utility of psychological interventions for HIV-positive persons because randomized controlled trials have utilized stringent inclusion criteria and focused extensively on gay men. The present randomized controlled trial examined the efficacy of a 15-session, individually delivered cognitive-behavioral intervention (n = 467) compared to a wait-list control (n = 469) in a diverse sample of HIV-positive persons who reported HIV transmission risk behavior. Five intervention sessions that dealt with executing effective coping responses were delivered between baseline and the 5 months post-randomization. Additional assessments were completed through 25 months post-randomization. Despite previously documented reductions in HIV transmission risk, no intervention-related changes in psychosocial adjustment were observed across the 25-month investigation period. In addition, there were no intervention effects on psychosocial adjustment among individuals who presented with mild to moderate depressive symptoms. More intensive mental health interventions may be necessary to improve psychosocial adjustment among HIV-positive individuals.

  17. Gaussian free field in the background of correlated random clusters, formed by metallic nanoparticles

    NASA Astrophysics Data System (ADS)

    Cheraghalizadeh, Jafar; Najafi, Morteza N.; Mohammadzadeh, Hossein

    2018-05-01

    The effect of metallic nano-particles (MNPs) on the electrostatic potential of a disordered 2D dielectric media is considered. The disorder in the media is assumed to be white-noise Coulomb impurities with normal distribution. To realize the correlations between the MNPs we have used the Ising model with an artificial temperature T that controls the number of MNPs as well as their correlations. In the T → 0 limit, one retrieves the Gaussian free field (GFF), and in the finite temperature the problem is equivalent to a GFF in iso-potential islands. The problem is argued to be equivalent to a scale-invariant random surface with some critical exponents which vary with T and correspondingly are correlation-dependent. Two type of observables have been considered: local and global quantities. We have observed that the MNPs soften the random potential and reduce its statistical fluctuations. This softening is observed in the local as well as the geometrical quantities. The correlation function of the electrostatic and its total variance are observed to be logarithmic just like the GFF, i.e. the roughness exponent remains zero for all temperatures, whereas the proportionality constants scale with T - T c . The fractal dimension of iso-potential lines ( D f ), the exponent of the distribution function of the gyration radius ( τ r ), and the loop lengths ( τ l ), and also the exponent of the loop Green function x l change in terms of T - T c in a power-law fashion, with some critical exponents reported in the text. Importantly we have observed that D f ( T) - D f ( T c ) 1/√ ξ( T), in which ξ( T) is the spin correlation length in the Ising model.

  18. Observational research rigour alone does not justify causal inference.

    PubMed

    Ejima, Keisuke; Li, Peng; Smith, Daniel L; Nagy, Tim R; Kadish, Inga; van Groen, Thomas; Dawson, John A; Yang, Yongbin; Patki, Amit; Allison, David B

    2016-12-01

    Differing opinions exist on whether associations obtained in observational studies can be reliable indicators of a causal effect if the observational study is sufficiently well controlled and executed. To test this, we conducted two animal observational studies that were rigorously controlled and executed beyond what is achieved in studies of humans. In study 1, we randomized 332 genetically identical C57BL/6J mice into three diet groups with differing food energy allotments and recorded individual self-selected daily energy intake and lifespan. In study 2, 60 male mice (CD1) were paired and divided into two groups for a 2-week feeding regimen. We evaluated the association between weight gain and food consumption. Within each pair, one animal was randomly assigned to an S group in which the animals had free access to food. The second paired animal (R group) was provided exactly the same diet that their S partner ate the day before. In study 1, across all three groups, we found a significant negative effect of energy intake on lifespan. However, we found a positive association between food intake and lifespan among the ad libitum feeding group: 29·99 (95% CI: 8·2-51·7) days per daily kcal. In study 2, we found a significant (P = 0·003) group (randomized vs. self-selected)-by-food consumption interaction effect on weight gain. At least in nutrition research, associations derived from observational studies may not be reliable indicators of causal effects, even with the most rigorous study designs achievable. © 2016 Stichting European Society for Clinical Investigation Journal Foundation. Published by John Wiley & Sons Ltd.

  19. A randomized comparison between records made with an anesthesia information management system and by hand, and evaluation of the Hawthorne effect.

    PubMed

    Edwards, Kylie-Ellen; Hagen, Sander M; Hannam, Jacqueline; Kruger, Cornelis; Yu, Richard; Merry, Alan F

    2013-10-01

    Anesthesia information management system (AIMS) technology is designed to facilitate high-quality anesthetic recordkeeping. We examined the hypothesis that no difference exists between AIMS and handwritten anesthetic records in regard to the completeness of important information contained as text data. We also investigated the effect of observational research on the completeness of anesthesiologists' recordkeeping. As part of a larger randomized controlled trial, participants were randomized to produce 400 anesthetic records, either handwritten (n = 200) or using an AIMS (n = 200). Records were assessed against a 32-item checklist modified from a clinical guideline. Intravenous agent and bolus recordings were quantified, and data were compared between handwritten and AIMS records. Records produced with intensive research observation during the initial phase of the study (n = 200) were compared with records produced with reduced intensity observation during the final phase of the study (n = 200). The AIMS records were more complete than the handwritten records (mean difference 7.1%; 95% confidence interval [CI] 5.6 to 8.6%; P < 0.0001), with higher completion rates for six individual items on the checklist (P < 0.0001). Drug annotation data were equal between arms. The records completed early in the study, during a period of more intense observation, were more thorough than subsequent records (87.3% vs 81.6%, respectively; mean difference 5.7%; 95% CI 4.2 to 7.3%; P < 0.0001). The AIMS records were more complete than the handwritten records for 32 predefined items. The potential of observational research to influence professional behaviour in an anesthetic context was confirmed. This trial was registered at the Australian New Zealand Clinical Trials Registry No 12608000068369.

  20. Optimal random Lévy-loop searching: New insights into the searching behaviours of central-place foragers

    NASA Astrophysics Data System (ADS)

    Reynolds, A. M.

    2008-04-01

    A random Lévy-looping model of searching is devised and optimal random Lévy-looping searching strategies are identified for the location of a single target whose position is uncertain. An inverse-square power law distribution of loop lengths is shown to be optimal when the distance between the centre of the search and the target is much shorter than the size of the longest possible loop in the searching pattern. Optimal random Lévy-looping searching patterns have recently been observed in the flight patterns of honeybees (Apis mellifera) when attempting to locate their hive and when searching after a known food source becomes depleted. It is suggested that the searching patterns of desert ants (Cataglyphis) are consistent with the adoption of an optimal Lévy-looping searching strategy.

  1. Diverging conductance at the contact between random and pure quantum XX spin chains

    NASA Astrophysics Data System (ADS)

    Chatelain, Christophe

    2017-11-01

    A model consisting of two quantum XX spin chains, one homogeneous and the second with random couplings drawn from a binary distribution, is considered. The two chains are coupled to two different non-local thermal baths and their dynamics is governed by a Lindblad equation. In the steady state, a current J is induced between the two chains by coupling them together by their edges and imposing different chemical potentials μ to the two baths. While a regime of linear characteristics J versus Δμ is observed in the absence of randomness, a gap opens as the disorder strength is increased. In the infinite-randomness limit, this behavior is related to the density of states of the localized states contributing to the current. The conductance is shown to diverge in this limit.

  2. Cooperation evolution in random multiplicative environments

    NASA Astrophysics Data System (ADS)

    Yaari, G.; Solomon, S.

    2010-02-01

    Most real life systems have a random component: the multitude of endogenous and exogenous factors influencing them result in stochastic fluctuations of the parameters determining their dynamics. These empirical systems are in many cases subject to noise of multiplicative nature. The special properties of multiplicative noise as opposed to additive noise have been noticed for a long while. Even though apparently and formally the difference between free additive vs. multiplicative random walks consists in just a move from normal to log-normal distributions, in practice the implications are much more far reaching. While in an additive context the emergence and survival of cooperation requires special conditions (especially some level of reward, punishment, reciprocity), we find that in the multiplicative random context the emergence of cooperation is much more natural and effective. We study the various implications of this observation and its applications in various contexts.

  3. Multiwavelength generation in a random distributed feedback fiber laser using an all fiber Lyot filter.

    PubMed

    Sugavanam, S; Yan, Z; Kamynin, V; Kurkov, A S; Zhang, L; Churkin, D V

    2014-02-10

    Multiwavelength lasing in the random distributed feedback fiber laser is demonstrated by employing an all fiber Lyot filter. Stable multiwavelength generation is obtained, with each line exhibiting sub-nanometer line-widths. A flat power distribution over multiple lines is obtained, which indicates that the power between lines is redistributed in nonlinear mixing processes. The multiwavelength generation is observed both in first and second Stokes waves.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zúñiga, Juan Pablo Álvarez; Lemarié, Gabriel; Laflorencie, Nicolas

    A spin-wave (SW) approach for hard-core bosons is presented to treat the problem of two dimensional boson localization in a random potential. After a short review of the method to compute 1/S-corrected observables, the case of random on-site energy is discussed. Whereas the mean-field solution does not display a Bose glass (BG) phase, 1/S corrections do capture BG physics. In particular, the localization of SW excitations is discussed through the inverse participation ratio.

  5. Narrow linewidth short cavity Brillouin random laser based on Bragg grating array fiber and dynamical population inversion gratings

    NASA Astrophysics Data System (ADS)

    Popov, S. M.; Butov, O. V.; Chamorovski, Y. K.; Isaev, V. A.; Mégret, P.; Korobko, D. A.; Zolotovskii, I. O.; Fotiadi, A. A.

    2018-06-01

    We report on random lasing observed with 100-m-long fiber comprising an array of weak FBGs inscribed in the fiber core and uniformly distributed over the fiber length. Extended fluctuation-free oscilloscope traces highlight power dynamics typical for lasing. An additional piece of Er-doped fiber included into the laser cavity enables a stable laser generation with a linewidth narrower than 10 kHz.

  6. Planform structure of turbulent Rayleigh-Benard convection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Theerthan, S.A.; Arakeri, J.H.

    The planform structure of turbulent Rayleigh-Benard convection is obtained from visualizing a liquid crystal sheet stuck to the bottom hot surface. The bottom plate of the convection cell is Plexiglas and the top plate is glass. Water is the test liquid and the Rayleigh number is 4 [times] 10[sup 7]. The planform pattern reveals randomly moving hot streaks surrounded by cold regions suggesting that turbulent Rayleigh-Benard convection is dominated by quasi-two-dimensional randomly moving plumes. Simultaneous temperature traces from two vertically separated thermocouples indicate that these plumes may be inclined forward in the direction of horizontal motion. The periodic eruption ofmore » thermals observed by Sparrow et al and which forms the basis of Howard's model is not observed.« less

  7. Comparative effects of 12 weeks of equipment based and mat Pilates in patients with Chronic Low Back Pain on pain, function and transversus abdominis activation. A randomized controlled trial.

    PubMed

    Cruz-Díaz, David; Bergamin, M; Gobbo, S; Martínez-Amat, Antonio; Hita-Contreras, Fidel

    2017-08-01

    Pilates method has been recommended for patients with chronic low back pain (CLBP) and the activation of transversus abdominis has been deemed to play an important role in the improvement of these patients. Nevertheless, the evidence of the activation of TrA in Pilates practitioners remains unclear. To assess the effectiveness of 12 weeks of Pilates practice in disability, pain, kinesiophobia and transversus abdominis activation in patients with chronic nonspecific Low Back Pain. A randomized controlled trial was carried out. A single-blind randomized controlled trial with repeated measures at 6 and 12 weeks was carried out. A total of ninety eight patients with low back pain were included and randomly allocated to a Pilates Mat group (PMG) equipment based with apparatus Pilates (PAG) or control group (CG). Roland Morris Disability Questionnaire (RMDQ), visual analog scale (VAS) Tampa Scale of Kinesiophobia (TSK), and transversus abdominis (TrA) activation assessed by real time ultrasound measurement (US) were assessed as outcome measures. Improvement were observed in both intervention groups in all the included variables at 6 and 12 weeks (p<0.001). Faster enhancement was observed in the equipment based Pilates group (p=0.007). Equipment based and mat Pilates modalities are both effective in the improvement of TaA activation in patients with CLBP with associate improvement on pain, function and kinesiophobia. Significant differences were observed after 12 weeks of intervention in PMG and PAG with faster improvement in PAG suggesting that, feedback provided by equipment could help in the interiorization of Pilates principles. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Simple CPR: A randomized, controlled trial of video self-instructional cardiopulmonary resuscitation training in an African American church congregation.

    PubMed

    Todd, K H; Heron, S L; Thompson, M; Dennis, R; O'Connor, J; Kellermann, A L

    1999-12-01

    Despite the proven efficacy of cardiopulmonary resuscitation (CPR), only a small fraction of the population knows how to perform it. As a result, rates of bystander CPR and rates of survival from cardiac arrest are low. Bystander CPR is particularly uncommon in the African American community. Successful development of a simplified approach to CPR training could boost rates of bystander CPR and save lives. We conducted the following randomized, controlled study to determine whether video self-instruction (VSI) in CPR results in comparable or better performance than traditional CPR training. This randomized, controlled trial was conducted among congregational volunteers in an African American church in Atlanta, GA. Subjects were randomly assigned to receive either 34 minutes of VSI or the 4-hour American Heart Association "Heartsaver" CPR course. Two months after training, blinded observers used explicit criteria to assess CPR performance in a simulated cardiac arrest setting. A recording manikin was used to measure ventilation and chest compression characteristics. Participants also completed a written test of CPR-related knowledge and attitudes. VSI trainees displayed a comparable level of performance to that achieved by traditional trainees. Observers scored 40% of VSI trainees competent or better in performing CPR, compared with only 16% of traditional trainees (absolute difference 24%, 95% confidence interval 8% to 40%). Data from the recording manikin confirmed these observations. VSI trainees and traditional trainees achieved comparable scores on tests of CPR-related knowledge and attitudes. Thirty-four minutes of VSI can produce CPR of comparable quality to that achieved by traditional training methods. VSI provides a simple, quick, consistent, and inexpensive alternative to traditional CPR instruction, and may be used to extend CPR training to historically underserved populations.

  9. Changes in Brain Volume and Cognition in a Randomized Trial of Exercise and Social Interaction in a Community-Based Sample of Non-Demented Chinese Elders

    PubMed Central

    Mortimer, James A.; Ding, Ding; Borenstein, Amy R.; DeCarli, Charles; Guo, Qihao; Wu, Yougui; Zhao, Qianhua; Chu, Shugang

    2013-01-01

    Physical exercise has been shown to increase brain volume and improve cognition in randomized trials of non-demented elderly. Although greater social engagement was found to reduce dementia risk in observational studies, randomized trials of social interventions have not been reported. A representative sample of 120 elderly from Shanghai, China was randomized to four groups (Tai Chi, Walking, Social Interaction, No Intervention) for 40 weeks. Two MRIs were obtained, one before the intervention period, the other after. A neuropsychological battery was administered at baseline, 20 weeks, and 40 weeks. Comparison of changes in brain volumes in intervention groups with the No Intervention group were assessed by t-tests. Time-intervention group interactions for neuropsychological measures were evaluated with repeated-measures mixed models. Compared to the No Intervention group, significant increases in brain volume were seen in the Tai Chi and Social Intervention groups (p < 0.05). Improvements also were observed in several neuropsychological measures in the Tai Chi group, including the Mattis Dementia Rating Scale score (p = 0.004), the Trailmaking Test A (p = 0.002) and B (p = 0.0002), the Auditory Verbal Learning Test (p = 0.009), and verbal fluency for animals (p = 0.01). The Social Interaction group showed improvement on some, but fewer neuropsychological indices. No differences were observed between the Walking and No Intervention groups. The findings differ from previous clinical trials in showing increases in brain volume and improvements in cognition with a largely non-aerobic exercise (Tai Chi). In addition, intellectual stimulation through social interaction was associated with increases in brain volume as well as with some cognitive improvements. PMID:22451320

  10. Amylase, Lipase, and Acute Pancreatitis in People With Type 2 Diabetes Treated With Liraglutide: Results From the LEADER Randomized Trial.

    PubMed

    Steinberg, William M; Buse, John B; Ghorbani, Marie Louise Muus; Ørsted, David D; Nauck, Michael A

    2017-07-01

    To evaluate serum amylase and lipase levels and the rate of acute pancreatitis in patients with type 2 diabetes and high cardiovascular risk randomized to liraglutide or placebo and observed for 3.5-5.0 years. A total of 9,340 patients with type 2 diabetes were randomized to either liraglutide or placebo (median observation time 3.84 years). Fasting serum lipase and amylase were monitored. Acute pancreatitis was adjudicated in a blinded manner. Compared with the placebo group, liraglutide-treated patients had increases in serum lipase and amylase of 28.0% and 7.0%, respectively. Levels were increased at 6 months and then remained stable. During the study, 18 (0.4% [1.1 events/1,000 patient-years of observation] [PYO]) liraglutide-treated and 23 (0.5% [1.7 events/1,000 PYO]) placebo patients had acute pancreatitis confirmed by adjudication. Most acute pancreatitis cases occurred ≥12 months after randomization. Liraglutide-treated patients with prior history of pancreatitis ( n = 147) were not more likely to develop acute pancreatitis than similar patients in the placebo group ( n = 120). Elevations of amylase and lipase levels did not predict future risk of acute pancreatitis (positive predictive value <1.0%) in patients treated with liraglutide. In a population with type 2 diabetes at high cardiovascular risk, there were numerically fewer events of acute pancreatitis among liraglutide-treated patients (regardless of previous history of pancreatitis) compared with the placebo group. Liraglutide was associated with increases in serum lipase and amylase, which were not predictive of an event of subsequent acute pancreatitis. © 2017 by the American Diabetes Association.

  11. Quantifying the placebo effect in psychological outcomes of exercise training: a meta-analysis of randomized trials.

    PubMed

    Lindheimer, Jacob B; O'Connor, Patrick J; Dishman, Rod K

    2015-05-01

    The placebo effect could account for some or all of the psychological benefits attributed to exercise training. The magnitude of the placebo effect in psychological outcomes of randomized controlled exercise training trials has not been quantified. The aim of this investigation was to estimate the magnitude of the population placebo effect in psychological outcomes from placebo conditions used in exercise training studies and compare it to the observed effect of exercise training. Articles published before 1 July 2013 were located using Google Scholar, MEDLINE, PsycINFO, and The Cochrane Library. To be included in the analysis, studies were required to have (1) a design that randomly assigned participants to exercise training, placebo, and control conditions and (2) an assessment of a subjective (i.e., anxiety, depression, energy, fatigue) or an objective (i.e., cognitive) psychological outcome. Meta-analytic and multi-level modeling techniques were used to analyze effects from nine studies involving 661 participants. Hedges' d effect sizes were calculated, and random effects models were used to estimate the overall magnitude of the placebo and exercise training effects. After adjusting for nesting effects, the placebo mean effect size was 0.20 (95% confidence interval [CI] -0.02, 0.41) and the observed effect of exercise training was 0.37 (95% CI 0.11, 0.63). A small body of research suggests both that (1) the placebo effect is approximately half of the observed psychological benefits of exercise training and (2) there is an urgent need for creative research specifically aimed at better understanding the role of the placebo effect in the mental health consequences of exercise training.

  12. Changes in brain volume and cognition in a randomized trial of exercise and social interaction in a community-based sample of non-demented Chinese elders.

    PubMed

    Mortimer, James A; Ding, Ding; Borenstein, Amy R; DeCarli, Charles; Guo, Qihao; Wu, Yougui; Zhao, Qianhua; Chu, Shugang

    2012-01-01

    Physical exercise has been shown to increase brain volume and improve cognition in randomized trials of non-demented elderly. Although greater social engagement was found to reduce dementia risk in observational studies, randomized trials of social interventions have not been reported. A representative sample of 120 elderly from Shanghai, China was randomized to four groups (Tai Chi, Walking, Social Interaction, No Intervention) for 40 weeks. Two MRIs were obtained, one before the intervention period, the other after. A neuropsychological battery was administered at baseline, 20 weeks, and 40 weeks. Comparison of changes in brain volumes in intervention groups with the No Intervention group were assessed by t-tests. Time-intervention group interactions for neuropsychological measures were evaluated with repeated-measures mixed models. Compared to the No Intervention group, significant increases in brain volume were seen in the Tai Chi and Social Intervention groups (p < 0.05). Improvements also were observed in several neuropsychological measures in the Tai Chi group, including the Mattis Dementia Rating Scale score (p = 0.004), the Trailmaking Test A (p = 0.002) and B (p = 0.0002), the Auditory Verbal Learning Test (p = 0.009), and verbal fluency for animals (p = 0.01). The Social Interaction group showed improvement on some, but fewer neuropsychological indices. No differences were observed between the Walking and No Intervention groups. The findings differ from previous clinical trials in showing increases in brain volume and improvements in cognition with a largely non-aerobic exercise (Tai Chi). In addition, intellectual stimulation through social interaction was associated with increases in brain volume as well as with some cognitive improvements.

  13. Cluster-randomized, controlled trial of computer-based decision support for selecting long-term anti-thrombotic therapy after acute ischaemic stroke.

    PubMed

    Weir, C J; Lees, K R; MacWalter, R S; Muir, K W; Wallesch, C-W; McLelland, E V; Hendry, A

    2003-02-01

    Identifying the appropriate long-term anti-thrombotic therapy following acute ischaemic stroke is a challenging area in which computer-based decision support may provide assistance. To evaluate the influence on prescribing practice of a computer-based decision support system (CDSS) that provided patient-specific estimates of the expected ischaemic and haemorrhagic vascular event rates under each potential anti-thrombotic therapy. Cluster-randomized controlled trial. We recruited patients who presented for a first investigation of ischaemic stroke or TIA symptoms, excluding those with a poor prognosis or major contraindication to anticoagulation. After observation of routine prescribing practice (6 months) in each hospital, centres were randomized for 6 months to either control (routine practice observed) or intervention (practice observed while the CDSS provided patient-specific information). We compared, between control and intervention centres, the risk reduction (estimated by the CDSS) in ischaemic and haemorrhagic vascular events achieved by long-term anti-thrombotic therapy, and the proportions of subjects prescribed the optimal therapy identified by the CDSS. Sixteen hospitals recruited 1952 subjects. When the CDSS provided information, the mean relative risk reduction attained by prescribing increased by 2.7 percentage units (95%CI -0.3 to 5.7) and the odds ratio for the optimal therapy being prescribed was 1.32 (0.83 to 1.80). Some 55% (5/9) of clinicians believed the CDSS had influenced their prescribing. Cluster-randomized trials provide excellent frameworks for evaluating novel clinical management methods. Our CDSS was feasible to implement and acceptable to clinicians, but did not substantially influence prescribing practice for anti-thrombotic drugs after acute ischaemic stroke.

  14. A Monte Carlo analysis of breast screening randomized trials.

    PubMed

    Zamora, Luis I; Forastero, Cristina; Guirado, Damián; Lallena, Antonio M

    2016-12-01

    To analyze breast screening randomized trials with a Monte Carlo simulation tool. A simulation tool previously developed to simulate breast screening programmes was adapted for that purpose. The history of women participating in the trials was simulated, including a model for survival after local treatment of invasive cancers. Distributions of time gained due to screening detection against symptomatic detection and the overall screening sensitivity were used as inputs. Several randomized controlled trials were simulated. Except for the age range of women involved, all simulations used the same population characteristics and this permitted to analyze their external validity. The relative risks obtained were compared to those quoted for the trials, whose internal validity was addressed by further investigating the reasons of the disagreements observed. The Monte Carlo simulations produce results that are in good agreement with most of the randomized trials analyzed, thus indicating their methodological quality and external validity. A reduction of the breast cancer mortality around 20% appears to be a reasonable value according to the results of the trials that are methodologically correct. Discrepancies observed with Canada I and II trials may be attributed to a low mammography quality and some methodological problems. Kopparberg trial appears to show a low methodological quality. Monte Carlo simulations are a powerful tool to investigate breast screening controlled randomized trials, helping to establish those whose results are reliable enough to be extrapolated to other populations and to design the trial strategies and, eventually, adapting them during their development. Copyright © 2016 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  15. A Kalman Filter for SINS Self-Alignment Based on Vector Observation.

    PubMed

    Xu, Xiang; Xu, Xiaosu; Zhang, Tao; Li, Yao; Tong, Jinwu

    2017-01-29

    In this paper, a self-alignment method for strapdown inertial navigation systems based on the q -method is studied. In addition, an improved method based on integrating gravitational apparent motion to form apparent velocity is designed, which can reduce the random noises of the observation vectors. For further analysis, a novel self-alignment method using a Kalman filter based on adaptive filter technology is proposed, which transforms the self-alignment procedure into an attitude estimation using the observation vectors. In the proposed method, a linear psuedo-measurement equation is adopted by employing the transfer method between the quaternion and the observation vectors. Analysis and simulation indicate that the accuracy of the self-alignment is improved. Meanwhile, to improve the convergence rate of the proposed method, a new method based on parameter recognition and a reconstruction algorithm for apparent gravitation is devised, which can reduce the influence of the random noises of the observation vectors. Simulations and turntable tests are carried out, and the results indicate that the proposed method can acquire sound alignment results with lower standard variances, and can obtain higher alignment accuracy and a faster convergence rate.

  16. [Acupuncture at Baihui(GV 20) and Shenting(GV 24) combined with basic treatment and regular rehabilitation for post-stroke cognitive impairment:a randomized controlled trial].

    PubMed

    Zhan, Jie; Pan, Ruihuan; Guo, Youhua; Zhan, Lechang; He, Mingfeng; Wang, Qiuchun; Chen, Hongxia

    2016-08-12

    To observe the clinical effect of acupuncture at Baihui(GV 20) and Shenting(GV 24) combined with rehabilitation for post-stroke cognitive impairment(PSCI). Fifty patients with PSCI were randomly assigned to an observation group and a control group,25 cases in each one. In the control group,basic treatment and regular rehabilitation were applied. In the observation group,acupuncture at Baihui(GV 20) and Shenting(GV 24) and the same therapies as the control group were used for continuous four weeks,once a day and five times a week. Mini-mental state examination(MMSE) and Montreal cognitive assessment(MoCA) were observed before and after treatment in the two groups. After treatment,the scores of MMSE and MoCA were improved apparently(both P <0.05),with better results in the observation group(both P <0.05). Acupuncture at Baihui(GV 20) and Shenting(GV 24) combined with basic treatment and regular rehabilitation can obviously improve the cognitive function of PSCI,and the effect is superior to that of basic treatment and regular rehabilitation.

  17. ARCADO - Adding random case analysis to direct observation in workplace-based formative assessment of general practice registrars.

    PubMed

    Ingham, Gerard; Fry, Jennifer; Morgan, Simon; Ward, Bernadette

    2015-12-10

    Workplace-based formative assessments using consultation observation are currently conducted during the Australian general practice training program. Assessment reliability is improved by using multiple assessment methods. The aim of this study was to explore experiences of general practice medical educator assessors and registrars (trainees) when adding random case analysis to direct observation (ARCADO) during formative workplace-based assessments. A sample of general practice medical educators and matched registrars were recruited. Following the ARCADO workplace assessment, semi-structured qualitative interviews were conducted. The data was analysed thematically. Ten registrars and eight medical educators participated. Four major themes emerged - formative versus summative assessment; strengths (acceptability, flexibility, time efficiency, complementarity and authenticity); weaknesses (reduced observation and integrity risks); and contextual factors (variation in assessment content, assessment timing, registrar-medical educator relationship, medical educator's approach and registrar ability). ARCADO is a well-accepted workplace-based formative assessment perceived by registrars and assessors to be valid and flexible. The use of ARCADO enabled complementary insights that would not have been achieved with direct observation alone. Whilst there are some contextual factors to be considered in its implementation, ARCADO appears to have utility as formative assessment and, subject to further evaluation, high-stakes assessment.

  18. A Kalman Filter for SINS Self-Alignment Based on Vector Observation

    PubMed Central

    Xu, Xiang; Xu, Xiaosu; Zhang, Tao; Li, Yao; Tong, Jinwu

    2017-01-01

    In this paper, a self-alignment method for strapdown inertial navigation systems based on the q-method is studied. In addition, an improved method based on integrating gravitational apparent motion to form apparent velocity is designed, which can reduce the random noises of the observation vectors. For further analysis, a novel self-alignment method using a Kalman filter based on adaptive filter technology is proposed, which transforms the self-alignment procedure into an attitude estimation using the observation vectors. In the proposed method, a linear psuedo-measurement equation is adopted by employing the transfer method between the quaternion and the observation vectors. Analysis and simulation indicate that the accuracy of the self-alignment is improved. Meanwhile, to improve the convergence rate of the proposed method, a new method based on parameter recognition and a reconstruction algorithm for apparent gravitation is devised, which can reduce the influence of the random noises of the observation vectors. Simulations and turntable tests are carried out, and the results indicate that the proposed method can acquire sound alignment results with lower standard variances, and can obtain higher alignment accuracy and a faster convergence rate. PMID:28146059

  19. Effects of physical randomness training on virtual and laboratory golf putting performance in novices.

    PubMed

    Pataky, T C; Lamb, P F

    2018-06-01

    External randomness exists in all sports but is perhaps most obvious in golf putting where robotic putters sink only 80% of 5 m putts due to unpredictable ball-green dynamics. The purpose of this study was to test whether physical randomness training can improve putting performance in novices. A virtual random-physics golf-putting game was developed based on controlled ball-roll data. Thirty-two subjects were assigned a unique randomness gain (RG) ranging from 0.1 to 2.0-times real-world randomness. Putter face kinematics were measured in 5 m laboratory putts before and after five days of virtual training. Performance was quantified using putt success rate and "miss-adjustment correlation" (MAC), the correlation between left-right miss magnitude and subsequent right-left kinematic adjustments. Results showed no RG-success correlation (r = -0.066, p = 0.719) but mildly stronger correlations with MAC for face angle (r = -0.168, p = 0.358) and clubhead path (r = -0.302, p = 0.093). The strongest RG-MAC correlation was observed during virtual training (r = -0.692, p < 0.001). These results suggest that subjects quickly adapt to physical randomness in virtual training, and also that this learning may weakly transfer to real golf putting kinematics. Adaptation to external physical randomness during virtual training may therefore help golfers adapt to external randomness in real-world environments.

  20. True randomness from an incoherent source

    NASA Astrophysics Data System (ADS)

    Qi, Bing

    2017-11-01

    Quantum random number generators (QRNGs) harness the intrinsic randomness in measurement processes: the measurement outputs are truly random, given the input state is a superposition of the eigenstates of the measurement operators. In the case of trusted devices, true randomness could be generated from a mixed state ρ so long as the system entangled with ρ is well protected. We propose a random number generation scheme based on measuring the quadrature fluctuations of a single mode thermal state using an optical homodyne detector. By mixing the output of a broadband amplified spontaneous emission (ASE) source with a single mode local oscillator (LO) at a beam splitter and performing differential photo-detection, we can selectively detect the quadrature fluctuation of a single mode output of the ASE source, thanks to the filtering function of the LO. Experimentally, a quadrature variance about three orders of magnitude larger than the vacuum noise has been observed, suggesting this scheme can tolerate much higher detector noise in comparison with QRNGs based on measuring the vacuum noise. The high quality of this entropy source is evidenced by the small correlation coefficients of the acquired data. A Toeplitz-hashing extractor is applied to generate unbiased random bits from the Gaussian distributed raw data, achieving an efficiency of 5.12 bits per sample. The output of the Toeplitz extractor successfully passes all the NIST statistical tests for random numbers.

  1. [Randomized, controlled clinical trials with observational follow-up investigations for evaluating efficacy of antihyperglycaemic treatment. II. Features of and lessons from the follow-up investigations].

    PubMed

    Jermendy, György

    2018-04-01

    Although the outcomes of the follow-up investigation period of the randomized clinical studies for evaluating the efficacy of a treatment or an antidiabetic drug may be confounded or potentially biased by several factors, the results are widely accepted by the diabetes community. In line with the theory of metabolic memory or metabolic legacy, early and intensive antihyperglycaemic treatment should be provided for all diabetic patients as this strategy can result in beneficial effects even in the long run. The recent cardiovascular safety trials with new, innovative antidiabetic drugs differ in several aspects from the former efficacy studies. Ten cardiovascular safety trials were completed so far enabling to define their unique and common features. It can be anticipated that the era of randomized, controlled efficacy studies with observational follow-up investigations came to an end in diabetes research. Nowadays, cardiovascular safety trials are in the focus of clinical research in diabetology and results of several ongoing studies are expected with interest in the near future. Orv Hetil. 2018; 159(16): 615-619.

  2. On the Coupling Time of the Heat-Bath Process for the Fortuin-Kasteleyn Random-Cluster Model

    NASA Astrophysics Data System (ADS)

    Collevecchio, Andrea; Elçi, Eren Metin; Garoni, Timothy M.; Weigel, Martin

    2018-01-01

    We consider the coupling from the past implementation of the random-cluster heat-bath process, and study its random running time, or coupling time. We focus on hypercubic lattices embedded on tori, in dimensions one to three, with cluster fugacity at least one. We make a number of conjectures regarding the asymptotic behaviour of the coupling time, motivated by rigorous results in one dimension and Monte Carlo simulations in dimensions two and three. Amongst our findings, we observe that, for generic parameter values, the distribution of the appropriately standardized coupling time converges to a Gumbel distribution, and that the standard deviation of the coupling time is asymptotic to an explicit universal constant multiple of the relaxation time. Perhaps surprisingly, we observe these results to hold both off criticality, where the coupling time closely mimics the coupon collector's problem, and also at the critical point, provided the cluster fugacity is below the value at which the transition becomes discontinuous. Finally, we consider analogous questions for the single-spin Ising heat-bath process.

  3. Direct Simulation of Multiple Scattering by Discrete Random Media Illuminated by Gaussian Beams

    NASA Technical Reports Server (NTRS)

    Mackowski, Daniel W.; Mishchenko, Michael I.

    2011-01-01

    The conventional orientation-averaging procedure developed in the framework of the superposition T-matrix approach is generalized to include the case of illumination by a Gaussian beam (GB). The resulting computer code is parallelized and used to perform extensive numerically exact calculations of electromagnetic scattering by volumes of discrete random medium consisting of monodisperse spherical particles. The size parameters of the scattering volumes are 40, 50, and 60, while their packing density is fixed at 5%. We demonstrate that all scattering patterns observed in the far-field zone of a random multisphere target and their evolution with decreasing width of the incident GB can be interpreted in terms of idealized theoretical concepts such as forward-scattering interference, coherent backscattering (CB), and diffuse multiple scattering. It is shown that the increasing violation of electromagnetic reciprocity with decreasing GB width suppresses and eventually eradicates all observable manifestations of CB. This result supplements the previous demonstration of the effects of broken reciprocity in the case of magneto-optically active particles subjected to an external magnetic field.

  4. Sustainability of the Dissemination of an Occupational Sun Protection Program in a Randomized Trial

    PubMed Central

    Buller, David B.; Walkosz, Barbara J.; Andersen, Peter A.; Scott, Michael D.; Dignan, Mark B.; Cutter, Gary R.; Zhang, Xiao; Kane, Ilima L.

    2012-01-01

    The sustainability of an occupational sun safety program, Go Sun Smart (GSS), was explored in a randomized trial, testing dissemination strategies at 68 U.S. and Canadian ski areas in 2004-2007. All ski areas received GSS from the National Ski Areas Association through a Basic Dissemination Strategy (BDS) using conference presentations and free materials. Half of the ski areas were randomly assigned to a theory-based Enhanced Dissemination Strategy (EDS) with personal contact supporting GSS use. Use of GSS was assessed at immediate and long-term follow-up posttests by on-site observation. Use of GSS declined from the immediate (M=5.72) to the long-term follow-up (M=6.24), F[1,62]=6.95, p=.01, but EDS ski areas (M=6.53) continued to use GSS more than BDS ski areas (M=4.49), F(1,62)=5.75, p=0.02, regardless of observation, F(1,60)=0.05, p=.83. Despite declines over time, a group of ski areas had sustained high program use and active dissemination methods had sustained positive effects on GSS implementation. PMID:22102323

  5. Olive Oil, Sunflower Oil or no Oil for Baby Dry Skin or Massage: A Pilot, Assessor-blinded, Randomized Controlled Trial (the Oil in Baby SkincaRE [OBSeRvE] Study).

    PubMed

    Cooke, Alison; Cork, Michael J; Victor, Suresh; Campbell, Malcolm; Danby, Simon; Chittock, John; Lavender, Tina

    2016-03-01

    Topical oils on baby skin may contribute to development of childhood atopic eczema. A pilot, assessor-blinded, randomized controlled trial assessed feasibility of a definitive trial investigating their impact in neonates. One-hundred and fifteen healthy, full-term neonates were randomly assigned to olive oil, sunflower oil or no oil, twice daily for 4 weeks, stratified by family history of atopic eczema. We measured spectral profile of lipid lamellae, trans-epidermal water loss (TEWL), stratum corneum hydration and pH and recorded clinical observations, at baseline, and 4 weeks post-birth. Recruitment was challenging (recruitment 11.1%; retention 80%), protocol adherence reasonable (79-100%). Both oil groups had significantly improved hydration but significantly less improvement in lipid lamellae structure compared to the no oil group. There were no significant differences in TEWL, pH or erythema/skin scores. The study was not powered for clinical significance, but until further research is conducted, caution should be exercised when recommending oils for neonatal skin.

  6. The Effects of Twelve Weeks of Tai Chi Practice on Anxiety in Stressed But Healthy People Compared to Exercise and Wait-List Groups-A Randomized Controlled Trial.

    PubMed

    Zheng, Shuai; Kim, Christine; Lal, Sara; Meier, Peter; Sibbritt, David; Zaslawski, Chris

    2018-01-01

    This randomized controlled trial was undertaken to determine whether 12 weeks of Tai Chi (TC) practice can reduce anxiety in healthy but stressed people. Fifty participants were randomized into TC (n=17), exercise (n=17), and wait-list (WL) groups (n=16). Outcome measures used were State Trait Anxiety Inventory, Perceived Stress Scale 14 (PSS14), blood pressure and heart rate variability, visual analogue scale (VAS), and Short Form 36. Significant improvements were observed from baseline for both TC and exercise groups for both state (p <0.01) and trait (p <0.01) anxiety, PSS14 (p <0.01), VAS (p <0.01), mental health domain (p <0.01), and vitality domain (p <0.01). Superior outcomes were also observed for TC when compared with WL for state and trait anxiety (p <0.01) and mental health domain (p <0.05). TC reduces stress levels in healthy individuals and provides a safer, cost effective, and less physically vigorous alternative to exercise. © 2017 Wiley Periodicals, Inc.

  7. Bayesian Sensitivity Analysis of Statistical Models with Missing Data

    PubMed Central

    ZHU, HONGTU; IBRAHIM, JOSEPH G.; TANG, NIANSHENG

    2013-01-01

    Methods for handling missing data depend strongly on the mechanism that generated the missing values, such as missing completely at random (MCAR) or missing at random (MAR), as well as other distributional and modeling assumptions at various stages. It is well known that the resulting estimates and tests may be sensitive to these assumptions as well as to outlying observations. In this paper, we introduce various perturbations to modeling assumptions and individual observations, and then develop a formal sensitivity analysis to assess these perturbations in the Bayesian analysis of statistical models with missing data. We develop a geometric framework, called the Bayesian perturbation manifold, to characterize the intrinsic structure of these perturbations. We propose several intrinsic influence measures to perform sensitivity analysis and quantify the effect of various perturbations to statistical models. We use the proposed sensitivity analysis procedure to systematically investigate the tenability of the non-ignorable missing at random (NMAR) assumption. Simulation studies are conducted to evaluate our methods, and a dataset is analyzed to illustrate the use of our diagnostic measures. PMID:24753718

  8. Evaluating the Performance of Repeated Measures Approaches in Replicating Experimental Benchmark Results

    ERIC Educational Resources Information Center

    McConeghy, Kevin; Wing, Coady; Wong, Vivian C.

    2015-01-01

    Randomized experiments have long been established as the gold standard for addressing causal questions. However, experiments are not always feasible or desired, so observational methods are also needed. When multiple observations on the same variable are available, a repeated measures design may be used to assess whether a treatment administered…

  9. Matching with Multiple Control Groups with Adjustment for Group Differences

    ERIC Educational Resources Information Center

    Stuart, Elizabeth A.; Rubin, Donald B.

    2008-01-01

    When estimating causal effects from observational data, it is desirable to approximate a randomized experiment as closely as possible. This goal can often be achieved by choosing a subsample from the original control group that matches the treatment group on the distribution of the observed covariates. However, sometimes the original control group…

  10. Observing and Deterring Social Cheating on College Exams

    ERIC Educational Resources Information Center

    Fendler, Richard J.; Yates, Michael C.; Godbey, Johnathan M.

    2018-01-01

    This research introduces a unique multiple choice exam design to observe and measure the degree to which students copy answers from their peers. Using data collected from the exam, an empirical experiment is conducted to determine whether random seat assignment deters cheating relative to a control group of students allowed to choose their seats.…

  11. Producing chondrules by recycling and volatile loss

    NASA Technical Reports Server (NTRS)

    Alexander, C. M. O.

    1994-01-01

    Interelement correlations observed in bulk chondrule INAA data, particularly between the refractory lithophiles, have led to the now generally accepted conclusion that the chondrule precursors were nebular condensates. However, it has been recently suggested that random sampling of fragments from a previous generation of chondrules could reproduce much of the observed range of bulk chondrule composition.

  12. Classwide Efficacy of INSIGHTS: Observed Teacher Practices and Student Behaviors in Kindergarten and First Grade

    ERIC Educational Resources Information Center

    Cappella, Elise; O'Connor, Erin E.; McCormick, Meghan P.; Turbeville, Ashley R.; Collins, Ashleigh J.; McClowry, Sandee G.

    2015-01-01

    We investigate the classwide efficacy of INSIGHTS, a universal social-emotional learning intervention for early elementary grades, on observed teacher practices and student behaviors. Twenty-two elementary schools (87% free/reduced lunch) were randomly assigned to INSIGHTS or an attention-control condition. Kindergarten and first-grade classrooms…

  13. Visualizing the Sample Standard Deviation

    ERIC Educational Resources Information Center

    Sarkar, Jyotirmoy; Rashid, Mamunur

    2017-01-01

    The standard deviation (SD) of a random sample is defined as the square-root of the sample variance, which is the "mean" squared deviation of the sample observations from the sample mean. Here, we interpret the sample SD as the square-root of twice the mean square of all pairwise half deviations between any two sample observations. This…

  14. The Characteristics and Quality of Pre-School Education in Spain

    ERIC Educational Resources Information Center

    Sandstrom, Heather

    2012-01-01

    We examined 25 four-year-old pre-school classrooms from a random sample of 15 schools within a large urban city in southern Spain. Observational measures of classroom quality included the Early Childhood Environment Rating Scale-Revised, the Classroom Assessment Scoring System and the Observation of Activities in Pre-school. Findings revealed…

  15. A Comparison of Four Approaches to Account for Method Effects in Latent State-Trait Analyses

    ERIC Educational Resources Information Center

    Geiser, Christian; Lockhart, Ginger

    2012-01-01

    Latent state-trait (LST) analysis is frequently applied in psychological research to determine the degree to which observed scores reflect stable person-specific effects, effects of situations and/or person-situation interactions, and random measurement error. Most LST applications use multiple repeatedly measured observed variables as indicators…

  16. Effect of Remote Internet Follow-Up on Postradiotherapy Compliance Among Patients with Esophageal Cancer: A Randomized Controlled Study.

    PubMed

    Wang, Ping; Yang, Lin; Hua, Zhongsheng

    2015-11-01

    To explore the effects of using remote Internet follow-up on postradiotherapy compliance with medical advice provided to patients with esophageal cancer. Between January 1 and August 1, 2013, in total, 128 patients with esophageal squamous cell cancer treated with radiotherapy were randomly assigned to either an observation group (n=64) or a control group (n=64). The control group received routine outpatient follow-up, whereas the observation group received additional remote Internet follow-up for 6 months after discharge from the hospital. The treatment effects and compliance were investigated using a questionnaire. At 3 months and 6 months after discharge, patients in the observation group had sought significantly more consultations and undergone more periodic re-examinations than patients in the control group (all p<0.001). Furthermore, both the disease-free survival rate and the symptom reduction rate were significantly higher in the observation group compared with the control group (all p<0.001). Remote Internet follow-up is an easy and fast method for improving postradiotherapy compliance with medical instructions and promoting normalization among patients with esophageal cancer.

  17. Efficient strategies for leave-one-out cross validation for genomic best linear unbiased prediction.

    PubMed

    Cheng, Hao; Garrick, Dorian J; Fernando, Rohan L

    2017-01-01

    A random multiple-regression model that simultaneously fit all allele substitution effects for additive markers or haplotypes as uncorrelated random effects was proposed for Best Linear Unbiased Prediction, using whole-genome data. Leave-one-out cross validation can be used to quantify the predictive ability of a statistical model. Naive application of Leave-one-out cross validation is computationally intensive because the training and validation analyses need to be repeated n times, once for each observation. Efficient Leave-one-out cross validation strategies are presented here, requiring little more effort than a single analysis. Efficient Leave-one-out cross validation strategies is 786 times faster than the naive application for a simulated dataset with 1,000 observations and 10,000 markers and 99 times faster with 1,000 observations and 100 markers. These efficiencies relative to the naive approach using the same model will increase with increases in the number of observations. Efficient Leave-one-out cross validation strategies are presented here, requiring little more effort than a single analysis.

  18. Exploring When and Why to Use Arabic in the Saudi Arabian EFL Classroom: Viewing L1 Use as Eclectic Technique

    ERIC Educational Resources Information Center

    Khresheh, Asim

    2012-01-01

    This study aims to investigate when and why to use Arabic as L1 in the Saudi Arabian EFL classroom. For this purpose, 45 classroom observations were performed for beginning, intermediate, and advanced levels of students. 5 classes were chosen randomly for each level and each class was observed three times. Based on the classroom observations,…

  19. Evaluating an Art-Based Intervention to Improve Practicing Nurses' Observation, Description, and Problem Identification Skills.

    PubMed

    Nease, Beth M; Haney, Tina S

    Astute observation, description, and problem identification skills provide the underpinning for nursing assessment, surveillance, and prevention of failure to rescue events. Art-based education has been effective in nursing schools for improving observation, description, and problem identification. The authors describe a randomized controlled pilot study testing the effectiveness of an art-based educational intervention aimed at improving these skills in practicing nurses.

  20. Not a Copernican observer: biased peculiar velocity statistics in the local Universe

    NASA Astrophysics Data System (ADS)

    Hellwing, Wojciech A.; Nusser, Adi; Feix, Martin; Bilicki, Maciej

    2017-05-01

    We assess the effect of the local large-scale structure on the estimation of two-point statistics of the observed radial peculiar velocities of galaxies. A large N-body simulation is used to examine these statistics from the perspective of random observers as well as 'Local Group-like' observers conditioned to reside in an environment resembling the observed Universe within 20 Mpc. The local environment systematically distorts the shape and amplitude of velocity statistics with respect to ensemble-averaged measurements made by a Copernican (random) observer. The Virgo cluster has the most significant impact, introducing large systematic deviations in all the statistics. For a simple 'top-hat' selection function, an idealized survey extending to ˜160 h-1 Mpc or deeper is needed to completely mitigate the effects of the local environment. Using shallower catalogues leads to systematic deviations of the order of 50-200 per cent depending on the scale considered. For a flat redshift distribution similar to the one of the CosmicFlows-3 survey, the deviations are even more prominent in both the shape and amplitude at all separations considered (≲100 h-1 Mpc). Conclusions based on statistics calculated without taking into account the impact of the local environment should be revisited.

  1. Optimizing the LSST Dither Pattern for Survey Uniformity

    NASA Astrophysics Data System (ADS)

    Awan, Humna; Gawiser, Eric J.; Kurczynski, Peter; Carroll, Christopher M.; LSST Dark Energy Science Collaboration

    2015-01-01

    The Large Synoptic Survey Telescope (LSST) will gather detailed data of the southern sky, enabling unprecedented study of Baryonic Acoustic Oscillations, which are an important probe of dark energy. These studies require a survey with highly uniform depth, and we aim to find an observation strategy that optimizes this uniformity. We have shown that in the absence of dithering (large telescope-pointing offsets), the LSST survey will vary significantly in depth. Hence, we implemented various dithering strategies, including random and repulsive random pointing offsets and spiral patterns with the spiral reaching completion in either a few months or the entire ten-year run. We employed three different implementations of dithering strategies: a single offset assigned to all fields observed on each night, offsets assigned to each field independently whenever the field is observed, and offsets assigned to each field only when the field is observed on a new night. Our analysis reveals that large dithers are crucial to guarantee survey uniformity and that assigning dithers to each field independently whenever the field is observed significantly increases this uniformity. These results suggest paths towards an optimal observation strategy that will enable LSST to achieve its science goals.We gratefully acknowledge support from the National Science Foundation REU program at Rutgers, PHY-1263280, and the Department of Energy, DE-SC0011636.

  2. Effects of music therapy and distraction cards on pain relief during phlebotomy in children.

    PubMed

    Aydin, Diler; Sahiner, Nejla Canbulat

    2017-02-01

    To investigate three different distraction methods (distraction cards, listening to music, and distraction cards + music) on pain and anxiety relief in children during phlebotomy. This study was a prospective, randomized, controlled trial. The sample consisted of children aged 7 to 12years who required blood tests. The children were randomized into four groups, distraction cards, music, distraction cards + music, and controls. Data were obtained through face-to-face interviews with the children, their parents, and the observer before and after the procedure. The children's pain levels were assessed and reported by the parents and observers, and the children themselves who self-reported using Wong-Baker FACES. The children's anxiety levels were also assessed using the Children's Fear Scale. Two hundred children (mean age: 9.01±2.35years) were included. No difference was found between the groups in the self, parent, and observer reported procedural pain levels (p=0.72, p=0.23, p=0.15, respectively). Furthermore, no significant differences were observed between groups in procedural child anxiety levels according to the parents and observer (p=0.092, p=0.096, respectively). Pain and anxiety relief was seen in all three methods during phlebotomy; however, no statistically significant difference was observed. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. Packet Randomized Experiments for Eliminating Classes of Confounders

    PubMed Central

    Pavela, Greg; Wiener, Howard; Fontaine, Kevin R.; Fields, David A.; Voss, Jameson D.; Allison, David B.

    2014-01-01

    Background Although randomization is considered essential for causal inference, it is often not possible to randomize in nutrition and obesity research. To address this, we develop a framework for an experimental design—packet randomized experiments (PREs), which improves causal inferences when randomization on a single treatment variable is not possible. This situation arises when subjects are randomly assigned to a condition (such as a new roommate) which varies in one characteristic of interest (such as weight), but also varies across many others. There has been no general discussion of this experimental design, including its strengths, limitations, and statistical properties. As such, researchers are left to develop and apply PREs on an ad hoc basis, limiting its potential to improve causal inferences among nutrition and obesity researchers. Methods We introduce PREs as an intermediary design between randomized controlled trials and observational studies. We review previous research that used the PRE design and describe its application in obesity-related research, including random roommate assignments, heterochronic parabiosis, and the quasi-random assignment of subjects to geographic areas. We then provide a statistical framework to control for potential packet-level confounders not accounted for by randomization. Results PREs have successfully been used to improve causal estimates of the effect of roommates, altitude, and breastfeeding on weight outcomes. When certain assumptions are met, PREs can asymptotically control for packet-level characteristics. This has the potential to statistically estimate the effect of a single treatment even when randomization to a single treatment did not occur. Conclusions Applying PREs to obesity-related research will improve decisions about clinical, public health, and policy actions insofar as it offers researchers new insight into cause and effect relationships among variables. PMID:25444088

  4. Statistical simulation of ensembles of precipitation fields for data assimilation applications

    NASA Astrophysics Data System (ADS)

    Haese, Barbara; Hörning, Sebastian; Chwala, Christian; Bárdossy, András; Schalge, Bernd; Kunstmann, Harald

    2017-04-01

    The simulation of the hydrological cycle by models is an indispensable tool for a variety of environmental challenges such as climate prediction, water resources management, or flood forecasting. One of the crucial variables within the hydrological system, and accordingly one of the main drivers for terrestrial hydrological processes, is precipitation. A correct reproduction of the spatio-temporal distribution of precipitation is crucial for the quality and performance of hydrological applications. In our approach we stochastically generate precipitation fields conditioned on various precipitation observations. Rain gauges provide high-quality information for a specific measurement point, but their spatial representativeness is often rare. Microwave links, e. g. from commercial cellular operators, on the other hand can be used to estimate line integrals of near-surface rainfall information. They provide a very dense observational system compared to rain gauges. A further prevalent source of precipitation information are weather radars, which provide rainfall pattern informations. In our approach we derive precipitation fields, which are conditioned on combinations of these different observation types. As method to generate precipitation fields we use the random mixing method. Following this method a precipitation field is received as a linear combination of unconditional spatial random fields, where the spatial dependence structure is described by copulas. The weights of the linear combination are chosen in the way that the observations and the spatial structure of precipitation are reproduced. One main advantage of the random mixing method is the opportunity to consider linear and non-linear constraints. For a demonstration of the method we use virtual observations generated from a virtual reality of the Neckar catchment. These virtual observations mimic advantages and disadvantages of real observations. This virtual data set allows us to evaluate simulated precipitation fields in a very detailed manner as well as to quantify uncertainties which are conveyed by measurement inaccuracies. In a further step we use real observations as a basis for the generation of precipitation fields. The resulting ensembles of precipitation fields are used for example for data assimilation applications or as input data for hydrological models.

  5. False Operation of Static Random Access Memory Cells under Alternating Current Power Supply Voltage Variation

    NASA Astrophysics Data System (ADS)

    Sawada, Takuya; Takata, Hidehiro; Nii, Koji; Nagata, Makoto

    2013-04-01

    Static random access memory (SRAM) cores exhibit susceptibility against power supply voltage variation. False operation is investigated among SRAM cells under sinusoidal voltage variation on power lines introduced by direct RF power injection. A standard SRAM core of 16 kbyte in a 90 nm 1.5 V technology is diagnosed with built-in self test and on-die noise monitor techniques. The sensitivity of bit error rate is shown to be high against the frequency of injected voltage variation, while it is not greatly influenced by the difference in frequency and phase against SRAM clocking. It is also observed that the distribution of false bits is substantially random in a cell array.

  6. Random electric field instabilities of relaxor ferroelectrics

    NASA Astrophysics Data System (ADS)

    Arce-Gamboa, José R.; Guzmán-Verri, Gian G.

    2017-06-01

    Relaxor ferroelectrics are complex oxide materials which are rather unique to study the effects of compositional disorder on phase transitions. Here, we study the effects of quenched cubic random electric fields on the lattice instabilities that lead to a ferroelectric transition and show that, within a microscopic model and a statistical mechanical solution, even weak compositional disorder can prohibit the development of long-range order and that a random field state with anisotropic and power-law correlations of polarization emerges from the combined effect of their characteristic dipole forces and their inherent charge disorder. We compare and reproduce several key experimental observations in the well-studied relaxor PbMg1/3Nb2/3O3-PbTiO3.

  7. Errors in radial velocity variance from Doppler wind lidar

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, H.; Barthelmie, R. J.; Doubrawa, P.

    A high-fidelity lidar turbulence measurement technique relies on accurate estimates of radial velocity variance that are subject to both systematic and random errors determined by the autocorrelation function of radial velocity, the sampling rate, and the sampling duration. Our paper quantifies the effect of the volumetric averaging in lidar radial velocity measurements on the autocorrelation function and the dependence of the systematic and random errors on the sampling duration, using both statistically simulated and observed data. For current-generation scanning lidars and sampling durations of about 30 min and longer, during which the stationarity assumption is valid for atmospheric flows, themore » systematic error is negligible but the random error exceeds about 10%.« less

  8. Dual operation characteristics of resistance random access memory in indium-gallium-zinc-oxide thin film transistors

    NASA Astrophysics Data System (ADS)

    Yang, Jyun-Bao; Chang, Ting-Chang; Huang, Jheng-Jie; Chen, Yu-Chun; Chen, Yu-Ting; Tseng, Hsueh-Chih; Chu, Ann-Kuo; Sze, Simon M.

    2014-04-01

    In this study, indium-gallium-zinc-oxide thin film transistors can be operated either as transistors or resistance random access memory devices. Before the forming process, current-voltage curve transfer characteristics are observed, and resistance switching characteristics are measured after a forming process. These resistance switching characteristics exhibit two behaviors, and are dominated by different mechanisms. The mode 1 resistance switching behavior is due to oxygen vacancies, while mode 2 is dominated by the formation of an oxygen-rich layer. Furthermore, an easy approach is proposed to reduce power consumption when using these resistance random access memory devices with the amorphous indium-gallium-zinc-oxide thin film transistor.

  9. Plasma fluctuations as Markovian noise.

    PubMed

    Li, B; Hazeltine, R D; Gentle, K W

    2007-12-01

    Noise theory is used to study the correlations of stationary Markovian fluctuations that are homogeneous and isotropic in space. The relaxation of the fluctuations is modeled by the diffusion equation. The spatial correlations of random fluctuations are modeled by the exponential decay. Based on these models, the temporal correlations of random fluctuations, such as the correlation function and the power spectrum, are calculated. We find that the diffusion process can give rise to the decay of the correlation function and a broad frequency spectrum of random fluctuations. We also find that the transport coefficients may be estimated by the correlation length and the correlation time. The theoretical results are compared with the observed plasma density fluctuations from the tokamak and helimak experiments.

  10. Models for the hotspot distribution

    NASA Technical Reports Server (NTRS)

    Jurdy, Donna M.; Stefanick, Michael

    1990-01-01

    Published hotspot catalogs all show a hemispheric concentration beyond what can be expected by chance. Cumulative distributions about the center of concentration are described by a power law with a fractal dimension closer to 1 than 2. Random sets of the corresponding sizes do not show this effect. A simple shift of the random sets away from a point would produce distributions similar to those of hotspot sets. The possible relation of the hotspots to the locations of ridges and subduction zones is tested using large sets of randomly-generated points to estimate areas within given distances of the plate boundaries. The probability of finding the observed number of hotspots within 10 deg of the ridges is about what is expected.

  11. Errors in radial velocity variance from Doppler wind lidar

    DOE PAGES

    Wang, H.; Barthelmie, R. J.; Doubrawa, P.; ...

    2016-08-29

    A high-fidelity lidar turbulence measurement technique relies on accurate estimates of radial velocity variance that are subject to both systematic and random errors determined by the autocorrelation function of radial velocity, the sampling rate, and the sampling duration. Our paper quantifies the effect of the volumetric averaging in lidar radial velocity measurements on the autocorrelation function and the dependence of the systematic and random errors on the sampling duration, using both statistically simulated and observed data. For current-generation scanning lidars and sampling durations of about 30 min and longer, during which the stationarity assumption is valid for atmospheric flows, themore » systematic error is negligible but the random error exceeds about 10%.« less

  12. Use of a Linear Paul Trap to Study Random Noise-Induced Beam Degradation in High-Intensity Accelerators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chung, Moses; Gilson, Erik P.; Davidson, Ronald C.

    2009-04-10

    A random noise-induced beam degradation that can affect intense beam transport over long propagation distances has been experimentally studied by making use of the transverse beam dynamics equivalence between an alternating-gradient (AG) focusing system and a linear Paul trap system. For the present studies, machine imperfections in the quadrupole focusing lattice are considered, which are emulated by adding small random noise on the voltage waveform of the quadrupole electrodes in the Paul trap. It is observed that externally driven noise continuously produces a nonthermal tail of trapped ions, and increases the transverse emittance almost linearly with the duration of themore » noise.« less

  13. Random walker in temporally deforming higher-order potential forces observed in a financial crisis.

    PubMed

    Watanabe, Kota; Takayasu, Hideki; Takayasu, Misako

    2009-11-01

    Basic peculiarities of market price fluctuations are known to be well described by a recently developed random-walk model in a temporally deforming quadratic potential force whose center is given by a moving average of past price traces [M. Takayasu, T. Mizuno, and H. Takayasu, Physica A 370, 91 (2006)]. By analyzing high-frequency financial time series of exceptional events, such as bubbles and crashes, we confirm the appearance of higher-order potential force in the markets. We show statistical significance of its existence by applying the information criterion. This time series analysis is expected to be applied widely for detecting a nonstationary symptom in random phenomena.

  14. Random crystal field effects on the integer and half-integer mixed-spin system

    NASA Astrophysics Data System (ADS)

    Yigit, Ali; Albayrak, Erhan

    2018-05-01

    In this work, we have focused on the random crystal field effects on the phase diagrams of the mixed spin-1 and spin-5/2 Ising system obtained by utilizing the exact recursion relations (ERR) on the Bethe lattice (BL). The distribution function P(Di) = pδ [Di - D(1 + α) ] +(1 - p) δ [Di - D(1 - α) ] is used to randomize the crystal field.The phase diagrams are found to exhibit second- and first-order phase transitions depending on the values of α, D and p. It is also observed that the model displays tricritical point, isolated point, critical end point and three compensation temperatures for suitable values of the system parameters.

  15. Sun compass error model

    NASA Technical Reports Server (NTRS)

    Blucker, T. J.; Ferry, W. W.

    1971-01-01

    An error model is described for the Apollo 15 sun compass, a contingency navigational device. Field test data are presented along with significant results of the test. The errors reported include a random error resulting from tilt in leveling the sun compass, a random error because of observer sighting inaccuracies, a bias error because of mean tilt in compass leveling, a bias error in the sun compass itself, and a bias error because the device is leveled to the local terrain slope.

  16. Continuous-time random-walk model for financial distributions

    NASA Astrophysics Data System (ADS)

    Masoliver, Jaume; Montero, Miquel; Weiss, George H.

    2003-02-01

    We apply the formalism of the continuous-time random walk to the study of financial data. The entire distribution of prices can be obtained once two auxiliary densities are known. These are the probability densities for the pausing time between successive jumps and the corresponding probability density for the magnitude of a jump. We have applied the formalism to data on the U.S. dollar deutsche mark future exchange, finding good agreement between theory and the observed data.

  17. Evaluation of the effect of Spiritual care on patients with generalized anxiety and depression: a randomized controlled study.

    PubMed

    Sankhe, A; Dalal, K; Save, D; Sarve, P

    2017-12-01

    The present study was conducted to assess the effect of spiritual care in patients with depression, anxiety or both in a randomized controlled design. The participants were randomized either to receive spiritual care or not and Hamilton anxiety rating scale-A (HAM-A), Hamilton depression rating scale-D (HAM-D), WHO-quality of life-Brief (WHOQOL-BREF) and Functional assessment of chronic illness therapy - Spiritual well-being (FACIT-Sp) were assessed before therapy and two follow-ups at 3 and 6 week. However, with regard to the spiritual care therapy group, statistically significant differences were observed in both HAM-A and HAM-D scales between the baseline and visit 2 (p < 0.001), thus significantly reducing symptoms of anxiety and depression, respectively. No statistically significant differences were observed for any of the scales during the follow-up periods for the control group of participants. When the scores were compared between the study groups, HAM-A, HAM-D and FACIT-Sp 12 scores were significantly lower in the interventional group as compared to the control group at both third and sixth weeks. This suggests a significant improvement in symptoms of anxiety and depression in the spiritual care therapy group than the control group; however, large randomized controlled trials with robust design are needed to confirm the same.

  18. Propagation of finite amplitude sound through turbulence: Modeling with geometrical acoustics and the parabolic approximation

    NASA Astrophysics Data System (ADS)

    Blanc-Benon, Philippe; Lipkens, Bart; Dallois, Laurent; Hamilton, Mark F.; Blackstock, David T.

    2002-01-01

    Sonic boom propagation can be affected by atmospheric turbulence. It has been shown that turbulence affects the perceived loudness of sonic booms, mainly by changing its peak pressure and rise time. The models reported here describe the nonlinear propagation of sound through turbulence. Turbulence is modeled as a set of individual realizations of a random temperature or velocity field. In the first model, linear geometrical acoustics is used to trace rays through each realization of the turbulent field. A nonlinear transport equation is then derived along each eigenray connecting the source and receiver. The transport equation is solved by a Pestorius algorithm. In the second model, the KZK equation is modified to account for the effect of a random temperature field and it is then solved numerically. Results from numerical experiments that simulate the propagation of spark-produced N waves through turbulence are presented. It is observed that turbulence decreases, on average, the peak pressure of the N waves and increases the rise time. Nonlinear distortion is less when turbulence is present than without it. The effects of random vector fields are stronger than those of random temperature fields. The location of the caustics and the deformation of the wave front are also presented. These observations confirm the results from the model experiment in which spark-produced N waves are used to simulate sonic boom propagation through a turbulent atmosphere.

  19. Propagation of finite amplitude sound through turbulence: modeling with geometrical acoustics and the parabolic approximation.

    PubMed

    Blanc-Benon, Philippe; Lipkens, Bart; Dallois, Laurent; Hamilton, Mark F; Blackstock, David T

    2002-01-01

    Sonic boom propagation can be affected by atmospheric turbulence. It has been shown that turbulence affects the perceived loudness of sonic booms, mainly by changing its peak pressure and rise time. The models reported here describe the nonlinear propagation of sound through turbulence. Turbulence is modeled as a set of individual realizations of a random temperature or velocity field. In the first model, linear geometrical acoustics is used to trace rays through each realization of the turbulent field. A nonlinear transport equation is then derived along each eigenray connecting the source and receiver. The transport equation is solved by a Pestorius algorithm. In the second model, the KZK equation is modified to account for the effect of a random temperature field and it is then solved numerically. Results from numerical experiments that simulate the propagation of spark-produced N waves through turbulence are presented. It is observed that turbulence decreases, on average, the peak pressure of the N waves and increases the rise time. Nonlinear distortion is less when turbulence is present than without it. The effects of random vector fields are stronger than those of random temperature fields. The location of the caustics and the deformation of the wave front are also presented. These observations confirm the results from the model experiment in which spark-produced N waves are used to simulate sonic boom propagation through a turbulent atmosphere.

  20. Sleep Promotion Program for Improving Sleep Behaviors in Adolescents: A Randomized Controlled Pilot Study

    PubMed Central

    John, Bindu; Bellipady, Sumanth Shetty; Bhat, Shrinivasa Undaru

    2016-01-01

    Aims. The purpose of this pilot trial was to determine the efficacy of sleep promotion program to adapt it for the use of adolescents studying in various schools of Mangalore, India, and evaluate the feasibility issues before conducting a randomized controlled trial in a larger sample of adolescents. Methods. A randomized controlled trial design with stratified random sampling method was used. Fifty-eight adolescents were selected (mean age: 14.02 ± 2.15 years; intervention group, n = 34; control group, n = 24). Self-report questionnaires, including sociodemographic questionnaire with some additional questions on sleep and activities, Sleep Hygiene Index, Pittsburgh Sleep Quality Index, The Cleveland Adolescent Sleepiness Questionnaire, and PedsQL™ Present Functioning Visual Analogue Scale, were used. Results. Insufficient weekday-weekend sleep duration with increasing age of adolescents was observed. The program revealed a significant effect in the experimental group over the control group in overall sleep quality, sleep onset latency, sleep duration, daytime sleepiness, and emotional and overall distress. No significant effect was observed in sleep hygiene and other sleep parameters. All target variables showed significant correlations with each other. Conclusion. The intervention holds a promise for improving the sleep behaviors in healthy adolescents. However, the effect of the sleep promotion program treatment has yet to be proven through a future research. This trial is registered with ISRCTN13083118. PMID:27088040

  1. Characterization of cancer and normal tissue fluorescence through wavelet transform and singular value decomposition

    NASA Astrophysics Data System (ADS)

    Gharekhan, Anita H.; Biswal, Nrusingh C.; Gupta, Sharad; Pradhan, Asima; Sureshkumar, M. B.; Panigrahi, Prasanta K.

    2008-02-01

    The statistical and characteristic features of the polarized fluorescence spectra from cancer, normal and benign human breast tissues are studied through wavelet transform and singular value decomposition. The discrete wavelets enabled one to isolate high and low frequency spectral fluctuations, which revealed substantial randomization in the cancerous tissues, not present in the normal cases. In particular, the fluctuations fitted well with a Gaussian distribution for the cancerous tissues in the perpendicular component. One finds non-Gaussian behavior for normal and benign tissues' spectral variations. The study of the difference of intensities in parallel and perpendicular channels, which is free from the diffusive component, revealed weak fluorescence activity in the 630nm domain, for the cancerous tissues. This may be ascribable to porphyrin emission. The role of both scatterers and fluorophores in the observed minor intensity peak for the cancer case is experimentally confirmed through tissue-phantom experiments. Continuous Morlet wavelet also highlighted this domain for the cancerous tissue fluorescence spectra. Correlation in the spectral fluctuation is further studied in different tissue types through singular value decomposition. Apart from identifying different domains of spectral activity for diseased and non-diseased tissues, we found random matrix support for the spectral fluctuations. The small eigenvalues of the perpendicular polarized fluorescence spectra of cancerous tissues fitted remarkably well with random matrix prediction for Gaussian random variables, confirming our observations about spectral fluctuations in the wavelet domain.

  2. Effect of Spinal Manipulative Therapy on the Singing Voice.

    PubMed

    Fachinatto, Ana Paula A; Duprat, André de Campos; Silva, Marta Andrada E; Bracher, Eduardo Sawaya Botelho; Benedicto, Camila de Carvalho; Luz, Victor Botta Colangelo; Nogueira, Maruan Nogueira; Fonseca, Beatriz Suster Gomes

    2015-09-01

    This study investigated the effect of spinal manipulative therapy (SMT) on the singing voice of male individuals. Randomized, controlled, case-crossover trial. Twenty-nine subjects were selected among male members of the Heralds of the Gospel. This association was chosen because it is a group of persons with similar singing activities. Participants were randomly assigned to two groups: (A) chiropractic SMT procedure and (B) nontherapeutic transcutaneous electrical nerve stimulation (TENS) procedure. Recordings of the singing voice of each participant were taken immediately before and after the procedures. After a 14-day period, procedures were switched between groups: participants who underwent SMT on the first day were subjected to TENS and vice versa. Recordings were subjected to perceptual audio and acoustic evaluations. The same recording segment of each participant was selected. Perceptual audio evaluation was performed by a specialist panel (SP). Recordings of each participant were randomly presented thus making the SP blind to intervention type and recording session (before/after intervention). Recordings compiled in a randomized order were also subjected to acoustic evaluation. No differences in the quality of the singing on perceptual audio evaluation were observed between TENS and SMT. No differences in the quality of the singing voice of asymptomatic male singers were observed on perceptual audio evaluation or acoustic evaluation after a single spinal manipulative intervention of the thoracic and cervical spine. Copyright © 2015 The Voice Foundation. Published by Elsevier Inc. All rights reserved.

  3. Does Assessing Suicidality Frequently and Repeatedly Cause Harm? A Randomized Control Study

    PubMed Central

    Law, Mary Kate; Furr, R. Michael; Arnold, Elizabeth Mayfield; Mneimne, Malek; Jaquett, Caroline; Fleeson, William

    2015-01-01

    Assessing suicidality is common in mental health practice and is fundamental to suicide research. Although necessary, there is significant concern that such assessments have unintended harmful consequences. Using a longitudinal randomized control design, we evaluated whether repeated and frequent assessments of suicide-related thoughts and behaviors negatively affected individuals, including those at-risk for suicide-related outcomes. Adults (N = 282), including many diagnosed with Borderline Personality Disorder (BPD), were recruited through psychiatric outpatient clinics and from the community at large, and were randomly assigned to assessment groups. A Control Assessment group responded to questions regarding negative psychological experiences several times each day during a 2-week Main Observation phase. During the same observation period, an Intensive Suicide Assessment group responded to the same questions, along with questions regarding suicidal behavior and ideation. Negative psychological outcomes were measured during the Main Observation phase (for BPD symptoms unrelated to suicide and for BPD-relevant emotions) and/or at the end of each week during the Main Observation phase and monthly for 6 months thereafter (for all outcomes, including suicidal ideation and behavior). Results revealed little evidence that intensive suicide assessment triggered negative outcomes, including suicidal ideation or behavior, even among people with BPD. A handful of effects did reach or approach significance, though these were temporary and non-robust. However, given the seriousness of some outcomes, we recommend that researchers or clinicians who implement experience sampling methods including suicide-related items carefully consider the benefits of asking about suicide and to inform participants about possible risks. PMID:25894705

  4. Grouping by proximity and the visual impression of approximate number in random dot arrays.

    PubMed

    Im, Hee Yeon; Zhong, Sheng-Hua; Halberda, Justin

    2016-09-01

    We address the challenges of how to model human perceptual grouping in random dot arrays and how perceptual grouping affects human number estimation in these arrays. We introduce a modeling approach relying on a modified k-means clustering algorithm to formally describe human observers' grouping behavior. We found that a default grouping window size of approximately 4° of visual angle describes human grouping judgments across a range of random dot arrays (i.e., items within 4° are grouped together). This window size was highly consistent across observers and images, and was also stable across stimulus durations, suggesting that the k-means model captured a robust signature of perceptual grouping. Further, the k-means model outperformed other models (e.g., CODE) at describing human grouping behavior. Next, we found that the more the dots in a display are clustered together, the more human observers tend to underestimate the numerosity of the dots. We demonstrate that this effect is independent of density, and the modified k-means model can predict human observers' numerosity judgments and underestimation. Finally, we explored the robustness of the relationship between clustering and dot number underestimation and found that the effects of clustering remain, but are greatly reduced, when participants receive feedback on every trial. Together, this work suggests some promising avenues for formal models of human grouping behavior, and it highlights the importance of a 4° window of perceptual grouping. Lastly, it reveals a robust, somewhat plastic, relationship between perceptual grouping and number estimation. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Does assessing suicidality frequently and repeatedly cause harm? A randomized control study.

    PubMed

    Law, Mary Kate; Furr, R Michael; Arnold, Elizabeth Mayfield; Mneimne, Malek; Jaquett, Caroline; Fleeson, William

    2015-12-01

    Assessing suicidality is common in mental health practice and is fundamental to suicide research. Although necessary, there is significant concern that such assessments have unintended harmful consequences. Using a longitudinal randomized control design, the authors evaluated whether repeated and frequent assessments of suicide-related thoughts and behaviors negatively affected individuals, including those at-risk for suicide-related outcomes. Adults (N = 282), including many diagnosed with borderline personality disorder (BPD), were recruited through psychiatric outpatient clinics and from the community at large, and were randomly assigned to assessment groups. A control assessment group responded to questions regarding negative psychological experiences several times each day during a 2-week main observation phase. During the same observation period, an intensive suicide assessment group responded to the same questions, along with questions regarding suicidal behavior and ideation. Negative psychological outcomes were measured during the main observation phase (for BPD symptoms unrelated to suicide and for BPD-relevant emotions) and/or at the end of each week during the main observation phase and monthly for 6 months thereafter (for all outcomes, including suicidal ideation and behavior). Results revealed little evidence that intensive suicide assessment triggered negative outcomes, including suicidal ideation or behavior, even among people with BPD. A handful of effects did reach or approach significance, though these were temporary and nonrobust. However, given the seriousness of some outcomes, the authors recommend that researchers or clinicians who implement experience sampling methods including suicide-related items carefully consider the benefits of asking about suicide and to inform participants about possible risks. (c) 2015 APA, all rights reserved).

  6. An agent-based model of dialect evolution in killer whales.

    PubMed

    Filatova, Olga A; Miller, Patrick J O

    2015-05-21

    The killer whale is one of the few animal species with vocal dialects that arise from socially learned group-specific call repertoires. We describe a new agent-based model of killer whale populations and test a set of vocal-learning rules to assess which mechanisms may lead to the formation of dialect groupings observed in the wild. We tested a null model with genetic transmission and no learning, and ten models with learning rules that differ by template source (mother or matriline), variation type (random errors or innovations) and type of call change (no divergence from kin vs. divergence from kin). The null model without vocal learning did not produce the pattern of group-specific call repertoires we observe in nature. Learning from either mother alone or the entire matriline with calls changing by random errors produced a graded distribution of the call phenotype, without the discrete call types observed in nature. Introducing occasional innovation or random error proportional to matriline variance yielded more or less discrete and stable call types. A tendency to diverge from the calls of related matrilines provided fast divergence of loose call clusters. A pattern resembling the dialect diversity observed in the wild arose only when rules were applied in combinations and similar outputs could arise from different learning rules and their combinations. Our results emphasize the lack of information on quantitative features of wild killer whale dialects and reveal a set of testable questions that can draw insights into the cultural evolution of killer whale dialects. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Comparison of Metal-Ceramic and All-Ceramic Three-Unit Posterior Fixed Dental Prostheses: A 3-Year Randomized Clinical Trial.

    PubMed

    Nicolaisen, Maj H; Bahrami, Golnosh; Schropp, Lars; Isidor, Flemming

    2016-01-01

    The aim of this randomized clinical study was to compare the 3-year clinical outcome of metal-ceramic fixed dental prostheses (MC-FDPs) and zirconia all-ceramic fixed dental prostheses (AC-FDPs) replacing a posterior tooth. A sample of 34 patients with a missing posterior tooth were randomly chosen to receive either a MC-FDP (n = 17) or an AC-FDP (n = 17). The FDPs were evaluated at baseline and yearly until 3 years after cementation. They were assessed using the California Dental Association assessment system. Periodontal parameters were measured at the abutment teeth, and the contralateral teeth served as control. The statistical unit was the FDP/patient. The survival rates for MC-FDPs and AC-FDPs were 100%. The success rate was 76% and 71% for MC-FDPs and AC-FDPs, respectively. Three technical complications were observed in the MC-FDP group and five in the AC-FDP group, all chipping fractures of the ceramic veneer. Furthermore, one biologic complication in the MC-FDP group (an apical lesion) was observed. No framework fractures occurred. All patients had optimal oral hygiene and showed no bleeding on periodontal probing at any of the recalls. Only minor changes in the periodontal parameters were observed during the 3 years of observation. Three-unit posterior MC-FDPs and AC-FDPs showed similar high survival rates and acceptable success rates after 3 years of function, and ceramic veneer chipping fracture was the most frequent complication for both types of restorations.

  8. The role of observational investigations in comparative effectiveness research.

    PubMed

    Marko, Nicholas F; Weil, Robert J

    2010-12-01

    Comparative effectiveness research (CER) seeks to inform clinical decisions between alternate treatment strategies using data that reflects real patient populations and real-world clinical scenarios for the purpose of improving patient outcomes. There are multiple clinical situations where the unique characteristics of observational investigations can inform medical decision-making within the CER paradigm. Accordingly, it is critical for clinicians to appreciate the strengths and limitations of observational research, particularly as they apply to CER. This review focuses on the role of observational research in CER. We discuss the concept of evidence hierarchies as they relate to observational research and CER, review the scope and nature of observational research, present the rationale for its inclusion in CER investigations, discuss potential sources of bias in observational investigations as well as strategies used to compensate for these biases, and discuss a framework to implement observational research in CER. The CER paradigm recognizes the limitations of hierarchical models of evidence and favors application of a strength-of-evidence model. In this model, observational research fills gaps in randomized clinical trial data and is particularly valuable to investigate effectiveness, harms, prognosis, and infrequent outcomes as well as in circumstances where randomization is not possible and in studies of many surgical populations. Observational investigations must be designed with careful consideration of potential sources of bias and must incorporate strategies to control such bias prospectively, and their results must be reported in a uniform and transparent fashion. When these conditions can be achieved, observational research represents a valuable and critical component of modern CER. © 2010, International Society for Pharmacoeconomics and Outcomes Research (ISPOR).

  9. Merging Marine Ecosystem Models and Genomics

    NASA Astrophysics Data System (ADS)

    Coles, V.; Hood, R. R.; Stukel, M. R.; Moran, M. A.; Paul, J. H.; Satinsky, B.; Zielinski, B.; Yager, P. L.

    2015-12-01

    oceanography. One of the grand challenges of oceanography is to develop model techniques to more effectively incorporate genomic information. As one approach, we developed an ecosystem model whose community is determined by randomly assigning functional genes to build each organism's "DNA". Microbes are assigned a size that sets their baseline environmental responses using allometric response cuves. These responses are modified by the costs and benefits conferred by each gene in an organism's genome. The microbes are embedded in a general circulation model where environmental conditions shape the emergent population. This model is used to explore whether organisms constructed from randomized combinations of metabolic capability alone can self-organize to create realistic oceanic biogeochemical gradients. Realistic community size spectra and chlorophyll-a concentrations emerge in the model. The model is run repeatedly with randomly-generated microbial communities and each time realistic gradients in community size spectra, chlorophyll-a, and forms of nitrogen develop. This supports the hypothesis that the metabolic potential of a community rather than the realized species composition is the primary factor setting vertical and horizontal environmental gradients. Vertical distributions of nitrogen and transcripts for genes involved in nitrification are broadly consistent with observations. Modeled gene and transcript abundance for nitrogen cycling and processing of land-derived organic material match observations along the extreme gradients in the Amazon River plume, and they help to explain the factors controlling observed variability.

  10. Modafinil for the Improvement of Patient Outcomes Following Traumatic Brain Injury.

    PubMed

    Borghol, Amne; Aucoin, Michael; Onor, Ifeanyichukwu; Jamero, Dana; Hawawini, Fadi

    2018-04-01

    Objective : The authors sought to assess the literature evidence on the efficacy of modafinil use in patients with fatigue or excessive daytime sleepiness (EDS) secondary to traumatic brain injury (TBI). Method of Research : A literature search of Medline and PubMed was performed using the EBSCOhost database. Primary literature, observational studies, meta-analyses, case reports, and systematic reviews were assessed for content regarding modafinil and psychostimulant use in patients with TBI. Of the 23 articles collected, three randomized, controlled studies, three observational studies, one case report, and two systematic reviews gave a description of modafinil use in TBI patients. Results and Conclusion : Modafinil is a central nervous system stimulant with well-established effectiveness in the treatment of narcolepsy and shift-work sleep disorder. There is conflicting evidence about the benefits of modafinil in the treatment of fatigue and EDS secondary to TBI. One randomized, controlled study states that modafinil does not significantly improve patient wakefulness, while another concludes that modafinil corrects EDS but not fatigue. An observational study provides evidence that modafinil increases alertness in fatigued patients with past medical history of brainstem diencephalic stroke or multiple sclerosis. Modafinil appears to have the potential to improve wakefulness in patients with TBI. A prospective, double-blinded, randomized, crossover trial of modafinil for the management of fatigue in ischemic stroke patients is currently being conducted, and further studies demonstrating consistent results are needed before making a conclusive decision.

  11. The Relative Importance of Random Error and Observation Frequency in Detecting Trends in Upper Tropospheric Water Vapor

    NASA Technical Reports Server (NTRS)

    Whiteman, David N.; Vermeesch, Kevin C.; Oman, Luke D.; Weatherhead, Elizabeth C.

    2011-01-01

    Recent published work assessed the amount of time to detect trends in atmospheric water vapor over the coming century. We address the same question and conclude that under the most optimistic scenarios and assuming perfect data (i.e., observations with no measurement uncertainty) the time to detect trends will be at least 12 years at approximately 200 hPa in the upper troposphere. Our times to detect trends are therefore shorter than those recently reported and this difference is affected by data sources used, method of processing the data, geographic location and pressure level in the atmosphere where the analyses were performed. We then consider the question of how instrumental uncertainty plays into the assessment of time to detect trends. We conclude that due to the high natural variability in atmospheric water vapor, the amount of time to detect trends in the upper troposphere is relatively insensitive to instrumental random uncertainty and that it is much more important to increase the frequency of measurement than to decrease the random error in the measurement. This is put in the context of international networks such as the Global Climate Observing System (GCOS) Reference Upper-Air Network (GRUAN) and the Network for the Detection of Atmospheric Composition Change (NDACC) that are tasked with developing time series of climate quality water vapor data.

  12. The relative importance of random error and observation frequency in detecting trends in upper tropospheric water vapor

    NASA Astrophysics Data System (ADS)

    Whiteman, David N.; Vermeesch, Kevin C.; Oman, Luke D.; Weatherhead, Elizabeth C.

    2011-11-01

    Recent published work assessed the amount of time to detect trends in atmospheric water vapor over the coming century. We address the same question and conclude that under the most optimistic scenarios and assuming perfect data (i.e., observations with no measurement uncertainty) the time to detect trends will be at least 12 years at approximately 200 hPa in the upper troposphere. Our times to detect trends are therefore shorter than those recently reported and this difference is affected by data sources used, method of processing the data, geographic location and pressure level in the atmosphere where the analyses were performed. We then consider the question of how instrumental uncertainty plays into the assessment of time to detect trends. We conclude that due to the high natural variability in atmospheric water vapor, the amount of time to detect trends in the upper troposphere is relatively insensitive to instrumental random uncertainty and that it is much more important to increase the frequency of measurement than to decrease the random error in the measurement. This is put in the context of international networks such as the Global Climate Observing System (GCOS) Reference Upper-Air Network (GRUAN) and the Network for the Detection of Atmospheric Composition Change (NDACC) that are tasked with developing time series of climate quality water vapor data.

  13. Evaluation of family intervention through unobtrusive audio recordings: experiences in "bugging" children.

    PubMed

    Johnson, S M; Christensen, A; Bellamy, G T

    1976-01-01

    Five children referred to a child-family intervention program wore a radio transmitter in the home during pre-intervention and termination assessments. The transmitter broadcast to a receiver-recording apparatus in the home (either activated by an interval timer at predetermined "random" times or by parents at predetermined "picked" times). "Picked" times were parent-selected situations during which problems typically occurred (e.g., bedtime). Parents activated the recorder regularly whether or not problems occurred. Child-deviant, parent-negative, and parent-commanding behaviors were significantly higher at the picked times during pretest than at random times. At posttest, behaviors in all three classes were substantially reduced at picked times, but not at random times. For individual subject data, reductions occurred in at least two of the three dependent variables for three of the five cases during random time assessments. In general, the behavioral outcome data corresponded to parent-attitude reports and parent-collected observation data.

  14. Dynamic speckle - Interferometry of micro-displacements

    NASA Astrophysics Data System (ADS)

    Vladimirov, A. P.

    2012-06-01

    The problem of the dynamics of speckles in the image plane of the object, caused by random movements of scattering centers is solved. We consider three cases: 1) during the observation the points move at random, but constant speeds, and 2) the relative displacement of any pair of points is a continuous random process, and 3) the motion of the centers is the sum of a deterministic movement and random displacement. For the cases 1) and 2) the characteristics of temporal and spectral autocorrelation function of the radiation intensity can be used for determining of individually and the average relative displacement of the centers, their dispersion and the relaxation time. For the case 3) is showed that under certain conditions, the optical signal contains a periodic component, the number of periods is proportional to the derivations of the deterministic displacements. The results of experiments conducted to test and application of theory are given.

  15. Micromechanical analysis of composites with fibers distributed randomly over the transverse cross-section

    NASA Astrophysics Data System (ADS)

    Weng, Jingmeng; Wen, Weidong; Cui, Haitao; Chen, Bo

    2018-06-01

    A new method to generate the random distribution of fibers in the transverse cross-section of fiber reinforced composites with high fiber volume fraction is presented in this paper. Based on the microscopy observation of the transverse cross-sections of unidirectional composite laminates, hexagon arrangement is set as the initial arrangement status, and the initial velocity of each fiber is arbitrary at an arbitrary direction, the micro-scale representative volume element (RVE) is established by simulating perfectly elastic collision. Combined with the proposed periodic boundary conditions which are suitable for multi-axial loading, the effective elastic properties of composite materials can be predicted. The predicted properties show reasonable agreement with experimental results. By comparing the stress field of RVE with fibers distributed randomly and RVE with fibers distributed periodically, the predicted elastic modulus of RVE with fibers distributed randomly is greater than RVE with fibers distributed periodically.

  16. Method of model reduction and multifidelity models for solute transport in random layered porous media

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, Zhijie; Tartakovsky, Alexandre M.

    This work presents a hierarchical model for solute transport in bounded layered porous media with random permeability. The model generalizes the Taylor-Aris dispersion theory to stochastic transport in random layered porous media with a known velocity covariance function. In the hierarchical model, we represent (random) concentration in terms of its cross-sectional average and a variation function. We derive a one-dimensional stochastic advection-dispersion-type equation for the average concentration and a stochastic Poisson equation for the variation function, as well as expressions for the effective velocity and dispersion coefficient. We observe that velocity fluctuations enhance dispersion in a non-monotonic fashion: the dispersionmore » initially increases with correlation length λ, reaches a maximum, and decreases to zero at infinity. Maximum enhancement can be obtained at the correlation length about 0.25 the size of the porous media perpendicular to flow.« less

  17. Random-access algorithms for multiuser computer communication networks. Doctoral thesis, 1 September 1986-31 August 1988

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Papantoni-Kazakos, P.; Paterakis, M.

    1988-07-01

    For many communication applications with time constraints (e.g., transmission of packetized voice messages), a critical performance measure is the percentage of messages transmitted within a given amount of time after their generation at the transmitting station. This report presents a random-access algorithm (RAA) suitable for time-constrained applications. Performance analysis demonstrates that significant message-delay improvement is attained at the expense of minimal traffic loss. Also considered is the case of noisy channels. The noise effect appears at erroneously observed channel feedback. Error sensitivity analysis shows that the proposed random-access algorithm is insensitive to feedback channel errors. Window Random-Access Algorithms (RAAs) aremore » considered next. These algorithms constitute an important subclass of Multiple-Access Algorithms (MAAs); they are distributive, and they attain high throughput and low delays by controlling the number of simultaneously transmitting users.« less

  18. Enhancement of cooperation in the spatial prisoner's dilemma with a coherence-resonance effect through annealed randomness at a cooperator-defector boundary; comparison of two variant models

    NASA Astrophysics Data System (ADS)

    Tanimoto, Jun

    2016-11-01

    Inspired by the commonly observed real-world fact that people tend to behave in a somewhat random manner after facing interim equilibrium to break a stalemate situation whilst seeking a higher output, we established two models of the spatial prisoner's dilemma. One presumes that an agent commits action errors, while the other assumes that an agent refers to a payoff matrix with an added random noise instead of an original payoff matrix. A numerical simulation revealed that mechanisms based on the annealing of randomness due to either the action error or the payoff noise could significantly enhance the cooperation fraction. In this study, we explain the detailed enhancement mechanism behind the two models by referring to the concepts that we previously presented with respect to evolutionary dynamic processes under the names of enduring and expanding periods.

  19. Effect of Phase-Breaking Events on Electron Transport in Mesoscopic and Nanodevices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meunier, Vincent; Mintmire, John W; Thushari, Jayasekera

    2008-01-01

    Existing ballistic models for electron transport in mesoscopic and nanoscale systems break down as the size of the device becomes longer than the phase coherence length of electrons in the system. Krstic et al. experimentally observed that the current in single-wall carbon nanotube systems can be regarded as a combination of a coherent part and a noncoherent part. In this article, we discuss the use of Buettiker phase-breaking technique to address partially coherent electron transport, generalize that to a multichannel problem, and then study the effect of phase-breaking events on the electron transport in two-terminal graphene nanoribbon devices. We alsomore » investigate the difference between the pure-phase randomization and phase/momentum randomization boundary conditions. While momentum randomization adds an extra resistance caused by backward scattering, pure-phase randomization smooths the conductance oscillations because of interference.« less

  20. Healthcare outcomes assessed with observational study designs compared with those assessed in randomized trials.

    PubMed

    Anglemyer, Andrew; Horvath, Hacsi T; Bero, Lisa

    2014-04-29

    Researchers and organizations often use evidence from randomized controlled trials (RCTs) to determine the efficacy of a treatment or intervention under ideal conditions. Studies of observational designs are often used to measure the effectiveness of an intervention in 'real world' scenarios. Numerous study designs and modifications of existing designs, including both randomized and observational, are used for comparative effectiveness research in an attempt to give an unbiased estimate of whether one treatment is more effective or safer than another for a particular population.A systematic analysis of study design features, risk of bias, parameter interpretation, and effect size for all types of randomized and non-experimental observational studies is needed to identify specific differences in design types and potential biases. This review summarizes the results of methodological reviews that compare the outcomes of observational studies with randomized trials addressing the same question, as well as methodological reviews that compare the outcomes of different types of observational studies. To assess the impact of study design (including RCTs versus observational study designs) on the effect measures estimated.To explore methodological variables that might explain any differences identified.To identify gaps in the existing research comparing study designs. We searched seven electronic databases, from January 1990 to December 2013.Along with MeSH terms and relevant keywords, we used the sensitivity-specificity balanced version of a validated strategy to identify reviews in PubMed, augmented with one term ("review" in article titles) so that it better targeted narrative reviews. No language restrictions were applied. We examined systematic reviews that were designed as methodological reviews to compare quantitative effect size estimates measuring efficacy or effectiveness of interventions tested in trials with those tested in observational studies. Comparisons included RCTs versus observational studies (including retrospective cohorts, prospective cohorts, case-control designs, and cross-sectional designs). Reviews were not eligible if they compared randomized trials with other studies that had used some form of concurrent allocation. In general, outcome measures included relative risks or rate ratios (RR), odds ratios (OR), hazard ratios (HR). Using results from observational studies as the reference group, we examined the published estimates to see whether there was a relative larger or smaller effect in the ratio of odds ratios (ROR).Within each identified review, if an estimate comparing results from observational studies with RCTs was not provided, we pooled the estimates for observational studies and RCTs. Then, we estimated the ratio of ratios (risk ratio or odds ratio) for each identified review using observational studies as the reference category. Across all reviews, we synthesized these ratios to get a pooled ROR comparing results from RCTs with results from observational studies. Our initial search yielded 4406 unique references. Fifteen reviews met our inclusion criteria; 14 of which were included in the quantitative analysis.The included reviews analyzed data from 1583 meta-analyses that covered 228 different medical conditions. The mean number of included studies per paper was 178 (range 19 to 530).Eleven (73%) reviews had low risk of bias for explicit criteria for study selection, nine (60%) were low risk of bias for investigators' agreement for study selection, five (33%) included a complete sample of studies, seven (47%) assessed the risk of bias of their included studies,Seven (47%) reviews controlled for methodological differences between studies,Eight (53%) reviews controlled for heterogeneity among studies, nine (60%) analyzed similar outcome measures, and four (27%) were judged to be at low risk of reporting bias.Our primary quantitative analysis, including 14 reviews, showed that the pooled ROR comparing effects from RCTs with effects from observational studies was 1.08 (95% confidence interval (CI) 0.96 to 1.22). Of 14 reviews included in this analysis, 11 (79%) found no significant difference between observational studies and RCTs. One review suggested observational studies had larger effects of interest, and two reviews suggested observational studies had smaller effects of interest.Similar to the effect across all included reviews, effects from reviews comparing RCTs with cohort studies had a pooled ROR of 1.04 (95% CI 0.89 to 1.21), with substantial heterogeneity (I(2) = 68%). Three reviews compared effects of RCTs and case-control designs (pooled ROR: 1.11 (95% CI 0.91 to 1.35)).No significant difference in point estimates across heterogeneity, pharmacological intervention, or propensity score adjustment subgroups were noted. No reviews had compared RCTs with observational studies that used two of the most common causal inference methods, instrumental variables and marginal structural models. Our results across all reviews (pooled ROR 1.08) are very similar to results reported by similarly conducted reviews. As such, we have reached similar conclusions; on average, there is little evidence for significant effect estimate differences between observational studies and RCTs, regardless of specific observational study design, heterogeneity, or inclusion of studies of pharmacological interventions. Factors other than study design per se need to be considered when exploring reasons for a lack of agreement between results of RCTs and observational studies. Our results underscore that it is important for review authors to consider not only study design, but the level of heterogeneity in meta-analyses of RCTs or observational studies. A better understanding of how these factors influence study effects might yield estimates reflective of true effectiveness.

  1. The incidence of kidney injury for patients treated with a high-potency versus moderate-potency statin regimen after an acute coronary syndrome.

    PubMed

    Sarma, Amy; Cannon, Christopher P; de Lemos, James; Rouleau, Jean L; Lewis, Eldrin F; Guo, Jianping; Mega, Jessica L; Sabatine, Marc S; O'Donoghue, Michelle L

    2014-05-01

    Observational studies have raised concerns that high-potency statins increase the risk of acute kidney injury. We therefore examined the incidence of kidney injury across 2 randomized trials of statin therapy. PROVE IT-TIMI 22 enrolled 4162 subjects after an acute coronary syndrome (ACS) and randomized them to atorvastatin 80 mg/day versus pravastatin 40 mg/day. A-to-Z enrolled 4497 subjects after ACS and randomized them to a high-potency (simvastatin 40 mg/day × 1 months, then simvastatin 80 mg/day) versus a delayed moderate-potency statin strategy (placebo × 4 months, then simvastatin 20 mg/day). Serum creatinine was assessed centrally at serial time points. Adverse events (AEs) relating to kidney injury were identified through database review. Across both trials, mean serum creatinine was similar between treatment arms at baseline and throughout follow-up. In A-to-Z, the incidence of a 1.5-fold or ≥ 0.3 mg/dL rise in serum creatinine was 11.4% for subjects randomized to a high-potency statin regimen versus 12.4% for those on a delayed moderate-potency regimen (odds ratio [OR], 0.91; 95% confidence interval [CI], 0.76 to 1.10; P=0.33). In PROVE IT-TIMI 22, the incidence was 9.4% for subjects randomized to atorvastatin 80 mg/day and 10.6% for subjects randomized to pravastatin 40 mg/day (OR, 0.88; 95% CI, 0.71 to 1.09; P=0.25). Consistent results were observed for different kidney injury thresholds and in individuals with diabetes mellitus or with moderate renal dysfunction. The incidence of kidney injury-related adverse events (AEs) was not statistically different for patients on a high-potency versus moderate-potency statin regimen (OR, 1.06; 95% CI, 0.68 to 1.67; P=0.78). For patients enrolled in 2 large randomized trials of statin therapy after ACS, the use of a high-potency statin regimen did not increase the risk of kidney injury.

  2. Workshop on Incomplete Network Data Held at Sandia National Labs – Livermore

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Soundarajan, Sucheta; Wendt, Jeremy D.

    2016-06-01

    While network analysis is applied in a broad variety of scientific fields (including physics, computer science, biology, and the social sciences), how networks are constructed and the resulting bias and incompleteness have drawn more limited attention. For example, in biology, gene networks are typically developed via experiment -- many actual interactions are likely yet to be discovered. In addition to this incompleteness, the data-collection processes can introduce significant bias into the observed network datasets. For instance, if you observe part of the World Wide Web network through a classic random walk, then high degree nodes are more likely to bemore » found than if you had selected nodes at random. Unfortunately, such incomplete and biasing data collection methods must be often used.« less

  3. Empirical Comparison of Two Psychological Therapies: Self Psychology and Cognitive Orientation in the Treatment of Anorexia and Bulimia

    PubMed Central

    Bachar, Eytan; Latzer, Yael; Kreitler, Shulamit; Berry, Elliot M.

    1999-01-01

    The authors investigated the applicability of self psychological treatment (SPT) and cognitive orientation treatment (COT) to the treatment of anorexia and bulimia. Thirty-three patients participated in this study. The bulimic patients (n = 25) were randomly assigned either to SPT, COT, or control/nutritional counseling only (C/NC). The anorexic patients (n = 8) were randomly assigned to either SPT or COT. Patients were administered a battery of outcome measures assessing eating disorders symptomatology, attitudes toward food, self structure, and general psychiatric symptoms. After SPT, significant improvement was observed. After COT, slight but nonsignificant improvement was observed. After C/NC, almost no changes could be detected.(The Journal of Psychotherapy Practice and Research 1999; 8:115–128) PMID:10079459

  4. Old models explain new observations of butterfly movement at patch edges.

    PubMed

    Crone, Elizabeth E; Schultz, Cheryl B

    2008-07-01

    Understanding movement in heterogeneous environments is central to predicting how landscape changes affect animal populations. Several recent studies point out an intriguing and distinctive looping behavior by butterflies at habitat patch edges and hypothesize that this behavior requires a new framework for analyzing animal movement. We show that this looping behavior could be caused by a longstanding movement model, biased correlated random walk, with bias toward habitat patches. The ability of this longstanding model to explain recent observations reinforces the point that butterflies respond to habitat heterogeneity and do not move randomly through heterogeneous environments. We discuss the implications of different movement models for predicting butterfly responses to landscape change, and our rationale for retaining longstanding movement models, rather than developing new modeling frameworks for looping behavior at patch edges.

  5. A systematic review of clinical outcomes in surgical treatment of adult isthmic spondylolisthesis.

    PubMed

    Noorian, Shaya; Sorensen, Karen; Cho, Woojin

    2018-05-07

    A variety of surgical methods are available for the treatment of adult isthmic spondylolisthesis, but there is no consensus regarding their relative effects on clinical outcomes. To compare the effects of different surgical techniques on clinical outcomes in adult isthmic spondylolisthesis. Systematic Review PATIENT SAMPLE: A total of 1,538 patients from six randomized clinical trials and nine observational studies comparing different surgical treatments in adult isthmic spondylolisthesis. Primary outcome measures of interest included differences in pre- versus post-surgical assessments of pain, functional disability, and overall health as assessed by validated pain rating scales and questionnaires. Secondary outcome measures of interest included intraoperative blood loss, length of hospital stay, surgery duration, reoperation rates, and complication rates. A search of the literature was performed in September, 2017 for relevant comparative studies published in the prior 10-year period in the following databases: PubMed, Embase, Web of Science, and ClinicalTrials.Gov. PRISMA guidelines were followed and studies were included/excluded based on strict predetermined criteria. Quality appraisal was conducted using the Newcastle-Ottawa Scale (NOS) for observational studies and the Cochrane Collaboration's risk of bias assessment tool for randomized clinical trials. The authors received no funding support to conduct this review. A total of 15 studies (6 randomized clinical trials and 9 observational studies) were included for full text review, a majority of which only included cases of low-grade isthmic spondylolisthesis. 1 study examined the effects of adding pedicle screw fixation (PS) to posterolateral fusion (PLF) and 2 studies examined the effects of adding reduction to interbody fusion (IF) + PS on clinical outcomes. 5 studies compared PLF, 4 with and 1 without PS, to IF + PS. Additionally, 3 studies compared circumferential fusion (IF + PS + PLF) to IF + PS and 1 study compared circumferential fusion to PLF + PS. 3 studies compared clinical outcomes among different IF + PS techniques (ALIF + PS vs. PLIF + PS vs TLIF + PS) without PLF. As per the Cochrane Collaboration's risk of bias assessment tool, 4 randomized clinical trials had an overall low risk of bias, 1 randomized clinical trial had an unclear risk of bias, and 1 randomized clinical trial had a high risk of bias. As per the Newcastle-Ottawa scale, 3 observational studies were of overall good quality, 4 observational studies were of fair quality, and 2 observational studies were of poor quality. Available studies provide strong evidence that the addition of reduction to fusion does not result in better clinical outcomes of pain and function in low-grade isthmic spondylolisthesis. Evidence also suggests that there is no significant difference between interbody fusion (IF + PS) and posterior fusion (PLF +/- PS) in outcomes of pain, function, and complication rates at follow-up points up to approximately 3 years in cases of low-grade slips. However, studies with longer follow-up points suggest that interbody fusion (IF + PS) may perform better in these same measures at later follow-up points. Available evidence also suggests no difference between circumferential fusion (IF + PS + PLF) and interbody fusion (IF + PS) in outcomes of pain and function in low-grade slips, but circumferential fusion has been associated with greater intraoperative blood loss, longer surgery duration, and longer hospital stays. In terms of clinical outcomes, insufficient evidence is available to assess the utility of adding PS to PLF, the relative efficacy of different interbody fusion (IF + PS) techniques (ALIF + PS vs. TLIF + PS vs. PLIF + PS), and the relative efficacy of circumferential fusion and posterior fusion (PLF + PS). Copyright © 2018. Published by Elsevier Inc.

  6. Perceived beauty of random texture patterns: A preference for complexity.

    PubMed

    Friedenberg, Jay; Liby, Bruce

    2016-07-01

    We report two experiments on the perceived aesthetic quality of random density texture patterns. In each experiment a square grid was filled with a progressively larger number of elements. Grid size in Experiment 1 was 10×10 with elements added to create a variety of textures ranging from 10%-100% fill levels. Participants rated the beauty of the patterns. Average judgments across all observers showed an inverted U-shaped function that peaked near middle densities. In Experiment 2 grid size was increased to 15×15 to see if observers preferred patterns with a fixed density or a fixed number of elements. The results of the second experiment were nearly identical to that of the first showing a preference for density over fixed element number. Ratings in both studies correlated positively with a GIF compression metric of complexity and with edge length. Within the range of stimuli used, observers judge more complex patterns to be more beautiful. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. Randomized controlled trials vs. observational studies: why not just live together?

    PubMed

    Faraoni, David; Schaefer, Simon Thomas

    2016-10-21

    Randomized controlled trials (RCTs) are considered the gold standard for clinical research, thus having a high impact on clinical guidelines and our daily patients' care. However, various treatment strategies which we consider "evidence based" have never been subject to a prospective RCT, as we would rate it unethical to withheld an established treatment to individuals in an placebo controlled trial.In a recent BMC Anesthesiology publication, Trentino et al. analyzed the usefulness of observational studies in assessing benefit and risk of different transfusion strategies. The authors nicely reviewed and summarized similarities and differences, advantages and limitations, between different study types frequently used in transfusion medicine. In this interesting article, the authors conclude, that 'when comparing the results of observational studies with RCTs assessing transfusion outcomes, it is important that one consider not only the study method, but also the key elements of the study design'. Thus, in this commentary we now discuss the pro's and con's of different study types, even irrespective of transfusion medicine.

  8. Childhood consequences of maternal obesity and excessive weight gain during pregnancy.

    PubMed

    Gaillard, Romy; Felix, Janine F; Duijts, Liesbeth; Jaddoe, Vincent W V

    2014-11-01

    Obesity is a major public health concern. In western countries, the prevalence of obesity in pregnant women has strongly increased, with reported prevalence rates reaching 30%. Also, up to 40% of women gain an excessive amount of weight during pregnancy. Recent observational studies and meta-analyses strongly suggest long-term impact of maternal obesity and excessive weight gain during pregnancy on adiposity, cardiovascular and respiratory related health outcomes in their children. These observations suggest that maternal adiposity during pregnancy may program common health problems in the offspring. Currently, it remains unclear whether the observed associations are causal, or just reflect confounding by family-based sociodemographic or lifestyle-related factors. Parent-offspring studies, sibling comparison studies, Mendelian randomization studies and randomized trials can help to explore the causality and underlying mechanisms. Also, the potential for prevention of common diseases in future generations by reducing maternal obesity and excessive weight gain during pregnancy needs to be explored. © 2014 Nordic Federation of Societies of Obstetrics and Gynecology.

  9. Chiasma failures and chromosome association in Rhoeo spathacea var. variegata.

    PubMed

    Lin, Y J

    1982-01-01

    In Rhoeo spathacea var. variegata (2n = 2x = 12), the most frequent meiotic configuration was the chain-of-12 chromosomes (36%) and the second most frequent was the ring-of-12 chromosomes (25.6%). All six possible two-chain situations and eleven of the twelve possible three-chain situations were observed. A maximum of five chains was observed in four cells. The size of chains ranged from on through twelve chromosomes. The mean number of chiasma failures was 1.36 +/- 0.07 per cell and 0.1133 per pair of chromosome arms. Because the observed frequencies of various configurations agree with the expected, which were calculated under the assumption that chiasma failure is equally likely at each of the twelve positions around the ring, it was concluded that chiasma failures occurred at random among the arm-positions. Due to the lengths of arm-pairs in the ring vary considerably, the randomness may mean that chiasma formation was restricted to small terminal regions on all chromosomes.

  10. Sativex oromucosal spray as adjunctive therapy in advanced cancer patients with chronic pain unalleviated by optimized opioid therapy: two double-blind, randomized, placebo-controlled phase 3 studies.

    PubMed

    Fallon, Marie T; Albert Lux, Eberhard; McQuade, Robert; Rossetti, Sandro; Sanchez, Raymond; Sun, Wei; Wright, Stephen; Lichtman, Aron H; Kornyeyeva, Elena

    2017-08-01

    Opioids are critical for managing cancer pain, but may provide inadequate relief and/or unacceptable side effects in some cases. To assess the analgesic efficacy of adjunctive Sativex (Δ 9 -tetrahydrocannabinol (27 mg/mL): cannabidiol (25 mg/mL)) in advanced cancer patients with chronic pain unalleviated by optimized opioid therapy. This report describes two phase 3, double-blind, randomized, placebo-controlled trials. Eligible patients had advanced cancer and average pain numerical rating scale (NRS) scores ≥4 and ≤8 at baseline, despite optimized opioid therapy. In Study-1, patients were randomized to Sativex or placebo, and then self-titrated study medications over a 2-week period per effect and tolerability, followed by a 3-week treatment period. In Study-2, all patients self-titrated Sativex over a 2-week period. Patients with a ≥15% improvement from baseline in pain score were then randomized 1:1 to Sativex or placebo, followed by 5-week treatment period (randomized withdrawal design). The primary efficacy endpoint (percent improvement (Study-1) and mean change (Study-2) in average daily pain NRS scores) was not met in either study. Post hoc analyses of the primary endpoints identified statistically favourable treatment effect for Sativex in US patients <65 years (median treatment difference: 8.8; 95% confidence interval (CI): 0.00-17.95; p = 0.040) that was not observed in patients <65 years from the rest of the world (median treatment difference: 0.2; 95% CI: -5.00 to 7.74; p = 0.794). Treatment effect in favour of Sativex was observed on quality-of-life questionnaires, despite the fact that similar effects were not observed on NRS score. The safety profile of Sativex was consistent with earlier studies, and no evidence of abuse or misuse was identified. Sativex did not demonstrate superiority to placebo in reducing self-reported pain NRS scores in advanced cancer patients with chronic pain unalleviated by optimized opioid therapy, although further exploration of differences between United States and patients from the rest of the world is warranted.

  11. A Randomized Trial Evaluating Short-term Effectiveness of Overminus Lenses in Children 3 to 6 Years of Age with Intermittent Exotropia.

    PubMed

    Chen, Angela M; Holmes, Jonathan M; Chandler, Danielle L; Patel, Reena A; Gray, Michael E; Erzurum, S Ayse; Wallace, David K; Kraker, Raymond T; Jensen, Allison A

    2016-10-01

    To evaluate the short-term effectiveness of overminus spectacles in improving control of childhood intermittent exotropia (IXT). Randomized, clinical trial. A total of 58 children aged 3 to <7 years with IXT. Eligibility criteria included a distance control score of 2 or worse (mean of 3 measures during a single examination) on a scale of 0 (exophoria) to 5 (constant exotropia) and spherical equivalent refractive error between -6.00 diopters (D) and +1.00 D. Children were randomly assigned to overminus spectacles (-2.50 D over cycloplegic refraction) or observation (non-overminus spectacles if needed or no spectacles) for 8 weeks. The primary outcome was distance control score for each child (mean of 3 measures during a single examination) assessed by a masked examiner at 8 weeks. Outcome testing was conducted with children wearing their study spectacles or plano spectacles for the children in the observation group who did not need spectacles. The primary analysis compared mean 8-week distance control score between treatment groups using an analysis of covariance model that adjusted for baseline distance control, baseline near control, prestudy spectacle wear, and prior IXT treatment. Treatment side effects were evaluated using questionnaires completed by parents. At 8 weeks, mean distance control was better in the 27 children treated with overminus spectacles than in the 31 children who were observed without treatment (2.0 vs. 2.8 points, adjusted difference = -0.75 points favoring the overminus group; 2-sided 95% confidence interval, -1.42 to -0.07 points). Side effects of headaches, eyestrain, avoidance of near activities, and blur appeared similar between treatment groups. In a pilot randomized clinical trial, overminus spectacles improved distance control at 8 weeks in children aged 3 to <7 years with IXT. A larger and longer randomized trial is warranted to assess the effectiveness of overminus spectacles in treating IXT, particularly the effect on control after overminus treatment has been discontinued. Copyright © 2016 American Academy of Ophthalmology. Published by Elsevier Inc. All rights reserved.

  12. Observing Responses and Serial Stimuli: Searching for the Reinforcing Properties of the S-

    ERIC Educational Resources Information Center

    Escobar, Rogelio; Bruner, Carlos A.

    2009-01-01

    The control exerted by a stimulus associated with an extinction component (S-) on observing responses was determined as a function of its temporal relation with the onset of the reinforcement component (S+). Lever pressing by rats was reinforced on a mixed random-interval extinction schedule. Each press on a second lever produced stimuli…

  13. Working Group on Ice Forces on Structures. A State-of-the-Art Report.

    DTIC Science & Technology

    1980-06-01

    observed in Soviet Design Codes, but the randomness of ice properties is not directly observed anywhere. 3.3 Contact system The mode of ice failure against...ups .............. o..........................90 2.133 Factors limiting ice ride-up ..................... o............91 2.134 Procedures for designing ...o................................110 3.3 Contact system .................................................. 115 3.4 Damping

  14. Redundant Information and the Quantum-Classical Transition

    ERIC Educational Resources Information Center

    Riedel, Charles Jess

    2012-01-01

    A state selected at random from the Hilbert space of a many-body system is overwhelmingly likely to exhibit highly non-classical correlations. For these typical states, half of the environment must be measured by an observer to determine the state of a given subsystem. The objectivity of classical reality--the fact that multiple observers can each…

  15. Resource Contention Management in Parallel Systems

    DTIC Science & Technology

    1989-04-01

    technical competence include communications, command and control, battle management, information processing, surveillance sensors, intelligence data ...two-simulation approach since they require only a single simulation run. More importantly, since they involve only observed data , they may also be...we use the original, unobservable RAC of Section 2 and handle un- observable transitions by generating artifcial events, when required, using a random

  16. Local ensemble transform Kalman filter for ionospheric data assimilation: Observation influence analysis during a geomagnetic storm event

    NASA Astrophysics Data System (ADS)

    Durazo, Juan A.; Kostelich, Eric J.; Mahalov, Alex

    2017-09-01

    We propose a targeted observation strategy, based on the influence matrix diagnostic, that optimally selects where additional observations may be placed to improve ionospheric forecasts. This strategy is applied in data assimilation observing system experiments, where synthetic electron density vertical profiles, which represent those of Constellation Observing System for Meteorology, Ionosphere, and Climate/Formosa satellite 3, are assimilated into the Thermosphere-Ionosphere-Electrodynamics General Circulation Model using the local ensemble transform Kalman filter during the 26 September 2011 geomagnetic storm. During each analysis step, the observation vector is augmented with five synthetic vertical profiles optimally placed to target electron density errors, using our targeted observation strategy. Forecast improvement due to assimilation of augmented vertical profiles is measured with the root-mean-square error (RMSE) of analyzed electron density, averaged over 600 km regions centered around the augmented vertical profile locations. Assimilating vertical profiles with targeted locations yields about 60%-80% reduction in electron density RMSE, compared to a 15% average reduction when assimilating randomly placed vertical profiles. Assimilating vertical profiles whose locations target the zonal component of neutral winds (Un) yields on average a 25% RMSE reduction in Un estimates, compared to a 2% average improvement obtained with randomly placed vertical profiles. These results demonstrate that our targeted strategy can improve data assimilation efforts during extreme events by detecting regions where additional observations would provide the largest benefit to the forecast.

  17. A randomized controlled pilot trial comparing the impact of access to clinical endocrinology video demonstrations with access to usual revision resources on medical student performance of clinical endocrinology skills

    PubMed Central

    2013-01-01

    Background Demonstrating competence in clinical skills is key to course completion for medical students. Methods of providing clinical instruction that foster immediate learning and potentially serve as longer-term repositories for on-demand revision, such as online videos demonstrating competent performance of clinical skills, are increasingly being used. However, their impact on learning has been little studied. The aim of this study was to determine the value of adjunctive on-demand video-based training for clinical skills acquisition by medical students in endocrinology. Methods Following an endocrinology clinical tutorial program, 2nd year medical students in the pre-assessment revision period were recruited and randomized to either a set of bespoke on-line clinical skills training videos (TV), or to revision as usual (RAU). The skills demonstrated on video were history taking in diabetes mellitus (DMH), examination for diabetes lower limb complications (LLE), and examination for signs of thyroid disease (TE). Students were assessed on these clinical skills in an observed structured clinical examination two weeks after randomization. Assessors were blinded to student randomization status. Results For both diabetes related clinical skills assessment tasks, students in the TV group performed significantly better than those in the RAU group. There were no between group differences in thyroid examination performance. For the LLE, 91.7% (n = 11/12) of students randomized to the video were rated globally as competent at the skill compared with 40% (n = 4/10) of students not randomized to the video (p = 0.024). For the DMH, 83.3% (n = 10/12) of students randomized to the video were rated globally as competent at the skill compared with 20% (n = 2/10) of students not randomized to the video (p = 0.007). Conclusion Exposure to high quality videos demonstrating clinical skills can significantly improve medical student skill performance in an observed structured clinical examination of these skills, when used as an adjunct to clinical skills face-to-face tutorials and deliberate practice of skills in a blended learning format. Video demonstrations can provide an enduring, on-demand, portable resource for revision, which can even be used at the bedside by learners. Such resources are cost-effectively scalable for large numbers of learners. PMID:24090039

  18. A randomized controlled pilot trial comparing the impact of access to clinical endocrinology video demonstrations with access to usual revision resources on medical student performance of clinical endocrinology skills.

    PubMed

    Hibbert, Emily J; Lambert, Tim; Carter, John N; Learoyd, Diana L; Twigg, Stephen; Clarke, Stephen

    2013-10-03

    Demonstrating competence in clinical skills is key to course completion for medical students. Methods of providing clinical instruction that foster immediate learning and potentially serve as longer-term repositories for on-demand revision, such as online videos demonstrating competent performance of clinical skills, are increasingly being used. However, their impact on learning has been little studied. The aim of this study was to determine the value of adjunctive on-demand video-based training for clinical skills acquisition by medical students in endocrinology. Following an endocrinology clinical tutorial program, 2nd year medical students in the pre-assessment revision period were recruited and randomized to either a set of bespoke on-line clinical skills training videos (TV), or to revision as usual (RAU). The skills demonstrated on video were history taking in diabetes mellitus (DMH), examination for diabetes lower limb complications (LLE), and examination for signs of thyroid disease (TE). Students were assessed on these clinical skills in an observed structured clinical examination two weeks after randomization. Assessors were blinded to student randomization status. For both diabetes related clinical skills assessment tasks, students in the TV group performed significantly better than those in the RAU group. There were no between group differences in thyroid examination performance. For the LLE, 91.7% (n = 11/12) of students randomized to the video were rated globally as competent at the skill compared with 40% (n = 4/10) of students not randomized to the video (p = 0.024). For the DMH, 83.3% (n = 10/12) of students randomized to the video were rated globally as competent at the skill compared with 20% (n = 2/10) of students not randomized to the video (p = 0.007). Exposure to high quality videos demonstrating clinical skills can significantly improve medical student skill performance in an observed structured clinical examination of these skills, when used as an adjunct to clinical skills face-to-face tutorials and deliberate practice of skills in a blended learning format. Video demonstrations can provide an enduring, on-demand, portable resource for revision, which can even be used at the bedside by learners. Such resources are cost-effectively scalable for large numbers of learners.

  19. Hydrolyzed infant formula and early β-cell autoimmunity: a randomized clinical trial.

    PubMed

    Knip, Mikael; Åkerblom, Hans K; Becker, Dorothy; Dosch, Hans-Michael; Dupre, John; Fraser, William; Howard, Neville; Ilonen, Jorma; Krischer, Jeffrey P; Kordonouri, Olga; Lawson, Margaret L; Palmer, Jerry P; Savilahti, Erkki; Vaarala, Outi; Virtanen, Suvi M

    2014-06-11

    The disease process leading to clinical type 1 diabetes often starts during the first years of life. Early exposure to complex dietary proteins may increase the risk of β-cell autoimmunity in children at genetic risk for type 1 diabetes. Extensively hydrolyzed formulas do not contain intact proteins. To test the hypothesis that weaning to an extensively hydrolyzed formula decreases the cumulative incidence of diabetes-associated autoantibodies in young children. A double-blind randomized clinical trial of 2159 infants with HLA-conferred disease susceptibility and a first-degree relative with type 1 diabetes recruited from May 2002 to January 2007 in 78 study centers in 15 countries; 1078 were randomized to be weaned to the extensively hydrolyzed casein formula and 1081 were randomized to be weaned to a conventional cows' milk-based formula. The participants were observed to April 16, 2013. The participants received either a casein hydrolysate or a conventional cows' milk formula supplemented with 20% of the casein hydrolysate. AND MEASURES: Primary outcome was positivity for at least 2 diabetes-associated autoantibodies out of 4 analyzed. Autoantibodies to insulin, glutamic acid decarboxylase, and the insulinoma-associated-2 (IA-2) molecule were analyzed using radiobinding assays and islet cell antibodies with immunofluorescence during a median observation period of 7.0 years (mean, 6.3 years). The absolute risk of positivity for 2 or more islet autoantibodies was 13.4% among those randomized to the casein hydrolysate formula (n = 139) vs 11.4% among those randomized to the conventional formula (n = 117). The unadjusted hazard ratio for positivity for 2 or more autoantibodies among those randomized to be weaned to the casein hydrolysate was 1.21 (95% CI, 0.94-1.54), compared with those randomized to the conventional formula, while the hazard ratio adjusted for HLA risk, duration of breastfeeding, vitamin D use, study formula duration and consumption, and region was 1.23 (95% CI, 0.96-1.58). There were no clinically significant differences in the rate of reported adverse events between the 2 groups. Among infants at risk for type 1 diabetes, the use of a hydrolyzed formula, when compared with a conventional formula, did not reduce the incidence of diabetes-associated autoantibodies after 7 years. These findings do not support a benefit from hydrolyzed formula. TRIAL REGISTRATION clinicaltrials.gov Identifier: NCT00179777.

  20. A comparison of observation-level random effect and Beta-Binomial models for modelling overdispersion in Binomial data in ecology & evolution.

    PubMed

    Harrison, Xavier A

    2015-01-01

    Overdispersion is a common feature of models of biological data, but researchers often fail to model the excess variation driving the overdispersion, resulting in biased parameter estimates and standard errors. Quantifying and modeling overdispersion when it is present is therefore critical for robust biological inference. One means to account for overdispersion is to add an observation-level random effect (OLRE) to a model, where each data point receives a unique level of a random effect that can absorb the extra-parametric variation in the data. Although some studies have investigated the utility of OLRE to model overdispersion in Poisson count data, studies doing so for Binomial proportion data are scarce. Here I use a simulation approach to investigate the ability of both OLRE models and Beta-Binomial models to recover unbiased parameter estimates in mixed effects models of Binomial data under various degrees of overdispersion. In addition, as ecologists often fit random intercept terms to models when the random effect sample size is low (<5 levels), I investigate the performance of both model types under a range of random effect sample sizes when overdispersion is present. Simulation results revealed that the efficacy of OLRE depends on the process that generated the overdispersion; OLRE failed to cope with overdispersion generated from a Beta-Binomial mixture model, leading to biased slope and intercept estimates, but performed well for overdispersion generated by adding random noise to the linear predictor. Comparison of parameter estimates from an OLRE model with those from its corresponding Beta-Binomial model readily identified when OLRE were performing poorly due to disagreement between effect sizes, and this strategy should be employed whenever OLRE are used for Binomial data to assess their reliability. Beta-Binomial models performed well across all contexts, but showed a tendency to underestimate effect sizes when modelling non-Beta-Binomial data. Finally, both OLRE and Beta-Binomial models performed poorly when models contained <5 levels of the random intercept term, especially for estimating variance components, and this effect appeared independent of total sample size. These results suggest that OLRE are a useful tool for modelling overdispersion in Binomial data, but that they do not perform well in all circumstances and researchers should take care to verify the robustness of parameter estimates of OLRE models.

  1. Modeling of chromosome intermingling by partially overlapping uniform random polygons.

    PubMed

    Blackstone, T; Scharein, R; Borgo, B; Varela, R; Diao, Y; Arsuaga, J

    2011-03-01

    During the early phase of the cell cycle the eukaryotic genome is organized into chromosome territories. The geometry of the interface between any two chromosomes remains a matter of debate and may have important functional consequences. The Interchromosomal Network model (introduced by Branco and Pombo) proposes that territories intermingle along their periphery. In order to partially quantify this concept we here investigate the probability that two chromosomes form an unsplittable link. We use the uniform random polygon as a crude model for chromosome territories and we model the interchromosomal network as the common spatial region of two overlapping uniform random polygons. This simple model allows us to derive some rigorous mathematical results as well as to perform computer simulations easily. We find that the probability that one uniform random polygon of length n that partially overlaps a fixed polygon is bounded below by 1 − O(1/√n). We use numerical simulations to estimate the dependence of the linking probability of two uniform random polygons (of lengths n and m, respectively) on the amount of overlapping. The degree of overlapping is parametrized by a parameter [Formula: see text] such that [Formula: see text] indicates no overlapping and [Formula: see text] indicates total overlapping. We propose that this dependence relation may be modeled as f (ε, m, n) = [Formula: see text]. Numerical evidence shows that this model works well when [Formula: see text] is relatively large (ε ≥ 0.5). We then use these results to model the data published by Branco and Pombo and observe that for the amount of overlapping observed experimentally the URPs have a non-zero probability of forming an unsplittable link.

  2. A randomized, controlled trial of aerobic exercise for treatment-related fatigue in men receiving radical external beam radiotherapy for localized prostate carcinoma.

    PubMed

    Windsor, Phyllis M; Nicol, Kathleen F; Potter, Joan

    2004-08-01

    Advice to rest and take things easy if patients become fatigued during radiotherapy may be detrimental. Aerobic walking improves physical functioning and has been an intervention for chemotherapy-related fatigue. A prospective, randomized, controlled trial was performed to determine whether aerobic exercise would reduce the incidence of fatigue and prevent deterioration in physical functioning during radiotherapy for localized prostate carcinoma. Sixty-six men were randomized before they received radical radiotherapy for localized prostate carcinoma, with 33 men randomized to an exercise group and 33 men randomized to a control group. Outcome measures were fatigue and distance walked in a modified shuttle test before and after radiotherapy. There were no significant between group differences noted with regard to fatigue scores at baseline (P = 0.55) or after 4 weeks of radiotherapy (P = 0.18). Men in the control group had significant increases in fatigue scores from baseline to the end of radiotherapy (P = 0.013), with no significant increases observed in the exercise group (P = 0.203). A nonsignificant reduction (2.4%) in shuttle test distance at the end of radiotherapy was observed in the control group; however, in the exercise group, there was a significant increase (13.2%) in distance walked (P = 0.0003). Men who followed advice to rest and take things easy if they became fatigued demonstrated a slight deterioration in physical functioning and a significant increase in fatigue at the end of radiotherapy. Home-based, moderate-intensity walking produced a significant improvement in physical functioning with no significant increase in fatigue. Improved physical functioning may be necessary to combat radiation fatigue.

  3. Brief Strategic Family Therapy Versus Treatment as Usual: Results of a Multisite Randomized Trial for Substance Using Adolescents

    PubMed Central

    Robbins, Michael S.; Feaster, Daniel J.; Horigian, Viviana E.; Rohrbaugh, Michael; Shoham, Varda; Bachrach, Ken; Miller, Michael; Burlew, Kathleen A.; Hodgkins, Candy; Carrion, Ibis; Vandermark, Nancy; Schindler, Eric; Werstlein, Robert; Szapocznik, José

    2012-01-01

    Objective To determine the effectiveness of brief strategic family therapy (BSFT; an evidence-based family therapy) compared to treatment as usual (TAU) as provided in community-based adolescent outpatient drug abuse programs. Method A randomized effectiveness trial in the National Drug Abuse Treatment Clinical Trials Network compared BSFT to TAU with a multiethnic sample of adolescents (213 Hispanic, 148 White, and 110 Black) referred for drug abuse treatment at 8 community treatment agencies nationwide. Randomization encompassed both adolescents’ families (n = 480) and the agency therapists (n = 49) who provided either TAU or BSFT services. The primary outcome was adolescent drug use, assessed monthly via adolescent self-report and urinalysis for up to 1 year post randomization. Secondary outcomes included treatment engagement (≥2 sessions), retention (≥8 sessions), and participants’ reports of family functioning 4, 8, and 12 months following randomization. Results No overall differences between conditions were observed in the trajectories of self-reports of adolescent drug use. However, the median number of days of self-reported drug use was significantly higher, χ2(1) = 5.40, p < .02, in TAU (Mdn = 3.5, interquartile range [IQR] = 11) than BSFT (Mdn = 2, IQR = 9) at the final observation point. BSFT was significantly more effective than TAU in engaging, χ2(1) = 11.33, p < .001, and retaining, χ2(1) = 5.66, p < .02, family members in treatment and in improving parent reports of family functioning, χ2(2) = 9.10, p < .011. Conclusions We discuss challenges in treatment implementation in community settings and provide recommendations for further research. PMID:21967492

  4. Donepezil treatment and Alzheimer disease: can the results of randomized clinical trials be applied to Alzheimer disease patients in clinical practice?

    PubMed

    Tinklenberg, Jared R; Kraemer, Helena C; Yaffe, Kristine; Ross, Leslie; Sheikh, Javaid; Ashford, John W; Yesavage, Jerome A; Taylor, Joy L

    2007-11-01

    To determine if results from randomized clinical trials of donepezil in Alzheimer disease (AD) patients can be applied to AD patients in clinical practice by comparing the findings from a Nordic one-year randomized AD donepezil trial with data from a one-year prospective, observational study of AD patients. AD patients from a consortium of California sites were systematically followed for at least one year. Their treatment regimens, including prescription of donepezil, were determined by their individual physician according to his or her usual criteria. The 148 California patients treated with donepezil had a one-year decline of 1.3 (3.5 SD) points on the Mini-Mental State Exam compared to a decline of 3.3 (4.4 SD) in the 158 AD patients who received no anti-Alzheimer drugs. The Mini-Mental State Exam decline in Nordic sample was approximately 0.25 points for the 91 patients receiving donepezil and approximately 2.2 for the 98 placebo patients. The overall effect sizes were estimated at about 0.49 in both studies. The California data were further analyzed using propensity methods; after taking into account differences that could bias prescribing decisions, benefits associated with taking donepezil remained. A comparison of a randomized clinical trial of donepezil in AD patients and this observational study indicates that if appropriate methodological and statistical precautions are undertaken, then results from randomized clinical trials can be predictive with AD patients in clinical practice. This California study supports the modest effectiveness of donepezil in AD patients having clinical characteristics similar to those of the Nordic study.

  5. Selective intra-dinucleotide interactions and periodicities of bases separated by K sites: a new vision and tool for phylogeny analyses.

    PubMed

    Valenzuela, Carlos Y

    2017-02-13

    Direct tests of the random or non-random distribution of nucleotides on genomes have been devised to test the hypothesis of neutral, nearly-neutral or selective evolution. These tests are based on the direct base distribution and are independent of the functional (coding or non-coding) or structural (repeated or unique sequences) properties of the DNA. The first approach described the longitudinal distribution of bases in tandem repeats under the Bose-Einstein statistics. A huge deviation from randomness was found. A second approach was the study of the base distribution within dinucleotides whose bases were separated by 0, 1, 2… K nucleotides. Again an enormous difference from the random distribution was found with significances out of tables and programs. These test values were periodical and included the 16 dinucleotides. For example a high "positive" (more observed than expected dinucleotides) value, found in dinucleotides whose bases were separated by (3K + 2) sites, was preceded by two smaller "negative" (less observed than expected dinucleotides) values, whose bases were separated by (3K) or (3K + 1) sites. We examined mtDNAs, prokaryote genomes and some eukaryote chromosomes and found that the significant non-random interactions and periodicities were present up to 1000 or more sites of base separation and in human chromosome 21 until separations of more than 10 millions sites. Each nucleotide has its own significant value of its distance to neutrality; this yields 16 hierarchical significances. A three dimensional table with the number of sites of separation between the bases and the 16 significances (the third dimension is the dinucleotide, individual or taxon involved) gives directly an evolutionary state of the analyzed genome that can be used to obtain phylogenies. An example is provided.

  6. Large Uncertainty in Estimating pCO2 From Carbonate Equilibria in Lakes

    NASA Astrophysics Data System (ADS)

    Golub, Malgorzata; Desai, Ankur R.; McKinley, Galen A.; Remucal, Christina K.; Stanley, Emily H.

    2017-11-01

    Most estimates of carbon dioxide (CO2) evasion from freshwaters rely on calculating partial pressure of aquatic CO2 (pCO2) from two out of three CO2-related parameters using carbonate equilibria. However, the pCO2 uncertainty has not been systematically evaluated across multiple lake types and equilibria. We quantified random errors in pH, dissolved inorganic carbon, alkalinity, and temperature from the North Temperate Lakes Long-Term Ecological Research site in four lake groups across a broad gradient of chemical composition. These errors were propagated onto pCO2 calculated from three carbonate equilibria, and for overlapping observations, compared against uncertainties in directly measured pCO2. The empirical random errors in CO2-related parameters were mostly below 2% of their median values. Resulting random pCO2 errors ranged from ±3.7% to ±31.5% of the median depending on alkalinity group and choice of input parameter pairs. Temperature uncertainty had a negligible effect on pCO2. When compared with direct pCO2 measurements, all parameter combinations produced biased pCO2 estimates with less than one third of total uncertainty explained by random pCO2 errors, indicating that systematic uncertainty dominates over random error. Multidecadal trend of pCO2 was difficult to reconstruct from uncertain historical observations of CO2-related parameters. Given poor precision and accuracy of pCO2 estimates derived from virtually any combination of two CO2-related parameters, we recommend direct pCO2 measurements where possible. To achieve consistently robust estimates of CO2 emissions from freshwater components of terrestrial carbon balances, future efforts should focus on improving accuracy and precision of CO2-related parameters (including direct pCO2) measurements and associated pCO2 calculations.

  7. Percolation, sliding, localization and relaxation in topologically closed circuits

    NASA Astrophysics Data System (ADS)

    Hurowitz, Daniel; Cohen, Doron

    2016-03-01

    Considering a random walk in a random environment in a topologically closed circuit, we explore the implications of the percolation and sliding transitions for its relaxation modes. A complementary question regarding the “delocalization” of eigenstates of non-hermitian Hamiltonians has been addressed by Hatano, Nelson, and followers. But we show that for a conservative stochastic process the implied spectral properties are dramatically different. In particular we determine the threshold for under-damped relaxation, and observe “complexity saturation” as the bias is increased.

  8. Analysis of stationary and dynamic factors affecting highway accident occurrence: A dynamic correlated grouped random parameters binary logit approach.

    PubMed

    Fountas, Grigorios; Sarwar, Md Tawfiq; Anastasopoulos, Panagiotis Ch; Blatt, Alan; Majka, Kevin

    2018-04-01

    Traditional accident analysis typically explores non-time-varying (stationary) factors that affect accident occurrence on roadway segments. However, the impact of time-varying (dynamic) factors is not thoroughly investigated. This paper seeks to simultaneously identify pre-crash stationary and dynamic factors of accident occurrence, while accounting for unobserved heterogeneity. Using highly disaggregate information for the potential dynamic factors, and aggregate data for the traditional stationary elements, a dynamic binary random parameters (mixed) logit framework is employed. With this approach, the dynamic nature of weather-related, and driving- and pavement-condition information is jointly investigated with traditional roadway geometric and traffic characteristics. To additionally account for the combined effect of the dynamic and stationary factors on the accident occurrence, the developed random parameters logit framework allows for possible correlations among the random parameters. The analysis is based on crash and non-crash observations between 2011 and 2013, drawn from urban and rural highway segments in the state of Washington. The findings show that the proposed methodological framework can account for both stationary and dynamic factors affecting accident occurrence probabilities, for panel effects, for unobserved heterogeneity through the use of random parameters, and for possible correlation among the latter. The comparative evaluation among the correlated grouped random parameters, the uncorrelated random parameters logit models, and their fixed parameters logit counterpart, demonstrate the potential of the random parameters modeling, in general, and the benefits of the correlated grouped random parameters approach, specifically, in terms of statistical fit and explanatory power. Published by Elsevier Ltd.

  9. Gossip and Distributed Kalman Filtering: Weak Consensus Under Weak Detectability

    NASA Astrophysics Data System (ADS)

    Kar, Soummya; Moura, José M. F.

    2011-04-01

    The paper presents the gossip interactive Kalman filter (GIKF) for distributed Kalman filtering for networked systems and sensor networks, where inter-sensor communication and observations occur at the same time-scale. The communication among sensors is random; each sensor occasionally exchanges its filtering state information with a neighbor depending on the availability of the appropriate network link. We show that under a weak distributed detectability condition: 1. the GIKF error process remains stochastically bounded, irrespective of the instability properties of the random process dynamics; and 2. the network achieves \\emph{weak consensus}, i.e., the conditional estimation error covariance at a (uniformly) randomly selected sensor converges in distribution to a unique invariant measure on the space of positive semi-definite matrices (independent of the initial state.) To prove these results, we interpret the filtered states (estimates and error covariances) at each node in the GIKF as stochastic particles with local interactions. We analyze the asymptotic properties of the error process by studying as a random dynamical system the associated switched (random) Riccati equation, the switching being dictated by a non-stationary Markov chain on the network graph.

  10. DNA asymmetry in stem cells - immortal or mortal?

    PubMed

    Yadlapalli, Swathi; Yamashita, Yukiko M

    2013-09-15

    The immortal strand hypothesis proposes that stem cells retain a template copy of genomic DNA (i.e. an 'immortal strand') to avoid replication-induced mutations. An alternative hypothesis suggests that certain cells segregate sister chromatids non-randomly to transmit distinct epigenetic information. However, this area of research has been highly controversial, with conflicting data even from the same cell types. Moreover, historically, the same term of 'non-random sister chromatid segregation' or 'biased sister chromatid segregation' has been used to indicate distinct biological processes, generating a confusion in the biological significance and potential mechanism of each phenomenon. Here, we discuss the models of non-random sister chromatid segregation, and we explore the strengths and limitations of the various techniques and experimental model systems used to study this question. We also describe our recent study on Drosophila male germline stem cells, where sister chromatids of X and Y chromosomes are segregated non-randomly during cell division. We aim to integrate the existing evidence to speculate on the underlying mechanisms and biological relevance of this long-standing observation on non-random sister chromatid segregation.

  11. DNA asymmetry in stem cells – immortal or mortal?

    PubMed Central

    Yadlapalli, Swathi; Yamashita, Yukiko M.

    2013-01-01

    Summary The immortal strand hypothesis proposes that stem cells retain a template copy of genomic DNA (i.e. an ‘immortal strand’) to avoid replication-induced mutations. An alternative hypothesis suggests that certain cells segregate sister chromatids non-randomly to transmit distinct epigenetic information. However, this area of research has been highly controversial, with conflicting data even from the same cell types. Moreover, historically, the same term of ‘non-random sister chromatid segregation’ or ‘biased sister chromatid segregation’ has been used to indicate distinct biological processes, generating a confusion in the biological significance and potential mechanism of each phenomenon. Here, we discuss the models of non-random sister chromatid segregation, and we explore the strengths and limitations of the various techniques and experimental model systems used to study this question. We also describe our recent study on Drosophila male germline stem cells, where sister chromatids of X and Y chromosomes are segregated non-randomly during cell division. We aim to integrate the existing evidence to speculate on the underlying mechanisms and biological relevance of this long-standing observation on non-random sister chromatid segregation. PMID:23970416

  12. Design and analysis of randomized clinical trials requiring prolonged observation of each patient. II. analysis and examples.

    PubMed Central

    Peto, R.; Pike, M. C.; Armitage, P.; Breslow, N. E.; Cox, D. R.; Howard, S. V.; Mantel, N.; McPherson, K.; Peto, J.; Smith, P. G.

    1977-01-01

    Part I of this report appeared in the previous issue (Br. J. Cancer (1976) 34,585), and discussed the design of randomized clinical trials. Part II now describes efficient methods of analysis of randomized clinical trials in which we wish to compare the duration of survival (or the time until some other untoward event first occurs) among different groups of patients. It is intended to enable physicians without statistical training either to analyse such data themselves using life tables, the logrank test and retrospective stratification, or, when such analyses are presented, to appreciate them more critically, but the discussion may also be of interest to statisticians who have not yet specialized in clinical trial analyses. PMID:831755

  13. Bell experiments with random destination sources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sciarrino, Fabio; Mataloni, Paolo; Istituto Nazionale di Ottica, Consiglio Nazionale delle Ricerche

    2011-03-15

    It is generally assumed that sources randomly sending two particles to one or two different observers, random destination sources (RDSs), cannot be used for genuine quantum nonlocality tests because of the postselection loophole. We demonstrate that Bell experiments not affected by the postselection loophole may be performed with (i) an RDS and local postselection using perfect detectors, (ii) an RDS, local postselection, and fair sampling assumption with any detection efficiency, and (iii) an RDS and a threshold detection efficiency required to avoid the detection loophole. These results allow the adoption of RDS setups which are simpler and more efficient formore » long-distance free-space Bell tests, and extend the range of physical systems which can be used for loophole-free Bell tests.« less

  14. Improved diagonal queue medical image steganography using Chaos theory, LFSR, and Rabin cryptosystem.

    PubMed

    Jain, Mamta; Kumar, Anil; Choudhary, Rishabh Charan

    2017-06-01

    In this article, we have proposed an improved diagonal queue medical image steganography for patient secret medical data transmission using chaotic standard map, linear feedback shift register, and Rabin cryptosystem, for improvement of previous technique (Jain and Lenka in Springer Brain Inform 3:39-51, 2016). The proposed algorithm comprises four stages, generation of pseudo-random sequences (pseudo-random sequences are generated by linear feedback shift register and standard chaotic map), permutation and XORing using pseudo-random sequences, encryption using Rabin cryptosystem, and steganography using the improved diagonal queues. Security analysis has been carried out. Performance analysis is observed using MSE, PSNR, maximum embedding capacity, as well as by histogram analysis between various Brain disease stego and cover images.

  15. Clearing out a maze: A model of chemotactic motion in porous media

    NASA Astrophysics Data System (ADS)

    Schilling, Tanja; Voigtmann, Thomas

    2017-12-01

    We study the anomalous dynamics of a biased "hungry" (or "greedy") random walk on a percolating cluster. The model mimics chemotaxis in a porous medium: In close resemblance to the 1980s arcade game PAC-MA N ®, the hungry random walker consumes food, which is initially distributed in the maze, and biases its movement towards food-filled sites. We observe that the mean-squared displacement of the process follows a power law with an exponent that is different from previously known exponents describing passive or active microswimmer dynamics. The change in dynamics is well described by a dynamical exponent that depends continuously on the propensity to move towards food. It results in slower differential growth when compared to the unbiased random walk.

  16. Stochastic reduced order models for inverse problems under uncertainty

    PubMed Central

    Warner, James E.; Aquino, Wilkins; Grigoriu, Mircea D.

    2014-01-01

    This work presents a novel methodology for solving inverse problems under uncertainty using stochastic reduced order models (SROMs). Given statistical information about an observed state variable in a system, unknown parameters are estimated probabilistically through the solution of a model-constrained, stochastic optimization problem. The point of departure and crux of the proposed framework is the representation of a random quantity using a SROM - a low dimensional, discrete approximation to a continuous random element that permits e cient and non-intrusive stochastic computations. Characterizing the uncertainties with SROMs transforms the stochastic optimization problem into a deterministic one. The non-intrusive nature of SROMs facilitates e cient gradient computations for random vector unknowns and relies entirely on calls to existing deterministic solvers. Furthermore, the method is naturally extended to handle multiple sources of uncertainty in cases where state variable data, system parameters, and boundary conditions are all considered random. The new and widely-applicable SROM framework is formulated for a general stochastic optimization problem in terms of an abstract objective function and constraining model. For demonstration purposes, however, we study its performance in the specific case of inverse identification of random material parameters in elastodynamics. We demonstrate the ability to efficiently recover random shear moduli given material displacement statistics as input data. We also show that the approach remains effective for the case where the loading in the problem is random as well. PMID:25558115

  17. Single molecule counting and assessment of random molecular tagging errors with transposable giga-scale error-correcting barcodes.

    PubMed

    Lau, Billy T; Ji, Hanlee P

    2017-09-21

    RNA-Seq measures gene expression by counting sequence reads belonging to unique cDNA fragments. Molecular barcodes commonly in the form of random nucleotides were recently introduced to improve gene expression measures by detecting amplification duplicates, but are susceptible to errors generated during PCR and sequencing. This results in false positive counts, leading to inaccurate transcriptome quantification especially at low input and single-cell RNA amounts where the total number of molecules present is minuscule. To address this issue, we demonstrated the systematic identification of molecular species using transposable error-correcting barcodes that are exponentially expanded to tens of billions of unique labels. We experimentally showed random-mer molecular barcodes suffer from substantial and persistent errors that are difficult to resolve. To assess our method's performance, we applied it to the analysis of known reference RNA standards. By including an inline random-mer molecular barcode, we systematically characterized the presence of sequence errors in random-mer molecular barcodes. We observed that such errors are extensive and become more dominant at low input amounts. We described the first study to use transposable molecular barcodes and its use for studying random-mer molecular barcode errors. Extensive errors found in random-mer molecular barcodes may warrant the use of error correcting barcodes for transcriptome analysis as input amounts decrease.

  18. Statistics of peak overpressure and shock steepness for linear and nonlinear N-wave propagation in a kinematic turbulence.

    PubMed

    Yuldashev, Petr V; Ollivier, Sébastien; Karzova, Maria M; Khokhlova, Vera A; Blanc-Benon, Philippe

    2017-12-01

    Linear and nonlinear propagation of high amplitude acoustic pulses through a turbulent layer in air is investigated using a two-dimensional KZK-type (Khokhlov-Zabolotskaya-Kuznetsov) equation. Initial waves are symmetrical N-waves with shock fronts of finite width. A modified von Kármán spectrum model is used to generate random wind velocity fluctuations associated with the turbulence. Physical parameters in simulations correspond to previous laboratory scale experiments where N-waves with 1.4 cm wavelength propagated through a turbulence layer with the outer scale of about 16 cm. Mean value and standard deviation of peak overpressure and shock steepness, as well as cumulative probabilities to observe amplified peak overpressure and shock steepness, are analyzed. Nonlinear propagation effects are shown to enhance pressure level in random foci for moderate initial amplitudes of N-waves thus increasing the probability to observe highly peaked waveforms. Saturation of the pressure level is observed for stronger nonlinear effects. It is shown that in the linear propagation regime, the turbulence mainly leads to the smearing of shock fronts, thus decreasing the probability to observe high values of steepness, whereas nonlinear effects dramatically increase the probability to observe steep shocks.

  19. Handling target obscuration through Markov chain observations

    NASA Astrophysics Data System (ADS)

    Kouritzin, Michael A.; Wu, Biao

    2008-04-01

    Target Obscuration, including foliage or building obscuration of ground targets and landscape or horizon obscuration of airborne targets, plagues many real world filtering problems. In particular, ground moving target identification Doppler radar, mounted on a surveillance aircraft or unattended airborne vehicle, is used to detect motion consistent with targets of interest. However, these targets try to obscure themselves (at least partially) by, for example, traveling along the edge of a forest or around buildings. This has the effect of creating random blockages in the Doppler radar image that move dynamically and somewhat randomly through this image. Herein, we address tracking problems with target obscuration by building memory into the observations, eschewing the usual corrupted, distorted partial measurement assumptions of filtering in favor of dynamic Markov chain assumptions. In particular, we assume the observations are a Markov chain whose transition probabilities depend upon the signal. The state of the observation Markov chain attempts to depict the current obscuration and the Markov chain dynamics are used to handle the evolution of the partially obscured radar image. Modifications of the classical filtering equations that allow observation memory (in the form of a Markov chain) are given. We use particle filters to estimate the position of the moving targets. Moreover, positive proof-of-concept simulations are included.

  20. Asymmetry in serial femtosecond crystallography data.

    PubMed

    Sharma, Amit; Johansson, Linda; Dunevall, Elin; Wahlgren, Weixiao Y; Neutze, Richard; Katona, Gergely

    2017-03-01

    Serial crystallography is an increasingly important approach to protein crystallography that exploits both X-ray free-electron laser (XFEL) and synchrotron radiation. Serial crystallography recovers complete X-ray diffraction data by processing and merging diffraction images from thousands of randomly oriented non-uniform microcrystals, of which all observations are partial Bragg reflections. Random fluctuations in the XFEL pulse energy spectrum, variations in the size and shape of microcrystals, integrating over millions of weak partial observations and instabilities in the XFEL beam position lead to new types of experimental errors. The quality of Bragg intensity estimates deriving from serial crystallography is therefore contingent upon assumptions made while modeling these data. Here it is observed that serial femtosecond crystallography (SFX) Bragg reflections do not follow a unimodal Gaussian distribution and it is recommended that an idealized assumption of single Gaussian peak profiles be relaxed to incorporate apparent asymmetries when processing SFX data. The phenomenon is illustrated by re-analyzing data collected from microcrystals of the Blastochloris viridis photosynthetic reaction center and comparing these intensity observations with conventional synchrotron data. The results show that skewness in the SFX observations captures the essence of the Wilson plot and an empirical treatment is suggested that can help to separate the diffraction Bragg intensity from the background.

Top