Sørensen, Thorkild I. A; Boutin, Philippe; Taylor, Moira A; Larsen, Lesli H; Verdich, Camilla; Petersen, Liselotte; Holst, Claus; Echwald, Søren M; Dina, Christian; Toubro, Søren; Petersen, Martin; Polak, Jan; Clément, Karine; Martínez, J. Alfredo; Langin, Dominique; Oppert, Jean-Michel; Stich, Vladimir; Macdonald, Ian; Arner, Peter; Saris, Wim H. M; Pedersen, Oluf; Astrup, Arne; Froguel, Philippe
2006-01-01
Objectives: To study if genes with common single nucleotide polymorphisms (SNPs) associated with obesity-related phenotypes influence weight loss (WL) in obese individuals treated by a hypo-energetic low-fat or high-fat diet. Design: Randomised, parallel, two-arm, open-label multi-centre trial. Setting: Eight clinical centres in seven European countries. Participants: 771 obese adult individuals. Interventions: 10-wk dietary intervention to hypo-energetic (−600 kcal/d) diets with a targeted fat energy of 20%–25% or 40%–45%, completed in 648 participants. Outcome Measures: WL during the 10 wk in relation to genotypes of 42 SNPs in 26 candidate genes, probably associated with hypothalamic regulation of appetite, efficiency of energy expenditure, regulation of adipocyte differentiation and function, lipid and glucose metabolism, or production of adipocytokines, determined in 642 participants. Results: Compared with the noncarriers of each of the SNPs, and after adjusting for gender, age, baseline weight and centre, heterozygotes showed WL differences that ranged from −0.6 to 0.8 kg, and homozygotes, from −0.7 to 3.1 kg. Genotype-dependent additional WL on low-fat diet ranged from 1.9 to −1.6 kg in heterozygotes, and from 3.8 kg to −2.1 kg in homozygotes relative to the noncarriers. Considering the multiple testing conducted, none of the associations was statistically significant. Conclusions: Polymorphisms in a panel of obesity-related candidate genes play a minor role, if any, in modulating weight changes induced by a moderate hypo-energetic low-fat or high-fat diet. PMID:16871334
Uterine rupture at 10 weeks of gestation after laparoscopic myomectomy.
Okada, Yoshiyuki; Hasegawa, Junichi; Mimura, Takashi; Arakaki, Tatsuya; Yoshikawa, Shinichiro; Yamashita, Yuka; Oba, Tomohiro; Nakamura, Masamitsu; Matsuoka, Ryu; Sekizawa, Akihiko
2016-01-01
The patient had a previous history of laparoscopic myomectomy. At 10 weeks of gestation, she visited our emergency center due to sudden abdominal pain. An ultrasound examination and MRI showed complete rupture of the uterine myometrium in the fundal wall and a floating gestation sac in Douglas' fossa with fluid. Emergency abdominal laparotomy was immediately performed due to the diagnosis of uterine rupture. During surgery, a small defect of the myometrium was found in the posterior fundal wall of the uterus. Two-layer suturing was performed at the perforation hole. The occasional occurrence of uterine rupture after surgery of the uterus even in the first trimester should be considered.
ERIC Educational Resources Information Center
Schmidt, Mirko; Valkanover, Stefan; Roebers, Claudia; Conzelmann, Achim
2013-01-01
Most physical education intervention studies on the positive effect of sports on self-concept development have attempted to "increase" schoolchildren's self-concept without taking the "veridicality" of the self-concept into account. The present study investigated whether a 10-week intervention in physical education would…
Puntumetakul, Rungthip; Areeudomwong, Pattanasin; Emasithi, Alongkot; Yamauchi, Junichiro
2013-01-01
Background and aims Clinical lumbar instability causes pain and socioeconomic suffering; however, an appropriate treatment for this condition is unknown. This article examines the effect of a 10 week core stabilization exercise (CSE) program and 3 month follow-up on pain-related outcomes in patients with clinical lumbar instability. Methods Forty-two participants with clinical lumbar instability of at least 3 months in duration were randomly allocated either to 10 weekly treatments with CSE or to a conventional group (CG) receiving trunk stretching exercises and hot pack. Pain-related outcomes including pain intensity during instability catch sign, functional disability, patient satisfaction, and health-related quality of life were measured at 10 weeks of intervention and 1 and 3 months after the last intervention session (follow-up); trunk muscle activation patterns measured by surface electromyography were measured at 10 weeks. Results CSE showed significantly greater reductions in all pain-related outcomes after 10 weeks and over the course of 3 month follow-up periods than those seen in the CG (P<0.01). Furthermore, CSE enhanced deep abdominal muscle activation better than in the CG (P<0.001), whereas the CG had deterioration of deep back muscle activation compared with the CSE group (P<0.01). For within-group comparison, CSE provided significant improvements in all pain-related outcomes over follow-up (P<0.01), whereas the CG demonstrated reduction in pain intensity during instability catch sign only at 10 weeks (P<0.01). In addition, CSE showed an improvement in deep abdominal muscle activation (P<0.01), whereas the CG revealed the deterioration of deep abdominal and back muscle activations (P<0.05). Conclusion Ten week CSE provides greater training and retention effects on pain-related outcomes and induced activation of deep abdominal muscles in patients with clinical lumbar instability compared with conventional treatment. PMID:24399870
Yang, Wen-Wen; Liu, Ya-Chen; Lu, Lee-Chang; Chang, Hsiao-Yun; Chou, Paul Pei-Hsi; Liu, Chiang
2013-12-01
Compared with regulation-weight baseballs, lightweight baseballs generate lower torque on the shoulder and elbow joints without altering the pitching movement and timing. This study investigates the throwing accuracy, throwing velocity, arm swing velocity, and maximum shoulder external rotation (MSER) of adolescent players after 10 weeks of pitching training with appropriate lightweight baseballs. We assigned 24 adolescent players to a lightweight baseball group (group L) and a regulation-weight baseball group (group R) based on their pretraining throwing velocity. Both groups received pitching training 3 times per week for 10 weeks with 4.4- and 5-oz baseballs. The players' throwing accuracy, throwing velocity, arm swing velocity, and MSER were measured from 10 maximum efforts throws using a regulation-weight baseball before and after undergoing the pitching training. The results showed that the players in group L significantly increased their throwing velocity and arm swing velocity (p < 0.05) after 10 weeks of pitching training with the 4.4-oz baseball, whereas group R did not (p > 0.05). Furthermore, the percentage change in the throwing velocity and arm swing velocity of group L was significantly superior to that of group R (p < 0.05). Thus, we concluded that the 10 weeks of pitching training with an appropriate lightweight baseball substantially enhanced the arm swing velocity and throwing velocity of the adolescent baseball players. These findings suggest that using a lightweight baseball, which can reduce the risk of injury without altering pitching patterns, has positive training effects on players in the rapid physical growth and technique development stage.
Yang, Wen-Wen; Liu, Ya-Chen; Lu, Lee-Chang; Chang, Hsiao-Yun; Chou, Paul Pei-Hsi; Liu, Chiang
2013-12-01
Compared with regulation-weight baseballs, lightweight baseballs generate lower torque on the shoulder and elbow joints without altering the pitching movement and timing. This study investigates the throwing accuracy, throwing velocity, arm swing velocity, and maximum shoulder external rotation (MSER) of adolescent players after 10 weeks of pitching training with appropriate lightweight baseballs. We assigned 24 adolescent players to a lightweight baseball group (group L) and a regulation-weight baseball group (group R) based on their pretraining throwing velocity. Both groups received pitching training 3 times per week for 10 weeks with 4.4- and 5-oz baseballs. The players' throwing accuracy, throwing velocity, arm swing velocity, and MSER were measured from 10 maximum efforts throws using a regulation-weight baseball before and after undergoing the pitching training. The results showed that the players in group L significantly increased their throwing velocity and arm swing velocity (p < 0.05) after 10 weeks of pitching training with the 4.4-oz baseball, whereas group R did not (p > 0.05). Furthermore, the percentage change in the throwing velocity and arm swing velocity of group L was significantly superior to that of group R (p < 0.05). Thus, we concluded that the 10 weeks of pitching training with an appropriate lightweight baseball substantially enhanced the arm swing velocity and throwing velocity of the adolescent baseball players. These findings suggest that using a lightweight baseball, which can reduce the risk of injury without altering pitching patterns, has positive training effects on players in the rapid physical growth and technique development stage. PMID:23603999
Comparison of upper body strength gains between men and women after 10 weeks of resistance training
Steele, James; Pereira, Maria C.; Castanheira, Rafael P.M.; Paoli, Antonio; Bottaro, Martim
2016-01-01
Resistance training (RT) offers benefits to both men and women. However, the studies about the differences between men and women in response to an RT program are not conclusive and few data are available about upper body strength response. The aim of this study was to compare elbow flexor strength gains in men and women after 10 weeks of RT. Forty-four college-aged men (22.63 ± 2.34 years) and forty-seven college-aged women (21.62 ± 2.96 years) participated in the study. The RT program was performed two days a week for 10 weeks. Before and after the training period, peak torque (PT) of the elbow flexors was measured with an isokinetic dynamometer. PT values were higher in men in comparison to women in pre- and post-tests (p < 0.01). Both males and females significantly increased elbow flexor strength (p < 0.05); however, strength changes did not differ between genders after 10 weeks of RT program (11.61 and 11.76% for men and women, respectively; p > 0.05). Effect sizes were 0.57 and 0.56 for men and women, respectively. In conclusion, the present study suggests that men and women have a similar upper body strength response to RT. PMID:26893958
Identifying the limitations for growth in low performing piglets from birth until 10 weeks of age.
Paredes, S P; Jansman, A J M; Verstegen, M W A; den Hartog, L A; van Hees, H M J; Bolhuis, J E; van Kempen, T A T G; Gerrits, W J J
2014-06-01
The evolution of hyper-prolific pig breeds has led to a higher within-litter variation in birth weight and in BW gain during the nursery phase. Based on an algorithm developed in previous research, two populations from a pool of 368 clinically healthy piglets at 6 weeks of age were selected: a low (LP) and a high (HP) performing population and their development was monitored until the end of the nursery phase (10 weeks of age). To understand the cause of the variation in growth between these populations we characterized the LP and HP piglets in terms of body morphology, behaviour, voluntary feed intake, BW gain, and apparent total tract and ileal nutrient digestibility. Piglets were housed individually and were fed a highly digestible diet. At selection, 6 weeks of age, the BW of LP and HP piglets were 6.8±0.1 and 12.2±0.1 kg, respectively. Compared with the LP piglets the HP piglets grew faster (203 g/day), ate more (275 g/day) from 6 to 10 weeks of age and were heavier at 10 weeks (30.0 v. 18.8 kg, all P<0.01). Yet, the differences in average daily gain and average daily feed intake disappeared when compared per kg BW0.75. Assuming similar maintenance requirements per kg BW0.75 the efficiency of feed utilization above maintenance was 0.1 g/g lower for the LP piglets (P=0.09).The gain : feed ratio was similar for both groups. LP piglets tended to take more time to touch a novel object (P=0.10), and spent more time eating (P<0.05). At 10 weeks, LP piglets had a higher body length and head circumference relative to BW (P<0.01). Relative to BW, LP had a 21% higher small intestine weight; 36% longer length, and relative to average FI, the small intestinal weight was 4 g/kg higher (both P=<0.01). Apparent total tract and ileal dry matter, N and gross energy digestibility were similar between groups (P>0.10). We concluded that the low performance of the LP piglets was due to their inability to engage compensatory gain or compensatory feed intake as efficiency of
Fisher, B E; Ross, K; Wilson, A
1994-06-01
In a 10-week longitudinal study, 29 parents and their children kept daily records of the children's sleep behaviors, excitement levels, and tiredness levels. Although the hypothesized increase in sleep behaviors such as sleepwalking and restlessness during the week of Christmas did not occur, children rated as more excitable by their parents and themselves exhibited a higher frequency of sleep behaviors. Positive associations were also found between averaged tiredness ratings and sleep scores. The results support previous findings of an association between arousal characteristics of children and their sleep behavior. Moderate validity coefficients were obtained for parents' and children's ratings of excitement, tiredness, and nocturnal waking.
Chung, Weiliang; Shaw, Greg; Anderson, Megan E; Pyne, David B; Saunders, Philo U; Bishop, David J; Burke, Louise M
2012-10-09
Although some laboratory-based studies show an ergogenic effect with beta-alanine supplementation, there is a lack of field-based research in training and competition settings. Elite/Sub-elite swimmers (n = 23 males and 18 females, age = 21.7 ± 2.8 years; mean ± SD) were supplemented with either beta-alanine (4 weeks loading phase of 4.8 g/day and 3.2 g/day thereafter) or placebo for 10 weeks. Competition performance times were log-transformed, then evaluated before (National Championships) and after (international or national selection meet) supplementation. Swimmers also completed three standardized training sets at baseline, 4 and 10 weeks of supplementation. Capillary blood was analyzed for pH, bicarbonate and lactate concentration in both competition and training. There was an unclear effect (0.4%; ± 0.8%, mean, ± 90% confidence limits) of beta-alanine on competition performance compared to placebo with no meaningful changes in blood chemistry. While there was a transient improvement on training performance after 4 weeks with beta-alanine (-1.3%; ± 1.0%), there was an unclear effect at ten weeks (-0.2%; ± 1.5%) and no meaningful changes in blood chemistry. Beta-alanine supplementation appears to have minimal effect on swimming performance in non-laboratory controlled real-world training and competition settings.
ERIC Educational Resources Information Center
Dreyer, Lukas; Dreyer, Sonja; Rankin, Dean
2012-01-01
This study examined the effect of a 10-week physical exercise program on the health status of college staff. Eighty-one participants were pre-tested on 22 variables including physical fitness, biochemical status, psychological health, and morphological measures. Participants in an experimental group (n = 61) received a 10-week intervention…
Effects of a 10-week resistance exercise program on soccer kick biomechanics and muscle strength.
Manolopoulos, Evaggelos; Katis, Athanasios; Manolopoulos, Konstantinos; Kalapotharakos, Vasileios; Kellis, Eleftherios
2013-12-01
The purpose of the study was to examine the effects of a resistance exercise program on soccer kick biomechanics. Twenty male amateur soccer players were divided in the experimental group (EG) and the control group (CG), each consisting of 10 players. The EG followed a 10-week resistance exercise program mainly for the lower limb muscles. Maximal instep kick kinematics, electromyography, and ground reaction forces (GRFs) as well as maximum isometric leg strength were recorded before and after training. A 2-way analysis of variance showed significantly higher ball speed values only for the EG (26.14 ± 1.17 m·s vs. 27.59 ± 1.49 m·s before and after training, respectively), whereas no significant differences were observed for the CG. The EG showed a decline in joint angular velocities and an increase in biceps femoris electromyography of the swinging leg during the backswing phase followed by a significant increase in segmental and joint velocities and muscle activation of the same leg during the forward swing phase (p < 0.05). The EG also showed significantly higher vertical GRFs and rectus femoris and gastrocnemius activation of the support leg (p < 0.05). Similarly, maximum and explosive isometric force significantly increased after training only for the EG (p < 0.05). These results suggest that increases in soccer kicking performance after a 10-week resistance training program were accompanied by increases in maximum strength and an altered soccer kick movement pattern, characterized by a more explosive backward-forward swinging movement and higher muscle activation during the final kicking phase.
Wan Dali, Wan Putri Elena; Lua, Pei Lin
2013-01-01
The aim of the study was to evaluate the effectiveness of implementing multimodal nutrition education intervention (NEI) to improve dietary intake among university students. The design of study used was cluster randomised controlled design at four public universities in East Coast of Malaysia. A total of 417 university students participated in the study. They were randomly selected and assigned into two arms, that is, intervention group (IG) or control group (CG) according to their cluster. The IG received 10-week multimodal intervention using three modes (conventional lecture, brochures, and text messages) while CG did not receive any intervention. Dietary intake was assessed before and after intervention and outcomes reported as nutrient intakes as well as average daily servings of food intake. Analysis of covariance (ANCOVA) and adjusted effect size were used to determine difference in dietary changes between groups and time. Results showed that, compared to CG, participants in IG significantly improved their dietary intake by increasing their energy intake, carbohydrate, calcium, vitamin C and thiamine, fruits and 100% fruit juice, fish, egg, milk, and dairy products while at the same time significantly decreased their processed food intake. In conclusion, multimodal NEI focusing on healthy eating promotion is an effective approach to improve dietary intakes among university students. PMID:24069535
Sensitivity of vergence responses of 5- to 10-week-old human infants.
Seemiller, Eric S; Wang, Jingyun; Candy, T Rowan
2016-01-01
Infants have been shown to make vergence eye movements by 1 month of age to stimulation with prisms or targets moving in depth. However, little is currently understood about the threshold sensitivity of the maturing visual system to such stimulation. In this study, 5- to 10-week-old human infants and adults viewed a target moving in depth as a triangle wave of three amplitudes (1.0, 0.5, and 0.25 meter angles). Their horizontal eye position and the refractive state of both eyes were measured simultaneously. The vergence responses of the infants and adults varied at the same frequency as the stimulus at the three tested modulation amplitudes. For a typical infant of this age, the smallest amplitude is equivalent to an interocular change of approximately 2° of retinal disparity, from nearest to farthest points. The infants' accommodation responses only modulated reliably to the largest stimulus, while adults responded to all three amplitudes. Although the accommodative system appears relatively insensitive, the sensitivity of the vergence responses suggests that subtle cues are available to drive vergence in the second month after birth. PMID:26891827
Sensitivity of vergence responses of 5- to 10-week-old human infants
Seemiller, Eric S.; Wang, Jingyun; Candy, T. Rowan
2016-01-01
Infants have been shown to make vergence eye movements by 1 month of age to stimulation with prisms or targets moving in depth. However, little is currently understood about the threshold sensitivity of the maturing visual system to such stimulation. In this study, 5- to 10-week-old human infants and adults viewed a target moving in depth as a triangle wave of three amplitudes (1.0, 0.5, and 0.25 meter angles). Their horizontal eye position and the refractive state of both eyes were measured simultaneously. The vergence responses of the infants and adults varied at the same frequency as the stimulus at the three tested modulation amplitudes. For a typical infant of this age, the smallest amplitude is equivalent to an interocular change of approximately 2° of retinal disparity, from nearest to farthest points. The infants' accommodation responses only modulated reliably to the largest stimulus, while adults responded to all three amplitudes. Although the accommodative system appears relatively insensitive, the sensitivity of the vergence responses suggests that subtle cues are available to drive vergence in the second month after birth. PMID:26891827
Impact of 10-weeks of yoga practice on flexibility and balance of college athletes
Polsgrove, M Jay; Eggleston, Brandon M; Lockyer, Roch J
2016-01-01
Background: With clearer evidence of its benefits, coaches, and athletes may better see that yoga has a role in optimizing performance. Aims: To determine the impact of yoga on male college athletes (N = 26). Methods: Over a 10-week period, a yoga group (YG) of athletes (n = 14) took part in biweekly yoga sessions; while a nonyoga group (NYG) of athletes (n = 12) took part in no additional yoga activity. Performance measures were obtained immediately before and after this period. Measurements of flexibility and balance, included: Sit-reach (SR), shoulder flexibility (SF), and stork stand (SS); dynamic measurements consisted of joint angles (JA) measured during the performance of three distinct yoga positions (downward dog [DD]; right foot lunge [RFL]; chair [C]). Results: Significant gains were observed in the YG for flexibility (SR, P = 0.01; SF, P = 0.03), and balance (SS, P = 0.05). No significant differences were observed in the NYG for flexibility and balance. Significantly, greater JA were observed in the YG for: RFL (dorsiflexion, l-ankle; P = 0.04), DD (extension, r-knee, P = 0.04; r-hip; P = 0.01; flexion, r-shoulder; P = 0.01) and C (flexion, r-knee; P = 0.01). Significant JA differences were observed in the NYG for: DD (flexion, r-knee, P = 0.01: r-hip, P = 0.05; r-shoulder, P = 0.03) and C (flexion r-knee, P = 0.01; extension, r-shoulder; P = 0.05). A between group comparison revealed the significant differences for: RFL (l-ankle; P = 0.01), DD (r-knee, P = 0.01; r-hip; P = 0.01), and C (r-shoulder, P = 0.02). Conclusions: Results suggest that a regular yoga practice may increase the flexibility and balance as well as whole body measures of male college athletes and therefore, may enhance athletic performances that require these characteristics. PMID:26865768
ERIC Educational Resources Information Center
Ashburn, Jennifer; Ayers, Mary Jane; Born-Ozment, Susan; Karsten, Jayne; Maeda, Sheri
This 10-week middle school curriculum unit for grades 6-8, integrating concepts, materials, and content from language arts, music, and visual arts, provides a set of specific instructional plans relative to the study of myths (often a content area in middle school grades across the country). All the sample lessons and examples in the curriculum…
Phakomatous choristoma in a 10-week-old boy: a case report and review of the literature.
Romano, Ryan C; McDonough, Patrick; Salomao, Diva R; Fritchie, Karen J
2015-01-01
Phakomatous choristoma (PC) is a rare benign congenital lesion of lenticular anlage. It presents in young patients as a firm subcutaneous mass in the medial eyelid or orbit and may raise clinical concern for neoplasms such as rhabdomyosarcoma, but its histopathology is distinct, consisting of dense collagenous stroma and eosinophilic cuboidal epithelial cells forming nests, tubules, cords, or pseudoglands. We present a case of PC in a 10-week-old boy to illustrate the unique clinical, histopathologic, and immunophenotypic features of this condition and to reaffirm that familiarity with this rare entity aids accurate diagnosis.
Navalgund, Anand; Buford, John A.; Briggs, Mathew S.; Givens, Deborah L.
2013-01-01
Altered trunk muscle reflexes have been observed in patients with low back pain (LBP). Altered reflexes may contribute to impaired postural control, and possibly recurrence of LBP. Specific stabilization exercise (SSE) programs have been shown to decrease the risk of LBP recurrence in a select group of patients with acute, first episode LBP. It is not known if trunk muscle reflex responses improve with resolution of subacute, recurrent LBP when treated with a SSE program. A perturbation test was used to compare trunk muscle reflexes in patients with subacute, recurrent LBP, before and after 10 weeks of a SSE program and a group of matched control subjects (CNTL). The LBP group pre therapy had delayed trunk muscle reflexes compared with the CNTL group. Post therapy reflex latencies remained delayed, but amplitudes increased. Increased reflex amplitudes could limit excessive movement of the spine when perturbed; potentially helping prevent recurrence. PMID:22964879
Tolnai, Nóra; Szabó, Zsófia; Köteles, Ferenc; Szabo, Attila
2016-09-01
Pilates exercises have several demonstrated physical and psychological benefits. To date, most research in this context was conducted with symptomatic or elderly people with few dependent measures. The current study examined the chronic or longitudinal effects of very low frequency, once a week, Pilates training on several physical and psychological measures, over a 10-week intervention, in young, healthy, and sedentary women. Further, the study gauged the acute effects of Pilates exercises on positive- and negative affect in 10 exercise sessions. Compared to a control group, the Pilates group exhibited significant improvements in skeletal muscle mass, flexibility, balance, core- and abdominal muscle strength, body awareness, and negative affect. This group also showed favorable changes in positive (22.5% increase) and negative affect (12.2% decrease) in nine out of ten exercise sessions. This work clearly demonstrates the acute and chronic benefits of Pilates training on both physical and psychological measures. It also reveals that even only once a week Pilates training is enough to trigger detectable benefits in young sedentary women. While this frequency is below the required levels of exercise for health, it may overcome the 'lack of time' excuse for not exercising and subsequently its tangible benefits may positively influence one's engagement in more physical activity. PMID:27195456
Effect of a 10-week yoga programme on the quality of life of women after breast cancer surgery
Merecz, Dorota; Wójcik, Aleksandra; Świątkowska, Beata; Sierocka, Kamilla; Najder, Anna
2014-01-01
Aim of the study The following research is aimed at determining the effect of yoga on the quality of life of women after breast cancer surgery. Material and methods A 10-week yoga programme included 90-minute yoga lessons once a week. To estimate the quality of life, questionnaires developed by the European Organisation for Research and Treatment of Cancer (QLQ-C30 and QLQ-BR23) were used. An experimental group consisted of 12 women who practised yoga, a control group – of 16 women who did not. Between groups there were no differences in age, time from operation and characteristics associated with disease, treatment and participation in rehabilitation. Results Our results revealed an improvement of general health and quality of life, physical and social functioning as well as a reduction of difficulties in daily activities among exercising women. Also their future prospects enhanced – they worried less about their health than they used to before participating in the programme. As compared to baseline, among exercising women, fatigue, dyspnoea and discomfort (pain, swelling, sensitivity) in the arm and breast on the operated side decreased. Conclusions Participation in the exercising programme resulted in an improvement of physical functioning, reduction of fatigue, dyspnoea, and discomfort in the area of the breast and arm on the operated side. Based on our results and those obtained in foreign studies, we conclude that rehabilitation with the use of yoga practice improves the quality of life of the patients after breast cancer surgery. However, we recommend further research on this issue in Poland. PMID:26327853
Paulsen, G; Hamarsland, H; Cumming, K T; Johansen, R E; Hulmi, J J; Børsheim, E; Wiig, H; Garthe, I; Raastad, T
2014-12-15
This study investigated the effects of vitamin C and E supplementation on acute responses and adaptations to strength training. Thirty-two recreationally strength-trained men and women were randomly allocated to receive a vitamin C and E supplement (1000 mg day(-1) and 235 mg day(-1), respectively), or a placebo, for 10 weeks. During this period the participants' training involved heavy-load resistance exercise four times per week. Muscle biopsies from m. vastus lateralis were collected, and 1 repetition maximum (1RM) and maximal isometric voluntary contraction force, body composition (dual-energy X-ray absorptiometry), and muscle cross-sectional area (magnetic resonance imaging) were measured before and after the intervention. Furthermore, the cellular responses to a single exercise session were assessed midway in the training period by measurements of muscle protein fractional synthetic rate and phosphorylation of several hypertrophic signalling proteins. Muscle biopsies were obtained from m. vastus lateralis twice before, and 100 and 150 min after, the exercise session (4 × 8RM, leg press and knee-extension). The supplementation did not affect the increase in muscle mass or the acute change in protein synthesis, but it hampered certain strength increases (biceps curl). Moreover, increased phosphorylation of p38 mitogen-activated protein kinase, Extracellular signal-regulated protein kinases 1 and 2 and p70S6 kinase after the exercise session was blunted by vitamin C and E supplementation. The total ubiquitination levels after the exercise session, however, were lower with vitamin C and E than placebo. We concluded that vitamin C and E supplementation interfered with the acute cellular response to heavy-load resistance exercise and demonstrated tentative long-term negative effects on adaptation to strength training.
Paulsen, G; Hamarsland, H; Cumming, K T; Johansen, R E; Hulmi, J J; Børsheim, E; Wiig, H; Garthe, I; Raastad, T
2014-01-01
This study investigated the effects of vitamin C and E supplementation on acute responses and adaptations to strength training. Thirty-two recreationally strength-trained men and women were randomly allocated to receive a vitamin C and E supplement (1000 mg day−1 and 235 mg day−1, respectively), or a placebo, for 10 weeks. During this period the participants’ training involved heavy-load resistance exercise four times per week. Muscle biopsies from m. vastus lateralis were collected, and 1 repetition maximum (1RM) and maximal isometric voluntary contraction force, body composition (dual-energy X-ray absorptiometry), and muscle cross-sectional area (magnetic resonance imaging) were measured before and after the intervention. Furthermore, the cellular responses to a single exercise session were assessed midway in the training period by measurements of muscle protein fractional synthetic rate and phosphorylation of several hypertrophic signalling proteins. Muscle biopsies were obtained from m. vastus lateralis twice before, and 100 and 150 min after, the exercise session (4 × 8RM, leg press and knee-extension). The supplementation did not affect the increase in muscle mass or the acute change in protein synthesis, but it hampered certain strength increases (biceps curl). Moreover, increased phosphorylation of p38 mitogen-activated protein kinase, Extracellular signal-regulated protein kinases 1 and 2 and p70S6 kinase after the exercise session was blunted by vitamin C and E supplementation. The total ubiquitination levels after the exercise session, however, were lower with vitamin C and E than placebo. We concluded that vitamin C and E supplementation interfered with the acute cellular response to heavy-load resistance exercise and demonstrated tentative long-term negative effects on adaptation to strength training. PMID:25384788
ERIC Educational Resources Information Center
Adams, Gilbert L.
2013-01-01
This ex post facto comparison study of a postsecondary apprenticeship program at a naval ship construction company examined 8 years of academic performance and program completion data for two curricular formats: a 15-week traditional group (1,259 apprentices) and a 10-week accelerated group (736 apprentices). The two groups were investigated to…
ERIC Educational Resources Information Center
Smith, Sue
2002-01-01
A 10-week stress management and relaxation course helped anxious students develop skills and strategies derived from self-awareness. Course included stress theory, organizational skills (time management, goal setting), personal transformation, tolerance for uncertainty, and metacognition, with an emphasis on self-efficacy and autonomy. (Contains…
Strzelecki, Dominik; Tabaszewska, Agnieszka; Barszcz, Zbigniew; Józefowicz, Olga; Kropiwnicki, Paweł; Rabe-Jabłońska, Jolanta
2013-12-01
Memantine and other glutamatergic agents have been currently investigated in some off-label indications due to glutamatergic involvement in several psychoneurological disorders. We assumed that memantine similarly to ketamine may positively influence mood, moreover having a potential to improve cognition and general quality of life. We report a case of a 49-year-old male hospitalized during a manic and a subsequent moderate depressive episode. After an ineffective use of lithium, olanzapine and antidepressive treatment with mianserin, memantine was added up to 20 mg per day for 10 weeks. The mental state was assessed using the Hamilton Depression Rating Scale, the Young Mania Rating Scale, the Hamilton Anxiety Scale, the Clinical Global Inventory, the World Health Organization Quality of Life Scale and psychological tests. After 10 weeks the patient achieved a partial symptomatic improvement in mood, anxiety and quality of sleep, but his activity remained insufficient. We also observed an improvement in the parameters of cognitive functioning and quality of life. There was neither significant mood variations during the memantine use nor mood changes after its termination. No significant side effects were noted during the memantine treatment. We conclude that using memantine in bipolar depression may improve mood, cognitive functioning and quality of life. PMID:24474993
Liakopoulou, A.; Buttar, H.S.; Nera, E.A.; Fernando, L. )
1989-01-01
Offspring of mice treated with cyclophosphamide (Cy; 1, 2.5 or 5 mg/kg) during pregnancy (6-18 days of gestation) and tested for immunocompetence from 5 to 10 weeks of age were found to have defective reticuloendothelial clearance. The main effects were: (a) increased elimination half time (T 1/2) of {sup 51}Cr-labeled SRBC from circulation, (b) decreased liver uptake of {sup 51}Cr and (c) impaired ability of the spleen, mostly affecting the female pups, to compensate for decreased liver uptake. The highest dose group suffered the most pronounced effects. This group was also found to have increased IgG immunoglobulin levels at 7 weeks of age. IgG antibody production in response to specific antigenic stimulation and delayed hypersensitivity reactions to oxazolone did not appear to be affected by Cy treatment.
Chaubal, S A; Molina, J A; Ohlrichs, C L; Ferre, L B; Faber, D C; Bols, P E J; Riesen, J W; Tian, X; Yang, X
2006-05-01
The objective was to develop a simple and effective ovum pick-up (OPU) protocol for cows, optimised for oocyte harvest and subsequent in vitro embryo production (IVP). Five protocols differing in collection frequency, dominant follicle removal (DFR) and FSH stimulation were tested on groups of three cows each, over an interval of 10 consecutive weeks. Performance was evaluated on per OPU session, per week and pooled (3 cowsx10weeks) basis. Among the non-stimulated groups, on a per cow per session basis, once- or twice-weekly OPU had no effect on the mean (+/- S.E.M.) number of follicles aspirated, oocytes retrieved and blastocysts produced (0.6+/-0.8 and 0.7 +/- 0.7, respectively). However, DFR 72 h prior to OPU almost doubled blastocyst production (1.2 +/- 1.3). In stimulated groups, FSH treatment (80 mg IM and 120 mg SC) was given once weekly prior to OPU. Treatment with FSH, followed by twice-weekly OPU, failed to show any synergistic effect of FSH and increased aspiration frequency. When FSH was given 36 h after DFR, followed by OPU 48 h later, more (P < 0.05) follicles (16.0 +/- 5.0), oocytes (10.6 +/- 4.5) and embryos (2.1 +/- 1.2) were obtained during each session, but not on a weekly basis. Pooled results over 10 weeks showed an overall improved performance for the treatment groups with twice-weekly OPU sessions, due to double the number of OPU sessions performed. However, the protocol that consisted of DFR, FSH treatment and a subsequent single OPU per week, was the most productive and cost-effective, with potential commercial appeal.
Allah Yar, Razia; Akbar, Atif; Iqbal, Furhan
2015-01-21
Currently there are no uniform standard treatments for newborn suffering from cerebral hypoxia-ischemia (HI) and to find new and effective strategies for treating the HI injury remains a key direction for future research. Present study was designed to demonstrate that optimal dose (1 or 3%) of creatine monohydrate (Cr) for the treatment of neonatal HI in female albino mice. On postnatal day 10, animals were subjected to left carotid artery ligation followed by 8% hypoxia for 25 minutes. Following weaning on postnatal day 20, mice were divided into three treatments on the basis of diet supplementation (Normal rodent diet, 1% and 3% creatine supplemented diet) for 10 week. A battery of neurological tests (Rota rod, open field and Morris water maze) was used to demonstrate effect of Cr supplementation on neurofunction and infarct size following HI. Open field test results indicated that Cr supplementation had significantly improved locomotory and exploratory behavior in subjects. It was observed that Cr treated mice showed better neuromuscular coordination (rota rod) and improved spatial memory (Morris Water Maze test). A significant affect of creatine supplementation in reducing infarct size was also observed. Post hoc analysis of post hoc multiple comparisons revealed that mice supplemented with 3% Cr for 10 weeks performed better during Morris water maze test while 1% Cr supplementation improved the exploratory behavior and gain in body weight than control group indicating that Cr supplementation has the potential to improve the neurofunction following neonatal brain damage. This article is part of a Special Issue entitled SI: Brain and Memory.
Jensen, M.L.; Obling, K.; Nielsen, R.O.; Parner, E.T.; Rasmussen, S.
2013-01-01
Background/Purpose: There is a paucity of knowledge on the association between different foot posture quantified by Foot Posture Index (FPI) and Quadriceps angle (Q-angle) with development of running-related injuries. Earlier studies investigating these associations did not include an objective measure of the amount of running performed. Therefore, the purpose of this study was to investigate if kilometers to running-related injury (RRI) differ among novice runners with different foot postures and Q-angles when running in a neutral running shoe. Methods: A 10 week study was conducted including healthy, novice runners. At baseline foot posture was evaluated using the foot posture index (FPI) and the Q-angle was measured. Based on the FPI and Q-angle, right and left feet / knees of the runners were categorized into exposure groups. All participants received a Global Positioning System watch to allow them to quantify running volume and were instructed to run a minimum of two times per week in a conventional, neutral running shoe. The outcome was RRI. Results: Fifty nine novice runners of mixed gender were included. Of these, 13 sustained a running-related injury. No significant difference in cumulative relative risk between persons with pronated feet and neutral feet was found after 125 km of running (Cumulative relative risk = 1.65 [0.65; 4.17], p = 0.29). Similarly, no difference was found between low and neutral Q-angle (Cumulative relative risk = 1.25 [0.49; 3.23], p = 0.63). Conclusion: Static foot posture as quantified by FPI and knee alignment as quantified by Q-angle do not seem to affect the risk of injury among novice runners taking up a running regimen wearing a conventional neutral running shoe. These results should be interpreted with caution due to a small sample size. Level of Evidence: 2a PMID:24175127
NASA Astrophysics Data System (ADS)
Myrbo, A.; Howes, T.; Thompson, R.; Drake, C.; Woods, P.; Schuldt, N.; Borkholder, B.; Marty, J.; Lafrancois, T.; Pellerin, H.
2012-12-01
. The two-week "mini-REU" was designed to attract students with little or no independent research experience, who might be intimidated by applying for a ten-week internship away from home (but who might apply for one after completing a good mini-REU). The arc of research, from site selection to field work and lab work to data interpretation and poster presentation, must be encompassed in these brief projects, so group projects with clear goals are best suited for mini-REUs. The May 2012 project, with twelve students in four research proxy groups (charcoal, phytoliths, plant macrofossils, and zooplankton), demonstrated that a FDL lake, Rice Portage, had extensive wild rice habitat prior to early 20th-century Euroamerican ditching; this proof was required in order for FDL to gain a permit from the Army Corps of Engineers to raise the lake level as part of a wild rice restoration effort. Each proxy group had one research advisor (a graduate student or soft money researcher), plus one UMN über-advisor for the project as a whole, as well as the Fond du Lac resource manager. All of these advisors also work with the 10-week interns throughout the summer.
Erceg, David N.; Anderson, Lindsey J.; Nickles, Chun M.; Lane, Christianne J.; Weigensberg, Marc J.; Schroeder, E. Todd
2015-01-01
Purpose: With the childhood obesity epidemic, efficient methods of exercise are sought to improve health. We tested whether whole body vibration (WBV) exercise can positively affect bone metabolism and improve insulin/glucose dynamics in sedentary overweight Latino boys. Methods: Twenty Latino boys 8-10 years of age were randomly assigned to either a control (CON) or 3 days/wk WBV exercise (VIB) for 10-wk. Results: Significant increases in BMC (4.5±3.2%; p=0.01) and BMD (1.3±1.3%; p<0.01) were observed for the VIB group when compared to baseline values. For the CON group BMC significantly increased (2.0±2.2%; p=0.02), with no change in BMD (0.8±1.3%; p=0.11). There were no significant between group changes in BMC or BMD. No significant change was observed for osteocalcin and (collagen type I C-telopeptide) CTx for the VIB group. However, osteocalcin showed a decreasing trend (p=0.09) and CTx significantly increased (p<0.03) for the CON group. This increase in CTx was significantly different between groups (p<0.02) and the effect size of between-group difference in change was large (-1.09). There were no significant correlations between osteocalcin and measures of fat mass or insulin resistance for collapsed data. Conclusion: Although bone metabolism was altered by WBV training, no associations were apparent between osteocalcin and insulin resistance. These findings suggest WBV exercise may positively increase BMC and BMD by decreasing bone resorption in overweight Latino boys. PMID:26078710
ERIC Educational Resources Information Center
Molina, Brooke S. G.; Flory, Kate; Bukstein, Oscar G.; Greiner, Andrew R.; Baker, Jennifer L.; Krug, Vicky; Evans, Steven W.
2008-01-01
Objective: This pilot study tests the feasibility and preliminary efficacy of an after-school treatment program for middle schoolers with ADHD using a randomized clinical trial design. Method: A total of 23 students with ADHD (25% female, 48% African American) from a large public middle school were randomly assigned to a 10-week program or to…
NASA Technical Reports Server (NTRS)
Messaro. Semma; Harrison, Phillip
2010-01-01
Ares I Zonal Random vibration environments due to acoustic impingement and combustion processes are develop for liftoff, ascent and reentry. Random Vibration test criteria for Ares I Upper Stage pyrotechnic components are developed by enveloping the applicable zonal environments where each component is located. Random vibration tests will be conducted to assure that these components will survive and function appropriately after exposure to the expected vibration environments. Methodology: Random Vibration test criteria for Ares I Upper Stage pyrotechnic components were desired that would envelope all the applicable environments where each component was located. Applicable Ares I Vehicle drawings and design information needed to be assessed to determine the location(s) for each component on the Ares I Upper Stage. Design and test criteria needed to be developed by plotting and enveloping the applicable environments using Microsoft Excel Spreadsheet Software and documenting them in a report Using Microsoft Word Processing Software. Conclusion: Random vibration liftoff, ascent, and green run design & test criteria for the Upper Stage Pyrotechnic Components were developed by using Microsoft Excel to envelope zonal environments applicable to each component. Results were transferred from Excel into a report using Microsoft Word. After the report is reviewed and edited by my mentor it will be submitted for publication as an attachment to a memorandum. Pyrotechnic component designers will extract criteria from my report for incorporation into the design and test specifications for components. Eventually the hardware will be tested to the environments I developed to assure that the components will survive and function appropriately after exposure to the expected vibration environments.
NASA Astrophysics Data System (ADS)
Tapiero, Charles S.; Vallois, Pierre
2016-11-01
The premise of this paper is that a fractional probability distribution is based on fractional operators and the fractional (Hurst) index used that alters the classical setting of random variables. For example, a random variable defined by its density function might not have a fractional density function defined in its conventional sense. Practically, it implies that a distribution's granularity defined by a fractional kernel may have properties that differ due to the fractional index used and the fractional calculus applied to define it. The purpose of this paper is to consider an application of fractional calculus to define the fractional density function of a random variable. In addition, we provide and prove a number of results, defining the functional forms of these distributions as well as their existence. In particular, we define fractional probability distributions for increasing and decreasing functions that are right continuous. Examples are used to motivate the usefulness of a statistical approach to fractional calculus and its application to economic and financial problems. In conclusion, this paper is a preliminary attempt to construct statistical fractional models. Due to the breadth and the extent of such problems, this paper may be considered as an initial attempt to do so.
2016-01-01
Background The need for accessible and motivating treatment approaches within mental health has led to the development of an Internet-based serious game intervention (called “Plan-It Commander”) as an adjunct to treatment as usual for children with attention-deficit/hyperactivity disorder (ADHD). Objective The aim was to determine the effects of Plan-It Commander on daily life skills of children with ADHD in a multisite randomized controlled crossover open-label trial. Methods Participants (N=170) in this 20-week trial had a diagnosis of ADHD and ranged in age from 8 to 12 years (male: 80.6%, 137/170; female: 19.4%, 33/170). They were randomized to a serious game intervention group (group 1; n=88) or a treatment-as-usual crossover group (group 2; n=82). Participants randomized to group 1 received a serious game intervention in addition to treatment as usual for the first 10 weeks and then received treatment as usual for the next 10 weeks. Participants randomized to group 2 received treatment as usual for the first 10 weeks and crossed over to the serious game intervention in addition to treatment as usual for the subsequent 10 weeks. Primary (parent report) and secondary (parent, teacher, and child self-report) outcome measures were administered at baseline, 10 weeks, and 10-week follow-up. Results After 10 weeks, participants in group 1 compared to group 2 achieved significantly greater improvements on the primary outcome of time management skills (parent-reported; P=.004) and on secondary outcomes of the social skill of responsibility (parent-reported; P=.04), and working memory (parent-reported; P=.02). Parents and teachers reported that total social skills improved over time within groups, whereas effects on total social skills and teacher-reported planning/organizing skills were nonsignificant between groups. Within group 1, positive effects were maintained or further improved in the last 10 weeks of the study. Participants in group 2, who played the
ERIC Educational Resources Information Center
Geller, Daniel A.; Wagner, Karen Dineen; Emslie, Graham; Murphy, Tanya; Carpenter, David J.; Wetherhold, Erica; Perera, Phil; Machin, Andrea; Gardiner, Christel
2004-01-01
Objective: To assess the efficacy and safety of paroxetine for the treatment of pediatric obsessive-compulsive disorder.Method: Children (7-11 years of age) and adolescents (12-17 years of age) meeting DSM-IV criteria for obsessive-compulsive disorder were randomized to paroxetine (10-50 mg/day) or placebo for 10 weeks. The primary efficacy…
Is random access memory random?
NASA Technical Reports Server (NTRS)
Denning, P. J.
1986-01-01
Most software is contructed on the assumption that the programs and data are stored in random access memory (RAM). Physical limitations on the relative speeds of processor and memory elements lead to a variety of memory organizations that match processor addressing rate with memory service rate. These include interleaved and cached memory. A very high fraction of a processor's address requests can be satified from the cache without reference to the main memory. The cache requests information from main memory in blocks that can be transferred at the full memory speed. Programmers who organize algorithms for locality can realize the highest performance from these computers.
How to Do Random Allocation (Randomization)
Shin, Wonshik
2014-01-01
Purpose To explain the concept and procedure of random allocation as used in a randomized controlled study. Methods We explain the general concept of random allocation and demonstrate how to perform the procedure easily and how to report it in a paper. PMID:24605197
Jeon, Seon-Min; Kim, Ji-Eun; Shin, Su-Kyung; Kwon, Eun-Young; Jung, Un Ju; Baek, Nam-In; Lee, Kyung-Tae; Jeong, Tae-Sook; Chung, Hae-Gon; Choi, Myung-Sook
2013-02-01
We evaluated the effects of Brassica rapa ethanol extract (BREE) on body composition and plasma lipid profiles through a randomized, double-blind, and placebo-controlled trial in overweight subjects. Fifty-eight overweight participants (age 20-50 years, body mass index23.0-24.9) were randomly assigned to two groups and served BREE (2 g/day) or placebo (starch, 2 g/day) for 10 weeks. Body compositions, nutrients intake, plasma lipids, adipocytokines, and hepatotoxicity biomarkers were assessed in all subjects at baseline and after 10 weeks of supplementation. The plasma total cholesterol (total-C) concentration was significantly increased after 10 weeks compared to the baseline in both groups. However, BREE supplementation significantly increased the high-density lipoprotein cholesterol (HDL-C) concentration and significantly reduced the total-C/HDL-C ratio, free fatty acid, and adipsin levels after 10 weeks. No significant differences were observed in body compositions, fasting blood glucose, plasma adipocytokines except adipsin, and aspartate aminotransferase and alanine aminotransferase activities between before and after trial within groups as well as between the two groups. The supplementation of BREE partially improves plasma lipid metabolism in overweight subjects without adverse effects.
Random broadcast on random geometric graphs
Bradonjic, Milan; Elsasser, Robert; Friedrich, Tobias
2009-01-01
In this work, we consider the random broadcast time on random geometric graphs (RGGs). The classic random broadcast model, also known as push algorithm, is defined as: starting with one informed node, in each succeeding round every informed node chooses one of its neighbors uniformly at random and informs it. We consider the random broadcast time on RGGs, when with high probability: (i) RGG is connected, (ii) when there exists the giant component in RGG. We show that the random broadcast time is bounded by {Omicron}({radical} n + diam(component)), where diam(component) is a diameter of the entire graph, or the giant component, for the regimes (i), or (ii), respectively. In other words, for both regimes, we derive the broadcast time to be {Theta}(diam(G)), which is asymptotically optimal.
Quantumness, Randomness and Computability
NASA Astrophysics Data System (ADS)
Solis, Aldo; Hirsch, Jorge G.
2015-06-01
Randomness plays a central role in the quantum mechanical description of our interactions. We review the relationship between the violation of Bell inequalities, non signaling and randomness. We discuss the challenge in defining a random string, and show that algorithmic information theory provides a necessary condition for randomness using Borel normality. We close with a view on incomputablity and its implications in physics.
Directed random walk with random restarts: The Sisyphus random walk
NASA Astrophysics Data System (ADS)
Montero, Miquel; Villarroel, Javier
2016-09-01
In this paper we consider a particular version of the random walk with restarts: random reset events which suddenly bring the system to the starting value. We analyze its relevant statistical properties, like the transition probability, and show how an equilibrium state appears. Formulas for the first-passage time, high-water marks, and other extreme statistics are also derived; we consider counting problems naturally associated with the system. Finally we indicate feasible generalizations useful for interpreting different physical effects.
NASA Technical Reports Server (NTRS)
Erdmann, Michael
1992-01-01
This paper investigates the role of randomization in the solution of robot manipulation tasks. One example of randomization is shown by the strategy of shaking a bin holding a part in order to orient the part in a desired stable state with some high probability. Randomization can be useful for mobile robot navigation and as a means of guiding the design process.
ERIC Educational Resources Information Center
De Boeck, Paul
2008-01-01
It is common practice in IRT to consider items as fixed and persons as random. Both, continuous and categorical person parameters are most often random variables, whereas for items only continuous parameters are used and they are commonly of the fixed type, although exceptions occur. It is shown in the present article that random item parameters…
Quantum random number generation
Ma, Xiongfeng; Yuan, Xiao; Cao, Zhu; Zhang, Zhen; Qi, Bing
2016-06-28
Here, quantum physics can be exploited to generate true random numbers, which play important roles in many applications, especially in cryptography. Genuine randomness from the measurement of a quantum system reveals the inherent nature of quantumness -- coherence, an important feature that differentiates quantum mechanics from classical physics. The generation of genuine randomness is generally considered impossible with only classical means. Based on the degree of trustworthiness on devices, quantum random number generators (QRNGs) can be grouped into three categories. The first category, practical QRNG, is built on fully trusted and calibrated devices and typically can generate randomness at amore » high speed by properly modeling the devices. The second category is self-testing QRNG, where verifiable randomness can be generated without trusting the actual implementation. The third category, semi-self-testing QRNG, is an intermediate category which provides a tradeoff between the trustworthiness on the device and the random number generation speed.« less
NASA Astrophysics Data System (ADS)
Cappellini, Valerio; Sommers, Hans-Jürgen; Bruzda, Wojciech; Życzkowski, Karol
2009-09-01
Ensembles of random stochastic and bistochastic matrices are investigated. While all columns of a random stochastic matrix can be chosen independently, the rows and columns of a bistochastic matrix have to be correlated. We evaluate the probability measure induced into the Birkhoff polytope of bistochastic matrices by applying the Sinkhorn algorithm to a given ensemble of random stochastic matrices. For matrices of order N = 2 we derive explicit formulae for the probability distributions induced by random stochastic matrices with columns distributed according to the Dirichlet distribution. For arbitrary N we construct an initial ensemble of stochastic matrices which allows one to generate random bistochastic matrices according to a distribution locally flat at the center of the Birkhoff polytope. The value of the probability density at this point enables us to obtain an estimation of the volume of the Birkhoff polytope, consistent with recent asymptotic results.
Generating random density matrices
NASA Astrophysics Data System (ADS)
Życzkowski, Karol; Penson, Karol A.; Nechita, Ion; Collins, Benoît
2011-06-01
We study various methods to generate ensembles of random density matrices of a fixed size N, obtained by partial trace of pure states on composite systems. Structured ensembles of random pure states, invariant with respect to local unitary transformations are introduced. To analyze statistical properties of quantum entanglement in bi-partite systems we analyze the distribution of Schmidt coefficients of random pure states. Such a distribution is derived in the case of a superposition of k random maximally entangled states. For another ensemble, obtained by performing selective measurements in a maximally entangled basis on a multi-partite system, we show that this distribution is given by the Fuss-Catalan law and find the average entanglement entropy. A more general class of structured ensembles proposed, containing also the case of Bures, forms an extension of the standard ensemble of structureless random pure states, described asymptotically, as N → ∞, by the Marchenko-Pastur distribution.
Randomness: Quantum versus classical
NASA Astrophysics Data System (ADS)
Khrennikov, Andrei
2016-05-01
Recent tremendous development of quantum information theory has led to a number of quantum technological projects, e.g. quantum random generators. This development had stimulated a new wave of interest in quantum foundations. One of the most intriguing problems of quantum foundations is the elaboration of a consistent and commonly accepted interpretation of a quantum state. Closely related problem is the clarification of the notion of quantum randomness and its interrelation with classical randomness. In this short review, we shall discuss basics of classical theory of randomness (which by itself is very complex and characterized by diversity of approaches) and compare it with irreducible quantum randomness. We also discuss briefly “digital philosophy”, its role in physics (classical and quantum) and its coupling to the information interpretation of quantum mechanics (QM).
Quantum random number generator
Pooser, Raphael C.
2016-05-10
A quantum random number generator (QRNG) and a photon generator for a QRNG are provided. The photon generator may be operated in a spontaneous mode below a lasing threshold to emit photons. Photons emitted from the photon generator may have at least one random characteristic, which may be monitored by the QRNG to generate a random number. In one embodiment, the photon generator may include a photon emitter and an amplifier coupled to the photon emitter. The amplifier may enable the photon generator to be used in the QRNG without introducing significant bias in the random number and may enable multiplexing of multiple random numbers. The amplifier may also desensitize the photon generator to fluctuations in power supplied thereto while operating in the spontaneous mode. In one embodiment, the photon emitter and amplifier may be a tapered diode amplifier.
Autonomous Byte Stream Randomizer
NASA Technical Reports Server (NTRS)
Paloulian, George K.; Woo, Simon S.; Chow, Edward T.
2013-01-01
Net-centric networking environments are often faced with limited resources and must utilize bandwidth as efficiently as possible. In networking environments that span wide areas, the data transmission has to be efficient without any redundant or exuberant metadata. The Autonomous Byte Stream Randomizer software provides an extra level of security on top of existing data encryption methods. Randomizing the data s byte stream adds an extra layer to existing data protection methods, thus making it harder for an attacker to decrypt protected data. Based on a generated crypto-graphically secure random seed, a random sequence of numbers is used to intelligently and efficiently swap the organization of bytes in data using the unbiased and memory-efficient in-place Fisher-Yates shuffle method. Swapping bytes and reorganizing the crucial structure of the byte data renders the data file unreadable and leaves the data in a deconstructed state. This deconstruction adds an extra level of security requiring the byte stream to be reconstructed with the random seed in order to be readable. Once the data byte stream has been randomized, the software enables the data to be distributed to N nodes in an environment. Each piece of the data in randomized and distributed form is a separate entity unreadable on its own right, but when combined with all N pieces, is able to be reconstructed back to one. Reconstruction requires possession of the key used for randomizing the bytes, leading to the generation of the same cryptographically secure random sequence of numbers used to randomize the data. This software is a cornerstone capability possessing the ability to generate the same cryptographically secure sequence on different machines and time intervals, thus allowing this software to be used more heavily in net-centric environments where data transfer bandwidth is limited.
NASA Astrophysics Data System (ADS)
Chatterjee, Krishnendu; Doyen, Laurent; Gimbert, Hugo; Henzinger, Thomas A.
We consider two-player zero-sum games on graphs. These games can be classified on the basis of the information of the players and on the mode of interaction between them. On the basis of information the classification is as follows: (a) partial-observation (both players have partial view of the game); (b) one-sided complete-observation (one player has complete observation); and (c) complete-observation (both players have complete view of the game). On the basis of mode of interaction we have the following classification: (a) concurrent (players interact simultaneously); and (b) turn-based (players interact in turn). The two sources of randomness in these games are randomness in transition function and randomness in strategies. In general, randomized strategies are more powerful than deterministic strategies, and randomness in transitions gives more general classes of games. We present a complete characterization for the classes of games where randomness is not helpful in: (a) the transition function (probabilistic transition can be simulated by deterministic transition); and (b) strategies (pure strategies are as powerful as randomized strategies). As consequence of our characterization we obtain new undecidability results for these games.
Broome, John
1984-10-01
This article considers what justification can be found for selecting randomly and in what circumstances it applies, including that of selecting patients to be treated by a scarce medical procedure. The author demonstrates that balancing the merits of fairness, common good, equal rights, and equal chance as they apply in various situations frequently leads to the conclusion that random selection may not be the most appropriate mode of selection. Broome acknowledges that, in the end, we may be forced to conclude that the only merit of random selection is the political one of guarding against partiality and oppression.
NASA Astrophysics Data System (ADS)
Newman, M. E. J.; Martin, Travis
2014-11-01
Random graph models have played a dominant role in the theoretical study of networked systems. The Poisson random graph of Erdős and Rényi, in particular, as well as the so-called configuration model, have served as the starting point for numerous calculations. In this paper we describe another large class of random graph models, which we call equitable random graphs and which are flexible enough to represent networks with diverse degree distributions and many nontrivial types of structure, including community structure, bipartite structure, degree correlations, stratification, and others, yet are exactly solvable for a wide range of properties in the limit of large graph size, including percolation properties, complete spectral density, and the behavior of homogeneous dynamical systems, such as coupled oscillators or epidemic models.
NASA Astrophysics Data System (ADS)
Bruzda, Wojciech; Cappellini, Valerio; Sommers, Hans-Jürgen; Życzkowski, Karol
2009-01-01
We define a natural ensemble of trace preserving, completely positive quantum maps and present algorithms to generate them at random. Spectral properties of the superoperator Φ associated with a given quantum map are investigated and a quantum analogue of the Frobenius-Perron theorem is proved. We derive a general formula for the density of eigenvalues of Φ and show the connection with the Ginibre ensemble of real non-symmetric random matrices. Numerical investigations of the spectral gap imply that a generic state of the system iterated several times by a fixed generic map converges exponentially to an invariant state.
NASA Astrophysics Data System (ADS)
Donnelly, Isaac
Random walks on lattices are a well used model for diffusion on continuum. They have been to model subdiffusive systems, systems with forcing and reactions as well as a combination of the three. We extend the traditional random walk framework to the network to obtain novel results. As an example due to the small graph diameter, the early time behaviour of subdiffusive dynamics dominates the observed system which has implications for models of the brain or airline networks. I would like to thank the Australian American Fulbright Association.
Randomness Of Amoeba Movements
NASA Astrophysics Data System (ADS)
Hashiguchi, S.; Khadijah, Siti; Kuwajima, T.; Ohki, M.; Tacano, M.; Sikula, J.
2005-11-01
Movements of amoebas were automatically traced using the difference between two successive frames of the microscopic movie. It was observed that the movements were almost random in that the directions and the magnitudes of the successive two steps are not correlated, and that the distance from the origin was proportional to the square root of the step number.
NASA Astrophysics Data System (ADS)
Leonetti, Marco; López, Cefe
2012-06-01
A random laser is formed by a haphazard assembly of nondescript optical scatters with optical gain. Multiple light scattering replaces the optical cavity of traditional lasers and the interplay between gain, scattering and size determines its unique properties. Random lasers studied till recently, consisted of irregularly shaped or polydisperse scatters, with some average scattering strength constant across the gain frequency band. Photonic glasses can sustain scattering resonances that can be placed in the gain window, since they are formed by monodisperse spheres [1]. The unique resonant scattering of this novel material allows controlling the lasing color via the diameter of the particles and their refractive index. Thus a random laser with a priori set lasing peak can be designed [2]. A special pumping scheme that enables to select the number of activated modes in a random laser permits to prepare RLs in two distinct regimes by controlling directionality through the shape of the pump [3]. When pumping is essentially unidirectional, few (barely interacting) modes are turned on that show as sharp, uncorrelated peaks in the spectrum. By increasing angular span of the pump beams, many resonances intervene generating a smooth emission spectrum with a high degree of correlation, and shorter lifetime. These are signs of a phaselocking transition, in which phases are clamped together so that modes oscillate synchronously.
ERIC Educational Resources Information Center
Griffiths, Martin
2011-01-01
One of the author's undergraduate students recently asked him whether it was possible to generate a random positive integer. After some thought, the author realised that there were plenty of interesting mathematical ideas inherent in her question. So much so in fact, that the author decided to organise a workshop, open both to undergraduates and…
Contouring randomly spaced data
NASA Technical Reports Server (NTRS)
Kibler, J. F.; Morris, W. D.; Hamm, R. W.
1977-01-01
Computer program using triangulation contouring technique contours data points too numerous to fit into rectangular grid. Using random access procedures, program can handle up to 56,000 data points and provides up to 20 contour intervals for multiple number of parameters.
Uniform random number generators
NASA Technical Reports Server (NTRS)
Farr, W. R.
1971-01-01
Methods are presented for the generation of random numbers with uniform and normal distributions. Subprogram listings of Fortran generators for the Univac 1108, SDS 930, and CDC 3200 digital computers are also included. The generators are of the mixed multiplicative type, and the mathematical method employed is that of Marsaglia and Bray.
Randomization and sampling issues
Geissler, P.H.
1996-01-01
The need for randomly selected routes and other sampling issues have been debated by the Amphibian electronic discussion group. Many excellent comments have been made, pro and con, but we have not reached consensus yet. This paper brings those comments together and attempts a synthesis. I hope that the resulting discussion will bring us closer to a consensus.
ERIC Educational Resources Information Center
Ben-Ari, Morechai
2004-01-01
The term "random" is frequently used in discussion of the theory of evolution, even though the mathematical concept of randomness is problematic and of little relevance in the theory. Therefore, since the core concept of the theory of evolution is the non-random process of natural selection, the term random should not be used in teaching the…
2011-01-01
Background Natural food supplements with high flavonoid content are often claimed to promote weight-loss and lower plasma cholesterol in animal studies, but human studies have been more equivocal. The aim of this study was firstly to determine the effectiveness of natural food supplements containing Glycine max leaves extract (EGML) or Garcinia cambogia extract (GCE) to promote weight-loss and lower plasma cholesterol. Secondly to examine whether these supplements have any beneficial effect on lipid, adipocytokine or antioxidant profiles. Methods Eighty-six overweight subjects (Male:Female = 46:40, age: 20~50 yr, BMI > 23 < 29) were randomly assigned to three groups and administered tablets containing EGML (2 g/day), GCE (2 g/day) or placebo (starch, 2 g/day) for 10 weeks. At baseline and after 10 weeks, body composition, plasma cholesterol and diet were assessed. Blood analysis was also conducted to examine plasma lipoproteins, triglycerides, adipocytokines and antioxidants. Results EGML and GCE supplementation failed to promote weight-loss or any clinically significant change in %body fat. The EGML group had lower total cholesterol after 10 weeks compared to the placebo group (p < 0.05). EGML and GCE had no effect on triglycerides, non-HDL-C, adipocytokines or antioxidants when compared to placebo supplementation. However, HDL-C was higher in the EGML group (p < 0.001) after 10 weeks compared to the placebo group. Conclusions Ten weeks of EGML or GCE supplementation did not promote weight-loss or lower total cholesterol in overweight individuals consuming their habitual diet. Although, EGML did increase plasma HDL-C levels which is associated with a lower risk of atherosclerosis. PMID:21936892
Relativistic Weierstrass random walks.
Saa, Alberto; Venegeroles, Roberto
2010-08-01
The Weierstrass random walk is a paradigmatic Markov chain giving rise to a Lévy-type superdiffusive behavior. It is well known that special relativity prevents the arbitrarily high velocities necessary to establish a superdiffusive behavior in any process occurring in Minkowski spacetime, implying, in particular, that any relativistic Markov chain describing spacetime phenomena must be essentially Gaussian. Here, we introduce a simple relativistic extension of the Weierstrass random walk and show that there must exist a transition time t{c} delimiting two qualitative distinct dynamical regimes: the (nonrelativistic) superdiffusive Lévy flights, for t
Relativistic Weierstrass random walks.
Saa, Alberto; Venegeroles, Roberto
2010-08-01
The Weierstrass random walk is a paradigmatic Markov chain giving rise to a Lévy-type superdiffusive behavior. It is well known that special relativity prevents the arbitrarily high velocities necessary to establish a superdiffusive behavior in any process occurring in Minkowski spacetime, implying, in particular, that any relativistic Markov chain describing spacetime phenomena must be essentially Gaussian. Here, we introduce a simple relativistic extension of the Weierstrass random walk and show that there must exist a transition time t{c} delimiting two qualitative distinct dynamical regimes: the (nonrelativistic) superdiffusive Lévy flights, for t
Interactions in random copolymers
NASA Astrophysics Data System (ADS)
Marinov, Toma; Luettmer-Strathmann, Jutta
2002-04-01
The description of thermodynamic properties of copolymers in terms of simple lattice models requires a value for the effective interaction strength between chain segments, in addition to parameters that can be derived from the properties of the corresponding homopolymers. If the monomers are chemically similar, Berthelot's geometric-mean combining rule provides a good first approximation for interactions between unlike segments. In earlier work on blends of polyolefins [1], we found that the small-scale architecture of the chains leads to corrections to the geometric-mean approximation that are important for the prediction of phase diagrams. In this work, we focus on the additional effects due to sequencing of the monomeric units. In order to estimate the effective interaction for random copolymers, the small-scale simulation approach developed in [1] is extended to allow for random sequencing of the monomeric units. The approach is applied here to random copolymers of ethylene and 1-butene. [1] J. Luettmer-Strathmann and J.E.G. Lipson. Phys. Rev. E 59, 2039 (1999) and Macromolecules 32, 1093 (1999).
FitzGerald, Mary P; Anderson, Rodney U; Potts, Jeannette; Payne, Christopher K; Peters, Kenneth M; Clemens, J Quentin; Kotarinos, Rhonda; Fraser, Laura; Cosby, Annamarie; Fortman, Carole; Neville, Cynthia; Badillo, Suzanne; Odabachian, Lisa; Sanfield, Anna; O’Dougherty, Betsy; Halle-Podell, Rick; Cen, Liyi; Chuai, Shannon; Landis, J Richard; Kusek, John W; Nyberg, Leroy M
2010-01-01
Objectives To determine the feasibility of conducting a randomized clinical trial designed to compare two methods of manual therapy (myofascial physical therapy (MPT) and global therapeutic massage (GTM)) among patients with urologic chronic pelvic pain syndromes. Materials and Methods Our goal was to recruit 48 subjects with chronic prostatitis/chronic pelvic pain syndrome or interstitial cystitis/painful bladder syndrome at six clinical centers. Eligible patients were randomized to either MPT or GTM and were scheduled to receive up to 10 weekly treatments, each 1 hour in duration. Criteria to assess feasibility included adherence of therapists to prescribed therapeutic protocol as determined by records of treatment, adverse events which occurred during study treatment, and rate of response to therapy as assessed by the Patient Global Response Assessment (GRA). Primary outcome analysis compared response rates between treatment arms using Mantel-Haenszel methods. Results Twenty-three (49%) men and 24 (51%) women were randomized over a six month period. Twenty-four (51%) patients were randomized to GTM, 23 (49%) to MPT; 44 (94%) patients completed the study. Therapist adherence to the treatment protocols was excellent. The GRA response rate of 57% in the MPT group was significantly higher than the rate of 21% in the GTM treatment group (p=0.03). Conclusions The goals to judge feasibility of conducting a full-scale trial of physical therapy methods were met. The preliminary findings of a beneficial effect of MPT warrants further study. PMID:19535099
Thomas, D R; Goode, P S; LaMaster, K; Tennyson, T
1998-10-01
Aloe vera has been used for centuries as a topical treatment for various conditions and as a cathartic. An amorphous hydrogel dressing derived from the aloe plant (Carrasyn Gel Wound Dressing, Carrington Laboratories, Inc., Irving, TX) is approved by the Food and Drug Administration for the management of Stages I through IV pressure ulcers. To evaluate effectiveness of this treatment, 30 patients were randomized to receive either daily topical application of the hydrogel study dressing (acemannan hydrogel wound dressing) or a moist saline gauze dressing. Complete healing of the study ulcer occurred in 19 of 30 subjects (63%) during the 10-week observation period. No difference was observed in complete healing between the experimental and the control groups (odds ratio 0.93, 95% CI 0.16, 5.2). This study indicates that the acemannan hydrogel dressing is as effective as, but is not superior to, a moist saline gauze wound dressing for the management of pressure ulcers.
Jee, Sandra H; Couderc, Jean-Philippe; Swanson, Dena; Gallegos, Autumn; Hilliard, Cammie; Blumkin, Aaron; Cunningham, Kendall; Heinert, Sara
2015-08-01
This article presents a pilot project implementing a mindfulness-based stress reduction program among traumatized youth in foster and kinship care over 10 weeks. Forty-two youth participated in this randomized controlled trial that used a mixed-methods (quantitative, qualitative, and physiologic) evaluation. Youth self-report measuring mental health problems, mindfulness, and stress were lower than anticipated, and the relatively short time-frame to teach these skills to traumatized youth may not have been sufficient to capture significant changes in stress as measured by electrocardiograms. Main themes from qualitative data included expressed competence in managing ongoing stress, enhanced self-awareness, and new strategies to manage stress. We share our experiences and recommendations for future research and practice, including focusing efforts on younger youth, and using community-based participatory research principles to promote engagement and co-learning. CLINICALTRIALS.GOV: Protocol Registration System ID NCT01708291.
A random number generator for continuous random variables
NASA Technical Reports Server (NTRS)
Guerra, V. M.; Tapia, R. A.; Thompson, J. R.
1972-01-01
A FORTRAN 4 routine is given which may be used to generate random observations of a continuous real valued random variable. Normal distribution of F(x), X, E(akimas), and E(linear) is presented in tabular form.
Talley, Nicholas J.; Locke, G. Richard; Saito, Yuri A.; Almazar, Ann E.; Bouras, Ernest P.; Howden, Colin W.; Lacy, Brian E.; DiBaise, John K.; Prather, Charlene M.; Abraham, Bincy P.; El-Serag, Hashem B.; Moayyedi, Paul; Herrick, Linda M.; Szarka, Lawrence A.; Camilleri, Michael; Hamilton, Frank A.; Schleck, Cathy D.; Tilkes, Katherine E.; Zinsmeister, Alan R.
2015-01-01
Background & Aims Anti-depressants are frequently prescribed to treat functional dyspepsia (FD), a common disorder characterized by upper abdominal symptoms, including discomfort or post-prandial fullness. However, there is little evidence for the efficacy of these drugs in patients with FD. We performed a randomized, double-blind, placebo-controlled trial to evaluate the effects of anti-depressant therapy effects on symptoms, gastric emptying (GE), and mealinduced satiety in patients with FD. Methods We performed a study at 8 North American sites of patients who met the Rome II criteria for FD and did not have depression or use anti-depressants. Subjects (n=292; 44±15 y old, 75% female, 70% with dysmotility-like FD, and 30% with ulcer-like FD) were randomly assigned to groups given placebo, 50 mg amitriptyline, or 10 mg escitalopram for 10 weeks. The primary endpoint was adequate relief of FD symptoms for ≥5 weeks of the last 10 weeks (out of 12). Secondary endpoints included GE time, maximum tolerated volume in a nutrient drink test, and FD-related quality of life. Results An adequate relief response was reported by 39 subjects given placebo (40%), 51 given amitriptyline (53%), and 37 given escitalopram (38%) (P=.05, following treatment, adjusted for baseline balancing factors including all subjects). Subjects with ulcer-like FD given amitriptyline were more than 3-fold more likely to report adequate relief than those given placebo (odds ratio=3.1; 95% confidence interval, 1.1–9.0). Neither amitriptyline nor escitalopram appeared to affect GE or meal-induced satiety after the 10 week period in any group. Subjects with delayed GE were less likely to report adequate relief than subjects with normal GE (odds ratio=0.4; 95% confidence interval, 0.2–0.8). Both anti-depressants improved overall quality-of-life. Conclusions Amitriptyline, but not escitalopram, appears to benefit some patients with FD— particularly those with ulcer-like (painful) FD. Patients
Randomizing Roaches: Exploring the "Bugs" of Randomization in Experimental Design
ERIC Educational Resources Information Center
Wagler, Amy; Wagler, Ron
2014-01-01
Understanding the roles of random selection and random assignment in experimental design is a central learning objective in most introductory statistics courses. This article describes an activity, appropriate for a high school or introductory statistics course, designed to teach the concepts, values and pitfalls of random selection and assignment…
Cluster Randomized Controlled Trial
Young, John; Chapman, Katie; Nixon, Jane; Patel, Anita; Holloway, Ivana; Mellish, Kirste; Anwar, Shamaila; Breen, Rachel; Knapp, Martin; Murray, Jenni; Farrin, Amanda
2015-01-01
Background and Purpose— We developed a new postdischarge system of care comprising a structured assessment covering longer-term problems experienced by patients with stroke and their carers, linked to evidence-based treatment algorithms and reference guides (the longer-term stroke care system of care) to address the poor longer-term recovery experienced by many patients with stroke. Methods— A pragmatic, multicentre, cluster randomized controlled trial of this system of care. Eligible patients referred to community-based Stroke Care Coordinators were randomized to receive the new system of care or usual practice. The primary outcome was improved patient psychological well-being (General Health Questionnaire-12) at 6 months; secondary outcomes included functional outcomes for patients, carer outcomes, and cost-effectiveness. Follow-up was through self-completed postal questionnaires at 6 and 12 months. Results— Thirty-two stroke services were randomized (29 participated); 800 patients (399 control; 401 intervention) and 208 carers (100 control; 108 intervention) were recruited. In intention to treat analysis, the adjusted difference in patient General Health Questionnaire-12 mean scores at 6 months was −0.6 points (95% confidence interval, −1.8 to 0.7; P=0.394) indicating no evidence of statistically significant difference between the groups. Costs of Stroke Care Coordinator inputs, total health and social care costs, and quality-adjusted life year gains at 6 months, 12 months, and over the year were similar between the groups. Conclusions— This robust trial demonstrated no benefit in clinical or cost-effectiveness outcomes associated with the new system of care compared with usual Stroke Care Coordinator practice. Clinical Trial Registration— URL: http://www.controlled-trials.com. Unique identifier: ISRCTN 67932305. PMID:26152298
Composite Random Fiber Networks
NASA Astrophysics Data System (ADS)
Picu, Catalin; Shahsavari, Ali
2013-03-01
Systems made from fibers are common in the biological and engineering worlds. In many instances, as for example in skin, where elastin and collagen fibers are present, the fiber network is composite, in the sense that it contains fibers of very different properties. The relationship between microstructural parameters and the elastic moduli of random fiber networks containing a single type of fiber is understood. In this work we address a similar target for the composite networks. We show that linear superposition of the contributions to stiffness of individual sub-networks does not apply and interesting non-linear effects are observed. A physical basis of these effects is proposed.
Can randomization be informative?
NASA Astrophysics Data System (ADS)
Pereira, Carlos A. B.; Campos, Thiago F.; Silva, Gustavo M.; Wechsler, Sergio
2012-10-01
In this paper the Pair of Siblings Paradox introduced by Pereira [1] is extended by considering more than two children and more than one child observed for gender. We follow the same lines of Wechsler et al. [2] that generalizes the three prisoners' dilemma, introduced by Gardner [3]. This paper's conjecture is that the Pair of Siblings and the Three Prisoners dilemma are dual paradoxes. Looking at possible likelihoods, the sure (randomized) selection for the former is non informative (informative), the opposite that holds for the latter. This situation is maintained for generalizations. Non informative likelihood here means that prior and posterior are equal.
Random numbers from vacuum fluctuations
NASA Astrophysics Data System (ADS)
Shi, Yicheng; Chng, Brenda; Kurtsiefer, Christian
2016-07-01
We implement a quantum random number generator based on a balanced homodyne measurement of vacuum fluctuations of the electromagnetic field. The digitized signal is directly processed with a fast randomness extraction scheme based on a linear feedback shift register. The random bit stream is continuously read in a computer at a rate of about 480 Mbit/s and passes an extended test suite for random numbers.
Random recursive trees and the elephant random walk
NASA Astrophysics Data System (ADS)
Kürsten, Rüdiger
2016-03-01
One class of random walks with infinite memory, so-called elephant random walks, are simple models describing anomalous diffusion. We present a surprising connection between these models and bond percolation on random recursive trees. We use a coupling between the two models to translate results from elephant random walks to the percolation process. We calculate, besides other quantities, exact expressions for the first and the second moment of the root cluster size and of the number of nodes in child clusters of the first generation. We further introduce another model, the skew elephant random walk, and calculate the first and second moment of this process.
Random recursive trees and the elephant random walk.
Kürsten, Rüdiger
2016-03-01
One class of random walks with infinite memory, so-called elephant random walks, are simple models describing anomalous diffusion. We present a surprising connection between these models and bond percolation on random recursive trees. We use a coupling between the two models to translate results from elephant random walks to the percolation process. We calculate, besides other quantities, exact expressions for the first and the second moment of the root cluster size and of the number of nodes in child clusters of the first generation. We further introduce another model, the skew elephant random walk, and calculate the first and second moment of this process. PMID:27078296
NASA Astrophysics Data System (ADS)
Mak, Chi H.; Pham, Phuong; Afif, Samir A.; Goodman, Myron F.
2015-09-01
Enzymes that rely on random walk to search for substrate targets in a heterogeneously dispersed medium can leave behind complex spatial profiles of their catalyzed conversions. The catalytic signatures of these random-walk enzymes are the result of two coupled stochastic processes: scanning and catalysis. Here we develop analytical models to understand the conversion profiles produced by these enzymes, comparing an intrusive model, in which scanning and catalysis are tightly coupled, against a loosely coupled passive model. Diagrammatic theory and path-integral solutions of these models revealed clearly distinct predictions. Comparison to experimental data from catalyzed deaminations deposited on single-stranded DNA by the enzyme activation-induced deoxycytidine deaminase (AID) demonstrates that catalysis and diffusion are strongly intertwined, where the chemical conversions give rise to new stochastic trajectories that were absent if the substrate DNA was homogeneous. The C →U deamination profiles in both analytical predictions and experiments exhibit a strong contextual dependence, where the conversion rate of each target site is strongly contingent on the identities of other surrounding targets, with the intrusive model showing an excellent fit to the data. These methods can be applied to deduce sequence-dependent catalytic signatures of other DNA modification enzymes, with potential applications to cancer, gene regulation, and epigenetics.
Random rough surface photofabrication
NASA Astrophysics Data System (ADS)
Brissonneau, Vincent; Escoubas, Ludovic; Flory, François; Berginc, Gérard
2011-10-01
Random rough surfaces are of primary interest for their optical properties: reducing reflection at the interface or obtaining specific scattering diagram for example. Thus controlling surface statistics during the fabrication process paves the way to original and specific behaviors of reflected optical waves. We detail an experimental method allowing the fabrication of random rough surfaces showing tuned statistical properties. A two-step photoresist exposure process was developed. In order to initiate photoresist polymerization, an energy threshold needs to be reached by light exposure. This energy is brought by a uniform exposure equipment comprising UV-LEDs. This pre-exposure is studied by varying parameters such as optical power and exposure time. The second step consists in an exposure based on the Gray method.1 The speckle pattern of an enlarged scattered laser beam is used to insolate the photoresist. A specific photofabrication bench using an argon ion laser was implemented. Parameters such as exposure time and distances between optical components are discussed. Then, we describe how we modify the speckle-based exposure bench to include a spatial light modulator (SLM). The SLM used is a micromirror matrix known as Digital Micromirror Device (DMD) which allows spatial modulation by displaying binary images. Thus, the spatial beam shape can be tuned and so the speckle pattern on the photoresist is modified. As the photoresist photofabricated surface is correlated to the speckle pattern used to insolate, the roughness parameters can be adjusted.
Mak, Chi H; Pham, Phuong; Afif, Samir A; Goodman, Myron F
2015-09-01
Enzymes that rely on random walk to search for substrate targets in a heterogeneously dispersed medium can leave behind complex spatial profiles of their catalyzed conversions. The catalytic signatures of these random-walk enzymes are the result of two coupled stochastic processes: scanning and catalysis. Here we develop analytical models to understand the conversion profiles produced by these enzymes, comparing an intrusive model, in which scanning and catalysis are tightly coupled, against a loosely coupled passive model. Diagrammatic theory and path-integral solutions of these models revealed clearly distinct predictions. Comparison to experimental data from catalyzed deaminations deposited on single-stranded DNA by the enzyme activation-induced deoxycytidine deaminase (AID) demonstrates that catalysis and diffusion are strongly intertwined, where the chemical conversions give rise to new stochastic trajectories that were absent if the substrate DNA was homogeneous. The C→U deamination profiles in both analytical predictions and experiments exhibit a strong contextual dependence, where the conversion rate of each target site is strongly contingent on the identities of other surrounding targets, with the intrusive model showing an excellent fit to the data. These methods can be applied to deduce sequence-dependent catalytic signatures of other DNA modification enzymes, with potential applications to cancer, gene regulation, and epigenetics.
Mak, Chi H.; Pham, Phuong; Afif, Samir A.; Goodman, Myron F.
2015-01-01
Enzymes that rely on random walk to search for substrate targets in a heterogeneously dispersed medium can leave behind complex spatial profiles of their catalyzed conversions. The catalytic signatures of these random-walk enzymes are the result of two coupled stochastic processes: scanning and catalysis. Here we develop analytical models to understand the conversion profiles produced by these enzymes, comparing an intrusive model, in which scanning and catalysis are tightly coupled, against a loosely coupled passive model. Diagrammatic theory and path-integral solutions of these models revealed clearly distinct predictions. Comparison to experimental data from catalyzed deaminations deposited on single-stranded DNA by the enzyme activation-induced deoxycytidine deaminase (AID) demonstrates that catalysis and diffusion are strongly intertwined, where the chemical conversions give rise to new stochastic trajectories that were absent if the substrate DNA was homogeneous. The C → U deamination profiles in both analytical predictions and experiments exhibit a strong contextual dependence, where the conversion rate of each target site is strongly contingent on the identities of other surrounding targets, with the intrusive model showing an excellent fit to the data. These methods can be applied to deduce sequence-dependent catalytic signatures of other DNA modification enzymes, with potential applications to cancer, gene regulation, and epigenetics. PMID:26465508
Mak, Chi H; Pham, Phuong; Afif, Samir A; Goodman, Myron F
2015-09-01
Enzymes that rely on random walk to search for substrate targets in a heterogeneously dispersed medium can leave behind complex spatial profiles of their catalyzed conversions. The catalytic signatures of these random-walk enzymes are the result of two coupled stochastic processes: scanning and catalysis. Here we develop analytical models to understand the conversion profiles produced by these enzymes, comparing an intrusive model, in which scanning and catalysis are tightly coupled, against a loosely coupled passive model. Diagrammatic theory and path-integral solutions of these models revealed clearly distinct predictions. Comparison to experimental data from catalyzed deaminations deposited on single-stranded DNA by the enzyme activation-induced deoxycytidine deaminase (AID) demonstrates that catalysis and diffusion are strongly intertwined, where the chemical conversions give rise to new stochastic trajectories that were absent if the substrate DNA was homogeneous. The C→U deamination profiles in both analytical predictions and experiments exhibit a strong contextual dependence, where the conversion rate of each target site is strongly contingent on the identities of other surrounding targets, with the intrusive model showing an excellent fit to the data. These methods can be applied to deduce sequence-dependent catalytic signatures of other DNA modification enzymes, with potential applications to cancer, gene regulation, and epigenetics. PMID:26465508
Random Numbers and Quantum Computers
ERIC Educational Resources Information Center
McCartney, Mark; Glass, David
2002-01-01
The topic of random numbers is investigated in such a way as to illustrate links between mathematics, physics and computer science. First, the generation of random numbers by a classical computer using the linear congruential generator and logistic map is considered. It is noted that these procedures yield only pseudo-random numbers since…
Work analysis by random sampling.
Divilbiss, J L; Self, P C
1978-01-01
Random sampling of work activities using an electronic random alarm mechanism provided a simple and effective way to determine how time was divided between various activities. At each random alarm the subject simply recorded the time and the activity. Analysis of the data led to reassignment of staff functions and also resulted in additional support for certain critical activities. PMID:626793
Random Selection for Drug Screening
Center for Human Reliability Studies
2007-05-01
Simple random sampling is generally the starting point for a random sampling process. This sampling technique ensures that each individual within a group (population) has an equal chance of being selected. There are a variety of ways to implement random sampling in a practical situation.
Investigating the Randomness of Numbers
ERIC Educational Resources Information Center
Pendleton, Kenn L.
2009-01-01
The use of random numbers is pervasive in today's world. Random numbers have practical applications in such far-flung arenas as computer simulations, cryptography, gambling, the legal system, statistical sampling, and even the war on terrorism. Evaluating the randomness of extremely large samples is a complex, intricate process. However, the…
Randomized Response Analysis in Mplus
ERIC Educational Resources Information Center
Hox, Joop; Lensvelt-Mulders, Gerty
2004-01-01
This article describes a technique to analyze randomized response data using available structural equation modeling (SEM) software. The randomized response technique was developed to obtain estimates that are more valid when studying sensitive topics. The basic feature of all randomized response methods is that the data are deliberately…
HP-PRRSV challenge of 4 and 10-week-old pigs
Technology Transfer Automated Retrieval System (TEKTRAN)
In 2006 a unique syndrome was recognized in growing pigs in China with the predominant clinical signs being high fever, anorexia, listlessness, red discoloration of skin, and respiratory distress. The disease had a very high morbidity and mortality rate and became known as porcine high fever disease...
Jain, Sudhir R; Srivastava, Shashi C L
2008-09-01
We present a Gaussian ensemble of random cyclic matrices on the real field and study their spectral fluctuations. These cyclic matrices are shown to be pseudosymmetric with respect to generalized parity. We calculate the joint probability distribution function of eigenvalues and the spacing distributions analytically and numerically. For small spacings, the level spacing distribution exhibits either a Gaussian or a linear form. Furthermore, for the general case of two arbitrary complex eigenvalues, leaving out the spacings among real eigenvalues, and, among complex conjugate pairs, we find that the spacing distribution agrees completely with the Wigner distribution for a Poisson process on a plane. The cyclic matrices occur in a wide variety of physical situations, including disordered linear atomic chains and the Ising model in two dimensions. These exact results are also relevant to two-dimensional statistical mechanics and nu -parametrized quantum chromodynamics. PMID:18851127
Reinelt, Douglas A.; van Swol, Frank B.; Kraynik, Andrew Michael
2004-06-01
The Surface Evolver was used to compute the equilibrium microstructure of dry soap foams with random structure and a wide range of cell-size distributions. Topological and geometric properties of foams and individual cells were evaluated. The theory for isotropic Plateau polyhedra describes the dependence of cell geometric properties on their volume and number of faces. The surface area of all cells is about 10% greater than a sphere of equal volume; this leads to a simple but accurate theory for the surface free energy density of foam. A novel parameter based on the surface-volume mean bubble radius R32 is used to characterize foam polydispersity. The foam energy, total cell edge length, and average number of faces per cell all decrease with increasing polydispersity. Pentagonal faces are the most common in monodisperse foam but quadrilaterals take over in highly polydisperse structures.
CONTOURING RANDOMLY SPACED DATA
NASA Technical Reports Server (NTRS)
Hamm, R. W.
1994-01-01
This program prepares contour plots of three-dimensional randomly spaced data. The contouring techniques use a triangulation procedure developed by Dr. C. L. Lawson of the Jet Propulsion Laboratory which allows the contouring of randomly spaced input data without first fitting the data into a rectangular grid. The program also allows contour points to be fitted with a smooth curve using an interpolating spline under tension. The input data points to be contoured are read from a magnetic tape or disk file with one record for each data point. Each record contains the X and Y coordinates, value to be contoured, and an alternate contour value (if applicable). The contour data is then partitioned by the program to reduce core storage requirements. Output consists of the contour plots and user messages. Several output options are available to the user such as: controlling which value in the data record is to be contoured, whether contours are drawn by polygonal lines or by a spline under tension (smooth curves), and controlling the contour level labels which may be suppressed if desired. The program can handle up to 56,000 data points and provide for up to 20 contour intervals for a multiple number of parameters. This program was written in FORTRAN IV for implementation on a CDC 6600 computer using CALCOMP plotting capabilities. The field length required is dependent upon the number of data points to be contoured. The program requires 42K octal storage locations plus the larger of: 24 times the maximum number of points in each data partition (defaults to maximum of 1000 data points in each partition with 20 percent overlap) or 2K plus four times the total number of points to be plotted. This program was developed in 1975.
How random are random numbers generated using photons?
NASA Astrophysics Data System (ADS)
Solis, Aldo; Angulo Martínez, Alí M.; Ramírez Alarcón, Roberto; Cruz Ramírez, Hector; U'Ren, Alfred B.; Hirsch, Jorge G.
2015-06-01
Randomness is fundamental in quantum theory, with many philosophical and practical implications. In this paper we discuss the concept of algorithmic randomness, which provides a quantitative method to assess the Borel normality of a given sequence of numbers, a necessary condition for it to be considered random. We use Borel normality as a tool to investigate the randomness of ten sequences of bits generated from the differences between detection times of photon pairs generated by spontaneous parametric downconversion. These sequences are shown to fulfil the randomness criteria without difficulties. As deviations from Borel normality for photon-generated random number sequences have been reported in previous work, a strategy to understand these diverging findings is outlined.
Quantum random walk polynomial and quantum random walk measure
NASA Astrophysics Data System (ADS)
Kang, Yuanbao; Wang, Caishi
2014-05-01
In the paper, we introduce a quantum random walk polynomial (QRWP) that can be defined as a polynomial , which is orthogonal with respect to a quantum random walk measure (QRWM) on , such that the parameters are in the recurrence relations and satisfy . We firstly obtain some results of QRWP and QRWM, in which case the correspondence between measures and orthogonal polynomial sequences is one-to-one. It shows that any measure with respect to which a quantum random walk polynomial sequence is orthogonal is a quantum random walk measure. We next collect some properties of QRWM; moreover, we extend Karlin and McGregor's representation formula for the transition probabilities of a quantum random walk (QRW) in the interacting Fock space, which is a parallel result with the CGMV method. Using these findings, we finally obtain some applications for QRWM, which are of interest in the study of quantum random walk, highlighting the role played by QRWP and QRWM.
Nonvolatile random access memory
NASA Technical Reports Server (NTRS)
Wu, Jiin-Chuan (Inventor); Stadler, Henry L. (Inventor); Katti, Romney R. (Inventor)
1994-01-01
A nonvolatile magnetic random access memory can be achieved by an array of magnet-Hall effect (M-H) elements. The storage function is realized with a rectangular thin-film ferromagnetic material having an in-plane, uniaxial anisotropy and inplane bipolar remanent magnetization states. The thin-film magnetic element is magnetized by a local applied field, whose direction is used to form either a 0 or 1 state. The element remains in the 0 or 1 state until a switching field is applied to change its state. The stored information is detcted by a Hall-effect sensor which senses the fringing field from the magnetic storage element. The circuit design for addressing each cell includes transistor switches for providing a current of selected polarity to store a binary digit through a separate conductor overlying the magnetic element of the cell. To read out a stored binary digit, transistor switches are employed to provide a current through a row of Hall-effect sensors connected in series and enabling a differential voltage amplifier connected to all Hall-effect sensors of a column in series. To avoid read-out voltage errors due to shunt currents through resistive loads of the Hall-effect sensors of other cells in the same column, at least one transistor switch is provided between every pair of adjacent cells in every row which are not turned on except in the row of the selected cell.
Spatially embedded random networks.
Barnett, L; Di Paolo, E; Bullock, S
2007-11-01
Many real-world networks analyzed in modern network theory have a natural spatial element; e.g., the Internet, social networks, neural networks, etc. Yet, aside from a comparatively small number of somewhat specialized and domain-specific studies, the spatial element is mostly ignored and, in particular, its relation to network structure disregarded. In this paper we introduce a model framework to analyze the mediation of network structure by spatial embedding; specifically, we model connectivity as dependent on the distance between network nodes. Our spatially embedded random networks construction is not primarily intended as an accurate model of any specific class of real-world networks, but rather to gain intuition for the effects of spatial embedding on network structure; nevertheless we are able to demonstrate, in a quite general setting, some constraints of spatial embedding on connectivity such as the effects of spatial symmetry, conditions for scale free degree distributions and the existence of small-world spatial networks. We also derive some standard structural statistics for spatially embedded networks and illustrate the application of our model framework with concrete examples. PMID:18233726
Spatially embedded random networks
NASA Astrophysics Data System (ADS)
Barnett, L.; di Paolo, E.; Bullock, S.
2007-11-01
Many real-world networks analyzed in modern network theory have a natural spatial element; e.g., the Internet, social networks, neural networks, etc. Yet, aside from a comparatively small number of somewhat specialized and domain-specific studies, the spatial element is mostly ignored and, in particular, its relation to network structure disregarded. In this paper we introduce a model framework to analyze the mediation of network structure by spatial embedding; specifically, we model connectivity as dependent on the distance between network nodes. Our spatially embedded random networks construction is not primarily intended as an accurate model of any specific class of real-world networks, but rather to gain intuition for the effects of spatial embedding on network structure; nevertheless we are able to demonstrate, in a quite general setting, some constraints of spatial embedding on connectivity such as the effects of spatial symmetry, conditions for scale free degree distributions and the existence of small-world spatial networks. We also derive some standard structural statistics for spatially embedded networks and illustrate the application of our model framework with concrete examples.
Does Random Dispersion Help Survival?
NASA Astrophysics Data System (ADS)
Schinazi, Rinaldo B.
2015-04-01
Many species live in colonies that prosper for a while and then collapse. After the collapse the colony survivors disperse randomly and found new colonies that may or may not make it depending on the new environment they find. We use birth and death chains in random environments to model such a population and to argue that random dispersion is a superior strategy for survival.
Leadership statistics in random structures
NASA Astrophysics Data System (ADS)
Ben-Naim, E.; Krapivsky, P. L.
2004-01-01
The largest component ("the leader") in evolving random structures often exhibits universal statistical properties. This phenomenon is demonstrated analytically for two ubiquitous structures: random trees and random graphs. In both cases, lead changes are rare as the average number of lead changes increases quadratically with logarithm of the system size. As a function of time, the number of lead changes is self-similar. Additionally, the probability that no lead change ever occurs decays exponentially with the average number of lead changes.
A Random Variable Transformation Process.
ERIC Educational Resources Information Center
Scheuermann, Larry
1989-01-01
Provides a short BASIC program, RANVAR, which generates random variates for various theoretical probability distributions. The seven variates include: uniform, exponential, normal, binomial, Poisson, Pascal, and triangular. (MVL)
Mazenko, Gene F
2008-09-01
We study the random diffusion model. This is a continuum model for a conserved scalar density field varphi driven by diffusive dynamics. The interesting feature of the dynamics is that the bare diffusion coefficient D is density dependent. In the simplest case, D=D[over ]+D_{1}deltavarphi , where D[over ] is the constant average diffusion constant. In the case where the driving effective Hamiltonian is quadratic, the model can be treated using perturbation theory in terms of the single nonlinear coupling D1 . We develop perturbation theory to fourth order in D1 . The are two ways of analyzing this perturbation theory. In one approach, developed by Kawasaki, at one-loop order one finds mode-coupling theory with an ergodic-nonergodic transition. An alternative more direct interpretation at one-loop order leads to a slowing down as the nonlinear coupling increases. Eventually one hits a critical coupling where the time decay becomes algebraic. Near this critical coupling a weak peak develops at a wave number well above the peak at q=0 associated with the conservation law. The width of this peak in Fourier space decreases with time and can be identified with a characteristic kinetic length which grows with a power law in time. For stronger coupling the system becomes metastable and then unstable. At two-loop order it is shown that the ergodic-nonergodic transition is not supported. It is demonstrated that the critical properties of the direct approach survive, going to higher order in perturbation theory.
NASA Astrophysics Data System (ADS)
Jung, P.; Talkner, P.
2010-09-01
A simple way to convert a purely random sequence of events into a signal with a strong periodic component is proposed. The signal consists of those instants of time at which the length of the random sequence exceeds an integer multiple of a given number. The larger this number the more pronounced the periodic behavior becomes.
Students' Misconceptions about Random Variables
ERIC Educational Resources Information Center
Kachapova, Farida; Kachapov, Ilias
2012-01-01
This article describes some misconceptions about random variables and related counter-examples, and makes suggestions about teaching initial topics on random variables in general form instead of doing it separately for discrete and continuous cases. The focus is on post-calculus probability courses. (Contains 2 figures.)
Randomness versus Nonlocality and Entanglement
NASA Astrophysics Data System (ADS)
Acín, Antonio; Massar, Serge; Pironio, Stefano
2012-03-01
The outcomes obtained in Bell tests involving two-outcome measurements on two subsystems can, in principle, generate up to 2 bits of randomness. However, the maximal violation of the Clauser-Horne-Shimony-Holt inequality guarantees the generation of only 1.23 bits of randomness. We prove here that quantum correlations with arbitrarily little nonlocality and states with arbitrarily little entanglement can be used to certify that close to the maximum of 2 bits of randomness are produced. Our results show that nonlocality, entanglement, and randomness are inequivalent quantities. They also imply that device-independent quantum key distribution with an optimal key generation rate is possible by using almost-local correlations and that device-independent randomness generation with an optimal rate is possible with almost-local correlations and with almost-unentangled states.
NASA Astrophysics Data System (ADS)
Papakonstantinou, Periklis A.; Woodruff, David P.; Yang, Guang
2016-09-01
Generating random bits is a difficult task, which is important for physical systems simulation, cryptography, and many applications that rely on high-quality random bits. Our contribution is to show how to generate provably random bits from uncertain events whose outcomes are routinely recorded in the form of massive data sets. These include scientific data sets, such as in astronomics, genomics, as well as data produced by individuals, such as internet search logs, sensor networks, and social network feeds. We view the generation of such data as the sampling process from a big source, which is a random variable of size at least a few gigabytes. Our view initiates the study of big sources in the randomness extraction literature. Previous approaches for big sources rely on statistical assumptions about the samples. We introduce a general method that provably extracts almost-uniform random bits from big sources and extensively validate it empirically on real data sets. The experimental findings indicate that our method is efficient enough to handle large enough sources, while previous extractor constructions are not efficient enough to be practical. Quality-wise, our method at least matches quantum randomness expanders and classical world empirical extractors as measured by standardized tests.
Papakonstantinou, Periklis A.; Woodruff, David P.; Yang, Guang
2016-01-01
Generating random bits is a difficult task, which is important for physical systems simulation, cryptography, and many applications that rely on high-quality random bits. Our contribution is to show how to generate provably random bits from uncertain events whose outcomes are routinely recorded in the form of massive data sets. These include scientific data sets, such as in astronomics, genomics, as well as data produced by individuals, such as internet search logs, sensor networks, and social network feeds. We view the generation of such data as the sampling process from a big source, which is a random variable of size at least a few gigabytes. Our view initiates the study of big sources in the randomness extraction literature. Previous approaches for big sources rely on statistical assumptions about the samples. We introduce a general method that provably extracts almost-uniform random bits from big sources and extensively validate it empirically on real data sets. The experimental findings indicate that our method is efficient enough to handle large enough sources, while previous extractor constructions are not efficient enough to be practical. Quality-wise, our method at least matches quantum randomness expanders and classical world empirical extractors as measured by standardized tests. PMID:27666514
Random sequential adsorption on fractals.
Ciesla, Michal; Barbasz, Jakub
2012-07-28
Irreversible adsorption of spheres on flat collectors having dimension d < 2 is studied. Molecules are adsorbed on Sierpinski's triangle and carpet-like fractals (1 < d < 2), and on general Cantor set (d < 1). Adsorption process is modeled numerically using random sequential adsorption (RSA) algorithm. The paper concentrates on measurement of fundamental properties of coverages, i.e., maximal random coverage ratio and density autocorrelation function, as well as RSA kinetics. Obtained results allow to improve phenomenological relation between maximal random coverage ratio and collector dimension. Moreover, simulations show that, in general, most of known dimensional properties of adsorbed monolayers are valid for non-integer dimensions.
Spieker, Susan J.; Oxford, Monica L.; Kelly, Jean F.; Nelson, Elizabeth M.; Fleming, Charles B.
2013-01-01
We conducted a community based, randomized control trial of Promoting First Relationships (PFR; Kelly, Sandoval, Zuckerman, & Buehlman, 2008) to improve parenting and toddler outcomes for toddlers in state dependency. Toddlers (10 – 24 months; N = 210) with a recent placement disruption were randomized to 10-week PFR or a comparison condition. Community agency providers were trained to use PFR in the intervention for caregivers. From baseline to post-intervention follow-up, observational ratings of caregiver sensitivity improved more in the PFR condition than in the comparison condition, with an effect size for the difference in adjusted means post-intervention of d = .41. Caregiver understanding of toddlers’ social emotional needs and caregiver reports of child competence also differed by intervention condition post-intervention (d = .36 and d = .42) with caregivers in the PFR condition reporting more understanding of toddlers and child competence. Models of PFR effects on within-individual change were significant for caregiver sensitivity and understanding of toddlers. At the 6-month follow-up 61% of original sample dyads were still intact and there were no significant differences on caregiver or child outcomes, although caregivers in the PFR group did report marginally (p<.10) fewer child sleep problems (d = −.34). PMID:22949743
An Integrated Intervention in Pregnant African Americans Reduces Postpartum Risk: A Randomized Trial
El-Mohandes, Ayman A.E.; Kiely, Michele; Joseph, Jill G.; Subramanian, Siva; Johnson, Allan A.; Blake, Susn M.; Gantz, Marie G.; El-Khorazaty, M. Nabil
2010-01-01
Objective To evaluate the efficacy of an integrated multiple risk intervention delivered mainly during pregnancy, in reducing such risks (smoking, environmental tobacco smoke exposure, depression and intimate partner violence) postpartum. Design Data from this randomized controlled trial were collected prenatally and on average 10 weeks postpartum in six prenatal care sites in the District of Columbia. African Americans were screened, recruited and randomly assigned to the behavioral intervention or usual care. Clinic-based, individually tailored counseling was delivered to intervention women. The outcome measures were number of reisks reported postpartum and reduction of these risks between baseline and postpartum. Results The intervention was effective in significantly reducing the number of risks reported in the postpartum period. In Bivariate analyses, the intervention group was more successful in resolving all risks (47% compared with 35%, p=0.007), number needed to treat=9, 95% confidence interval [CI] 5-31) and in resolving some risks (63% compared with 54%, p=0.009), number needed to treat=11, 95% CI 7-43) as compared with the usual care group. In logistical regression analyses, women in the intervention group were more likely to resolve all risks (OR=1.86, 95% CI: 1.25-2.75) and in resolving at least one risk (OR=1.6, 95% CI: 1.15-2.22). Conclusions An integrated multiple risk factor intervention addressing psychosocial and behavioral risks delivered mainly during pregnancy can have beneficial effects in risk reduction postpartum. PMID:18757660
Control theory for random systems
NASA Technical Reports Server (NTRS)
Bryson, A. E., Jr.
1972-01-01
A survey is presented of the current knowledge available for designing and predicting the effectiveness of controllers for dynamic systems which can be modeled by ordinary differential equations. A short discussion of feedback control is followed by a description of deterministic controller design and the concept of system state. The need for more realistic disturbance models led to the use of stochastic process concepts, in particular the Gauss-Markov process. A compensator controlled system, with random forcing functions, random errors in the measurements, and random initial conditions, is treated as constituting a Gauss-Markov random process; hence the mean-square behavior of the controlled system is readily predicted. As an example, a compensator is designed for a helicopter to maintain it in hover in a gusty wind over a point on the ground.
Quantifying randomness in real networks
NASA Astrophysics Data System (ADS)
Orsini, Chiara; Dankulov, Marija M.; Colomer-de-Simón, Pol; Jamakovic, Almerima; Mahadevan, Priya; Vahdat, Amin; Bassler, Kevin E.; Toroczkai, Zoltán; Boguñá, Marián; Caldarelli, Guido; Fortunato, Santo; Krioukov, Dmitri
2015-10-01
Represented as graphs, real networks are intricate combinations of order and disorder. Fixing some of the structural properties of network models to their values observed in real networks, many other properties appear as statistical consequences of these fixed observables, plus randomness in other respects. Here we employ the dk-series, a complete set of basic characteristics of the network structure, to study the statistical dependencies between different network properties. We consider six real networks--the Internet, US airport network, human protein interactions, technosocial web of trust, English word network, and an fMRI map of the human brain--and find that many important local and global structural properties of these networks are closely reproduced by dk-random graphs whose degree distributions, degree correlations and clustering are as in the corresponding real network. We discuss important conceptual, methodological, and practical implications of this evaluation of network randomness, and release software to generate dk-random graphs.
Random Selection for Drug Screening
Center for Human Reliability Studies
2007-05-01
Sampling is the process of choosing some members out of a group or population. Probablity sampling, or random sampling, is the process of selecting members by chance with a known probability of each individual being chosen.
Quantifying randomness in real networks.
Orsini, Chiara; Dankulov, Marija M; Colomer-de-Simón, Pol; Jamakovic, Almerima; Mahadevan, Priya; Vahdat, Amin; Bassler, Kevin E; Toroczkai, Zoltán; Boguñá, Marián; Caldarelli, Guido; Fortunato, Santo; Krioukov, Dmitri
2015-10-20
Represented as graphs, real networks are intricate combinations of order and disorder. Fixing some of the structural properties of network models to their values observed in real networks, many other properties appear as statistical consequences of these fixed observables, plus randomness in other respects. Here we employ the dk-series, a complete set of basic characteristics of the network structure, to study the statistical dependencies between different network properties. We consider six real networks--the Internet, US airport network, human protein interactions, technosocial web of trust, English word network, and an fMRI map of the human brain--and find that many important local and global structural properties of these networks are closely reproduced by dk-random graphs whose degree distributions, degree correlations and clustering are as in the corresponding real network. We discuss important conceptual, methodological, and practical implications of this evaluation of network randomness, and release software to generate dk-random graphs.
Quantifying randomness in real networks
Orsini, Chiara; Dankulov, Marija M.; Colomer-de-Simón, Pol; Jamakovic, Almerima; Mahadevan, Priya; Vahdat, Amin; Bassler, Kevin E.; Toroczkai, Zoltán; Boguñá, Marián; Caldarelli, Guido; Fortunato, Santo; Krioukov, Dmitri
2015-01-01
Represented as graphs, real networks are intricate combinations of order and disorder. Fixing some of the structural properties of network models to their values observed in real networks, many other properties appear as statistical consequences of these fixed observables, plus randomness in other respects. Here we employ the dk-series, a complete set of basic characteristics of the network structure, to study the statistical dependencies between different network properties. We consider six real networks—the Internet, US airport network, human protein interactions, technosocial web of trust, English word network, and an fMRI map of the human brain—and find that many important local and global structural properties of these networks are closely reproduced by dk-random graphs whose degree distributions, degree correlations and clustering are as in the corresponding real network. We discuss important conceptual, methodological, and practical implications of this evaluation of network randomness, and release software to generate dk-random graphs. PMID:26482121
Diffraction by random Ronchi gratings.
Torcal-Milla, Francisco Jose; Sanchez-Brea, Luis Miguel
2016-08-01
In this work, we obtain analytical expressions for the near-and far-field diffraction of random Ronchi diffraction gratings where the slits of the grating are randomly displaced around their periodical positions. We theoretically show that the effect of randomness in the position of the slits of the grating produces a decrease of the contrast and even disappearance of the self-images for high randomness level at the near field. On the other hand, it cancels high-order harmonics in far field, resulting in only a few central diffraction orders. Numerical simulations by means of the Rayleigh-Sommerfeld diffraction formula are performed in order to corroborate the analytical results. These results are of interest for industrial and technological applications where manufacture errors need to be considered.
Cluster randomization and political philosophy.
Chwang, Eric
2012-11-01
In this paper, I will argue that, while the ethical issues raised by cluster randomization can be challenging, they are not new. My thesis divides neatly into two parts. In the first, easier part I argue that many of the ethical challenges posed by cluster randomized human subjects research are clearly present in other types of human subjects research, and so are not novel. In the second, more difficult part I discuss the thorniest ethical challenge for cluster randomized research--cases where consent is genuinely impractical to obtain. I argue that once again these cases require no new analytic insight; instead, we should look to political philosophy for guidance. In other words, the most serious ethical problem that arises in cluster randomized research also arises in political philosophy.
Quantum entanglement from random measurements
NASA Astrophysics Data System (ADS)
Tran, Minh Cong; Dakić, Borivoje; Arnault, François; Laskowski, Wiesław; Paterek, Tomasz
2015-11-01
We show that the expectation value of squared correlations measured along random local directions is an identifier of quantum entanglement in pure states, which can be directly experimentally assessed if two copies of the state are available. Entanglement can therefore be detected by parties who do not share a common reference frame and whose local reference frames, such as polarizers or Stern-Gerlach magnets, remain unknown. Furthermore, we also show that in every experimental run, access to only one qubit from the macroscopic reference is sufficient to identify entanglement, violate a Bell inequality, and, in fact, observe all phenomena observable with macroscopic references. Finally, we provide a state-independent entanglement witness solely in terms of random correlations and emphasize how data gathered for a single random measurement setting per party reliably detects entanglement. This is only possible due to utilized randomness and should find practical applications in experimental confirmation of multiphoton entanglement or space experiments.
Diffraction by random Ronchi gratings.
Torcal-Milla, Francisco Jose; Sanchez-Brea, Luis Miguel
2016-08-01
In this work, we obtain analytical expressions for the near-and far-field diffraction of random Ronchi diffraction gratings where the slits of the grating are randomly displaced around their periodical positions. We theoretically show that the effect of randomness in the position of the slits of the grating produces a decrease of the contrast and even disappearance of the self-images for high randomness level at the near field. On the other hand, it cancels high-order harmonics in far field, resulting in only a few central diffraction orders. Numerical simulations by means of the Rayleigh-Sommerfeld diffraction formula are performed in order to corroborate the analytical results. These results are of interest for industrial and technological applications where manufacture errors need to be considered. PMID:27505363
Jay, Kenneth; Brandt, Mikkel; Jakobsen, Markus Due; Sundstrup, Emil; Berthelsen, Kasper Gymoese; Schraefel, Mc; Sjøgaard, Gisela; Andersen, Lars L
2016-08-01
People with chronic musculoskeletal pain often experience pain-related fear of movement and avoidance behavior. The Fear-Avoidance model proposes a possible mechanism at least partly explaining the development and maintenance of chronic pain. People who interpret pain during movement as being potentially harmful to the organism may initiate a vicious behavioral cycle by generating pain-related fear of movement accompanied by avoidance behavior and hyper-vigilance.This study investigates whether an individually adapted multifactorial approach comprised of biopsychosocial elements, with a focus on physical exercise, mindfulness, and education on pain and behavior, can decrease work-related fear-avoidance beliefs.As part of a large scale 10-week worksite randomized controlled intervention trial focusing on company initiatives to combat work-related musculoskeletal pain and stress, we evaluated fear-avoidance behavior in 112 female laboratory technicians with chronic neck, shoulder, upper back, lower back, elbow, and hand/wrist pain using the Fear-Avoidance Beliefs Questionnaire at baseline, before group allocation, and again at the post intervention follow-up 10 weeks later.A significant group by time interaction was observed (P < 0.05) for work-related fear-avoidance beliefs. The between-group difference at follow-up was -2.2 (-4.0 to -0.5), corresponding to a small to medium effect size (Cohen's d = 0.30).Our study shows that work-related, but not leisure time activity-related, fear-avoidance beliefs, as assessed by the Fear-avoidance Beliefs Questionnaire, can be significantly reduced by 10 weeks of physical-cognitive-mindfulness training in female laboratory technicians with chronic pain.
Jay, Kenneth; Brandt, Mikkel; Jakobsen, Markus Due; Sundstrup, Emil; Berthelsen, Kasper Gymoese; Schraefel, Mc; Sjøgaard, Gisela; Andersen, Lars L
2016-08-01
People with chronic musculoskeletal pain often experience pain-related fear of movement and avoidance behavior. The Fear-Avoidance model proposes a possible mechanism at least partly explaining the development and maintenance of chronic pain. People who interpret pain during movement as being potentially harmful to the organism may initiate a vicious behavioral cycle by generating pain-related fear of movement accompanied by avoidance behavior and hyper-vigilance.This study investigates whether an individually adapted multifactorial approach comprised of biopsychosocial elements, with a focus on physical exercise, mindfulness, and education on pain and behavior, can decrease work-related fear-avoidance beliefs.As part of a large scale 10-week worksite randomized controlled intervention trial focusing on company initiatives to combat work-related musculoskeletal pain and stress, we evaluated fear-avoidance behavior in 112 female laboratory technicians with chronic neck, shoulder, upper back, lower back, elbow, and hand/wrist pain using the Fear-Avoidance Beliefs Questionnaire at baseline, before group allocation, and again at the post intervention follow-up 10 weeks later.A significant group by time interaction was observed (P < 0.05) for work-related fear-avoidance beliefs. The between-group difference at follow-up was -2.2 (-4.0 to -0.5), corresponding to a small to medium effect size (Cohen's d = 0.30).Our study shows that work-related, but not leisure time activity-related, fear-avoidance beliefs, as assessed by the Fear-avoidance Beliefs Questionnaire, can be significantly reduced by 10 weeks of physical-cognitive-mindfulness training in female laboratory technicians with chronic pain. PMID:27559939
Armah, George; Lewis, Kristen D. C.; Cortese, Margaret M.; Parashar, Umesh D.; Ansah, Akosua; Gazley, Lauren; Victor, John C.; McNeal, Monica M.; Binka, Fred; Steele, A. Duncan
2016-01-01
Background. The recommended schedule for receipt of 2-dose human rotavirus vaccine (HRV) coincides with receipt of the first and second doses of diphtheria, pertussis, and tetanus vaccine (ie, 6 and 10 weeks of age, respectively). Alternative schedules and additional doses of HRV have been proposed and may improve vaccine performance in low-income countries. Methods. In this randomized trial in rural Ghana, HRV was administered at ages 6 and 10 weeks (group 1), 10 and 14 weeks (group 2), or 6, 10, and 14 weeks (group 3). We compared serum antirotavirus immunoglobulin A (IgA) seroconversion (≥20 U/mL) and geometric mean concentrations (GMCs) between group 1 and groups 2 and 3. Results. Ninety-three percent of participants (424 of 456) completed the study per protocol. In groups 1, 2, and 3, the IgA seroconversion frequencies among participants with IgA levels of <20 U/mL at baseline were 28.9%, 37.4%, and 43.4%, respectively (group 1 vs group 3, P = .014; group 1 vs group 2, P = .163). Postvaccination IgA GMCs were 22.1 U/mL, 26.5 U/mL, and 32.6 U/mL in groups 1, 2, and 3, respectively (group 1 vs group 3, P = .038; group 1 vs group 2, P = .304). Conclusions. A third dose of HRV resulted in increased seroconversion frequencies and GMCs, compared with 2 doses administered at 6 and 10 weeks of age. Since there is no correlate of protection, a postmarketing effectiveness study is required to determine whether the improvement in immune response translates into a public health benefit in low-income countries. Clinical Trials Registration. NCT015751. PMID:26823335
Quasi-Random Sequence Generators.
1994-03-01
Version 00 LPTAU generates quasi-random sequences. The sequences are uniformly distributed sets of L=2**30 points in the N-dimensional unit cube: I**N=[0,1]. The sequences are used as nodes for multidimensional integration, as searching points in global optimization, as trial points in multicriteria decision making, as quasi-random points for quasi Monte Carlo algorithms.
Staggered chiral random matrix theory
Osborn, James C.
2011-02-01
We present a random matrix theory for the staggered lattice QCD Dirac operator. The staggered random matrix theory is equivalent to the zero-momentum limit of the staggered chiral Lagrangian and includes all taste breaking terms at their leading order. This is an extension of previous work which only included some of the taste breaking terms. We will also present some results for the taste breaking contributions to the partition function and the Dirac eigenvalues.
On Pfaffian Random Point Fields
NASA Astrophysics Data System (ADS)
Kargin, V.
2014-02-01
We study Pfaffian random point fields by using the Moore-Dyson quaternion determinants. First, we give sufficient conditions that ensure that a self-dual quaternion kernel defines a valid random point field, and then we prove a CLT for Pfaffian point fields. The proofs are based on a new quaternion extension of the Cauchy-Binet determinantal identity. In addition, we derive the Fredholm determinantal formulas for the Pfaffian point fields which use the quaternion determinant.
Randomness and degrees of irregularity.
Pincus, S; Singer, B H
1996-01-01
The fundamental question "Are sequential data random?" arises in myriad contexts, often with severe data length constraints. Furthermore, there is frequently a critical need to delineate nonrandom sequences in terms of closeness to randomness--e.g., to evaluate the efficacy of therapy in medicine. We address both these issues from a computable framework via a quantification of regularity. ApEn (approximate entropy), defining maximal randomness for sequences of arbitrary length, indicating the applicability to sequences as short as N = 5 points. An infinite sequence formulation of randomness is introduced that retains the operational (and computable) features of the finite case. In the infinite sequence setting, we indicate how the "foundational" definition of independence in probability theory, and the definition of normality in number theory, reduce to limit theorems without rates of convergence, from which we utilize ApEn to address rates of convergence (of a deficit from maximal randomness), refining the aforementioned concepts in a computationally essential manner. Representative applications among many are indicated to assess (i) random number generation output; (ii) well-shuffled arrangements; and (iii) (the quality of) bootstrap replicates. PMID:11607637
Experimental evidence of quantum randomness incomputability
Calude, Cristian S.; Dinneen, Michael J.; Dumitrescu, Monica; Svozil, Karl
2010-08-15
In contrast with software-generated randomness (called pseudo-randomness), quantum randomness can be proven incomputable; that is, it is not exactly reproducible by any algorithm. We provide experimental evidence of incomputability--an asymptotic property--of quantum randomness by performing finite tests of randomness inspired by algorithmic information theory.
Phase Transitions on Random Lattices: How Random is Topological Disorder?
NASA Astrophysics Data System (ADS)
Barghathi, Hatem; Vojta, Thomas
2015-03-01
We study the effects of topological (connectivity) disorder on phase transitions. We identify a broad class of random lattices whose disorder fluctuations decay much faster with increasing length scale than those of generic random systems, yielding a wandering exponent of ω = (d - 1) / (2 d) in d dimensions. The stability of clean critical points is thus governed by the criterion (d + 1) ν > 2 rather than the usual Harris criterion dν > 2 , making topological disorder less relevant than generic randomness. The Imry-Ma criterion is also modified, allowing first-order transitions to survive in all dimensions d > 1 . These results explain a host of puzzling violations of the original criteria for equilibrium and nonequilibrium phase transitions on random lattices. We discuss applications, and we illustrate our theory by computer simulations of random Voronoi and other lattices. This work was supported by the NSF under Grant Nos. DMR-1205803 and PHYS-1066293. We acknowledge the hospitality of the Aspen Center for Physics.
Subset selection under random censorship
Kim, J.S.
1983-03-01
Suppose we want to model the situation commonly taking place, for example, in industrial life-testing in which two-component series system is understudy. The system functions if and only if both the Type A component and the Type B component are functioning. The distribution or an unknown parameter in the distribution of the Type A component is of interest. Let X/sub 1/, X/sub 2/, ..., X/sub n/ be independent and identically distributed random variables denoting lifelengths of n Type A components with a continuous distribution function F, and let Y/sub 1/, Y/sub 2/, ..., Y/sub n/ be independent and identically distributed random variables denoting lifelengths of n Type B components also with a continuous distribution function H(.). Failure of the Type B component causes the system failure, thereby making it impossible to observe the failure time of the Type A component. The random variables Y/sub 1/, Y/sub 2/, ..., Y/sub n/ are referred to as time-to-censorship or censoring random variables, and the distribution function H(.) as the censoring distribution. We assume that (X/sub 1/, Y/sub 1/), (X/sub 2/, Y/sub 2/), ..., (X/sub n/, Y/sub n/) is an independent and identically distributed sequence of random pairs defined on a common probability space. Our observations consist of the minima, Z/sub 1/ - min (X/sub 1/, Y/sub 1/), Z/sub 2/ = min (X/sub 2/, Y/sub 2/), ..., Z/sub n/ = min (X/sub n/, Y/sub n/, which are i.i.d. random variables. It is the objective of this paper to formulate a k-sample selection problem under random censorship.
Virial expansion for almost diagonal random matrices
NASA Astrophysics Data System (ADS)
Yevtushenko, Oleg; Kravtsov, Vladimir E.
2003-08-01
Energy level statistics of Hermitian random matrices hat H with Gaussian independent random entries Higeqj is studied for a generic ensemble of almost diagonal random matrices with langle|Hii|2rangle ~ 1 and langle|Hi\
Full randomness from arbitrarily deterministic events
NASA Astrophysics Data System (ADS)
Gallego, Rodrigo; Masanes, Lluis; de la Torre, Gonzalo; Dhara, Chirag; Aolita, Leandro; Acín, Antonio
2013-10-01
Do completely unpredictable events exist? Classical physics excludes fundamental randomness. Although quantum theory makes probabilistic predictions, this does not imply that nature is random, as randomness should be certified without relying on the complete structure of the theory being used. Bell tests approach the question from this perspective. However, they require prior perfect randomness, falling into a circular reasoning. A Bell test that generates perfect random bits from bits possessing high—but less than perfect—randomness has recently been obtained. Yet, the main question remained open: does any initial randomness suffice to certify perfect randomness? Here we show that this is indeed the case. We provide a Bell test that uses arbitrarily imperfect random bits to produce bits that are, under the non-signalling principle assumption, perfectly random. This provides the first protocol attaining full randomness amplification. Our results have strong implications onto the debate of whether there exist events that are fully random.
On Convergent Probability of a Random Walk
ERIC Educational Resources Information Center
Lee, Y.-F.; Ching, W.-K.
2006-01-01
This note introduces an interesting random walk on a straight path with cards of random numbers. The method of recurrent relations is used to obtain the convergent probability of the random walk with different initial positions.
Wave propagation through a random medium - The random slab problem
NASA Technical Reports Server (NTRS)
Acquista, C.
1978-01-01
The first-order smoothing approximation yields integral equations for the mean and the two-point correlation function of a wave in a random medium. A method is presented for the approximate solution of these equations that combines features of the eiconal approximation and of the Born expansion. This method is applied to the problem of reflection and transmission of a plane wave by a slab of a random medium. Both the mean wave and the covariance are calculated to determine the reflected and transmitted amplitudes and intensities.
EDITORIAL: Nano and random lasers Nano and random lasers
NASA Astrophysics Data System (ADS)
Wiersma, Diederik S.; Noginov, Mikhail A.
2010-02-01
The field of extreme miniature sources of stimulated emission represented by random lasers and nanolasers has gone through an enormous development in recent years. Random lasers are disordered optical structures in which light waves are both multiply scattered and amplified. Multiple scattering is a process that we all know very well from daily experience. Many familiar materials are actually disordered dielectrics and owe their optical appearance to multiple light scattering. Examples are white marble, white painted walls, paper, white flowers, etc. Light waves inside such materials perform random walks, that is they are scattered several times in random directions before they leave the material, and this gives it an opaque white appearance. This multiple scattering process does not destroy the coherence of the light. It just creates a very complex interference pattern (also known as speckle). Random lasers can be made of basically any disordered dielectric material by adding an optical gain mechanism to the structure. In practice this can be achieved with, for instance, laser dye that is dissolved in the material and optically excited by a pump laser. Alternative routes to incorporate gain are achieved using rare-earth or transition metal doped solid-state laser materials or direct band gap semiconductors. The latter can potentially be pumped electrically. After excitation, the material is capable of scattering light and amplifying it, and these two ingredients form the basis for a random laser. Random laser emission can be highly coherent, even in the absence of an optical cavity. The reason is that random structures can sustain optical modes that are spectrally narrow. This provides a spectral selection mechanism that, together with gain saturation, leads to coherent emission. A random laser can have a large number of (randomly distributed) modes that are usually strongly coupled. This means that many modes compete for the gain that is available in a random
Cover times of random searches
NASA Astrophysics Data System (ADS)
Chupeau, Marie; Bénichou, Olivier; Voituriez, Raphaël
2015-10-01
How long must one undertake a random search to visit all sites of a given domain? This time, known as the cover time, is a key observable to quantify the efficiency of exhaustive searches, which require a complete exploration of an area and not only the discovery of a single target. Examples range from immune-system cells chasing pathogens to animals harvesting resources, from robotic exploration for cleaning or demining to the task of improving search algorithms. Despite its broad relevance, the cover time has remained elusive and so far explicit results have been scarce and mostly limited to regular random walks. Here we determine the full distribution of the cover time for a broad range of random search processes, including Lévy strategies, intermittent strategies, persistent random walks and random walks on complex networks, and reveal its universal features. We show that for all these examples the mean cover time can be minimized, and that the corresponding optimal strategies also minimize the mean search time for a single target, unambiguously pointing towards their robustness.
Random walk through fractal environments.
Isliker, H; Vlahos, L
2003-02-01
We analyze random walk through fractal environments, embedded in three-dimensional, permeable space. Particles travel freely and are scattered off into random directions when they hit the fractal. The statistical distribution of the flight increments (i.e., of the displacements between two consecutive hittings) is analytically derived from a common, practical definition of fractal dimension, and it turns out to approximate quite well a power-law in the case where the dimension D(F) of the fractal is less than 2, there is though, always a finite rate of unaffected escape. Random walks through fractal sets with D(F)< or =2 can thus be considered as defective Levy walks. The distribution of jump increments for D(F)>2 is decaying exponentially. The diffusive behavior of the random walk is analyzed in the frame of continuous time random walk, which we generalize to include the case of defective distributions of walk increments. It is shown that the particles undergo anomalous, enhanced diffusion for D(F)<2, the diffusion is dominated by the finite escape rate. Diffusion for D(F)>2 is normal for large times, enhanced though for small and intermediate times. In particular, it follows that fractals generated by a particular class of self-organized criticality models give rise to enhanced diffusion. The analytical results are illustrated by Monte Carlo simulations.
Random root movements in weightlessness
NASA Technical Reports Server (NTRS)
Johnsson, A.; Karlsson, C.; Iversen, T. H.; Chapman, D. K.
1996-01-01
The dynamics of root growth was studied in weightlessness. In the absence of the gravitropic reference direction during weightlessness, root movements could be controlled by spontaneous growth processes, without any corrective growth induced by the gravitropic system. If truly random of nature, the bending behavior should follow so-called 'random walk' mathematics during weightlessness. Predictions from this hypothesis were critically tested. In a Spacelab ESA-experiment, denoted RANDOM and carried out during the IML-2 Shuttle flight in July 1994, the growth of garden cress (Lepidium sativum) roots was followed by time lapse photography at 1-h intervals. The growth pattern was recorded for about 20 h. Root growth was significantly smaller in weightlessness as compared to gravity (control) conditions. It was found that the roots performed spontaneous movements in weightlessness. The average direction of deviation of the plants consistently stayed equal to zero, despite these spontaneous movements. The average squared deviation increased linearly with time as predicted theoretically (but only for 8-10 h). Autocorrelation calculations showed that bendings of the roots, as determined from the 1-h photographs, were uncorrelated after about a 2-h interval. It is concluded that random processes play an important role in root growth. Predictions from a random walk hypothesis as to the growth dynamics could explain parts of the growth patterns recorded. This test of the hypothesis required microgravity conditions as provided for in a space experiment.
Visual Categorization with Random Projection.
Arriaga, Rosa I; Rutter, David; Cakmak, Maya; Vempala, Santosh S
2015-10-01
Humans learn categories of complex objects quickly and from a few examples. Random projection has been suggested as a means to learn and categorize efficiently. We investigate how random projection affects categorization by humans and by very simple neural networks on the same stimuli and categorization tasks, and how this relates to the robustness of categories. We find that (1) drastic reduction in stimulus complexity via random projection does not degrade performance in categorization tasks by either humans or simple neural networks, (2) human accuracy and neural network accuracy are remarkably correlated, even at the level of individual stimuli, and (3) the performance of both is strongly indicated by a natural notion of category robustness.
Fast Randomized STDMA Link Scheduling
NASA Astrophysics Data System (ADS)
Gomez, Sergio; Gras, Oriol; Friderikos, Vasilis
In this paper a fast randomized parallel link swap based packing (RSP) algorithm for timeslot allocation in a spatial time division multiple access (STDMA) wireless mesh network is presented. The proposed randomized algorithm extends several greedy scheduling algorithms that utilize the physical interference model by applying a local search that leads to a substantial improvement in the spatial timeslot reuse. Numerical simulations reveal that compared to previously scheduling schemes the proposed randomized algorithm can achieve a performance gain of up to 11%. A significant benefit of the proposed scheme is that the computations can be parallelized and therefore can efficiently utilize commoditized and emerging multi-core and/or multi-CPU processors.
A Randomized Experiment Comparing Random and Cutoff-Based Assignment
ERIC Educational Resources Information Center
Shadish, William R.; Galindo, Rodolfo; Wong, Vivian C.; Steiner, Peter M.; Cook, Thomas D.
2011-01-01
In this article, we review past studies comparing randomized experiments to regression discontinuity designs, mostly finding similar results, but with significant exceptions. The latter might be due to potential confounds of study characteristics with assignment method or with failure to estimate the same parameter over methods. In this study, we…
Relatively Random: Context Effects on Perceived Randomness and Predicted Outcomes
ERIC Educational Resources Information Center
Matthews, William J.
2013-01-01
This article concerns the effect of context on people's judgments about sequences of chance outcomes. In Experiment 1, participants judged whether sequences were produced by random, mechanical processes (such as a roulette wheel) or skilled human action (such as basketball shots). Sequences with lower alternation rates were judged more likely to…
Relatively random: context effects on perceived randomness and predicted outcomes.
Matthews, William J
2013-09-01
This article concerns the effect of context on people's judgments about sequences of chance outcomes. In Experiment 1, participants judged whether sequences were produced by random, mechanical processes (such as a roulette wheel) or skilled human action (such as basketball shots). Sequences with lower alternation rates were judged more likely to result from human action. However, this effect was highly context-dependent: A moderate alternation rate was judged more likely to indicate a random physical process when encountered among sequences with lower alternation rates than when embedded among sequences with higher alternation rates. Experiment 2 found the same effect for predictions of the next outcome following a streak: A streak of 3 at the end of the sequence was judged less likely to continue by participants who had encountered shorter terminal streaks in previous trials than by those who had encountered longer ones. These contrast effects (a) help to explain variability in the types of sequences that are judged to be random and that elicit the gambler's fallacy, and urge caution about attempts to establish universal parameterizations of these effects; (b) are congruent with theories of sequence judgment that emphasize the importance of people's actual experiences with sequences of different kinds; (c) provide a link between models of sequence judgment and broader accounts of psychophysical/economic judgment; and (d) may offer new insight into individual differences in randomness judgments and sequence predictions.
Pooranfar, S; Shakoor, E; Shafahi, MJ; Salesi, M; Karimi, MH; Roozbeh, J; Hasheminasab, M
2014-01-01
Background: Patients undergoing renal transplantation consume immunosuppressive drugs to prevent graft rejection. Cardiovascular complications and reduced quality of sleep are among the side effects of these drugs. Studies have indicated that the use of non-therapeutic methods such as exercise is important to reduce these complications. Objective: To evaluate the effect of a period of exercise training, as a non-therapeutic method, on quality and quantity of sleep and lipid profile in renal transplant patients. Methods: 44 renal transplant recipients were selected to participate in the study and randomized into exercise (n=29) and control (n=15) groups. The exercise group participated in a cumulative exercise program 3 days a week for 10 weeks in 60–90-minute exercise sessions. Control group subjects did not participate in any regular exercise activity during this period. Sleep quality of the subjects was evaluated using Pittsburgh Sleep Quality Index (PSQI) questionnaire; the sleep quantity was assessed by recording the duration of convenient nocturnal sleep of the subjects. Physiological sleep-related variables (serum triglyceride [TG], and total, high-density lipoprotein [HDL], and low-density lipoprotein [LDL] cholesterol) were measured before and after 10 weeks of exercise training Results: In exercise training group, sleep quality of the subjects was improved by 27%; the sleep quantity was increased by 30 minutes (p<0.05). TG, cholesterol and LDL values were significantly (p<0.05) decreased after 10 weeks of exercise training in the exercise group compared to the control group, however, no change was observed in serum HDL level in exercise group compared to the control. There was also a significant (p=0.05) difference in sleep quality and quantity between control and exercise groups. However, there was no correlation between changing quality and quantity of sleep with sleep-related physiological factors. Conclusion: 10 weeks of exercise activity improved
Random photonic crystal optical memory
NASA Astrophysics Data System (ADS)
Wirth Lima, A., Jr.; Sombra, A. S. B.
2012-10-01
Currently, optical cross-connects working on wavelength division multiplexing systems are based on optical fiber delay lines buffering. We designed and analyzed a novel photonic crystal optical memory, which replaces the fiber delay lines of the current optical cross-connect buffer. Optical buffering systems based on random photonic crystal optical memory have similar behavior to the electronic buffering systems based on electronic RAM memory. In this paper, we show that OXCs working with optical buffering based on random photonic crystal optical memories provides better performance than the current optical cross-connects.
Truncations of random orthogonal matrices.
Khoruzhenko, Boris A; Sommers, Hans-Jürgen; Życzkowski, Karol
2010-10-01
Statistical properties of nonsymmetric real random matrices of size M, obtained as truncations of random orthogonal N×N matrices, are investigated. We derive an exact formula for the density of eigenvalues which consists of two components: finite fraction of eigenvalues are real, while the remaining part of the spectrum is located inside the unit disk symmetrically with respect to the real axis. In the case of strong nonorthogonality, M/N=const, the behavior typical to real Ginibre ensemble is found. In the case M=N-L with fixed L, a universal distribution of resonance widths is recovered.
Truncations of random orthogonal matrices
NASA Astrophysics Data System (ADS)
Khoruzhenko, Boris A.; Sommers, Hans-Jürgen; Życzkowski, Karol
2010-10-01
Statistical properties of nonsymmetric real random matrices of size M , obtained as truncations of random orthogonal N×N matrices, are investigated. We derive an exact formula for the density of eigenvalues which consists of two components: finite fraction of eigenvalues are real, while the remaining part of the spectrum is located inside the unit disk symmetrically with respect to the real axis. In the case of strong nonorthogonality, M/N=const , the behavior typical to real Ginibre ensemble is found. In the case M=N-L with fixed L , a universal distribution of resonance widths is recovered.
Neutron transport in random media
Makai, M.
1996-08-01
The survey reviews the methods available in the literature which allow a discussion of corium recriticality after a severe accident and a characterization of the corium. It appears that to date no one has considered the eigenvalue problem, though for the source problem several approaches have been proposed. The mathematical formulation of a random medium may be approached in different ways. Based on the review of the literature, we can draw three basic conclusions. The problem of static, random perturbations has been solved. The static case is tractable by the Monte Carlo method. There is a specific time dependent case for which the average flux is given as a series expansion.
Molecular random tilings as glasses
Garrahan, Juan P.; Stannard, Andrew; Blunt, Matthew O.; Beton, Peter H.
2009-01-01
We have recently shown that p-terphenyl-3,5,3′,5′-tetracarboxylic acid adsorbed on graphite self-assembles into a two-dimensional rhombus random tiling. This tiling is close to ideal, displaying long-range correlations punctuated by sparse localized tiling defects. In this article we explore the analogy between dynamic arrest in this type of random tilings and that of structural glasses. We show that the structural relaxation of these systems is via the propagation–reaction of tiling defects, giving rise to dynamic heterogeneity. We study the scaling properties of the dynamics and discuss connections with kinetically constrained models of glasses. PMID:19720990
Synchronizability of random rectangular graphs
Estrada, Ernesto Chen, Guanrong
2015-08-15
Random rectangular graphs (RRGs) represent a generalization of the random geometric graphs in which the nodes are embedded into hyperrectangles instead of on hypercubes. The synchronizability of RRG model is studied. Both upper and lower bounds of the eigenratio of the network Laplacian matrix are determined analytically. It is proven that as the rectangular network is more elongated, the network becomes harder to synchronize. The synchronization processing behavior of a RRG network of chaotic Lorenz system nodes is numerically investigated, showing complete consistence with the theoretical results.
Rini, Christine; Porter, Laura S; Somers, Tamara J; McKee, Daphne C; DeVellis, Robert F; Smith, Meredith; Winkel, Gary; Ahern, David K; Goldman, Roberta; Stiller, Jamie L; Mariani, Cara; Patterson, Carol; Jordan, Joanne M; Caldwell, David S; Keefe, Francis J
2015-05-01
Osteoarthritis (OA) places a significant burden on worldwide public health because of the large and growing number of people affected by OA and its associated pain and disability. Pain coping skills training (PCST) is an evidence-based intervention targeting OA pain and disability. To reduce barriers that currently limit access to PCST, we developed an 8-week, automated, Internet-based PCST program called PainCOACH and evaluated its potential efficacy and acceptability in a small-scale, 2-arm randomized controlled feasibility trial. Participants were 113 men and women with clinically confirmed hip or knee OA and associated pain. They were randomized to a group completing PainCOACH or an assessment-only control group. Osteoarthritis pain, pain-related interference with functioning, pain-related anxiety, self-efficacy for pain management, and positive and negative affect were measured before intervention, midway through the intervention, and after intervention. Findings indicated high acceptability and adherence: 91% of participants randomized to complete PainCOACH finished all 8 modules over 8 to 10 weeks. Linear mixed models showed that, after treatment, women who received the PainCOACH intervention reported significantly lower pain than that in women in the control group (Cohen d = 0.33). Intervention effects could not be tested in men because of their low pain and small sample size. Additionally, both men and women demonstrated increases in self-efficacy from baseline to after intervention compared with the control group (d = 0.43). Smaller effects were observed for pain-related anxiety (d = 0.20), pain-related interference with functioning (d = 0.13), negative affect (d = 0.10), and positive affect (d = 0.24). Findings underscore the value of continuing to develop an automated Internet-based approach to disseminate this empirically supported intervention.
NASA Technical Reports Server (NTRS)
Katti, Romney R.
1995-01-01
Random-access memory (RAM) devices of proposed type exploit magneto-optical properties of magnetic garnets exhibiting perpendicular anisotropy. Magnetic writing and optical readout used. Provides nonvolatile storage and resists damage by ionizing radiation. Because of basic architecture and pinout requirements, most likely useful as small-capacity memory devices.
Common Randomness Principles of Secrecy
ERIC Educational Resources Information Center
Tyagi, Himanshu
2013-01-01
This dissertation concerns the secure processing of distributed data by multiple terminals, using interactive public communication among themselves, in order to accomplish a given computational task. In the setting of a probabilistic multiterminal source model in which several terminals observe correlated random signals, we analyze secure…
Randomized Item Response Theory Models
ERIC Educational Resources Information Center
Fox, Jean-Paul
2005-01-01
The randomized response (RR) technique is often used to obtain answers on sensitive questions. A new method is developed to measure latent variables using the RR technique because direct questioning leads to biased results. Within the RR technique is the probability of the true response modeled by an item response theory (IRT) model. The RR…
Universality in random quantum networks
NASA Astrophysics Data System (ADS)
Novotný, Jaroslav; Alber, Gernot; Jex, Igor
2015-12-01
Networks constitute efficient tools for assessing universal features of complex systems. In physical contexts, classical as well as quantum networks are used to describe a wide range of phenomena, such as phase transitions, intricate aspects of many-body quantum systems, or even characteristic features of a future quantum internet. Random quantum networks and their associated directed graphs are employed for capturing statistically dominant features of complex quantum systems. Here, we develop an efficient iterative method capable of evaluating the probability of a graph being strongly connected. It is proven that random directed graphs with constant edge-establishing probability are typically strongly connected, i.e., any ordered pair of vertices is connected by a directed path. This typical topological property of directed random graphs is exploited to demonstrate universal features of the asymptotic evolution of large random qubit networks. These results are independent of our knowledge of the details of the network topology. These findings suggest that other highly complex networks, such as a future quantum internet, may also exhibit similar universal properties.
Pseudo-Random Number Generators
NASA Technical Reports Server (NTRS)
Howell, L. W.; Rheinfurth, M. H.
1984-01-01
Package features comprehensive selection of probabilistic distributions. Monte Carlo simulations resorted to whenever systems studied not amenable to deterministic analyses or when direct experimentation not feasible. Random numbers having certain specified distribution characteristic integral part of simulations. Package consists of collector of "pseudorandom" number generators for use in Monte Carlo simulations.
Undecidability Theorem and Quantum Randomness
NASA Astrophysics Data System (ADS)
Berezin, Alexander A.
2005-04-01
As scientific folklore has it, Kurt Godel was once annoyed by question whether he sees any link between his Undecidability Theorem (UT) and Uncertainty Relationship. His reaction, however, may indicate that he probably felt that such a hidden link could indeed exist but he was unable clearly formulate it. Informational version of UT (G.J.Chaitin) states impossibility to rule out algorithmic compressibility of arbitrary digital string. Thus, (mathematical) randomness can only be disproven, not proven. Going from mathematical to physical (mainly quantum) randomness, we encounter seemingly random acts of radioactive decays of isotopes (such as C14), emission of excited atoms, tunneling effects, etc. However, our notion of quantum randomness (QR) may likely hit similarly formidable wall of physical version of UT leading to seemingly bizarre ideas such as Everett many world model (D.Deutsch) or backward causation (J.A.Wheeler). Resolution may potentially lie in admitting some form of Aristotelean final causation (AFC) as an ultimate foundational principle (G.W.Leibniz) connecting purely mathematical (Platonic) grounding aspects with it physically observable consequences, such as plethora of QR effects. Thus, what we interpret as QR may eventually be manifestation of AFC in which UT serves as delivery vehicle. Another example of UT/QR/AFC connection is question of identity (indistinguishability) of elementary particles (are all electrons exactly the same or just approximately so to a very high degree?).
Plated wire random access memories
NASA Technical Reports Server (NTRS)
Gouldin, L. D.
1975-01-01
A program was conducted to construct 4096-work by 18-bit random access, NDRO-plated wire memory units. The memory units were subjected to comprehensive functional and environmental tests at the end-item level to verify comformance with the specified requirements. A technical description of the unit is given, along with acceptance test data sheets.
A Mixed Effects Randomized Item Response Model
ERIC Educational Resources Information Center
Fox, J.-P.; Wyrick, Cheryl
2008-01-01
The randomized response technique ensures that individual item responses, denoted as true item responses, are randomized before observing them and so-called randomized item responses are observed. A relationship is specified between randomized item response data and true item response data. True item response data are modeled with a (non)linear…
Random trinomial tree models and vanilla options
NASA Astrophysics Data System (ADS)
Ganikhodjaev, Nasir; Bayram, Kamola
2013-09-01
In this paper we introduce and study random trinomial model. The usual trinomial model is prescribed by triple of numbers (u, d, m). We call the triple (u, d, m) an environment of the trinomial model. A triple (Un, Dn, Mn), where {Un}, {Dn} and {Mn} are the sequences of independent, identically distributed random variables with 0 < Dn < 1 < Un and Mn = 1 for all n, is called a random environment and trinomial tree model with random environment is called random trinomial model. The random trinomial model is considered to produce more accurate results than the random binomial model or usual trinomial model.
Random density matrices versus random evolution of open system
NASA Astrophysics Data System (ADS)
Pineda, Carlos; Seligman, Thomas H.
2015-10-01
We present and compare two families of ensembles of random density matrices. The first, static ensemble, is obtained foliating an unbiased ensemble of density matrices. As criterion we use fixed purity as the simplest example of a useful convex function. The second, dynamic ensemble, is inspired in random matrix models for decoherence where one evolves a separable pure state with a random Hamiltonian until a given value of purity in the central system is achieved. Several families of Hamiltonians, adequate for different physical situations, are studied. We focus on a two qubit central system, and obtain exact expressions for the static case. The ensemble displays a peak around Werner-like states, modulated by nodes on the degeneracies of the density matrices. For moderate and strong interactions good agreement between the static and the dynamic ensembles is found. Even in a model where one qubit does not interact with the environment excellent agreement is found, but only if there is maximal entanglement with the interacting one. The discussion is started recalling similar considerations for scattering theory. At the end, we comment on the reach of the results for other convex functions of the density matrix, and exemplify the situation with the von Neumann entropy.
Random walk with random resetting to the maximum position
NASA Astrophysics Data System (ADS)
Majumdar, Satya N.; Sabhapandit, Sanjib; Schehr, Grégory
2015-11-01
We study analytically a simple random walk model on a one-dimensional lattice, where at each time step the walker resets to the maximum of the already visited positions (to the rightmost visited site) with a probability r , and with probability (1 -r ) , it undergoes symmetric random walk, i.e., it hops to one of its neighboring sites, with equal probability (1 -r )/2 . For r =0 , it reduces to a standard random walk whose typical distance grows as √{n } for large n . In the presence of a nonzero resetting rate 0
Random walk with random resetting to the maximum position.
Majumdar, Satya N; Sabhapandit, Sanjib; Schehr, Grégory
2015-11-01
We study analytically a simple random walk model on a one-dimensional lattice, where at each time step the walker resets to the maximum of the already visited positions (to the rightmost visited site) with a probability r, and with probability (1-r), it undergoes symmetric random walk, i.e., it hops to one of its neighboring sites, with equal probability (1-r)/2. For r=0, it reduces to a standard random walk whose typical distance grows as √n for large n. In the presence of a nonzero resetting rate 0
Mangione, Kathleen K.; Craik, Rebecca L.; Palombaro, Kerstin M.; Tomlinson, Susan S.; Hofmann, Mary T.
2010-01-01
Objectives Examine the effectiveness of a short term leg strengthening exercise program compared to attentional control on improving strength, walking abilities, and function one year after hip fracture. Design Randomized controlled pilot study. Setting Interventions occurred in patients’ homes. Participants Community-dwelling older adults (n=26) six months post hip fracture at baseline. Intervention Exercise and control participants received interventions by physical therapists twice weekly for 10 weeks. The exercise group received high intensity leg strengthening exercises. The control group received transcutaneous electrical nerve stimulation and mental imagery. Measurements Isometric force production of lower extremity muscles; usual and fast gait speed, six minute walk (6-MW) distance, modified physical performance test (mPPT), and SF-36 physical function. Results The primary endpoint was at one year post fracture. Isometric force production (p<.01), usual and fast gait speed (p=.02 & .03, respectively), 6-MW (p<.01), and mPPT (p<.01) improved at one year post fracture with exercise. Effect sizes were 0.79 for strength, 0.81 for mPPT scores, 0.56 for gait speed, 0.49 for 6-MW, and 0.30 for SF-36 scores. More patients in the exercise group made meaningful changes in gait speed and 6-MW distance than control patients (χ2: p=.004). Conclusion A 10-week home-based progressive resistance exercise program was sufficient to achieve moderate to large effects on physical performance and quality of life and may offer an alternative intervention mode for hip fracture patients who are unable to leave home at 6 months after the fracture. The effects were maintained at 3 months after completion of the training program. PMID:20929467
Markov random field surface reconstruction.
Paulsen, Rasmus R; Baerentzen, Jakob Andreas; Larsen, Rasmus
2010-01-01
A method for implicit surface reconstruction is proposed. The novelty in this paper is the adaptation of Markov Random Field regularization of a distance field. The Markov Random Field formulation allows us to integrate both knowledge about the type of surface we wish to reconstruct (the prior) and knowledge about data (the observation model) in an orthogonal fashion. Local models that account for both scene-specific knowledge and physical properties of the scanning device are described. Furthermore, how the optimal distance field can be computed is demonstrated using conjugate gradients, sparse Cholesky factorization, and a multiscale iterative optimization scheme. The method is demonstrated on a set of scanned human heads and, both in terms of accuracy and the ability to close holes, the proposed method is shown to have similar or superior performance when compared to current state-of-the-art algorithms.
Knot probabilities in random diagrams
NASA Astrophysics Data System (ADS)
Cantarella, Jason; Chapman, Harrison; Mastin, Matt
2016-10-01
We consider a natural model of random knotting—choose a knot diagram at random from the finite set of diagrams with n crossings. We tabulate diagrams with 10 and fewer crossings and classify the diagrams by knot type, allowing us to compute exact probabilities for knots in this model. As expected, most diagrams with 10 and fewer crossings are unknots (about 78% of the roughly 1.6 billion 10 crossing diagrams). For these crossing numbers, the unknot fraction is mostly explained by the prevalence of ‘tree-like’ diagrams which are unknots for any assignment of over/under information at crossings. The data shows a roughly linear relationship between the log of knot type probability and the log of the frequency rank of the knot type, analogous to Zipf’s law for word frequency. The complete tabulation and all knot frequencies are included as supplementary data.
Optimal randomized scheduling by replacement
Saias, I.
1996-05-01
In the replacement scheduling problem, a system is composed of n processors drawn from a pool of p. The processors can become faulty while in operation and faulty processors never recover. A report is issued whenever a fault occurs. This report states only the existence of a fault but does not indicate its location. Based on this report, the scheduler can reconfigure the system and choose another set of n processors. The system operates satisfactorily as long as, upon report of a fault, the scheduler chooses n non-faulty processors. We provide a randomized protocol maximizing the expected number of faults the system can sustain before the occurrence of a crash. The optimality of the protocol is established by considering a closely related dual optimization problem. The game-theoretic technical difficulties that we solve in this paper are very general and encountered whenever proving the optimality of a randomized algorithm in parallel and distributed computation.
Percolation on correlated random networks
NASA Astrophysics Data System (ADS)
Agliari, E.; Cioli, C.; Guadagnini, E.
2011-09-01
We consider a class of random, weighted networks, obtained through a redefinition of patterns in an Hopfield-like model, and, by performing percolation processes, we get information about topology and resilience properties of the networks themselves. Given the weighted nature of the graphs, different kinds of bond percolation can be studied: stochastic (deleting links randomly) and deterministic (deleting links based on rank weights), each mimicking a different physical process. The evolution of the network is accordingly different, as evidenced by the behavior of the largest component size and of the distribution of cluster sizes. In particular, we can derive that weak ties are crucial in order to maintain the graph connected and that, when they are the most prone to failure, the giant component typically shrinks without abruptly breaking apart; these results have been recently evidenced in several kinds of social networks.
Image segmentation using random features
NASA Astrophysics Data System (ADS)
Bull, Geoff; Gao, Junbin; Antolovich, Michael
2014-01-01
This paper presents a novel algorithm for selecting random features via compressed sensing to improve the performance of Normalized Cuts in image segmentation. Normalized Cuts is a clustering algorithm that has been widely applied to segmenting images, using features such as brightness, intervening contours and Gabor filter responses. Some drawbacks of Normalized Cuts are that computation times and memory usage can be excessive, and the obtained segmentations are often poor. This paper addresses the need to improve the processing time of Normalized Cuts while improving the segmentations. A significant proportion of the time in calculating Normalized Cuts is spent computing an affinity matrix. A new algorithm has been developed that selects random features using compressed sensing techniques to reduce the computation needed for the affinity matrix. The new algorithm, when compared to the standard implementation of Normalized Cuts for segmenting images from the BSDS500, produces better segmentations in significantly less time.
NASA Astrophysics Data System (ADS)
Korneta, W.; Pytel, Z.
1988-07-01
The random walk of a particle on a three-dimensional semi-infinite lattice is considered. In order to study the effect of the surface on the random walk, it is assumed that the velocity of the particle depends on the distance to the surface. Moreover it is assumed that at any point the particle may be absorbed with a certain probability. The probability of the return of the particle to the starting point and the average time of eventual return are calculated. The dependence of these quantities on the distance to the surface, the probability of absorption and the properties of the surface is discussed. The method of generating functions is used.
Random modelling of contagious diseases.
Demongeot, J; Hansen, O; Hessami, H; Jannot, A S; Mintsa, J; Rachdi, M; Taramasco, C
2013-03-01
Modelling contagious diseases needs to include a mechanistic knowledge about contacts between hosts and pathogens as specific as possible, e.g., by incorporating in the model information about social networks through which the disease spreads. The unknown part concerning the contact mechanism can be modelled using a stochastic approach. For that purpose, we revisit SIR models by introducing first a microscopic stochastic version of the contacts between individuals of different populations (namely Susceptible, Infective and Recovering), then by adding a random perturbation in the vicinity of the endemic fixed point of the SIR model and eventually by introducing the definition of various types of random social networks. We propose as example of application to contagious diseases the HIV, and we show that a micro-simulation of individual based modelling (IBM) type can reproduce the current stable incidence of the HIV epidemic in a population of HIV-positive men having sex with men (MSM). PMID:23525763
Equations of the Randomizer's Dynamics
NASA Astrophysics Data System (ADS)
Strzałko, Jarosław; Grabski, Juliusz; Perlikowski, Przemysław; Stefanski, Andrzej; Kapitaniak, Tomasz
Basing on the Newton-Euler laws of mechanics we derive the equations which describe the dynamics of the coin toss, the die throw, and roulette run. The equations for full 3D models and for lower dimensional simplifications are given. The influence of the air resistance and energy dissipation at the impacts is described. The obtained equations allow for the numerical simulation of the randomizer's dynamics and define the mapping of the initial conditions into the final outcome.
NASA Technical Reports Server (NTRS)
Hornstein, J.; Fainberg, J.
1981-01-01
We review ray-optical methods of analyzing short-wavelength propagation in random media. The advantages and limitations of ray methods are discussed, and results of the statistical theory of ray segment fluctuations pertinent to ray tracing are summarized. The standard method of Monte Carlo ray tracing is compared to a new method which takes into account recent results on the statistics of ray segment fluctuations.
Random drift and culture change.
Bentley, R. Alexander; Hahn, Matthew W.; Shennan, Stephen J.
2004-01-01
We show that the frequency distributions of cultural variants, in three different real-world examples--first names, archaeological pottery and applications for technology patents--follow power laws that can be explained by a simple model of random drift. We conclude that cultural and economic choices often reflect a decision process that is value-neutral; this result has far-reaching testable implications for social-science research. PMID:15306315
Correlated randomness and switching phenomena
NASA Astrophysics Data System (ADS)
Stanley, H. E.; Buldyrev, S. V.; Franzese, G.; Havlin, S.; Mallamace, F.; Kumar, P.; Plerou, V.; Preis, T.
2010-08-01
One challenge of biology, medicine, and economics is that the systems treated by these serious scientific disciplines have no perfect metronome in time and no perfect spatial architecture-crystalline or otherwise. Nonetheless, as if by magic, out of nothing but randomness one finds remarkably fine-tuned processes in time and remarkably fine-tuned structures in space. Further, many of these processes and structures have the remarkable feature of “switching” from one behavior to another as if by magic. The past century has, philosophically, been concerned with placing aside the human tendency to see the universe as a fine-tuned machine. Here we will address the challenge of uncovering how, through randomness (albeit, as we shall see, strongly correlated randomness), one can arrive at some of the many spatial and temporal patterns in biology, medicine, and economics and even begin to characterize the switching phenomena that enables a system to pass from one state to another. Inspired by principles developed by A. Nihat Berker and scores of other statistical physicists in recent years, we discuss some applications of correlated randomness to understand switching phenomena in various fields. Specifically, we present evidence from experiments and from computer simulations supporting the hypothesis that water’s anomalies are related to a switching point (which is not unlike the “tipping point” immortalized by Malcolm Gladwell), and that the bubbles in economic phenomena that occur on all scales are not “outliers” (another Gladwell immortalization). Though more speculative, we support the idea of disease as arising from some kind of yet-to-be-understood complex switching phenomenon, by discussing data on selected examples, including heart disease and Alzheimer disease.
Joos, Leen; Goudriaan, Anna E; Schmaal, Lianne; van den Brink, Wim; Sabbe, Bernard G C; Dom, Geert
2013-11-01
Cognitive deficits are highly prevalent in alcohol-dependent (AD) patients and may have a detrimental impact on treatment response and treatment outcome. Enhancing cognitive functions may improve treatment success. Modafinil is a promising compound in this respect. Therefore, a randomized double-blind placebo-controlled trial was conducted with modafinil (300 mg/d) or placebo in 83 AD patients for 10 weeks. Various cognitive functions (digit span task, Tower of London task, Stroop task) were measured at baseline, during and after treatment. Compared to placebo, modafinil improved verbal short-term memory (number of forward digit spans) (p=0.030), but modafinil exerted a negative effect on the working memory score of the digit span task (p=0.003). However, subgroup analyses revealed that modafinil did improve both working memory and verbal short-term memory in AD patients with a poor working memory ability at baseline (25% worst performers), whereas no significant treatment effect of modafinil was found on these two dependent variables in patients with good working memory skills at baseline (25% best performers). No effect of modafinil was found on measures of planning (Tower of London task) and selective attention (Stroop task). Further research is needed to better understand the relationship between cognitive remediation and treatment outcome in order to design targeted treatments.
Armodafinil in binge eating disorder: a randomized, placebo-controlled trial.
McElroy, Susan L; Guerdjikova, Anna I; Mori, Nicole; Blom, Thomas J; Williams, Stephanie; Casuto, Leah S; Keck, Paul E
2015-07-01
This study evaluated the efficacy, tolerability, and safety of armodafinil in the treatment of binge eating disorder (BED). Sixty participants with BED were randomized to receive armodafinil (150-250 mg/day) (N = 30) or placebo (N = 30) in a 10-week, prospective, parallel-group, double-blind, flexible-dose, single-center trial. In the primary longitudinal analysis, armodafinil and placebo produced similar rates of improvement in binge eating day frequency (the primary outcome measure); however, armodafinil was associated with a statistically significantly higher rate of decrease in binge eating episode frequency. In the secondary baseline-to-endpoint analyses, armodafinil was associated with statistically significant reductions in obsessive-compulsive features of binge eating and BMI. The mean (SD) armodafinil daily dose at endpoint evaluation was 216.7 (43.9) mg. There were no serious adverse events, although one armodafinil recipient developed markedly increased blood pressure that resolved upon drug discontinuation. The small sample size may have limited the detection of important drug-placebo differences. As some of the observed effect sizes appeared clinically meaningful, larger studies of armodafinil in the treatment of BED are warranted.
Approximating random quantum optimization problems
NASA Astrophysics Data System (ADS)
Hsu, B.; Laumann, C. R.; Läuchli, A. M.; Moessner, R.; Sondhi, S. L.
2013-06-01
We report a cluster of results regarding the difficulty of finding approximate ground states to typical instances of the quantum satisfiability problem k-body quantum satisfiability (k-QSAT) on large random graphs. As an approximation strategy, we optimize the solution space over “classical” product states, which in turn introduces a novel autonomous classical optimization problem, PSAT, over a space of continuous degrees of freedom rather than discrete bits. Our central results are (i) the derivation of a set of bounds and approximations in various limits of the problem, several of which we believe may be amenable to a rigorous treatment; (ii) a demonstration that an approximation based on a greedy algorithm borrowed from the study of frustrated magnetism performs well over a wide range in parameter space, and its performance reflects the structure of the solution space of random k-QSAT. Simulated annealing exhibits metastability in similar “hard” regions of parameter space; and (iii) a generalization of belief propagation algorithms introduced for classical problems to the case of continuous spins. This yields both approximate solutions, as well as insights into the free energy “landscape” of the approximation problem, including a so-called dynamical transition near the satisfiability threshold. Taken together, these results allow us to elucidate the phase diagram of random k-QSAT in a two-dimensional energy-density-clause-density space.
Mixed interactions in random copolymers
NASA Astrophysics Data System (ADS)
Marinov, Toma; Luettmer-Strathmann, Jutta
2002-03-01
The description of thermodynamic properties of copolymers in terms of simple lattice models requires a value for the mixed interaction strength (ɛ_12) between unlike chain segments, in addition to parameters that can be derived from the properties of the corresponding homopolymers. If the monomers are chemically similar, Berthelot's geometric-mean combining rule provides a good first approximation for ɛ_12. In earlier work on blends of polyolefins [1], we found that the small-scale architecture of the chains leads to corrections to the geometric-mean approximation that are important for the prediction of phase diagrams. In this work, we focus on the additional effects due to sequencing of the monomeric units. In order to estimate the mixed interaction ɛ_12 for random copolymers, the small-scale simulation approach developed in [1] is extended to allow for random sequencing of the monomeric units. The approach is applied here to random copolymers of ethylene and 1-butene. [1] J. Luettmer-Strathmann and J.E.G. Lipson. Phys. Rev. E 59, 2039 (1999) and Macromolecules 32, 1093 (1999).
Resolution analysis by random probing
NASA Astrophysics Data System (ADS)
Simutė, S.; Fichtner, A.; van Leeuwen, T.
2015-12-01
We develop and apply methods for resolution analysis in tomography, based on stochastic probing of the Hessian or resolution operators. Key properties of our methods are (i) low algorithmic complexity and easy implementation, (ii) applicability to any tomographic technique, including full-waveform inversion and linearized ray tomography, (iii) applicability in any spatial dimension and to inversions with a large number of model parameters, (iv) low computational costs that are mostly a fraction of those required for synthetic recovery tests, and (v) the ability to quantify both spatial resolution and inter-parameter trade-offs. Using synthetic full-waveform inversions as benchmarks, we demonstrate that auto-correlations of random-model applications to the Hessian yield various resolution measures, including direction- and position-dependent resolution lengths, and the strength of inter-parameter mappings. We observe that the required number of random test models is around 5 in one, two and three dimensions. This means that the proposed resolution analyses are not only more meaningful than recovery tests but also computationally less expensive. We demonstrate the applicability of our method in 3D real-data full-waveform inversions for the western Mediterranean and Japan. In addition to tomographic problems, resolution analysis by random probing may be used in other inverse methods that constrain continuously distributed properties, including electromagnetic and potential-field inversions, as well as recently emerging geodynamic data assimilation.
Topological insulators in random potentials
NASA Astrophysics Data System (ADS)
Pieper, Andreas; Fehske, Holger
2016-01-01
We investigate the effects of magnetic and nonmagnetic impurities on the two-dimensional surface states of three-dimensional topological insulators (TIs). Modeling weak and strong TIs using a generic four-band Hamiltonian, which allows for a breaking of inversion and time-reversal symmetries and takes into account random local potentials as well as the Zeeman and orbital effects of external magnetic fields, we compute the local density of states, the single-particle spectral function, and the conductance for a (contacted) slab geometry by numerically exact techniques based on kernel polynomial expansion and Green's function approaches. We show that bulk disorder refills the surface-state Dirac gap induced by a homogeneous magnetic field with states, whereas orbital (Peierls-phase) disorder preserves the gap feature. The former effect is more pronounced in weak TIs than in strong TIs. At moderate randomness, disorder-induced conducting channels appear in the surface layer, promoting diffusive metallicity. Random Zeeman fields rapidly destroy any conducting surface states. Imprinting quantum dots on a TI's surface, we demonstrate that carrier transport can be easily tuned by varying the gate voltage, even to the point where quasibound dot states may appear.
Nordin, Sara; Carlbring, Per; Cuijpers, Pim; Andersson, Gerhard
2010-09-01
Cognitive behavioral bibliotherapy for panic disorder has been found to be less effective without therapist support. In this study, participants were randomized to either unassisted bibliotherapy (n=20) with a scheduled follow-up telephone interview or to a waiting list control group (n=19). Following a structured psychiatric interview, participants in the treatment group were sent a self-help book consisting of 10 chapters based on cognitive behavioral strategies for the treatment of panic disorder. No therapist contact of any kind was provided during the treatment phase, which lasted for 10 weeks. Results showed that the treatment group had, in comparison to the control group, improved on all outcome measures at posttreatment and at 3-month follow-up. The tentative conclusion drawn from these results is that pure bibliotherapy with a clear deadline can be effective for people suffering from panic disorder with or without agoraphobia.
Ghaleiha, Ali; Ghyasvand, Mohammad; Mohammadi, Mohammad-Reza; Farokhnia, Mehdi; Yadegari, Noorollah; Tabrizi, Mina; Hajiaghaee, Reza; Yekehtaz, Habibeh; Akhondzadeh, Shahin
2014-07-01
The role of cholinergic abnormalities in autism was recently evidenced and there is a growing interest in cholinergic modulation, emerging for targeting autistic symptoms. Galantamine is an acetylcholinesterase inhibitor and an allosteric potentiator of nicotinic receptors. This study aimed to evaluate the possible effects of galantamine as an augmentative therapy to risperidone, in autistic children. In this randomized, double-blind, placebo-controlled, parallel-group study, 40 outpatients aged 4-12 years whom had a diagnosis of autism (DSM IV-TR) and a score of 12 or higher on the Aberrant Behavior Checklist-Community (ABC-C) Irritability subscale were equally randomized to receive either galantamine (up to 24 mg/day) or placebo, in addition to risperidone (up to 2 mg/day), for 10 weeks. We rated participants by ABC-C and a side effects checklist, at baseline and at weeks 5 and 10. By the study endpoint, the galantamine-treated patients showed significantly greater improvement in the Irritability (P = 0.017) and Lethargy/Social Withdrawal (P = 0.005) subscales than the placebo group. The difference between the two groups in the frequency of side effects was not significant. In conclusion, galantamine augmentation was shown to be a relatively effective and safe augmentative strategy for alleviating some of the autism-related symptoms.
Thøgersen-Ntoumani, C; Loughren, E A; Kinnafick, F-E; Taylor, I M; Duda, J L; Fox, K R
2015-12-01
Physical activity may regulate affective experiences at work, but controlled studies are needed and there has been a reliance on retrospective accounts of experience. The purpose of the present study was to examine the effect of lunchtime walks on momentary work affect at the individual and group levels. Physically inactive employees (N = 56; M age = 47.68; 92.86% female) from a large university in the UK were randomized to immediate treatment or delayed treatment (DT). The DT participants completed both a control and intervention period. During the intervention period, participants partook in three weekly 30-min lunchtime group-led walks for 10 weeks. They completed twice daily affective reports at work (morning and afternoon) using mobile phones on two randomly chosen days per week. Multilevel modeling was used to analyze the data. Lunchtime walks improved enthusiasm, relaxation, and nervousness at work, although the pattern of results differed depending on whether between-group or within-person analyses were conducted. The intervention was effective in changing some affective states and may have broader implications for public health and workplace performance.
Thøgersen-Ntoumani, C; Loughren, E A; Kinnafick, F-E; Taylor, I M; Duda, J L; Fox, K R
2015-12-01
Physical activity may regulate affective experiences at work, but controlled studies are needed and there has been a reliance on retrospective accounts of experience. The purpose of the present study was to examine the effect of lunchtime walks on momentary work affect at the individual and group levels. Physically inactive employees (N = 56; M age = 47.68; 92.86% female) from a large university in the UK were randomized to immediate treatment or delayed treatment (DT). The DT participants completed both a control and intervention period. During the intervention period, participants partook in three weekly 30-min lunchtime group-led walks for 10 weeks. They completed twice daily affective reports at work (morning and afternoon) using mobile phones on two randomly chosen days per week. Multilevel modeling was used to analyze the data. Lunchtime walks improved enthusiasm, relaxation, and nervousness at work, although the pattern of results differed depending on whether between-group or within-person analyses were conducted. The intervention was effective in changing some affective states and may have broader implications for public health and workplace performance. PMID:25559067
Morin, Mélanie; Dumoulin, Chantale; Bergeron, Sophie; Mayrand, Marie-Hélène; Khalifé, Samir; Waddell, Guy; Dubois, Marie-France
2016-01-01
Provoked vestibulodynia (PVD) is a highly prevalent and debilitating condition yet its management relies mainly on non-empirically validated interventions. Among the many causes of PVD, there is growing evidence that pelvic floor muscle (PFM) dysfunctions play an important role in its pathophysiology. Multimodal physiotherapy, which addresses these dysfunctions, is judged by experts to be highly effective and is recommended as a first-line treatment. However, the effectiveness of this promising intervention has been evaluated through only two small uncontrolled trials. The proposed bi-center, single-blind, parallel group, randomized controlled trial (RCT) aims to evaluate the efficacy of multimodal physiotherapy and compare it to a frequently used first-line treatment, topical overnight application of lidocaine, in women with PVD. A total of 212 women diagnosed with PVD according to a standardized protocol were eligible for the study and were randomly assigned to either multimodal physiotherapy or lidocaine treatment for 10weeks. The primary outcome measure is pain during intercourse (assessed with a numerical rating scale). Secondary measures include sexual function, pain quality, psychological factors (including pain catastrophizing, anxiety, depression and fear of pain), PFM morphology and function, and patients' global impression of change. Assessments are made at baseline, post-treatment and at the 6-month follow-up. This manuscript presents and discusses the rationale, design and methodology of the first RCT investigating physiotherapy in comparison to a commonly prescribed first-line treatment, overnight topical lidocaine, for women with PVD.
NASA Astrophysics Data System (ADS)
Chinh, Pham Duc
1998-11-01
The envelopes of the overall conductivities of effective medium intergranularly random and completely random polycrystalline aggregates are compared with the available bounds on the polycrystals' properties. The geometrically realizable models cover the major parts of the property ranges permitted by the bounds, hence the estimates represent the behaviour of realistic random aggregates well, given the uncertainty in the shapes of constituent crystals.
The HEART Pathway Randomized Trial
Mahler, Simon A.; Riley, Robert F.; Hiestand, Brian C.; Russell, Gregory B.; Hoekstra, James W.; Lefebvre, Cedric W.; Nicks, Bret A.; Cline, David M.; Askew, Kim L.; Elliott, Stephanie B.; Herrington, David M.; Burke, Gregory L.; Miller, Chadwick D.
2015-01-01
Background The HEART Pathway is a decision aid designed to identify emergency department patients with acute chest pain for early discharge. No randomized trials have compared the HEART Pathway with usual care. Methods and Results Adult emergency department patients with symptoms related to acute coronary syndrome without ST-elevation on ECG (n=282) were randomized to the HEART Pathway or usual care. In the HEART Pathway arm, emergency department providers used the HEART score, a validated decision aid, and troponin measures at 0 and 3 hours to identify patients for early discharge. Usual care was based on American College of Cardiology/American Heart Association guidelines. The primary outcome, objective cardiac testing (stress testing or angiography), and secondary outcomes, index length of stay, early discharge, and major adverse cardiac events (death, myocardial infarction, or coronary revascularization), were assessed at 30 days by phone interview and record review. Participants had a mean age of 53 years, 16% had previous myocardial infarction, and 6% (95% confidence interval, 3.6%–9.5%) had major adverse cardiac events within 30 days of randomization. Compared with usual care, use of the HEART Pathway decreased objective cardiac testing at 30 days by 12.1% (68.8% versus 56.7%; P=0.048) and length of stay by 12 hours (9.9 versus 21.9 hours; P=0.013) and increased early discharges by 21.3% (39.7% versus 18.4%; P<0.001). No patients identified for early discharge had major adverse cardiac events within 30 days. Conclusions The HEART Pathway reduces objective cardiac testing during 30 days, shortens length of stay, and increases early discharges. These important efficiency gains occurred without any patients identified for early discharge suffering MACE at 30 days. PMID:25737484
Random Matrix Theory and Econophysics
NASA Astrophysics Data System (ADS)
Rosenow, Bernd
2000-03-01
Random Matrix Theory (RMT) [1] is used in many branches of physics as a ``zero information hypothesis''. It describes generic behavior of different classes of systems, while deviations from its universal predictions allow to identify system specific properties. We use methods of RMT to analyze the cross-correlation matrix C of stock price changes [2] of the largest 1000 US companies. In addition to its scientific interest, the study of correlations between the returns of different stocks is also of practical relevance in quantifying the risk of a given stock portfolio. We find [3,4] that the statistics of most of the eigenvalues of the spectrum of C agree with the predictions of RMT, while there are deviations for some of the largest eigenvalues. We interpret these deviations as a system specific property, e.g. containing genuine information about correlations in the stock market. We demonstrate that C shares universal properties with the Gaussian orthogonal ensemble of random matrices. Furthermore, we analyze the eigenvectors of C through their inverse participation ratio and find eigenvectors with large ratios at both edges of the eigenvalue spectrum - a situation reminiscent of localization theory results. This work was done in collaboration with V. Plerou, P. Gopikrishnan, T. Guhr, L.A.N. Amaral, and H.E Stanley and is related to recent work of Laloux et al.. 1. T. Guhr, A. Müller Groeling, and H.A. Weidenmüller, ``Random Matrix Theories in Quantum Physics: Common Concepts'', Phys. Rep. 299, 190 (1998). 2. See, e.g. R.N. Mantegna and H.E. Stanley, Econophysics: Correlations and Complexity in Finance (Cambridge University Press, Cambridge, England, 1999). 3. V. Plerou, P. Gopikrishnan, B. Rosenow, L.A.N. Amaral, and H.E. Stanley, ``Universal and Nonuniversal Properties of Cross Correlations in Financial Time Series'', Phys. Rev. Lett. 83, 1471 (1999). 4. V. Plerou, P. Gopikrishnan, T. Guhr, B. Rosenow, L.A.N. Amaral, and H.E. Stanley, ``Random Matrix Theory
Randomized selection on the GPU
Monroe, Laura Marie; Wendelberger, Joanne R; Michalak, Sarah E
2011-01-13
We implement here a fast and memory-sparing probabilistic top N selection algorithm on the GPU. To our knowledge, this is the first direct selection in the literature for the GPU. The algorithm proceeds via a probabilistic-guess-and-chcck process searching for the Nth element. It always gives a correct result and always terminates. The use of randomization reduces the amount of data that needs heavy processing, and so reduces the average time required for the algorithm. Probabilistic Las Vegas algorithms of this kind are a form of stochastic optimization and can be well suited to more general parallel processors with limited amounts of fast memory.
Quantum random walks without walking
Manouchehri, K.; Wang, J. B.
2009-12-15
Quantum random walks have received much interest due to their nonintuitive dynamics, which may hold the key to a new generation of quantum algorithms. What remains a major challenge is a physical realization that is experimentally viable and not limited to special connectivity criteria. We present a scheme for walking on arbitrarily complex graphs, which can be realized using a variety of quantum systems such as a Bose-Einstein condensate trapped inside an optical lattice. This scheme is particularly elegant since the walker is not required to physically step between the nodes; only flipping coins is sufficient.
Local leaders in random networks
NASA Astrophysics Data System (ADS)
Blondel, Vincent D.; Guillaume, Jean-Loup; Hendrickx, Julien M.; de Kerchove, Cristobald; Lambiotte, Renaud
2008-03-01
We consider local leaders in random uncorrelated networks, i.e., nodes whose degree is higher than or equal to the degree of all their neighbors. An analytical expression is found for the probability for a node of degree k to be a local leader. This quantity is shown to exhibit a transition from a situation where high-degree nodes are local leaders to a situation where they are not, when the tail of the degree distribution behaves like the power law ˜k-γc with γc=3 . Theoretical results are verified by computer simulations, and the importance of finite-size effects is discussed.
Random bearings and their stability.
Mahmoodi Baram, Reza; Herrmann, Hans J
2005-11-25
Self-similar space-filling bearings have been proposed some time ago as models for the motion of tectonic plates and appearance of seismic gaps. These models have two features which, however, seem unrealistic, namely, high symmetry in the arrangement of the particles, and lack of a lower cutoff in the size of the particles. In this work, an algorithm for generating random bearings in both two and three dimensions is presented. Introducing a lower cutoff for the sizes of the particles, the instabilities of the bearing under an external force such as gravity, are studied. PMID:16384225
Generation of pseudo-random numbers
NASA Technical Reports Server (NTRS)
Howell, L. W.; Rheinfurth, M. H.
1982-01-01
Practical methods for generating acceptable random numbers from a variety of probability distributions which are frequently encountered in engineering applications are described. The speed, accuracy, and guarantee of statistical randomness of the various methods are discussed.
Self-correcting random number generator
Humble, Travis S.; Pooser, Raphael C.
2016-09-06
A system and method for generating random numbers. The system may include a random number generator (RNG), such as a quantum random number generator (QRNG) configured to self-correct or adapt in order to substantially achieve randomness from the output of the RNG. By adapting, the RNG may generate a random number that may be considered random regardless of whether the random number itself is tested as such. As an example, the RNG may include components to monitor one or more characteristics of the RNG during operation, and may use the monitored characteristics as a basis for adapting, or self-correcting, to provide a random number according to one or more performance criteria.
Gabriels, Robin L.; Pan, Zhaoxing; Dechant, Briar; Agnew, John A.; Brim, Natalie; Mesibov, Gary
2015-01-01
Objective This study expands previous equine-assisted intervention research by evaluating the effectiveness of therapeutic horseback riding (THR) on self-regulation, socialization, communication, adaptive, and motor behaviors in children with autism spectrum disorder (ASD). Method Participants with ASD (ages 6–16 years; N=127) were stratified by nonverbal IQ standard scores (≤ 85 or > 85) and randomized to one of two groups for 10 weeks: THR intervention or a barn activity (BA) control group without horses that employed similar methods. The fidelity of the THR intervention was monitored. Participants were evaluated within one month pre- and post-intervention by raters blind to intervention conditions and unblinded caregiver questionnaires. During the intervention, caregivers rated participants’ behaviors weekly. Results Intent-to-treat analysis conducted on the 116 participants who completed a baseline assessment (THR n = 58; BA control n = 58) revealed significant improvements in the THR group compared to the control on measures of irritability (primary outcome) (p=.002; effect size [ES]=.50) and hyperactivity (p=.001; ES=0.53), beginning by week five of the intervention. Significant improvements in the THR group were also observed on a measure of social cognition (p=.05, ES=.41) and social communication (p=.003; ES =.63), along with the total number of words (p=.01; ES=.54) and new words (p=.01; ES=.54) spoken during a standardized language sample. Sensitivity analyses adjusting for age, IQ, and per-protocol analyses produced consistent results. Conclusion This is the first large-scale randomized, controlled trial demonstrating efficacy of THR for the ASD population, and findings are consistent with previous equine-assisted intervention studies. Clinical trial registration information Trial of Therapeutic Horseback Riding in Children and Adolescents With Autism Spectrum Disorder; http://clinicaltrials.gov/; NCT02301195. PMID:26088658
Grøndahl, Jan Robert; Rosvold, Elin Olaug
2008-01-01
Background Hypnosis treatment in general practice is a rather new concept. This pilot study was performed to evaluate the effect of a standardized hypnosis treatment used in general practice for patients with chronic widespread pain (CWP). Methods The study was designed as a randomized control group-controlled study. Sixteen patients were randomized into a treatment group or a control group, each constituting eight patients. Seven patients in the treatment group completed the schedule. After the control period, five of the patients in the control group also received treatment, making a total of 12 patients having completed the treatment sessions. The intervention group went through a standardized hypnosis treatment with ten consecutive therapeutic sessions once a week, each lasting for about 30 minutes, focusing on ego-strengthening, relaxation, releasing muscular tension and increasing self-efficacy. A questionnaire was developed in order to calibrate the symptoms before and after the 10 weeks period, and the results were interpolated into a scale from 0 to 100, increasing numbers representing increasing suffering. Data were analyzed by means of T-tests. Results The treatment group improved from their symptoms, (change from 62.5 to 55.4), while the control group deteriorated, (change from 37.2 to 45.1), (p = 0,045). The 12 patients who completed the treatment showed a mean improvement from 51.5 to 41.6. (p = 0,046). One year later the corresponding result was 41.3, indicating a persisting improvement. Conclusion The study indicates that hypnosis treatment may have a positive effect on pain and quality of life for patients with chronic muscular pain. Considering the limited number of patients, more studies should be conducted to confirm the results. Trial Registration The study was registered in ClinicalTrials.gov and released 27.08.07 Reg nr NCT00521807 Approval Number: 05032001. PMID:18801190
Bhat, Shripathy M.; Latha, K.S.
2016-01-01
Introduction Chronic Kidney Disease (CKD) is becoming a major public health problem worldwide. The very diagnosis of CKD brings a plethora of psychological problems that adds to the agony of the debilitating illness. Financial difficulties apart from the excruciating physical burden of the disease, owing to series of psychosocial issues. Anxiety and depression are two major concerns that to be managed effectively to sustain the life of people undergoing Haemodialysis. Aim The study aimed at finding the effect of Cognitive Behaviour Therapy (CBT) on anxiety and depression among people undergoing haemodialysis. Materials and Methods An experimental approach with Randomized controlled trial design was adopted for the study. The instruments used for data collection were Background Proforma and Hospital Anxiety and Depression Scale (HADS). A total of 150 subjects undergoing haemodialysis in a tertiary care hospital of South Karnataka were screened for inclusion and exclusion criteria and 80 participants were recruited for the study. Through computerized block randomization 40 each were allotted to experimental and control groups whereas 33 and 34 respectively in both the groups completed the study. CBT, a structured individual therapy of cognitive, behavioural and didactic techniques, with 10 weekly sessions each was administered to the experimental group. Non-directed counseling, a psychological intervention with ten weekly sessions of individual counseling was given to the control group. Results The findings of the study revealed that there was a significant reduction of mean anxiety (F=76.739, p=0.001) and depression (F=57.326, p= 0.001) in the experimental group when compared with the control group. Conclusion Researchers concluded that CBT can be effectively utilized for people undergoing haemodialysis in order to obtain control over their negative thoughts thereby reducing anxiety and depression. PMID:27656536
Gudenkauf, Lisa M.; Antoni, Michael H.; Stagl, Jamie M.; Lechner, Suzanne C.; Jutagir, Devika R.; Bouchard, Laura C.; Blomberg, Bonnie B.; Glück, Stefan; Derhagopian, Robert P.; Giron, Gladys L.; Avisar, Eli; Torres-Salichs, Manuel A.; Carver, Charles S.
2015-01-01
Objective Women with breast cancer (BCa) report elevated distress post-surgery. Group-based cognitive-behavioral stress management (CBSM) following surgery improves psychological adaptation, though its key mechanisms remain speculative. This randomized controlled dismantling trial compared two interventions featuring elements thought to drive CBSM effects: a 5-week Cognitive-Behavioral Training (CBT) and 5-week Relaxation Training (RT) vs. a 5-week Health Education (HE) control group. Method Women with stage 0-III BCa (N = 183) were randomized to CBT, RT, or HE condition 2–10 weeks post-surgery. Psychosocial measures were collected at baseline (T1) and post-intervention (T2). Repeated-measures ANOVAs tested whether CBT and RT treatments improved primary measures of psychological adaptation and secondary measures of stress management resource perceptions from pre- to post-intervention relative to HE. Results Both CBT and RT groups reported reduced depressive affect. The CBT group reported improved emotional well-being/quality of life and less cancer-specific thought intrusions. The RT group reported improvements on illness-related social disruption. Regarding stress management resources, the CBT group reported increased reliability of social support networks, while the RT group reported increased confidence in relaxation skills. Psychological adaptation and stress management resource constructs were unchanged in the HE control group. Conclusions Non-metastatic breast cancer patients participating in two forms of brief, 5-week group-based stress management intervention after surgery showed improvements in psychological adaptation and stress management resources compared to an attention-matched control group. Findings provide preliminary support suggesting that using brief group-based stress management interventions may promote adaptation among non-metastatic breast cancer patients. PMID:25939017
Peabody, John W; Shimkhada, Riti; Quimbo, Stella; Solon, Orville; Javier, Xylee; McCulloch, Charles
2014-01-01
Improving clinical performance using measurement and payment incentives, including pay for performance (or P4P), has, so far, shown modest to no benefit on patient outcomes. Our objective was to assess the impact of a P4P programme on paediatric health outcomes in the Philippines. We used data from the Quality Improvement Demonstration Study. In this study, the P4P intervention, introduced in 2004, was randomly assigned to 10 community district hospitals, which were matched to 10 control sites. At all sites, physician quality was measured using Clinical Performance Vignettes (CPVs) among randomly selected physicians every 6 months over a 36-month period. In the hospitals randomized to the P4P intervention, physicians received bonus payments if they met qualifying scores on the CPV. We measured health outcomes 4–10 weeks after hospital discharge among children 5 years of age and under who had been hospitalized for diarrhoea and pneumonia (the two most common illnesses affecting this age cohort) and had been under the care of physicians participating in the study. Health outcomes data collection was done at baseline/pre-intervention and 2 years post-intervention on the following post-discharge outcomes: (1) age-adjusted wasting, (2) C-reactive protein in blood, (3) haemoglobin level and (4) parental assessment of child’s health using general self-reported health (GSRH) measure. To evaluate changes in health outcomes in the control vs intervention sites over time (baseline vs post-intervention), we used a difference-in-difference logistic regression analysis, controlling for potential confounders. We found an improvement of 7 and 9 percentage points in GSRH and wasting over time (post-intervention vs baseline) in the intervention sites relative to the control sites (P ≤ 0.001). The results from this randomized social experiment indicate that the introduction of a performance-based incentive programme, which included measurement and feedback, led to improvements
Peabody, John W; Shimkhada, Riti; Quimbo, Stella; Solon, Orville; Javier, Xylee; McCulloch, Charles
2014-08-01
Improving clinical performance using measurement and payment incentives, including pay for performance (or P4P), has, so far, shown modest to no benefit on patient outcomes. Our objective was to assess the impact of a P4P programme on paediatric health outcomes in the Philippines. We used data from the Quality Improvement Demonstration Study. In this study, the P4P intervention, introduced in 2004, was randomly assigned to 10 community district hospitals, which were matched to 10 control sites. At all sites, physician quality was measured using Clinical Performance Vignettes (CPVs) among randomly selected physicians every 6 months over a 36-month period. In the hospitals randomized to the P4P intervention, physicians received bonus payments if they met qualifying scores on the CPV. We measured health outcomes 4-10 weeks after hospital discharge among children 5 years of age and under who had been hospitalized for diarrhoea and pneumonia (the two most common illnesses affecting this age cohort) and had been under the care of physicians participating in the study. Health outcomes data collection was done at baseline/pre-intervention and 2 years post-intervention on the following post-discharge outcomes: (1) age-adjusted wasting, (2) C-reactive protein in blood, (3) haemoglobin level and (4) parental assessment of child's health using general self-reported health (GSRH) measure. To evaluate changes in health outcomes in the control vs intervention sites over time (baseline vs post-intervention), we used a difference-in-difference logistic regression analysis, controlling for potential confounders. We found an improvement of 7 and 9 percentage points in GSRH and wasting over time (post-intervention vs baseline) in the intervention sites relative to the control sites (P ≤ 0.001). The results from this randomized social experiment indicate that the introduction of a performance-based incentive programme, which included measurement and feedback, led to improvements in
Stipčević, Mario
2016-03-01
In this work, a new type of elementary logic circuit, named random flip-flop (RFF), is proposed, experimentally realized, and studied. Unlike conventional Boolean logic circuits whose action is deterministic and highly reproducible, the action of a RFF is intentionally made maximally unpredictable and, in the proposed realization, derived from a fundamentally random process of emission and detection of light quanta. We demonstrate novel applications of RFF in randomness preserving frequency division, random frequency synthesis, and random number generation. Possible usages of these applications in the information and communication technology, cryptographic hardware, and testing equipment are discussed. PMID:27036825
Stipčević, Mario
2016-03-01
In this work, a new type of elementary logic circuit, named random flip-flop (RFF), is proposed, experimentally realized, and studied. Unlike conventional Boolean logic circuits whose action is deterministic and highly reproducible, the action of a RFF is intentionally made maximally unpredictable and, in the proposed realization, derived from a fundamentally random process of emission and detection of light quanta. We demonstrate novel applications of RFF in randomness preserving frequency division, random frequency synthesis, and random number generation. Possible usages of these applications in the information and communication technology, cryptographic hardware, and testing equipment are discussed.
NASA Astrophysics Data System (ADS)
Stipčević, Mario
2016-03-01
In this work, a new type of elementary logic circuit, named random flip-flop (RFF), is proposed, experimentally realized, and studied. Unlike conventional Boolean logic circuits whose action is deterministic and highly reproducible, the action of a RFF is intentionally made maximally unpredictable and, in the proposed realization, derived from a fundamentally random process of emission and detection of light quanta. We demonstrate novel applications of RFF in randomness preserving frequency division, random frequency synthesis, and random number generation. Possible usages of these applications in the information and communication technology, cryptographic hardware, and testing equipment are discussed.
Quantifying errors without random sampling
Phillips, Carl V; LaPole, Luwanna M
2003-01-01
Background All quantifications of mortality, morbidity, and other health measures involve numerous sources of error. The routine quantification of random sampling error makes it easy to forget that other sources of error can and should be quantified. When a quantification does not involve sampling, error is almost never quantified and results are often reported in ways that dramatically overstate their precision. Discussion We argue that the precision implicit in typical reporting is problematic and sketch methods for quantifying the various sources of error, building up from simple examples that can be solved analytically to more complex cases. There are straightforward ways to partially quantify the uncertainty surrounding a parameter that is not characterized by random sampling, such as limiting reported significant figures. We present simple methods for doing such quantifications, and for incorporating them into calculations. More complicated methods become necessary when multiple sources of uncertainty must be combined. We demonstrate that Monte Carlo simulation, using available software, can estimate the uncertainty resulting from complicated calculations with many sources of uncertainty. We apply the method to the current estimate of the annual incidence of foodborne illness in the United States. Summary Quantifying uncertainty from systematic errors is practical. Reporting this uncertainty would more honestly represent study results, help show the probability that estimated values fall within some critical range, and facilitate better targeting of further research. PMID:12892568
Classical randomness in quantum measurements
NASA Astrophysics Data System (ADS)
Mauro D'Ariano, Giacomo; Lo Presti, Paoloplacido; Perinotti, Paolo
2005-07-01
Similarly to quantum states, also quantum measurements can be 'mixed', corresponding to a random choice within an ensemble of measuring apparatuses. Such mixing is equivalent to a sort of hidden variable, which produces a noise of purely classical nature. It is then natural to ask which apparatuses are indecomposable, i.e. do not correspond to any random choice of apparatuses. This problem is interesting not only for foundations, but also for applications, since most optimization strategies give optimal apparatuses that are indecomposable. Mathematically the problem is posed describing each measuring apparatus by a positive operator-valued measure (POVM), which gives the statistics of the outcomes for any input state. The POVMs form a convex set, and in this language the indecomposable apparatuses are represented by extremal points—the analogous of 'pure states' in the convex set of states. Differently from the case of states, however, indecomposable POVMs are not necessarily rank-one, e.g. von Neumann measurements. In this paper we give a complete classification of indecomposable apparatuses (for discrete spectrum), by providing different necessary and sufficient conditions for extremality of POVMs, along with a simple general algorithm for the decomposition of a POVM into extremals. As an interesting application, 'informationally complete' measurements are analysed in this respect. The convex set of POVMs is fully characterized by determining its border in terms of simple algebraic properties of the corresponding POVMs.
Evolutionary dynamics on random structures
Fraser, S.M.; Reidys, C.M. |
1997-04-01
In this paper the authors consider the evolutionary dynamics of populations of sequences, under a process of selection at the phenotypic level of structures. They use a simple graph-theoretic representation of structures which captures well the properties of the mapping between RNA sequences and their molecular structure. Each sequence is assigned to a structure by means of a sequence-to-structure mapping. The authors make the basic assumption that every fitness landscape can be factorized through the structures. The set of all sequences that map into a particular random structure can then be modeled as a random graph in sequence space, the so-called neutral network. They analyze in detail how an evolving population searches for new structures, in particular how they switch from one neutral network to another. They verify that transitions occur directly between neutral networks, and study the effects of different population sizes and the influence of the relatedness of the structures on these transitions. In fitness landscapes where several structures exhibit high fitness, the authors then study evolutionary paths on the structural level taken by the population during its search. They present a new way of expressing structural similarities which are shown to have relevant implications for the time evolution of the population.
The wasteland of random supergravities
NASA Astrophysics Data System (ADS)
Marsh, David; McAllister, Liam; Wrase, Timm
2012-03-01
We show that in a general {N} = {1} supergravity with N ≫ 1 scalar fields, an exponentially small fraction of the de Sitter critical points are metastable vacua. Taking the superpotential and Kähler potential to be random functions, we construct a random matrix model for the Hessian matrix, which is well-approximated by the sum of a Wigner matrix and two Wishart matrices. We compute the eigenvalue spectrum analytically from the free convolution of the constituent spectra and find that in typical configurations, a significant fraction of the eigenvalues are negative. Building on the Tracy-Widom law governing fluctuations of extreme eigenvalues, we determine the probability P of a large fluctuation in which all the eigenvalues become positive. Strong eigenvalue repulsion makes this extremely unlikely: we find P ∝ exp(- c N p ), with c, p being constants. For generic critical points we find p ≈ 1 .5, while for approximately-supersymmetric critical points, p ≈ 1 .3. Our results have significant implications for the counting of de Sitter vacua in string theory, but the number of vacua remains vast.
Random sources for cusped beams.
Li, Jia; Wang, Fei; Korotkova, Olga
2016-08-01
We introduce two novel classes of partially coherent sources whose degrees of coherence are described by the rectangular Lorentz-correlated Schell-model (LSM) and rectangular fractional multi-Gaussian-correlated Schell-model (FMGSM) functions. Based on the generalized Collins formula, analytical expressions are derived for the spectral density distributions of these beams propagating through a stigmatic ABCD optical system. It is shown that beams belonging to both classes form the spectral density apex that is much higher and sharper than that generated by the Gaussian Schell-model (GSM) beam with a comparable coherence state. We experimentally generate these beams by using a nematic, transmissive spatial light modulator (SLM) that serves as a random phase screen controlled by a computer. The experimental data is consistent with theoretical predictions. Moreover, it is illustrated that the FMGSM beam generated in our experiments has a better focusing capacity than the GSM beam with the same coherence state. The applications that can potentially benefit from the use of novel beams range from material surface processing, to communications and sensing through random media. PMID:27505746
Randomized approximate nearest neighbors algorithm.
Jones, Peter Wilcox; Osipov, Andrei; Rokhlin, Vladimir
2011-09-20
We present a randomized algorithm for the approximate nearest neighbor problem in d-dimensional Euclidean space. Given N points {x(j)} in R(d), the algorithm attempts to find k nearest neighbors for each of x(j), where k is a user-specified integer parameter. The algorithm is iterative, and its running time requirements are proportional to T·N·(d·(log d) + k·(d + log k)·(log N)) + N·k(2)·(d + log k), with T the number of iterations performed. The memory requirements of the procedure are of the order N·(d + k). A by-product of the scheme is a data structure, permitting a rapid search for the k nearest neighbors among {x(j)} for an arbitrary point x ∈ R(d). The cost of each such query is proportional to T·(d·(log d) + log(N/k)·k·(d + log k)), and the memory requirements for the requisite data structure are of the order N·(d + k) + T·(d + N). The algorithm utilizes random rotations and a basic divide-and-conquer scheme, followed by a local graph search. We analyze the scheme's behavior for certain types of distributions of {x(j)} and illustrate its performance via several numerical examples.
Randomness in Sequence Evolution Increases over Time.
Wang, Guangyu; Sun, Shixiang; Zhang, Zhang
2016-01-01
The second law of thermodynamics states that entropy, as a measure of randomness in a system, increases over time. Although studies have investigated biological sequence randomness from different aspects, it remains unknown whether sequence randomness changes over time and whether this change consists with the second law of thermodynamics. To capture the dynamics of randomness in molecular sequence evolution, here we detect sequence randomness based on a collection of eight statistical random tests and investigate the randomness variation of coding sequences with an application to Escherichia coli. Given that core/essential genes are more ancient than specific/non-essential genes, our results clearly show that core/essential genes are more random than specific/non-essential genes and accordingly indicate that sequence randomness indeed increases over time, consistent well with the second law of thermodynamics. We further find that an increase in sequence randomness leads to increasing randomness of GC content and longer sequence length. Taken together, our study presents an important finding, for the first time, that sequence randomness increases over time, which may provide profound insights for unveiling the underlying mechanisms of molecular sequence evolution. PMID:27224236
Randomness, Its Meanings and Educational Implications.
ERIC Educational Resources Information Center
Batanero, Carmen; Green, David R.; Serrano, Luis Romero
1998-01-01
Presents an analysis of the different meanings associated with randomness throughout its historical evolution as well as a summary of research concerning the subjective perception of randomness by children and adolescents. Some teaching suggestions are included to help students gradually understand the characteristics of random phenomena. Contains…
Randomness in Sequence Evolution Increases over Time
Wang, Guangyu; Sun, Shixiang; Zhang, Zhang
2016-01-01
The second law of thermodynamics states that entropy, as a measure of randomness in a system, increases over time. Although studies have investigated biological sequence randomness from different aspects, it remains unknown whether sequence randomness changes over time and whether this change consists with the second law of thermodynamics. To capture the dynamics of randomness in molecular sequence evolution, here we detect sequence randomness based on a collection of eight statistical random tests and investigate the randomness variation of coding sequences with an application to Escherichia coli. Given that core/essential genes are more ancient than specific/non-essential genes, our results clearly show that core/essential genes are more random than specific/non-essential genes and accordingly indicate that sequence randomness indeed increases over time, consistent well with the second law of thermodynamics. We further find that an increase in sequence randomness leads to increasing randomness of GC content and longer sequence length. Taken together, our study presents an important finding, for the first time, that sequence randomness increases over time, which may provide profound insights for unveiling the underlying mechanisms of molecular sequence evolution. PMID:27224236
Cluster randomization: a trap for the unwary.
Underwood, M; Barnett, A; Hajioff, S
1998-01-01
Controlled trials that randomize by practice can provide robust evidence to inform patient care. However, compared with randomizing by each individual patient, this approach may have substantial implications for sample size calculations and the interpretation of results. An increased awareness of these effects will improve the quality of research based on randomization by practice. PMID:9624757
Source-Independent Quantum Random Number Generation
NASA Astrophysics Data System (ADS)
Cao, Zhu; Zhou, Hongyi; Yuan, Xiao; Ma, Xiongfeng
2016-01-01
Quantum random number generators can provide genuine randomness by appealing to the fundamental principles of quantum mechanics. In general, a physical generator contains two parts—a randomness source and its readout. The source is essential to the quality of the resulting random numbers; hence, it needs to be carefully calibrated and modeled to achieve information-theoretical provable randomness. However, in practice, the source is a complicated physical system, such as a light source or an atomic ensemble, and any deviations in the real-life implementation from the theoretical model may affect the randomness of the output. To close this gap, we propose a source-independent scheme for quantum random number generation in which output randomness can be certified, even when the source is uncharacterized and untrusted. In our randomness analysis, we make no assumptions about the dimension of the source. For instance, multiphoton emissions are allowed in optical implementations. Our analysis takes into account the finite-key effect with the composable security definition. In the limit of large data size, the length of the input random seed is exponentially small compared to that of the output random bit. In addition, by modifying a quantum key distribution system, we experimentally demonstrate our scheme and achieve a randomness generation rate of over 5 ×103 bit /s .
Parabolic Anderson Model in a Dynamic Random Environment: Random Conductances
NASA Astrophysics Data System (ADS)
Erhard, D.; den Hollander, F.; Maillard, G.
2016-06-01
The parabolic Anderson model is defined as the partial differential equation ∂ u( x, t)/ ∂ t = κ Δ u( x, t) + ξ( x, t) u( x, t), x ∈ ℤ d , t ≥ 0, where κ ∈ [0, ∞) is the diffusion constant, Δ is the discrete Laplacian, and ξ is a dynamic random environment that drives the equation. The initial condition u( x, 0) = u 0( x), x ∈ ℤ d , is typically taken to be non-negative and bounded. The solution of the parabolic Anderson equation describes the evolution of a field of particles performing independent simple random walks with binary branching: particles jump at rate 2 d κ, split into two at rate ξ ∨ 0, and die at rate (- ξ) ∨ 0. In earlier work we looked at the Lyapunov exponents λ p(κ ) = limlimits _{tto ∞} 1/t log {E} ([u(0,t)]p)^{1/p}, quad p in {N} , qquad λ 0(κ ) = limlimits _{tto ∞} 1/2 log u(0,t). For the former we derived quantitative results on the κ-dependence for four choices of ξ : space-time white noise, independent simple random walks, the exclusion process and the voter model. For the latter we obtained qualitative results under certain space-time mixing conditions on ξ. In the present paper we investigate what happens when κΔ is replaced by Δ𝓚, where 𝓚 = {𝓚( x, y) : x, y ∈ ℤ d , x ˜ y} is a collection of random conductances between neighbouring sites replacing the constant conductances κ in the homogeneous model. We show that the associated annealed Lyapunov exponents λ p (𝓚), p ∈ ℕ, are given by the formula λ p({K} ) = {sup} {λ p(κ ) : κ in {Supp} ({K} )}, where, for a fixed realisation of 𝓚, Supp(𝓚) is the set of values taken by the 𝓚-field. We also show that for the associated quenched Lyapunov exponent λ 0(𝓚) this formula only provides a lower bound, and we conjecture that an upper bound holds when Supp(𝓚) is replaced by its convex hull. Our proof is valid for three classes of reversible ξ, and for all 𝓚
Functional methods for waves in random media
NASA Technical Reports Server (NTRS)
Chow, P. L.
1981-01-01
Some basic ideas in functional methods for waves in random media are illustrated through a simple random differential equation. These methods are then generalized to solve certain random parabolic equations via an exponential representation given by the Feynman-Kac formula. It is shown that these functional methods are applicable to a number of problems in random wave propagation. They include the forward-scattering approximation in Gaussian white-noise media; the solution of the optical beam propagation problem by a phase-integral method; the high-frequency scattering by bounded random media; and a derivation of approximate moment equations from the functional integral representation.
Functional methods for waves in random media
NASA Technical Reports Server (NTRS)
Chow, P. L.
1981-01-01
Some basic ideas in functional methods for waves in random media are illustrated through a simple random differential equation. These methods are then generalized to solve certain random parabolic equations via an exponential representation given by the Feynman-Kac formula. It is shown that these functional methods are applicable to a number of problems in random wave propagation. They include the forward-scattering approximation in Gaussian white-noise media; the solution of the optical beam propagation problem by a phase-integral method; the high-frequency scattering by bounded random media, and a derivation of approximate moment equations from the functional integral representation.
Certifying Unpredictable Randomness from Quantum Nonlocality
NASA Astrophysics Data System (ADS)
Bierhorst, Peter
2015-03-01
A device-independent quantum randomness protocol takes an initial random seed as input and then expands it in to a longer random string. It has been proven that if the initial random seed is trusted to be unpredictable, then the longer output string can also be certified to be unpredictable by an experimental violation of Bell's inequality. It has furthermore been argued that the initial random seed may not need to be truly unpredictable, but only uncorrelated to specific parts of the Bell experiment. In this work, we demonstrate rigorously that this is indeed true, under assumptions related to ``no superdeterminism/no conspiracy'' concepts along with the no-signaling assumption. So if we assume that superluminal signaling is impossible, then a loophole-free test of Bell's inequality would be able to generate provably unpredictable randomness from an input source of (potentially predictable) classical randomness.
NASA Technical Reports Server (NTRS)
Kester, DO; Bontekoe, Tj. Romke
1994-01-01
In order to make the best high resolution images of IRAS data it is necessary to incorporate any knowledge about the instrument into a model: the IRAS model. This is necessary since every remaining systematic effect will be amplified by any high resolution technique into spurious artifacts in the images. The search for random noise is in fact the never-ending quest for better quality results, and can only be obtained by better models. The Dutch high-resolution effort has resulted in HIRAS which drives the MEMSYS5 algorithm. It is specifically designed for IRAS image construction. A detailed description of HIRAS with many results is in preparation. In this paper we emphasize many of the instrumental effects incorporated in the IRAS model, including our improved 100 micron IRAS response functions.
Flow Through Randomly Curved Manifolds
Mendoza, M.; Succi, S.; Herrmann, H. J.
2013-01-01
We present a computational study of the transport properties of campylotic (intrinsically curved) media. It is found that the relation between the flow through a campylotic media, consisting of randomly located curvature perturbations, and the average Ricci scalar of the system, exhibits two distinct functional expressions, depending on whether the typical spatial extent of the curvature perturbation lies above or below the critical value maximizing the overall scalar of curvature. Furthermore, the flow through such systems as a function of the number of curvature perturbations is found to present a sublinear behavior for large concentrations, due to the interference between curvature perturbations leading to an overall less curved space. We have also characterized the flux through such media as a function of the local Reynolds number and the scale of interaction between impurities. For the purpose of this study, we have also developed and validated a new lattice Boltzmann model. PMID:24173367
Random walks for image segmentation.
Grady, Leo
2006-11-01
A novel method is proposed for performing multilabel, interactive image segmentation. Given a small number of pixels with user-defined (or predefined) labels, one can analytically and quickly determine the probability that a random walker starting at each unlabeled pixel will first reach one of the prelabeled pixels. By assigning each pixel to the label for which the greatest probability is calculated, a high-quality image segmentation may be obtained. Theoretical properties of this algorithm are developed along with the corresponding connections to discrete potential theory and electrical circuits. This algorithm is formulated in discrete space (i.e., on a graph) using combinatorial analogues of standard operators and principles from continuous potential theory, allowing it to be applied in arbitrary dimension on arbitrary graphs.
Random Telegraph Noise in Microstructures
Kogan, S.
1998-10-01
The theory of random current switchings in conductors with S -type current-voltage characteristic is presented. In the range of bistability, the mean time spent by the system in the low-current state before a transition to the high-current state occurs, {bar {tau}}{sub l} , decreases with voltage, and that for the high-current state, {bar {tau}}{sub h} , grows with voltage; both variations are exponential-like. {bar {tau}}{sub l}={bar {tau}}{sub h} at a definite voltage in the bistability range. These results are in full accordance with experiments on microstructures. Because of the growth of both times with the size of the conductor, such noise is observable just in microstructures. {copyright} {ital 1998} {ital The American Physical Society}
Structure of random bidisperse foam.
Reinelt, Douglas A.; van Swol, Frank B.; Kraynik, Andrew Michael
2005-02-01
The Surface Evolver was used to compute the equilibrium microstructure of random soap foams with bidisperse cell-size distributions and to evaluate topological and geometric properties of the foams and individual cells. The simulations agree with the experimental data of Matzke and Nestler for the probability {rho}(F) of finding cells with F faces and its dependence on the fraction of large cells. The simulations also agree with the theory for isotropic Plateau polyhedra (IPP), which describes the F-dependence of cell geometric properties, such as surface area, edge length, and mean curvature (diffusive growth rate); this is consistent with results for polydisperse foams. Cell surface areas are about 10% greater than spheres of equal volume, which leads to a simple but accurate relation for the surface free energy density of foams. The Aboav-Weaire law is not valid for bidisperse foams.
Ergodic theory, randomness, and "chaos".
Ornstein, D S
1989-01-13
Ergodic theory is the theory of the long-term statistical behavior of dynamical systems. The baker's transformation is an object of ergodic theory that provides a paradigm for the possibility of deterministic chaos. It can now be shown that this connection is more than an analogy and that at some level of abstraction a large number of systems governed by Newton's laws are the same as the baker's transformation. Going to this level of abstraction helps to organize the possible kinds of random behavior. The theory also gives new concrete results. For example, one can show that the same process could be produced by a mechanism governed by Newton's laws or by a mechanism governed by coin tossing. It also gives a statistical analog of structural stability.
Random vibration of compliant wall
NASA Technical Reports Server (NTRS)
Yang, J.-N.; Heller, R. A.
1976-01-01
The paper is concerned with the realistic case of two-dimensional random motion of a membrane with bending stiffness supported on a viscoelastic spring substrate and on an elastic base plate under both subsonic and supersonic boundary layer turbulence. The cross-power spectral density of surface displacements is solved in terms of design variables of the compliant wall - such as the dimensions and material properties of the membrane (Mylar), substrate (PVC foam), and panel (aluminum) - so that a sensitivity analysis can be made to examine the influence of each design variable on the surface response statistics. Three numerical examples typical of compliant wall design are worked out and their response statistics in relation to wave drag and roughness drag are assessed. The results can serve as a guideline for experimental investigation of the drag reduction concept through the use of a compliant wall.
Modeling Randomness in Judging Rating Scales with a Random-Effects Rating Scale Model
ERIC Educational Resources Information Center
Wang, Wen-Chung; Wilson, Mark; Shih, Ching-Lin
2006-01-01
This study presents the random-effects rating scale model (RE-RSM) which takes into account randomness in the thresholds over persons by treating them as random-effects and adding a random variable for each threshold in the rating scale model (RSM) (Andrich, 1978). The RE-RSM turns out to be a special case of the multidimensional random…
Subramanian, Leena; Morris, Monica Busse; Brosnan, Meadhbh; Turner, Duncan L.; Morris, Huw R.; Linden, David E. J.
2016-01-01
Objective: Real-time functional magnetic resonance imaging (rt-fMRI) neurofeedback (NF) uses feedback of the patient’s own brain activity to self-regulate brain networks which in turn could lead to a change in behavior and clinical symptoms. The objective was to determine the effect of NF and motor training (MOT) alone on motor and non-motor functions in Parkinson’s Disease (PD) in a 10-week small Phase I randomized controlled trial. Methods: Thirty patients with Parkinson’s disease (PD; Hoehn and Yahr I-III) and no significant comorbidity took part in the trial with random allocation to two groups. Group 1 (NF: 15 patients) received rt-fMRI-NF with MOT. Group 2 (MOT: 15 patients) received MOT alone. The primary outcome measure was the Movement Disorder Society—Unified PD Rating Scale-Motor scale (MDS-UPDRS-MS), administered pre- and post-intervention “off-medication”. The secondary outcome measures were the “on-medication” MDS-UPDRS, the PD Questionnaire-39, and quantitative motor assessments after 4 and 10 weeks. Results: Patients in the NF group were able to upregulate activity in the supplementary motor area (SMA) by using motor imagery. They improved by an average of 4.5 points on the MDS-UPDRS-MS in the “off-medication” state (95% confidence interval: −2.5 to −6.6), whereas the MOT group improved only by 1.9 points (95% confidence interval +3.2 to −6.8). The improvement in the intervention group meets the minimal clinically important difference which is also on par with other non-invasive therapies such as repetitive Transcranial Magnetic Stimulation (rTMS). However, the improvement did not differ significantly between the groups. No adverse events were reported in either group. Interpretation: This Phase I study suggests that NF combined with MOT is safe and improves motor symptoms immediately after treatment, but larger trials are needed to explore its superiority over active control conditions. PMID:27375451
The influence of floor type before and after 10 weeks of age on osteochondrosis in growing gilts.
de Koning, D B; van Grevenhof, E M; Laurenssen, B F A; van Weeren, P R; Hazeleger, W; Kemp, B
2014-08-01
Osteochondrosis (OC) is a degenerative joint condition developing in a short time frame in young growing gilts that may cause lameness at an older age, affecting welfare and leading to premature culling of breeding sows. Causes of OC are multifactorial including both genetic and environmental factors. Floor type has been suggested to affect OC prevalence and effects might be age dependent during the rearing period. The aim of this study was to investigate possible age-dependent effects of floor type, conventional concrete partially slatted versus wood shavings as deep bedding, on OC prevalence in gilts (Dutch Large White × Dutch Landrace) at slaughter (24 wk of age; 106.5 [14.7 SD] kg of BW). At weaning (4 wk of age; 6.9 [1.3 SD] kg of BW), 212 gilts were subjected to 1 of 4 flooring regimens. Gilts were either subjected to a conventional floor from weaning until slaughter (CC), wood shavings as bedding from weaning until slaughter (WW), a conventional floor from weaning until 10 wk of age after which gilts were switched to wood shavings as bedding (CW), or wood shavings as bedding from weaning until 10 wk of age after which gilts were switched to a conventional floor (WC). After slaughter the elbow, hock, and knee joints were macroscopically examined for OC and scored on a 5 point scale where 0 indicates no OC and 4 indicates the severest form of OC. There was no significant difference (P > 0.4) between treatments on the overall OC prevalence for any joint assessed or at the animal level (all joints combined). At the animal level, however, gilts had greater odds to have OC scores 3 and 4 in the CW treatment (odds ratios [OR] = 2.3; P = 0.05), WC treatment (OR = 2.6; P = 0.02), and WW treatment (OR = 3.7; P < 0.001) compared with gilts in the CC treatment. The results indicate that there are no age-dependent effects of floor types on overall OC prevalence. However, wood shavings as bedding seems to increase the odds for severe OC and might affect animal welfare in the long term.
Váczi, Márk; Nagy, Szilvia A; Kőszegi, Tamás; Ambrus, Míra; Bogner, Péter; Perlaki, Gábor; Orsi, Gergely; Tóth, Katalin; Hortobágyi, Tibor
2014-10-01
The growth promoting effects of eccentric (ECC) contractions are well documented but it is unknown if the rate of stretch per se plays a role in such muscular responses in healthy aging human skeletal muscle. We tested the hypothesis that exercise training of the quadriceps muscle with low rate ECC and high rate ECC contractions in the form of stretch-shortening cycles (SSCs) but at equal total mechanical work would produce rate-specific adaptations in healthy old males age 60-70. Both training programs produced similar improvements in maximal voluntary isometric (6%) and ECC torque (23%) and stretch-shortening cycle function (reduced contraction duration [24%] and enhanced elastic energy storage [12%]) (p<0.05). The rate of torque development increased 30% only after SSC exercise (p<0.05). Resting testosterone and cortisol levels were unchanged but after each program the acute exercise-induced cortisol levels were 12-15% lower (p<0.05). Both programs increased quadriceps size 2.5% (p<0.05). It is concluded that both ECC and SSC exercise training produces favorable adaptations in healthy old males' quadriceps muscle. Although the rate of muscle tension during the SSC vs. ECC contractions was about 4-fold greater, the total mechanical work seems to regulate the hypetrophic, hormonal, and most of the mechanical adaptations. However, SSC exercise was uniquely effective in improving a key deficiency of aging muscle, i.e., its ability to produce force rapidly.
NASA Astrophysics Data System (ADS)
Zhou, Yu-Qian; Gao, Fei; Li, Dan-Dan; Li, Xin-Hui; Wen, Qiao-Yan
2016-09-01
We have proved that new randomness can be certified by partially free sources using 2 →1 quantum random access code (QRAC) in the framework of semi-device-independent (SDI) protocols [Y.-Q. Zhou, H.-W. Li, Y.-K. Wang, D.-D. Li, F. Gao, and Q.-Y. Wen, Phys. Rev. A 92, 022331 (2015), 10.1103/PhysRevA.92.022331]. To improve the effectiveness of the randomness generation, here we propose the SDI randomness expansion using 3 →1 QRAC and obtain the corresponding classical and quantum bounds of the two-dimensional quantum witness. Moreover, we get the condition which should be satisfied by the partially free sources to successfully certify new randomness, and the analytic relationship between the certified randomness and the two-dimensional quantum witness violation.
Marcus, Sue M; Stuart, Elizabeth A; Wang, Pei; Shadish, William R; Steiner, Peter M
2012-06-01
Although randomized studies have high internal validity, generalizability of the estimated causal effect from randomized clinical trials to real-world clinical or educational practice may be limited. We consider the implication of randomized assignment to treatment, as compared with choice of preferred treatment as it occurs in real-world conditions. Compliance, engagement, or motivation may be better with a preferred treatment, and this can complicate the generalizability of results from randomized trials. The doubly randomized preference trial (DRPT) is a hybrid randomized and nonrandomized design that allows for estimation of the causal effect of randomization versus treatment preference. In the DRPT, individuals are first randomized to either randomized assignment or choice assignment. Those in the randomized assignment group are then randomized to treatment or control, and those in the choice group receive their preference of treatment versus control. Using the potential outcomes framework, we apply the algebra of conditional independence to show how the DRPT can be used to derive an unbiased estimate of the causal effect of randomization versus preference for each of the treatment and comparison conditions. Also, we show how these results can be implemented using full matching on the propensity score. The methodology is illustrated with a DRPT of introductory psychology students who were randomized to randomized assignment or preference of mathematics versus vocabulary training. We found a small to moderate benefit of preference versus randomization with respect to the mathematics outcome for those who received mathematics training.
The MCNP5 Random number generator
Brown, F. B.; Nagaya, Y.
2002-01-01
MCNP and other Monte Carlo particle transport codes use random number generators to produce random variates from a uniform distribution on the interval. These random variates are then used in subsequent sampling from probability distributions to simulate the physical behavior of particles during the transport process. This paper describes the new random number generator developed for MCNP Version 5. The new generator will optionally preserve the exact random sequence of previous versions and is entirely conformant to the Fortran-90 standard, hence completely portable. In addition, skip-ahead algorithms have been implemented to efficiently initialize the generator for new histories, a capability that greatly simplifies parallel algorithms. Further, the precision of the generator has been increased, extending the period by a factor of 10{sup 5}. Finally, the new generator has been subjected to 3 different sets of rigorous and extensive statistical tests to verify that it produces a sufficiently random sequence.
Dynamic computing random access memory
NASA Astrophysics Data System (ADS)
Traversa, F. L.; Bonani, F.; Pershin, Y. V.; Di Ventra, M.
2014-07-01
The present von Neumann computing paradigm involves a significant amount of information transfer between a central processing unit and memory, with concomitant limitations in the actual execution speed. However, it has been recently argued that a different form of computation, dubbed memcomputing (Di Ventra and Pershin 2013 Nat. Phys. 9 200-2) and inspired by the operation of our brain, can resolve the intrinsic limitations of present day architectures by allowing for computing and storing of information on the same physical platform. Here we show a simple and practical realization of memcomputing that utilizes easy-to-build memcapacitive systems. We name this architecture dynamic computing random access memory (DCRAM). We show that DCRAM provides massively-parallel and polymorphic digital logic, namely it allows for different logic operations with the same architecture, by varying only the control signals. In addition, by taking into account realistic parameters, its energy expenditures can be as low as a few fJ per operation. DCRAM is fully compatible with CMOS technology, can be realized with current fabrication facilities, and therefore can really serve as an alternative to the present computing technology.
Migration in asymmetric, random environments
NASA Astrophysics Data System (ADS)
Deem, Michael; Wang, Dong
Migration is a key mechanism for expansion of communities. As a population migrates, it experiences a changing environment. In heterogeneous environments, rapid adaption is key to the evolutionary success of the population. In the case of human migration, environmental heterogeneity is naturally asymmetric in the North-South and East-West directions. We here consider migration in random, asymmetric, modularly correlated environments. Knowledge about the environment determines the fitness of each individual. We find that the speed of migration is proportional to the inverse of environmental change, and in particular we find that North-South migration rates are lower than East-West migration rates. Fast communication within the population of pieces of knowledge between individuals, similar to horizontal gene transfer in genetic systems, can help to spread beneficial knowledge among individuals. We show that increased modularity of the relation between knowledge and fitness enhances the rate of evolution. We investigate the relation between optimal information exchange rate and modularity of the dependence of fitness on knowledge. These results for the dependence of migration rate on heterogeneity, asymmetry, and modularity are consistent with existing archaeological facts.
Dynamic computing random access memory.
Traversa, F L; Bonani, F; Pershin, Y V; Di Ventra, M
2014-07-18
The present von Neumann computing paradigm involves a significant amount of information transfer between a central processing unit and memory, with concomitant limitations in the actual execution speed. However, it has been recently argued that a different form of computation, dubbed memcomputing (Di Ventra and Pershin 2013 Nat. Phys. 9 200-2) and inspired by the operation of our brain, can resolve the intrinsic limitations of present day architectures by allowing for computing and storing of information on the same physical platform. Here we show a simple and practical realization of memcomputing that utilizes easy-to-build memcapacitive systems. We name this architecture dynamic computing random access memory (DCRAM). We show that DCRAM provides massively-parallel and polymorphic digital logic, namely it allows for different logic operations with the same architecture, by varying only the control signals. In addition, by taking into account realistic parameters, its energy expenditures can be as low as a few fJ per operation. DCRAM is fully compatible with CMOS technology, can be realized with current fabrication facilities, and therefore can really serve as an alternative to the present computing technology.
Aggregated Recommendation through Random Forests
2014-01-01
Aggregated recommendation refers to the process of suggesting one kind of items to a group of users. Compared to user-oriented or item-oriented approaches, it is more general and, therefore, more appropriate for cold-start recommendation. In this paper, we propose a random forest approach to create aggregated recommender systems. The approach is used to predict the rating of a group of users to a kind of items. In the preprocessing stage, we merge user, item, and rating information to construct an aggregated decision table, where rating information serves as the decision attribute. We also model the data conversion process corresponding to the new user, new item, and both new problems. In the training stage, a forest is built for the aggregated training set, where each leaf is assigned a distribution of discrete rating. In the testing stage, we present four predicting approaches to compute evaluation values based on the distribution of each tree. Experiments results on the well-known MovieLens dataset show that the aggregated approach maintains an acceptable level of accuracy. PMID:25180204
Efficient robust conditional random fields.
Song, Dongjin; Liu, Wei; Zhou, Tianyi; Tao, Dacheng; Meyer, David A
2015-10-01
Conditional random fields (CRFs) are a flexible yet powerful probabilistic approach and have shown advantages for popular applications in various areas, including text analysis, bioinformatics, and computer vision. Traditional CRF models, however, are incapable of selecting relevant features as well as suppressing noise from noisy original features. Moreover, conventional optimization methods often converge slowly in solving the training procedure of CRFs, and will degrade significantly for tasks with a large number of samples and features. In this paper, we propose robust CRFs (RCRFs) to simultaneously select relevant features. An optimal gradient method (OGM) is further designed to train RCRFs efficiently. Specifically, the proposed RCRFs employ the l1 norm of the model parameters to regularize the objective used by traditional CRFs, therefore enabling discovery of the relevant unary features and pairwise features of CRFs. In each iteration of OGM, the gradient direction is determined jointly by the current gradient together with the historical gradients, and the Lipschitz constant is leveraged to specify the proper step size. We show that an OGM can tackle the RCRF model training very efficiently, achieving the optimal convergence rate [Formula: see text] (where k is the number of iterations). This convergence rate is theoretically superior to the convergence rate O(1/k) of previous first-order optimization methods. Extensive experiments performed on three practical image segmentation tasks demonstrate the efficacy of OGM in training our proposed RCRFs.
Organization of growing random networks
Krapivsky, P. L.; Redner, S.
2001-06-01
The organizational development of growing random networks is investigated. These growing networks are built by adding nodes successively, and linking each to an earlier node of degree k with an attachment probability A{sub k}. When A{sub k} grows more slowly than linearly with k, the number of nodes with k links, N{sub k}(t), decays faster than a power law in k, while for A{sub k} growing faster than linearly in k, a single node emerges which connects to nearly all other nodes. When A{sub k} is asymptotically linear, N{sub k}(t){similar_to}tk{sup {minus}{nu}}, with {nu} dependent on details of the attachment probability, but in the range 2{lt}{nu}{lt}{infinity}. The combined age and degree distribution of nodes shows that old nodes typically have a large degree. There is also a significant correlation in the degrees of neighboring nodes, so that nodes of similar degree are more likely to be connected. The size distributions of the in and out components of the network with respect to a given node{emdash}namely, its {open_quotes}descendants{close_quotes} and {open_quotes}ancestors{close_quotes}{emdash}are also determined. The in component exhibits a robust s{sup {minus}2} power-law tail, where s is the component size. The out component has a typical size of order lnt, and it provides basic insights into the genealogy of the network.
Hierarchy in directed random networks
NASA Astrophysics Data System (ADS)
Mones, Enys
2013-02-01
In recent years, the theory and application of complex networks have been quickly developing in a markable way due to the increasing amount of data from real systems and the fruitful application of powerful methods used in statistical physics. Many important characteristics of social or biological systems can be described by the study of their underlying structure of interactions. Hierarchy is one of these features that can be formulated in the language of networks. In this paper we present some (qualitative) analytic results on the hierarchical properties of random network models with zero correlations and also investigate, mainly numerically, the effects of different types of correlations. The behavior of the hierarchy is different in the absence and the presence of giant components. We show that the hierarchical structure can be drastically different if there are one-point correlations in the network. We also show numerical results suggesting that the hierarchy does not change monotonically with the correlations and there is an optimal level of nonzero correlations maximizing the level of hierarchy.
All-optical fast random number generator.
Li, Pu; Wang, Yun-Cai; Zhang, Jian-Zhong
2010-09-13
We propose a scheme of all-optical random number generator (RNG), which consists of an ultra-wide bandwidth (UWB) chaotic laser, an all-optical sampler and an all-optical comparator. Free from the electric-device bandwidth, it can generate 10Gbit/s random numbers in our simulation. The high-speed bit sequences can pass standard statistical tests for randomness after all-optical exclusive-or (XOR) operation.
Nonstationary interference and scattering from random media
Nazikian, R.
1991-12-01
For the small angle scattering of coherent plane waves from inhomogeneous random media, the three dimensional mean square distribution of random fluctuations may be recovered from the interferometric detection of the nonstationary modulational structure of the scattered field. Modulational properties of coherent waves scattered from random media are related to nonlocal correlations in the double sideband structure of the Fourier transform of the scattering potential. Such correlations may be expressed in terms of a suitability generalized spectral coherence function for analytic fields.
Private randomness expansion with untrusted devices
NASA Astrophysics Data System (ADS)
Colbeck, Roger; Kent, Adrian
2011-03-01
Randomness is an important resource for many applications, from gambling to secure communication. However, guaranteeing that the output from a candidate random source could not have been predicted by an outside party is a challenging task, and many supposedly random sources used today provide no such guarantee. Quantum solutions to this problem exist, for example a device which internally sends a photon through a beamsplitter and observes on which side it emerges, but, presently, such solutions require the user to trust the internal workings of the device. Here, we seek to go beyond this limitation by asking whether randomness can be generated using untrusted devices—even ones created by an adversarial agent—while providing a guarantee that no outside party (including the agent) can predict it. Since this is easily seen to be impossible unless the user has an initially private random string, the task we investigate here is private randomness expansion. We introduce a protocol for private randomness expansion with untrusted devices which is designed to take as input an initially private random string and produce as output a longer private random string. We point out that private randomness expansion protocols are generally vulnerable to attacks that can render the initial string partially insecure, even though that string is used only inside a secure laboratory; our protocol is designed to remove this previously unconsidered vulnerability by privacy amplification. We also discuss extensions of our protocol designed to generate an arbitrarily long random string from a finite initially private random string. The security of these protocols against the most general attacks is left as an open question.
Some physical applications of random hierarchical matrices
Avetisov, V. A.; Bikulov, A. Kh.; Vasilyev, O. A.; Nechaev, S. K.; Chertovich, A. V.
2009-09-15
The investigation of spectral properties of random block-hierarchical matrices as applied to dynamic and structural characteristics of complex hierarchical systems with disorder is proposed for the first time. Peculiarities of dynamics on random ultrametric energy landscapes are discussed and the statistical properties of scale-free and polyscale (depending on the topological characteristics under investigation) random hierarchical networks (graphs) obtained by multiple mapping are considered.
Random packing of spheres in Menger sponge.
Cieśla, Michał; Barbasz, Jakub
2013-06-01
Random packing of spheres inside fractal collectors of dimension 2 < d < 3 is studied numerically using Random Sequential Adsorption (RSA) algorithm. The paper focuses mainly on the measurement of random packing saturation limit. Additionally, scaling properties of density autocorrelations in the obtained packing are analyzed. The RSA kinetics coefficients are also measured. Obtained results allow to test phenomenological relation between random packing saturation density and collector dimension. Additionally, performed simulations together with previously obtained results confirm that, in general, the known dimensional relations are obeyed by systems having non-integer dimension, at least for d < 3.
The Theory of Random Laser Systems
Xunya Jiang
2002-06-27
Studies of random laser systems are a new direction with promising potential applications and theoretical interest. The research is based on the theories of localization and laser physics. So far, the research shows that there are random lasing modes inside the systems which is quite different from the common laser systems. From the properties of the random lasing modes, they can understand the phenomena observed in the experiments, such as multi-peak and anisotropic spectrum, lasing mode number saturation, mode competition and dynamic processes, etc. To summarize, this dissertation has contributed the following in the study of random laser systems: (1) by comparing the Lamb theory with the Letokhov theory, the general formulas of the threshold length or gain of random laser systems were obtained; (2) they pointed out the vital weakness of previous time-independent methods in random laser research; (3) a new model which includes the FDTD method and the semi-classical laser theory. The solutions of this model provided an explanation of the experimental results of multi-peak and anisotropic emission spectra, predicted the saturation of lasing modes number and the length of localized lasing modes; (4) theoretical (Lamb theory) and numerical (FDTD and transfer-matrix calculation) studies of the origin of localized lasing modes in the random laser systems; and (5) proposal of using random lasing modes as a new path to study wave localization in random systems and prediction of the lasing threshold discontinuity at mobility edge.
Random walks with similar transition probabilities
NASA Astrophysics Data System (ADS)
Schiefermayr, Klaus
2003-04-01
We consider random walks on the nonnegative integers with a possible absorbing state at -1. A random walk is called [alpha]-similar to a random walk if there exist constants Cij such that for the corresponding n-step transition probabilities , i,j[greater-or-equal, slanted]0, hold. We give necessary and sufficient conditions for the [alpha]-similarity of two random walks both in terms of the parameters and in terms of the corresponding spectral measures which appear in the spectral representation of the n-step transition probabilities developed by Karlin and McGregor.
Analysis and experiment of random ball test
NASA Astrophysics Data System (ADS)
Lu, Liming; Wu, Fan; Hou, Xi; Zhang, Can
2012-10-01
Robert E.Parks from National Institute of Standards and Technology (NIST), America, first reported Random Ball Test (RBT), which is used to measure the absolute error of the reference surface of the interferometer. The basic course of this technology as followed: first, assemble the Random Ball in the confocal position of the interferometer system; then, measure the surface of the Random Ball and record the result; rotate the Random Ball to another position, meanwhile make sure that the Random Ball is in the confocal position all the time; In the new position, measure the surface of the Random Ball and record it again; repeat enough times as above, calculate the mean result of the measuring results, and this mean result is just the absolute error of the reference surface of the interferometer. Since 1998, other scholars have continued Robert E.Parks's research, and created a new type of the RBT. In this new technology, Random Ball is sustained by high pressure airflow, suspending in the air, and rotating around sphere center. This technology is called Dynamic Random Ball Test (DRBT), because the Random Ball is rotating during measurement. This article mainly reported the experiment study about the DRBT.
Raman mode random lasing in ZnS-β-carotene random gain media
NASA Astrophysics Data System (ADS)
Bingi, Jayachandra; Warrier, Anita R.; Vijayan, C.
2013-06-01
Raman mode random lasing is demonstrated in ZnS-β-carotene random gain media at room temperature. A self assembled random medium is prepared with ZnS sub micron spheres synthesized by homogeneous precipitation method. β-Carotene extracted from pale green leaves is embedded in this random medium. The emission band of ZnS random medium (on excitation at 488 nm) overlaps considerably with that of β-carotene, which functions as a gain medium. Here, random medium works as a cavity, leading to Raman mode lasing at 517 nm and 527 nm triggered by stimulated resonance Raman scattering.
2012-01-01
Background Luteal support with progesterone is necessary for successful implantation of the embryo following egg collection and embryo transfer in an in-vitro fertilization (IVF) cycle. Progesterone has been used for as little as 2 weeks and for as long as 12 weeks of gestation. The optimal length of treatment is unresolved at present and it remains unclear how long to treat women receiving luteal supplementation. Design The trial is a prospective, randomized, double-blind, placebo-controlled trial to investigate the effect of the duration of luteal support with progesterone in IVF cycles. Following 2 weeks standard treatment and a positive biochemical pregnancy test, this randomized control trial will allocate women to a supplementary 8 weeks treatment with vaginal progesterone or 8 weeks placebo. Further studies would be required to investigate whether additional supplementation with progesterone is beneficial in early pregnancy. Discussion Currently at the Hewitt Centre, approximately 32.5% of women have a positive biochemical pregnancy test 2 weeks after embryo transfer. It is this population that is eligible for trial entry and randomization. Once the patient has confirmed a positive urinary pregnancy test they will be invited to join the trial. Once the consent form has been completed by the patient a trial prescription sheet will be sent to pharmacy with a stated collection time. The patient can then be randomized and the drugs dispensed according to pharmacy protocol. A blood sample will then be drawn for measurement of baseline hormone levels (progesterone, estradiol, free beta-human chorionic gonadotrophin, pregnancy-associated plasma protein-A, Activin A, Inhibin A and Inhibin B). The primary outcome measure is the proportion of all randomized women that continue successfully to a viable pregnancy (at least one fetus with fetal heart rate >100 beats/minute) on transabdominal/transvaginal ultrasound at 10 weeks post embryo transfer/12 weeks gestation
Realization of high performance random laser diodes
NASA Astrophysics Data System (ADS)
Yu, S. F.
2011-03-01
For the past four decades, extensive studies have been concentrated on the understanding of the physics of random lasing phenomena in scattering media with optical gain. Although lasing modes can be excited from the mirrorless scattering media, the characteristics of high scattering loss, multiple-direction emission, as well as multiple-mode oscillation prohibited them to be used as practical laser cavities. Furthermore, due to the difficulty of achieving high optical gain under electrical excitation, electrical excitation of random lasing action was seldom reported. Hence, mirrorless random cavities have never been used to realize lasers for practical applications -- CD, DVD, pico-projector, etc. Nowadays, studies of random lasing are still limited to the scientific research. Recently, the difficulty of achieving `battery driven' random laser diodes has been overcome by using nano-structured ZnO as the random medium and the careful design of heterojunctions. This lead to the first demonstration of room-temperature electrically pumped random lasing action under continuity wave and pulsed operation. In this presentation, we proposed to realize an array of quasi-one dimensional ZnO random laser diodes. We can show that if the laser array can be manipulated in a way such that every individual random laser can be coupled laterally to and locked with a particular phase relationship to its adjacent neighbor, the laser array can obtain coherent addition of random modes. Hence, output power can be multiplied and one lasing mode will only be supported due to the repulsion characteristics of random modes. This work was supported by HK PolyU grant no. 1-ZV6X.
Cialdella-Kam, Lynn; Nieman, David C.; Knab, Amy M.; Shanely, R. Andrew; Meaney, Mary Pat; Jin, Fuxia; Sha, Wei; Ghosh, Sujoy
2016-01-01
Flavonoids and fish oils have anti-inflammatory and immune-modulating influences. The purpose of this study was to determine if a mixed flavonoid-fish oil supplement (Q-Mix; 1000 mg quercetin, 400 mg isoquercetin, 120 mg epigallocatechin (EGCG) from green tea extract, 400 mg n3-PUFAs (omega-3 polyunsaturated fatty acid) (220 mg eicosapentaenoic acid (EPA) and 180 mg docosahexaenoic acid (DHA)) from fish oil, 1000 mg vitamin C, 40 mg niacinamide, and 800 µg folic acid) would reduce complications associated with obesity; that is, reduce inflammatory and oxidative stress markers and alter genomic profiles in overweight women. Overweight and obese women (n = 48; age = 40–70 years) were assigned to Q-Mix or placebo groups using randomized double-blinded placebo-controlled procedures. Overnight fasted blood samples were collected at 0 and 10 weeks and analyzed for cytokines, C-reactive protein (CRP), F2-isoprostanes, and whole-blood-derived mRNA, which was assessed using Affymetrix HuGene-1_1 ST arrays. Statistical analysis included two-way ANOVA models for blood analytes and gene expression and pathway and network enrichment methods for gene expression. Plasma levels increased with Q-Mix supplementation by 388% for quercetin, 95% for EPA, 18% for DHA, and 20% for docosapentaenoic acid (DPA). Q-Mix did not alter plasma levels for CRP (p = 0.268), F2-isoprostanes (p = 0.273), and cytokines (p > 0.05). Gene set enrichment analysis revealed upregulation of pathways in Q-Mix vs. placebo related to interferon-induced antiviral mechanism (false discovery rate, FDR < 0.001). Overrepresentation analysis further disclosed an inhibition of phagocytosis-related inflammatory pathways in Q-Mix vs. placebo. Thus, a 10-week Q-Mix supplementation elicited a significant rise in plasma quercetin, EPA, DHA, and DPA, as well as stimulated an antiviral and inflammation whole-blood transcriptomic response in overweight women. PMID:27187447
Cialdella-Kam, Lynn; Nieman, David C; Knab, Amy M; Shanely, R Andrew; Meaney, Mary Pat; Jin, Fuxia; Sha, Wei; Ghosh, Sujoy
2016-01-01
Flavonoids and fish oils have anti-inflammatory and immune-modulating influences. The purpose of this study was to determine if a mixed flavonoid-fish oil supplement (Q-Mix; 1000 mg quercetin, 400 mg isoquercetin, 120 mg epigallocatechin (EGCG) from green tea extract, 400 mg n3-PUFAs (omega-3 polyunsaturated fatty acid) (220 mg eicosapentaenoic acid (EPA) and 180 mg docosahexaenoic acid (DHA)) from fish oil, 1000 mg vitamin C, 40 mg niacinamide, and 800 µg folic acid) would reduce complications associated with obesity; that is, reduce inflammatory and oxidative stress markers and alter genomic profiles in overweight women. Overweight and obese women (n = 48; age = 40-70 years) were assigned to Q-Mix or placebo groups using randomized double-blinded placebo-controlled procedures. Overnight fasted blood samples were collected at 0 and 10 weeks and analyzed for cytokines, C-reactive protein (CRP), F₂-isoprostanes, and whole-blood-derived mRNA, which was assessed using Affymetrix HuGene-1_1 ST arrays. Statistical analysis included two-way ANOVA models for blood analytes and gene expression and pathway and network enrichment methods for gene expression. Plasma levels increased with Q-Mix supplementation by 388% for quercetin, 95% for EPA, 18% for DHA, and 20% for docosapentaenoic acid (DPA). Q-Mix did not alter plasma levels for CRP (p = 0.268), F2-isoprostanes (p = 0.273), and cytokines (p > 0.05). Gene set enrichment analysis revealed upregulation of pathways in Q-Mix vs. placebo related to interferon-induced antiviral mechanism (false discovery rate, FDR < 0.001). Overrepresentation analysis further disclosed an inhibition of phagocytosis-related inflammatory pathways in Q-Mix vs. placebo. Thus, a 10-week Q-Mix supplementation elicited a significant rise in plasma quercetin, EPA, DHA, and DPA, as well as stimulated an antiviral and inflammation whole-blood transcriptomic response in overweight women. PMID:27187447
Rouhani, M H; Kelishadi, R; Hashemipour, M; Esmaillzadeh, A; Surkan, P J; Keshavarz, A; Azadbakht, L
2016-04-01
Although the effects of dietary glycemic index (GI) on insulin resistance are well documented in adults, the complex interaction among glucose intolerance, inflammatory markers, and adipokine concentration has not been well studied, especially among adolescents. We investigated the effect of a low glycemic index (LGI) diet on insulin concentration, fasting blood sugar (FBS), inflammatory markers, and serum adiponectin concentration among healthy obese/overweight adolescent females. In this parallel randomized clinical trial, 2 different diets, an LGI diet and a healthy nutritional recommendation diet (HNRD) with similar macronutrient composition were prescribed to 50 obese and overweight adolescent girls with the same pubertal status. Biochemical markers FBS, serum insulin concentration, high sensitivity C-reactive protein (hs-CRP), interleukin 6 (IL-6), and adiponectin were measured before and after a 10 week intervention. Using an intention-to-treat analysis, data from 50 subjects were analyzed. According to a dietary assessment, GI in the LGI group was 43.22±0.54. While the mean for FBS, serum insulin concentration, the homeostasis model assessment (HOMA), the quantitative insulin sensitivity check index (QUICKI), and adiponectin concentration did not differ significantly within each group, the average hs-CRP and IL-6 decreased significantly in the LGI diet group after the 10 week intervention (p=0.009 and p=0.001; respectively). Comparing percent changes, we found a marginally significant decrease in hs-CRP in the LGI group compared with the HNRD group after adjusting for confounders. Compliance with an LGI diet may have favorable effect on inflammation among overweight and obese adolescent girls. PMID:27065462
A random Q-switched fiber laser.
Tang, Yulong; Xu, Jianqiu
2015-01-01
Extensive studies have been performed on random lasers in which multiple-scattering feedback is used to generate coherent emission. Q-switching and mode-locking are well-known routes for achieving high peak power output in conventional lasers. However, in random lasers, the ubiquitous random cavities that are formed by multiple scattering inhibit energy storage, making Q-switching impossible. In this paper, widespread Rayleigh scattering arising from the intrinsic micro-scale refractive-index irregularities of fiber cores is used to form random cavities along the fiber. The Q-factor of the cavity is rapidly increased by stimulated Brillouin scattering just after the spontaneous emission is enhanced by random cavity resonances, resulting in random Q-switched pulses with high brightness and high peak power. This report is the first observation of high-brightness random Q-switched laser emission and is expected to stimulate new areas of scientific research and applications, including encryption, remote three-dimensional random imaging and the simulation of stellar lasing. PMID:25797520
Statistical estimation of π using random vectors
NASA Astrophysics Data System (ADS)
Bloch, S. C.; Dressler, R.
1999-04-01
We present a simple application of a spreadsheet to estimate π using random vector ensembles. The worksheet, combining elements of the works of Archimedes, Liu Hui, and Buffon, can be used as a brief introductory tutorial on the Monte Carlo method, random number generators, and the logical IF function. Complete instructions are given for composing the worksheet.
The Design of Cluster Randomized Crossover Trials
ERIC Educational Resources Information Center
Rietbergen, Charlotte; Moerbeek, Mirjam
2011-01-01
The inefficiency induced by between-cluster variation in cluster randomized (CR) trials can be reduced by implementing a crossover (CO) design. In a simple CO trial, each subject receives each treatment in random order. A powerful characteristic of this design is that each subject serves as its own control. In a CR CO trial, clusters of subjects…
Random energy model at complex temperatures
Saakian
2000-06-01
The complete phase diagram of the random energy model is obtained for complex temperatures using the method proposed by Derrida. We find the density of zeroes for the statistical sum. Then the method is applied to the generalized random energy model. This allowed us to propose an analytical method for investigating zeroes of the statistical sum for finite-dimensional systems. PMID:11088286
Random numbers certified by Bell's theorem.
Pironio, S; Acín, A; Massar, S; de la Giroday, A Boyer; Matsukevich, D N; Maunz, P; Olmschenk, S; Hayes, D; Luo, L; Manning, T A; Monroe, C
2010-04-15
Randomness is a fundamental feature of nature and a valuable resource for applications ranging from cryptography and gambling to numerical simulation of physical and biological systems. Random numbers, however, are difficult to characterize mathematically, and their generation must rely on an unpredictable physical process. Inaccuracies in the theoretical modelling of such processes or failures of the devices, possibly due to adversarial attacks, limit the reliability of random number generators in ways that are difficult to control and detect. Here, inspired by earlier work on non-locality-based and device-independent quantum information processing, we show that the non-local correlations of entangled quantum particles can be used to certify the presence of genuine randomness. It is thereby possible to design a cryptographically secure random number generator that does not require any assumption about the internal working of the device. Such a strong form of randomness generation is impossible classically and possible in quantum systems only if certified by a Bell inequality violation. We carry out a proof-of-concept demonstration of this proposal in a system of two entangled atoms separated by approximately one metre. The observed Bell inequality violation, featuring near perfect detection efficiency, guarantees that 42 new random numbers are generated with 99 per cent confidence. Our results lay the groundwork for future device-independent quantum information experiments and for addressing fundamental issues raised by the intrinsic randomness of quantum theory. PMID:20393558
Effect Sizes in Cluster-Randomized Designs
ERIC Educational Resources Information Center
Hedges, Larry V.
2007-01-01
Multisite research designs involving cluster randomization are becoming increasingly important in educational and behavioral research. Researchers would like to compute effect size indexes based on the standardized mean difference to compare the results of cluster-randomized studies (and corresponding quasi-experiments) with other studies and to…
Brownian Optimal Stopping and Random Walks
Lamberton, D.
2002-06-05
One way to compute the value function of an optimal stopping problem along Brownian paths consists of approximating Brownian motion by a random walk. We derive error estimates for this type of approximation under various assumptions on the distribution of the approximating random walk.
Evaluation of the Randomized Multiple Choice Format.
ERIC Educational Resources Information Center
Harke, Douglas James
Each physics problem used in evaluating the effectiveness of Randomized Multiple Choice (RMC) tests was stated in the conventional manner and was followed by several multiple choice items corresponding to the steps in a written solution but presented in random order. Students were instructed to prepare a written answer and to use it to answer the…
Synchronization Properties of Random Piecewise Isometries
NASA Astrophysics Data System (ADS)
Gorodetski, Anton; Kleptsyn, Victor
2016-08-01
We study the synchronization properties of the random double rotations on tori. We give a criterion that show when synchronization is present in the case of random double rotations on the circle and prove that it is always absent in dimensions two and higher.
Color Charts, Esthetics, and Subjective Randomness
ERIC Educational Resources Information Center
Sanderson, Yasmine B.
2012-01-01
Color charts, or grids of evenly spaced multicolored dots or squares, appear in the work of modern artists and designers. Often the artist/designer distributes the many colors in a way that could be described as "random," that is, without an obvious pattern. We conduct a statistical analysis of 125 "random-looking" art and design color charts and…
Non-Hermitian Euclidean random matrix theory.
Goetschy, A; Skipetrov, S E
2011-07-01
We develop a theory for the eigenvalue density of arbitrary non-Hermitian Euclidean matrices. Closed equations for the resolvent and the eigenvector correlator are derived. The theory is applied to the random Green's matrix relevant to wave propagation in an ensemble of pointlike scattering centers. This opens a new perspective in the study of wave diffusion, Anderson localization, and random lasing.
49 CFR 382.305 - Random testing.
Code of Federal Regulations, 2010 CFR
2010-10-01
... annual percentage rate for random controlled substances testing shall be 50 percent of the average number of driver positions. (c) The FMCSA Administrator's decision to increase or decrease the minimum... minimum annual percentage rate for random alcohol testing shall be 10 percent of the average number...
49 CFR 382.305 - Random testing.
Code of Federal Regulations, 2011 CFR
2011-10-01
... annual percentage rate for random controlled substances testing shall be 50 percent of the average number of driver positions. (c) The FMCSA Administrator's decision to increase or decrease the minimum... minimum annual percentage rate for random alcohol testing shall be 10 percent of the average number...
Individual Differences Methods for Randomized Experiments
ERIC Educational Resources Information Center
Tucker-Drob, Elliot M.
2011-01-01
Experiments allow researchers to randomly vary the key manipulation, the instruments of measurement, and the sequences of the measurements and manipulations across participants. To date, however, the advantages of randomized experiments to manipulate both the aspects of interest and the aspects that threaten internal validity have been primarily…
Random vibration of mechanical and structural systems
NASA Astrophysics Data System (ADS)
Soong, T. T.; Grigoriu, Mircea
This book addresses random vibration of mechanical and structural systems commonly encountered in aerospace, mechanical, and civil engineering. Techniques are examined for determining probabilistic characteristics of the response of dynamic systems subjected to random loads or inputs and for calculating probabilities related to system performance or reliability. Emphasis is given to applications.
Depp, Colin A; Ceglowski, Jenni; Wang, Vicki C; Yaghouti, Faraz; Mausbach, Brent T; Thompson, Wesley K; Granholm, Eric L
2014-01-01
Background Psychosocial interventions for bipolar disorder are frequently unavailable and resource intensive. Mobile technology may improve access to evidence-based interventions and may increase their efficacy. We evaluated the feasibility, acceptability and efficacy of an augmentative mobile ecological momentary intervention targeting self-management of mood symptoms. Methods This was a randomized single-blind controlled trial with 82 consumers diagnosed with bipolar disorder who completed a four-session psychoeducational intervention and were assigned to 10 weeks of either: 1) mobile device delivered interactive intervention linking patient-reported mood states with personalized self-management strategies, or 2) paper-and-pencil mood monitoring. Participants were assessed at baseline, 6 weeks (mid-point), 12 weeks (post-treatment), and 24 weeks (follow up) with clinician-rated depression and mania scales and self-reported functioning. Results Retention at 12 weeks was 93% and both conditions were associated with high satisfaction. Compared to the paper-and-pencil condition, participants in the augmented mobile intervention condition showed significantly greater reductions in depressive symptoms at 6 and 12 weeks (Cohen's d for both were d=0.48). However, these effects were not maintained at 24-week follow up. Conditions did not differ significantly in the impact on manic symptoms or functional impairment. Limitations This was not a definitive trial and was not powered to detect moderators and mediators. Conclusions Automated mobile-phone intervention is feasible, acceptable, and may enhance the impact of brief psychoeducation on depressive symptoms in bipolar disorder. However, sustainment of gains from symptom self-management mobile interventions, once stopped, may be limited. PMID:25479050
Hardy, Joseph L.; Nelson, Rolf A.; Thomason, Moriah E.; Sternberg, Daniel A.; Katovich, Kiefer; Farzin, Faraz; Scanlon, Michael
2015-01-01
Background A variety of studies have demonstrated gains in cognitive ability following cognitive training interventions. However, other studies have not shown such gains, and questions remain regarding the efficacy of specific cognitive training interventions. Cognitive training research often involves programs made up of just one or a few exercises, targeting limited and specific cognitive endpoints. In addition, cognitive training studies typically involve small samples that may be insufficient for reliable measurement of change. Other studies have utilized training periods that were too short to generate reliable gains in cognitive performance. Methods The present study evaluated an online cognitive training program comprised of 49 exercises targeting a variety of cognitive capacities. The cognitive training program was compared to an active control condition in which participants completed crossword puzzles. All participants were recruited, trained, and tested online (N = 4,715 fully evaluable participants). Participants in both groups were instructed to complete one approximately 15-minute session at least 5 days per week for 10 weeks. Results Participants randomly assigned to the treatment group improved significantly more on the primary outcome measure, an aggregate measure of neuropsychological performance, than did the active control group (Cohen’s d effect size = 0.255; 95% confidence interval = [0.198, 0.312]). Treatment participants showed greater improvements than controls on speed of processing, short-term memory, working memory, problem solving, and fluid reasoning assessments. Participants in the treatment group also showed greater improvements on self-reported measures of cognitive functioning, particularly on those items related to concentration compared to the control group (Cohen’s d = 0.249; 95% confidence interval = [0.191, 0.306]). Conclusion Taken together, these results indicate that a varied training program composed of a number of
Sustained-Release Methylphenidate in a Randomized Trial of Treatment of Methamphetamine Use Disorder
Ling, Walter; Chang, Linda; Hillhouse, Maureen; Ang, Alfonso; Striebel, Joan; Jenkins, Jessica; Hernandez, Jasmin; Olaer, Mary; Mooney, Larissa; Reed, Susan; Fukaya, Erin; Kogachi, Shannon; Alicata, Daniel; Holmes, Nataliya; Esagoff, Asher
2014-01-01
Background and aims No effective pharmacotherapy for methamphetamine (MA) use disorder has yet been found. This study evaluated sustained-release methylphenidate (MPH-SR) compared with placebo (PLA) for treatment of MA use disorder in people also undergoing behavioural support and motivational incentives. Design This was a randomized, double-blind, placebo-controlled design with MPH-SR or PLA provided for 10 weeks (active phase) followed by 4 weeks of single-blind PLA. Twice-weekly clinic visits, weekly group counseling (CBT), and motivational incentives (MI) for MA-negative urine drug screens (UDS) were included. Setting Treatment sites were in Los Angeles, California (LA) and Honolulu, Hawaii (HH), USA. Participants 110 MA-dependent (via DSM-IV) participants (LA = 90; HH = 20). Measurements The primary outcome measure is self-reported days of MA use during the last 30 days of the active phase. Included in the current analyses are drug use (UDS and self-report), retention, craving, compliance (dosing, CBT, MI), adverse events, and treatment satisfaction. Findings No difference was found between treatment groups in self-reported days of MA use during the last 30 days of the active phase (p=0.22). In planned secondary outcomes analyses, however, the MPH group had fewer self-reported MA use days from baseline through the active phase compared with the PLA group (p=0.05). The MPH group also had lower craving scores and fewer marijuana-positive UDS than the PLA group in the last 30 days of the active phase. The two groups had similar retention, other drug use, adverse events, and treatment satisfaction. Conclusions Methylphenidate may lead to a reduction in concurrent methamphetamine use when provided as treatment for patients undergoing behavioural support for moderate to severe methamphetamine use disorder but this requires confirmation. PMID:24825486
2011-01-01
Background Randomized controlled trials (RCT) are required to test relationships between physical activity and cognition in children, but these must be informed by exploratory studies. This study aimed to inform future RCT by: conducting practical utility and reliability studies to identify appropriate cognitive outcome measures; piloting an RCT of a 10 week physical education (PE) intervention which involved 2 hours per week of aerobically intense PE compared to 2 hours of standard PE (control). Methods 64 healthy children (mean age 6.2 yrs SD 0.3; 33 boys) recruited from 6 primary schools. Outcome measures were the Cambridge Neuropsychological Test Battery (CANTAB), the Attention Network Test (ANT), the Cognitive Assessment System (CAS) and the short form of the Connor's Parent Rating Scale (CPRS:S). Physical activity was measured habitually and during PE sessions using the Actigraph accelerometer. Results Test- retest intraclass correlations from CANTAB Spatial Span (r 0.51) and Spatial Working Memory Errors (0.59) and ANT Reaction Time (0.37) and ANT Accuracy (0.60) were significant, but low. Physical activity was significantly higher during intervention vs. control PE sessions (p < 0.0001). There were no significant differences between intervention and control group changes in CAS scores. Differences between intervention and control groups favoring the intervention were observed for CANTAB Spatial Span, CANTAB Spatial Working Memory Errors, and ANT Accuracy. Conclusions The present study has identified practical and age-appropriate cognitive and behavioral outcome measures for future RCT, and identified that schools are willing to increase PE time. Trial registration number ISRCTN70853932 (http://www.controlled-trials.com) PMID:22034850
Schittkowski, Michael P.; Antal, Andrea; Ambrus, Géza Gergely; Paulus, Walter; Dannhauer, Moritz; Michalik, Romualda; Mante, Alf; Bola, Michal; Lux, Anke; Kropf, Siegfried; Brandt, Stephan A.; Sabel, Bernhard A.
2016-01-01
Background Vision loss after optic neuropathy is considered irreversible. Here, repetitive transorbital alternating current stimulation (rtACS) was applied in partially blind patients with the goal of activating their residual vision. Methods We conducted a multicenter, prospective, randomized, double-blind, sham-controlled trial in an ambulatory setting with daily application of rtACS (n = 45) or sham-stimulation (n = 37) for 50 min for a duration of 10 week days. A volunteer sample of patients with optic nerve damage (mean age 59.1 yrs) was recruited. The primary outcome measure for efficacy was super-threshold visual fields with 48 hrs after the last treatment day and at 2-months follow-up. Secondary outcome measures were near-threshold visual fields, reaction time, visual acuity, and resting-state EEGs to assess changes in brain physiology. Results The rtACS-treated group had a mean improvement in visual field of 24.0% which was significantly greater than after sham-stimulation (2.5%). This improvement persisted for at least 2 months in terms of both within- and between-group comparisons. Secondary analyses revealed improvements of near-threshold visual fields in the central 5° and increased thresholds in static perimetry after rtACS and improved reaction times, but visual acuity did not change compared to shams. Visual field improvement induced by rtACS was associated with EEG power-spectra and coherence alterations in visual cortical networks which are interpreted as signs of neuromodulation. Current flow simulation indicates current in the frontal cortex, eye, and optic nerve and in the subcortical but not in the cortical regions. Conclusion rtACS treatment is a safe and effective means to partially restore vision after optic nerve damage probably by modulating brain plasticity. This class 1 evidence suggests that visual fields can be improved in a clinically meaningful way. Trial Registration ClinicalTrials.gov NCT01280877 PMID:27355577
Truly random bit generation based on a novel random Brillouin fiber laser.
Xiang, Dao; Lu, Ping; Xu, Yanping; Gao, Song; Chen, Liang; Bao, Xiaoyi
2015-11-15
We propose a novel dual-emission random Brillouin fiber laser (RBFL) with bidirectional pumping operation. Numerical simulations and experimental verification of the chaotic temporal and statistical properties of the RBFL are conducted, revealing intrinsic unpredictable intensity fluctuations and two completely uncorrelated laser outputs. A random bit generator based on quantum noise sources in the random Fabry-Perot resonator of the RBFL is realized at a bit rate of 5 Mbps with verified randomness.
Truly random bit generation based on a novel random Brillouin fiber laser.
Xiang, Dao; Lu, Ping; Xu, Yanping; Gao, Song; Chen, Liang; Bao, Xiaoyi
2015-11-15
We propose a novel dual-emission random Brillouin fiber laser (RBFL) with bidirectional pumping operation. Numerical simulations and experimental verification of the chaotic temporal and statistical properties of the RBFL are conducted, revealing intrinsic unpredictable intensity fluctuations and two completely uncorrelated laser outputs. A random bit generator based on quantum noise sources in the random Fabry-Perot resonator of the RBFL is realized at a bit rate of 5 Mbps with verified randomness. PMID:26565888
Bright emission from a random Raman laser
Hokr, Brett H.; Bixler, Joel N.; Cone, Michael T.; Mason, John D.; Beier, Hope T.; Noojin, Gary D.; Petrov, Georgi I.; Golovan, Leonid A.; Thomas, Robert J.; Rockwell, Benjamin A.; Yakovlev, Vladislav V.
2014-01-01
Random lasers are a developing class of light sources that utilize a highly disordered gain medium as opposed to a conventional optical cavity. Although traditional random lasers often have a relatively broad emission spectrum, a random laser that utilizes vibration transitions via Raman scattering allows for an extremely narrow bandwidth, on the order of 10 cm−1. Here we demonstrate the first experimental evidence of lasing via a Raman interaction in a bulk three-dimensional random medium, with conversion efficiencies on the order of a few percent. Furthermore, Monte Carlo simulations are used to study the complex spatial and temporal dynamics of nonlinear processes in turbid media. In addition to providing a large signal, characteristic of the Raman medium, the random Raman laser offers us an entirely new tool for studying the dynamics of gain in a turbid medium. PMID:25014073
ERIC Educational Resources Information Center
Marcus, Sue M.; Stuart, Elizabeth A.; Wang, Pei; Shadish, William R.; Steiner, Peter M.
2012-01-01
Although randomized studies have high internal validity, generalizability of the estimated causal effect from randomized clinical trials to real-world clinical or educational practice may be limited. We consider the implication of randomized assignment to treatment, as compared with choice of preferred treatment as it occurs in real-world…
Random nanolasing in the Anderson localized regime
NASA Astrophysics Data System (ADS)
Liu, J.; Garcia, P. D.; Ek, S.; Gregersen, N.; Suhr, T.; Schubert, M.; Mørk, J.; Stobbe, S.; Lodahl, P.
2014-04-01
The development of nanoscale optical devices for classical and quantum photonics is affected by unavoidable fabrication imperfections that often impose performance limitations. However, disorder may also enable new functionalities, for example in random lasers, where lasing relies on random multiple scattering. The applicability of random lasers has been limited due to multidirectional emission, lack of tunability, and strong mode competition with chaotic fluctuations due to a weak mode confinement. The regime of Anderson localization of light has been proposed for obtaining stable multimode random lasing, and initial work concerned macroscopic one-dimensional layered media. Here, we demonstrate on-chip random nanolasers where the cavity feedback is provided by the intrinsic disorder. The strong confinement achieved by Anderson localization reduces the spatial overlap between lasing modes, thus preventing mode competition and improving stability. This enables highly efficient, stable and broadband wavelength-controlled lasers with very small mode volumes. Furthermore, the complex interplay between gain, dispersion-controlled slow light, and disorder is demonstrated experimentally for a non-conservative random medium. The statistical analysis shows a way towards optimizing random-lasing performance by reducing the localization length, a universal parameter.
Reduction of display artifacts by random sampling
NASA Technical Reports Server (NTRS)
Ahumada, A. J., Jr.; Nagel, D. C.; Watson, A. B.; Yellott, J. I., Jr.
1983-01-01
The application of random-sampling techniques to remove visible artifacts (such as flicker, moire patterns, and paradoxical motion) introduced in TV-type displays by discrete sequential scanning is discussed and demonstrated. Sequential-scanning artifacts are described; the window of visibility defined in spatiotemporal frequency space by Watson and Ahumada (1982 and 1983) and Watson et al. (1983) is explained; the basic principles of random sampling are reviewed and illustrated by the case of the human retina; and it is proposed that the sampling artifacts can be replaced by random noise, which can then be shifted to frequency-space regions outside the window of visibility. Vertical sequential, single-random-sequence, and continuously renewed random-sequence plotting displays generating 128 points at update rates up to 130 Hz are applied to images of stationary and moving lines, and best results are obtained with the single random sequence for the stationary lines and with the renewed random sequence for the moving lines.
Duality analysis on random planar lattices.
Ohzeki, Masayuki; Fujii, Keisuke
2012-11-01
The conventional duality analysis is employed to identify a location of a critical point on a uniform lattice without any disorder in its structure. In the present study, we deal with the random planar lattice, which consists of the randomized structure based on the square lattice. We introduce the uniformly random modification by the bond dilution and contraction on a part of the unit square. The random planar lattice includes the triangular and hexagonal lattices in extreme cases of a parameter to control the structure. A modern duality analysis fashion with real-space renormalization is found to be available for estimating the location of the critical points with a wide range of the randomness parameter. As a simple test bed, we demonstrate that our method indeed gives several critical points for the cases of the Ising and Potts models and the bond-percolation thresholds on the random planar lattice. Our method leads to not only such an extension of the duality analyses on the classical statistical mechanics but also a fascinating result associated with optimal error thresholds for a class of quantum error correction code, the surface code on the random planar lattice, which is known as a skillful technique to protect the quantum state.
Duality analysis on random planar lattices
NASA Astrophysics Data System (ADS)
Ohzeki, Masayuki; Fujii, Keisuke
2012-11-01
The conventional duality analysis is employed to identify a location of a critical point on a uniform lattice without any disorder in its structure. In the present study, we deal with the random planar lattice, which consists of the randomized structure based on the square lattice. We introduce the uniformly random modification by the bond dilution and contraction on a part of the unit square. The random planar lattice includes the triangular and hexagonal lattices in extreme cases of a parameter to control the structure. A modern duality analysis fashion with real-space renormalization is found to be available for estimating the location of the critical points with a wide range of the randomness parameter. As a simple test bed, we demonstrate that our method indeed gives several critical points for the cases of the Ising and Potts models and the bond-percolation thresholds on the random planar lattice. Our method leads to not only such an extension of the duality analyses on the classical statistical mechanics but also a fascinating result associated with optimal error thresholds for a class of quantum error correction code, the surface code on the random planar lattice, which is known as a skillful technique to protect the quantum state.
Wave propagation in random granular chains.
Manjunath, Mohith; Awasthi, Amnaya P; Geubelle, Philippe H
2012-03-01
The influence of randomness on wave propagation in one-dimensional chains of spherical granular media is investigated. The interaction between the elastic spheres is modeled using the classical Hertzian contact law. Randomness is introduced in the discrete model using random distributions of particle mass, Young's modulus, or radius. Of particular interest in this study is the quantification of the attenuation in the amplitude of the impulse associated with various levels of randomness: two distinct regimes of decay are observed, characterized by an exponential or a power law, respectively. The responses are normalized to represent a vast array of material parameters and impact conditions. The virial theorem is applied to investigate the transfer from potential to kinetic energy components in the system for different levels of randomness. The level of attenuation in the two decay regimes is compared for the three different sources of randomness and it is found that randomness in radius leads to the maximum rate of decay in the exponential regime of wave propagation. PMID:22587093
Wave propagation in random granular chains.
Manjunath, Mohith; Awasthi, Amnaya P; Geubelle, Philippe H
2012-03-01
The influence of randomness on wave propagation in one-dimensional chains of spherical granular media is investigated. The interaction between the elastic spheres is modeled using the classical Hertzian contact law. Randomness is introduced in the discrete model using random distributions of particle mass, Young's modulus, or radius. Of particular interest in this study is the quantification of the attenuation in the amplitude of the impulse associated with various levels of randomness: two distinct regimes of decay are observed, characterized by an exponential or a power law, respectively. The responses are normalized to represent a vast array of material parameters and impact conditions. The virial theorem is applied to investigate the transfer from potential to kinetic energy components in the system for different levels of randomness. The level of attenuation in the two decay regimes is compared for the three different sources of randomness and it is found that randomness in radius leads to the maximum rate of decay in the exponential regime of wave propagation.
Gajecki, Mikael; Johansson, Magnus; Blankers, Matthijs; Sinadinovic, Kristina; Stenlund-Gens, Erik; Berman, Anne H.
2016-01-01
Background The Internet has increasingly been studied as mode of delivery for interventions targeting problematic alcohol use. Most interventions have been fully automated, but some research suggests that adding counselor guidance may improve alcohol consumption outcomes. Methods An eight-module Internet-based self-help program based on cognitive behavioral therapy (CBT) was tested among Internet help-seekers. Eighty participants with problematic alcohol use according to the Alcohol Use Disorders Identification Test (AUDIT; scores of ≥ 6 for women and ≥ 8 for men) were recruited online from an open access website and randomized into three different groups. All groups were offered the same self-help program, but participants in two of the three groups received Internet-based counselor guidance in addition to the self-help program. One of the guidance groups was given a choice between guidance via asynchronous text messages or synchronous text-based chat, while the other guidance group received counselor guidance via asynchronous text messages only. Results In the choice group, 65% (13 of 20 participants) chose guidance via asynchronous text messages. At the 10-week post-treatment follow-up, an intention-to-treat (ITT) analysis showed that participants in the two guidance groups (choice and messages) reported significantly lower past week alcohol consumption compared to the group without guidance; 10.8 (SD = 12.1) versus 22.6 (SD = 18.4); p = 0.001; Cohen’s d = 0.77. Participants in both guidance groups reported significantly lower scores on the AUDIT at follow-up compared to the group without guidance, with a mean score of 14.4 (SD = 5.2) versus 18.2 (SD = 5.9); p = 0.003; Cohen’s d = 0.68. A higher proportion of participants in the guidance groups said that they would recommend the program compared to the group without guidance (81% for choice; 93% for messages versus 47% for self-help). Conclusion Self-help programs for problematic alcohol use can be more
2013-01-01
Background Chronic work-related stress is an independent risk factor for cardiometabolic diseases and associated mortality, particularly when compounded by a sedentary work environment. The purpose of this study was to determine if an office worksite-based hatha yoga program could improve physiological stress, evaluated via heart rate variability (HRV), and associated health-related outcomes in a cohort of office workers. Methods Thirty-seven adults employed in university-based office positions were randomized upon the completion of baseline testing to an experimental or control group. The experimental group completed a 10-week yoga program prescribed three sessions per week during lunch hour (50 min per session). An experienced instructor led the sessions, which emphasized asanas (postures) and vinyasa (exercises). The primary outcome was the high frequency (HF) power component of HRV. Secondary outcomes included additional HRV parameters, musculoskeletal fitness (i.e. push-up, side-bridge, and sit & reach tests) and psychological indices (i.e. state and trait anxiety, quality of life and job satisfaction). Results All measures of HRV failed to change in the experimental group versus the control group, except that the experimental group significantly increased LF:HF (p = 0.04) and reduced pNN50 (p = 0.04) versus control, contrary to our hypotheses. Flexibility, evaluated via sit & reach test increased in the experimental group versus the control group (p < 0.001). No other adaptations were noted. Post hoc analysis comparing participants who completed ≥70% of yoga sessions (n = 11) to control (n = 19) yielded the same findings, except that the high adherers also reduced state anxiety (p = 0.02) and RMSSD (p = 0.05), and tended to improve the push-up test (p = 0.07) versus control. Conclusions A 10-week hatha yoga intervention delivered at the office worksite during lunch hour did not improve HF power or other HRV parameters
Low-noise Brillouin random fiber laser with a random grating-based resonator.
Xu, Yanping; Gao, Song; Lu, Ping; Mihailov, Stephen; Chen, Liang; Bao, Xiaoyi
2016-07-15
A novel Brillouin random fiber laser (BRFL) with the random grating-based Fabry-Perot (FP) resonator is proposed and demonstrated. Significantly enhanced random feedback from the femtosecond laser-fabricated random grating overwhelms the Rayleigh backscattering, which leads to efficient Brillouin gain for the lasing modes and reduced lasing threshold. Compared to the intensity and frequency noises of the Rayleigh feedback resonator, those of the proposed random laser are effectively suppressed due to the reduced resonating modes and mode competition resulting from the random grating-formed filters. Using the heterodyne technique, the linewidth of the coherent random lasing spike is measured to be ∼45.8 Hz. PMID:27420494
Localization length fluctuation in randomly layered media
NASA Astrophysics Data System (ADS)
Yuan, Haiming; Huang, Feng; Jiang, Xiangqian; Sun, Xiudong
2016-10-01
Localization properties of the two-component randomly layered media (RLM) are studied in detail both analytically and numerically. The localization length is found fluctuating around the analytical result obtained under the high-frequency limit. The fluctuation amplitude approaches zero with the increasing of disorder, which is characterized by the distribution width of random thickness. It is also found that the localization length over the mean thickness periodically varies with the distribution center of random thickness. For the multi-component RLM structure, the arrangement of material must be considered.
Fraunhofer diffraction by a random screen.
Malinka, Aleksey V
2011-08-01
The stochastic approach is applied to the problem of Fraunhofer diffraction by a random screen. The diffraction pattern is expressed through the random chord distribution. Two cases are considered: the sparse ensemble, where the interference between different obstacles can be neglected, and the densely packed ensemble, where this interference is to be taken into account. The solution is found for the general case and the analytical formulas are obtained for the Switzer model of a random screen, i.e., for the case of Markov statistics.
Quantum randomness certified by the uncertainty principle
NASA Astrophysics Data System (ADS)
Vallone, Giuseppe; Marangon, Davide G.; Tomasin, Marco; Villoresi, Paolo
2014-11-01
We present an efficient method to extract the amount of true randomness that can be obtained by a quantum random number generator (QRNG). By repeating the measurements of a quantum system and by swapping between two mutually unbiased bases, a lower bound of the achievable true randomness can be evaluated. The bound is obtained thanks to the uncertainty principle of complementary measurements applied to min-entropy and max-entropy. We tested our method with two different QRNGs by using a train of qubits or ququart and demonstrated the scalability toward practical applications.
A random matrix theory of decoherence
NASA Astrophysics Data System (ADS)
Gorin, T.; Pineda, C.; Kohler, H.; Seligman, T. H.
2008-11-01
Random matrix theory is used to represent generic loss of coherence of a fixed central system coupled to a quantum-chaotic environment, represented by a random matrix ensemble, via random interactions. We study the average density matrix arising from the ensemble induced, in contrast to previous studies where the average values of purity, concurrence and entropy were considered; we further discuss when one or the other approach is relevant. The two approaches agree in the limit of large environments. Analytic results for the average density matrix and its purity are presented in linear response approximation. The two-qubit system is analysed, mainly numerically, in more detail.
Cavity approach to the random solid state.
Mao, Xiaoming; Goldbart, Paul M; Mézard, Marc; Weigt, Martin
2005-09-30
The cavity approach is used to address the physical properties of random solids in equilibrium. Particular attention is paid to the fraction of localized particles and the distribution of localization lengths characterizing their thermal motion. This approach is of relevance to a wide class of random solids, including rubbery media (formed via the vulcanization of polymer fluids) and chemical gels (formed by the random covalent bonding of fluids of atoms or small molecules). The cavity approach confirms results that have been obtained previously via replica mean-field theory, doing so in a way that sheds new light on their physical origin. PMID:16241698
Self-assembly of Random Copolymers
Li, Longyu; Raghupathi, Kishore; Song, Cunfeng; Prasad, Priyaa; Thayumanavan, S.
2014-01-01
Self-assembly of random copolymers has attracted considerable attention recently. In this feature article, we highlight the use of random copolymers to prepare nanostructures with different morphologies and to prepare nanomaterials that are responsive to single or multiple stimuli. The synthesis of single-chain nanoparticles and their potential applications from random copolymers are also discussed in some detail. We aim to draw more attention to these easily accessible copolymers, which are likely to play an important role in translational polymer research. PMID:25036552
A random rule model of surface growth
NASA Astrophysics Data System (ADS)
Mello, Bernardo A.
2015-02-01
Stochastic models of surface growth are usually based on randomly choosing a substrate site to perform iterative steps, as in the etching model, Mello et al. (2001) [5]. In this paper I modify the etching model to perform sequential, instead of random, substrate scan. The randomicity is introduced not in the site selection but in the choice of the rule to be followed in each site. The change positively affects the study of dynamic and asymptotic properties, by reducing the finite size effect and the short-time anomaly and by increasing the saturation time. It also has computational benefits: better use of the cache memory and the possibility of parallel implementation.
Garland, Eric L; Roberts-Lewis, Amelia; Tronnier, Christine D; Graves, Rebecca; Kelley, Karen
2016-02-01
In many clinical settings, there is a high comorbidity between substance use disorders, psychiatric disorders, and traumatic stress. Novel therapies are needed to address these co-occurring issues efficiently. The aim of the present study was to conduct a pragmatic randomized controlled trial comparing Mindfulness-Oriented Recovery Enhancement (MORE) to group Cognitive-Behavioral Therapy (CBT) and treatment-as-usual (TAU) for previously homeless men residing in a therapeutic community. Men with co-occurring substance use and psychiatric disorders, as well as extensive trauma histories, were randomly assigned to 10 weeks of group treatment with MORE (n = 64), CBT (n = 64), or TAU (n = 52). Study findings indicated that from pre-to post-treatment MORE was associated with modest yet significantly greater improvements in substance craving, post-traumatic stress, and negative affect than CBT, and greater improvements in post-traumatic stress and positive affect than TAU. A significant indirect effect of MORE on decreasing craving and post-traumatic stress by increasing dispositional mindfulness was observed, suggesting that MORE may target these issues via enhancing mindful awareness in everyday life. This pragmatic trial represents the first head-to-head comparison of MORE against an empirically-supported treatment for co-occurring disorders. Results suggest that MORE, as an integrative therapy designed to bolster self-regulatory capacity, may hold promise as a treatment for intersecting clinical conditions.
Garland, Eric L; Roberts-Lewis, Amelia; Tronnier, Christine D; Graves, Rebecca; Kelley, Karen
2016-02-01
In many clinical settings, there is a high comorbidity between substance use disorders, psychiatric disorders, and traumatic stress. Novel therapies are needed to address these co-occurring issues efficiently. The aim of the present study was to conduct a pragmatic randomized controlled trial comparing Mindfulness-Oriented Recovery Enhancement (MORE) to group Cognitive-Behavioral Therapy (CBT) and treatment-as-usual (TAU) for previously homeless men residing in a therapeutic community. Men with co-occurring substance use and psychiatric disorders, as well as extensive trauma histories, were randomly assigned to 10 weeks of group treatment with MORE (n = 64), CBT (n = 64), or TAU (n = 52). Study findings indicated that from pre-to post-treatment MORE was associated with modest yet significantly greater improvements in substance craving, post-traumatic stress, and negative affect than CBT, and greater improvements in post-traumatic stress and positive affect than TAU. A significant indirect effect of MORE on decreasing craving and post-traumatic stress by increasing dispositional mindfulness was observed, suggesting that MORE may target these issues via enhancing mindful awareness in everyday life. This pragmatic trial represents the first head-to-head comparison of MORE against an empirically-supported treatment for co-occurring disorders. Results suggest that MORE, as an integrative therapy designed to bolster self-regulatory capacity, may hold promise as a treatment for intersecting clinical conditions. PMID:26701171
Random Effects Diagonal Metric Multidimensional Scaling Models.
ERIC Educational Resources Information Center
Clarkson, Douglas B.; Gonzalez, Richard
2001-01-01
Defines a random effects diagonal metric multidimensional scaling model, gives its computational algorithms, describes researchers' experiences with these algorithms, and provides an illustration of the use of the model and algorithms. (Author/SLD)
Periodic behavior in a random environment.
Broadbent, H A
1994-04-01
Animals' tendency to search periodically in temporally random environments was studied by presenting rats with random-interval (RI) 60-s and 120-s schedules. Power spectra revealed a periodicity of responding of 20-50 s for all animals regardless of condition. A second periodicity of 5-10-s was strongest under the RI 60-s schedule. Optimality theory suggests that periodic responding is better than random responding in obtaining food sooner on average, but the theory does not account for multiple periodicities. These multiple periodicities also cannot be explained by a single-oscillator, information-processing version of scalar expectancy theory (J. Gibbon & R. M. Church, 1992) or by the behavioral theory of timing (P.R. Killeen & J. G. Fetterman, 1988). The periodicities are consistent with a connectionist version of scalar expectancy theory that has nonscalar emergent properties, including multiple periodicities that are not proportional to the rate of random events.
Randomizing Genome-Scale Metabolic Networks
Samal, Areejit; Martin, Olivier C.
2011-01-01
Networks coming from protein-protein interactions, transcriptional regulation, signaling, or metabolism may appear to have “unusual” properties. To quantify this, it is appropriate to randomize the network and test the hypothesis that the network is not statistically different from expected in a motivated ensemble. However, when dealing with metabolic networks, the randomization of the network using edge exchange generates fictitious reactions that are biochemically meaningless. Here we provide several natural ensembles of randomized metabolic networks. A first constraint is to use valid biochemical reactions. Further constraints correspond to imposing appropriate functional constraints. We explain how to perform these randomizations with the help of Markov Chain Monte Carlo (MCMC) and show that they allow one to approach the properties of biological metabolic networks. The implication of the present work is that the observed global structural properties of real metabolic networks are likely to be the consequence of simple biochemical and functional constraints. PMID:21779409
Random Life Curves for Common Engineering Materials
NASA Technical Reports Server (NTRS)
Hu, T.
1983-01-01
Program incorporates non-Rayleigh effects in evaluating structure life. RMS2 computer program converts constant amplitude fatigue allowables to random-loading allowables, with influence of peak distribution and mean stress considered. RMS2 written in FORTRAN IV.
Parametric models for samples of random functions
Grigoriu, M.
2015-09-15
A new class of parametric models, referred to as sample parametric models, is developed for random elements that match sample rather than the first two moments and/or other global properties of these elements. The models can be used to characterize, e.g., material properties at small scale in which case their samples represent microstructures of material specimens selected at random from a population. The samples of the proposed models are elements of finite-dimensional vector spaces spanned by samples, eigenfunctions of Karhunen–Loève (KL) representations, or modes of singular value decompositions (SVDs). The implementation of sample parametric models requires knowledge of the probability laws of target random elements. Numerical examples including stochastic processes and random fields are used to demonstrate the construction of sample parametric models, assess their accuracy, and illustrate how these models can be used to solve efficiently stochastic equations.
Fast generation of sparse random kernel graphs
Hagberg, Aric; Lemons, Nathan; Du, Wen -Bo
2015-09-10
The development of kernel-based inhomogeneous random graphs has provided models that are flexible enough to capture many observed characteristics of real networks, and that are also mathematically tractable. We specify a class of inhomogeneous random graph models, called random kernel graphs, that produces sparse graphs with tunable graph properties, and we develop an efficient generation algorithm to sample random instances from this model. As real-world networks are usually large, it is essential that the run-time of generation algorithms scales better than quadratically in the number of vertices n. We show that for many practical kernels our algorithm runs in time at most ο(n(logn)²). As an example, we show how to generate samples of power-law degree distribution graphs with tunable assortativity.
Fast generation of sparse random kernel graphs
Hagberg, Aric; Lemons, Nathan; Du, Wen -Bo
2015-09-10
The development of kernel-based inhomogeneous random graphs has provided models that are flexible enough to capture many observed characteristics of real networks, and that are also mathematically tractable. We specify a class of inhomogeneous random graph models, called random kernel graphs, that produces sparse graphs with tunable graph properties, and we develop an efficient generation algorithm to sample random instances from this model. As real-world networks are usually large, it is essential that the run-time of generation algorithms scales better than quadratically in the number of vertices n. We show that for many practical kernels our algorithm runs in timemore » at most ο(n(logn)²). As an example, we show how to generate samples of power-law degree distribution graphs with tunable assortativity.« less
Products of Independent Elliptic Random Matrices
NASA Astrophysics Data System (ADS)
O'Rourke, Sean; Renfrew, David; Soshnikov, Alexander; Vu, Van
2015-07-01
For fixed , we study the product of independent elliptic random matrices as tends to infinity. Our main result shows that the empirical spectral distribution of the product converges, with probability , to the -th power of the circular law, regardless of the joint distribution of the mirror entries in each matrix. This leads to a new kind of universality phenomenon: the limit law for the product of independent random matrices is independent of the limit laws for the individual matrices themselves. Our result also generalizes earlier results of Götze-Tikhomirov (On the asymptotic spectrum of products of independent random matrices, available at http://arxiv.org/abs/1012.2710) and O'Rourke-Soshnikov (J Probab 16(81):2219-2245, 2011) concerning the product of independent iid random matrices.
Computer Challenges: Random Walks in the Classroom.
ERIC Educational Resources Information Center
Gamble, Andy
1982-01-01
Discusses a short computer program used in teaching the random (RND) function in the BASIC programming language. Focuses on the mathematical concepts involved in the program related to elementary probability. (JN)
Random Variables: Simulations and Surprising Connections.
ERIC Educational Resources Information Center
Quinn, Robert J.; Tomlinson, Stephen
1999-01-01
Features activities for advanced second-year algebra students in grades 11 and 12. Introduces three random variables and considers an empirical and theoretical probability for each. Uses coins, regular dice, decahedral dice, and calculators. (ASK)
Random motion analysis of flexible satellite structures
NASA Technical Reports Server (NTRS)
Huang, T. C.; Das, A.
1978-01-01
A singular perturbation formulation is used to study the responses of a flexible satellite when random measurement errors can occur. The random variables, at different instants of time, are assumed to be uncorrelated. Procedures for obtaining maxima and minima are described, and a variation of the linear method is developed for the formal solution of the two-point boundary-value problems represented by the variational equations. Random and deterministic solutions for the structural position coordinates are studied, and an analytic algorithm for treating the force equation of motion is developed. Since the random system indicated by the variational equation will always be asymptotically unstable, any analysis of stability must be based on the deterministic system.
(Convertible) Undeniable Signatures Without Random Oracles
NASA Astrophysics Data System (ADS)
Yuen, Tsz Hon; Au, Man Ho; Liu, Joseph K.; Susilo, Willy
We propose a convertible undeniable signature scheme without random oracles. Our construction is based on Waters' and Kurosawa and Heng's schemes that were proposed in Eurocrypt 2005. The security of our scheme is based on the CDH and the decision linear assumption. Comparing only the part of undeniable signatures, our scheme uses more standard assumptions than the existing undeniable signatures without random oracles due to Laguillamie and Vergnaud.
Managing numerical errors in random sequential adsorption
NASA Astrophysics Data System (ADS)
Cieśla, Michał; Nowak, Aleksandra
2016-09-01
Aim of this study is to examine the influence of a finite surface size and a finite simulation time on a packing fraction estimated using random sequential adsorption simulations. The goal of particular interest is providing hints on simulation setup to achieve desired level of accuracy. The analysis is based on properties of saturated random packing of disks on continuous and flat surfaces of different sizes.
Random flights through spaces of different dimensions
NASA Astrophysics Data System (ADS)
Reimberg, Paulo H. F.; Abramo, L. Raul
2015-01-01
We shall study random flights that start in a space of one given dimension and, after performing a definite number of steps, continue to develop in a space of higher dimension. We show that if the difference of the dimension of spaces is even, then the probability density describing the composite flight can be expressed as marginalizations of the probability density associated to a random flight in the space of less dimensions. This dimensional reduction is a consequence of Gegenbauer addition theorem.
Nonlocal soliton scattering in random potentials
NASA Astrophysics Data System (ADS)
Piccardi, Armando; Residori, Stefania; Assanto, Gaetano
2016-07-01
We experimentally investigate the transport behaviour of nonlocal spatial optical solitons when launched in and interacting with propagation-invariant random potentials. The solitons are generated in nematic liquid crystals; the randomness is created by suitably engineered illumination of planar voltage-biased cells equipped with a photosensitive wall. We find that the fluctuations follow a super-diffusive trend, with the mean square displacement lowering for decreasing spatial correlation of the noise.
Spatial coherence of random laser emission.
Redding, Brandon; Choma, Michael A; Cao, Hui
2011-09-01
We experimentally studied the spatial coherence of random laser emission from dye solutions containing nanoparticles. The spatial coherence, measured in a double slit experiment, varied significantly with the density of scatterers and the size and shape of the excitation volume. A qualitative explanation is provided, illustrating the dramatic difference from the spatial coherence of a conventional laser. This work demonstrates that random lasers can be controlled to provide intense, spatially incoherent emission for applications in which spatial cross talk or speckle limit performance.
Quantum random walks using quantum accelerator modes
Ma, Z.-Y.; Burnett, K.; D'Arcy, M. B.; Gardiner, S. A.
2006-01-15
We discuss the use of high-order quantum accelerator modes to achieve an atom optical realization of a biased quantum random walk. We first discuss how one can create coexistent quantum accelerator modes, and hence how momentum transfer that depends on the atoms' internal state can be achieved. When combined with microwave driving of the transition between the states, a different type of atomic beam splitter results. This permits the realization of a biased quantum random walk through quantum accelerator modes.
Sunspot random walk and 22-year variation
Love, Jeffrey J.; Rigler, E. Joshua
2012-01-01
We examine two stochastic models for consistency with observed long-term secular trends in sunspot number and a faint, but semi-persistent, 22-yr signal: (1) a null hypothesis, a simple one-parameter random-walk model of sunspot-number cycle-to-cycle change, and, (2) an alternative hypothesis, a two-parameter random-walk model with an imposed 22-yr alternating amplitude. The observed secular trend in sunspots, seen from solar cycle 5 to 23, would not be an unlikely result of the accumulation of multiple random-walk steps. Statistical tests show that a 22-yr signal can be resolved in historical sunspot data; that is, the probability is low that it would be realized from random data. On the other hand, the 22-yr signal has a small amplitude compared to random variation, and so it has a relatively small effect on sunspot predictions. Many published predictions for cycle 24 sunspots fall within the dispersion of previous cycle-to-cycle sunspot differences. The probability is low that the Sun will, with the accumulation of random steps over the next few cycles, walk down to a Dalton-like minimum. Our models support published interpretations of sunspot secular variation and 22-yr variation resulting from cycle-to-cycle accumulation of dynamo-generated magnetic energy.
Maximally Nonlocal Theories Cannot Be Maximally Random
NASA Astrophysics Data System (ADS)
de la Torre, Gonzalo; Hoban, Matty J.; Dhara, Chirag; Prettico, Giuseppe; Acín, Antonio
2015-04-01
Correlations that violate a Bell inequality are said to be nonlocal; i.e., they do not admit a local and deterministic explanation. Great effort has been devoted to study how the amount of nonlocality (as measured by a Bell inequality violation) serves to quantify the amount of randomness present in observed correlations. In this work we reverse this research program and ask what do the randomness certification capabilities of a theory tell us about the nonlocality of that theory. We find that, contrary to initial intuition, maximal randomness certification cannot occur in maximally nonlocal theories. We go on and show that quantum theory, in contrast, permits certification of maximal randomness in all dichotomic scenarios. We hence pose the question of whether quantum theory is optimal for randomness; i.e., is it the most nonlocal theory that allows maximal randomness certification? We answer this question in the negative by identifying a larger-than-quantum set of correlations capable of this feat. Not only are these results relevant to understanding quantum mechanics' fundamental features, but also put fundamental restrictions on device-independent protocols based on the no-signaling principle.
Soft random solids and their heterogeneous elasticity.
Mao, Xiaoming; Goldbart, Paul M; Xing, Xiangjun; Zippelius, Annette
2009-09-01
Spatial heterogeneity in the elastic properties of soft random solids is examined via vulcanization theory. The spatial heterogeneity in the structure of soft random solids is a result of the fluctuations locked-in at their synthesis, which also brings heterogeneity in their elastic properties. Vulcanization theory studies semimicroscopic models of random-solid-forming systems and applies replica field theory to deal with their quenched disorder and thermal fluctuations. The elastic deformations of soft random solids are argued to be described by the Goldstone sector of fluctuations contained in vulcanization theory, associated with a subtle form of spontaneous symmetry breaking that is associated with the liquid-to-random-solid transition. The resulting free energy of this Goldstone sector can be reinterpreted as arising from a phenomenological description of an elastic medium with quenched disorder. Through this comparison, we arrive at the statistics of the quenched disorder of the elasticity of soft random solids in terms of residual stress and Lamé-coefficient fields. In particular, there are large residual stresses in the equilibrium reference state, and the disorder correlators involving the residual stress are found to be long ranged and governed by a universal parameter that also gives the mean shear modulus. PMID:19905095
Maximally nonlocal theories cannot be maximally random.
de la Torre, Gonzalo; Hoban, Matty J; Dhara, Chirag; Prettico, Giuseppe; Acín, Antonio
2015-04-24
Correlations that violate a Bell inequality are said to be nonlocal; i.e., they do not admit a local and deterministic explanation. Great effort has been devoted to study how the amount of nonlocality (as measured by a Bell inequality violation) serves to quantify the amount of randomness present in observed correlations. In this work we reverse this research program and ask what do the randomness certification capabilities of a theory tell us about the nonlocality of that theory. We find that, contrary to initial intuition, maximal randomness certification cannot occur in maximally nonlocal theories. We go on and show that quantum theory, in contrast, permits certification of maximal randomness in all dichotomic scenarios. We hence pose the question of whether quantum theory is optimal for randomness; i.e., is it the most nonlocal theory that allows maximal randomness certification? We answer this question in the negative by identifying a larger-than-quantum set of correlations capable of this feat. Not only are these results relevant to understanding quantum mechanics' fundamental features, but also put fundamental restrictions on device-independent protocols based on the no-signaling principle. PMID:25955039
Soft random solids and their heterogeneous elasticity
NASA Astrophysics Data System (ADS)
Mao, Xiaoming; Goldbart, Paul M.; Xing, Xiangjun; Zippelius, Annette
2009-09-01
Spatial heterogeneity in the elastic properties of soft random solids is examined via vulcanization theory. The spatial heterogeneity in the structure of soft random solids is a result of the fluctuations locked-in at their synthesis, which also brings heterogeneity in their elastic properties. Vulcanization theory studies semimicroscopic models of random-solid-forming systems and applies replica field theory to deal with their quenched disorder and thermal fluctuations. The elastic deformations of soft random solids are argued to be described by the Goldstone sector of fluctuations contained in vulcanization theory, associated with a subtle form of spontaneous symmetry breaking that is associated with the liquid-to-random-solid transition. The resulting free energy of this Goldstone sector can be reinterpreted as arising from a phenomenological description of an elastic medium with quenched disorder. Through this comparison, we arrive at the statistics of the quenched disorder of the elasticity of soft random solids in terms of residual stress and Lamé-coefficient fields. In particular, there are large residual stresses in the equilibrium reference state, and the disorder correlators involving the residual stress are found to be long ranged and governed by a universal parameter that also gives the mean shear modulus.
Randomization Does Not Help Much, Comparability Does
Saint-Mont, Uwe
2015-01-01
According to R.A. Fisher, randomization “relieves the experimenter from the anxiety of considering innumerable causes by which the data may be disturbed.” Since, in particular, it is said to control for known and unknown nuisance factors that may considerably challenge the validity of a result, it has become very popular. This contribution challenges the received view. First, looking for quantitative support, we study a number of straightforward, mathematically simple models. They all demonstrate that the optimism surrounding randomization is questionable: In small to medium-sized samples, random allocation of units to treatments typically yields a considerable imbalance between the groups, i.e., confounding due to randomization is the rule rather than the exception. In the second part of this contribution, the reasoning is extended to a number of traditional arguments in favour of randomization. This discussion is rather non-technical, and sometimes touches on the rather fundamental Frequentist/Bayesian debate. However, the result of this analysis turns out to be quite similar: While the contribution of randomization remains doubtful, comparability contributes much to a compelling conclusion. Summing up, classical experimentation based on sound background theory and the systematic construction of exchangeable groups seems to be advisable. PMID:26193621
How Well Do Random Walks Parallelize?
NASA Astrophysics Data System (ADS)
Efremenko, Klim; Reingold, Omer
A random walk on a graph is a process that explores the graph in a random way: at each step the walk is at a vertex of the graph, and at each step it moves to a uniformly selected neighbor of this vertex. Random walks are extremely useful in computer science and in other fields. A very natural problem that was recently raised by Alon, Avin, Koucky, Kozma, Lotker, and Tuttle (though it was implicit in several previous papers) is to analyze the behavior of k independent walks in comparison with the behavior of a single walk. In particular, Alon et al. showed that in various settings (e.g., for expander graphs), k random walks cover the graph (i.e., visit all its nodes), Ω(k)-times faster (in expectation) than a single walk. In other words, in such cases k random walks efficiently “parallelize” a single random walk. Alon et al. also demonstrated that, depending on the specific setting, this “speedup” can vary from logarithmic to exponential in k.
Sunspot random walk and 22-year variation
NASA Astrophysics Data System (ADS)
Love, Jeffrey J.; Rigler, E. Joshua
2012-05-01
We examine two stochastic models for consistency with observed long-term secular trends in sunspot number and a faint, but semi-persistent, 22-yr signal: (1) a null hypothesis, a simple one-parameter log-normal random-walk model of sunspot-number cycle-to-cycle change, and, (2) an alternative hypothesis, a two-parameter random-walk model with an imposed 22-yr alternating amplitude. The observed secular trend in sunspots, seen from solar cycle 5 to 23, would not be an unlikely result of the accumulation of multiple random-walk steps. Statistical tests show that a 22-yr signal can be resolved in historical sunspot data; that is, the probability is low that it would be realized from random data. On the other hand, the 22-yr signal has a small amplitude compared to random variation, and so it has a relatively small effect on sunspot predictions. Many published predictions for cycle 24 sunspots fall within the dispersion of previous cycle-to-cycle sunspot differences. The probability is low that the Sun will, with the accumulation of random steps over the next few cycles, walk down to a Dalton-like minimum. Our models support published interpretations of sunspot secular variation and 22-yr variation resulting from cycle-to-cycle accumulation of dynamo-generated magnetic energy.
Lighting up microscopy with random Raman lasing
NASA Astrophysics Data System (ADS)
Hokr, Brett H.; Nodurft, Dawson T.; Thompson, Jonathan V.; Bixler, Joel N.; Noojin, Gary D.; Redding, Brandon; Thomas, Robert J.; Cao, Hui; Rockwell, Benjamin A.; Scully, Marlan O.; Yakovlev, Vladislav V.
2016-03-01
Wide-field microscopy, where full images are obtained simultaneously, is limited by the power available from speckle-free light sources. Currently, the vast majority of wide-field microscopes use either mercury arc lamps, or LEDs as the illumination source. The power available from these sources limits wide-field fluorescent microscopy to tens of microseconds temporal resolution. Lasers, while capable of producing high power and short pulses, have high spatial coherence. This leads to the formation of laser speckle that makes such sources unsuitable for wide-field imaging applications. Random Raman lasers offer the best of both worlds by producing laser-like intensities, short, nanosecond-scale, pulses, and low spatial coherence, speckle-free, output. These qualities combine to make random Raman lasers 4 orders of magnitude brighter than traditional wide-field microscopy light sources. Furthermore, the unique properties of random Raman lasers make possible the entirely new possibilities of wide-field fluorescence lifetime imaging or wide-field Raman microscopy. We will introduce the relevant physics that give rise to the unique properties of random Raman lasing, and demonstrate early proof of principle results demonstrating random Raman lasing emission being used as an imaging light source. Finally, we will discuss future directions and elucidate the benefits of using random Raman lasers as a wide-field microscopy light source.
2013-01-01
Background Anxiety disorders affect approximately 10% to 20% of young people, can be enduring if left untreated, and have been associated with psychopathology in later life. Despite this, there is a paucity of empirical research to assist clinicians in determining appropriate treatment options. We describe a protocol for a randomized controlled trial in which we will examine the effectiveness of a group-based Acceptance and Commitment Therapy program for children and adolescents with a primary diagnosis of anxiety disorder. For the adolescent participants we will also evaluate the elements of the intervention that act as mechanisms for change. Methods/design We will recruit 150 young people (90 children and 60 adolescents) diagnosed with an anxiety disorder and their parent or caregiver. After completion of baseline assessment, participants will be randomized to one of three conditions (Acceptance and Commitment Therapy, Cognitive Behavior Therapy or waitlist control). Those in the Acceptance and Commitment Therapy and Cognitive Behavior Therapy groups will receive 10 × 1.5 hour weekly group-therapy sessions using a manualized treatment program, in accordance with the relevant therapy, to be delivered by psychologists. Controls will receive the Cognitive Behavior Therapy program after 10 weeks waitlisted. Repeated measures will be taken immediately post-therapy and at three months after therapy cessation. Discussion To the best of our knowledge, this study will be the largest trial of Acceptance and Commitment Therapy in the treatment of children and young people to date. It will provide comprehensive data on the use of Acceptance and Commitment Therapy for anxiety disorders and will offer evidence for mechanisms involved in the process of change. Furthermore, additional data will be obtained for the use of Cognitive Behavior Therapy in this population and this research will illustrate the comparative effectiveness of these two interventions, which are currently
Behavior Therapy for Children with Tourette Disorder: A Randomized Controlled Trial
Piacentini, John; Woods, Douglas W.; Scahill, Lawrence; Wilhelm, Sabine; Peterson, Alan L.; Chang, Susanna; Ginsburg, Golda S.; Deckersbach, Thilo; Dziura, James; Levi-Pearl, Sue; Walkup, John T.
2010-01-01
Context Tourette disorder is a chronic and typically impairing childhood-onset neurological condition. Antipsychotic medications, the first-line treatments for moderate to severe tics, are often associated with adverse effects. Behavioral interventions, although promising, have not been evaluated in large-scale controlled trials. Objective To determine the efficacy of a comprehensive behavioral intervention for reducing tic severity in children and adolescents. Design, Setting, Participants Randomized, observer-blind, controlled trial of 126 youngsters recruited from December, 2004 through May, 2007 and aged 9–17 years with impairing Tourette or chronic tic disorder as primary diagnosis randomized to 8 sessions over 10 weeks of behavior therapy (n=61) or a control treatment consisting of supportive therapy and education (n=65). Responders received 3 monthly treatment booster sessions and were reassessed at 3- and 6-months post-treatment. Intervention Comprehensive behavioral intervention. Main Outcome Measures Yale Global Tic Severity Scale (range 0–40, score >15 indicating clinically significant tics), Clinical Global Impression-Improvement Scale (range 1-very much improved to 8-very much worse). Results Behavioral intervention led to a significantly greater decrease on the Yale Global Tic Severity Scale (24.7; CI:23.1,26.3) to 17.1 CI:15.1,19.1) from baseline to endpoint compared to the control treatment (24.6 CI:23.2,26.0) to 21.1 CI:19.2,23.0) (P<.001; 95% CI for difference between groups: 6.2, 2.0); (effect size=0.68). Compared to children in control treatment, significantly more children receiving behavioral intervention were rated as “very much” or “much improved” on the Clinical Global Impression-Improvement scale (52.5% to 18.5%, respectively; P<0.001; number-needed-to-treat=3). Attrition was low (12/126 or 9.5%); tic worsening was reported by 4% of children (5/126). Treatment gains were durable with 87% of available responders to behavior
Randomized Algorithms for Matrices and Data
NASA Astrophysics Data System (ADS)
Mahoney, Michael W.
2012-03-01
This chapter reviews recent work on randomized matrix algorithms. By “randomized matrix algorithms,” we refer to a class of recently developed random sampling and random projection algorithms for ubiquitous linear algebra problems such as least-squares (LS) regression and low-rank matrix approximation. These developments have been driven by applications in large-scale data analysis—applications which place very different demands on matrices than traditional scientific computing applications. Thus, in this review, we will focus on highlighting the simplicity and generality of several core ideas that underlie the usefulness of these randomized algorithms in scientific applications such as genetics (where these algorithms have already been applied) and astronomy (where, hopefully, in part due to this review they will soon be applied). The work we will review here had its origins within theoretical computer science (TCS). An important feature in the use of randomized algorithms in TCS more generally is that one must identify and then algorithmically deal with relevant “nonuniformity structure” in the data. For the randomized matrix algorithms to be reviewed here and that have proven useful recently in numerical linear algebra (NLA) and large-scale data analysis applications, the relevant nonuniformity structure is defined by the so-called statistical leverage scores. Defined more precisely below, these leverage scores are basically the diagonal elements of the projection matrix onto the dominant part of the spectrum of the input matrix. As such, they have a long history in statistical data analysis, where they have been used for outlier detection in regression diagnostics. More generally, these scores often have a very natural interpretation in terms of the data and processes generating the data. For example, they can be interpreted in terms of the leverage or influence that a given data point has on, say, the best low-rank matrix approximation; and this
Rozental, Alexander
2013-01-01
Background Procrastination, to voluntarily delay an intended course of action despite expecting to be worse-off for the delay, is a persistent behavior pattern that can cause major psychological suffering. Approximately half of the student population and 15%-20% of the adult population are presumed having substantial difficulties due to chronic and recurrent procrastination in their everyday life. However, preconceptions and a lack of knowledge restrict the availability of adequate care. Cognitive behavior therapy (CBT) is often considered treatment of choice, although no clinical trials have previously been carried out. Objective The aim of this study will be to test the effects of CBT for procrastination, and to investigate whether it can be delivered via the Internet. Methods Participants will be recruited through advertisements in newspapers, other media, and the Internet. Only people residing in Sweden with access to the Internet and suffering from procrastination will be included in the study. A randomized controlled trial with a sample size of 150 participants divided into three groups will be utilized. The treatment group will consist of 50 participants receiving a 10-week CBT intervention with weekly therapist contact. A second treatment group with 50 participants receiving the same treatment, but without therapist contact, will also be employed. The intervention being used for the current study is derived from a self-help book for procrastination written by one of the authors (AR). It includes several CBT techniques commonly used for the treatment of procrastination (eg, behavioral activation, behavioral experiments, stimulus control, and psychoeducation on motivation and different work methods). A control group consisting of 50 participants on a wait-list control will be used to evaluate the effects of the CBT intervention. For ethical reasons, the participants in the control group will gain access to the same intervention following the 10-week treatment
Wang, Fei; Toselli, Italo; Korotkova, Olga
2016-02-10
An optical system consisting of a laser source and two independent consecutive phase-only spatial light modulators (SLMs) is shown to accurately simulate a generated random beam (first SLM) after interaction with a stationary random medium (second SLM). To illustrate the range of possibilities, a recently introduced class of random optical frames is examined on propagation in free space and several weak turbulent channels with Kolmogorov and non-Kolmogorov statistics.
Wang, Fei; Toselli, Italo; Korotkova, Olga
2016-02-10
An optical system consisting of a laser source and two independent consecutive phase-only spatial light modulators (SLMs) is shown to accurately simulate a generated random beam (first SLM) after interaction with a stationary random medium (second SLM). To illustrate the range of possibilities, a recently introduced class of random optical frames is examined on propagation in free space and several weak turbulent channels with Kolmogorov and non-Kolmogorov statistics. PMID:26906385
Random selection as a confidence building tool
Macarthur, Duncan W; Hauck, Danielle; Langner, Diana; Thron, Jonathan; Smith, Morag; Williams, Richard
2010-01-01
Any verification measurement performed on potentially classified nuclear material must satisfy two seemingly contradictory constraints. First and foremost, no classified information can be released. At the same time, the monitoring party must have confidence in the veracity of the measurement. The first concern can be addressed by performing the measurements within the host facility using instruments under the host's control. Because the data output in this measurement scenario is also under host control, it is difficult for the monitoring party to have confidence in that data. One technique for addressing this difficulty is random selection. The concept of random selection can be thought of as four steps: (1) The host presents several 'identical' copies of a component or system to the monitor. (2) One (or more) of these copies is randomly chosen by the monitors for use in the measurement system. (3) Similarly, one or more is randomly chosen to be validated further at a later date in a monitor-controlled facility. (4) Because the two components or systems are identical, validation of the 'validation copy' is equivalent to validation of the measurement system. This procedure sounds straightforward, but effective application may be quite difficult. Although random selection is often viewed as a panacea for confidence building, the amount of confidence generated depends on the monitor's continuity of knowledge for both validation and measurement systems. In this presentation, we will discuss the random selection technique, as well as where and how this technique might be applied to generate maximum confidence. In addition, we will discuss the role of modular measurement-system design in facilitating random selection and describe a simple modular measurement system incorporating six small {sup 3}He neutron detectors and a single high-purity germanium gamma detector.
Randomized central limit theorems: A unified theory.
Eliazar, Iddo; Klafter, Joseph
2010-08-01
The central limit theorems (CLTs) characterize the macroscopic statistical behavior of large ensembles of independent and identically distributed random variables. The CLTs assert that the universal probability laws governing ensembles' aggregate statistics are either Gaussian or Lévy, and that the universal probability laws governing ensembles' extreme statistics are Fréchet, Weibull, or Gumbel. The scaling schemes underlying the CLTs are deterministic-scaling all ensemble components by a common deterministic scale. However, there are "random environment" settings in which the underlying scaling schemes are stochastic-scaling the ensemble components by different random scales. Examples of such settings include Holtsmark's law for gravitational fields and the Stretched Exponential law for relaxation times. In this paper we establish a unified theory of randomized central limit theorems (RCLTs)-in which the deterministic CLT scaling schemes are replaced with stochastic scaling schemes-and present "randomized counterparts" to the classic CLTs. The RCLT scaling schemes are shown to be governed by Poisson processes with power-law statistics, and the RCLTs are shown to universally yield the Lévy, Fréchet, and Weibull probability laws.
Randomized central limit theorems: A unified theory
NASA Astrophysics Data System (ADS)
Eliazar, Iddo; Klafter, Joseph
2010-08-01
The central limit theorems (CLTs) characterize the macroscopic statistical behavior of large ensembles of independent and identically distributed random variables. The CLTs assert that the universal probability laws governing ensembles’ aggregate statistics are either Gaussian or Lévy, and that the universal probability laws governing ensembles’ extreme statistics are Fréchet, Weibull, or Gumbel. The scaling schemes underlying the CLTs are deterministic—scaling all ensemble components by a common deterministic scale. However, there are “random environment” settings in which the underlying scaling schemes are stochastic—scaling the ensemble components by different random scales. Examples of such settings include Holtsmark’s law for gravitational fields and the Stretched Exponential law for relaxation times. In this paper we establish a unified theory of randomized central limit theorems (RCLTs)—in which the deterministic CLT scaling schemes are replaced with stochastic scaling schemes—and present “randomized counterparts” to the classic CLTs. The RCLT scaling schemes are shown to be governed by Poisson processes with power-law statistics, and the RCLTs are shown to universally yield the Lévy, Fréchet, and Weibull probability laws.
Anomalous Anticipatory Responses in Networked Random Data
Nelson, Roger D.; Bancel, Peter A.
2006-10-16
We examine an 8-year archive of synchronized, parallel time series of random data from a world spanning network of physical random event generators (REGs). The archive is a publicly accessible matrix of normally distributed 200-bit sums recorded at 1 Hz which extends from August 1998 to the present. The primary question is whether these data show non-random structure associated with major events such as natural or man-made disasters, terrible accidents, or grand celebrations. Secondarily, we examine the time course of apparently correlated responses. Statistical analyses of the data reveal consistent evidence that events which strongly affect people engender small but significant effects. These include suggestions of anticipatory responses in some cases, leading to a series of specialized analyses to assess possible non-random structure preceding precisely timed events. A focused examination of data collected around the time of earthquakes with Richter magnitude 6 and greater reveals non-random structure with a number of intriguing, potentially important features. Anomalous effects in the REG data are seen only when the corresponding earthquakes occur in populated areas. No structure is found if they occur in the oceans. We infer that an important contributor to the effect is the relevance of the earthquake to humans. Epoch averaging reveals evidence for changes in the data some hours prior to the main temblor, suggestive of reverse causation.
Coupled continuous time random walks in finance
NASA Astrophysics Data System (ADS)
Meerschaert, Mark M.; Scalas, Enrico
2006-10-01
Continuous time random walks (CTRWs) are used in physics to model anomalous diffusion, by incorporating a random waiting time between particle jumps. In finance, the particle jumps are log-returns and the waiting times measure delay between transactions. These two random variables (log-return and waiting time) are typically not independent. For these coupled CTRW models, we can now compute the limiting stochastic process (just like Brownian motion is the limit of a simple random walk), even in the case of heavy-tailed (power-law) price jumps and/or waiting times. The probability density functions for this limit process solve fractional partial differential equations. In some cases, these equations can be explicitly solved to yield descriptions of long-term price changes, based on a high-resolution model of individual trades that includes the statistical dependence between waiting times and the subsequent log-returns. In the heavy-tailed case, this involves operator stable space-time random vectors that generalize the familiar stable models. In this paper, we will review the fundamental theory and present two applications with tick-by-tick stock and futures data.
Random walk centrality for temporal networks
NASA Astrophysics Data System (ADS)
Rocha, Luis E. C.; Masuda, Naoki
2014-06-01
Nodes can be ranked according to their relative importance within a network. Ranking algorithms based on random walks are particularly useful because they connect topological and diffusive properties of the network. Previous methods based on random walks, for example the PageRank, have focused on static structures. However, several realistic networks are indeed dynamic, meaning that their structure changes in time. In this paper, we propose a centrality measure for temporal networks based on random walks under periodic boundary conditions that we call TempoRank. It is known that, in static networks, the stationary density of the random walk is proportional to the degree or the strength of a node. In contrast, we find that, in temporal networks, the stationary density is proportional to the in-strength of the so-called effective network, a weighted and directed network explicitly constructed from the original sequence of transition matrices. The stationary density also depends on the sojourn probability q, which regulates the tendency of the walker to stay in the node, and on the temporal resolution of the data. We apply our method to human interaction networks and show that although it is important for a node to be connected to another node with many random walkers (one of the principles of the PageRank) at the right moment, this effect is negligible in practice when the time order of link activation is included.
Random subspaces in quantum information theory
NASA Astrophysics Data System (ADS)
Hayden, Patrick
2005-03-01
The selection of random unitary transformations plays a role in quantum information theory analogous to the role of random hash functions in classical information theory. Recent applications have included protocols achieving the quantum channel capacity and methods for extending superdense coding from bits to qubits. In addition, the corresponding random subspaces have proved useful for studying the structure of bipartite and multipartite entanglement. In quantum information theory, we're fond of saying that Hilbert space is a big place, the implication being that there's room for the unexpected to occur. The goal of this talk is to further bolster this homespun wisdowm. I'm going to present a number of results in quantum information theory that stem from the initially counterintuitive geometry of high-dimensional vector spaces, where subspaces with highly extremal properties are the norm rather than the exception. Peter Shor has shown, for example, that randomly selected subspaces can be used to send quantum information through a noisy quantum channel at the highest possible rate, that is, the quantum channel capacity. More recently, Debbie Leung, Andreas Winter and I demonstrated that a randomly chosen subspace of a bipartite quantum system will likely contain nothing but nearly maximally entangled states, even if the subspace is nearly as large as the original system in qubit terms. This observation has implications for communication, especially superdense coding.
Helmhout, Pieter H; Harts, Chris C; Staal, J Bart; de Bie, Rob A
2004-01-01
Background Researchers from the Royal Netherlands Army are studying the potential of isolated lumbar extensor training in low back pain in their working population. Currently, a randomized controlled trial is carried out in five military health centers in The Netherlands and Germany, in which a 10-week program of not more than 2 training sessions (10–15 minutes) per week is studied in soldiers with nonspecific low back pain for more than 4 weeks. The purpose of the study is to investigate the efficacy of this 'minimal intervention program', compared to usual care. Moreover, attempts are made to identify subgroups of different responders to the intervention. Methods Besides a baseline measurement, follow-up data are gathered at two short-term intervals (5 and 10 weeks after randomization) and two long-term intervals (6 months and one year after the end of the intervention), respectively. At every test moment, participants fill out a compound questionnaire on a stand-alone PC, and they undergo an isometric back strength measurement on a lower back machine. Primary outcome measures in this study are: self-assessed degree of complaints and degree of handicap in daily activities due to back pain. In addition, our secondary measurements focus on: fear of movement/(re-) injury, mental and social health perception, individual back extension strength, and satisfaction of the patient with the treatment perceived. Finally, we assess a number of potential prognostic factors: demographic and job characteristics, overall health, the degree of physical activity, and the attitudes and beliefs of the physiotherapist towards chronic low back pain. Discussion Although a substantial number of trials have been conducted that included lumbar extension training in low back pain patients, hardly any study has emphasized a minimal intervention approach comparable to ours. For reasons of time efficiency and patient preferences, this minimal sports medicine approach of low back pain
Defect Detection Using Hidden Markov Random Fields
Dogandzic, Aleksandar; Eua-anant, Nawanat; Zhang Benhong
2005-04-09
We derive an approximate maximum a posteriori (MAP) method for detecting NDE defect signals using hidden Markov random fields (HMRFs). In the proposed HMRF framework, a set of spatially distributed NDE measurements is assumed to form a noisy realization of an underlying random field that has a simple structure with Markovian dependence. Here, the random field describes the defect signals to be estimated or detected. The HMRF models incorporate measurement locations into the statistical analysis, which is important in scenarios where the same defect affects measurements at multiple locations. We also discuss initialization of the proposed HMRF detector and apply to simulated eddy-current data and experimental ultrasonic C-scan data from an inspection of a cylindrical Ti 6-4 billet.
Regularity of nuclear structure under random interactions
Zhao, Y. M.
2011-05-06
In this contribution I present a brief introduction to simplicity out of complexity in nuclear structure, specifically, the regularity of nuclear structure under random interactions. I exemplify such simplicity by two examples: spin-zero ground state dominance and positive parity ground state dominance in even-even nuclei. Then I discuss two recent results of nuclear structure in the presence of random interactions, in collaboration with Prof. Arima. Firstly I discuss sd bosons under random interactions, with the focus on excited states in the yrast band. We find a few regular patterns in these excited levels. Secondly I discuss our recent efforts towards obtaining eigenvalues without diagonalizing the full matrices of the nuclear shell model Hamiltonian.
Operational conditions for random-number generation
NASA Astrophysics Data System (ADS)
Compagner, A.
1995-11-01
Ensemble theory is used to describe arbitrary sequences of integers, whether formed by the decimals of π or produced by a roulette or by any other means. Correlation coefficients of any range and order are defined as Fourier transforms of the ensemble weights. Competing definitions of random sequences are considered. Special attention is given to sequences of random numbers needed for Monte Carlo calculations. Different recipes for those sequences lead to correlations that vary in range and order, but the total amount of correlation is the same for all sequences of a given length (without internal periodicities). For maximum-length sequences produced by linear algorithms, most correlation coefficients are zero, but the remaining ones are of absolute value 1. In well-tempered sequences, these complete correlations are of high order or of very long range. General conditions to be obeyed by random-number generators are discussed and a qualitative method for comparing different recipes is given.
Component Evolution in General Random Intersection Graphs
NASA Astrophysics Data System (ADS)
Bradonjić, Milan; Hagberg, Aric; Hengartner, Nicolas W.; Percus, Allon G.
Random intersection graphs (RIGs) are an important random structure with algorithmic applications in social networks, epidemic networks, blog readership, and wireless sensor networks. RIGs can be interpreted as a model for large randomly formed non-metric data sets. We analyze the component evolution in general RIGs, giving conditions on the existence and uniqueness of the giant component. Our techniques generalize existing methods for analysis of component evolution: we analyze survival and extinction properties of a dependent, inhomogeneous Galton-Watson branching process on general RIGs. Our analysis relies on bounding the branching processes and inherits the fundamental concepts of the study of component evolution in Erdős-Rényi graphs. The major challenge comes from the underlying structure of RIGs, which involves both a set of nodes and a set of attributes, with different probabilities associated with each attribute.
Scale-invariant geometric random graphs.
Xie, Zheng; Rogers, Tim
2016-03-01
We introduce and analyze a class of growing geometric random graphs that are invariant under rescaling of space and time. Directed connections between nodes are drawn according to influence zones that depend on node position in space and time, mimicking the heterogeneity and increased specialization found in growing networks. Through calculations and numerical simulations we explore the consequences of scale invariance for geometric random graphs generated this way. Our analysis reveals a dichotomy between scale-free and Poisson distributions of in- and out-degree, the existence of a random number of hub nodes, high clustering, and unusual percolation behavior. These properties are similar to those of empirically observed web graphs. PMID:27078369
Randomized parallel speedups for list ranking
Vishkin, U.
1987-06-01
The following problem is considered: given a linked list of length n, compute the distance of each element of the linked list from the end of the list. The problem has two standard deterministic algorithms: a linear time serial algorithm, and an O(n log n)/ rho + log n) time parallel algorithm using rho processors. The authors present a randomized parallel algorithm for the problem. The algorithm is designed for an exclusive-read exclusive-write parallel random access machine (EREW PRAM). It runs almost surely in time O(n/rho + log n log* n) using rho processors. Using a recently published parallel prefix sums algorithm the list-ranking algorithm can be adapted to run on a concurrent-read concurrent-write parallel random access machine (CRCW PRAM) almost surely in time O(n/rho + log n) using rho processors.
Random Numbers from a Delay Equation
NASA Astrophysics Data System (ADS)
Self, Julian; Mackey, Michael C.
2016-10-01
Delay differential equations can have "chaotic" solutions that can be used to mimic Brownian motion. Since a Brownian motion is random in its velocity, it is reasonable to think that a random number generator might be constructed from such a model. In this preliminary study, we consider one specific example of this and show that it satisfies criteria commonly employed in the testing of random number generators (from TestU01's very stringent "Big Crush" battery of tests). A technique termed digit discarding, commonly used in both this generator and physical RNGs using laser feedback systems, is discussed with regard to the maximal Lyapunov exponent. Also, we benchmark the generator to a contemporary common method: the multiple recursive generator, MRG32k3a. Although our method is about 7 times slower than MRG32k3a, there is in principle no apparent limit on the number of possible values that can be generated from the scheme we present here.
Entanglement-assisted random access codes
Pawlowski, Marcin; Zukowski, Marek
2010-04-15
An (n,m,p) random access code (RAC) makes it possible to encode n bits in an m-bit message in such a way that a receiver of the message can guess any of the original n bits with probability p greater than (1/2). In quantum RACs (QRACs), one transmits n qubits. The full set of primitive entanglement-assisted random access codes (EARACs) is introduced, in which parties are allowed to share a two-qubit singlet. It is shown that via a concatenation of these, one can build for any n an (n,1,p) EARAC. QRACs for n>3 exist only if parties also share classical randomness. We show that EARACs outperform the best of known QRACs not only in the success probabilities but also in the amount of communication needed in the preparatory stage of the protocol. Upper bounds on the performance of EARACs are given and shown to limit also QRACs.
Cooperation evolution in random multiplicative environments
NASA Astrophysics Data System (ADS)
Yaari, G.; Solomon, S.
2010-02-01
Most real life systems have a random component: the multitude of endogenous and exogenous factors influencing them result in stochastic fluctuations of the parameters determining their dynamics. These empirical systems are in many cases subject to noise of multiplicative nature. The special properties of multiplicative noise as opposed to additive noise have been noticed for a long while. Even though apparently and formally the difference between free additive vs. multiplicative random walks consists in just a move from normal to log-normal distributions, in practice the implications are much more far reaching. While in an additive context the emergence and survival of cooperation requires special conditions (especially some level of reward, punishment, reciprocity), we find that in the multiplicative random context the emergence of cooperation is much more natural and effective. We study the various implications of this observation and its applications in various contexts.
The Generation of Random Equilateral Polygons
NASA Astrophysics Data System (ADS)
Alvarado, Sotero; Calvo, Jorge Alberto; Millett, Kenneth C.
2011-04-01
Freely jointed random equilateral polygons serve as a common model for polymer rings, reflecting their statistical properties under theta conditions. To generate equilateral polygons, researchers employ many procedures that have been proved, or at least are believed, to be random with respect to the natural measure on the space of polygonal knots. As a result, the random selection of equilateral polygons, as well as the statistical robustness of this selection, is of particular interest. In this research, we study the key features of four popular methods: the Polygonal Folding, the Crankshaft Rotation, the Hedgehog, and the Triangle Methods. In particular, we compare the implementation and efficacy of these procedures, especially in regards to the population distribution of polygons in the space of polygonal knots, the distribution of edge vectors, the local curvature, and the local torsion. In addition, we give a rigorous proof that the Crankshaft Rotation Method is ergodic.
Efficient broadcast on random geometric graphs
Bradonjic, Milan; Elsasser, Robert; Friedrich, Tobias; Sauerwald, Thomas
2009-01-01
A Randon Geometric Graph (RGG) is constructed by distributing n nodes uniformly at random in the unit square and connecting two nodes if their Euclidean distance is at most r, for some prescribed r. They analyze the following randomized broadcast algorithm on RGGs. At the beginning, there is only one informed node. Then in each round, each informed node chooses a neighbor uniformly at random and informs it. They prove that this algorithm informs every node in the largest component of a RGG in {Omicron}({radical}n/r) rounds with high probability. This holds for any value of r larger than the critical value for the emergence of a giant component. In particular, the result implies that the diameter of the giant component is {Theta}({radical}n/r).
Taming Explosive Growth through Dynamic Random Links
NASA Astrophysics Data System (ADS)
Choudhary, Anshul; Kohar, Vivek; Sinha, Sudeshna
2014-03-01
We study the dynamics of a collection of nonlinearly coupled limit cycle oscillators relevant to a wide class of systems, ranging from neuronal populations to electrical circuits, over network topologies varying from a regular ring to a random network. We find that for sufficiently strong coupling strengths the trajectories of the system escape to infinity in the regular ring network. However when a fraction of the regular connections are dynamically randomized, the unbounded growth is suppressed and the system remains bounded. Further, we find a scaling relation between the critical fraction of random links necessary for successful prevention of explosive behavior and the network rewiring time-scale. These results suggest a mechanism by which blow-ups may be controlled in extended oscillator systems.
Discrete mechanics and special relativistic random walks.
Wall, F T
1988-05-01
Random walks with step lengths equal to the shortest possible physically meaningful distances are considered from the point of view of special relativity involving two observers moving uniformly with respect to each other. A requirement of statistical equivalence of the probability distributions seen by those observers leads to the Lorentz transformations, provided a randomly moving particle shifts from one submicroscopic cell of uncertainty to a neighbor with a speed equivalent to that of light. Ordinary smooth motion would appear to involve a tremendous amount of submicroscopic back and forth randomness subject to a statistical bias favoring a particular direction. The diffusive nature of the motion naturally leads to a spreading of the probability distribution.
Social patterns revealed through random matrix theory
NASA Astrophysics Data System (ADS)
Sarkar, Camellia; Jalan, Sarika
2014-11-01
Despite the tremendous advancements in the field of network theory, very few studies have taken weights in the interactions into consideration that emerge naturally in all real-world systems. Using random matrix analysis of a weighted social network, we demonstrate the profound impact of weights in interactions on emerging structural properties. The analysis reveals that randomness existing in particular time frame affects the decisions of individuals rendering them more freedom of choice in situations of financial security. While the structural organization of networks remains the same throughout all datasets, random matrix theory provides insight into the interaction pattern of individuals of the society in situations of crisis. It has also been contemplated that individual accountability in terms of weighted interactions remains as a key to success unless segregation of tasks comes into play.
Portfolio optimization and the random magnet problem
NASA Astrophysics Data System (ADS)
Rosenow, B.; Plerou, V.; Gopikrishnan, P.; Stanley, H. E.
2002-08-01
Diversification of an investment into independently fluctuating assets reduces its risk. In reality, movements of assets are mutually correlated and therefore knowledge of cross-correlations among asset price movements are of great importance. Our results support the possibility that the problem of finding an investment in stocks which exposes invested funds to a minimum level of risk is analogous to the problem of finding the magnetization of a random magnet. The interactions for this "random magnet problem" are given by the cross-correlation matrix C of stock returns. We find that random matrix theory allows us to make an estimate for C which outperforms the standard estimate in terms of constructing an investment which carries a minimum level of risk.
Random field estimation approach to robot dynamics
NASA Technical Reports Server (NTRS)
Rodriguez, Guillermo
1990-01-01
The difference equations of Kalman filtering and smoothing recursively factor and invert the covariance of the output of a linear state-space system driven by a white-noise process. Here it is shown that similar recursive techniques factor and invert the inertia matrix of a multibody robot system. The random field models are based on the assumption that all of the inertial (D'Alembert) forces in the system are represented by a spatially distributed white-noise model. They are easier to describe than the models based on classical mechanics, which typically require extensive derivation and manipulation of equations of motion for complex mechanical systems. With the spatially random models, more primitive locally specified computations result in a global collective system behavior equivalent to that obtained with deterministic models. The primary goal of applying random field estimation is to provide a concise analytical foundation for solving robot control and motion planning problems.
Random lasing with spatially nonuniform gain
NASA Astrophysics Data System (ADS)
Fan, Ting; Lü, Jiantao
2016-07-01
Spatial and spectral properties of random lasing with spatially nonuniform gain were investigated in two-dimensional (2D) disordered medium. The pumping light was described by an individual electric field and coupled into the rate equations by using the polarization equation. The spatially nonuniform gain comes from the multiple scattering of this pumping light. Numerical simulation of the random system with uniform and nonuniform gain were performed both in weak and strong scattering regime. In weak scattering sample, all the lasing modes correspond to those of the passive system whether the nonuniform gain is considered. However, in strong scattering regime, new lasing modes appear with nonuniform gain as the localization area changes. Our results show that it is more accurate to describe the random lasing behavior with introducing the nonuniform gain origins from the multiple light scattering.
Quantum walks on a random environment
Yin Yue; Katsanos, D. E.; Evangelou, S. N.
2008-02-15
Quantum walks are considered in a one-dimensional random medium characterized by static or dynamic disorder. Quantum interference for static disorder can lead to Anderson localization which completely hinders the quantum walk and it is contrasted with the decoherence effect of dynamic disorder having strength W, where a quantum to classical crossover at time t{sub c}{proportional_to}W{sup -2} transforms the quantum walk into an ordinary random walk with diffusive spreading. We demonstrate these localization and decoherence phenomena in quantum carpets of the observed time evolution, we relate our results to previously studied models of decoherence for quantum walks, and examine in detail a dimer lattice which corresponds to a single qubit subject to randomness.
Mesoscopic description of random walks on combs.
Méndez, Vicenç; Iomin, Alexander; Campos, Daniel; Horsthemke, Werner
2015-12-01
Combs are a simple caricature of various types of natural branched structures, which belong to the category of loopless graphs and consist of a backbone and branches. We study continuous time random walks on combs and present a generic method to obtain their transport properties. The random walk along the branches may be biased, and we account for the effect of the branches by renormalizing the waiting time probability distribution function for the motion along the backbone. We analyze the overall diffusion properties along the backbone and find normal diffusion, anomalous diffusion, and stochastic localization (diffusion failure), respectively, depending on the characteristics of the continuous time random walk along the branches, and compare our analytical results with stochastic simulations. PMID:26764637
Randomness in post-selected events
NASA Astrophysics Data System (ADS)
Phuc Thinh, Le; de la Torre, Gonzalo; Bancal, Jean-Daniel; Pironio, Stefano; Scarani, Valerio
2016-03-01
Bell inequality violations can be used to certify private randomness for use in cryptographic applications. In photonic Bell experiments, a large amount of the data that is generated comes from no-detection events and presumably contains little randomness. This raises the question as to whether randomness can be extracted only from the smaller post-selected subset corresponding to proper detection events, instead of from the entire set of data. This could in principle be feasible without opening an analogue of the detection loophole as long as the min-entropy of the post-selected data is evaluated by taking all the information into account, including no-detection events. The possibility of extracting randomness from a short string has a practical advantage, because it reduces the computational time of the extraction. Here, we investigate the above idea in a simple scenario, where the devices and the adversary behave according to i.i.d. strategies. We show that indeed almost all the randomness is present in the pair of outcomes for which at least one detection happened. We further show that in some cases applying a pre-processing on the data can capture features that an analysis based on global frequencies only misses, thus resulting in the certification of more randomness. We then briefly consider non-i.i.d strategies and provide an explicit example of such a strategy that is more powerful than any i.i.d. one even in the asymptotic limit of infinitely many measurement rounds, something that was not reported before in the context of Bell inequalities.
ERIC Educational Resources Information Center
Bradshaw, Ceri A.; Reed, Phil
2012-01-01
In three experiments, human participants pressed the space bar on a computer keyboard to earn points on random-ratio (RR) and random-interval (RI) schedules of reinforcement. Verbalized contingency awareness (CA) for each schedule was measured after the entire task (Experiments 1 and 2), or after each RR-RI trial (Experiment 3). In all three…
Estimation of the Nonlinear Random Coefficient Model when Some Random Effects Are Separable
ERIC Educational Resources Information Center
du Toit, Stephen H. C.; Cudeck, Robert
2009-01-01
A method is presented for marginal maximum likelihood estimation of the nonlinear random coefficient model when the response function has some linear parameters. This is done by writing the marginal distribution of the repeated measures as a conditional distribution of the response given the nonlinear random effects. The resulting distribution…
A Digitally Addressable Random-Access Image Selector and Random-Access Audio System.
ERIC Educational Resources Information Center
Bitzer, Donald L.; And Others
The requirements of PLATO IV, a computer based education system at the University of Illinois, have led to the development of an improved, digitally addressable, random access image selector and a digitally addressable, random access audio device. Both devices utilize pneumatically controlled mechanical binary adders to position the mecahnical…
The Analysis of Completely Randomized Factorial Experiments When Observations Are Lost at Random.
ERIC Educational Resources Information Center
Hummel, Thomas J.
An investigation was conducted of the characteristics of two estimation procedures and corresponding test statistics used in the analysis of completely randomized factorial experiments when observations are lost at random. For one estimator, contrast coefficients for cell means did not involve the cell frequencies. For the other, contrast…
Formulation and Application of the Hierarchical Generalized Random-Situation Random-Weight MIRID
ERIC Educational Resources Information Center
Hung, Lai-Fa
2011-01-01
The process-component approach has become quite popular for examining many psychological concepts. A typical example is the model with internal restrictions on item difficulty (MIRID) described by Butter (1994) and Butter, De Boeck, and Verhelst (1998). This study proposes a hierarchical generalized random-situation random-weight MIRID. The…
A Random Variable Related to the Inversion Vector of a Partial Random Permutation
ERIC Educational Resources Information Center
Laghate, Kavita; Deshpande, M. N.
2005-01-01
In this article, we define the inversion vector of a permutation of the integers 1, 2,..., n. We set up a particular kind of permutation, called a partial random permutation. The sum of the elements of the inversion vector of such a permutation is a random variable of interest.
Polarization-modulated random fiber laser
NASA Astrophysics Data System (ADS)
Wu, Han; Wang, Zinan; He, Qiheng; Fan, Mengqiu; Li, Yunqi; Sun, Wei; Zhang, Li; Li, Yi; Rao, Yunjiang
2016-05-01
In this letter, we propose and experimentally demonstrate a polarization-modulated random fiber laser (RFL) for the first time. It is found that the output power of the half-opened RFL with polarized pumping is sensitive to the state of polarization (SOP) of the Stokes light in a fiber loop acting as a mirror. By inserting a polarization switch (PSW) in the loop mirror, the state of the random lasing can be switched between on/off states, thus such a polarization-modulated RFL can generate pulsed output with high extinction ratio.
Evolutionary Phase Transitions in Random Environments.
Skanata, Antun; Kussell, Edo
2016-07-15
We present analytical results for long-term growth rates of structured populations in randomly fluctuating environments, which we apply to predict how cellular response networks evolve. We show that networks which respond rapidly to a stimulus will evolve phenotypic memory exclusively under random (i.e., nonperiodic) environments. We identify the evolutionary phase diagram for simple response networks, which we show can exhibit both continuous and discontinuous transitions. Our approach enables exact analysis of diverse evolutionary systems, from viral epidemics to emergence of drug resistance. PMID:27472146
Random sequential adsorption of trimers and hexamers.
Cieśla, Michał; Barbasz, Jakub
2013-12-01
Adsorption of trimers and hexamers built of identical spheres was studied numerically using the random sequential adsorption (RSA) algorithm. Particles were adsorbed on a two-dimensional, flat and homogeneous surface. Numerical simulations allowed us to determine the maximal random coverage ratio, RSA kinetics as well as the available surface function (ASF), which is crucial for determining the kinetics of the adsorption process obtained experimentally. Additionally, the density autocorrelation function was measured. All the results were compared with previous results obtained for spheres, dimers and tetramers.
System characterization in nonlinear random vibration
Paez, T.L.; Gregory, D.L.
1986-01-01
Linear structural models are frequently used for structural system characterization and analysis. In most situations they can provide satisfactory results, but under some circumstances they are insufficient for system definition. The present investigation proposes a model for nonlinear structure characterization, and demonstrates how the functions describing the model can be identified using a random vibration experiment. Further, it is shown that the model is sufficient to completely characterize the stationary random vibration response of a structure that has a harmonic frequency generating form of nonlinearity. An analytical example is presented to demonstrate the plausibility of the model.
Random grid fern for visual tracking
NASA Astrophysics Data System (ADS)
Cheng, Fei; Liu, Kai; Zhang, Jin; Li, YunSong
2014-05-01
Visual tracking is one of the significant research directions in computer vision. Although standard random ferns tracking method obtains a good performance for the random spatial arrangement of binary tests, the effect of the locality of image on ferns description ability are ignored and prevent them to describe the object more accurately and robustly. This paper proposes a novel spatial arrangement of binary tests to divide the bounding box into grids in order to keep more details of the image for visual tracking. Experimental results show that this method can improve tracking accuracy effectively.
Bootstrapped models for intrinsic random functions
Campbell, K.
1988-08-01
Use of intrinsic random function stochastic models as a basis for estimation in geostatistical work requires the identification of the generalized covariance function of the underlying process. The fact that this function has to be estimated from data introduces an additional source of error into predictions based on the model. This paper develops the sample reuse procedure called the bootstrap in the context of intrinsic random functions to obtain realistic estimates of these errors. Simulation results support the conclusion that bootstrap distributions of functionals of the process, as well as their kriging variance, provide a reasonable picture of variability introduced by imperfect estimation of the generalized covariance function.
Bootstrapped models for intrinsic random functions
Campbell, K.
1987-01-01
The use of intrinsic random function stochastic models as a basis for estimation in geostatistical work requires the identification of the generalized covariance function of the underlying process, and the fact that this function has to be estimated from the data introduces an additional source of error into predictions based on the model. This paper develops the sample reuse procedure called the ''bootstrap'' in the context of intrinsic random functions to obtain realistic estimates of these errors. Simulation results support the conclusion that bootstrap distributions of functionals of the process, as well as of their ''kriging variance,'' provide a reasonable picture of the variability introduced by imperfect estimation of the generalized covariance function.
Clinical Research Methodology 3: Randomized Controlled Trials.
Sessler, Daniel I; Imrey, Peter B
2015-10-01
Randomized assignment of treatment excludes reverse causation and selection bias and, in sufficiently large studies, effectively prevents confounding. Well-implemented blinding prevents measurement bias. Studies that include these protections are called randomized, blinded clinical trials and, when conducted with sufficient numbers of patients, provide the most valid results. Although conceptually straightforward, design of clinical trials requires thoughtful trade-offs among competing approaches-all of which influence the number of patients required, enrollment time, internal and external validity, ability to evaluate interactions among treatments, and cost.
Searching for nodes in random graphs.
Lancaster, David
2011-11-01
We consider the problem of searching for a node on a labeled random graph according to a greedy algorithm that selects a route to the desired node using metric information on the graph. Motivated by peer-to-peer networks two types of random graph are proposed with properties particularly amenable to this kind of algorithm. We derive equations for the probability that the search is successful and also study the number of hops required, finding both numerical and analytic evidence of a transition as the number of links is varied.
Attractors and Time Averages for Random Maps
NASA Astrophysics Data System (ADS)
Araujo, Vitor
2006-07-01
Considering random noise in finite dimensional parameterized families of diffeomorphisms of a compact finite dimensional boundaryless manifold M, we show the existence of time averages for almost every orbit of each point of M, imposing mild conditions on the families. Moreover these averages are given by a finite number of physical absolutely continuous stationary probability measures. We use this result to deduce that situations with infinitely many sinks and Henon-like attractors are not stable under random perturbations, e.g., Newhouse's and Colli's phenomena in the generic unfolding of a quadratic homoclinic tangency by a one-parameter family of diffeomorphisms.
Quantum Random Walks with General Particle States
NASA Astrophysics Data System (ADS)
Belton, Alexander C. R.
2014-06-01
A convergence theorem is obtained for quantum random walks with particles in an arbitrary normal state. This unifies and extends previous work on repeated-interactions models, including that of Attal and Pautrat (Ann Henri Poincaré 7:59-104 2006) and Belton (J Lond Math Soc 81:412-434, 2010; Commun Math Phys 300:317-329, 2010). When the random-walk generator acts by ampliation and either multiplication or conjugation by a unitary operator, it is shown that the quantum stochastic cocycle which arises in the limit is driven by a unitary process.
Active remote sensing of random media
Zuniga, M.; Kong, J.A.
1980-01-01
Analytical results for the bistatic scattering coefficients and the backscattering cross sections have been derived for active remote sensing of earth terrain with the model of bounded random media which accounts for volume-scattering effects. It is found that as a result of the effect of the second boundary, the horizontally polarized return sigma/sub h/h can be greater than the vertically polarized return sigma/sub v/v, whereas for a half-space random medium sigma/sub v/v is always greater than sigma/sub h/h. We illustrate by matching the theoretical results with experimental data collected from vegetation field.
Random Walk Weakly Attracted to a Wall
NASA Astrophysics Data System (ADS)
de Coninck, Joël; Dunlop, François; Huillet, Thierry
2008-10-01
We consider a random walk X n in ℤ+, starting at X 0= x≥0, with transition probabilities {P}(X_{n+1}=Xn±1|Xn=yge1)={1over2}mp{δover4y+2δ} and X n+1=1 whenever X n =0. We prove {E}Xn˜const. n^{1-{δ over2}} as n ↗∞ when δ∈(1,2). The proof is based upon the Karlin-McGregor spectral representation, which is made explicit for this random walk.
Evolutionary Phase Transitions in Random Environments
NASA Astrophysics Data System (ADS)
Skanata, Antun; Kussell, Edo
2016-07-01
We present analytical results for long-term growth rates of structured populations in randomly fluctuating environments, which we apply to predict how cellular response networks evolve. We show that networks which respond rapidly to a stimulus will evolve phenotypic memory exclusively under random (i.e., nonperiodic) environments. We identify the evolutionary phase diagram for simple response networks, which we show can exhibit both continuous and discontinuous transitions. Our approach enables exact analysis of diverse evolutionary systems, from viral epidemics to emergence of drug resistance.
Correlations between outcomes of random measurements
NASA Astrophysics Data System (ADS)
Tran, Minh Cong; Dakić, Borivoje; Laskowski, Wiesław; Paterek, Tomasz
2016-10-01
We recently showed that multipartite correlations between outcomes of random observables detect quantum entanglement in all pure and some mixed states. In this followup article we further develop this approach, derive a maximal amount of such correlations, and show that they are not monotonous under local operations and classical communication. Nevertheless, we demonstrate their usefulness in entanglement detection with a single random observable per party. Finally we study convex-roof extension of the correlations and provide a closed-form necessary and sufficient condition for entanglement in rank-2 mixed states and a witness in general.
A random matrix approach to credit risk.
Münnix, Michael C; Schäfer, Rudi; Guhr, Thomas
2014-01-01
We estimate generic statistical properties of a structural credit risk model by considering an ensemble of correlation matrices. This ensemble is set up by Random Matrix Theory. We demonstrate analytically that the presence of correlations severely limits the effect of diversification in a credit portfolio if the correlations are not identically zero. The existence of correlations alters the tails of the loss distribution considerably, even if their average is zero. Under the assumption of randomly fluctuating correlations, a lower bound for the estimation of the loss distribution is provided.
A Random Matrix Approach to Credit Risk
Guhr, Thomas
2014-01-01
We estimate generic statistical properties of a structural credit risk model by considering an ensemble of correlation matrices. This ensemble is set up by Random Matrix Theory. We demonstrate analytically that the presence of correlations severely limits the effect of diversification in a credit portfolio if the correlations are not identically zero. The existence of correlations alters the tails of the loss distribution considerably, even if their average is zero. Under the assumption of randomly fluctuating correlations, a lower bound for the estimation of the loss distribution is provided. PMID:24853864
Small, Dylan S; Ten Have, Thomas R; Joffe, Marshall M; Cheng, Jing
2006-06-30
We present a random effects logistic approach for estimating the efficacy of treatment for compliers in a randomized trial with treatment non-adherence and longitudinal binary outcomes. We use our approach to analyse a primary care depression intervention trial. The use of a random effects model to estimate efficacy supplements intent-to-treat longitudinal analyses based on random effects logistic models that are commonly used in primary care depression research. Our estimation approach is an extension of Nagelkerke et al.'s instrumental variables approximation for cross-sectional binary outcomes. Our approach is easily implementable with standard random effects logistic regression software. We show through a simulation study that our approach provides reasonably accurate inferences for the setting of the depression trial under model assumptions. We also evaluate the sensitivity of our approach to model assumptions for the depression trial.
Lasser, Robert A; Dirks, Bryan; Nasrallah, Henry; Kirsch, Courtney; Gao, Joseph; Pucci, Michael L; Knesevich, Mary A; Lindenmayer, Jean-Pierre
2013-10-01
Negative symptoms of schizophrenia (NSS), related to hypodopaminergic activity in the mesocortical pathway and prefrontal cortex, are predictive of poor outcomes and have no effective treatment. Use of dopamine-enhancing drugs (eg, psychostimulants) has been limited by potential adverse effects. This multicenter study examined lisdexamfetamine dimesylate (LDX), a d-amphetamine prodrug, as adjunctive therapy to antipsychotics in adults with clinically stable schizophrenia and predominant NSS. Outpatients with stable schizophrenia, predominant NSS, limited positive symptoms, and maintained on stable atypical antipsychotic therapy underwent a 3-week screening, 10-week open-label adjunctive LDX (20-70 mg/day), and 4-week, double-blind, randomized, placebo-controlled withdrawal. Efficacy measures included a modified Scale for the Assessment of Negative Symptoms (SANS-18) and Positive and Negative Syndrome Scale (PANSS) total and subscale scores. Ninety-two participants received open-label LDX; 69 received double-blind therapy with placebo (n=35) or LDX (n=34). At week 10 (last observation carried forward; last open-label visit), mean (95% confidence interval) change in SANS-18 scores was -12.9 (-15.0, -10.8; P<0.0001). At week 10, 52.9% of participants demonstrated a minimum of 20% reduction from baseline in SANS-18 score. Open-label LDX was also associated with significant improvement in PANSS total and subscale scores. During the double-blind/randomized-withdrawal phase, no significant differences (change from randomization baseline) were found between placebo and LDX in SANS-18 or PANSS subscale scores. In adults with clinically stable schizophrenia, open-label LDX appeared to be associated with significant improvements in negative symptoms without positive symptom worsening. Abrupt LDX discontinuation was not associated with positive or negative symptom worsening. Confirmation with larger controlled trials is warranted. PMID:23756608
Wang, Gang; McIntyre, Alexander; Earley, Willie R.; Raines, Shane; Eriksson, Hans
2012-01-01
Objectives Evaluate the efficacy and tolerability of once-daily extended release quetiapine fumarate (quetiapine XR) monotherapy in patients with major depressive disorder (MDD). Methods 10-week (8-week active-treatment/2-week post-treatment), randomized, double-blind, placebo- and active-controlled study (D1448C00004). Patients received quetiapine XR 150 mg/day, escitalopram 10 mg/day, or placebo; patients with an inadequate response (<20% improvement in MADRS total score) at Week 2 received double-treatment dose. Primary endpoint: Week 8 change from randomization in MADRS total score. Secondary endpoints included: MADRS response (≥50% improvement) and remission (score ≤8), HAM-D total and Item 1, HAM-A total, psychic and somatic, CGI-S total, PSQI global, and Q-LES-Q-SF% maximum total scores; tolerability was assessed throughout. Results 471 patients were randomized. No significant improvements in MADRS total score were observed at Week 8 (LOCF) with either active treatment (quetiapine XR, −17.21 [p=0.174]; escitalopram, −16.73 [p=0.346]) versus placebo (−15.61). There were no significant differences in secondary endpoints versus placebo, with the exception of Week 8 change in PSQI global score (quetiapine XR, −4.96 [p < 0.01] versus placebo, −3.37). MMRM analysis of observed cases data suggested that the primary analysis may not be robust. Most commonly reported AEs included: dry mouth, somnolence, and dizziness for quetiapine XR; headache and nausea for escitalopram. Conclusions In this study, neither quetiapine XR (150/300 mg/day) nor escitalopram (10/20 mg/day) showed significant separation from placebo. Both compounds have been shown previously to be effective in the treatment of MDD; possible reasons for this failed study are discussed. Quetiapine XR was generally well tolerated with a profile similar to that reported previously.
Random walk in generalized quantum theory
Martin, Xavier; O'Connor, Denjoe; Sorkin, Rafael D.
2005-01-15
One can view quantum mechanics as a generalization of classical probability theory that provides for pairwise interference among alternatives. Adopting this perspective, we 'quantize' the classical random walk by finding, subject to a certain condition of 'strong positivity', the most general Markovian, translationally invariant 'decoherence functional' with nearest neighbor transitions.
Top-hat random fiber Bragg grating.
Yin, Hongwei; Gbadebo, Adenowo; Turitsyna, Elena G
2015-08-01
We examined the possibility of using noise or pseudo-random variations of the refractive index in the design of fiber Bragg gratings (FBGs). We demonstrated theoretically and experimentally that top-hat FBGs may be designed and fabricated using this approach. The reflectivity of the fabricated top-hat FBG matches quite well with that of the designed one. PMID:26258365
Sampled-Data Consensus Over Random Networks
NASA Astrophysics Data System (ADS)
Wu, Junfeng; Meng, Ziyang; Yang, Tao; Shi, Guodong; Johansson, Karl Henrik
2016-09-01
This paper considers the consensus problem for a network of nodes with random interactions and sampled-data control actions. We first show that consensus in expectation, in mean square, and almost surely are equivalent for a general random network model when the inter-sampling interval and network size satisfy a simple relation. The three types of consensus are shown to be simultaneously achieved over an independent or a Markovian random network defined on an underlying graph with a directed spanning tree. For both independent and Markovian random network models, necessary and sufficient conditions for mean-square consensus are derived in terms of the spectral radius of the corresponding state transition matrix. These conditions are then interpreted as the existence of critical value on the inter-sampling interval, below which global mean-square consensus is achieved and above which the system diverges in mean-square sense for some initial states. Finally, we establish an upper bound on the inter-sampling interval below which almost sure consensus is reached, and a lower bound on the inter-sampling interval above which almost sure divergence is reached. Some numerical simulations are given to validate the theoretical results and some discussions on the critical value of the inter-sampling intervals for the mean-square consensus are provided.
Wave propagation on a random lattice
Sahlmann, Hanno
2010-09-15
Motivated by phenomenological questions in quantum gravity, we consider the propagation of a scalar field on a random lattice. We describe a procedure to calculate the dispersion relation for the field by taking a limit of a periodic lattice. We use this to calculate the lowest order coefficients of the dispersion relation for a specific one-dimensional model.
Reducing financial avalanches by random investments.
Biondo, Alessio Emanuele; Pluchino, Alessandro; Rapisarda, Andrea; Helbing, Dirk
2013-12-01
Building on similarities between earthquakes and extreme financial events, we use a self-organized criticality-generating model to study herding and avalanche dynamics in financial markets. We consider a community of interacting investors, distributed in a small-world network, who bet on the bullish (increasing) or bearish (decreasing) behavior of the market which has been specified according to the S&P 500 historical time series. Remarkably, we find that the size of herding-related avalanches in the community can be strongly reduced by the presence of a relatively small percentage of traders, randomly distributed inside the network, who adopt a random investment strategy. Our findings suggest a promising strategy to limit the size of financial bubbles and crashes. We also obtain that the resulting wealth distribution of all traders corresponds to the well-known Pareto power law, while that of random traders is exponential. In other words, for technical traders, the risk of losses is much greater than the probability of gains compared to those of random traders. PMID:24483518
Average Transmission Probability of a Random Stack
ERIC Educational Resources Information Center
Lu, Yin; Miniatura, Christian; Englert, Berthold-Georg
2010-01-01
The transmission through a stack of identical slabs that are separated by gaps with random widths is usually treated by calculating the average of the logarithm of the transmission probability. We show how to calculate the average of the transmission probability itself with the aid of a recurrence relation and derive analytical upper and lower…
Cooperation for volunteering and partially random partnerships
NASA Astrophysics Data System (ADS)
Szabó, György; Vukov, Jeromos
2004-03-01
Competition among cooperative, defective, and loner strategies is studied by considering an evolutionary prisoner’s dilemma game for different partnerships. In this game each player can adopt one of its coplayer’s strategy with a probability depending on the difference of payoffs coming from games with the corresponding coplayers. Our attention is focused on the effects of annealed and quenched randomness in the partnership for fixed number of coplayers. It is shown that only the loners survive if the four coplayers are chosen randomly (mean-field limit). On the contrary, on the square lattice all the three strategies are maintained by the cyclic invasions resulting in a self-organizing spatial pattern. If the fixed partnership is described by a regular small-world structure then a homogeneous oscillation occurs in the population dynamics when the measure of quenched randomness exceeds a threshold value. Similar behavior with higher sensitivity to the randomness is found if temporary partners are substituted for the standard ones with some probability at each step of iteration.
Random Walk Method for Potential Problems
NASA Technical Reports Server (NTRS)
Krishnamurthy, T.; Raju, I. S.
2002-01-01
A local Random Walk Method (RWM) for potential problems governed by Lapalace's and Paragon's equations is developed for two- and three-dimensional problems. The RWM is implemented and demonstrated in a multiprocessor parallel environment on a Beowulf cluster of computers. A speed gain of 16 is achieved as the number of processors is increased from 1 to 23.
Detecting targets hidden in random forests
NASA Astrophysics Data System (ADS)
Kouritzin, Michael A.; Luo, Dandan; Newton, Fraser; Wu, Biao
2009-05-01
Military tanks, cargo or troop carriers, missile carriers or rocket launchers often hide themselves from detection in the forests. This plagues the detection problem of locating these hidden targets. An electro-optic camera mounted on a surveillance aircraft or unmanned aerial vehicle is used to capture the images of the forests with possible hidden targets, e.g., rocket launchers. We consider random forests of longitudinal and latitudinal correlations. Specifically, foliage coverage is encoded with a binary representation (i.e., foliage or no foliage), and is correlated in adjacent regions. We address the detection problem of camouflaged targets hidden in random forests by building memory into the observations. In particular, we propose an efficient algorithm to generate random forests, ground, and camouflage of hidden targets with two dimensional correlations. The observations are a sequence of snapshots consisting of foliage-obscured ground or target. Theoretically, detection is possible because there are subtle differences in the correlations of the ground and camouflage of the rocket launcher. However, these differences are well beyond human perception. To detect the presence of hidden targets automatically, we develop a Markov representation for these sequences and modify the classical filtering equations to allow the Markov chain observation. Particle filters are used to estimate the position of the targets in combination with a novel random weighting technique. Furthermore, we give positive proof-of-concept simulations.
A random matrix formulation of fidelity decay
NASA Astrophysics Data System (ADS)
Gorin, T.; Prosen, T.; Seligman, T. H.
2004-02-01
We propose to study echo dynamics in a random-matrix framework, where we assume that the perturbation is time-independent, random and orthogonally invariant. This allows us to use a basis in which the unperturbed Hamiltonian is diagonal and its properties are thus largely determined by its spectral statistics. We concentrate on the effect of spectral correlations usually associated with chaos and disregard secular variations in spectral density. We obtain analytical results for the fidelity decay in the linear-response regime. To extend the domain of validity, we heuristically exponentiate the linear-response result. The resulting expressions, exact in the perturbative limit, are accurate approximations in the transition region between the 'Fermi golden rule' and the perturbative regimes, as verified by example for a deterministic chaotic system. To sense the effect of spectral stiffness, we apply our model also to the extreme cases of random spectra and equidistant spectra. In our analytical approximations as well as in extensive Monte Carlo calculations, we find that fidelity decay is fastest for random spectra and slowest for equidistant ones, while the classical ensembles lie in between. We conclude that spectral stiffness systematically enhances fidelity.
Reporting Randomized Controlled Trials in Education
ERIC Educational Resources Information Center
Mayo-Wilson, Evan; Grant, Sean; Montgomery, Paul
2014-01-01
Randomized controlled trials (RCTs) are increasingly used to evaluate programs and interventions in order to inform education policy and practice. High quality reports of these RCTs are needed for interested readers to understand the rigor of the study, the interventions tested, and the context in which the evaluation took place (Mayo-Wilson et…
UNSTEADY DISPERSION IN RANDOM INTERMITTENT FLOW
The longitudinal dispersion coefficient of a conservative tracer was calculated from flow tests in a dead-end pipe loop system. Flow conditions for these tests ranged from laminar to transitional flow, and from steady to intermittent and random. Two static mixers linked in series...
Random-phase metasurfaces at optical wavelengths
NASA Astrophysics Data System (ADS)
Pors, Anders; Ding, Fei; Chen, Yiting; Radko, Ilya P.; Bozhevolnyi, Sergey I.
2016-06-01
Random-phase metasurfaces, in which the constituents scatter light with random phases, have the property that an incident plane wave will diffusely scatter, hereby leading to a complex far-field response that is most suitably described by statistical means. In this work, we present and exemplify the statistical description of the far-field response, particularly highlighting how the response for polarised and unpolarised light might be alike or different depending on the correlation of scattering phases for two orthogonal polarisations. By utilizing gap plasmon-based metasurfaces, consisting of an optically thick gold film overlaid by a subwavelength thin glass spacer and an array of gold nanobricks, we design and realize random-phase metasurfaces at a wavelength of 800 nm. Optical characterisation of the fabricated samples convincingly demonstrates the diffuse scattering of reflected light, with statistics obeying the theoretical predictions. We foresee the use of random-phase metasurfaces for camouflage applications and as high-quality reference structures in dark-field microscopy, while the control of the statistics for polarised and unpolarised light might find usage in security applications. Finally, by incorporating a certain correlation between scattering by neighbouring metasurface constituents new types of functionalities can be realised, such as a Lambertian reflector.
Routing in Networks with Random Topologies
NASA Technical Reports Server (NTRS)
Bambos, Nicholas
1997-01-01
We examine the problems of routing and server assignment in networks with random connectivities. In such a network the basic topology is fixed, but during each time slot and for each of tis input queues, each server (node) is either connected to or disconnected from each of its queues with some probability.
Random-phase metasurfaces at optical wavelengths
Pors, Anders; Ding, Fei; Chen, Yiting; Radko, Ilya P.; Bozhevolnyi, Sergey I.
2016-01-01
Random-phase metasurfaces, in which the constituents scatter light with random phases, have the property that an incident plane wave will diffusely scatter, hereby leading to a complex far-field response that is most suitably described by statistical means. In this work, we present and exemplify the statistical description of the far-field response, particularly highlighting how the response for polarised and unpolarised light might be alike or different depending on the correlation of scattering phases for two orthogonal polarisations. By utilizing gap plasmon-based metasurfaces, consisting of an optically thick gold film overlaid by a subwavelength thin glass spacer and an array of gold nanobricks, we design and realize random-phase metasurfaces at a wavelength of 800 nm. Optical characterisation of the fabricated samples convincingly demonstrates the diffuse scattering of reflected light, with statistics obeying the theoretical predictions. We foresee the use of random-phase metasurfaces for camouflage applications and as high-quality reference structures in dark-field microscopy, while the control of the statistics for polarised and unpolarised light might find usage in security applications. Finally, by incorporating a certain correlation between scattering by neighbouring metasurface constituents new types of functionalities can be realised, such as a Lambertian reflector. PMID:27328635
Randomness of Dengue Outbreaks on the Equator.
Chen, Yirong; Cook, Alex R; Lim, Alisa X L
2015-09-01
A simple mathematical model without seasonality indicated that the apparently chaotic dengue epidemics in Singapore have characteristics similar to epidemics resulting from chance. Randomness as a sufficient condition for patterns of dengue epidemics in equatorial regions calls into question existing explanations for dengue outbreaks there.
Reducing financial avalanches by random investments.
Biondo, Alessio Emanuele; Pluchino, Alessandro; Rapisarda, Andrea; Helbing, Dirk
2013-12-01
Building on similarities between earthquakes and extreme financial events, we use a self-organized criticality-generating model to study herding and avalanche dynamics in financial markets. We consider a community of interacting investors, distributed in a small-world network, who bet on the bullish (increasing) or bearish (decreasing) behavior of the market which has been specified according to the S&P 500 historical time series. Remarkably, we find that the size of herding-related avalanches in the community can be strongly reduced by the presence of a relatively small percentage of traders, randomly distributed inside the network, who adopt a random investment strategy. Our findings suggest a promising strategy to limit the size of financial bubbles and crashes. We also obtain that the resulting wealth distribution of all traders corresponds to the well-known Pareto power law, while that of random traders is exponential. In other words, for technical traders, the risk of losses is much greater than the probability of gains compared to those of random traders.
Random-phase metasurfaces at optical wavelengths.
Pors, Anders; Ding, Fei; Chen, Yiting; Radko, Ilya P; Bozhevolnyi, Sergey I
2016-01-01
Random-phase metasurfaces, in which the constituents scatter light with random phases, have the property that an incident plane wave will diffusely scatter, hereby leading to a complex far-field response that is most suitably described by statistical means. In this work, we present and exemplify the statistical description of the far-field response, particularly highlighting how the response for polarised and unpolarised light might be alike or different depending on the correlation of scattering phases for two orthogonal polarisations. By utilizing gap plasmon-based metasurfaces, consisting of an optically thick gold film overlaid by a subwavelength thin glass spacer and an array of gold nanobricks, we design and realize random-phase metasurfaces at a wavelength of 800 nm. Optical characterisation of the fabricated samples convincingly demonstrates the diffuse scattering of reflected light, with statistics obeying the theoretical predictions. We foresee the use of random-phase metasurfaces for camouflage applications and as high-quality reference structures in dark-field microscopy, while the control of the statistics for polarised and unpolarised light might find usage in security applications. Finally, by incorporating a certain correlation between scattering by neighbouring metasurface constituents new types of functionalities can be realised, such as a Lambertian reflector. PMID:27328635
Object Recognition and Random Image Structure Evolution
ERIC Educational Resources Information Center
Sadr, Jvid; Sinha, Pawan
2004-01-01
We present a technique called Random Image Structure Evolution (RISE) for use in experimental investigations of high-level visual perception. Potential applications of RISE include the quantitative measurement of perceptual hysteresis and priming, the study of the neural substrates of object perception, and the assessment and detection of subtle…
A Model for Random Student Drug Testing
ERIC Educational Resources Information Center
Nelson, Judith A.; Rose, Nancy L.; Lutz, Danielle
2011-01-01
The purpose of this case study was to examine random student drug testing in one school district relevant to: (a) the perceptions of students participating in competitive extracurricular activities regarding drug use and abuse; (b) the attitudes and perceptions of parents, school staff, and community members regarding student drug involvement; (c)…
Recruiting Participants for Randomized Controlled Trials
ERIC Educational Resources Information Center
Gallagher, H. Alix; Roschelle, Jeremy; Feng, Mingyu
2014-01-01
The objective of this study was to look across strategies used in a wide range of studies to build a framework for researchers to use in conceptualizing the recruitment process. This paper harvests lessons learned across 19 randomized controlled trials in K-12 school settings conducted by a leading research organization to identify strategies that…
Random Convex Hulls and Extreme Value Statistics
NASA Astrophysics Data System (ADS)
Majumdar, Satya N.; Comtet, Alain; Randon-Furling, Julien
2009-12-01
In this paper we study the statistical properties of convex hulls of N random points in a plane chosen according to a given distribution. The points may be chosen independently or they may be correlated. After a non-exhaustive survey of the somewhat sporadic literature and diverse methods used in the random convex hull problem, we present a unifying approach, based on the notion of support function of a closed curve and the associated Cauchy’s formulae, that allows us to compute exactly the mean perimeter and the mean area enclosed by the convex polygon both in case of independent as well as correlated points. Our method demonstrates a beautiful link between the random convex hull problem and the subject of extreme value statistics. As an example of correlated points, we study here in detail the case when the points represent the vertices of n independent random walks. In the continuum time limit this reduces to n independent planar Brownian trajectories for which we compute exactly, for all n, the mean perimeter and the mean area of their global convex hull. Our results have relevant applications in ecology in estimating the home range of a herd of animals. Some of these results were announced recently in a short communication [Phys. Rev. Lett. 103:140602, 2009].
Species selection and random drift in macroevolution.
Chevin, Luis-Miguel
2016-03-01
Species selection resulting from trait-dependent speciation and extinction is increasingly recognized as an important mechanism of phenotypic macroevolution. However, the recent bloom in statistical methods quantifying this process faces a scarcity of dynamical theory for their interpretation, notably regarding the relative contributions of deterministic versus stochastic evolutionary forces. I use simple diffusion approximations of birth-death processes to investigate how the expected and random components of macroevolutionary change depend on phenotype-dependent speciation and extinction rates, as can be estimated empirically. I show that the species selection coefficient for a binary trait, and selection differential for a quantitative trait, depend not only on differences in net diversification rates (speciation minus extinction), but also on differences in species turnover rates (speciation plus extinction), especially in small clades. The randomness in speciation and extinction events also produces a species-level equivalent to random genetic drift, which is stronger for higher turnover rates. I then show how microevolutionary processes including mutation, organismic selection, and random genetic drift cause state transitions at the species level, allowing comparison of evolutionary forces across levels. A key parameter that would be needed to apply this theory is the distribution and rate of origination of new optimum phenotypes along a phylogeny. PMID:26880617
A Sweet Tasting Demonstration of Random Occurrences.
ERIC Educational Resources Information Center
Christopher, Andrew N.; Marek, Pam
2002-01-01
Discusses a game in which students must guess the flavor of LifeSaver candy without the aid of sight and smell. Explains that this demonstration assists students to understand the phenomenon of random occurrences. Describes how the presentation is conducted as well as the outcomes of the demonstration. (CMK)
Effect of randomness in logistic maps
NASA Astrophysics Data System (ADS)
Khaleque, Abdul; Sen, Parongama
2015-01-01
We study a random logistic map xt+1 = atxt[1 - xt] where at are bounded (q1 ≤ at ≤ q2), random variables independently drawn from a distribution. xt does not show any regular behavior in time. We find that xt shows fully ergodic behavior when the maximum allowed value of at is 4. However
Index statistical properties of sparse random graphs
NASA Astrophysics Data System (ADS)
Metz, F. L.; Stariolo, Daniel A.
2015-10-01
Using the replica method, we develop an analytical approach to compute the characteristic function for the probability PN(K ,λ ) that a large N ×N adjacency matrix of sparse random graphs has K eigenvalues below a threshold λ . The method allows to determine, in principle, all moments of PN(K ,λ ) , from which the typical sample-to-sample fluctuations can be fully characterized. For random graph models with localized eigenvectors, we show that the index variance scales linearly with N ≫1 for |λ |>0 , with a model-dependent prefactor that can be exactly calculated. Explicit results are discussed for Erdös-Rényi and regular random graphs, both exhibiting a prefactor with a nonmonotonic behavior as a function of λ . These results contrast with rotationally invariant random matrices, where the index variance scales only as lnN , with an universal prefactor that is independent of λ . Numerical diagonalization results confirm the exactness of our approach and, in addition, strongly support the Gaussian nature of the index fluctuations.
Reducing financial avalanches by random investments
NASA Astrophysics Data System (ADS)
Biondo, Alessio Emanuele; Pluchino, Alessandro; Rapisarda, Andrea; Helbing, Dirk
2013-12-01
Building on similarities between earthquakes and extreme financial events, we use a self-organized criticality-generating model to study herding and avalanche dynamics in financial markets. We consider a community of interacting investors, distributed in a small-world network, who bet on the bullish (increasing) or bearish (decreasing) behavior of the market which has been specified according to the S&P 500 historical time series. Remarkably, we find that the size of herding-related avalanches in the community can be strongly reduced by the presence of a relatively small percentage of traders, randomly distributed inside the network, who adopt a random investment strategy. Our findings suggest a promising strategy to limit the size of financial bubbles and crashes. We also obtain that the resulting wealth distribution of all traders corresponds to the well-known Pareto power law, while that of random traders is exponential. In other words, for technical traders, the risk of losses is much greater than the probability of gains compared to those of random traders.
A Random Walk on a Circular Path
ERIC Educational Resources Information Center
Ching, W.-K.; Lee, M. S.
2005-01-01
This short note introduces an interesting random walk on a circular path with cards of numbers. By using high school probability theory, it is proved that under some assumptions on the number of cards, the probability that a walker will return to a fixed position will tend to one as the length of the circular path tends to infinity.
ERIC Educational Resources Information Center
Huang, Hung-Yu; Wang, Wen-Chung
2014-01-01
The DINA (deterministic input, noisy, and gate) model has been widely used in cognitive diagnosis tests and in the process of test development. The outcomes known as slip and guess are included in the DINA model function representing the responses to the items. This study aimed to extend the DINA model by using the random-effect approach to allow…
Globally, unrelated protein sequences appear random
Lavelle, Daniel T.; Pearson, William R.
2010-01-01
Motivation: To test whether protein folding constraints and secondary structure sequence preferences significantly reduce the space of amino acid words in proteins, we compared the frequencies of four- and five-amino acid word clumps (independent words) in proteins to the frequencies predicted by four random sequence models. Results: While the human proteome has many overrepresented word clumps, these words come from large protein families with biased compositions (e.g. Zn-fingers). In contrast, in a non-redundant sample of Pfam-AB, only 1% of four-amino acid word clumps (4.7% of 5mer words) are 2-fold overrepresented compared with our simplest random model [MC(0)], and 0.1% (4mers) to 0.5% (5mers) are 2-fold overrepresented compared with a window-shuffled random model. Using a false discovery rate q-value analysis, the number of exceptional four- or five-letter words in real proteins is similar to the number found when comparing words from one random model to another. Consensus overrepresented words are not enriched in conserved regions of proteins, but four-letter words are enriched 1.18- to 1.56-fold in α-helical secondary structures (but not β-strands). Five-residue consensus exceptional words are enriched for α-helix 1.43- to 1.61-fold. Protein word preferences in regular secondary structure do not appear to significantly restrict the use of sequence words in unrelated proteins, although the consensus exceptional words have a secondary structure bias for α-helix. Globally, words in protein sequences appear to be under very few constraints; for the most part, they appear to be random. Contact: wrp@virginia.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:19948773
Randomization and the Gross-Pitaevskii Hierarchy
NASA Astrophysics Data System (ADS)
Sohinger, Vedran; Staffilani, Gigliola
2015-10-01
We study the Gross-Pitaevskii hierarchy on the spatial domain . By using an appropriate randomization of the Fourier coefficients in the collision operator, we prove an averaged form of the main estimate which is used in order to contract the Duhamel terms that occur in the study of the hierarchy. In the averaged estimate, we do not need to integrate in the time variable. An averaged spacetime estimate for this range of regularity exponents then follows as a direct corollary. The range of regularity exponents that we obtain is . It was shown in our previous joint work with Gressman (J Funct Anal 266(7):4705-4764, 2014) that the range is sharp in the corresponding deterministic spacetime estimate. This is in contrast to the non-periodic setting, which was studied by Klainerman and Machedon (Commun Math Phys 279(1):169-185, 2008), where the spacetime estimate is known to hold whenever . The goal of our paper is to extend the range of α in this class of estimates in a probabilistic sense. We use the new estimate and the ideas from its proof in order to study randomized forms of the Gross-Pitaevskii hierarchy. More precisely, we consider hierarchies similar to the Gross-Pitaevskii hierarchy, but in which the collision operator has been randomized. For these hierarchies, we show convergence to zero in low regularity Sobolev spaces of Duhamel expansions of fixed deterministic density matrices. We believe that the study of the randomized collision operators could be the first step in the understanding of a nonlinear form of randomization.
Molecular motors: thermodynamics and the random walk.
Thomas, N.; Imafuku, Y.; Tawada, K.
2001-01-01
The biochemical cycle of a molecular motor provides the essential link between its thermodynamics and kinetics. The thermodynamics of the cycle determine the motor's ability to perform mechanical work, whilst the kinetics of the cycle govern its stochastic behaviour. We concentrate here on tightly coupled, processive molecular motors, such as kinesin and myosin V, which hydrolyse one molecule of ATP per forward step. Thermodynamics require that, when such a motor pulls against a constant load f, the ratio of the forward and backward products of the rate constants for its cycle is exp [-(DeltaG + u(0)f)/kT], where -DeltaG is the free energy available from ATP hydrolysis and u(0) is the motor's step size. A hypothetical one-state motor can therefore act as a chemically driven ratchet executing a biased random walk. Treating this random walk as a diffusion problem, we calculate the forward velocity v and the diffusion coefficient D and we find that its randomness parameter r is determined solely by thermodynamics. However, real molecular motors pass through several states at each attachment site. They satisfy a modified diffusion equation that follows directly from the rate equations for the biochemical cycle and their effective diffusion coefficient is reduced to D-v(2)tau, where tau is the time-constant for the motor to reach the steady state. Hence, the randomness of multistate motors is reduced compared with the one-state case and can be used for determining tau. Our analysis therefore demonstrates the intimate relationship between the biochemical cycle, the force-velocity relation and the random motion of molecular motors. PMID:11600075
Random Vibrations: Assessment of the State of the Art
Paez, T.L.
1999-02-23
Random vibration is the phenomenon wherein random excitation applied to a mechanical system induces random response. We summarize the state of the art in random vibration analysis and testing, commenting on history, linear and nonlinear analysis, the analysis of large-scale systems, and probabilistic structural testing.
76 FR 79204 - Random Drug Testing Rate for Covered Crewmembers
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-21
... SECURITY Coast Guard Random Drug Testing Rate for Covered Crewmembers AGENCY: Coast Guard, DHS. ACTION: Notice of minimum random drug testing rate. SUMMARY: The Coast Guard has set the calendar year 2012 minimum random drug testing rate at 50 percent of covered crewmembers. DATES: The minimum random...
76 FR 1448 - Random Drug Testing Rate for Covered Crewmembers
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-10
... SECURITY Coast Guard Random Drug Testing Rate for Covered Crewmembers AGENCY: Coast Guard, DHS. ACTION: Notice of minimum random drug testing rate. SUMMARY: The Coast Guard has set the calendar year 2011 minimum random drug testing rate at 50 percent of covered crewmembers. DATES: The minimum random...
On the pertinence to Physics of random walks induced by random dynamical systems: a survey
NASA Astrophysics Data System (ADS)
Petritis, Dimitri
2016-08-01
Let be an abstract space and a denumerable (finite or infinite) alphabet. Suppose that is a family of functions such that for all we have and a family of transformations . The pair ((Sa)a , (pa)a ) is termed an iterated function system with place dependent probabilities. Such systems can be thought as generalisations of random dynamical systems. As a matter of fact, suppose we start from a given ; we pick then randomly, with probability pa (x), the transformation Sa and evolve to Sa (x). We are interested in the behaviour of the system when the iteration continues indefinitely. Random walks of the above type are omnipresent in both classical and quantum Physics. To give a small sample of occurrences we mention: random walks on the affine group, random walks on Penrose lattices, random walks on partially directed lattices, evolution of density matrices induced by repeated quantum measurements, quantum channels, quantum random walks, etc. In this article, we review some basic properties of such systems and provide with a pathfinder in the extensive bibliography (both on mathematical and physical sides) where the main results have been originally published.
NASA Astrophysics Data System (ADS)
Moyer, Steve; Uhl, Elizabeth R.
2015-05-01
For more than 50 years, the U.S. Army RDECOM CERDEC Night Vision and Electronic Sensors Directorate (NVESD) has been studying and modeling the human visual discrimination process as it pertains to military imaging systems. In order to develop sensor performance models, human observers are trained to expert levels in the identification of military vehicles. From 1998 until 2006, the experimental stimuli were block randomized, meaning that stimuli with similar difficulty levels (for example, in terms of distance from target, blur, noise, etc.) were presented together in blocks of approximately 24 images but the order of images within the block was random. Starting in 2006, complete randomization came into vogue, meaning that difficulty could change image to image. It was thought that this would provide a more statistically robust result. In this study we investigated the impact of the two types of randomization on performance in two groups of observers matched for skill to create equivalent groups. It is hypothesized that Soldiers in the Complete Randomized condition will have to shift their decision criterion more frequently than Soldiers in the Block Randomization group and this shifting is expected to impede performance so that Soldiers in the Block Randomized group perform better.
A simplified method for random vibration analysis of structures with random parameters
NASA Astrophysics Data System (ADS)
Ghienne, Martin; Blanzé, Claude
2016-09-01
Piezoelectric patches with adapted electrical circuits or viscoelastic dissipative materials are two solutions particularly adapted to reduce vibration of light structures. To accurately design these solutions, it is necessary to describe precisely the dynamical behaviour of the structure. It may quickly become computationally intensive to describe robustly this behaviour for a structure with nonlinear phenomena, such as contact or friction for bolted structures, and uncertain variations of its parameters. The aim of this work is to propose a non-intrusive reduced stochastic method to characterize robustly the vibrational response of a structure with random parameters. Our goal is to characterize the eigenspace of linear systems with dynamic properties considered as random variables. This method is based on a separation of random aspects from deterministic aspects and allows us to estimate the first central moments of each random eigenfrequency with a single deterministic finite elements computation. The method is applied to a frame with several Young's moduli modeled as random variables. This example could be expanded to a bolted structure including piezoelectric devices. The method needs to be enhanced when random eigenvalues are closely spaced. An indicator with no additional computational cost is proposed to characterize the ’’proximity” of two random eigenvalues.
Can Observed Randomness Be Certified to Be Fully Intrinsic?
NASA Astrophysics Data System (ADS)
Dhara, Chirag; de la Torre, Gonzalo; Acín, Antonio
2014-03-01
In general, any observed random process includes two qualitatively different forms of randomness: apparent randomness, which results both from ignorance or lack of control of degrees of freedom in the system, and intrinsic randomness, which is not ascribable to any such cause. While classical systems only possess the first kind of randomness, quantum systems may exhibit some intrinsic randomness. In this Letter, we provide quantum processes in which all the observed randomness is fully intrinsic. These results are derived under minimal assumptions: the validity of the no-signaling principle and an arbitrary (but not absolute) lack of freedom of choice. Our results prove that quantum predictions cannot be completed already in simple finite scenarios, for instance of three parties performing two dichotomic measurements. Moreover, the observed randomness tends to a perfect random bit when increasing the number of parties, thus, defining an explicit process attaining full randomness amplification.
Existence of the Harmonic Measure for Random Walks on Graphs and in Random Environments
NASA Astrophysics Data System (ADS)
Boivin, Daniel; Rau, Clément
2013-01-01
We give a sufficient condition for the existence of the harmonic measure from infinity of transient random walks on weighted graphs. In particular, this condition is verified by the random conductance model on ℤ d , d≥3, when the conductances are i.i.d. and the bonds with positive conductance percolate. The harmonic measure from infinity also exists for random walks on supercritical clusters of ℤ2. This is proved using results of Barlow (Ann. Probab. 32:3024-3084, 2004) and Barlow and Hambly (Electron. J. Probab. 14(1):1-27, 2009).
Mixing thermodynamics of block-random copolymers
NASA Astrophysics Data System (ADS)
Beckingham, Bryan Scott
Random copolymerization of A and B monomers represents a versatile method to tune interaction strengths between polymers, as ArB random copolymers will exhibit a smaller effective Flory interaction parameter chi; (or interaction energy density X) upon mixing with A or B homopolymers than upon mixing A and B homopolymers with each other, and the ArB composition can be tuned continuously. Thus, the incorporation of a random copolymer block into the classical block copolymer architecture to yield "block-random" copolymers introduces an additional tuning mechanism for the control of structure-property relationships, as the interblock interactions and physical properties can be tuned continuously through the random block's composition. However, typical living or controlled polymerizations produce compositional gradients along the "random" block, which can in turn influence the phase behavior. This dissertation demonstrates a method by which narrow-distribution copolymers of styrene and isoprene of any desired composition, with no measurable down-chain gradient, are synthesized. This synthetic method is then utilized to incorporate random copolymers of styrene and isoprene as blocks into block-random copolymers in order to examine the resulting interblock mixing thermodynamics. A series of well-defined near-symmetric block and block-random copolymers (S-I, Bd-S, I-SrI, S-SrI and Bd-S rI diblocks, where S is polystyrene, I is polyisoprene and Bd is polybutadiene), with varying molecular weight and random-block composition are synthesized and the mixing thermodynamics---via comparison of their interaction energy densities, X---of their hydrogenated derivatives is examined through measurement of the order-disorder transition (ODT) temperature. Hydrogenated derivatives of I-SrI and S-SrI block-random copolymers, both wherein the styrene aromaticity is retained and derivatives wherein the styrene units are saturated to vinylcyclohexane (VCH), are found to hew closely to the
NASA Astrophysics Data System (ADS)
Miszczak, Jarosław Adam
2013-01-01
The presented package for the Mathematica computing system allows the harnessing of quantum random number generators (QRNG) for investigating the statistical properties of quantum states. The described package implements a number of functions for generating random states. The new version of the package adds the ability to use the on-line quantum random number generator service and implements new functions for retrieving lists of random numbers. Thanks to the introduced improvements, the new version provides faster access to high-quality sources of random numbers and can be used in simulations requiring large amount of random data. New version program summaryProgram title: TRQS Catalogue identifier: AEKA_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEKA_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 18 134 No. of bytes in distributed program, including test data, etc.: 2 520 49 Distribution format: tar.gz Programming language: Mathematica, C. Computer: Any supporting Mathematica in version 7 or higher. Operating system: Any platform supporting Mathematica; tested with GNU/Linux (32 and 64 bit). RAM: Case-dependent Supplementary material: Fig. 1 mentioned below can be downloaded. Classification: 4.15. External routines: Quantis software library (http://www.idquantique.com/support/quantis-trng.html) Catalogue identifier of previous version: AEKA_v1_0 Journal reference of previous version: Comput. Phys. Comm. 183(2012)118 Does the new version supersede the previous version?: Yes Nature of problem: Generation of random density matrices and utilization of high-quality random numbers for the purpose of computer simulation. Solution method: Use of a physical quantum random number generator and an on-line service providing access to the source of true random
Generalized Random Sequential Adsorption on Erdős-Rényi Random Graphs
NASA Astrophysics Data System (ADS)
Dhara, Souvik; van Leeuwaarden, Johan S. H.; Mukherjee, Debankur
2016-09-01
We investigate random sequential adsorption (RSA) on a random graph via the following greedy algorithm: Order the n vertices at random, and sequentially declare each vertex either active or frozen, depending on some local rule in terms of the state of the neighboring vertices. The classical RSA rule declares a vertex active if none of its neighbors is, in which case the set of active nodes forms an independent set of the graph. We generalize this nearest-neighbor blocking rule in three ways and apply it to the Erdős-Rényi random graph. We consider these generalizations in the large-graph limit n→ ∞ and characterize the jamming constant, the limiting proportion of active vertices in the maximal greedy set.
Christensen, Britt; Ludvigsen, Maja; Nellemann, Birgitte; Kopchick, John J.; Honoré, Bent; Jørgensen, Jens Otto L.
2015-01-01
Introduction Despite implementation of the biological passport to detect erythropoietin abuse, a need for additional biomarkers remains. We used a proteomic approach to identify novel serum biomarkers of prolonged erythropoiesis-stimulating agent (ESA) exposure (Darbepoietin-α) and/or aerobic training. Trial Design Thirty-six healthy young males were randomly assigned to the following groups: Sedentary-placebo (n = 9), Sedentary-ESA (n = 9), Training-placebo (n = 10), or Training-ESA (n = 8). They were treated with placebo/Darbepoietin-α subcutaneously once/week for 10 weeks followed by a 3-week washout period. Training consisted of supervised biking 3/week for 13 weeks at the highest possible intensity. Serum was collected at baseline, week 3 (high dose Darbepoietin-α), week 10 (reduced dose Darbepoietin-α), and after a 3-week washout period. Methods Serum proteins were separated according to charge and molecular mass (2D-gel electrophoresis). The identity of proteins from spots exhibiting altered intensity was determined by mass spectrometry. Results Six protein spots changed in response to Darbepoietin-α treatment. Comparing all 4 experimental groups, two protein spots (serotransferrin and haptoglobin/haptoglobin related protein) showed a significant response to Darbepoietin-α treatment. The haptoglobin/haptoglobin related protein spot showed a significantly lower intensity in all subjects in the training-ESA group during the treatment period and increased during the washout period. Conclusion An isoform of haptoglobin/haptoglobin related protein could be a new anti-doping marker and merits further research. Trial Registration ClinicalTrials.gov NCT01320449 PMID:25679398
Williams, Alishia D.; O’Moore, Kathleen; Blackwell, Simon E.; Smith, Jessica; Holmes, Emily A.; Andrews, Gavin
2015-01-01
Background Accruing evidence suggests that positive imagery-based cognitive bias modification (CBM) could have potential as a standalone targeted intervention for depressive symptoms or as an adjunct to existing treatments. We sought to establish the benefit of this form of CBM when delivered prior to Internet cognitive behavioral therapy (iCBT) for depression Methods A randomized controlled trial (RCT) of a 1-week Internet-delivered positive CBM vs. an active control condition for participants (N=75, 69% female, mean age=42) meeting diagnostic criteria for major depression; followed by a 10-week iCBT program for both groups. Results Modified intent-to-treat marginal and mixed effect models demonstrated no significant difference between conditions following the CBM intervention or the iCBT program. In both conditions there were significant reductions (Cohen׳s d .57–1.58, 95% CI=.12–2.07) in primary measures of depression and interpretation bias (PHQ9, BDI-II, AST-D). Large effect size reductions (Cohen׳s d .81–1.32, 95% CI=.31–1.79) were observed for secondary measures of distress, disability, anxiety and repetitive negative thinking (K10, WHODAS, STAI, RTQ). Per protocol analyses conducted in the sample of participants who completed all seven sessions of CBM indicated between-group superiority of the positive over control group on depression symptoms (PHQ9, BDI-II) and psychological distress (K10) following CBM (Hedges g .55–.88, 95% CI=−.03–1.46) and following iCBT (PHQ9, K10). The majority (>70%) no longer met diagnostic criteria for depression at 3-month follow-up. Limitations The control condition contained many active components and therefore may have represented a smaller ‘dose’ of the positive condition. Conclusions Results provide preliminary support for the successful integration of imagery-based CBM into an existing Internet-based treatment for depression. PMID:25805405
Delucchi, Kevin L.; Prochaska, Judith J.
2015-01-01
Introduction: In an ethnically-diverse, uninsured psychiatric sample with co-occurring drug/alcohol addiction, we evaluated the feasibility and reproducibility of a tobacco treatment intervention. The intervention previously demonstrated efficacy in insured psychiatric and nonpsychiatric samples with 20.0%–25.0% abstinence at 18 months. Methods: Daily smokers, recruited in 2009–2010 from psychiatric units at an urban public hospital, were randomized to usual care (on-unit nicotine replacement plus quit advice) or intervention, which added a Transtheoretical-model tailored, computer-assisted intervention, stage-matched manual, brief counseling, and 10-week post-hospitalization nicotine replacement. Results: The sample (N = 100, 69% recruitment rate, age M = 40) was 56% racial/ethnic minority, 65% male, 79% unemployed, and 48% unstably housed, diagnosed with unipolar (54%) and bipolar (14%) depression and psychotic disorders (46%); 77% reported past-month illicit drug use. Prior to hospitalization, participants averaged 19 (SD = 11) cigarettes/day for 23 (SD = 13) years; 80% smoked within 30 minutes of awakening; 25% were preparing to quit. Encouraging and comparable to effects in the general population, 7-day point prevalence abstinence for intervention versus control was 12.5% versus 7.3% at 3 months, 17.5% versus 8.5% at 6 months, and 26.2% versus 16.7% at 12 months. Retention exceeded 80% over 12 months. The odds of abstinence increased over time, predicted by higher self-efficacy, greater perceived social status, and diagnosis of psychotic disorder compared to unipolar depression. Conclusions: Findings indicate uninsured smokers with serious mental illness can engage in tobacco treatment research with quit rates comparable to the general population. A larger investigation is warranted. Inclusion of diverse smokers with mental illness in clinical trials is supported and encouraged. PMID:26180227
Casey, S. C.; Patterson, R. L.; Gross, M.; Lickliter, K.; Stein, J. S.
2003-02-25
The U.S. Department of Energy (DOE) is responsible for disposing of transuranic waste in the Waste Isolation Pilot Plant (WIPP) in southeastern New Mexico. As part of that responsibility, DOE must comply with the U.S. Environmental Protection Agency's (EPA) radiation protection standards in Title 40 Code of Federal Regulations (CFR), Parts 191 and 194. This paper addresses compliance with the criteria of 40 CFR Section 194.24(d) and 194.24(f) that require DOE to either provide a waste loading scheme for the WIPP repository or to assume random emplacement in the mandated performance and compliance assessments. The DOE established a position on waste loading schemes during the process of obtaining the EPA's initial Certification in 1998. The justification for utilizing a random waste emplacement distribution within the WIPP repository was provided to the EPA. During the EPA rulemaking process for the initial certification, the EPA questioned DOE on whether waste would be loaded randomly as modeled in long-term performance assessment (PA) and the impact, if any, of nonrandom loading. In response, DOE conducted an impact assessment for non-random waste loading. The results of this assessment supported the contention that it does not matter whether random or non-random waste loading is assumed for the PA. The EPA determined that a waste loading plan was unnecessary because DOE had assumed random waste loading and evaluated the potential consequences of non-random loading for a very high activity waste stream. In other words, the EPA determined that DOE was not required to provide a waste loading scheme because compliance is not affected by the actual distribution of waste containers in the WIPP.
NASA Astrophysics Data System (ADS)
Mizutani, Tomoko; Saraya, Takuya; Takeuchi, Kiyoshi; Kobayashi, Masaharu; Hiramoto, Toshiro
2016-04-01
Bit failure events induced by random telegraph noise (RTN) for silicon-on-thin-buried-oxide (SOTB) static random access memory (SRAM) cells were characterized by directly monitoring the storage node voltage of individual cells, using a device-matrix-array (DMA) test element group (TEG). Correlating the cell-level RTN and failure waveforms with the RTN waveforms of individual transistors that constitute the same cell, RTN of a specific transistor that causes the cell failure was identified.
Markov speckle for efficient random bit generation.
Horstmeyer, Roarke; Chen, Richard Y; Judkewitz, Benjamin; Yang, Changhuei
2012-11-19
Optical speckle is commonly observed in measurements using coherent radiation. While lacking experimental validation, previous work has often assumed that speckle's random spatial pattern follows a Markov process. Here, we present a derivation and experimental confirmation of conditions under which this assumption holds true. We demonstrate that a detected speckle field can be designed to obey the first-order Markov property by using a Cauchy attenuation mask to modulate scattered light. Creating Markov speckle enables the development of more accurate and efficient image post-processing algorithms, with applications including improved de-noising, segmentation and super-resolution. To show its versatility, we use the Cauchy mask to maximize the entropy of a detected speckle field with fixed average speckle size, allowing cryptographic applications to extract a maximum number of useful random bits from speckle images.
Conformational transitions in random heteropolymer models
NASA Astrophysics Data System (ADS)
Blavatska, Viktoria; Janke, Wolfhard
2014-01-01
We study the conformational properties of heteropolymers containing two types of monomers A and B, modeled as self-attracting self-avoiding random walks on a regular lattice. Such a model can describe in particular the sequences of hydrophobic and hydrophilic residues in proteins [K. F. Lau and K. A. Dill, Macromolecules 22, 3986 (1989)] and polyampholytes with oppositely charged groups [Y. Kantor and M. Kardar, Europhys. Lett. 28, 169 (1994)]. Treating the sequences of the two types of monomers as quenched random variables, we provide a systematic analysis of possible generalizations of this model. To this end we apply the pruned-enriched Rosenbluth chain-growth algorithm, which allows us to obtain the phase diagrams of extended and compact states coexistence as function of both the temperature and fraction of A and B monomers along the heteropolymer chain.
Residual Defect Density in Random Disks Deposits
NASA Astrophysics Data System (ADS)
Topic, Nikola; Pöschel, Thorsten; Gallas, Jason A. C.
2015-08-01
We investigate the residual distribution of structural defects in very tall packings of disks deposited randomly in large channels. By performing simulations involving the sedimentation of up to 50 × 109 particles we find all deposits to consistently show a non-zero residual density of defects obeying a characteristic power-law as a function of the channel width. This remarkable finding corrects the widespread belief that the density of defects should vanish algebraically with growing height. A non-zero residual density of defects implies a type of long-range spatial order in the packing, as opposed to only local ordering. In addition, we find deposits of particles to involve considerably less randomness than generally presumed.
Area law for random graph states
NASA Astrophysics Data System (ADS)
Collins, Benoît; Nechita, Ion; Życzkowski, Karol
2013-08-01
Random pure states of multi-partite quantum systems, associated with arbitrary graphs, are investigated. Each vertex of the graph represents a generic interaction between subsystems, described by a random unitary matrix distributed according to the Haar measure, while each edge of the graph represents a bipartite, maximally entangled state. For any splitting of the graph into two parts we consider the corresponding partition of the quantum system and compute the average entropy of entanglement. First, in the special case where the partition does not cross any vertex of the graph, we show that the area law is satisfied exactly. In the general case, we show that the entropy of entanglement obeys an area law on average, this time with a correction term that depends on the topologies of the graph and of the partition. The results obtained are applied to the problem of distribution of quantum entanglement in a quantum network with prescribed topology.
Random Test Run Length and Effectiveness
NASA Technical Reports Server (NTRS)
Andrews, James H.; Groce, Alex; Weston, Melissa; Xu, Ru-Gang
2008-01-01
A poorly understood but important factor in many applications of random testing is the selection of a maximum length for test runs. Given a limited time for testing, it is seldom clear whether executing a small number of long runs or a large number of short runs maximizes utility. It is generally expected that longer runs are more likely to expose failures -- which is certainly true with respect to runs shorter than the shortest failing trace. However, longer runs produce longer failing traces, requiring more effort from humans in debugging or more resources for automated minimization. In testing with feedback, increasing ranges for parameters may also cause the probability of failure to decrease in longer runs. We show that the choice of test length dramatically impacts the effectiveness of random testing, and that the patterns observed in simple models and predicted by analysis are useful in understanding effects observed.
Universal microbial diagnostics using random DNA probes
Aghazadeh, Amirali; Lin, Adam Y.; Sheikh, Mona A.; Chen, Allen L.; Atkins, Lisa M.; Johnson, Coreen L.; Petrosino, Joseph F.; Drezek, Rebekah A.; Baraniuk, Richard G.
2016-01-01
Early identification of pathogens is essential for limiting development of therapy-resistant pathogens and mitigating infectious disease outbreaks. Most bacterial detection schemes use target-specific probes to differentiate pathogen species, creating time and cost inefficiencies in identifying newly discovered organisms. We present a novel universal microbial diagnostics (UMD) platform to screen for microbial organisms in an infectious sample, using a small number of random DNA probes that are agnostic to the target DNA sequences. Our platform leverages the theory of sparse signal recovery (compressive sensing) to identify the composition of a microbial sample that potentially contains novel or mutant species. We validated the UMD platform in vitro using five random probes to recover 11 pathogenic bacteria. We further demonstrated in silico that UMD can be generalized to screen for common human pathogens in different taxonomy levels. UMD’s unorthodox sensing approach opens the door to more efficient and universal molecular diagnostics. PMID:27704040
Subnoise detection of a fast random event.
Ataie, V; Esman, D; Kuo, B P-P; Alic, N; Radic, S
2015-12-11
Observation of random, nonrepetitive phenomena is of critical importance in astronomy, spectroscopy, biology, and remote sensing. Heralded by weak signals, hidden in noise, they pose basic detection challenges. In contrast to repetitive waveforms, a single-instance signal cannot be separated from noise through averaging. Here, we show that a fast, randomly occurring event can be detected and extracted from a noisy background without conventional averaging. An isolated 80-picosecond pulse was received with confidence level exceeding 99%, even when accompanied by noise. Our detector relies on instantaneous spectral cloning and a single-step, coherent field processor. The ability to extract fast, subnoise events is expected to increase detection sensitivity in multiple disciplines. Additionally, the new spectral-cloning receiver can potentially intercept communication signals that are presently considered secure. PMID:26659052
Directed Random Markets: Connectivity Determines Money
NASA Astrophysics Data System (ADS)
Martínez-Martínez, Ismael; López-Ruiz, Ricardo
2013-12-01
Boltzmann-Gibbs (BG) distribution arises as the statistical equilibrium probability distribution of money among the agents of a closed economic system where random and undirected exchanges are allowed. When considering a model with uniform savings in the exchanges, the final distribution is close to the gamma family. In this paper, we implement these exchange rules on networks and we find that these stationary probability distributions are robust and they are not affected by the topology of the underlying network. We introduce a new family of interactions: random but directed ones. In this case, it is found the topology to be determinant and the mean money per economic agent is related to the degree of the node representing the agent in the network. The relation between the mean money per economic agent and its degree is shown to be linear.
Random graphs containing arbitrary distributions of subgraphs
NASA Astrophysics Data System (ADS)
Karrer, Brian; Newman, M. E. J.
2010-12-01
Traditional random graph models of networks generate networks that are locally treelike, meaning that all local neighborhoods take the form of trees. In this respect such models are highly unrealistic, most real networks having strongly nontreelike neighborhoods that contain short loops, cliques, or other biconnected subgraphs. In this paper we propose and analyze a class of random graph models that incorporates general subgraphs, allowing for nontreelike neighborhoods while still remaining solvable for many fundamental network properties. Among other things we give solutions for the size of the giant component, the position of the phase transition at which the giant component appears, and percolation properties for both site and bond percolation on networks generated by the model.
Disorder by Random Crosslinking in Smectic Elastomers
NASA Astrophysics Data System (ADS)
Lambreva, Denitza M.; Ostrovskii, Boris I.; Finkelmann, Heino; de Jeu, Wim H.
2004-10-01
We present a high-resolution x-ray study of the effects of disorder due to random crosslinking on the one-dimensional translational ordering in smectic elastomers. At a small crosslink density of about 5%, the elastomer network stabilizes the smectic structure against layer-displacement fluctuations, and the algebraically decaying layer ordering extends up to several micrometers. With increasing concentration of crosslinks, the finite size of these domains is strongly reduced, indicating that disordering takes over. Finally, at a crosslink concentration of 20%, the structure factor can be described by a Lorentzian, which signals extended short-range correlations. The findings are discussed in terms of recent theories of randomly quenched disorder.
Informed Consent and Cluster-Randomized Trials
Dawson, Angus
2012-01-01
We argue that cluster-randomized trials are an important methodology, essential to the evaluation of many public health interventions. However, in the case of at least some cluster-randomized trials, it is not possible, or is incompatible with the aims of the study, to obtain individual informed consent. This should not necessarily be seen as an impediment to ethical approval, providing that sufficient justification is given for this omission. We further argue that it should be the institutional review board’s task to evaluate whether the protocol is sufficiently justified to proceed without consent and that this is preferable to any reliance on community consent or other means of proxy consent. PMID:22390511
Residual Defect Density in Random Disks Deposits
Topic, Nikola; Pöschel, Thorsten; Gallas, Jason A. C.
2015-01-01
We investigate the residual distribution of structural defects in very tall packings of disks deposited randomly in large channels. By performing simulations involving the sedimentation of up to 50 × 109 particles we find all deposits to consistently show a non-zero residual density of defects obeying a characteristic power-law as a function of the channel width. This remarkable finding corrects the widespread belief that the density of defects should vanish algebraically with growing height. A non-zero residual density of defects implies a type of long-range spatial order in the packing, as opposed to only local ordering. In addition, we find deposits of particles to involve considerably less randomness than generally presumed. PMID:26235809
Non-volatile magnetic random access memory
NASA Technical Reports Server (NTRS)
Katti, Romney R. (Inventor); Stadler, Henry L. (Inventor); Wu, Jiin-Chuan (Inventor)
1994-01-01
Improvements are made in a non-volatile magnetic random access memory. Such a memory is comprised of an array of unit cells, each having a Hall-effect sensor and a thin-film magnetic element made of material having an in-plane, uniaxial anisotropy and in-plane, bipolar remanent magnetization states. The Hall-effect sensor is made more sensitive by using a 1 m thick molecular beam epitaxy grown InAs layer on a silicon substrate by employing a GaAs/AlGaAs/InAlAs superlattice buffering layer. One improvement avoids current shunting problems of matrix architecture. Another improvement reduces the required magnetizing current for the micromagnets. Another improvement relates to the use of GaAs technology wherein high electron-mobility GaAs MESFETs provide faster switching times. Still another improvement relates to a method for configuring the invention as a three-dimensional random access memory.
Brownian motion on random dynamical landscapes
NASA Astrophysics Data System (ADS)
Suñé Simon, Marc; Sancho, José María; Lindenberg, Katja
2016-03-01
We present a study of overdamped Brownian particles moving on a random landscape of dynamic and deformable obstacles (spatio-temporal disorder). The obstacles move randomly, assemble, and dissociate following their own dynamics. This landscape may account for a soft matter or liquid environment in which large obstacles, such as macromolecules and organelles in the cytoplasm of a living cell, or colloids or polymers in a liquid, move slowly leading to crowding effects. This representation also constitutes a novel approach to the macroscopic dynamics exhibited by active matter media. We present numerical results on the transport and diffusion properties of Brownian particles under this disorder biased by a constant external force. The landscape dynamics are characterized by a Gaussian spatio-temporal correlation, with fixed time and spatial scales, and controlled obstacle concentrations.
Product, generic, and random generic quantum satisfiability
Laumann, C. R.; Sondhi, S. L.; Laeuchli, A. M.; Moessner, R.; Scardicchio, A.
2010-06-15
We report a cluster of results on k-QSAT, the problem of quantum satisfiability for k-qubit projectors which generalizes classical satisfiability with k-bit clauses to the quantum setting. First we define the NP-complete problem of product satisfiability and give a geometrical criterion for deciding when a QSAT interaction graph is product satisfiable with positive probability. We show that the same criterion suffices to establish quantum satisfiability for all projectors. Second, we apply these results to the random graph ensemble with generic projectors and obtain improved lower bounds on the location of the SAT-unSAT transition. Third, we present numerical results on random, generic satisfiability which provide estimates for the location of the transition for k=3 and k=4 and mild evidence for the existence of a phase which is satisfiable by entangled states alone.
Fast Magnetic Micropropellers with Random Shapes
2015-01-01
Studying propulsion mechanisms in low Reynolds number fluid has implications for many fields, ranging from the biology of motile microorganisms and the physics of active matter to micromixing in catalysis and micro- and nanorobotics. The propulsion of magnetic micropropellers can be characterized by a dimensionless speed, which solely depends on the propeller geometry for a given axis of rotation. However, this dependence has so far been only investigated for helical propeller shapes, which were assumed to be optimal. In order to explore a larger variety of shapes, we experimentally studied the propulsion properties of randomly shaped magnetic micropropellers. Surprisingly, we found that their dimensionless speeds are high on average, comparable to previously reported nanofabricated helical micropropellers. The highest dimensionless speed we observed is higher than that of any previously reported propeller moving in a low Reynolds number fluid, proving that physical random shape generation can be a viable optimization strategy. PMID:26383225
Random interactions in higher order neural networks
NASA Technical Reports Server (NTRS)
Baldi, Pierre; Venkatesh, Santosh S.
1993-01-01
Recurrent networks of polynomial threshold elements with random symmetric interactions are studied. Precise asymptotic estimates are derived for the expected number of fixed points as a function of the margin of stability. In particular, it is shown that there is a critical range of margins of stability (depending on the degree of polynomial interaction) such that the expected number of fixed points with margins below the critical range grows exponentially with the number of nodes in the network, while the expected number of fixed points with margins above the critical range decreases exponentially with the number of nodes in the network. The random energy model is also briefly examined and links with higher order neural networks and higher order spin glass models made explicit.
Random walk centrality in interconnected multilayer networks
NASA Astrophysics Data System (ADS)
Solé-Ribalta, Albert; De Domenico, Manlio; Gómez, Sergio; Arenas, Alex
2016-06-01
Real-world complex systems exhibit multiple levels of relationships. In many cases they require to be modeled as interconnected multilayer networks, characterizing interactions of several types simultaneously. It is of crucial importance in many fields, from economics to biology and from urban planning to social sciences, to identify the most (or the less) influent nodes in a network using centrality measures. However, defining the centrality of actors in interconnected complex networks is not trivial. In this paper, we rely on the tensorial formalism recently proposed to characterize and investigate this kind of complex topologies, and extend two well known random walk centrality measures, the random walk betweenness and closeness centrality, to interconnected multilayer networks. For each of the measures we provide analytical expressions that completely agree with numerically results.
Random-effects models for longitudinal data
Laird, N.M.; Ware, J.H.
1982-12-01
Models for the analysis of longitudinal data must recognize the relationship between serial observations on the same unit. Multivariate models with general covariance structure are often difficult to apply to highly unbalanced data, whereas two-stage random-effects models can be used easily. In two-stage models, the probability distributions for the response vectors of different individuals belong to a single family, but some random-effects parameters vary across individuals, with a distribution specified at the second stage. A general family of models is discussed, which includes both growth models and repeated-measures models as special cases. A unified approach to fitting these models, based on a combination of empirical Bayes and maximum likelihood estimation of model parameters and using the EM algorithm, is discussed. Two examples are taken from a current epidemiological study of the health effects of air pollution.
Parameter adaptive estimation of random processes
NASA Technical Reports Server (NTRS)
Caglayan, A. K.; Vanlandingham, H. F.
1975-01-01
This paper is concerned with the parameter adaptive least squares estimation of random processes. The main result is a general representation theorem for the conditional expectation of a random variable on a product probability space. Using this theorem along with the general likelihood ratio expression, the least squares estimate of the process is found in terms of the parameter conditioned estimates. The stochastic differential for the a posteriori probability and the stochastic differential equation for the a posteriori density are found by using simple stochastic calculus on the representations obtained. The results are specialized to the case when the parameter has a discrete distribution. The results can be used to construct an implementable recursive estimator for certain types of nonlinear filtering problems. This is illustrated by some simple examples.
Bell experiments with random destination sources
Sciarrino, Fabio; Mataloni, Paolo; Vallone, Giuseppe; Cabello, Adan
2011-03-15
It is generally assumed that sources randomly sending two particles to one or two different observers, random destination sources (RDSs), cannot be used for genuine quantum nonlocality tests because of the postselection loophole. We demonstrate that Bell experiments not affected by the postselection loophole may be performed with (i) an RDS and local postselection using perfect detectors, (ii) an RDS, local postselection, and fair sampling assumption with any detection efficiency, and (iii) an RDS and a threshold detection efficiency required to avoid the detection loophole. These results allow the adoption of RDS setups which are simpler and more efficient for long-distance free-space Bell tests, and extend the range of physical systems which can be used for loophole-free Bell tests.
Randomness in quantum mechanics - nature's ultimate cryptogram?
NASA Astrophysics Data System (ADS)
Erber, T.; Putterman, S.
1985-11-01
The possibility that a single atom irradiated by coherent light will be equivalent to an infinite computer with regard to its ability to generate random numbers is addressed. A search for unexpected patterns of order by crypt analysis of the telegraph signal generated by the on/off time of the atom's fluorescence is described. The results will provide new experimental tests of the fundamental principles of quantum theory.
Relativistic diffusive motion in random electromagnetic fields
NASA Astrophysics Data System (ADS)
Haba, Z.
2011-08-01
We show that the relativistic dynamics in a Gaussian random electromagnetic field can be approximated by the relativistic diffusion of Schay and Dudley. Lorentz invariant dynamics in the proper time leads to the diffusion in the proper time. The dynamics in the laboratory time gives the diffusive transport equation corresponding to the Jüttner equilibrium at the inverse temperature β-1 = mc2. The diffusion constant is expressed by the field strength correlation function (Kubo's formula).
Resilience of complex networks to random breakdown
Paul, Gerald; Sreenivasan, Sameet; Stanley, H. Eugene
2005-11-01
Using Monte Carlo simulations we calculate f{sub c}, the fraction of nodes that are randomly removed before global connectivity is lost, for networks with scale-free and bimodal degree distributions. Our results differ from the results predicted by an equation for f{sub c} proposed by Cohen et al. We discuss the reasons for this disagreement and clarify the domain for which the proposed equation is valid.
Average fidelity between random quantum states
Zyczkowski, Karol; Sommers, Hans-Juergen
2005-03-01
We analyze mean fidelity between random density matrices of size N, generated with respect to various probability measures in the space of mixed quantum states: the Hilbert-Schmidt measure, the Bures (statistical) measure, the measure induced by the partial trace, and the natural measure on the space of pure states. In certain cases explicit probability distributions for the fidelity are derived. The results obtained may be used to gauge the quality of quantum-information-processing schemes.
Quantum games on evolving random networks
NASA Astrophysics Data System (ADS)
Pawela, Łukasz
2016-09-01
We study the advantages of quantum strategies in evolutionary social dilemmas on evolving random networks. We focus our study on the two-player games: prisoner's dilemma, snowdrift and stag-hunt games. The obtained result show the benefits of quantum strategies for the prisoner's dilemma game. For the other two games, we obtain regions of parameters where the quantum strategies dominate, as well as regions where the classical strategies coexist.
Magnetic Analog Random-Access Memory
NASA Technical Reports Server (NTRS)
Katti, Romney R.; Wu, Jiin-Chuan; Stadler, Henry L.
1991-01-01
Proposed integrated, solid-state, analog random-access memory base on principle of magnetic writing and magnetoresistive reading. Current in writing conductor magnetizes storage layer. Remanent magnetization in storage layer penetrates readout layer and detected by magnetoresistive effect or Hall effect. Memory cells are part of integrated circuit including associated reading and writing transistors. Intended to provide high storage density and rapid access, nonvolatile, consumes little power, and relatively invulnerable to ionizing radiation.
A Random Walk Picture of Basketball
NASA Astrophysics Data System (ADS)
Gabel, Alan; Redner, Sidney
2012-02-01
We analyze NBA basketball play-by-play data and found that scoring is well described by a weakly-biased, anti-persistent, continuous-time random walk. The time between successive scoring events follows an exponential distribution, with little memory between events. We account for a wide variety of statistical properties of scoring, such as the distribution of the score difference between opponents and the fraction of game time that one team is in the lead.
Random transitions and cell cycle control.
Brooks, R F
1981-01-01
Differences between the cycle times of sister cells are exponentially distributed, which means that these differences can be explained entirely by the existence of a single critical step in the cell cycle which occurs at random. Cycle times as a whole are not exponentially distributed, indicating an additional source of variation in the cell cycle. It follows that this additional variation must affect sister cells identically; ie, sister cell cycle times are correlated. This correlation and the overall distribution of cycle times can be predicted quantitatively by a model that was developed initially in order to explain certain problematic features of the response of quiescent cells to mitogenic stimulation - in particular, the significance of the lag that almost invariably occurs between stimulation and the onset of DNA synthesis. This model proposes that each cell cycle depends not on one but two random transitions, one of which (at reasonably high growth rates) occurs in the mother cell, its effects being inherited equally by the two daughter cells. The fundamental timing element in the cell cycle is proposed to be a lengthy process, called L, which accounts for most of the lag on mitogenic stimulation and also for the minimum cycle time in growing cultures. One of the random transitions is concerned with the initiation of L, whereas the other becomes possible on completion of L. The latter transition has two consequences: the first is the initiation of a sequence of events which includes S, G2 and M; the second is the restoration of the state from which L may be initiated once more. As a result, L may begin (at random) at any stage of the conventional cycle, ie, S, G2, M, or G1. There are marked similarities between the hypothetical process L and the biogenesis of mitotic centres - the structures responsible for organising the spindle poles. PMID:7312875
Johansson, Robert; Björklund, Martin; Hornborg, Christoffer; Karlsson, Stina; Hesser, Hugo; Ljótsson, Brjánn; Rousseau, Andréas; Frederick, Ronald J; Andersson, Gerhard
2013-01-01
Background. Psychodynamic psychotherapy is a psychological treatment approach that has a growing empirical base. Research has indicated an association between therapist-facilitated affective experience and outcome in psychodynamic therapy. Affect-phobia therapy (APT), as outlined by McCullough et al., is a psychodynamic treatment that emphasizes a strong focus on expression and experience of affect. This model has neither been evaluated for depression nor anxiety disorders in a randomized controlled trial. While Internet-delivered psychodynamic treatments for depression and generalized anxiety disorder exist, they have not been based on APT. The aim of this randomized controlled trial was to investigate the efficacy of an Internet-based, psychodynamic, guided self-help treatment based on APT for depression and anxiety disorders. Methods. One hundred participants with diagnoses of mood and anxiety disorders participated in a randomized (1:1 ratio) controlled trial of an active group versus a control condition. The treatment group received a 10-week, psychodynamic, guided self-help treatment based on APT that was delivered through the Internet. The treatment consisted of eight text-based treatment modules and included therapist contact (9.5 min per client and week, on average) in a secure online environment. Participants in the control group also received online therapist support and clinical monitoring of symptoms, but received no treatment modules. Outcome measures were the 9-item Patient Health Questionnaire Depression Scale (PHQ-9) and the 7-item Generalized Anxiety Disorder Scale (GAD-7). Process measures were also included. All measures were administered weekly during the treatment period and at a 7-month follow-up. Results. Mixed models analyses using the full intention-to-treat sample revealed significant interaction effects of group and time on all outcome measures, when comparing treatment to the control group. A large between-group effect size of Cohen's d
Chan, Agnes S.; Han, Yvonne M. Y.; Sze, Sophia L.; Wong, Queenie Y.
2013-01-01
Our previous studies have reported the therapeutic effects of 10-session Chinese Chan-based Dejian mind-body interventions (DMBI) in reducing the intake of antidepressants, improving depressive symptoms, and enhancing the attentional abilities of patients with depression. This study aims to explore the possible neuroelectrophysiological mechanisms underlying the previously reported treatment effects of DMBI in comparison with those of cognitive behavioral therapy (CBT). Seventy-five age-, gender-, and education-matched participants with depression were randomly assigned to receive either CBT or DMBI or placed on a waitlist. Eyes-closed resting EEG data were obtained individually before and after 10 weeks. After intervention, the DMBI group demonstrated significantly enhanced frontal alpha asymmetry (an index of positive mood) and intra- and interhemispheric theta coherence in frontoposterior and posterior brain regions (an index of attention). In contrast, neither the CBT nor the waitlist group showed significant changes in EEG activity patterns. Furthermore, the asymmetry and coherence indices of the DMBI group were correlated with self-reported depression severity levels and performance on an attention test, respectively. The present findings provide support for the effects of a Chinese Chan-based mind-body intervention in fostering human brain states that can facilitate positive mood and an attentive mind. PMID:24489591
Lee, Seung-Hwan; Kwon, Hyuk-Sang; Park, Yong-Moon; Ko, Seung-Hyun; Choi, Yoon-Hee; Yoon, Kun-Ho
2014-01-01
Background This study investigated the rate of relapse of dyslipidemia and the factors which could predict relapse following a short-term statin discontinuation after achieving a target low density lipoprotein cholesterol (LDL-C) level in type 2 diabetic patients without cardiovascular disease (CVD). Methods Ninety-nine subjects on rosuvastatin treatment and whose LDL-C level was lower than 100 mg/dL were randomly assigned to discontinue or maintain statin treatment at a 2:1 ratio. The subjects were followed-up after 10 weeks. A relapse of dyslipidemia was defined as a reascent of LDL-C level to greater than 100 mg/dL. Results The statin discontinuation group had a significant rate of relapse compared to the maintenance group (79% vs. 3%, respectively). Pretreatment and baseline lipid levels, their ratios, and hemoglobin A1c level were significantly different between the relapse and nonrelapse groups. The pretreatment and baseline lipid profiles and their ratios were independently associated with relapse. The pretreatment LDL-C level was the most useful parameter for predicting a relapse, with a cutoff of 123 mg/dL. During the follow-up period, no CVD event was noted. Conclusion The relapse rate of dyslipidemia was high when statins were discontinued in type 2 diabetic patients without CVD. Statin discontinuation should be considered carefully based on the pretreatment lipid profiles of patients. PMID:24627830
Gunn, Rebecca; Russell, Jeremy K; Ary, Dennis V
2016-01-01
Background Worldwide, depression is rated as the fourth leading cause of disease burden and is projected to be the second leading cause of disability by 2020. Annual depression-related costs in the United States are estimated at US $210.5 billion, with employers bearing over 50% of these costs in productivity loss, absenteeism, and disability. Because most adults with depression never receive treatment, there is a need to develop effective interventions that can be more widely disseminated through new channels, such as employee assistance programs (EAPs), and directly to individuals who will not seek face-to-face care. Objective This study evaluated a self-guided intervention, using the MoodHacker mobile Web app to activate the use of cognitive behavioral therapy (CBT) skills in working adults with mild-to-moderate depression. It was hypothesized that MoodHacker users would experience reduced depression symptoms and negative cognitions, and increased behavioral activation, knowledge of depression, and functioning in the workplace. Methods A parallel two-group randomized controlled trial was conducted with 300 employed adults exhibiting mild-to-moderate depression. Participants were recruited from August 2012 through April 2013 in partnership with an EAP and with outreach through a variety of additional non-EAP organizations. Participants were blocked on race/ethnicity and then randomly assigned within each block to receive, without clinical support, either the MoodHacker intervention (n=150) or alternative care consisting of links to vetted websites on depression (n=150). Participants in both groups completed online self-assessment surveys at baseline, 6 weeks after baseline, and 10 weeks after baseline. Surveys assessed (1) depression symptoms, (2) behavioral activation, (3) negative thoughts, (4) worksite outcomes, (5) depression knowledge, and (6) user satisfaction and usability. After randomization, all interactions with subjects were automated with the
2012-01-01
Background Many postnatal women are insufficiently physically active in the year after childbirth and could benefit from interventions to increase activity levels. However, there is limited information about the efficacy, feasibility and acceptability of motivational and behavioral interventions promoting postnatal physical activity in the UK. Methods The MAMMiS study is a randomized, controlled trial, conducted within a large National Health Service (NHS) region in Scotland. Up to 76 postnatal women will be recruited to test the impact of two physical activity consultations and a 10-week group pram-walking program on physical activity behavior change. The intervention uses evidence-based motivational and behavioral techniques and will be systematically evaluated using objective measures (accelerometers) at three months, with a maintenance measure taken at a six-month follow-up. Secondary health and well-being measures and psychological mediators of physical activity change are included. Discussion The (MAMMiS study will provide a test of a theoretical and evidence-based physical activity behavior change intervention for postnatal women and provide information to inform future intervention development and testing within this population. Trial registration Current Controlled Trials ISRCTN79011784 PMID:22818406
Nadeau, Stephen E; Davis, Sandra E; Wu, Samuel S; Dai, Yunfeng; Richards, Lorie G
2014-01-01
Background. Phase III trials of rehabilitation of paresis after stroke have proven the effectiveness of intensive and extended task practice, but they have also shown that many patients do not qualify, because of severity of impairment, and that many of those who are treated are left with clinically significant deficits. Objective. To test the value of 2 potential adjuvants to normal learning processes engaged in constraint-induced movement therapy (CIMT): greater distribution of treatment over time and the coadministration of d-cycloserine, a competitive agonist at the glycine site of the N-methyl-D-aspartate glutamate receptor. Methods. A prospective randomized single-blind parallel-group trial of more versus less condensed therapy (2 vs 10 weeks) and d-cycloserine (50 mg) each treatment day versus placebo (in a 2 × 2 design), as potential adjuvants to 60 hours of CIMT. Results. Twenty-four participants entered the study, and 22 completed it and were assessed at the completion of treatment and 3 months later. Neither greater distribution of treatment nor treatment with d-cycloserine significantly augmented retention of gains achieved with CIMT. Conclusions. Greater distribution of practice and treatment with d-cycloserine do not appear to augment retention of gains achieved with CIMT. However, concentration of CIMT over 2 weeks ("massed practice") appears to confer no advantage either.
Random complex dynamics and devil's coliseums
NASA Astrophysics Data System (ADS)
Sumi, Hiroki
2015-04-01
We investigate the random dynamics of polynomial maps on the Riemann sphere \\hat{\\Bbb{C}} and the dynamics of semigroups of polynomial maps on \\hat{\\Bbb{C}} . In particular, the dynamics of a semigroup G of polynomials whose planar postcritical set is bounded and the associated random dynamics are studied. In general, the Julia set of such a G may be disconnected. We show that if G is such a semigroup, then regarding the associated random dynamics, the chaos of the averaged system disappears in the C0 sense, and the function T∞ of probability of tending to ∞ \\in \\hat{\\Bbb{C}} is Hölder continuous on \\hat{\\Bbb{C}} and varies only on the Julia set of G. Moreover, the function T∞ has a kind of monotonicity. It turns out that T∞ is a complex analogue of the devil's staircase, and we call T∞ a ‘devil’s coliseum'. We investigate the details of T∞ when G is generated by two polynomials. In this case, T∞ varies precisely on the Julia set of G, which is a thin fractal set. Moreover, under this condition, we investigate the pointwise Hölder exponents of T∞.
Effect of noise correlations on randomized benchmarking
NASA Astrophysics Data System (ADS)
Ball, Harrison; Stace, Thomas M.; Flammia, Steven T.; Biercuk, Michael J.
2016-02-01
Among the most popular and well-studied quantum characterization, verification, and validation techniques is randomized benchmarking (RB), an important statistical tool used to characterize the performance of physical logic operations useful in quantum information processing. In this work we provide a detailed mathematical treatment of the effect of temporal noise correlations on the outcomes of RB protocols. We provide a fully analytic framework capturing the accumulation of error in RB expressed in terms of a three-dimensional random walk in "Pauli space." Using this framework we derive the probability density function describing RB outcomes (averaged over noise) for both Markovian and correlated errors, which we show is generally described by a Γ distribution with shape and scale parameters depending on the correlation structure. Long temporal correlations impart large nonvanishing variance and skew in the distribution towards high-fidelity outcomes—consistent with existing experimental data—highlighting potential finite-sampling pitfalls and the divergence of the mean RB outcome from worst-case errors in the presence of noise correlations. We use the filter-transfer function formalism to reveal the underlying reason for these differences in terms of effective coherent averaging of correlated errors in certain random sequences. We conclude by commenting on the impact of these calculations on the utility of single-metric approaches to quantum characterization, verification, and validation.
Optical wireless communication through random media
NASA Astrophysics Data System (ADS)
Arnon, Shlomi
2011-03-01
The growing need for high data-rate communication both through the atmosphere and the ocean (sub-sea) has stimulated considerable interest in optical wireless communication (OWC) technologies. The main advantages of OWC as compared with RF communication in the atmosphere and with acoustic communication in sub-sea applications are a) high achievable data-rate, b) small size of equipment and c) low power-consumption. On the other hand the characteristics of the communication channel in both scenarios are stochastic with high values of variance, which severely degrades OWC communication system performance. In this paper we present a tutorial discussing the effects of random media on OWC and expand on two examples: Monte-Carlo simulation for sub-sea communication and mathematical synthesis using Meijer G-function for OWC through atmospheric turbulence. These two examples demonstrate that it is possible to gain significant insights on the effects of the random channel on system performance. The results of the different analysis methods could also indicate solutions for the improvement of performance using adaptive solutions or for extending the communication range by applying a multi-hop concept. We summarize the paper with a brief review of two emerging research fields that could, surprisingly, benefit from the characteristics of light propagation through random media and its effect on the communication system performance. The first research field is trans-cutaneous OWC and the second is an unguided optical communication bus for next-generation computers.
A parallel algorithm for random searches
NASA Astrophysics Data System (ADS)
Wosniack, M. E.; Raposo, E. P.; Viswanathan, G. M.; da Luz, M. G. E.
2015-11-01
We discuss a parallelization procedure for a two-dimensional random search of a single individual, a typical sequential process. To assure the same features of the sequential random search in the parallel version, we analyze the former spatial patterns of the encountered targets for different search strategies and densities of homogeneously distributed targets. We identify a lognormal tendency for the distribution of distances between consecutively detected targets. Then, by assigning the distinct mean and standard deviation of this distribution for each corresponding configuration in the parallel simulations (constituted by parallel random walkers), we are able to recover important statistical properties, e.g., the target detection efficiency, of the original problem. The proposed parallel approach presents a speedup of nearly one order of magnitude compared with the sequential implementation. This algorithm can be easily adapted to different instances, as searches in three dimensions. Its possible range of applicability covers problems in areas as diverse as automated computer searchers in high-capacity databases and animal foraging.
Variational Infinite Hidden Conditional Random Fields.
Bousmalis, Konstantinos; Zafeiriou, Stefanos; Morency, Louis-Philippe; Pantic, Maja; Ghahramani, Zoubin
2015-09-01
Hidden conditional random fields (HCRFs) are discriminative latent variable models which have been shown to successfully learn the hidden structure of a given classification problem. An Infinite hidden conditional random field is a hidden conditional random field with a countably infinite number of hidden states, which rids us not only of the necessity to specify a priori a fixed number of hidden states available but also of the problem of overfitting. Markov chain Monte Carlo (MCMC) sampling algorithms are often employed for inference in such models. However, convergence of such algorithms is rather difficult to verify, and as the complexity of the task at hand increases the computational cost of such algorithms often becomes prohibitive. These limitations can be overcome by variational techniques. In this paper, we present a generalized framework for infinite HCRF models, and a novel variational inference approach on a model based on coupled Dirichlet Process Mixtures, the HCRF-DPM. We show that the variational HCRF-DPM is able to converge to a correct number of represented hidden states, and performs as well as the best parametric HCRFs-chosen via cross-validation-for the difficult tasks of recognizing instances of agreement, disagreement, and pain in audiovisual sequences. PMID:26353136
Hydrodynamical spectral evolution for random matrices
NASA Astrophysics Data System (ADS)
Forrester, Peter J.; Grela, Jacek
2016-02-01
The eigenvalues of the matrix structure X+{X}(0), where X is a random Gaussian Hermitian matrix and {X}(0) is non-random or random independent of X, are closely related to Dyson Brownian motion. Previous works have shown how an infinite hierarchy of equations satisfied by the dynamical correlations become triangular in the infinite density limit, and give rise to the complex Burgers equation for the Green’s function of the corresponding one-point density function. We show how this and analogous partial differential equations, for chiral, circular and Jacobi versions of Dyson Brownian motion follow from a macroscopic hydrodynamical description involving the current density and continuity equation. The method of characteristics gives a systematic approach to solving the PDEs, and in the chiral case we show how this efficiently reclaims the characterization of the global eigenvalue density for non-central Wishart matrices due to Dozier and Silverstein. Collective variables provide another approach to deriving the complex Burgers equation in the Gaussian case, and we show that this approach applies equally as well to chiral matrices. We relate both the Gaussian and chiral cases to the asymptotics of matrix integrals.
Exploring number space by random digit generation.
Loetscher, Tobias; Brugger, Peter
2007-07-01
There is some evidence that human subjects preferentially select small numbers when asked to sample numbers from large intervals "at random". A retrospective analysis of single digit frequencies in 16 independent experiments with the Mental Dice Task (generation of digits 1-6 during 1 min) confirmed the occurrence of small-number biases (SNBs) in 488 healthy subjects. A subset of these experiments suggested a spatial nature of this bias in the sense of a "leftward" shift along the number line. First, individual SNBs were correlated with leftward deviations in a number line bisection task (but unrelated to the bisection of physical lines). Second, in 20 men, the magnitude of SNBs significantly correlated with leftward attentional biases in the judgment of chimeric faces. Finally, cognitive activation of the right hemisphere enhanced SNBs in 20 different men, while left hemisphere activation reduced them. Together, these findings provide support for a spatial component in random number generation. Specifically, they allow an interpretation of SNBs in terms of "pseudoneglect in number space." We recommend the use of random digit generation for future explorations of spatial-attentional asymmetries in numerical processing and discuss methodological issues relevant to prospective designs.
Kronberg, James W.
1993-01-01
An apparatus for selecting at random one item of N items on the average comprising counter and reset elements for counting repeatedly between zero and N, a number selected by the user, a circuit for activating and deactivating the counter, a comparator to determine if the counter stopped at a count of zero, an output to indicate an item has been selected when the count is zero or not selected if the count is not zero. Randomness is provided by having the counter cycle very often while varying the relatively longer duration between activation and deactivation of the count. The passive circuit components of the activating/deactivating circuit and those of the counter are selected for the sensitivity of their response to variations in temperature and other physical characteristics of the environment so that the response time of the circuitry varies. Additionally, the items themselves, which may be people, may vary in shape or the time they press a pushbutton, so that, for example, an ultrasonic beam broken by the item or person passing through it will add to the duration of the count and thus to the randomness of the selection.
KASER: Knowledge Amplification by Structured Expert Randomization.
Rubin, Stuart H; Murthy, S N Jayaram; Smith, Michael H; Trajković, Ljiljana
2004-12-01
In this paper and attached video, we present a third-generation expert system named Knowledge Amplification by Structured Expert Randomization (KASER) for which a patent has been filed by the U.S. Navy's SPAWAR Systems Center, San Diego, CA (SSC SD). KASER is a creative expert system. It is capable of deductive, inductive, and mixed derivations. Its qualitative creativity is realized by using a tree-search mechanism. The system achieves creative reasoning by using a declarative representation of knowledge consisting of object trees and inheritance. KASER computes with words and phrases. It possesses a capability for metaphor-based explanations. This capability is useful in explaining its creative suggestions and serves to augment the capabilities provided by the explanation subsystems of conventional expert systems. KASER also exhibits an accelerated capability to learn. However, this capability depends on the particulars of the selected application domain. For example, application domains such as the game of chess exhibit a high degree of geometric symmetry. Conversely, application domains such as the game of craps played with two dice exhibit no predictable pattern, unless the dice are loaded. More generally, we say that domains whose informative content can be compressed to a significant degree without loss (or with relatively little loss) are symmetric. Incompressible domains are said to be asymmetric or random. The measure of symmetry plus the measure of randomness must always sum to unity.
Kronberg, J.W.
1993-04-20
An apparatus for selecting at random one item of N items on the average comprising counter and reset elements for counting repeatedly between zero and N, a number selected by the user, a circuit for activating and deactivating the counter, a comparator to determine if the counter stopped at a count of zero, an output to indicate an item has been selected when the count is zero or not selected if the count is not zero. Randomness is provided by having the counter cycle very often while varying the relatively longer duration between activation and deactivation of the count. The passive circuit components of the activating/deactivating circuit and those of the counter are selected for the sensitivity of their response to variations in temperature and other physical characteristics of the environment so that the response time of the circuitry varies. Additionally, the items themselves, which may be people, may vary in shape or the time they press a pushbutton, so that, for example, an ultrasonic beam broken by the item or person passing through it will add to the duration of the count and thus to the randomness of the selection.
Fast phase randomization via two-folds
Jeffrey, M. R.
2016-01-01
A two-fold is a singular point on the discontinuity surface of a piecewise-smooth vector field, at which the vector field is tangent to the discontinuity surface on both sides. If an orbit passes through an invisible two-fold (also known as a Teixeira singularity) before settling to regular periodic motion, then the phase of that motion cannot be determined from initial conditions, and, in the presence of small noise, the asymptotic phase of a large number of sample solutions is highly random. In this paper, we show how the probability distribution of the asymptotic phase depends on the global nonlinear dynamics. We also show how the phase of a smooth oscillator can be randomized by applying a simple discontinuous control law that generates an invisible two-fold. We propose that such a control law can be used to desynchronize a collection of oscillators, and that this manner of phase randomization is fast compared with existing methods (which use fixed points as phase singularities), because there is no slowing of the dynamics near a two-fold. PMID:27118901
Asymptotic properties of a bold random walk
NASA Astrophysics Data System (ADS)
Serva, Maurizio
2014-08-01
In a recent paper we proposed a non-Markovian random walk model with memory of the maximum distance ever reached from the starting point (home). The behavior of the walker is different from the simple symmetric random walk only when she is at this maximum distance, where, having the choice to move either farther or closer, she decides with different probabilities. If the probability of a forward step is higher than the probability of a backward step, the walker is bold and her behavior turns out to be superdiffusive; otherwise she is timorous and her behavior turns out to be subdiffusive. The scaling behavior varies continuously from subdiffusive (timorous) to superdiffusive (bold) according to a single parameter γ ∈R. We investigate here the asymptotic properties of the bold case in the nonballistic region γ ∈[0,1/2], a problem which was left partially unsolved previously. The exact results proved in this paper require new probabilistic tools which rely on the construction of appropriate martingales of the random walk and its hitting times.
The most parsimonious tree for random data.
Fischer, Mareike; Galla, Michelle; Herbst, Lina; Steel, Mike
2014-11-01
Applying a method to reconstruct a phylogenetic tree from random data provides a way to detect whether that method has an inherent bias towards certain tree 'shapes'. For maximum parsimony, applied to a sequence of random 2-state data, each possible binary phylogenetic tree has exactly the same distribution for its parsimony score. Despite this pleasing and slightly surprising symmetry, some binary phylogenetic trees are more likely than others to be a most parsimonious (MP) tree for a sequence of k such characters, as we show. For k=2, and unrooted binary trees on six taxa, any tree with a caterpillar shape has a higher chance of being an MP tree than any tree with a symmetric shape. On the other hand, if we take any two binary trees, on any number of taxa, we prove that this bias between the two trees vanishes as the number of characters k grows. However, again there is a twist: MP trees on six taxa for k=2 random binary characters are more likely to have certain shapes than a uniform distribution on binary phylogenetic trees predicts. Moreover, this shape bias appears, from simulations, to be more pronounced for larger values of k.
Physical Principle for Generation of Randomness
NASA Technical Reports Server (NTRS)
Zak, Michail
2009-01-01
A physical principle (more precisely, a principle that incorporates mathematical models used in physics) has been conceived as the basis of a method of generating randomness in Monte Carlo simulations. The principle eliminates the need for conventional random-number generators. The Monte Carlo simulation method is among the most powerful computational methods for solving high-dimensional problems in physics, chemistry, economics, and information processing. The Monte Carlo simulation method is especially effective for solving problems in which computational complexity increases exponentially with dimensionality. The main advantage of the Monte Carlo simulation method over other methods is that the demand on computational resources becomes independent of dimensionality. As augmented by the present principle, the Monte Carlo simulation method becomes an even more powerful computational method that is especially useful for solving problems associated with dynamics of fluids, planning, scheduling, and combinatorial optimization. The present principle is based on coupling of dynamical equations with the corresponding Liouville equation. The randomness is generated by non-Lipschitz instability of dynamics triggered and controlled by feedback from the Liouville equation. (In non-Lipschitz dynamics, the derivatives of solutions of the dynamical equations are not required to be bounded.)
Grounding the randomness of quantum measurement.
Jaeger, Gregg
2016-05-28
Julian Schwinger provided to physics a mathematical reconstruction of quantum mechanics on the basis of the characteristics of sequences of measurements occurring at the atomic level of physical structure. The central component of this reconstruction is an algebra of symbols corresponding to quantum measurements, conceived of as discrete processes, which serve to relate experience to theory; collections of outcomes of identically circumscribed such measurements are attributed expectation values, which constitute the predictive content of the theory. The outcomes correspond to certain phase parameters appearing in the corresponding symbols, which are complex numbers, the algebra of which he finds by a process he refers to as 'induction'. Schwinger assumed these (individually unpredictable) phase parameters to take random, uniformly distributed definite values within a natural range. I have previously suggested that the 'principle of plenitude' may serve as a basis in principle for the occurrence of the definite measured values that are those members of the collections of measurement outcomes from which the corresponding observed statistics derive (Jaeger 2015Found. Phys.45, 806-819. (doi:10.1007/s10701-015-9893-6)). Here, I evaluate Schwinger's assumption in the context of recent critiques of the notion of randomness and explicitly relate the randomness of these phases with the principle of plenitude and, in this way, provide a fundamental grounding for the objective, physically irreducible probabilities, conceived of as graded possibilities, that are attributed to measurement outcomes by quantum mechanics.
Entanglement negativity in random spin chains
NASA Astrophysics Data System (ADS)
Ruggiero, Paola; Alba, Vincenzo; Calabrese, Pasquale
2016-07-01
We investigate the logarithmic negativity in strongly disordered spin chains in the random-singlet phase. We focus on the spin-1/2 random Heisenberg chain and the random X X chain. We find that for two arbitrary intervals, the disorder-averaged negativity and the mutual information are proportional to the number of singlets shared between the two intervals. Using the strong-disorder renormalization group (SDRG), we prove that the negativity of two adjacent intervals grows logarithmically with the intervals' length. In particular, the scaling behavior is the same as in conformal field theory, but with a different prefactor. For two disjoint intervals the negativity is given by a universal simple function of the cross ratio, reflecting scale invariance. As a function of the distance of the two intervals, the negativity decays algebraically in contrast with the exponential behavior in clean models. We confirm our predictions using a numerical implementation of the SDRG method. Finally, we also implement density matrix renormalization group simulations for the negativity in open spin chains. The chains accessible in the presence of strong disorder are not sufficiently long to provide a reliable confirmation of the SDRG results.
Metadisorder for designer light in random systems
Yu, Sunkyu; Piao, Xianji; Hong, Jiho; Park, Namkyoo
2016-01-01
Disorder plays a critical role in signal transport by controlling the correlation of a system, as demonstrated in various complex networks. In wave physics, disordered potentials suppress wave transport, because of their localized eigenstates, from the interference between multiple scattering paths. Although the variation of localization with tunable disorder has been intensively studied as a bridge between ordered and disordered media, the general trend of disorder-enhanced localization has remained unchanged, and the existence of complete delocalization in highly disordered potentials has not been explored. We propose the concept of “metadisorder”: randomly coupled optical systems in which eigenstates can be engineered to achieve unusual localization. We demonstrate that one of the eigenstates in a randomly coupled system can always be arbitrarily molded, regardless of the degree of disorder, by adjusting the self-energy of each element. Ordered waves with the desired form are then achieved in randomly coupled systems, including plane waves and globally collective resonances. We also devise counterintuitive functionalities in disordered systems, such as “small-world–like” transport from non–Anderson-type localization, phase-conserving disorder, and phase-controlled beam steering. PMID:27757414
Stretchable Random Lasers with Tunable Coherent Loops.
Sun, Tzu-Min; Wang, Cih-Su; Liao, Chi-Shiun; Lin, Shih-Yao; Perumal, Packiyaraj; Chiang, Chia-Wei; Chen, Yang-Fang
2015-12-22
Stretchability represents a key feature for the emerging world of realistic applications in areas, including wearable gadgets, health monitors, and robotic skins. Many optical and electronic technologies that can respond to large strain deformations have been developed. Laser plays a very important role in our daily life since it was discovered, which is highly desirable for the development of stretchable devices. Herein, stretchable random lasers with tunable coherent loops are designed, fabricated, and demonstrated. To illustrate our working principle, the stretchable random laser is made possible by transferring unique ZnO nanobrushes on top of polydimethylsiloxane (PDMS) elastomer substrate. Apart from the traditional gain material of ZnO nanorods, ZnO nanobrushes were used as optical gain materials so they can serve as scattering centers and provide the Fabry-Perot cavity to enhance laser action. The stretchable PDMS substrate gives the degree of freedom to mechanically tune the coherent loops of the random laser action by changing the density of ZnO nanobrushes. It is found that the number of laser modes increases with increasing external strain applied on the PDMS substrate due to the enhanced possibility for the formation of coherent loops. The device can be stretched by up to 30% strain and subjected to more than 100 cycles without loss in laser action. The result shows a major advance for the further development of man-made smart stretchable devices.
Component evolution in general random intersection graphs
Bradonjic, Milan; Hagberg, Aric; Hengartner, Nick; Percus, Allon G
2010-01-01
We analyze component evolution in general random intersection graphs (RIGs) and give conditions on existence and uniqueness of the giant component. Our techniques generalize the existing methods for analysis on component evolution in RIGs. That is, we analyze survival and extinction properties of a dependent, inhomogeneous Galton-Watson branching process on general RIGs. Our analysis relies on bounding the branching processes and inherits the fundamental concepts from the study on component evolution in Erdos-Renyi graphs. The main challenge becomes from the underlying structure of RIGs, when the number of offsprings follows a binomial distribution with a different number of nodes and different rate at each step during the evolution. RIGs can be interpreted as a model for large randomly formed non-metric data sets. Besides the mathematical analysis on component evolution, which we provide in this work, we perceive RIGs as an important random structure which has already found applications in social networks, epidemic networks, blog readership, or wireless sensor networks.
Grounding the randomness of quantum measurement.
Jaeger, Gregg
2016-05-28
Julian Schwinger provided to physics a mathematical reconstruction of quantum mechanics on the basis of the characteristics of sequences of measurements occurring at the atomic level of physical structure. The central component of this reconstruction is an algebra of symbols corresponding to quantum measurements, conceived of as discrete processes, which serve to relate experience to theory; collections of outcomes of identically circumscribed such measurements are attributed expectation values, which constitute the predictive content of the theory. The outcomes correspond to certain phase parameters appearing in the corresponding symbols, which are complex numbers, the algebra of which he finds by a process he refers to as 'induction'. Schwinger assumed these (individually unpredictable) phase parameters to take random, uniformly distributed definite values within a natural range. I have previously suggested that the 'principle of plenitude' may serve as a basis in principle for the occurrence of the definite measured values that are those members of the collections of measurement outcomes from which the corresponding observed statistics derive (Jaeger 2015Found. Phys.45, 806-819. (doi:10.1007/s10701-015-9893-6)). Here, I evaluate Schwinger's assumption in the context of recent critiques of the notion of randomness and explicitly relate the randomness of these phases with the principle of plenitude and, in this way, provide a fundamental grounding for the objective, physically irreducible probabilities, conceived of as graded possibilities, that are attributed to measurement outcomes by quantum mechanics. PMID:27091162
Adaptive Random Testing with Combinatorial Input Domain
Lu, Yansheng
2014-01-01
Random testing (RT) is a fundamental testing technique to assess software reliability, by simply selecting test cases in a random manner from the whole input domain. As an enhancement of RT, adaptive random testing (ART) has better failure-detection capability and has been widely applied in different scenarios, such as numerical programs, some object-oriented programs, and mobile applications. However, not much work has been done on the effectiveness of ART for the programs with combinatorial input domain (i.e., the set of categorical data). To extend the ideas to the testing for combinatorial input domain, we have adopted different similarity measures that are widely used for categorical data in data mining and have proposed two similarity measures based on interaction coverage. Then, we propose a new version named ART-CID as an extension of ART in combinatorial input domain, which selects an element from categorical data as the next test case such that it has the lowest similarity against already generated test cases. Experimental results show that ART-CID generally performs better than RT, with respect to different evaluation metrics. PMID:24772036
Electrokinetic transport in microchannels with random roughness
Wang, Moran; Kang, Qinjun
2008-01-01
We present a numerical framework to model the electrokinetic transport in microchannels with random roughness. The three-dimensional microstructure of the rough channel is generated by a random generation-growth method with three statistical parameters to control the number density, the total volume fraction, and the anisotropy characteristics of roughness elements. The governing equations for the electrokinetic transport are solved by a high-efficiency lattice Poisson?Boltzmann method in complex geometries. The effects from the geometric characteristics of roughness on the electrokinetic transport in microchannels are therefore modeled and analyzed. For a given total roughness volume fraction, a higher number density leads to a lower fluctuation because of the random factors. The electroosmotic flow rate increases with the roughness number density nearly logarithmically for a given volume fraction of roughness but decreases with the volume fraction for a given roughness number density. When both the volume fraction and the number density of roughness are given, the electroosmotic flow rate is enhanced by the increase of the characteristic length along the external electric field direction but is reduced by that in the direction across the channel. For a given microstructure of the rough microchannel, the electroosmotic flow rate decreases with the Debye length. It is found that the shape resistance of roughness is responsible for the flow rate reduction in the rough channel compared to the smooth channel even for very thin double layers, and hence plays an important role in microchannel electroosmotic flows.
Random bits, true and unbiased, from atmospheric turbulence.
Marangon, Davide G; Vallone, Giuseppe; Villoresi, Paolo
2014-06-30
Random numbers represent a fundamental ingredient for secure communications and numerical simulation as well as to games and in general to Information Science. Physical processes with intrinsic unpredictability may be exploited to generate genuine random numbers. The optical propagation in strong atmospheric turbulence is here taken to this purpose, by observing a laser beam after a 143 km free-space path. In addition, we developed an algorithm to extract the randomness of the beam images at the receiver without post-processing. The numbers passed very selective randomness tests for qualification as genuine random numbers. The extracting algorithm can be easily generalized to random images generated by different physical processes.
Mean first return time for random walks on weighted networks
NASA Astrophysics Data System (ADS)
Jing, Xing-Li; Ling, Xiang; Long, Jiancheng; Shi, Qing; Hu, Mao-Bin
2015-11-01
Random walks on complex networks are of great importance to understand various types of phenomena in real world. In this paper, two types of biased random walks on nonassortative weighted networks are studied: edge-weight-based random walks and node-strength-based random walks, both of which are extended from the normal random walk model. Exact expressions for stationary distribution and mean first return time (MFRT) are derived and examined by simulation. The results will be helpful for understanding the influences of weights on the behavior of random walks.
McLay, Robert N; Wood, Dennis P; Webb-Murphy, Jennifer A; Spira, James L; Wiederhold, Mark D; Pyne, Jeffrey M; Wiederhold, Brenda K
2011-04-01
Abstract Virtual reality (VR)-based therapy has emerged as a potentially useful means to treat post-traumatic stress disorder (PTSD), but randomized studies have been lacking for Service Members from Iraq or Afghanistan. This study documents a small, randomized, controlled trial of VR-graded exposure therapy (VR-GET) versus treatment as usual (TAU) for PTSD in Active Duty military personnel with combat-related PTSD. Success was gauged according to whether treatment resulted in a 30 percent or greater improvement in the PTSD symptom severity as assessed by the Clinician Administered PTSD Scale (CAPS) after 10 weeks of treatment. Seven of 10 participants improved by 30 percent or greater while in VR-GET, whereas only 1 of the 9 returning participants in TAU showed similar improvement. This is a clinically and statistically significant result (χ(2) = 6.74, p < 0.01, relative risk 3.2). Participants in VR-GET improved an average of 35 points on the CAPS, whereas those in TAU averaged a 9-point improvement (p < 0.05). The results are limited by small size, lack of blinding, a single therapist, and comparison to a relatively uncontrolled usual care condition, but did show VR-GET to be a safe and effective treatment for combat-related PTSD. PMID:21332375
Courneya, Kerry S; Friedenreich, Christine M; Sela, Rami A; Quinney, H Arthur; Rhodes, Ryan E
2002-01-01
In this study, we examined correlates of adherence and contamination in a randomized controlled trial (RCT) of exercise in cancer survivors using the theory of planned behavior and the Five Factor Model of personality (FFM). We randomly assigned cancer survivors in group psychotherapy classes to either a waiting-list control group (n = 45) or a home-based, moderate intensity exercise program (n = 51). At baseline, participants completed measures of the theory of planned behavior, the FFM, past exercise, physical fitness, medical variables, and demographics. We then monitored exercise over a 10-week period by weekly self-reports. Hierarchical multiple regression analyses indicated that the independent predictors of overall RCT exercise across both conditions were past exercise (beta = .36, p < .001), assignment to experimental condition (beta = .34, p < .001), sex (beta = .30, p < .001), and intention (beta = .14, p < .10). For exercise adherence in the exercise condition, the independent predictors were sex (beta = .38, p < .01), extraversion (beta = .30, p < .05), normative beliefs (beta = -.27, p < .05), and perceived behavioral control (beta = .23, p < .10). Finally, the independent predictors of exercise contamination in the control condition were past exercise (beta = .70, p < .001), sex (beta = .20, p < .05), and intention (beta = .17, p < .10). We conclude that the correlates of exercise adherence and contamination differ in kind as well as in degree. Explanations for these findings and practical implications for conducting exercise RCTs in this population are offered.
Compositions, Random Sums and Continued Random Fractions of Poisson and Fractional Poisson Processes
NASA Astrophysics Data System (ADS)
Orsingher, Enzo; Polito, Federico
2012-08-01
In this paper we consider the relation between random sums and compositions of different processes. In particular, for independent Poisson processes N α ( t), N β ( t), t>0, we have that N_{α}(N_{β}(t)) stackrel{d}{=} sum_{j=1}^{N_{β}(t)} Xj, where the X j s are Poisson random variables. We present a series of similar cases, where the outer process is Poisson with different inner processes. We highlight generalisations of these results where the external process is infinitely divisible. A section of the paper concerns compositions of the form N_{α}(tauk^{ν}), ν∈(0,1], where tauk^{ν} is the inverse of the fractional Poisson process, and we show how these compositions can be represented as random sums. Furthermore we study compositions of the form Θ( N( t)), t>0, which can be represented as random products. The last section is devoted to studying continued fractions of Cauchy random variables with a Poisson number of levels. We evaluate the exact distribution and derive the scale parameter in terms of ratios of Fibonacci numbers.
Forest Fires in a Random Forest
NASA Astrophysics Data System (ADS)
Leuenberger, Michael; Kanevski, Mikhaïl; Vega Orozco, Carmen D.
2013-04-01
Forest fires in Canton Ticino (Switzerland) are very complex phenomena. Meteorological data can explain some occurrences of fires in time, but not necessarily in space. Using anthropogenic and geographical feature data with the random forest algorithm, this study tries to highlight factors that most influence the fire-ignition and to identify areas under risk. The fundamental scientific problem considered in the present research deals with an application of random forest algorithms for the analysis and modeling of forest fires patterns in a high dimensional input feature space. This study is focused on the 2,224 anthropogenic forest fires among the 2,401 forest fire ignition points that have occurred in Canton Ticino from 1969 to 2008. Provided by the Swiss Federal Institute for Forest, Snow and Landscape Research (WSL), the database characterizes each fire by their location (x,y coordinates of the ignition point), start date, duration, burned area, and other information such as ignition cause and topographic features such as slope, aspect, altitude, etc. In addition, the database VECTOR25 from SwissTopo was used to extract information of the distances between fire ignition points and anthropogenic structures like buildings, road network, rail network, etc. Developed by L. Breiman and A. Cutler, the Random Forests (RF) algorithm provides an ensemble of classification and regression trees. By a pseudo-random variable selection for each split node, this method grows a variety of decision trees that do not return the same results, and thus by a committee system, returns a value that has a better accuracy than other machine learning methods. This algorithm incorporates directly measurement of importance variable which is used to display factors affecting forest fires. Dealing with this parameter, several models can be fit, and thus, a prediction can be made throughout the validity domain of Canton Ticino. Comprehensive RF analysis was carried out in order to 1
Deterministic signal associated with a random field.
Kim, Taewoo; Zhu, Ruoyu; Nguyen, Tan H; Zhou, Renjie; Edwards, Chris; Goddard, Lynford L; Popescu, Gabriel
2013-09-01
Stochastic fields do not generally possess a Fourier transform. This makes the second-order statistics calculation very difficult, as it requires solving a fourth-order stochastic wave equation. This problem was alleviated by Wolf who introduced the coherent mode decomposition and, as a result, space-frequency statistics propagation of wide-sense stationary fields. In this paper we show that if, in addition to wide-sense stationarity, the fields are also wide-sense statistically homogeneous, then monochromatic plane waves can be used as an eigenfunction basis for the cross spectral density. Furthermore, the eigenvalue associated with a plane wave, exp[i(k · r-ωt)], is given by the spatiotemporal power spectrum evaluated at the frequency (k, ω). We show that the second-order statistics of these fields is fully described by the spatiotemporal power spectrum, a real, positive function. Thus, the second-order statistics can be efficiently propagated in the wavevector-frequency representation using a new framework of deterministic signals associated with random fields. Analogous to the complex analytic signal representation of a field, the deterministic signal is a mathematical construct meant to simplify calculations. Specifically, the deterministic signal associated with a random field is defined such that it has the identical autocorrelation as the actual random field. Calculations for propagating spatial and temporal correlations are simplified greatly because one only needs to solve a deterministic wave equation of second order. We illustrate the power of the wavevector-frequency representation with calculations of spatial coherence in the far zone of an incoherent source, as well as coherence effects induced by biological tissues.
CT detector evaluation with complex random backgrounds
NASA Astrophysics Data System (ADS)
Fan, Helen; Barrett, Harrison H.
2012-02-01
Modern computed tomography (CT) uses detector arrays consisting of large numbers of photodiodes with scintil- lator crystals. The number of pixels in the array can play an important role in system performance. Considerable research has been performed on signal detection in flat backgrounds under various conditions, but little has been done with complex, random backgrounds in CT; our work investigates in particular the effect of the number of detector elements on signal detection by a channelized Hotelling observer in a complex background. For this project, a simulated three-dimensional phantom is generated with its attenuation equal to that of water. The phantom contains a smaller central section with random variations to simulate random anatomical structures. Cone-beam projections of the phantom are acquired at different angles and used to calculate the covariance matrix of the raw projection data. Laguerre-Gauss channels are used to reduce the dimensionality of each 2D projection and hence the size of the covariance matrix, but the covariance is still a function of two projection angles. A strong cross-channel correlation is observed as a function of the difference between the angles. A signal with known location and size is used, and the performance of the observer is calculated from the channel outputs at multiple projection angles. A contrast-detail diagram is computed for different variables such as signal size, number of incident x-ray photons, pixel size, etc. At a fixed observer signal-to-noise ratio (SNR), the contrast required to detect a signal increases dramatically as the signal size decreases.
Quantum key distribution protocol using random bases
NASA Astrophysics Data System (ADS)
Meslouhi, A.; Amellal, H.; Hassouni, Y.; El Baz, M.; El Allati, A.
2016-04-01
In order to enhance the quantum key distribution (QKD) security, a new protocol, “QKDPRB” based on random bases is proposed. It consists of using standard encoding bases moving circularly with a variable rotational angle α which depends on angular velocity ω(t); thus, the traditional bases turn into relative ones. To prove the security and the efficiency of the protocol, we present a universal demonstration which proves a high level security of the proposed protocol, even in the presence of the intercept and resend attack. Finally, the QKDPRB may improve the security of QKD.
Statistical regimes of random laser fluctuations
Lepri, Stefano; Cavalieri, Stefano; Oppo, Gian-Luca; Wiersma, Diederik S.
2007-06-15
Statistical fluctuations of the light emitted from amplifying random media are studied theoretically and numerically. The characteristic scales of the diffusive motion of light lead to Gaussian or power-law (Levy) distributed fluctuations depending on external control parameters. In the Levy regime, the output pulse is highly irregular leading to huge deviations from a mean-field description. Monte Carlo simulations of a simplified model which includes the population of the medium demonstrate the two statistical regimes and provide a comparison with dynamical rate equations. Different statistics of the fluctuations helps to explain recent experimental observations reported in the literature.
Random diffusion model with structure corrections
NASA Astrophysics Data System (ADS)
McCowan, David D.; Mazenko, Gene F.
2010-05-01
The random diffusion model is a continuum model for a conserved scalar density field ϕ driven by diffusive dynamics where the bare diffusion coefficient is density dependent. We generalize the model from one with a sharp wave-number cutoff to one with a more natural large wave-number cutoff. We investigate whether the features seen previously—namely, a slowing down of the system and the development of a prepeak in the dynamic structure factor at a wave number below the first structure peak—survive in this model. A method for extracting information about a hidden prepeak in experimental data is presented.
Performance of random multiple access transmission system
NASA Technical Reports Server (NTRS)
Phinainitisart, N.; Wu, W. W.
1990-01-01
The performance of the Random Multiple Access (RMA) technique, applied to a direct terminal-to-terminal link with a large number of potential users, is determined. The average signal-to-noise ratio (SNR) is derived. Under Gaussian assumption, the approximation of the probability of error is given. The analysis shows that the system performance is affected by the sequence length, the number of simultaneous users, and the number of cochannel symbols, but is not sensitive to the thermal noise. The performance of using very small aperture antenna for both transmitting and receiving without a hub station is given.
Random Genetic Drift and Gamete Frequency
Mano, Shuhei
2005-01-01
An analytic expression of conditional expectation of transient gamete frequency, given that one of the two loci remains polymorphic, is obtained in terms of the diffusion process by calculating the moments of the distribution. Using this expression, a model where linkage disequilibrium is introduced by a single mutation is considered. The conditional expectation of the gamete frequency given that the locus with the mutant allele remains polymorphic is presented. The behavior is significantly different from the monotonic decrease observed in the deterministic model without random genetic drift. PMID:16371518
Livermore Random I/O Testbench
2012-09-01
LRIOT is a test bench framework that is designed to generate sophisticated I/O rates that can stress high-performance memory and storage systems, such as non-volatile random access memories (NVRAM)and storage class memory. Furthermore, LRIOT provides the capabilities to mix multiple types of concurrency, namely threading and task parallelism, as well as distributed execution using Message Passing Interface (MPI) libraries. It will be used by algorithm designers to generate access patterns that mimic their application's behavior, and by system designers to test high-performance NVRAM storage.