Science.gov

Sample records for selective a1a-blocker randomized

  1. Blocked randomization with randomly selected block sizes.

    PubMed

    Efird, Jimmy

    2011-01-01

    When planning a randomized clinical trial, careful consideration must be given to how participants are selected for various arms of a study. Selection and accidental bias may occur when participants are not assigned to study groups with equal probability. A simple random allocation scheme is a process by which each participant has equal likelihood of being assigned to treatment versus referent groups. However, by chance an unequal number of individuals may be assigned to each arm of the study and thus decrease the power to detect statistically significant differences between groups. Block randomization is a commonly used technique in clinical trial design to reduce bias and achieve balance in the allocation of participants to treatment arms, especially when the sample size is small. This method increases the probability that each arm will contain an equal number of individuals by sequencing participant assignments by block. Yet still, the allocation process may be predictable, for example, when the investigator is not blind and the block size is fixed. This paper provides an overview of blocked randomization and illustrates how to avoid selection bias by using random block sizes. PMID:21318011

  2. Random Selection for Drug Screening

    SciTech Connect

    Center for Human Reliability Studies

    2007-05-01

    Simple random sampling is generally the starting point for a random sampling process. This sampling technique ensures that each individual within a group (population) has an equal chance of being selected. There are a variety of ways to implement random sampling in a practical situation.

  3. Random Selection for Drug Screening

    SciTech Connect

    Center for Human Reliability Studies

    2007-05-01

    Sampling is the process of choosing some members out of a group or population. Probablity sampling, or random sampling, is the process of selecting members by chance with a known probability of each individual being chosen.

  4. Randomized selection on the GPU

    SciTech Connect

    Monroe, Laura Marie; Wendelberger, Joanne R; Michalak, Sarah E

    2011-01-13

    We implement here a fast and memory-sparing probabilistic top N selection algorithm on the GPU. To our knowledge, this is the first direct selection in the literature for the GPU. The algorithm proceeds via a probabilistic-guess-and-chcck process searching for the Nth element. It always gives a correct result and always terminates. The use of randomization reduces the amount of data that needs heavy processing, and so reduces the average time required for the algorithm. Probabilistic Las Vegas algorithms of this kind are a form of stochastic optimization and can be well suited to more general parallel processors with limited amounts of fast memory.

  5. Random selection as a confidence building tool

    SciTech Connect

    Macarthur, Duncan W; Hauck, Danielle; Langner, Diana; Thron, Jonathan; Smith, Morag; Williams, Richard

    2010-01-01

    Any verification measurement performed on potentially classified nuclear material must satisfy two seemingly contradictory constraints. First and foremost, no classified information can be released. At the same time, the monitoring party must have confidence in the veracity of the measurement. The first concern can be addressed by performing the measurements within the host facility using instruments under the host's control. Because the data output in this measurement scenario is also under host control, it is difficult for the monitoring party to have confidence in that data. One technique for addressing this difficulty is random selection. The concept of random selection can be thought of as four steps: (1) The host presents several 'identical' copies of a component or system to the monitor. (2) One (or more) of these copies is randomly chosen by the monitors for use in the measurement system. (3) Similarly, one or more is randomly chosen to be validated further at a later date in a monitor-controlled facility. (4) Because the two components or systems are identical, validation of the 'validation copy' is equivalent to validation of the measurement system. This procedure sounds straightforward, but effective application may be quite difficult. Although random selection is often viewed as a panacea for confidence building, the amount of confidence generated depends on the monitor's continuity of knowledge for both validation and measurement systems. In this presentation, we will discuss the random selection technique, as well as where and how this technique might be applied to generate maximum confidence. In addition, we will discuss the role of modular measurement-system design in facilitating random selection and describe a simple modular measurement system incorporating six small {sup 3}He neutron detectors and a single high-purity germanium gamma detector.

  6. Species selection and random drift in macroevolution.

    PubMed

    Chevin, Luis-Miguel

    2016-03-01

    Species selection resulting from trait-dependent speciation and extinction is increasingly recognized as an important mechanism of phenotypic macroevolution. However, the recent bloom in statistical methods quantifying this process faces a scarcity of dynamical theory for their interpretation, notably regarding the relative contributions of deterministic versus stochastic evolutionary forces. I use simple diffusion approximations of birth-death processes to investigate how the expected and random components of macroevolutionary change depend on phenotype-dependent speciation and extinction rates, as can be estimated empirically. I show that the species selection coefficient for a binary trait, and selection differential for a quantitative trait, depend not only on differences in net diversification rates (speciation minus extinction), but also on differences in species turnover rates (speciation plus extinction), especially in small clades. The randomness in speciation and extinction events also produces a species-level equivalent to random genetic drift, which is stronger for higher turnover rates. I then show how microevolutionary processes including mutation, organismic selection, and random genetic drift cause state transitions at the species level, allowing comparison of evolutionary forces across levels. A key parameter that would be needed to apply this theory is the distribution and rate of origination of new optimum phenotypes along a phylogeny. PMID:26880617

  7. Randomness in post-selected events

    NASA Astrophysics Data System (ADS)

    Phuc Thinh, Le; de la Torre, Gonzalo; Bancal, Jean-Daniel; Pironio, Stefano; Scarani, Valerio

    2016-03-01

    Bell inequality violations can be used to certify private randomness for use in cryptographic applications. In photonic Bell experiments, a large amount of the data that is generated comes from no-detection events and presumably contains little randomness. This raises the question as to whether randomness can be extracted only from the smaller post-selected subset corresponding to proper detection events, instead of from the entire set of data. This could in principle be feasible without opening an analogue of the detection loophole as long as the min-entropy of the post-selected data is evaluated by taking all the information into account, including no-detection events. The possibility of extracting randomness from a short string has a practical advantage, because it reduces the computational time of the extraction. Here, we investigate the above idea in a simple scenario, where the devices and the adversary behave according to i.i.d. strategies. We show that indeed almost all the randomness is present in the pair of outcomes for which at least one detection happened. We further show that in some cases applying a pre-processing on the data can capture features that an analysis based on global frequencies only misses, thus resulting in the certification of more randomness. We then briefly consider non-i.i.d strategies and provide an explicit example of such a strategy that is more powerful than any i.i.d. one even in the asymptotic limit of infinitely many measurement rounds, something that was not reported before in the context of Bell inequalities.

  8. 32 CFR 1624.1 - Random selection procedures for induction.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 32 National Defense 6 2010-07-01 2010-07-01 false Random selection procedures for induction. 1624... SYSTEM INDUCTIONS § 1624.1 Random selection procedures for induction. (a) The Director of Selective Service shall from time to time establish a random selection sequence for induction by a drawing to...

  9. 47 CFR 1.1603 - Conduct of random selection.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 1 2011-10-01 2011-10-01 false Conduct of random selection. 1.1603 Section 1.1603 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1603 Conduct of random selection....

  10. 47 CFR 1.1603 - Conduct of random selection.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 47 Telecommunication 1 2012-10-01 2012-10-01 false Conduct of random selection. 1.1603 Section 1.1603 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Grants by Random Selection Random Selection Procedures for Mass Media Services General Procedures § 1.1603...

  11. 47 CFR 1.1602 - Designation for random selection.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 47 Telecommunication 1 2012-10-01 2012-10-01 false Designation for random selection. 1.1602 Section 1.1602 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Grants by Random Selection Random Selection Procedures for Mass Media Services General Procedures §...

  12. 47 CFR 1.1603 - Conduct of random selection.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Conduct of random selection. 1.1603 Section 1.1603 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1603 Conduct of random selection....

  13. 47 CFR 1.1602 - Designation for random selection.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 1 2011-10-01 2011-10-01 false Designation for random selection. 1.1602 Section 1.1602 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1602 Designation for random...

  14. 47 CFR 1.1602 - Designation for random selection.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Designation for random selection. 1.1602 Section 1.1602 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1602 Designation for random...

  15. Inference for blocked randomization under a selection bias model.

    PubMed

    Kennes, Lieven N; Rosenberger, William F; Hilgers, Ralf-Dieter

    2015-12-01

    We provide an asymptotic test to analyze randomized clinical trials that may be subject to selection bias. For normally distributed responses, and under permuted block randomization, we derive a likelihood ratio test of the treatment effect under a selection bias model. A likelihood ratio test of the presence of selection bias arises from the same formulation. We prove that the test is asymptotically chi-square on one degree of freedom. These results correlate well with the likelihood ratio test of Ivanova et al. (2005, Statistics in Medicine 24, 1537-1546) for binary responses, for which they established by simulation that the asymptotic distribution is chi-square. Simulations also show that the test is robust to departures from normality and under another randomization procedure. We illustrate the test by reanalyzing a clinical trial on retinal detachment. PMID:26099068

  16. Intra-cluster correlation selection for cluster randomized trials.

    PubMed

    Westgate, Philip M

    2016-08-30

    In this paper, we give focus to cluster randomized trials, also known as group randomized trials, which randomize clusters, or groups, of subjects to different trial arms, such as intervention or control. Outcomes from subjects within the same cluster tend to exhibit an exchangeable correlation measured by the intra-cluster correlation coefficient (ICC). Our primary interest is to test if the intervention has an impact on the marginal mean of an outcome. Using recently developed methods, we propose how to select a working ICC structure with the goal of choosing the structure that results in the smallest standard errors for regression parameter estimates and thus the greatest power for this test. Specifically, we utilize small-sample corrections for the estimation of the covariance matrix of regression parameter estimates. This matrix is incorporated within correlation selection criteria proposed in the generalized estimating equations literature to choose one of multiple working ICC structures under consideration. We demonstrate the potential power and utility of this approach when used in cluster randomized trial settings via a simulation study and application example, and we discuss practical considerations for its use in practice. Copyright © 2016 John Wiley & Sons, Ltd. PMID:26924419

  17. Alternative Modal Basis Selection Procedures for Nonlinear Random Response Simulation

    NASA Technical Reports Server (NTRS)

    Przekop, Adam; Guo, Xinyun; Rizzi, Stephen A.

    2010-01-01

    Three procedures to guide selection of an efficient modal basis in a nonlinear random response analysis are examined. One method is based only on proper orthogonal decomposition, while the other two additionally involve smooth orthogonal decomposition. Acoustic random response problems are employed to assess the performance of the three modal basis selection approaches. A thermally post-buckled beam exhibiting snap-through behavior, a shallowly curved arch in the auto-parametric response regime and a plate structure are used as numerical test articles. The results of the three reduced-order analyses are compared with the results of the computationally taxing simulation in the physical degrees of freedom. For the cases considered, all three methods are shown to produce modal bases resulting in accurate and computationally efficient reduced-order nonlinear simulations.

  18. Fixation probabilities of random mutants under frequency dependent selection.

    PubMed

    Huang, Weini; Traulsen, Arne

    2010-03-21

    Evolutionary game dynamics describes frequency dependent selection in asexual, haploid populations. It typically considers predefined strategies and fixed payoff matrices. Mutations occur between these known types only. Here, we consider a situation in which a mutation has produced an entirely new type which is characterized by a random payoff matrix that does not change during the fixation or extinction of the mutant. Based on the probability distribution underlying the payoff values, we address the fixation probability of the new mutant. It turns out that for weak selection, only the first moments of the distribution matter. For strong selection, the probability that a new payoff entry is larger than the wild type's payoff against itself is the crucial quantity. PMID:19995564

  19. Selective randomized load balancing and mesh networks with changing demands

    NASA Astrophysics Data System (ADS)

    Shepherd, F. B.; Winzer, P. J.

    2006-05-01

    We consider the problem of building cost-effective networks that are robust to dynamic changes in demand patterns. We compare several architectures using demand-oblivious routing strategies. Traditional approaches include single-hop architectures based on a (static or dynamic) circuit-switched core infrastructure and multihop (packet-switched) architectures based on point-to-point circuits in the core. To address demand uncertainty, we seek minimum cost networks that can carry the class of hose demand matrices. Apart from shortest-path routing, Valiant's randomized load balancing (RLB), and virtual private network (VPN) tree routing, we propose a third, highly attractive approach: selective randomized load balancing (SRLB). This is a blend of dual-hop hub routing and randomized load balancing that combines the advantages of both architectures in terms of network cost, delay, and delay jitter. In particular, we give empirical analyses for the cost (in terms of transport and switching equipment) for the discussed architectures, based on three representative carrier networks. Of these three networks, SRLB maintains the resilience properties of RLB while achieving significant cost reduction over all other architectures, including RLB and multihop Internet protocol/multiprotocol label switching (IP/MPLS) networks using VPN-tree routing.

  20. Hierarchy and extremes in selections from pools of randomized proteins.

    PubMed

    Boyer, Sébastien; Biswas, Dipanwita; Kumar Soshee, Ananda; Scaramozzino, Natale; Nizak, Clément; Rivoire, Olivier

    2016-03-29

    Variation and selection are the core principles of Darwinian evolution, but quantitatively relating the diversity of a population to its capacity to respond to selection is challenging. Here, we examine this problem at a molecular level in the context of populations of partially randomized proteins selected for binding to well-defined targets. We built several minimal protein libraries, screened them in vitro by phage display, and analyzed their response to selection by high-throughput sequencing. A statistical analysis of the results reveals two main findings. First, libraries with the same sequence diversity but built around different "frameworks" typically have vastly different responses; second, the distribution of responses of the best binders in a library follows a simple scaling law. We show how an elementary probabilistic model based on extreme value theory rationalizes the latter finding. Our results have implications for designing synthetic protein libraries, estimating the density of functional biomolecules in sequence space, characterizing diversity in natural populations, and experimentally investigating evolvability (i.e., the potential for future evolution). PMID:26969726

  1. Materials selection for oxide-based resistive random access memories

    SciTech Connect

    Guo, Yuzheng; Robertson, John

    2014-12-01

    The energies of atomic processes in resistive random access memories (RRAMs) are calculated for four typical oxides, HfO{sub 2}, TiO{sub 2}, Ta{sub 2}O{sub 5}, and Al{sub 2}O{sub 3}, to define a materials selection process. O vacancies have the lowest defect formation energy in the O-poor limit and dominate the processes. A band diagram defines the operating Fermi energy and O chemical potential range. It is shown how the scavenger metal can be used to vary the O vacancy formation energy, via controlling the O chemical potential, and the mean Fermi energy. The high endurance of Ta{sub 2}O{sub 5} RRAM is related to its more stable amorphous phase and the adaptive lattice rearrangements of its O vacancy.

  2. A Bayesian random effects discrete-choice model for resource selection: Population-level selection inference

    USGS Publications Warehouse

    Thomas, D.L.; Johnson, D.; Griffith, B.

    2006-01-01

    Modeling the probability of use of land units characterized by discrete and continuous measures, we present a Bayesian random-effects model to assess resource selection. This model provides simultaneous estimation of both individual- and population-level selection. Deviance information criterion (DIC), a Bayesian alternative to AIC that is sample-size specific, is used for model selection. Aerial radiolocation data from 76 adult female caribou (Rangifer tarandus) and calf pairs during 1 year on an Arctic coastal plain calving ground were used to illustrate models and assess population-level selection of landscape attributes, as well as individual heterogeneity of selection. Landscape attributes included elevation, NDVI (a measure of forage greenness), and land cover-type classification. Results from the first of a 2-stage model-selection procedure indicated that there is substantial heterogeneity among cow-calf pairs with respect to selection of the landscape attributes. In the second stage, selection of models with heterogeneity included indicated that at the population-level, NDVI and land cover class were significant attributes for selection of different landscapes by pairs on the calving ground. Population-level selection coefficients indicate that the pairs generally select landscapes with higher levels of NDVI, but the relationship is quadratic. The highest rate of selection occurs at values of NDVI less than the maximum observed. Results for land cover-class selections coefficients indicate that wet sedge, moist sedge, herbaceous tussock tundra, and shrub tussock tundra are selected at approximately the same rate, while alpine and sparsely vegetated landscapes are selected at a lower rate. Furthermore, the variability in selection by individual caribou for moist sedge and sparsely vegetated landscapes is large relative to the variability in selection of other land cover types. The example analysis illustrates that, while sometimes computationally intense, a

  3. Study on MAX-MIN Ant System with Random Selection in Quadratic Assignment Problem

    NASA Astrophysics Data System (ADS)

    Iimura, Ichiro; Yoshida, Kenji; Ishibashi, Ken; Nakayama, Shigeru

    Ant Colony Optimization (ACO), which is a type of swarm intelligence inspired by ants' foraging behavior, has been studied extensively and its effectiveness has been shown by many researchers. The previous studies have reported that MAX-MIN Ant System (MMAS) is one of effective ACO algorithms. The MMAS maintains the balance of intensification and diversification concerning pheromone by limiting the quantity of pheromone to the range of minimum and maximum values. In this paper, we propose MAX-MIN Ant System with Random Selection (MMASRS) for improving the search performance even further. The MMASRS is a new ACO algorithm that is MMAS into which random selection was newly introduced. The random selection is one of the edgechoosing methods by agents (ants). In our experimental evaluation using ten quadratic assignment problems, we have proved that the proposed MMASRS with the random selection is superior to the conventional MMAS without the random selection in the viewpoint of the search performance.

  4. 32 CFR 1624.1 - Random selection procedures for induction.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ..., during the specified calendar year(s) attain their 18th year of birth. The drawing, commencing with the... date of birth of the registrant that appears on his Selective Service Registration Record on the day... date of birth in all matters pertaining to his relations with the Selective Service System....

  5. 32 CFR 1624.1 - Random selection procedures for induction.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ..., during the specified calendar year(s) attain their 18th year of birth. The drawing, commencing with the... date of birth of the registrant that appears on his Selective Service Registration Record on the day... date of birth in all matters pertaining to his relations with the Selective Service System....

  6. 32 CFR 1624.1 - Random selection procedures for induction.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ..., during the specified calendar year(s) attain their 18th year of birth. The drawing, commencing with the... date of birth of the registrant that appears on his Selective Service Registration Record on the day... date of birth in all matters pertaining to his relations with the Selective Service System....

  7. MOMENT-BASED METHOD FOR RANDOM EFFECTS SELECTION IN LINEAR MIXED MODELS

    PubMed Central

    Ahn, Mihye; Lu, Wenbin

    2012-01-01

    The selection of random effects in linear mixed models is an important yet challenging problem in practice. We propose a robust and unified framework for automatically selecting random effects and estimating covariance components in linear mixed models. A moment-based loss function is first constructed for estimating the covariance matrix of random effects. Two types of shrinkage penalties, a hard thresholding operator and a new sandwich-type soft-thresholding penalty, are then imposed for sparse estimation and random effects selection. Compared with existing approaches, the new procedure does not require any distributional assumption on the random effects and error terms. We establish the asymptotic properties of the resulting estimator in terms of its consistency in both random effects selection and variance component estimation. Optimization strategies are suggested to tackle the computational challenges involved in estimating the sparse variance-covariance matrix. Furthermore, we extend the procedure to incorporate the selection of fixed effects as well. Numerical results show promising performance of the new approach in selecting both random and fixed effects and, consequently, improving the efficiency of estimating model parameters. Finally, we apply the approach to a data set from the Amsterdam Growth and Health study. PMID:23105913

  8. Application of random effects to the study of resource selection by animals.

    PubMed

    Gillies, Cameron S; Hebblewhite, Mark; Nielsen, Scott E; Krawchuk, Meg A; Aldridge, Cameron L; Frair, Jacqueline L; Saher, D Joanne; Stevens, Cameron E; Jerde, Christopher L

    2006-07-01

    1. Resource selection estimated by logistic regression is used increasingly in studies to identify critical resources for animal populations and to predict species occurrence. 2. Most frequently, individual animals are monitored and pooled to estimate population-level effects without regard to group or individual-level variation. Pooling assumes that both observations and their errors are independent, and resource selection is constant given individual variation in resource availability. 3. Although researchers have identified ways to minimize autocorrelation, variation between individuals caused by differences in selection or available resources, including functional responses in resource selection, have not been well addressed. 4. Here we review random-effects models and their application to resource selection modelling to overcome these common limitations. We present a simple case study of an analysis of resource selection by grizzly bears in the foothills of the Canadian Rocky Mountains with and without random effects. 5. Both categorical and continuous variables in the grizzly bear model differed in interpretation, both in statistical significance and coefficient sign, depending on how a random effect was included. We used a simulation approach to clarify the application of random effects under three common situations for telemetry studies: (a) discrepancies in sample sizes among individuals; (b) differences among individuals in selection where availability is constant; and (c) differences in availability with and without a functional response in resource selection. 6. We found that random intercepts accounted for unbalanced sample designs, and models with random intercepts and coefficients improved model fit given the variation in selection among individuals and functional responses in selection. Our empirical example and simulations demonstrate how including random effects in resource selection models can aid interpretation and address difficult assumptions

  9. Sample Selection in Randomized Experiments: A New Method Using Propensity Score Stratified Sampling

    ERIC Educational Resources Information Center

    Tipton, Elizabeth; Hedges, Larry; Vaden-Kiernan, Michael; Borman, Geoffrey; Sullivan, Kate; Caverly, Sarah

    2014-01-01

    Randomized experiments are often seen as the "gold standard" for causal research. Despite the fact that experiments use random assignment to treatment conditions, units are seldom selected into the experiment using probability sampling. Very little research on experimental design has focused on how to make generalizations to well-defined…

  10. Selective advantage for sexual replication with random haploid fusion

    NASA Astrophysics Data System (ADS)

    Tannenbaum, Emmanuel

    2008-03-01

    This talk develops a simplified set of models describing asexual and sexual replication in unicellular diploid organisms. The models assume organisms whose genomes consist of two chromosomes, where each chromosome is assumed to be functional if and only if it is equal to some master sequence. The fitness of an organism is determined by the number of functional chromosomes in its genome. For a population replicating asexually, a cell replicates both of its chromosomes, and then divides and splits its genetic material evenly between the two cells. For a population replicating sexually, a given cell first divides into two haploids, which enter a haploid pool. Within the haploid pool, haploids fuse into diploids, which then divide via the normal mitotic process. When the cost for sex is small, as measured by the ratio of the characteristic haploid fusion time to the characteristic growth time, we find that sexual replication with random haploid fusion leads to a greater mean fitness for the population than a purely asexual strategy. The results of this talk are consistent with previous studies suggesting that sex is favored at intermediate mutation rates, for slowly replicating organisms, and at high population densities.

  11. Acceptance sampling using judgmental and randomly selected samples

    SciTech Connect

    Sego, Landon H.; Shulman, Stanley A.; Anderson, Kevin K.; Wilson, John E.; Pulsipher, Brent A.; Sieber, W. Karl

    2010-09-01

    We present a Bayesian model for acceptance sampling where the population consists of two groups, each with different levels of risk of containing unacceptable items. Expert opinion, or judgment, may be required to distinguish between the high and low-risk groups. Hence, high-risk items are likely to be identifed (and sampled) using expert judgment, while the remaining low-risk items are sampled randomly. We focus on the situation where all observed samples must be acceptable. Consequently, the objective of the statistical inference is to quantify the probability that a large percentage of the unsampled items in the population are also acceptable. We demonstrate that traditional (frequentist) acceptance sampling and simpler Bayesian formulations of the problem are essentially special cases of the proposed model. We explore the properties of the model in detail, and discuss the conditions necessary to ensure that required samples sizes are non-decreasing function of the population size. The method is applicable to a variety of acceptance sampling problems, and, in particular, to environmental sampling where the objective is to demonstrate the safety of reoccupying a remediated facility that has been contaminated with a lethal agent.

  12. Drugs in oral fluid in randomly selected drivers.

    PubMed

    Drummer, Olaf H; Gerostamoulos, Dimitri; Chu, Mark; Swann, Philip; Boorman, Martin; Cairns, Ian

    2007-08-01

    There were 13,176 roadside drug tests performed in the first year of the random drug-testing program conducted in the state of Victoria. Drugs targeted in the testing were methamphetamines and Delta(9)-tetrahydrocannabinol (THC). On-site screening was conducted by the police using DrugWipe, while the driver was still in the vehicle and if positive, a second test on collected oral fluid, using the Rapiscan, was performed in a specially outfitted "drug bus" located adjacent to the testing area. Oral fluid on presumptive positive cases was sent to the laboratory for confirmation with limits of quantification of 5, 5, and 2 ng/mL for methamphetamine (MA), methylenedioxy-methamphetamine (MDMA), and THC, respectively. Recovery experiments conducted in the laboratory showed quantitative recovery of analytes from the collector. When oral fluid could not be collected, blood was taken from the driver and sent to the laboratory for confirmation. These roadside tests gave 313 positive cases following GC-MS confirmation. These comprised 269, 118, and 87 cases positive to MA, MDMA, and THC, respectively. The median oral concentrations (undiluted) of MA, MDMA, and THC was 1136, 2724, and 81 ng/mL. The overall drug positive rate was 2.4% of the screened population. This rate was highest in drivers of cars (2.8%). The average age of drivers detected with a positive drug reading was 28 years. Large vehicle (trucks over 4.5 t) drivers were older; on average at 38 years. Females accounted for 19% of all positives, although none of the positive truck drivers were female. There was one false positive to cannabis when the results of both on-site devices were considered and four to methamphetamines. PMID:17658711

  13. THE SELECTION OF A NATIONAL RANDOM SAMPLE OF TEACHERS FOR EXPERIMENTAL CURRICULUM EVALUATION.

    ERIC Educational Resources Information Center

    WELCH, WAYNE W.; AND OTHERS

    MEMBERS OF THE EVALUATION SECTION OF HARVARD PROJECT PHYSICS, DESCRIBING WHAT IS SAID TO BE THE FIRST ATTEMPT TO SELECT A NATIONAL RANDOM SAMPLE OF (HIGH SCHOOL PHYSICS) TEACHERS, LIST THE STEPS AS (1) PURCHASE OF A LIST OF PHYSICS TEACHERS FROM THE NATIONAL SCIENCE TEACHERS ASSOCIATION (MOST COMPLETE AVAILABLE), (2) SELECTION OF 136 NAMES BY A…

  14. Random Forest (RF) Wrappers for Waveband Selection and Classification of Hyperspectral Data.

    PubMed

    Poona, Nitesh Keshavelal; van Niekerk, Adriaan; Nadel, Ryan Leslie; Ismail, Riyad

    2016-02-01

    Hyperspectral data collected using a field spectroradiometer was used to model asymptomatic stress in Pinus radiata and Pinus patula seedlings infected with the pathogen Fusarium circinatum. Spectral data were analyzed using the random forest algorithm. To improve the classification accuracy of the model, subsets of wavebands were selected using three feature selection algorithms: (1) Boruta; (2) recursive feature elimination (RFE); and (3) area under the receiver operating characteristic curve of the random forest (AUC-RF). Results highlighted the robustness of the above feature selection methods when used in conjunction with the random forest algorithm for analyzing hyperspectral data. Overall, the Boruta feature selection algorithm provided the best results. When discriminating F. circinatum stress in Pinus radiata seedlings, Boruta selected wavebands (n = 69) yielded the best overall classification accuracies (training error of 17.00%, independent test error of 17.00% and an AUC value of 0.91). Classification results were, however, significantly lower for P. patula seedlings, with a training error of 24.00%, independent test error of 38.00%, and an AUC value of 0.65. A hybrid selection method that utilizes combinations of wavebands selected from the three feature selection algorithms was also tested. The hybrid method showed an improvement in classification accuracies for P. patula, and no improvement for P. radiata. The results of this study provide impetus towards implementing a hyperspectral framework for detecting stress within nursery environments. PMID:26903567

  15. Propensity scores used for analysis of cluster randomized trials with selection bias: a simulation study.

    PubMed

    Leyrat, C; Caille, A; Donner, A; Giraudeau, B

    2013-08-30

    Cluster randomized trials (CRTs) are often prone to selection bias despite randomization. Using a simulation study, we investigated the use of propensity score (PS) based methods in estimating treatment effects in CRTs with selection bias when the outcome is quantitative. Of four PS-based methods (adjustment on PS, inverse weighting, stratification, and optimal full matching method), three successfully corrected the bias, as did an approach using classical multivariable regression. However, they showed poorer statistical efficiency than classical methods, with higher standard error for the treatment effect, and type I error much smaller than the 5% nominal level. PMID:23553813

  16. A controlled, randomized trial of highly selective vagotomy versus selective vagotomy and pyloroplasty in the treatment of duodenal ulcer.

    PubMed Central

    Kronborg, O; Madsen, P

    1975-01-01

    The results of highly selective vagotomy without drainage and selective vagotomy with pyloroplasty for duodenal ulcer were compared in a randomized, controlled trial of a series of 100 patients. The frequency of dumping, diarrhoea, and epigastric fullness was significantly lower after highly selective (6, 6, and 8 percent) than after selective vagotomy (30, 20, and 28 percent) one year after the operations. Recurrent and persisting duodenal ulcers appearing from one to four years after the operations were significantly more frequent after highly selective (22 percent) than after selective vagotomy (8 percent). No significant relationships were found between recurrent ulceration and gastric acid secretion measurements after the two operations. The Hollander response was early positive in 28 percent and late positive in 30 percent of the patients subjected to highly selective vagotomy, while the corresponding figures after selective vagotomy were 26 and 32 percent. The overall clinical results of the two operations were not different according to the classification of Visick. Excluding the patients with recurrence resulted in significantly better clinical results after highly selective vagotomy. PMID:1093947

  17. COMPARISON OF RANDOM AND SYSTEMATIC SITE SELECTION FOR ASSESSING ATTAINMENT OF AQUATIC LIFE USES IN SEGMENTS OF THE OHIO RIVER

    EPA Science Inventory

    This report is a description of field work and data analysis results comparing a design comparable to systematic site selection with one based on random selection of sites. The report is expected to validate the use of random site selection in the bioassessment program for the O...

  18. Unbiased Feature Selection in Learning Random Forests for High-Dimensional Data

    PubMed Central

    Nguyen, Thanh-Tung; Huang, Joshua Zhexue; Nguyen, Thuy Thi

    2015-01-01

    Random forests (RFs) have been widely used as a powerful classification method. However, with the randomization in both bagging samples and feature selection, the trees in the forest tend to select uninformative features for node splitting. This makes RFs have poor accuracy when working with high-dimensional data. Besides that, RFs have bias in the feature selection process where multivalued features are favored. Aiming at debiasing feature selection in RFs, we propose a new RF algorithm, called xRF, to select good features in learning RFs for high-dimensional data. We first remove the uninformative features using p-value assessment, and the subset of unbiased features is then selected based on some statistical measures. This feature subset is then partitioned into two subsets. A feature weighting sampling technique is used to sample features from these two subsets for building trees. This approach enables one to generate more accurate trees, while allowing one to reduce dimensionality and the amount of data needed for learning RFs. An extensive set of experiments has been conducted on 47 high-dimensional real-world datasets including image datasets. The experimental results have shown that RFs with the proposed approach outperformed the existing random forests in increasing the accuracy and the AUC measures. PMID:25879059

  19. Optimization Of Mean-Semivariance-Skewness Portfolio Selection Model In Fuzzy Random Environment

    NASA Astrophysics Data System (ADS)

    Chatterjee, Amitava; Bhattacharyya, Rupak; Mukherjee, Supratim; Kar, Samarjit

    2010-10-01

    The purpose of the paper is to construct a mean-semivariance-skewness portfolio selection model in fuzzy random environment. The objective is to maximize the skewness with predefined maximum risk tolerance and minimum expected return. Here the security returns in the objectives and constraints are assumed to be fuzzy random variables in nature and then the vagueness of the fuzzy random variables in the objectives and constraints are transformed into fuzzy variables which are similar to trapezoidal numbers. The newly formed fuzzy model is then converted into a deterministic optimization model. The feasibility and effectiveness of the proposed method is verified by numerical example extracted from Bombay Stock Exchange (BSE). The exact parameters of fuzzy membership function and probability density function are obtained through fuzzy random simulating the past dates.

  20. An Evaluation of Information Criteria Use for Correct Cross-Classified Random Effects Model Selection

    ERIC Educational Resources Information Center

    Beretvas, S. Natasha; Murphy, Daniel L.

    2013-01-01

    The authors assessed correct model identification rates of Akaike's information criterion (AIC), corrected criterion (AICC), consistent AIC (CAIC), Hannon and Quinn's information criterion (HQIC), and Bayesian information criterion (BIC) for selecting among cross-classified random effects models. Performance of default values for the 5…

  1. 47 CFR 1.824 - Random selection procedures for Multichannel Multipoint Distribution Service and Multipoint...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Random selection procedures for Multichannel Multipoint Distribution Service and Multipoint Distribution Service H-Channel stations. 1.824 Section 1.824... for Multichannel Multipoint Distribution Service and Multipoint Distribution Service...

  2. Variable selection with random forest: Balancing stability, performance, and interpretation in ecological and environmental modeling

    EPA Science Inventory

    Random forest (RF) is popular in ecological and environmental modeling, in part, because of its insensitivity to correlated predictors and resistance to overfitting. Although variable selection has been proposed to improve both performance and interpretation of RF models, it is u...

  3. Adaptive consensus of scale-free multi-agent system by randomly selecting links

    NASA Astrophysics Data System (ADS)

    Mou, Jinping; Ge, Huafeng

    2016-06-01

    This paper investigates an adaptive consensus problem for distributed scale-free multi-agent systems (SFMASs) by randomly selecting links, where the degree of each node follows a power-law distribution. The randomly selecting links are based on the assumption that every agent decides to select links among its neighbours according to the received data with a certain probability. Accordingly, a novel consensus protocol with the range of the received data is developed, and each node updates its state according to the protocol. By the iterative method and Cauchy inequality, the theoretical analysis shows that all errors among agents converge to zero, and in the meanwhile, several criteria of consensus are obtained. One numerical example shows the reliability of the proposed methods.

  4. Continuous-Time Mean-Variance Portfolio Selection with Random Horizon

    SciTech Connect

    Yu, Zhiyong

    2013-12-15

    This paper examines the continuous-time mean-variance optimal portfolio selection problem with random market parameters and random time horizon. Treating this problem as a linearly constrained stochastic linear-quadratic optimal control problem, I explicitly derive the efficient portfolios and efficient frontier in closed forms based on the solutions of two backward stochastic differential equations. Some related issues such as a minimum variance portfolio and a mutual fund theorem are also addressed. All the results are markedly different from those in the problem with deterministic exit time. A key part of my analysis involves proving the global solvability of a stochastic Riccati equation, which is interesting in its own right.

  5. Emergence of multilevel selection in the prisoner's dilemma game on coevolving random networks

    NASA Astrophysics Data System (ADS)

    Szolnoki, Attila; Perc, Matjaž

    2009-09-01

    We study the evolution of cooperation in the prisoner's dilemma game, whereby a coevolutionary rule is introduced that molds the random topology of the interaction network in two ways. First, existing links are deleted whenever a player adopts a new strategy or its degree exceeds a threshold value; second, new links are added randomly after a given number of game iterations. These coevolutionary processes correspond to the generic formation of new links and deletion of existing links that, especially in human societies, appear frequently as a consequence of ongoing socialization, change of lifestyle or death. Due to the counteraction of deletions and additions of links the initial heterogeneity of the interaction network is qualitatively preserved, and thus cannot be held responsible for the observed promotion of cooperation. Indeed, the coevolutionary rule evokes the spontaneous emergence of a powerful multilevel selection mechanism, which despite the sustained random topology of the evolving network, maintains cooperation across the whole span of defection temptation values.

  6. Antibiotic Selection Pressure and Macrolide Resistance in Nasopharyngeal Streptococcus pneumoniae: A Cluster-Randomized Clinical Trial

    PubMed Central

    Skalet, Alison H.; Cevallos, Vicky; Ayele, Berhan; Gebre, Teshome; Zhou, Zhaoxia; Jorgensen, James H.; Zerihun, Mulat; Habte, Dereje; Assefa, Yared; Emerson, Paul M.; Gaynor, Bruce D.; Porco, Travis C.; Lietman, Thomas M.; Keenan, Jeremy D.

    2010-01-01

    Background It is widely thought that widespread antibiotic use selects for community antibiotic resistance, though this has been difficult to prove in the setting of a community-randomized clinical trial. In this study, we used a randomized clinical trial design to assess whether macrolide resistance was higher in communities treated with mass azithromycin for trachoma, compared to untreated control communities. Methods and Findings In a cluster-randomized trial for trachoma control in Ethiopia, 12 communities were randomized to receive mass azithromycin treatment of children aged 1–10 years at months 0, 3, 6, and 9. Twelve control communities were randomized to receive no antibiotic treatments until the conclusion of the study. Nasopharyngeal swabs were collected from randomly selected children in the treated group at baseline and month 12, and in the control group at month 12. Antibiotic susceptibility testing was performed on Streptococcus pneumoniae isolated from the swabs using Etest strips. In the treated group, the mean prevalence of azithromycin resistance among all monitored children increased from 3.6% (95% confidence interval [CI] 0.8%–8.9%) at baseline, to 46.9% (37.5%–57.5%) at month 12 (p = 0.003). In control communities, azithromycin resistance was 9.2% (95% CI 6.7%–13.3%) at month 12, significantly lower than the treated group (p<0.0001). Penicillin resistance was identified in 0.8% (95% CI 0%–4.2%) of isolates in the control group at 1 year, and in no isolates in the children-treated group at baseline or 1 year. Conclusions This cluster-randomized clinical trial demonstrated that compared to untreated control communities, nasopharyngeal pneumococcal resistance to macrolides was significantly higher in communities randomized to intensive azithromycin treatment. Mass azithromycin distributions were given more frequently than currently recommended by the World Health Organization's trachoma program. Azithromycin use in this setting did

  7. 40 CFR 761.308 - Sample selection by random number generation on any two-dimensional square grid.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Sample selection by random number generation on any two-dimensional square grid. 761.308 Section 761.308 Protection of Environment... § 761.79(b)(3) § 761.308 Sample selection by random number generation on any two-dimensional...

  8. 40 CFR 761.308 - Sample selection by random number generation on any two-dimensional square grid.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 31 2011-07-01 2011-07-01 false Sample selection by random number generation on any two-dimensional square grid. 761.308 Section 761.308 Protection of Environment... § 761.79(b)(3) § 761.308 Sample selection by random number generation on any two-dimensional...

  9. 40 CFR 761.308 - Sample selection by random number generation on any two-dimensional square grid.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 32 2012-07-01 2012-07-01 false Sample selection by random number generation on any two-dimensional square grid. 761.308 Section 761.308 Protection of Environment... § 761.79(b)(3) § 761.308 Sample selection by random number generation on any two-dimensional...

  10. 40 CFR 761.308 - Sample selection by random number generation on any two-dimensional square grid.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 32 2013-07-01 2013-07-01 false Sample selection by random number generation on any two-dimensional square grid. 761.308 Section 761.308 Protection of Environment... § 761.79(b)(3) § 761.308 Sample selection by random number generation on any two-dimensional...

  11. 40 CFR 761.308 - Sample selection by random number generation on any two-dimensional square grid.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 31 2014-07-01 2014-07-01 false Sample selection by random number generation on any two-dimensional square grid. 761.308 Section 761.308 Protection of Environment... § 761.79(b)(3) § 761.308 Sample selection by random number generation on any two-dimensional...

  12. Alternative Modal Basis Selection Procedures For Reduced-Order Nonlinear Random Response Simulation

    NASA Technical Reports Server (NTRS)

    Przekop, Adam; Guo, Xinyun; Rizi, Stephen A.

    2012-01-01

    Three procedures to guide selection of an efficient modal basis in a nonlinear random response analysis are examined. One method is based only on proper orthogonal decomposition, while the other two additionally involve smooth orthogonal decomposition. Acoustic random response problems are employed to assess the performance of the three modal basis selection approaches. A thermally post-buckled beam exhibiting snap-through behavior, a shallowly curved arch in the auto-parametric response regime and a plate structure are used as numerical test articles. The results of a computationally taxing full-order analysis in physical degrees of freedom are taken as the benchmark for comparison with the results from the three reduced-order analyses. For the cases considered, all three methods are shown to produce modal bases resulting in accurate and computationally efficient reduced-order nonlinear simulations.

  13. Alternative modal basis selection procedures for reduced-order nonlinear random response simulation

    NASA Astrophysics Data System (ADS)

    Przekop, Adam; Guo, Xinyun; Rizzi, Stephen A.

    2012-08-01

    Three procedures to guide selection of an efficient modal basis in a nonlinear random response analysis are examined. One method is based only on proper orthogonal decomposition, while the other two additionally involve smooth orthogonal decomposition. Acoustic random response problems are employed to assess the performance of the three modal basis selection approaches. A thermally post-buckled beam exhibiting snap-through behavior, a shallowly curved arch in the auto-parametric response regime and a plate structure are used as numerical test articles. The results of a computationally taxing full-order analysis in physical degrees of freedom are taken as the benchmark for comparison with the results from the three reduced-order analyses. For the cases considered, all three methods are shown to produce modal bases resulting in accurate and computationally efficient reduced-order nonlinear simulations.

  14. Topology-selective jamming of fully-connected, code-division random-access networks

    NASA Technical Reports Server (NTRS)

    Polydoros, Andreas; Cheng, Unjeng

    1990-01-01

    The purpose is to introduce certain models of topology selective stochastic jamming and examine its impact on a class of fully-connected, spread-spectrum, slotted ALOHA-type random access networks. The theory covers dedicated as well as half-duplex units. The dominant role of the spatial duty factor is established, and connections with the dual concept of time selective jamming are discussed. The optimal choices of coding rate and link access parameters (from the users' side) and the jamming spatial fraction are numerically established for DS and FH spreading.

  15. SnIPRE: selection inference using a Poisson random effects model.

    PubMed

    Eilertson, Kirsten E; Booth, James G; Bustamante, Carlos D

    2012-01-01

    We present an approach for identifying genes under natural selection using polymorphism and divergence data from synonymous and non-synonymous sites within genes. A generalized linear mixed model is used to model the genome-wide variability among categories of mutations and estimate its functional consequence. We demonstrate how the model's estimated fixed and random effects can be used to identify genes under selection. The parameter estimates from our generalized linear model can be transformed to yield population genetic parameter estimates for quantities including the average selection coefficient for new mutations at a locus, the synonymous and non-synynomous mutation rates, and species divergence times. Furthermore, our approach incorporates stochastic variation due to the evolutionary process and can be fit using standard statistical software. The model is fit in both the empirical Bayes and Bayesian settings using the lme4 package in R, and Markov chain Monte Carlo methods in WinBUGS. Using simulated data we compare our method to existing approaches for detecting genes under selection: the McDonald-Kreitman test, and two versions of the Poisson random field based method MKprf. Overall, we find our method universally outperforms existing methods for detecting genes subject to selection using polymorphism and divergence data. PMID:23236270

  16. SnIPRE: Selection Inference Using a Poisson Random Effects Model

    PubMed Central

    Eilertson, Kirsten E.; Booth, James G.; Bustamante, Carlos D.

    2012-01-01

    We present an approach for identifying genes under natural selection using polymorphism and divergence data from synonymous and non-synonymous sites within genes. A generalized linear mixed model is used to model the genome-wide variability among categories of mutations and estimate its functional consequence. We demonstrate how the model's estimated fixed and random effects can be used to identify genes under selection. The parameter estimates from our generalized linear model can be transformed to yield population genetic parameter estimates for quantities including the average selection coefficient for new mutations at a locus, the synonymous and non-synynomous mutation rates, and species divergence times. Furthermore, our approach incorporates stochastic variation due to the evolutionary process and can be fit using standard statistical software. The model is fit in both the empirical Bayes and Bayesian settings using the lme4 package in R, and Markov chain Monte Carlo methods in WinBUGS. Using simulated data we compare our method to existing approaches for detecting genes under selection: the McDonald-Kreitman test, and two versions of the Poisson random field based method MKprf. Overall, we find our method universally outperforms existing methods for detecting genes subject to selection using polymorphism and divergence data. PMID:23236270

  17. Recursive Random Forests Enable Better Predictive Performance and Model Interpretation than Variable Selection by LASSO.

    PubMed

    Zhu, Xiang-Wei; Xin, Yan-Jun; Ge, Hui-Lin

    2015-04-27

    Variable selection is of crucial significance in QSAR modeling since it increases the model predictive ability and reduces noise. The selection of the right variables is far more complicated than the development of predictive models. In this study, eight continuous and categorical data sets were employed to explore the applicability of two distinct variable selection methods random forests (RF) and least absolute shrinkage and selection operator (LASSO). Variable selection was performed: (1) by using recursive random forests to rule out a quarter of the least important descriptors at each iteration and (2) by using LASSO modeling with 10-fold inner cross-validation to tune its penalty λ for each data set. Along with regular statistical parameters of model performance, we proposed the highest pairwise correlation rate, average pairwise Pearson's correlation coefficient, and Tanimoto coefficient to evaluate the optimal by RF and LASSO in an extensive way. Results showed that variable selection could allow a tremendous reduction of noisy descriptors (at most 96% with RF method in this study) and apparently enhance model's predictive performance as well. Furthermore, random forests showed property of gathering important predictors without restricting their pairwise correlation, which is contrary to LASSO. The mutual exclusion of highly correlated variables in LASSO modeling tends to skip important variables that are highly related to response endpoints and thus undermine the model's predictive performance. The optimal variables selected by RF share low similarity with those by LASSO (e.g., the Tanimoto coefficients were smaller than 0.20 in seven out of eight data sets). We found that the differences between RF and LASSO predictive performances mainly resulted from the variables selected by different strategies rather than the learning algorithms. Our study showed that the right selection of variables is more important than the learning algorithm for modeling. We hope

  18. From Protocols to Publications: A Study in Selective Reporting of Outcomes in Randomized Trials in Oncology

    PubMed Central

    Raghav, Kanwal Pratap Singh; Mahajan, Sminil; Yao, James C.; Hobbs, Brian P.; Berry, Donald A.; Pentz, Rebecca D.; Tam, Alda; Hong, Waun K.; Ellis, Lee M.; Abbruzzese, James; Overman, Michael J.

    2015-01-01

    Purpose The decision by journals to append protocols to published reports of randomized trials was a landmark event in clinical trial reporting. However, limited information is available on how this initiative effected transparency and selective reporting of clinical trial data. Methods We analyzed 74 oncology-based randomized trials published in Journal of Clinical Oncology, the New England Journal of Medicine, and The Lancet in 2012. To ascertain integrity of reporting, we compared published reports with their respective appended protocols with regard to primary end points, nonprimary end points, unplanned end points, and unplanned analyses. Results A total of 86 primary end points were reported in 74 randomized trials; nine trials had greater than one primary end point. Nine trials (12.2%) had some discrepancy between their planned and published primary end points. A total of 579 nonprimary end points (median, seven per trial) were planned, of which 373 (64.4%; median, five per trial) were reported. A significant positive correlation was found between the number of planned and nonreported nonprimary end points (Spearman r = 0.66; P < .001). Twenty-eight studies (37.8%) reported a total of 65 unplanned end points; 52 (80.0%) of which were not identified as unplanned. Thirty-one (41.9%) and 19 (25.7%) of 74 trials reported a total of 52 unplanned analyses involving primary end points and 33 unplanned analyses involving nonprimary end points, respectively. Studies reported positive unplanned end points and unplanned analyses more frequently than negative outcomes in abstracts (unplanned end points odds ratio, 6.8; P = .002; unplanned analyses odd ratio, 8.4; P = .007). Conclusion Despite public and reviewer access to protocols, selective outcome reporting persists and is a major concern in the reporting of randomized clinical trials. To foster credible evidence-based medicine, additional initiatives are needed to minimize selective reporting. PMID:26304898

  19. Statistical Inference of Selection and Divergence from a Time-Dependent Poisson Random Field Model

    PubMed Central

    Amei, Amei; Sawyer, Stanley

    2012-01-01

    We apply a recently developed time-dependent Poisson random field model to aligned DNA sequences from two related biological species to estimate selection coefficients and divergence time. We use Markov chain Monte Carlo methods to estimate species divergence time and selection coefficients for each locus. The model assumes that the selective effects of non-synonymous mutations are normally distributed across genetic loci but constant within loci, and synonymous mutations are selectively neutral. In contrast with previous models, we do not assume that the individual species are at population equilibrium after divergence. Using a data set of 91 genes in two Drosophila species, D. melanogaster and D. simulans, we estimate the species divergence time (or 1.68 million years, assuming the haploid effective population size years) and a mean selection coefficient per generation . Although the average selection coefficient is positive, the magnitude of the selection is quite small. Results from numerical simulations are also presented as an accuracy check for the time-dependent model. PMID:22509300

  20. Fuzzy Random λ-Mean SAD Portfolio Selection Problem: An Ant Colony Optimization Approach

    NASA Astrophysics Data System (ADS)

    Thakur, Gour Sundar Mitra; Bhattacharyya, Rupak; Mitra, Swapan Kumar

    2010-10-01

    To reach the investment goal, one has to select a combination of securities among different portfolios containing large number of securities. Only the past records of each security do not guarantee the future return. As there are many uncertain factors which directly or indirectly influence the stock market and there are also some newer stock markets which do not have enough historical data, experts' expectation and experience must be combined with the past records to generate an effective portfolio selection model. In this paper the return of security is assumed to be Fuzzy Random Variable Set (FRVS), where returns are set of random numbers which are in turn fuzzy numbers. A new λ-Mean Semi Absolute Deviation (λ-MSAD) portfolio selection model is developed. The subjective opinions of the investors to the rate of returns of each security are taken into consideration by introducing a pessimistic-optimistic parameter vector λ. λ-Mean Semi Absolute Deviation (λ-MSAD) model is preferred as it follows absolute deviation of the rate of returns of a portfolio instead of the variance as the measure of the risk. As this model can be reduced to Linear Programming Problem (LPP) it can be solved much faster than quadratic programming problems. Ant Colony Optimization (ACO) is used for solving the portfolio selection problem. ACO is a paradigm for designing meta-heuristic algorithms for combinatorial optimization problem. Data from BSE is used for illustration.

  1. Predicting protein-RNA interaction amino acids using random forest based on submodularity subset selection.

    PubMed

    Pan, Xiaoyong; Zhu, Lin; Fan, Yong-Xian; Yan, Junchi

    2014-11-13

    Protein-RNA interaction plays a very crucial role in many biological processes, such as protein synthesis, transcription and post-transcription of gene expression and pathogenesis of disease. Especially RNAs always function through binding to proteins. Identification of binding interface region is especially useful for cellular pathways analysis and drug design. In this study, we proposed a novel approach for binding sites identification in proteins, which not only integrates local features and global features from protein sequence directly, but also constructed a balanced training dataset using sub-sampling based on submodularity subset selection. Firstly we extracted local features and global features from protein sequence, such as evolution information and molecule weight. Secondly, the number of non-interaction sites is much more than interaction sites, which leads to a sample imbalance problem, and hence biased machine learning model with preference to non-interaction sites. To better resolve this problem, instead of previous randomly sub-sampling over-represented non-interaction sites, a novel sampling approach based on submodularity subset selection was employed, which can select more representative data subset. Finally random forest were trained on optimally selected training subsets to predict interaction sites. Our result showed that our proposed method is very promising for predicting protein-RNA interaction residues, it achieved an accuracy of 0.863, which is better than other state-of-the-art methods. Furthermore, it also indicated the extracted global features have very strong discriminate ability for identifying interaction residues from random forest feature importance analysis. PMID:25462339

  2. Optimization of the Dutch Matrix Test by Random Selection of Sentences From a Preselected Subset

    PubMed Central

    Dreschler, Wouter A.

    2015-01-01

    Matrix tests are available for speech recognition testing in many languages. For an accurate measurement, a steep psychometric function of the speech materials is required. For existing tests, it would be beneficial if it were possible to further optimize the available materials by increasing the function’s steepness. The objective is to show if the steepness of the psychometric function of an existing matrix test can be increased by selecting a homogeneous subset of recordings with the steepest sentence-based psychometric functions. We took data from a previous multicenter evaluation of the Dutch matrix test (45 normal-hearing listeners). Based on half of the data set, first the sentences (140 out of 311) with a similar speech reception threshold and with the steepest psychometric function (≥9.7%/dB) were selected. Subsequently, the steepness of the psychometric function for this selection was calculated from the remaining (unused) second half of the data set. The calculation showed that the slope increased from 10.2%/dB to 13.7%/dB. The resulting subset did not allow the construction of enough balanced test lists. Therefore, the measurement procedure was changed to randomly select the sentences during testing. Random selection may interfere with a representative occurrence of phonemes. However, in our material, the median phonemic occurrence remained close to that of the original test. This finding indicates that phonemic occurrence is not a critical factor. The work highlights the possibility that existing speech tests might be improved by selecting sentences with a steep psychometric function. PMID:25964195

  3. Developmental contributions to macronutrient selection: a randomized controlled trial in adult survivors of malnutrition

    PubMed Central

    Campbell, Claudia P.; Raubenheimer, David; Badaloo, Asha V.; Gluckman, Peter D.; Martinez, Claudia; Gosby, Alison; Simpson, Stephen J.; Osmond, Clive; Boyne, Michael S.; Forrester, Terrence E.

    2016-01-01

    Background and objectives: Birthweight differences between kwashiorkor and marasmus suggest that intrauterine factors influence the development of these syndromes of malnutrition and may modulate risk of obesity through dietary intake. We tested the hypotheses that the target protein intake in adulthood is associated with birthweight, and that protein leveraging to maintain this target protein intake would influence energy intake (EI) and body weight in adult survivors of malnutrition. Methodology: Sixty-three adult survivors of marasmus and kwashiorkor could freely compose a diet from foods containing 10, 15 and 25 percentage energy from protein (percentage of energy derived from protein (PEP); Phase 1) for 3 days. Participants were then randomized in Phase 2 (5 days) to diets with PEP fixed at 10%, 15% or 25%. Results: Self-selected PEP was similar in both groups. In the groups combined, selected PEP was 14.7, which differed significantly (P < 0.0001) from the null expectation (16.7%) of no selection. Self-selected PEP was inversely related to birthweight, the effect disappearing after adjusting for sex and current body weight. In Phase 2, PEP correlated inversely with EI (P = 0.002) and weight change from Phase 1 to 2 (P = 0.002). Protein intake increased with increasing PEP, but to a lesser extent than energy increased with decreasing PEP. Conclusions and implications: Macronutrient intakes were not independently related to birthweight or diagnosis. In a free-choice situation (Phase 1), subjects selected a dietary PEP significantly lower than random. Lower PEP diets induce increased energy and decreased protein intake, and are associated with weight gain. PMID:26817484

  4. Partial sequence analysis of 130 randomly selected maize cDNA clones.

    PubMed Central

    Keith, C S; Hoang, D O; Barrett, B M; Feigelman, B; Nelson, M C; Thai, H; Baysdorfer, C

    1993-01-01

    As part of a project to identify novel maize (Zea mays L. cv B73) genes functionally, we have partially sequenced 130 randomly selected clones from a maize leaf cDNA library. Data base comparisons revealed seven previously sequenced maize cDNAs and 18 cDNAs with sequence similarity to related maize genes or to genes from other organisms. One hundred five cDNAs show little or no similarity to previously sequenced genes. Our results also establish the suitability of this library for large-scale sequencing in terms of its large insert size, proper insert orientation, and low duplication rate. PMID:8278499

  5. Selective of informative metabolites using random forests based on model population analysis.

    PubMed

    Huang, Jian-Hua; Yan, Jun; Wu, Qing-Hua; Duarte Ferro, Miguel; Yi, Lun-Zhao; Lu, Hong-Mei; Xu, Qing-Song; Liang, Yi-Zeng

    2013-12-15

    One of the main goals of metabolomics studies is to discover informative metabolites or biomarkers, which may be used to diagnose diseases and to find out pathology. Sophisticated feature selection approaches are required to extract the information hidden in such complex 'omics' data. In this study, it is proposed a new and robust selective method by combining random forests (RF) with model population analysis (MPA), for selecting informative metabolites from three metabolomic datasets. According to the contribution to the classification accuracy, the metabolites were classified into three kinds: informative, no-informative, and interfering metabolites. Based on the proposed method, some informative metabolites were selected for three datasets; further analyses of these metabolites between healthy and diseased groups were then performed, showing by T-test that the P values for all these selected metabolites were lower than 0.05. Moreover, the informative metabolites identified by the current method were demonstrated to be correlated with the clinical outcome under investigation. The source codes of MPA-RF in Matlab can be freely downloaded from http://code.google.com/p/my-research-list/downloads/list. PMID:24209380

  6. Biotin binders selected from a random peptide library expressed on phage.

    PubMed Central

    Saggio, I; Laufer, R

    1993-01-01

    Recombinant biotin-binding phages were affinity-selected from a random peptide library expressed on the surface of filamentous phage. Phage binding to biotinylated proteins was half-maximally inhibited by micromolar concentrations of a monobiotinylated molecule. Sequencing of the peptide inserts of selected phages led to the identification of a previously unknown biotin-binding motif, CXWXPPF(K or R)XXC. A synthetic peptide containing this sequence motif inhibited streptavidin binding to biotinylated BSA with an IC50 of 50 microM. This compound represents the shortest non-avidin biotin-binding peptide identified to date. Our results illustrate that phage display technology can be used to identify novel ligands for a small non-proteinaceous molecule. PMID:8352728

  7. Biotin binders selected from a random peptide library expressed on phage.

    PubMed

    Saggio, I; Laufer, R

    1993-08-01

    Recombinant biotin-binding phages were affinity-selected from a random peptide library expressed on the surface of filamentous phage. Phage binding to biotinylated proteins was half-maximally inhibited by micromolar concentrations of a monobiotinylated molecule. Sequencing of the peptide inserts of selected phages led to the identification of a previously unknown biotin-binding motif, CXWXPPF(K or R)XXC. A synthetic peptide containing this sequence motif inhibited streptavidin binding to biotinylated BSA with an IC50 of 50 microM. This compound represents the shortest non-avidin biotin-binding peptide identified to date. Our results illustrate that phage display technology can be used to identify novel ligands for a small non-proteinaceous molecule. PMID:8352728

  8. Selection and evolution of enzymes from a partially randomized non-catalytic scaffold.

    PubMed

    Seelig, Burckhard; Szostak, Jack W

    2007-08-16

    Enzymes are exceptional catalysts that facilitate a wide variety of reactions under mild conditions, achieving high rate-enhancements with excellent chemo-, regio- and stereoselectivities. There is considerable interest in developing new enzymes for the synthesis of chemicals and pharmaceuticals and as tools for molecular biology. Methods have been developed for modifying and improving existing enzymes through screening, selection and directed evolution. However, the design and evolution of truly novel enzymes has relied on extensive knowledge of the mechanism of the reaction. Here we show that genuinely new enzymatic activities can be created de novo without the need for prior mechanistic information by selection from a naive protein library of very high diversity, with product formation as the sole selection criterion. We used messenger RNA display, in which proteins are covalently linked to their encoding mRNA, to select for functional proteins from an in vitro translated protein library of >10(12 )independent sequences without the constraints imposed by any in vivo step. This technique has been used to evolve new peptides and proteins that can bind a specific ligand, from both random-sequence libraries and libraries based on a known protein fold. We now describe the isolation of novel RNA ligases from a library that is based on a zinc finger scaffold, followed by in vitro directed evolution to further optimize these enzymes. The resulting ligases exhibit multiple turnover with rate enhancements of more than two-million-fold. PMID:17700701

  9. A Comparison of Dietary Habits between Recreational Runners and a Randomly Selected Adult Population in Slovenia

    PubMed Central

    ŠKOF, Branko; ROTOVNIK KOZJEK, Nada

    2015-01-01

    Introduction The aim of the study was to compare the dietary habits of recreational runners with those of a random sample of the general population. We also wanted to determine the influence of gender, age and sports performance of recreational runners on their basic diet and compliance with recommendations in sports nutrition. Methods The study population consisted of 1,212 adult Slovenian recreational runners and 774 randomly selected residents of Slovenia between the ages of 18 and 65 years. The data on the dietary habits of our subjects was gathered by means of two questionnaires. The following parameters were evaluated: the type of diet, a food pattern, and the frequency of consumption of individual food groups, the use of dietary supplements, fluid intake, and alcohol consumption. Results Recreational runners had better compliance with recommendations for healthy nutrition than the general population. This pattern increased with the runner’s age and performance level. Compared to male runners, female runners ate more regularly and had a more frequent consumption of food groups associated with a healthy diet (fruit, vegetables, whole grain foods, and low-fat dairy products). The consumption of simple sugars and use of nutritional supplements by well-trained runners was inadequate with values recommended for physically active individuals. Conclusion Recreational runners are an exemplary population group that actively seeks to adopt a healthier lifestyle.

  10. Pornography in Usenet: a study of 9,800 randomly selected images.

    PubMed

    Mehta, M D

    2001-12-01

    This paper builds on an earlier study by Mehta and Plaza, from 1997, by analyzing 9,800 randomly selected images taken from 32 Usenet newsgroups between July 1995 and July 1996. The study concludes that an increasing percentage of pornographic images in Usenet come from commercially oriented sources and that commercial sources are more likely to post explicit images. Pornographic images containing themes that fall under most obscenity statutes are more likely to be posted by noncommercial sources. By examining the themes most commonly found in the sample, it is concluded that the vast majority of images contain legally permissible content. Only a small fraction of images contain pedophilic, bestiality, co-prophilic/urophilic, amputation and mutilation, and necrophilic themes. PMID:11800177

  11. Multilabel learning via random label selection for protein subcellular multilocations prediction.

    PubMed

    Wang, Xiao; Li, Guo-Zheng

    2013-01-01

    Prediction of protein subcellular localization is an important but challenging problem, particularly when proteins may simultaneously exist at, or move between, two or more different subcellular location sites. Most of the existing protein subcellular localization methods are only used to deal with the single-location proteins. In the past few years, only a few methods have been proposed to tackle proteins with multiple locations. However, they only adopt a simple strategy, that is, transforming the multilocation proteins to multiple proteins with single location, which does not take correlations among different subcellular locations into account. In this paper, a novel method named random label selection (RALS) (multilabel learning via RALS), which extends the simple binary relevance (BR) method, is proposed to learn from multilocation proteins in an effective and efficient way. RALS does not explicitly find the correlations among labels, but rather implicitly attempts to learn the label correlations from data by augmenting original feature space with randomly selected labels as its additional input features. Through the fivefold cross-validation test on a benchmark data set, we demonstrate our proposed method with consideration of label correlations obviously outperforms the baseline BR method without consideration of label correlations, indicating correlations among different subcellular locations really exist and contribute to improvement of prediction performance. Experimental results on two benchmark data sets also show that our proposed methods achieve significantly higher performance than some other state-of-the-art methods in predicting subcellular multilocations of proteins. The prediction web server is available at >http://levis.tongji.edu.cn:8080/bioinfo/MLPred-Euk/ for the public usage. PMID:23929867

  12. Implications of structural genomics target selection strategies: Pfam5000, whole genome, and random approaches

    SciTech Connect

    Chandonia, John-Marc; Brenner, Steven E.

    2004-07-14

    The structural genomics project is an international effort to determine the three-dimensional shapes of all important biological macromolecules, with a primary focus on proteins. Target proteins should be selected according to a strategy which is medically and biologically relevant, of good value, and tractable. As an option to consider, we present the Pfam5000 strategy, which involves selecting the 5000 most important families from the Pfam database as sources for targets. We compare the Pfam5000 strategy to several other proposed strategies that would require similar numbers of targets. These include including complete solution of several small to moderately sized bacterial proteomes, partial coverage of the human proteome, and random selection of approximately 5000 targets from sequenced genomes. We measure the impact that successful implementation of these strategies would have upon structural interpretation of the proteins in Swiss-Prot, TrEMBL, and 131 complete proteomes (including 10 of eukaryotes) from the Proteome Analysis database at EBI. Solving the structures of proteins from the 5000 largest Pfam families would allow accurate fold assignment for approximately 68 percent of all prokaryotic proteins (covering 59 percent of residues) and 61 percent of eukaryotic proteins (40 percent of residues). More fine-grained coverage which would allow accurate modeling of these proteins would require an order of magnitude more targets. The Pfam5000 strategy may be modified in several ways, for example to focus on larger families, bacterial sequences, or eukaryotic sequences; as long as secondary consideration is given to large families within Pfam, coverage results vary only slightly. In contrast, focusing structural genomics on a single tractable genome would have only a limited impact in structural knowledge of other proteomes: a significant fraction (about 30-40 percent of the proteins, and 40-60 percent of the residues) of each proteome is classified in small

  13. A novel, efficient, randomized selection trial comparing combinations of drug therapy for ALS

    PubMed Central

    GORDON, PAUL H.; CHEUNG, YING-KUEN; LEVIN, BRUCE; ANDREWS, HOWARD; DOORISH, CAROLYN; MACARTHUR, ROBERT B.; MONTES, JACQUELINE; BEDNARZ, KATE; FLORENCE, JULAINE; ROWIN, JULIE; BOYLAN, KEVIN; MOZAFFAR, TAHSEEN; TANDAN, RUP; MITSUMOTO, HIROSHI; KELVIN, ELIZABETH A.; CHAPIN, JOHN; BEDLACK, RICHARD; RIVNER, MICHAEL; MCCLUSKEY, LEO F.; PESTRONK, ALAN; GRAVES, MICHAEL; SORENSON, ERIC J.; BAROHN, RICHARD J.; BELSH, JERRY M.; LOU, JAU-SHIN; LEVINE, TODD; SAPERSTEIN, DAVID; MILLER, ROBERT G.; SCELSA, STEPHEN N.

    2015-01-01

    Combining agents with different mechanisms of action may be necessary for meaningful results in treating ALS. The combinations of minocycline-creatine and celecoxib-creatine have additive effects in the murine model. New trial designs are needed to efficiently screen the growing number of potential neuroprotective agents. Our objective was to assess two drug combinations in ALS using a novel phase II trial design. We conducted a randomized, double-blind selection trial in sequential pools of 60 patients. Participants received minocycline (100 mg)-creatine (10 g) twice daily or celecoxib (400 mg)-creatine (10 g) twice daily for six months. The primary objective was treatment selection based on which combination best slowed deterioration in the ALS Functional Rating Scale-Revised (ALSFRS-R); the trial could be stopped after one pool if the difference between the two arms was adequately large. At trial conclusion, each arm was compared to a historical control group in a futility analysis. Safety measures were also examined. After the first patient pool, the mean six-month decline in ALSFRS-R was 5.27 (SD=5.54) in the celecoxib-creatine group and 6.47 (SD=9.14) in the minocycline-creatine group. The corresponding decline was 5.82 (SD=6.77) in the historical controls. The difference between the two sample means exceeded the stopping criterion. The null hypothesis of superiority was not rejected in the futility analysis. Skin rash occurred more frequently in the celecoxib-creatine group. In conclusion, the celecoxib-creatine combination was selected as preferable to the minocycline-creatine combination for further evaluation. This phase II design was efficient, leading to treatment selection after just 60 patients, and can be used in other phase II trials to assess different agents. PMID:18608093

  14. Random removal of inserts from an RNA genome: selection against single-stranded RNA.

    PubMed Central

    Olsthoorn, R C; van Duin, J

    1996-01-01

    We have monitored the evolution of insertions in two MS2 RNA regions of known secondary structure where coding pressure is negligible or absent. Base changes and shortening of the inserts proceed until the excessive nucleotides can be accommodated in the original structure. The stems of hairpins can be dramatically extended but the loops cannot, revealing natural selection against single-stranded RNA. The 3' end of the MS2 A-protein gene forms a small hairpin with an XbaI sequence in the loop. This site was used to insert XbaI fragments of various sizes. Phages produced by these MS2 cDNA clones were not wild type, nor had they retained the full insert. Instead, every revertant phage had trimmed the insert in a different way to leave a four- to seven-membered loop to the now extended stem. Similar results were obtained with inserts in the 5' untranslated region. The great number of different revertants obtained from a single starting mutant as well as sequence inspection of the crossover points suggest that the removal of redundant RNA occurs randomly. The only common feature among all revertants appears the potential to form a hairpin with a short loop, suggesting that single-stranded RNA negatively affects the viability of the phage. To test this hypothesis, we introduced XbaI fragments of 34 nucleotides that could form either a long stem with a small loop or a short stem with a large loop (26 nucleotides). The base-paired inserts were perfectly maintained for many generations, whereas the unpaired versions were quickly trimmed back to reduce the size of the loop. These data confirm that single-stranded RNA adversely affects phage fitness and is strongly selected against. The repair of the RNA genome that we describe here appears as the result of random recombination. Of the plethora of recombinants, only those able to adopt a base-paired structure survive. The frequency with which our inserts are removed seems higher than measured by others for small inserts in a

  15. Task-Dependent Band-Selection of Hyperspectral Images by Projection-Based Random Forests

    NASA Astrophysics Data System (ADS)

    Hänsch, R.; Hellwich, O.

    2016-06-01

    The automatic classification of land cover types from hyperspectral images is a challenging problem due to (among others) the large amount of spectral bands and their high spatial and spectral correlation. The extraction of meaningful features, that enables a subsequent classifier to distinguish between different land cover classes, is often limited to a subset of all available data dimensions which is found by band selection techniques or other methods of dimensionality reduction. This work applies Projection-Based Random Forests to hyperspectral images, which not only overcome the need of an explicit feature extraction, but also provide mechanisms to automatically select spectral bands that contain original (i.e. non-redundant) as well as highly meaningful information for the given classification task. The proposed method is applied to four challenging hyperspectral datasets and it is shown that the effective number of spectral bands can be considerably limited without loosing too much of classification performance, e.g. a loss of 1 % accuracy if roughly 13 % of all available bands are used.

  16. Confidence intervals for the selected population in randomized trials that adapt the population enrolled

    PubMed Central

    Rosenblum, Michael

    2014-01-01

    It is a challenge to design randomized trials when it is suspected that a treatment may benefit only certain subsets of the target population. In such situations, trial designs have been proposed that modify the population enrolled based on an interim analysis, in a preplanned manner. For example, if there is early evidence during the trial that the treatment only benefits a certain subset of the population, enrollment may then be restricted to this subset. At the end of such a trial, it is desirable to draw inferences about the selected population. We focus on constructing confidence intervals for the average treatment effect in the selected population. Confidence interval methods that fail to account for the adaptive nature of the design may fail to have the desired coverage probability. We provide a new procedure for constructing confidence intervals having at least 95% coverage probability, uniformly over a large class Q of possible data generating distributions. Our method involves computing the minimum factor c by which a standard confidence interval must be expanded in order to have, asymptotically, at least 95% coverage probability, uniformly over Q. Computing the expansion factor c is not trivial, since it is not a priori clear, for a given decision rule, which data generating distribution leads to the worst-case coverage probability. We give an algorithm that computes c, and prove an optimality property for the resulting confidence interval procedure. PMID:23553577

  17. Selective oropharyngeal decontamination versus selective digestive decontamination in critically ill patients: a meta-analysis of randomized controlled trials

    PubMed Central

    Zhao, Di; Song, Jian; Gao, Xuan; Gao, Fei; Wu, Yupeng; Lu, Yingying; Hou, Kai

    2015-01-01

    Background Selective digestive decontamination (SDD) and selective oropharyngeal decontamination (SOD) are associated with reduced mortality and infection rates among patients in intensive care units (ICUs); however, whether SOD has a superior effect than SDD remains uncertain. Hence, we conducted a meta-analysis of randomized controlled trials (RCTs) to compare SOD with SDD in terms of clinical outcomes and antimicrobial resistance rates in patients who were critically ill. Methods RCTs published in PubMed, Embase, and Web of Science were systematically reviewed to compare the effects of SOD and SDD in patients who were critically ill. Outcomes included day-28 mortality, length of ICU stay, length of hospital stay, duration of mechanical ventilation, ICU-acquired bacteremia, and prevalence of antibiotic-resistant Gram-negative bacteria. Results were expressed as risk ratio (RR) with 95% confidence intervals (CIs), and weighted mean differences (WMDs) with 95% CIs. Pooled estimates were performed using a fixed-effects model or random-effects model, depending on the heterogeneity among studies. Results A total of four RCTs involving 23,822 patients met the inclusion criteria and were included in this meta-analysis. Among patients whose admitting specialty was surgery, cardiothoracic surgery (57.3%) and neurosurgery (29.7%) were the two main types of surgery being performed. Pooled results showed that SOD had similar effects as SDD in day-28 mortality (RR =1.03; 95% CI: 0.98, 1.08; P=0.253), length of ICU stay (WMD =0.00 days; 95% CI: −0.2, 0.2; P=1.00), length of hospital stay (WMD =0.00 days; 95% CI: −0.65, 0.65; P=1.00), and duration of mechanical ventilation (WMD =1.01 days; 95% CI: −0.01, 2.02; P=0.053). On the other hand, compared with SOD, SDD had a lower day-28 mortality in surgical patients (RR =1.11; 95% CI: 1.00, 1.22; P=0.050), lower incidence of ICU-acquired bacteremia (RR =1.38; 95% CI: 1.24, 1.54; P=0.000), and lower rectal carriage of

  18. Propensity score methods for estimating relative risks in cluster randomized trials with low-incidence binary outcomes and selection bias.

    PubMed

    Leyrat, Clémence; Caille, Agnès; Donner, Allan; Giraudeau, Bruno

    2014-09-10

    Despite randomization, selection bias may occur in cluster randomized trials. Classical multivariable regression usually allows for adjusting treatment effect estimates with unbalanced covariates. However, for binary outcomes with low incidence, such a method may fail because of separation problems. This simulation study focused on the performance of propensity score (PS)-based methods to estimate relative risks from cluster randomized trials with binary outcomes with low incidence. The results suggested that among the different approaches used (multivariable regression, direct adjustment on PS, inverse weighting on PS, and stratification on PS), only direct adjustment on the PS fully corrected the bias and moreover had the best statistical properties. PMID:24771662

  19. Application of random coherence order selection in gradient-enhanced multidimensional NMR

    NASA Astrophysics Data System (ADS)

    Bostock, Mark J.; Nietlispach, Daniel

    2016-03-01

    Development of multidimensional NMR is essential to many applications, for example in high resolution structural studies of biomolecules. Multidimensional techniques enable separation of NMR signals over several dimensions, improving signal resolution, whilst also allowing identification of new connectivities. However, these advantages come at a significant cost. The Fourier transform theorem requires acquisition of a grid of regularly spaced points to satisfy the Nyquist criterion, while frequency discrimination and acquisition of a pure phase spectrum require acquisition of both quadrature components for each time point in every indirect (non-acquisition) dimension, adding a factor of 2 N -1 to the number of free- induction decays which must be acquired, where N is the number of dimensions. Compressed sensing (CS) ℓ 1-norm minimisation in combination with non-uniform sampling (NUS) has been shown to be extremely successful in overcoming the Nyquist criterion. Previously, maximum entropy reconstruction has also been used to overcome the limitation of frequency discrimination, processing data acquired with only one quadrature component at a given time interval, known as random phase detection (RPD), allowing a factor of two reduction in the number of points for each indirect dimension (Maciejewski et al. 2011 PNAS 108 16640). However, whilst this approach can be easily applied in situations where the quadrature components are acquired as amplitude modulated data, the same principle is not easily extended to phase modulated (P-/N-type) experiments where data is acquired in the form exp (iωt) or exp (-iωt), and which make up many of the multidimensional experiments used in modern NMR. Here we demonstrate a modification of the CS ℓ 1-norm approach to allow random coherence order selection (RCS) for phase modulated experiments; we generalise the nomenclature for RCS and RPD as random quadrature detection (RQD). With this method, the power of RQD can be extended

  20. Computerized stratified random site-selection approaches for design of a ground-water-quality sampling network

    USGS Publications Warehouse

    Scott, J.C.

    1990-01-01

    Computer software was written to randomly select sites for a ground-water-quality sampling network. The software uses digital cartographic techniques and subroutines from a proprietary geographic information system. The report presents the approaches, computer software, and sample applications. It is often desirable to collect ground-water-quality samples from various areas in a study region that have different values of a spatial characteristic, such as land-use or hydrogeologic setting. A stratified network can be used for testing hypotheses about relations between spatial characteristics and water quality, or for calculating statistical descriptions of water-quality data that account for variations that correspond to the spatial characteristic. In the software described, a study region is subdivided into areal subsets that have a common spatial characteristic to stratify the population into several categories from which sampling sites are selected. Different numbers of sites may be selected from each category of areal subsets. A population of potential sampling sites may be defined by either specifying a fixed population of existing sites, or by preparing an equally spaced population of potential sites. In either case, each site is identified with a single category, depending on the value of the spatial characteristic of the areal subset in which the site is located. Sites are selected from one category at a time. One of two approaches may be used to select sites. Sites may be selected randomly, or the areal subsets in the category can be grouped into cells and sites selected randomly from each cell.

  1. Improving well-being at work: A randomized controlled intervention based on selection, optimization, and compensation.

    PubMed

    Müller, Andreas; Heiden, Barbara; Herbig, Britta; Poppe, Franziska; Angerer, Peter

    2016-04-01

    This study aimed to develop, implement, and evaluate an occupational health intervention that is based on the theoretical model of selection, optimization, and compensation (SOC). We conducted a stratified randomized controlled intervention with 70 nurses of a community hospital in Germany (94% women; mean age 43.7 years). Altogether, the training consisted of 6 sessions (16.5 hours) over a period of 9 months. The training took place in groups of 6-8 employees. Participants were familiarized with the SOC model and developed and implemented a personal project based on SOC to cope effectively with 1 important job demand or to activate a job resource. Consistent with our hypotheses, we observed a meaningful trend that the proposed SOC training enhanced mental well-being, particularly in employees with a strong commitment to the intervention. While highly committed training participants reported higher levels of job control at follow-up, the effects were not statistical significant. Additional analyses of moderation effects showed that the training is particularly effective to enhance mental well-being when job control is low. Contrary to our assumptions, perceived work ability was not improved by the training. Our study provides first indications that SOC training might be a promising approach to occupational health and stress prevention. Moreover, it identifies critical success factors of occupational interventions based on SOC. However, additional studies are needed to corroborate the effectiveness of SOC trainings in the occupational contexts. (PsycINFO Database Record PMID:26322438

  2. A compilation of partial sequences of randomly selected cDNA clones from the rat incisor.

    PubMed

    Matsuki, Y; Nakashima, M; Amizuka, N; Warshawsky, H; Goltzman, D; Yamada, K M; Yamada, Y

    1995-01-01

    The formation of tooth organs is regulated by a series of developmental programs. We have initiated a genome project with the ultimate goal of identifying novel genes important for tooth development. As an initial approach, we constructed a unidirectional cDNA library from the non-calcified portion of incisors of 3- to 4-week-old rats, sequenced cDNA clones, and classified their sequences by homology search through the GenBank data base and the PIR protein data base. Here, we report partial DNA sequences obtained by automated DNA sequencing on 400 cDNA clones randomly selected from the library. Of the sequences determined, 51% represented sequences of new genes that were not related to any previously reported gene. Twenty-six percent of the clones strongly matched genes and proteins in the data bases, including amelogenin, alpha 1(I) and alpha 2(I) collagen chains, osteonectin, and decorin. Nine percent of clones revealed partial sequence homology to known genes such as transcription factors and cell surface receptors. A significant number of the previously identified genes were expressed redundantly and were found to encode extracellular matrix proteins. Identification and cataloging of cDNA clones in these tissues are the first step toward identification of markers expressed in a tissue- or stage-specific manner, as well as the genetic linkage study of tooth anomalies. Further characterization of the clones described in this paper should lead to the discovery of novel genes important for tooth development. PMID:7876422

  3. Melanocytic Hyperplasia in the Epidermis Overlying Trichoblastomas in 100 Randomly Selected Cases.

    PubMed

    Al Omoush, Tahseen M M; Michal, Michael; Konstantinova, Anastasia M; Michal, Michal; Kutzner, Heinz; Kazakov, Dmitry V

    2016-04-01

    One hundred cases of trichoblastomas (large nodular, small nodular, cribriform, lymphadenoma, and columnar) were randomly selected and studied for the presence of melanocytic hyperplasia in the epidermis overlying the tumors, which was defined as foci of increased melanocytes in the basal layer of the epidermis (more than 1 per 4 basal keratinocytes). Focal melanocytic hyperplasia was detected in a total of 22 cases of trichoblastoma (22%), and this phenomenon was most frequently seen in columnar trichoblastoma (7 cases), followed by large nodular trichoblastoma (5 cases). The mechanism of epidermal melanocytic hyperplasia overlying trichoblastoma is unclear. Ultraviolet may be a contributing factor, as focal melanocytic hyperplasia was also detected in one-third of cases in the epidermis overlying uninvolved skin, usually associated with solar elastosis. This is further corroborated by the occurrence of the lesions predominantly on the face. Melanocytic hyperplasia overlying trichoblastoma appears to have no impact on the clinical appearance of the lesion and is recognized only microscopically. In an adequate biopsy specimen containing at least part of trichoblastoma, it should not cause any diagnostic problems. PMID:26885602

  4. A National Survey of Chief Student Personnel Officers at Randomly Selected Institutions of Postsecondary Education in the United States.

    ERIC Educational Resources Information Center

    Thomas, Henry B.; Kaplan, E. Joseph

    A national survey was conducted of randomly selected chief student personnel officers as listed in the 1979 "Education Directory of Colleges and Universities." The survey addressed specific institutional demographics, policy-making authority, reporting structure, and areas of responsibility of the administrators. Over 93 percent of the respondents…

  5. Darwinian Dynamics of Intratumoral Heterogeneity: Not Solely Random Mutations but Also Variable Environmental Selection Forces.

    PubMed

    Lloyd, Mark C; Cunningham, Jessica J; Bui, Marilyn M; Gillies, Robert J; Brown, Joel S; Gatenby, Robert A

    2016-06-01

    that at least some of the molecular heterogeneity in cancer cells in tumors is governed by predictable regional variations in environmental selection forces, arguing against the assumption that cancer cells can evolve toward a local fitness maximum by random accumulation of mutations. Cancer Res; 76(11); 3136-44. ©2016 AACR. PMID:27009166

  6. Most Undirected Random Graphs Are Amplifiers of Selection for Birth-Death Dynamics, but Suppressors of Selection for Death-Birth Dynamics.

    PubMed

    Hindersin, Laura; Traulsen, Arne

    2015-11-01

    We analyze evolutionary dynamics on graphs, where the nodes represent individuals of a population. The links of a node describe which other individuals can be displaced by the offspring of the individual on that node. Amplifiers of selection are graphs for which the fixation probability is increased for advantageous mutants and decreased for disadvantageous mutants. A few examples of such amplifiers have been developed, but so far it is unclear how many such structures exist and how to construct them. Here, we show that almost any undirected random graph is an amplifier of selection for Birth-death updating, where an individual is selected to reproduce with probability proportional to its fitness and one of its neighbors is replaced by that offspring at random. If we instead focus on death-Birth updating, in which a random individual is removed and its neighbors compete for the empty spot, then the same ensemble of graphs consists of almost only suppressors of selection for which the fixation probability is decreased for advantageous mutants and increased for disadvantageous mutants. Thus, the impact of population structure on evolutionary dynamics is a subtle issue that will depend on seemingly minor details of the underlying evolutionary process. PMID:26544962

  7. Model of Heat and Mass Transfer in Random Packing Layer of Powder Particles in Selective Laser Melting

    NASA Astrophysics Data System (ADS)

    Kovaleva, I.; Kovalev, O.; Smurov, I.

    Discretegrid model of heat transfer in granular porous mediumto describe the processes of selective laser melting of powdersis developed. The thermal conductivity in this mediumis performed through thecontact surfaces between the particles. The calculation method of morphology of random packing layer of powder considering the adhesive interaction between the particles is proposed. The internal structure of the obtained loose powder layer is a granular medium where spherical particles of different sizes are arranged in contact with each other randomly. Analytical models of powder balling process and formation of the remelted track are proposed.

  8. 40 CFR 761.306 - Sampling 1 meter square surfaces by random selection of halves.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... flipping a coin). (c) Continue selecting progressively smaller halves by dividing the previously selected... sides 1 meter long). Assign each half to one face of a coin. After flipping the coin, the half assigned... select from left/right halves. (ii) A coin flip selects the left half. The dimensions of this...

  9. Automatised selection of load paths to construct reduced-order models in computational damage micromechanics: from dissipation-driven random selection to Bayesian optimization

    NASA Astrophysics Data System (ADS)

    Goury, Olivier; Amsallem, David; Bordas, Stéphane Pierre Alain; Liu, Wing Kam; Kerfriden, Pierre

    2016-04-01

    In this paper, we present new reliable model order reduction strategies for computational micromechanics. The difficulties rely mainly upon the high dimensionality of the parameter space represented by any load path applied onto the representative volume element. We take special care of the challenge of selecting an exhaustive snapshot set. This is treated by first using a random sampling of energy dissipating load paths and then in a more advanced way using Bayesian optimization associated with an interlocked division of the parameter space. Results show that we can insure the selection of an exhaustive snapshot set from which a reliable reduced-order model can be built.

  10. Automatised selection of load paths to construct reduced-order models in computational damage micromechanics: from dissipation-driven random selection to Bayesian optimization

    NASA Astrophysics Data System (ADS)

    Goury, Olivier; Amsallem, David; Bordas, Stéphane Pierre Alain; Liu, Wing Kam; Kerfriden, Pierre

    2016-08-01

    In this paper, we present new reliable model order reduction strategies for computational micromechanics. The difficulties rely mainly upon the high dimensionality of the parameter space represented by any load path applied onto the representative volume element. We take special care of the challenge of selecting an exhaustive snapshot set. This is treated by first using a random sampling of energy dissipating load paths and then in a more advanced way using Bayesian optimization associated with an interlocked division of the parameter space. Results show that we can insure the selection of an exhaustive snapshot set from which a reliable reduced-order model can be built.

  11. Selecting a Sample for Your Experiment: A Non-Random Stratified Sampling Approach

    ERIC Educational Resources Information Center

    Tipton, Elizabeth

    2012-01-01

    The purpose of this paper is to develop a more general method for sample recruitment in experiments that is purposive (not random) and that results in a sample that is compositionally similar to the generalization population. This work builds on Tipton et al. (2011) by offering solutions to a larger class of problems than the non-overlapping…

  12. Prevalence and Severity of College Student Bereavement Examined in a Randomly Selected Sample

    ERIC Educational Resources Information Center

    Balk, David E.; Walker, Andrea C.; Baker, Ardith

    2010-01-01

    The authors used stratified random sampling to assess the prevalence and severity of bereavement in college undergraduates, providing an advance over findings that emerge from convenience sampling methods or from anecdotal observations. Prior research using convenience sampling indicated that 22% to 30% of college students are within 12 months of…

  13. The Implications of Teacher Selection and Teacher Effects in Individually Randomized Group Treatment Trials

    ERIC Educational Resources Information Center

    Weiss, Michael J.

    2010-01-01

    Randomized experiments have become an increasingly popular design to evaluate the effectiveness of interventions in education (Spybrook, 2008). Many of the interventions evaluated in education are delivered to groups of students, rather than to individuals. Experiments designed to evaluate programs delivered at the group level often…

  14. The Jackprot Simulation Couples Mutation Rate with Natural Selection to Illustrate How Protein Evolution Is Not Random

    PubMed Central

    Espinosa, Avelina; Bai, Chunyan Y.

    2016-01-01

    Protein evolution is not a random process. Views which attribute randomness to molecular change, deleterious nature to single-gene mutations, insufficient geological time, or population size for molecular improvements to occur, or invoke “design creationism” to account for complexity in molecular structures and biological processes, are unfounded. Scientific evidence suggests that natural selection tinkers with molecular improvements by retaining adaptive peptide sequence. We used slot-machine probabilities and ion channels to show biological directionality on molecular change. Because ion channels reside in the lipid bilayer of cell membranes, their residue location must be in balance with the membrane's hydrophobic/philic nature; a selective “pore” for ion passage is located within the hydrophobic region. We contrasted the random generation of DNA sequence for KcsA, a bacterial two-transmembrane-domain (2TM) potassium channel, from Streptomyces lividans, with an under-selection scenario, the “jackprot,” which predicted much faster evolution than by chance. We wrote a computer program in JAVA APPLET version 1.0 and designed an online interface, The Jackprot Simulation http://faculty.rwu.edu/cbai/JackprotSimulation.htm, to model a numerical interaction between mutation rate and natural selection during a scenario of polypeptide evolution. Winning the “jackprot,” or highest-fitness complete-peptide sequence, required cumulative smaller “wins” (rewarded by selection) at the first, second, and third positions in each of the 161 KcsA codons (“jackdons” that led to “jackacids” that led to the “jackprot”). The “jackprot” is a didactic tool to demonstrate how mutation rate coupled with natural selection suffices to explain the evolution of specialized proteins, such as the complex six-transmembrane (6TM) domain potassium, sodium, or calcium channels. Ancestral DNA sequences coding for 2TM-like proteins underwent nucleotide

  15. The Effect of Basis Selection on Static and Random Acoustic Response Prediction Using a Nonlinear Modal Simulation

    NASA Technical Reports Server (NTRS)

    Rizzi, Stephen A.; Przekop, Adam

    2005-01-01

    An investigation of the effect of basis selection on geometric nonlinear response prediction using a reduced-order nonlinear modal simulation is presented. The accuracy is dictated by the selection of the basis used to determine the nonlinear modal stiffness. This study considers a suite of available bases including bending modes only, bending and membrane modes, coupled bending and companion modes, and uncoupled bending and companion modes. The nonlinear modal simulation presented is broadly applicable and is demonstrated for nonlinear quasi-static and random acoustic response of flat beam and plate structures with isotropic material properties. Reduced-order analysis predictions are compared with those made using a numerical simulation in physical degrees-of-freedom to quantify the error associated with the selected modal bases. Bending and membrane responses are separately presented to help differentiate the bases.

  16. Natural Selection VS. Random Drift: Evidence from Temporal Variation in Allele Frequencies in Nature

    PubMed Central

    Mueller, Laurence D.; Barr, Lorraine G.; Ayala, Francisco J.

    1985-01-01

    We have obtained monthly samples of two species, Drosophila pseudoobscura and Drosophila persimilis, in a natural population from Napa County, California. In each species, about 300 genes have been assayed by electrophoresis for each of seven enzyme loci in each monthly sample from March 1972 to June 1975. Using statistical methods developed for the purpose, we have examined whether the allele frequencies at different loci vary in a correlated fashion. The methods used do not detect natural selection when it is deterministic (e.g., overdominance or directional selection), but only when alleles at different loci vary simultaneously in response to the same environmental variations. Moreover, only relatively large fitness differences (of the order of 15%) are detectable. We have found strong evidence of correlated allele frequency variation in 13–20% of the cases examined. We interpret this as evidence that natural selection plays a major role in the evolution of protein polymorphisms in nature. PMID:4054608

  17. Differences between MyoD DNA binding and activation site requirements revealed by functional random sequence selection.

    PubMed Central

    Huang, J; Blackwell, T K; Kedes, L; Weintraub, H

    1996-01-01

    A method has been developed for selecting functional enhancer/promoter sites from random DNA sequences in higher eukaryotic cells. Of sequences that were thus selected for transcriptional activation by the muscle-specific basic helix-loop-helix protein MyoD, only a subset are similar to the preferred in vitro binding consensus, and in the same promoter context an optimal in vitro binding site was inactive. Other sequences with full transcriptional activity instead exhibit sequence preferences that, remarkably, are generally either identical or very similar to those found in naturally occurring muscle-specific promoters. This first systematic examination of the relation between DNA binding and transcriptional activation by basic helix-loop-helix proteins indicates that binding per se is necessary but not sufficient for transcriptional activation by MyoD and implies a requirement for other DNA sequence-dependent interactions or conformations at its binding site. PMID:8668207

  18. Human IgA-binding Peptides Selected from Random Peptide Libraries

    PubMed Central

    Hatanaka, Takaaki; Ohzono, Shinji; Park, Mirae; Sakamoto, Kotaro; Tsukamoto, Shogo; Sugita, Ryohei; Ishitobi, Hiroyuki; Mori, Toshiyuki; Ito, Osamu; Sorajo, Koichi; Sugimura, Kazuhisa; Ham, Sihyun; Ito, Yuji

    2012-01-01

    Phage display system is a powerful tool to design specific ligands for target molecules. Here, we used disulfide-constrained random peptide libraries constructed with the T7 phage display system to isolate peptides specific to human IgA. The binding clones (A1–A4) isolated by biopanning exhibited clear specificity to human IgA, but the synthetic peptide derived from the A2 clone exhibited a low specificity/affinity (Kd = 1.3 μm). Therefore, we tried to improve the peptide using a partial randomized phage display library and mutational studies on the synthetic peptides. The designed Opt-1 peptide exhibited a 39-fold higher affinity (Kd = 33 nm) than the A2 peptide. An Opt-1 peptide-conjugated column was used to purify IgA from human plasma. However, the recovered IgA fraction was contaminated with other proteins, indicating nonspecific binding. To design a peptide with increased binding specificity, we examined the structural features of Opt-1 and the Opt-1-IgA complex using all-atom molecular dynamics simulations with explicit water. The simulation results revealed that the Opt-1 peptide displayed partial helicity in the N-terminal region and possessed a hydrophobic cluster that played a significant role in tight binding with IgA-Fc. However, these hydrophobic residues of Opt-1 may contribute to nonspecific binding with other proteins. To increase binding specificity, we introduced several mutations in the hydrophobic residues of Opt-1. The resultant Opt-3 peptide exhibited high specificity and high binding affinity for IgA, leading to successful isolation of IgA without contamination. PMID:23076147

  19. Moral hazard and selection among the poor: evidence from a randomized experiment.

    PubMed

    Spenkuch, Jörg L

    2012-01-01

    Not only does economic theory predict high-risk individuals to be more likely to purchase insurance, but insurance coverage is also thought to crowd out precautionary activities. In spite of stark theoretical predictions, there is conflicting empirical evidence on adverse selection, and evidence on ex ante moral hazard is very scarce. Using data from the Seguro Popular Experiment in Mexico, this paper documents patterns of selection on observables into health insurance as well as the existence of non-negligible ex ante moral hazard. More specifically, the findings indicate that (i) agents in poor self-assessed health prior to the intervention have, all else equal, a higher propensity to take up insurance; and (ii) insurance coverage reduces the demand for self-protection in the form of preventive care. Curiously, however, individuals do not sort based on objective measures of their health. PMID:22307034

  20. Code to generate random identifiers and select QA/QC samples

    USGS Publications Warehouse

    Mehnert, Edward

    1992-01-01

    SAMPLID is a PC-based, FORTRAN-77 code which generates unique numbers for identification of samples, selection of QA/QC samples, and generation of labels. These procedures are tedious, but using a computer code such as SAMPLID can increase efficiency and reduce or eliminate errors and bias. The algorithm, used in SAMPLID, for generation of pseudorandom numbers is free of statistical flaws present in commonly available algorithms.

  1. Phenotypic evolution by distance in fluctuating environments: The contribution of dispersal, selection and random genetic drift.

    PubMed

    Engen, Steinar; Sæther, Bernt-Erik

    2016-06-01

    Here we analyze how dispersal, genetic drift, and adaptation to the local environment affect the geographical differentiation of a quantitative character through natural selection using a spatial dynamic model for the evolution of the distribution of mean breeding values in space and time. The variation in optimal phenotype is described by local Ornstein-Uhlenbeck processes with a given spatial autocorrelation. Selection and drift are assumed to be governed by phenotypic variation within areas with a given mean breeding value and constant additive genetic variance. Between such neighboring areas there will be white noise variation in mean breeding values, while the variation at larger distances has a spatial structure and a spatial scale that we investigate. The model is analyzed by solving balance equations for the stationary distribution of mean breeding values. We also present scaling results for the spatial autocovariance function for mean breeding values as well as that for the covariance between mean breeding value and the optimal phenotype expressing local adaption. Our results show in particular how these spatial scales depend on population density. For large densities the spatial scale of fluctuations in mean breeding values have similarities with corresponding results in population dynamics, where the effect of migration on spatial scales may be large if the local strength of density regulation is small. In our evolutionary model strength of density regulation corresponds to strength of local selection so that weak local selection may produce large spatial scales of autocovariances. Genetic drift and stochastic migration are shown to act through the population size within a characteristic area with much smaller variation in optimal phenotypes than in the whole population. PMID:26855423

  2. Phenotypic Screening for Friedreich Ataxia Using Random shRNA Selection.

    PubMed

    Cotticelli, M Grazia; Acquaviva, Fabio; Xia, Shujuan; Kaur, Avinash; Wang, Yongping; Wilson, Robert B

    2015-10-01

    Friedreich ataxia (FRDA) is an autosomal recessive neuro- and cardio-degenerative disorder for which there are no proven effective treatments. FRDA is caused by decreased expression and/or function of the protein frataxin. Frataxin chaperones iron in the mitochondrial matrix and regulates the iron-sulfur cluster (ISC) assembly complex. ISCs are prosthetic groups critical for the function of the Krebs cycle and the mitochondrial electron transport chain. Decreased expression of frataxin is associated with decreased ISC assembly, mitochondrial iron accumulation, and increased oxidative stress, all of which contribute to mitochondrial dysfunction. In media with beta-hydroxybutyrate (BHB) as carbon source, primary FRDA fibroblasts grow poorly and/or lose viability over several days. We screened a random, short-hairpin-RNA (shRNA)-expressing library in primary FRDA fibroblasts and identified two shRNAs that reverse the growth/viability defect in BHB media. One of these two clones increases frataxin expression in primary FRDA fibroblasts, either as a vector-expressed shRNA or as a transfected short-interfering RNA (siRNA). PMID:26286937

  3. Benefits of Selected Physical Exercise Programs in Detention: A Randomized Controlled Study

    PubMed Central

    Battaglia, Claudia; di Cagno, Alessandra; Fiorilli, Giovanni; Giombini, Arrigo; Fagnani, Federica; Borrione, Paolo; Marchetti, Marco; Pigozzi, Fabio

    2013-01-01

    The aim of the study was to determine which kind of physical activity could be useful to inmate populations to improve their health status and fitness levels. A repeated measure design was used to evaluate the effects of two different training protocols on subjects in a state of detention, tested pre- and post-experimental protocol.Seventy-five male subjects were enrolled in the studyand randomly allocated to three groups: the cardiovascular plus resistance training protocol group (CRT) (n = 25; mean age 30.9 ± 8.9 years),the high-intensity strength training protocol group (HIST) (n = 25; mean age 33.9 ± 6.8 years), and a control group (C) (n = 25; mean age 32.9 ± 8.9 years) receiving no treatment. All subjects underwent a clinical assessmentandfitness tests. MANOVA revealed significant multivariate effects on group (p < 0.01) and group-training interaction (p < 0.05). CRT protocol resulted the most effective protocol to reach the best outcome in fitness tests. Both CRT and HIST protocols produced significant gains in the functional capacity (cardio-respiratory capacity and cardiovascular disease risk decrease) of incarcerated males. The significant gains obtained in functional capacity reflect the great potential of supervised exercise interventions for improving the health status of incarcerated people. PMID:24185842

  4. Content analysis of a stratified random selection of JVME articles: 1974-2004.

    PubMed

    Olson, Lynne E

    2011-01-01

    A content analysis was performed on a random sample (N = 168) of 25% of the articles published in the Journal of Veterinary Medical Education (JVME) per year from 1974 through 2004. Over time, there were increased numbers of authors per paper, more cross-institutional collaborations, greater prevalence of references or endnotes, and lengthier articles, which could indicate a trend toward publications describing more complex or complete work. The number of first authors that could be identified as female was greatest for the most recent time period studied (2000-2004). Two different categorization schemes were created to assess the content of the publications. The first categorization scheme identified the most frequently published topics as admissions, descriptions of courses, the effect of changing teaching methods, issues facing the profession, and examples of uses of technology. The second categorization scheme identified the subset of articles that described medical education research on the basis of the purpose of the research, which represented only 14% of the sample articles (24 of 168). Of that group, only three of 24, or 12%, represented studies based on a firm conceptual framework that could be confirmed or refuted by the study's results. The results indicate that JVME is meeting its broadly based mission and that publications in the veterinary medical education literature have features common to publications in medicine and medical education. PMID:21805934

  5. Simple Random Sampling-Based Probe Station Selection for Fault Detection in Wireless Sensor Networks

    PubMed Central

    Huang, Rimao; Qiu, Xuesong; Rui, Lanlan

    2011-01-01

    Fault detection for wireless sensor networks (WSNs) has been studied intensively in recent years. Most existing works statically choose the manager nodes as probe stations and probe the network at a fixed frequency. This straightforward solution leads however to several deficiencies. Firstly, by only assigning the fault detection task to the manager node the whole network is out of balance, and this quickly overloads the already heavily burdened manager node, which in turn ultimately shortens the lifetime of the whole network. Secondly, probing with a fixed frequency often generates too much useless network traffic, which results in a waste of the limited network energy. Thirdly, the traditional algorithm for choosing a probing node is too complicated to be used in energy-critical wireless sensor networks. In this paper, we study the distribution characters of the fault nodes in wireless sensor networks, validate the Pareto principle that a small number of clusters contain most of the faults. We then present a Simple Random Sampling-based algorithm to dynamic choose sensor nodes as probe stations. A dynamic adjusting rule for probing frequency is also proposed to reduce the number of useless probing packets. The simulation experiments demonstrate that the algorithm and adjusting rule we present can effectively prolong the lifetime of a wireless sensor network without decreasing the fault detected rate. PMID:22163789

  6. Conflicts of Interest, Selective Inertia, and Research Malpractice in Randomized Clinical Trials: An Unholy Trinity

    PubMed Central

    Berger, Vance W.

    2014-01-01

    Recently a great deal of attention has been paid to conflicts of interest in medical research, and the Institute of Medicine has called for more research into this important area. One research question that has not received sufficient attention concerns the mechanisms of action by which conflicts of interest can result in biased and/or flawed research. What discretion do conflicted researchers have to sway the results one way or the other? We address this issue from the perspective of selective inertia, or an unnatural selection of research methods based on which are most likely to establish the preferred conclusions, rather than on which are most valid. In many cases it is abundantly clear that a method that is not being used in practice is superior to the one that is being used in practice, at least from the perspective of validity, and that it is only inertia, as opposed to any serious suggestion that the incumbent method is superior (or even comparable), that keeps the inferior procedure in use, to the exclusion of the superior one. By focusing on these flawed research methods we can go beyond statements of potential harm from real conflicts of interest, and can more directly assess actual (not potential) harm. PMID:25150846

  7. Predicting the continuum between corridors and barriers to animal movements using Step Selection Functions and Randomized Shortest Paths.

    PubMed

    Panzacchi, Manuela; Van Moorter, Bram; Strand, Olav; Saerens, Marco; Kivimäki, Ilkka; St Clair, Colleen C; Herfindal, Ivar; Boitani, Luigi

    2016-01-01

    The loss, fragmentation and degradation of habitat everywhere on Earth prompts increasing attention to identifying landscape features that support animal movement (corridors) or impedes it (barriers). Most algorithms used to predict corridors assume that animals move through preferred habitat either optimally (e.g. least cost path) or as random walkers (e.g. current models), but neither extreme is realistic. We propose that corridors and barriers are two sides of the same coin and that animals experience landscapes as spatiotemporally dynamic corridor-barrier continua connecting (separating) functional areas where individuals fulfil specific ecological processes. Based on this conceptual framework, we propose a novel methodological approach that uses high-resolution individual-based movement data to predict corridor-barrier continua with increased realism. Our approach consists of two innovations. First, we use step selection functions (SSF) to predict friction maps quantifying corridor-barrier continua for tactical steps between consecutive locations. Secondly, we introduce to movement ecology the randomized shortest path algorithm (RSP) which operates on friction maps to predict the corridor-barrier continuum for strategic movements between functional areas. By modulating the parameter Ѳ, which controls the trade-off between exploration and optimal exploitation of the environment, RSP bridges the gap between algorithms assuming optimal movements (when Ѳ approaches infinity, RSP is equivalent to LCP) or random walk (when Ѳ → 0, RSP → current models). Using this approach, we identify migration corridors for GPS-monitored wild reindeer (Rangifer t. tarandus) in Norway. We demonstrate that reindeer movement is best predicted by an intermediate value of Ѳ, indicative of a movement trade-off between optimization and exploration. Model calibration allows identification of a corridor-barrier continuum that closely fits empirical data and demonstrates that RSP

  8. The Effect of Basis Selection on Thermal-Acoustic Random Response Prediction Using Nonlinear Modal Simulation

    NASA Technical Reports Server (NTRS)

    Rizzi, Stephen A.; Przekop, Adam

    2004-01-01

    The goal of this investigation is to further develop nonlinear modal numerical simulation methods for prediction of geometrically nonlinear response due to combined thermal-acoustic loadings. As with any such method, the accuracy of the solution is dictated by the selection of the modal basis, through which the nonlinear modal stiffness is determined. In this study, a suite of available bases are considered including (i) bending modes only; (ii) coupled bending and companion modes; (iii) uncoupled bending and companion modes; and (iv) bending and membrane modes. Comparison of these solutions with numerical simulation in physical degrees-of-freedom indicates that inclusion of any membrane mode variants (ii - iv) in the basis affects the bending displacement and stress response predictions. The most significant effect is on the membrane displacement, where it is shown that only the type (iv) basis accurately predicts its behavior. Results are presented for beam and plate structures in the thermally pre-buckled regime.

  9. Pregnancy is not a risk factor for gallstone disease: Results of a randomly selected population sample

    PubMed Central

    Walcher, Thomas; Haenle, Mark Martin; Kron, Martina; Hay, Birgit; Mason, Richard Andrew; von Schmiesing, Alexa Friederike Alice; Imhof, Armin; Koenig, Wolfgang; Kern, Peter; Boehm, Bernhard Otto; Kratzer, Wolfgang

    2005-01-01

    AIM: To investigate the prevalence, risk factors, and selection of the study population for cholecystolithiasis in an urban population in Germany, in relation to our own findings and to the results in the international literature. METHODS: A total of 2 147 persons (1 111 females, age 42.8 ± 12.7 years; 1 036 males, age 42.3 ± 13.1 years) participating in an investigation on the prevalence of Echinococcus multilocularis were studied for risk factors and prevalence of gallbladder stone disease. Risk factors were assessed by means of a standardized interview and calculation of body mass index (BMI). A diagnostic ultrasound examination of the gallbladder was performed. Data were analyzed by multiple logistic regression, using the SAS statistical software package. RESULTS: Gallbladder stones were detected in 171 study participants (8.0%, n = 2 147). Risk factors for the development of gallbladder stone disease included age, sex, BMI, and positive family history. In a separate analysis of female study participants, pregnancy (yes/no) and number of pregnancies did not exert any influence. CONCLUSION: Findings of the present study confirm that age, female sex, BMI, and positive family history are risk factors for the development of gallbladder stone disease. Pregnancy and the number of pregnancies, however, could not be shown to be risk factors. There seem to be no differences in the respective prevalence for gallbladder stone disease in urban and rural populations. PMID:16425387

  10. Variable selection in covariate dependent random partition models: an application to urinary tract infection.

    PubMed

    Barcella, William; Iorio, Maria De; Baio, Gianluca; Malone-Lee, James

    2016-04-15

    Lower urinary tract symptoms can indicate the presence of urinary tract infection (UTI), a condition that if it becomes chronic requires expensive and time consuming care as well as leading to reduced quality of life. Detecting the presence and gravity of an infection from the earliest symptoms is then highly valuable. Typically, white blood cell (WBC) count measured in a sample of urine is used to assess UTI. We consider clinical data from 1341 patients in their first visit in which UTI (i.e. WBC ≥ 1) is diagnosed. In addition, for each patient, a clinical profile of 34 symptoms was recorded. In this paper, we propose a Bayesian nonparametric regression model based on the Dirichlet process prior aimed at providing the clinicians with a meaningful clustering of the patients based on both the WBC (response variable) and possible patterns within the symptoms profiles (covariates). This is achieved by assuming a probability model for the symptoms as well as for the response variable. To identify the symptoms most associated to UTI, we specify a spike and slab base measure for the regression coefficients: this induces dependence of symptoms selection on cluster assignment. Posterior inference is performed through Markov Chain Monte Carlo methods. PMID:26536840

  11. A Laser Technique for State-Selected Time-of-Flight Analysis by Pseudo-Random Modulation

    NASA Astrophysics Data System (ADS)

    Baba, Hiroshi; Horiguchi, Hiroyuki; Kondo, Masamichi; Sakurai, Katsumi; Tsuchiya, Soji

    1983-11-01

    A laser technique has been developed for the time-of-flight (TOF) analysis of state-selected atomic or molecular beams. The technique was applied to TOF measurements of Na atoms seeded in supersonic rare gas beams. When a dye laser excited Na atoms in one of the hyperfine levels of the 32S1/2 state, this level was completely depopulated as a result of the optical pumping effect. This depopulation could be detected at a downstream position by the same laser light, since the optically-pumped atoms were transparent, and thus the TOF spectrum could be derived by taking the time correlation between the pseudo-randomly modulated pump laser light and the depopulation detected by LIF. A preliminary scattering experiment of Na by CO2 and SF6 was carried out to confirm the effectiveness of this method.

  12. Desquamated epithelial cells covered with a polymicrobial biofilm typical for bacterial vaginosis are present in randomly selected cryopreserved donor semen.

    PubMed

    Swidsinski, Alexander; Dörffel, Yvonne; Loening-Baucke, Vera; Mendling, Werner; Verstraelen, Hans; Dieterle, Stefan; Schilling, Johannes

    2010-08-01

    We tested whether the bacterial biofilm typical for bacterial vaginosis (BV) can be found on desquamated epithelial cells in cryopreserved donor semen. Bacteria were detected with FISH. Bacterial biofilm, covering the epithelial layer in vaginal biopsies of 20 women with BV, was evaluated on desquamated epithelial cells found in the urine of these same women and their male partners (N=20) and compared with the bacterial biofilm found on desquamated epithelial cells in randomly selected cryopreserved semen samples (N=20). Urine from 20 healthy women of laboratory and clinic personnel and urine from their partners were used as controls. Desquamated epithelial cells covered with a polymicrobial Gardnerella biofilm were identified in urine samples from all women with BV and 13 of their male partners and in none of the female controls and their partners. Gardnerella biofilm, typical for BV, was found in the semen of three of the 20 donors. Donor semen might be a vector for BV. PMID:20497224

  13. Selecting Optimal Random Forest Predictive Models: A Case Study on Predicting the Spatial Distribution of Seabed Hardness

    PubMed Central

    Li, Jin; Tran, Maggie; Siwabessy, Justy

    2016-01-01

    Spatially continuous predictions of seabed hardness are important baseline environmental information for sustainable management of Australia’s marine jurisdiction. Seabed hardness is often inferred from multibeam backscatter data with unknown accuracy and can be inferred from underwater video footage at limited locations. In this study, we classified the seabed into four classes based on two new seabed hardness classification schemes (i.e., hard90 and hard70). We developed optimal predictive models to predict seabed hardness using random forest (RF) based on the point data of hardness classes and spatially continuous multibeam data. Five feature selection (FS) methods that are variable importance (VI), averaged variable importance (AVI), knowledge informed AVI (KIAVI), Boruta and regularized RF (RRF) were tested based on predictive accuracy. Effects of highly correlated, important and unimportant predictors on the accuracy of RF predictive models were examined. Finally, spatial predictions generated using the most accurate models were visually examined and analysed. This study confirmed that: 1) hard90 and hard70 are effective seabed hardness classification schemes; 2) seabed hardness of four classes can be predicted with a high degree of accuracy; 3) the typical approach used to pre-select predictive variables by excluding highly correlated variables needs to be re-examined; 4) the identification of the important and unimportant predictors provides useful guidelines for further improving predictive models; 5) FS methods select the most accurate predictive model(s) instead of the most parsimonious ones, and AVI and Boruta are recommended for future studies; and 6) RF is an effective modelling method with high predictive accuracy for multi-level categorical data and can be applied to ‘small p and large n’ problems in environmental sciences. Additionally, automated computational programs for AVI need to be developed to increase its computational efficiency and

  14. Selecting Optimal Random Forest Predictive Models: A Case Study on Predicting the Spatial Distribution of Seabed Hardness.

    PubMed

    Li, Jin; Tran, Maggie; Siwabessy, Justy

    2016-01-01

    Spatially continuous predictions of seabed hardness are important baseline environmental information for sustainable management of Australia's marine jurisdiction. Seabed hardness is often inferred from multibeam backscatter data with unknown accuracy and can be inferred from underwater video footage at limited locations. In this study, we classified the seabed into four classes based on two new seabed hardness classification schemes (i.e., hard90 and hard70). We developed optimal predictive models to predict seabed hardness using random forest (RF) based on the point data of hardness classes and spatially continuous multibeam data. Five feature selection (FS) methods that are variable importance (VI), averaged variable importance (AVI), knowledge informed AVI (KIAVI), Boruta and regularized RF (RRF) were tested based on predictive accuracy. Effects of highly correlated, important and unimportant predictors on the accuracy of RF predictive models were examined. Finally, spatial predictions generated using the most accurate models were visually examined and analysed. This study confirmed that: 1) hard90 and hard70 are effective seabed hardness classification schemes; 2) seabed hardness of four classes can be predicted with a high degree of accuracy; 3) the typical approach used to pre-select predictive variables by excluding highly correlated variables needs to be re-examined; 4) the identification of the important and unimportant predictors provides useful guidelines for further improving predictive models; 5) FS methods select the most accurate predictive model(s) instead of the most parsimonious ones, and AVI and Boruta are recommended for future studies; and 6) RF is an effective modelling method with high predictive accuracy for multi-level categorical data and can be applied to 'small p and large n' problems in environmental sciences. Additionally, automated computational programs for AVI need to be developed to increase its computational efficiency and

  15. Does pulmonary rehabilitation work in clinical practice? A review on selection and dropout in randomized controlled trials on pulmonary rehabilitation

    PubMed Central

    Bjoernshave, Bodil; Korsgaard, Jens; Nielsen, Claus Vinther

    2010-01-01

    Aim: To analyze randomized controlled trials (RCTs) on pulmonary rehabilitation (PR) to determine whether the patients who complete PR form a representative subset of the chronic obstructive pulmonary disease (COPD) target population and to discuss what impact this may have for the generalizability and implementation of PR in practice. Material and methods: A review of 26 RCTs included in a Cochrane Review 2007. We analyzed the selection at three different levels: 1) sampling; 2) inclusion and exclusion; 3) and dropout. Results: Of 26 studies only 3 (12%) described the sampling as the number of patients contacted. In these studies 28% completed PR. In all we found, that 75% of the patients suitable for PR programs were omitted due to sampling exclusion and dropout. Most of the study populations are not representative of the target population. Conclusion: The RCTs selected for the Cochrane review gave sparse information about the sampling procedure. The demand for high internal validity in studies on PR reduced their external validity. The patients completing PR programs in RCTs were not drawn from a representative subset of the target population. The ability to draw conclusions relevant to clinical practice from the results of the RCTs on PR is impaired. PMID:20865106

  16. Polarimetric SAR decomposition parameter subset selection and their optimal dynamic range evaluation for urban area classification using Random Forest

    NASA Astrophysics Data System (ADS)

    Hariharan, Siddharth; Tirodkar, Siddhesh; Bhattacharya, Avik

    2016-02-01

    Urban area classification is important for monitoring the ever increasing urbanization and studying its environmental impact. Two NASA JPL's UAVSAR datasets of L-band (wavelength: 23 cm) were used in this study for urban area classification. The two datasets used in this study are different in terms of urban area structures, building patterns, their geometric shapes and sizes. In these datasets, some urban areas appear oriented about the radar line of sight (LOS) while some areas appear non-oriented. In this study, roll invariant polarimetric SAR decomposition parameters were used to classify these urban areas. Random Forest (RF), which is an ensemble decision tree learning technique, was used in this study. RF performs parameter subset selection as a part of its classification procedure. In this study, parameter subsets were obtained and analyzed to infer scattering mechanisms useful for urban area classification. The Cloude-Pottier α, the Touzi dominant scattering amplitude αs1 and the anisotropy A were among the top six important parameters selected for both the datasets. However, it was observed that these parameters were ranked differently for the two datasets. The urban area classification using RF was compared with the Support Vector Machine (SVM) and the Maximum Likelihood Classifier (MLC) for both the datasets. RF outperforms SVM by 4% and MLC by 12% in Dataset 1. It also outperforms SVM and MLC by 3.5% and 11% respectively in Dataset 2.

  17. Genome-wide association data classification and SNPs selection using two-stage quality-based Random Forests

    PubMed Central

    2015-01-01

    Background Single-nucleotide polymorphisms (SNPs) selection and identification are the most important tasks in Genome-wide association data analysis. The problem is difficult because genome-wide association data is very high dimensional and a large portion of SNPs in the data is irrelevant to the disease. Advanced machine learning methods have been successfully used in Genome-wide association studies (GWAS) for identification of genetic variants that have relatively big effects in some common, complex diseases. Among them, the most successful one is Random Forests (RF). Despite of performing well in terms of prediction accuracy in some data sets with moderate size, RF still suffers from working in GWAS for selecting informative SNPs and building accurate prediction models. In this paper, we propose to use a new two-stage quality-based sampling method in random forests, named ts-RF, for SNP subspace selection for GWAS. The method first applies p-value assessment to find a cut-off point that separates informative and irrelevant SNPs in two groups. The informative SNPs group is further divided into two sub-groups: highly informative and weak informative SNPs. When sampling the SNP subspace for building trees for the forest, only those SNPs from the two sub-groups are taken into account. The feature subspaces always contain highly informative SNPs when used to split a node at a tree. Results This approach enables one to generate more accurate trees with a lower prediction error, meanwhile possibly avoiding overfitting. It allows one to detect interactions of multiple SNPs with the diseases, and to reduce the dimensionality and the amount of Genome-wide association data needed for learning the RF model. Extensive experiments on two genome-wide SNP data sets (Parkinson case-control data comprised of 408,803 SNPs and Alzheimer case-control data comprised of 380,157 SNPs) and 10 gene data sets have demonstrated that the proposed model significantly reduced prediction

  18. Cross-match-compatible platelets improve corrected count increments in patients who are refractory to randomly selected platelets

    PubMed Central

    Elhence, Priti; Chaudhary, Rajendra K.; Nityanand, Soniya

    2014-01-01

    Background Cross-match-compatible platelets are used for the management of thrombocytopenic patients who are refractory to transfusions of randomly selected platelets. Data supporting the effectiveness of platelets that are compatible according to cross-matching with a modified antigen capture enzyme-linked immunosorbent assay (MAC-ELISA or MACE) are limited. This study aimed to determine the effectiveness of cross-match-compatible platelets in an unselected group of refractory patients. Materials and methods One hundred ABO compatible single donor platelet transfusions given to 31 refractory patients were studied. Patients were defined to be refractory if their 24-hour corrected count increment (CCI) was <5×109/L following two consecutive platelet transfusions. Platelets were cross-matched by MACE and the CCI was determined to monitor the effectiveness of platelet transfusions. Results The clinical sensitivity, specificity, positive predictive value and negative predictive value of the MACE-cross-matched platelets for post-transfusion CCI were 88%, 54.6%, 39.3% and 93.2%, respectively. The difference between adequate and inadequate post-transfusion 24-hour CCI for MACE cross-matched-compatible vs incompatible single donor platelet transfusions was statistically significant (p=0.000). The 24-hour CCI (mean±SD) was significantly higher for cross-match-compatible platelets (9,250±026.6) than for incompatible ones (6,757.94±2,656.5) (p<0.0001). Most of the incompatible cross-matches (73.2%) were due to anti-HLA antibodies, alone (55.3% of cases) or together with anti-platelet glycoprotein antibodies (17.9%). Discussion The clinical sensitivity and negative predictive value of platelet cross-matching by MACE were high in this study and such tests may, therefore, be used to select compatible platelets for refractory patients. A high negative predictive value demonstrates the greater chance of an adequate response with cross-matched-compatible platelets. PMID

  19. Prevalence of respiratory diseases and their association with growth rate and space in randomly selected swine herds.

    PubMed Central

    Wilson, M R; Takov, R; Friendship, R M; Martin, S W; McMillan, I; Hacker, R R; Swaminathan, S

    1986-01-01

    The prevalence and extent of respiratory tract lesions were measured in 1425 pigs from 27 randomly selected herds in the summer of 1982 and winter of 1983. About 75% of pigs had lesions of enzootic pneumonia, approximately 60% had atrophic rhinitis and approximately 11% had pleuropneumonia and/or pleuritis. Individual pig growth rate was recorded on two of the farms, and it was found that the correlations between growth rate and severity of enzootic pneumonia lesions were positive on one farm and negative on the other. Negative correlations between severity of turbinate atrophy and growth rate existed in one of the two herds. Extent of pneumonia and severity of turbinate atrophy were poorly related in individual pigs but herd averages were moderately and positively correlated. Prevalence of diffuse pleuritis and of pleuropneumonia were positively related, as were the extent of pneumonia and prevalence of localized pleuritis. Prevalence of pleuropneumonia was strongly correlated with increased days-to-market. A method of estimating the average days-to-market using weekly herd data (inventory) was developed. PMID:3756676

  20. A preliminary investigation of the jack-bean urease inhibition by randomly selected traditionally used herbal medicine.

    PubMed

    Biglar, Mahmood; Soltani, Khadijeh; Nabati, Farzaneh; Bazl, Roya; Mojab, Faraz; Amanlou, Massoud

    2012-01-01

    Helicobacter pylori (H. pylori) infection leads to different clinical and pathological outcomes in humans, including chronic gastritis, peptic ulcer disease and gastric neoplasia and even gastric cancer and its eradiation dependst upon multi-drug therapy. The most effective therapy is still unknown and prompts people to make great efforts to find better and more modern natural or synthetic anti-H. pylori agents. In this report 21 randomly selected herbal methanolic extracts were evaluated for their effect on inhibition of Jack-bean urease using the indophenol method as described by Weatherburn. The inhibition potency was measured by UV spectroscopy technique at 630 nm which attributes to released ammonium. Among these extracts, five showed potent inhibitory activities with IC50 ranges of 18-35 μg/mL. These plants are Matricaria disciforme (IC50:35 μg/mL), Nasturtium officinale (IC50:18 μg/mL), Punica granatum (IC50:30 μg/mL), Camelia sinensis (IC50:35 μg/mL), Citrus aurantifolia (IC50:28 μg/mL). PMID:24250509

  1. Acute Hemodynamic Effects of a Selective Serotonin Reuptake Inhibitor in Postural Tachycardia Syndrome: A Randomized, Crossover Trial

    PubMed Central

    Mar, Philip L; Raj, Vidya; Black, Bonnie K; Biaggioni, Italo; Shibao, Cyndya A; Paranjape, Sachin Y; Dupont, William D; Robertson, David; Raj, Satish R

    2014-01-01

    Background Selective serotonin reuptake inhibitors (SSRIs) are often prescribed in patients with postural tachycardia syndrome (POTS), and act at synaptic terminals to increase monoamine neurotransmitters. We hypothesized that they act to increase blood pressure (BP) and attenuate reflex tachycardia, thereby improving symptoms. Acute hemodynamic profiles after SSRI administration in POTS patients have not previously been reported. Methods Patients with POTS (n=39; F=37, 39 ±9 years) underwent a randomized crossover trial with sertraline 50mg and placebo. Heart rate (HR), systolic, diastolic, and mean BP were measured with the patient seated and standing for 10 minutes prior to drug or placebo administration, and then hourly for 4 hours. The primary endpoint was standing HR at 4 hours. Results At 4 hours, standing HR and systolic BP were not significantly different between sertraline and placebo. Seated systolic (106±12 mmHg vs. 101±8 mmHg; P=0.041), diastolic (72±8 mmHg vs. 69±8 mmHg; P=0.022), and mean BP (86±9 mmHg vs. 81±9 mmHg; P=0.007) were significantly higher after sertraline administration than placebo. At 4 hours, symptoms were worse with sertraline than placebo. Conclusions Sertraline had a modest pressor effect in POTS patients, but this did not translate into a reduced HR or improved symptoms. PMID:24227635

  2. A Preliminary Investigation of the Jack-Bean Urease Inhibition by Randomly Selected Traditionally Used Herbal Medicine

    PubMed Central

    Biglar, Mahmood; Soltani, Khadijeh; Nabati, Farzaneh; Bazl, Roya; Mojab, Faraz; Amanlou, Massoud

    2012-01-01

    Helicobacter pylori (H. pylori) infection leads to different clinical and pathological outcomes in humans, including chronic gastritis, peptic ulcer disease and gastric neoplasia and even gastric cancer and its eradiation dependst upon multi-drug therapy. The most effective therapy is still unknown and prompts people to make great efforts to find better and more modern natural or synthetic anti-H. pylori agents. In this report 21 randomly selected herbal methanolic extracts were evaluated for their effect on inhibition of Jack-bean urease using the indophenol method as described by Weatherburn. The inhibition potency was measured by UV spectroscopy technique at 630 nm which attributes to released ammonium. Among these extracts, five showed potent inhibitory activities with IC50 ranges of 18-35 μg/mL. These plants are Matricaria disciforme (IC50:35 μg/mL), Nasturtium officinale (IC50:18 μg/mL), Punica granatum (IC50:30 μg/mL), Camelia sinensis (IC50:35 μg/mL), Citrus aurantifolia (IC50:28 μg/mL). PMID:24250509

  3. Enumeration of Escherichia coli cells on chicken carcasses as a potential measure of microbial process control in a random selection of slaughter establishments in the United States

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The purpose of this study was to evaluate whether the measurement of Escherichia coli levels at two points during the chicken slaughter process has utility as a measure of quality control. A one year long survey was conducted during 2004 and 2005 in 20 randomly selected United States chicken slaught...

  4. Remission With Venlafaxine Extended Release or Selective Serotonin Reuptake Inhibitors in Depressed Patients: A Randomized, Open-Label Study

    PubMed Central

    Thase, Michael E.; Ninan, Philip T.; Musgnung, Jeff J.; Trivedi, Madhukar H.

    2011-01-01

    Background: This randomized, open-label, rater-blinded, multicenter study compared treatment outcomes with the serotonin-norepinephrine reuptake inhibitor (SNRI) venlafaxine extended release (ER) with selective serotonin reuptake inhibitors (SSRIs) in primary care patients with major depressive disorder. Method: Study data were collected from November 29, 2000, to March 4, 2003. Outpatients who met diagnostic criteria for major depressive disorder according to the Mental Health Screener, a computer-administered telephone interview program that screens for the most common mental disorders, and had a total score on the 17-item Hamilton Depression Rating Scale (HDRS17) ≥ 20 were randomly assigned to receive up to 6 months of open-label venlafaxine ER 75−225 mg/d (n = 688) or an SSRI (n = 697): fluoxetine 20−80 mg/d, paroxetine 20−50 mg/d, citalopram 20−40 mg/d, and sertraline 50−200 mg/d. The primary outcome was remission (HDRS17 score ≤ 7) at study end point using the last-observation-carried-forward method to account for early termination. A mixed-effects model for repeated measures (MMRM) analysis evaluated secondary outcome measures. Results: Fifty-one percent of patients completed the study. Month 6 remission rates did not differ significantly for venlafaxine ER and the SSRIs (35.5% vs 32.0%, respectively; P = .195). The MMRM analysis of HDRS17 scores also did not differ significantly (P = .0538). Significant treatment effects favoring the venlafaxine ER group were observed for remission rates at days 30, 60, 90, and 135 and a survival analysis of time to remission (P = .006), as well as Clinical Global Impressions-severity of illness scale (P = .0002); Hospital Anxiety and Depression Scale-Anxiety subscale (P = .03); 6-item Hamilton Depression Rating Scale, Bech version (P = .009); and Quick Inventory of Depressive Symptomatology–Self-Report (P = .0003). Conclusions: Remission rates for patients treated with venlafaxine ER or an SSRI did not

  5. Estimating the efficacy of Alcoholics Anonymous without self-selection bias: An instrumental variables re-analysis of randomized clinical trials

    PubMed Central

    Humphreys, Keith; Blodgett, Janet C.; Wagner, Todd H.

    2014-01-01

    Background Observational studies of Alcoholics Anonymous’ (AA) effectiveness are vulnerable to self-selection bias because individuals choose whether or not to attend AA. The present study therefore employed an innovative statistical technique to derive a selection bias-free estimate of AA’s impact. Methods Six datasets from 5 National Institutes of Health-funded randomized trials (one with two independent parallel arms) of AA facilitation interventions were analyzed using instrumental variables models. Alcohol dependent individuals in one of the datasets (n = 774) were analyzed separately from the rest of sample (n = 1582 individuals pooled from 5 datasets) because of heterogeneity in sample parameters. Randomization itself was used as the instrumental variable. Results Randomization was a good instrument in both samples, effectively predicting increased AA attendance that could not be attributed to self-selection. In five of the six data sets, which were pooled for analysis, increased AA attendance that was attributable to randomization (i.e., free of self-selection bias) was effective at increasing days of abstinence at 3-month (B = .38, p = .001) and 15-month (B = 0.42, p = .04) follow-up. However, in the remaining dataset, in which pre-existing AA attendance was much higher, further increases in AA involvement caused by the randomly assigned facilitation intervention did not affect drinking outcome. Conclusions For most individuals seeking help for alcohol problems, increasing AA attendance leads to short and long term decreases in alcohol consumption that cannot be attributed to self-selection. However, for populations with high pre-existing AA involvement, further increases in AA attendance may have little impact. PMID:25421504

  6. Water chemistry in 179 randomly selected Swedish headwater streams related to forest production, clear-felling and climate.

    PubMed

    Löfgren, Stefan; Fröberg, Mats; Yu, Jun; Nisell, Jakob; Ranneby, Bo

    2014-12-01

    From a policy perspective, it is important to understand forestry effects on surface waters from a landscape perspective. The EU Water Framework Directive demands remedial actions if not achieving good ecological status. In Sweden, 44 % of the surface water bodies have moderate ecological status or worse. Many of these drain catchments with a mosaic of managed forests. It is important for the forestry sector and water authorities to be able to identify where, in the forested landscape, special precautions are necessary. The aim of this study was to quantify the relations between forestry parameters and headwater stream concentrations of nutrients, organic matter and acid-base chemistry. The results are put into the context of regional climate, sulphur and nitrogen deposition, as well as marine influences. Water chemistry was measured in 179 randomly selected headwater streams from two regions in southwest and central Sweden, corresponding to 10 % of the Swedish land area. Forest status was determined from satellite images and Swedish National Forest Inventory data using the probabilistic classifier method, which was used to model stream water chemistry with Bayesian model averaging. The results indicate that concentrations of e.g. nitrogen, phosphorus and organic matter are related to factors associated with forest production but that it is not forestry per se that causes the excess losses. Instead, factors simultaneously affecting forest production and stream water chemistry, such as climate, extensive soil pools and nitrogen deposition, are the most likely candidates The relationships with clear-felled and wetland areas are likely to be direct effects. PMID:25260924

  7. The Ecological Effects of Universal and Selective Violence Prevention Programs for Middle School Students: A Randomized Trial

    ERIC Educational Resources Information Center

    Simon, Thomas R.; Ikeda, Robin M.; Smith, Emilie Phillips; Reese, Le'Roy E.; Rabiner, David L.; Miller, Shari; Winn, Donna-Marie; Dodge, Kenneth A.; Asher, Steven R.; Horne, Arthur M.; Orpinas, Pamela; Martin, Roy; Quinn, William H.; Tolan, Patrick H.; Gorman-Smith, Deborah; Henry, David B.; Gay, Franklin N.; Schoeny, Michael; Farrell, Albert D.; Meyer, Aleta L.; Sullivan, Terri N.; Allison, Kevin W.

    2009-01-01

    This study reports the findings of a multisite randomized trial evaluating the separate and combined effects of 2 school-based approaches to reduce violence among early adolescents. A total of 37 schools at 4 sites were randomized to 4 conditions: (1) a universal intervention that involved implementing a student curriculum and teacher training…

  8. Age- and sex-related reference ranges for eight plasma constituents derived from randomly selected adults in a Scottish new town.

    PubMed Central

    Gardner, M D; Scott, R

    1980-01-01

    The results of analysis of blood specimens from randomly selected adults aged 19-88 years in the new town of Cumbernauld were used to establish age- and sex-related reference ranges by the centile method (central 95%) for plasma calcium, phosphate, total protein, albumin, globulins, urea, creatinine, and urate. The possible existence of a subpopulation with a higher reference range for urea is mooted. PMID:7400337

  9. Selection of a potential diagnostic biomarker for HIV infection from a random library of non-biological synthetic peptoid oligomers.

    PubMed

    Gearhart, Tricia L; Montelaro, Ronald C; Schurdak, Mark E; Pilcher, Chris D; Rinaldo, Charles R; Kodadek, Thomas; Park, Yongseok; Islam, Kazi; Yurko, Raymond; Marques, Ernesto T A; Burke, Donald S

    2016-08-01

    Non-biological synthetic oligomers can serve as ligands for antibodies. We hypothesized that a random combinatorial library of synthetic poly-N-substituted glycine oligomers, or peptoids, could represent a random "shape library" in antigen space, and that some of these peptoids would be recognized by the antigen-binding pocket of disease-specific antibodies. We synthesized and screened a one bead one compound combinatorial library of peptoids, in which each bead displayed an 8-mer peptoid with ten possible different amines at each position (10(8) theoretical variants). By screening one million peptoid/beads we found 112 (approximately 1 in 10,000) that preferentially bound immunoglobulins from human sera known to be positive for anti-HIV antibodies. Reactive peptoids were then re-synthesized and rigorously evaluated in plate-based ELISAs. Four peptoids showed very good, and one showed excellent, properties for establishing a sero-diagnosis of HIV. These results demonstrate the feasibility of constructing sero-diagnostic assays for infectious diseases from libraries of random molecular shapes. In this study we sought a proof-of-principle that we could identify a potential diagnostic antibody ligand biomarker for an infectious disease in a random combinatorial library of 100 million peptoids. We believe that this is the first evidence that it is possible to develop sero-diagnostic assays - for any infectious disease - based on screening random libraries of non-biological molecular shapes. PMID:27182050

  10. The effect of random field errors on the radiation spectra of selected APS (Advanced Photon Source) undulators

    SciTech Connect

    Alp, E.E.; Viccaro, P.J.

    1987-08-01

    The effect of random magnetic field errors are introduced into the calculations of spectral characteristics of tunable undulators for the proposed 7 GeV Advanced Photon Source (APS). Single electron calculations are made for an undulator with a first harmonic radiation tunable between 3.5 and 13 keV. Using the universal curves developed by Kincaid, the effect of randomly distributed field errors on the first and third harmonics of two proposed typical undulators are calculated. It is found that the lower limit of 0.5% in field errors is more than sufficient for the successful operation of the undulators planned for the APS.

  11. RANDOM LASSO.

    PubMed

    Wang, Sijian; Nan, Bin; Rosset, Saharon; Zhu, Ji

    2011-03-01

    We propose a computationally intensive method, the random lasso method, for variable selection in linear models. The method consists of two major steps. In step 1, the lasso method is applied to many bootstrap samples, each using a set of randomly selected covariates. A measure of importance is yielded from this step for each covariate. In step 2, a similar procedure to the first step is implemented with the exception that for each bootstrap sample, a subset of covariates is randomly selected with unequal selection probabilities determined by the covariates' importance. Adaptive lasso may be used in the second step with weights determined by the importance measures. The final set of covariates and their coefficients are determined by averaging bootstrap results obtained from step 2. The proposed method alleviates some of the limitations of lasso, elastic-net and related methods noted especially in the context of microarray data analysis: it tends to remove highly correlated variables altogether or select them all, and maintains maximal flexibility in estimating their coefficients, particularly with different signs; the number of selected variables is no longer limited by the sample size; and the resulting prediction accuracy is competitive or superior compared to the alternatives. We illustrate the proposed method by extensive simulation studies. The proposed method is also applied to a Glioblastoma microarray data analysis. PMID:22997542

  12. Bias in the prediction of genetic gain due to mass and half-sib selection in random mating populations

    PubMed Central

    2009-01-01

    The prediction of gains from selection allows the comparison of breeding methods and selection strategies, although these estimates may be biased. The objective of this study was to investigate the extent of such bias in predicting genetic gain. For this, we simulated 10 cycles of a hypothetical breeding program that involved seven traits, three population classes, three experimental conditions and two breeding methods (mass and half-sib selection). Each combination of trait, population, heritability, method and cycle was repeated 10 times. The predicted gains were biased, even when the genetic parameters were estimated without error. Gain from selection in both genders is twice the gain from selection in a single gender only in the absence of dominance. The use of genotypic variance or broad sense heritability in the predictions represented an additional source of bias. Predictions based on additive variance and narrow sense heritability were equivalent, as were predictions based on genotypic variance and broad sense heritability. The predictions based on mass and family selection were suitable for comparing selection strategies, whereas those based on selection within progenies showed the largest bias and lower association with the realized gain. PMID:21637512

  13. Purification of polyclonal anti-conformational antibodies for use in affinity selection from random peptide phage display libraries: A study using the hydatid vaccine EG95

    PubMed Central

    Read, A.J.; Gauci, C.G.; Lightowlers, M.W.

    2009-01-01

    The use of polyclonal antibodies to screen random peptide phage display libraries often results in the recognition of a large number of peptides that mimic linear epitopes on various proteins. There appears to be a bias in the use of this technology toward the selection of peptides that mimic linear epitopes. In many circumstances the correct folding of a protein immunogen is required for conferring protection. The use of random peptide phage display libraries to identify peptide mimics of conformational epitopes in these cases requires a strategy for overcoming this bias. Conformational epitopes on the hydatid vaccine EG95 have been shown to result in protective immunity in sheep, whereas linear epitopes are not protective. In this paper we describe a strategy that results in the purification of polyclonal antibodies directed against conformational epitopes while eliminating antibodies directed against linear epitopes. These affinity purified antibodies were then used to select a peptide from a random peptide phage display library that has the capacity to mimic conformational epitopes on EG95. This peptide was subsequently used to affinity purify monospecific antibodies against EG95. PMID:19349218

  14. Free variable selection QSPR study to predict 19F chemical shifts of some fluorinated organic compounds using Random Forest and RBF-PLS methods

    NASA Astrophysics Data System (ADS)

    Goudarzi, Nasser

    2016-04-01

    In this work, two new and powerful chemometrics methods are applied for the modeling and prediction of the 19F chemical shift values of some fluorinated organic compounds. The radial basis function-partial least square (RBF-PLS) and random forest (RF) are employed to construct the models to predict the 19F chemical shifts. In this study, we didn't used from any variable selection method and RF method can be used as variable selection and modeling technique. Effects of the important parameters affecting the ability of the RF prediction power such as the number of trees (nt) and the number of randomly selected variables to split each node (m) were investigated. The root-mean-square errors of prediction (RMSEP) for the training set and the prediction set for the RBF-PLS and RF models were 44.70, 23.86, 29.77, and 23.69, respectively. Also, the correlation coefficients of the prediction set for the RBF-PLS and RF models were 0.8684 and 0.9313, respectively. The results obtained reveal that the RF model can be used as a powerful chemometrics tool for the quantitative structure-property relationship (QSPR) studies.

  15. Varying levels of difficulty index of skills-test items randomly selected by examinees on the Korean emergency medical technician licensing examination

    PubMed Central

    2016-01-01

    Purpose: The goal of this study was to characterize the difficulty index of the items in the skills test components of the class I and II Korean emergency medical technician licensing examination (KEMTLE), which requires examinees to select items randomly. Methods: The results of 1,309 class I KEMTLE examinations and 1,801 class II KEMTLE examinations in 2013 were subjected to analysis. Items from the basic and advanced skills test sections of the KEMTLE were compared to determine whether some were significantly more difficult than others. Results: In the class I KEMTLE, all 4 of the items on the basic skills test showed significant variation in difficulty index (P<0.01), as well as 4 of the 5 items on the advanced skills test (P<0.05). In the class II KEMTLE, 4 of the 5 items on the basic skills test showed significantly different difficulty index (P<0.01), as well as all 3 of the advanced skills test items (P<0.01). Conclusion: In the skills test components of the class I and II KEMTLE, the procedure in which examinees randomly select questions should be revised to require examinees to respond to a set of fixed items in order to improve the reliability of the national licensing examination. PMID:26883810

  16. Outcomes of an automated procedure for the selection of effective platelets for patients refractory to random donors based on cross-matching locally available platelet products.

    PubMed

    Rebulla, Paolo; Morelati, Fernanda; Revelli, Nicoletta; Villa, Maria Antonietta; Paccapelo, Cinzia; Nocco, Angela; Greppi, Noemi; Marconi, Maurizio; Cortelezzi, Agostino; Fracchiolla, Nicola; Martinelli, Giovanni; Deliliers, Giorgio Lambertenghi

    2004-04-01

    In 1999, we implemented an automated platelet cross-matching (XM) programme to select compatible platelets from the local inventory for patients refractory to random donor platelets. In this study, we evaluated platelet count increments in 40 consecutive refractory patients (8.3% of 480 consecutive platelet recipients) given 569 cross-match-negative platelets between April 1999 and December 2001. XM was performed automatically with a commercially available immunoadherence assay. Pre-, 1- and 24-h post-transfusion platelet counts (mean +/- SD) for the 569 XM-negative platelet transfusions containing 302 +/- 71 x 109 platelets were 7.7 +/- 5.5, 32.0 +/- 21.0 and 16.8 +/- 15.5 x 109/l respectively. Increments were significantly higher (P < 0.05, t-test) than those observed in the same patients given 303 random platelet pools (dose = 318 +/- 52 x 109 platelets) during the month before refractoriness was detected, when pre-, 1- and 24-h post-transfusion counts were 7.0 +/- 8.6, 15.9 +/- 16.1 and 9.6 +/- 12.8 x 109/l respectively. The cost of the platelet XM disposable kit per transfusion to produce 1-h post-transfusion platelet count increments >10 x 109/l was euro 447. This programme enabled the rapid selection of effective platelets for refractory patients, from the local inventory. PMID:15015974

  17. Transpapillary selective bile duct cannulation technique: Review of Japanese randomized controlled trials since 2010 and an overview of clinical results in precut sphincterotomy since 2004.

    PubMed

    Kawakami, Hiroshi; Kubota, Yoshimasa; Kawahata, Shuhei; Kubo, Kimitoshi; Kawakubo, Kazumichi; Kuwatani, Masaki; Sakamoto, Naoya

    2016-04-01

    In 1970, a Japanese group reported the first use of endoscopic retrograde cholangiopancreatography (ERCP), which is now carried out worldwide. Selective bile duct cannulation is a mandatory technique for diagnostic and therapeutic ERCP. Development of the endoscope and other devices has contributed to the extended use of ERCP, which has become a basic procedure to diagnose and treat pancreaticobiliary diseases. Various techniques related to selective bile duct cannulation have been widely applied. Although the classical contrast medium injection cannulation technique remains valuable, use of wire-guided cannulation has expanded since the early 2000s, and the technique is now widely carried out in the USA and Europe. Endoscopists must pay particular attention to a patient's condition and make an attendant choice about the most effective technique for selective bile duct cannulation. Some techniques have the potential to shorten procedure time and reduce the incidence of adverse events, particularly post-ERCP pancreatitis. However, a great deal of experience is required and endoscopists must be skilled in a variety of techniques. Although the development of the transpapillary biliary cannulation approach is remarkable, it is important to note that, to date, there have been no reports of transpapillary cannulation preventing post-ERCP pancreatitis. In the present article, selective bile duct cannulation techniques in the context of recent Japanese randomized controlled trials and cases of precut sphincterotomy are reviewed and discussed. PMID:26825609

  18. Selection of an adjuvant for seasonal influenza vaccine in elderly people: modelling immunogenicity from a randomized trial

    PubMed Central

    2013-01-01

    Background Improved influenza vaccines are needed to reduce influenza-associated complications in older adults. The aim of this study was to identify the optimal formulation of adjuvanted seasonal influenza vaccine for use in elderly people. Methods This observer-blind, randomized study assessed the optimal formulation of adjuvanted seasonal influenza vaccine based on immunogenicity and safety in participants aged ≥65 years. Participants were randomized (~200 per group) to receive one dose of non-adjuvanted vaccine or one of eight formulations of vaccine formulated with a squalene and tocopherol oil-in-water emulsion-based Adjuvant System (AS03C, AS03B or AS03A, with 2.97, 5.93 and 11.86 mg tocopherol, respectively) together with the immunostimulant monophosphoryl lipid A (MPL, doses of 0, 25 or 50 mg). Hemagglutination-inhibition (HI) antibody responses and T-cell responses were assessed on Day 0 and 21 days post-vaccination. The ratio of HI-based geometric mean titers in adjuvanted versus non-adjuvanted vaccine groups were calculated and the lower limit of the 90% confidence interval was transformed into a desirability index (a value between 0 and 1) in an experimental domain for each vaccine strain, and plotted in relation to the AS03 and MPL dose combination in the formulation. This model was used to assess the optimal formulation based on HI antibody titers. Reactogenicity and safety were also assessed. The immunogenicity and safety analyses were used to evaluate the optimal formulation of adjuvanted vaccine. Results In the HI antibody-based model, an AS03 dose–response was evident; responses against the A/H1N1 and A/H3N2 strains were higher for all adjuvanted formulations versus non-adjuvanted vaccine, and for the AS03A-MPL25, AS03B-MPL25 and AS03B-MPL50 formulations against the B strain. Modelling using more stringent criteria (post hoc) showed a clear dose-range effect for the AS03 component against all strains, whereas MPL showed a limited effect

  19. The post-pollination ethylene burst and the continuation of floral advertisement are harbingers of non-random mate selection in Nicotiana attenuata.

    PubMed

    Bhattacharya, Samik; Baldwin, Ian T

    2012-08-01

    The self-compatible plant Nicotiana attenuata grows in genetically diverse populations after fires, and produces flowers that remain open for 3 days and are visited by assorted pollinators. To determine whether and when post-pollination non-random mate selection occurs among self and non-self pollen, seed paternity and semi-in vivo pollen tube growth were determined in controlled single/mixed pollinations. Despite all pollen sources being equally proficient in siring seeds in single-genotype pollinations, self pollen was consistently selected in mixed pollinations, irrespective of maternal genotype. However, clear patterns of mate discrimination occurred amongst non-self pollen when mixed pollinations were performed soon after corollas open, including selection against hygromycin B resistance (transformation selectable marker) in wild-type styles and for it in transformed styles. However, mate choice among pollen genotypes was completely shut down in plants transformed to be unable to produce (irACO) or perceive (ETR1) ethylene. The post-pollination ethylene burst, which originates primarily from the stigma and upper style, was strongly correlated with mate selection in single and mixed hand-pollinations using eight pollen donors in two maternal ecotypes. The post-pollination ethylene burst was also negatively correlated with the continuation of emission of benzylacetone, the most abundant pollinator-attracting corolla-derived floral volatile. We conclude that ethylene signaling plays a pivotal role in mate choice, and the post-pollination ethylene burst and the termination of benzylacetone release are accurate predictors, both qualitatively and quantitatively, of pre-zygotic mate selection and seed paternity. PMID:22458597

  20. Recruitment strategies should not be randomly selected: empirically improving recruitment success and diversity in developmental psychology research

    PubMed Central

    Sugden, Nicole A.; Moulson, Margaret C.

    2015-01-01

    Psychological and developmental research have been critiqued for the lack of diversity of research samples. Because differences in culture, race, and ethnicity can influence participant behavior, limited diversity limits the generalizability of the findings. These differences may also impact how participants behave in response to recruitment attempts, which suggests that recruitment itself may be leveraged to increase sample diversity. The goal of the current study was to determine what factors, within a recruitment interaction, could be leveraged to increase success and diversity when recruiting families with children for developmental research. Study 1 found three factors influenced success: (1) recruitment was more successful when other potential participants were also interested (i.e., recruiters were busy), (2) recruiters of particular races were more successful than recruiters of other races, and (3) differences in success were related to what the recruiter said to engage the potential participant (i.e., the script). The latter two factors interacted, suggesting some recruiters were using less optimal scripts. To improve success rates, study 2 randomly assigned scripts to recruiters and encouraged them to recruit more vigorously during busy periods. Study 2 found that two factors influenced success: (1) some scripts were more successful than others and (2) we were more successful at recruiting non-White potential participants than White participants. These two interacted, with some scripts being more successful with White and other scripts being more successful with non-White families. This intervention significantly increased recruitment success rate by 8.1% and the overall number of families recruited by 15.3%. These findings reveal that empirically evaluating and tailoring recruitment efforts based on the most successful strategies is effective in boosting diversity through increased participation of children from non-White families. PMID:25972829

  1. Impact of random and systematic recall errors and selection bias in case--control studies on mobile phone use and brain tumors in adolescents (CEFALO study).

    PubMed

    Aydin, Denis; Feychting, Maria; Schüz, Joachim; Andersen, Tina Veje; Poulsen, Aslak Harbo; Prochazka, Michaela; Klaeboe, Lars; Kuehni, Claudia E; Tynes, Tore; Röösli, Martin

    2011-07-01

    Whether the use of mobile phones is a risk factor for brain tumors in adolescents is currently being studied. Case--control studies investigating this possible relationship are prone to recall error and selection bias. We assessed the potential impact of random and systematic recall error and selection bias on odds ratios (ORs) by performing simulations based on real data from an ongoing case--control study of mobile phones and brain tumor risk in children and adolescents (CEFALO study). Simulations were conducted for two mobile phone exposure categories: regular and heavy use. Our choice of levels of recall error was guided by a validation study that compared objective network operator data with the self-reported amount of mobile phone use in CEFALO. In our validation study, cases overestimated their number of calls by 9% on average and controls by 34%. Cases also overestimated their duration of calls by 52% on average and controls by 163%. The participation rates in CEFALO were 83% for cases and 71% for controls. In a variety of scenarios, the combined impact of recall error and selection bias on the estimated ORs was complex. These simulations are useful for the interpretation of previous case-control studies on brain tumor and mobile phone use in adults as well as for the interpretation of future studies on adolescents. PMID:21294138

  2. A theory for the origin of a self-replicating chemical system. I - Natural selection of the autogen from short, random oligomers

    NASA Technical Reports Server (NTRS)

    White, D. H.

    1980-01-01

    A general theory is presented for the origin of a self-replicating chemical system, termed an autogen, which is capable of both crude replication and translation (protein synthesis). The theory requires the availability of free energy and monomers to the system, a significant background low-yield synthesis of kinetically stable oligopeptides and oligonucleotides, the localization of the oligomers, crude oligonucleotide selectivity of amino acids during oligopeptide synthesis, crude oligonucleotide replication, and two short peptide families which catalyze replication and translation, to produce a localized group of at least one copy each of two protogenes and two protoenzymes. The model posits a process of random oligomerization, followed by the random nucleation of functional components and the rapid autocatalytic growth of the functioning autogen to macroscopic amounts, to account for the origin of the first self-replicating system. Such a process contains steps of such high probability and short time periods that it is suggested that the emergence of an autogen in a laboratory experiment of reasonable time scale may be possible.

  3. A comparison of the effects of random and selective mass extinctions on erosion of evolutionary history in communities of digital organisms.

    PubMed

    Yedid, Gabriel; Stredwick, Jason; Ofria, Charles A; Agapow, Paul-Michael

    2012-01-01

    The effect of mass extinctions on phylogenetic diversity and branching history of clades remains poorly understood in paleobiology. We examined the phylogenies of communities of digital organisms undergoing open-ended evolution as we subjected them to instantaneous "pulse" extinctions, choosing survivors at random, and to prolonged "press" extinctions involving a period of low resource availability. We measured age of the phylogenetic root and tree stemminess, and evaluated how branching history of the phylogenetic trees was affected by the extinction treatments. We found that strong random (pulse) and strong selective extinction (press) both left clear long-term signatures in root age distribution and tree stemminess, and eroded deep branching history to a greater degree than did weak extinction and control treatments. The widely-used Pybus-Harvey gamma statistic showed a clear short-term response to extinction and recovery, but differences between treatments diminished over time and did not show a long-term signature. The characteristics of post-extinction phylogenies were often affected as much by the recovery interval as by the extinction episode itself. PMID:22693570

  4. Selection of single blastocysts for fresh transfer via standard morphology assessment alone and with array CGH for good prognosis IVF patients: results from a randomized pilot study

    PubMed Central

    2012-01-01

    Background Single embryo transfer (SET) remains underutilized as a strategy to reduce multiple gestation risk in IVF, and its overall lower pregnancy rate underscores the need for improved techniques to select one embryo for fresh transfer. This study explored use of comprehensive chromosomal screening by array CGH (aCGH) to provide this advantage and improve pregnancy rate from SET. Methods First-time IVF patients with a good prognosis (age <35, no prior miscarriage) and normal karyotype seeking elective SET were prospectively randomized into two groups: In Group A, embryos were selected on the basis of morphology and comprehensive chromosomal screening via aCGH (from d5 trophectoderm biopsy) while Group B embryos were assessed by morphology only. All patients had a single fresh blastocyst transferred on d6. Laboratory parameters and clinical pregnancy rates were compared between the two groups. Results For patients in Group A (n = 55), 425 blastocysts were biopsied and analyzed via aCGH (7.7 blastocysts/patient). Aneuploidy was detected in 191/425 (44.9%) of blastocysts in this group. For patients in Group B (n = 48), 389 blastocysts were microscopically examined (8.1 blastocysts/patient). Clinical pregnancy rate was significantly higher in the morphology + aCGH group compared to the morphology-only group (70.9 and 45.8%, respectively; p = 0.017); ongoing pregnancy rate for Groups A and B were 69.1 vs. 41.7%, respectively (p = 0.009). There were no twin pregnancies. Conclusion Although aCGH followed by frozen embryo transfer has been used to screen at risk embryos (e.g., known parental chromosomal translocation or history of recurrent pregnancy loss), this is the first description of aCGH fully integrated with a clinical IVF program to select single blastocysts for fresh SET in good prognosis patients. The observed aneuploidy rate (44.9%) among biopsied blastocysts highlights the inherent imprecision of SET when conventional morphology is used

  5. Zeta Sperm Selection Improves Pregnancy Rate and Alters Sex Ratio in Male Factor Infertility Patients: A Double-Blind, Randomized Clinical Trial

    PubMed Central

    Nasr Esfahani, Mohammad Hossein; Deemeh, Mohammad Reza; Tavalaee, Marziyeh; Sekhavati, Mohammad Hadi; Gourabi, Hamid

    2016-01-01

    Background Selection of sperm for intra-cytoplasmic sperm injection (ICSI) is usually considered as the ultimate technique to alleviate male-factor infertility. In routine ICSI, selection is based on morphology and viability which does not necessarily preclude the chance injection of DNA-damaged or apoptotic sperm into the oocyte. Sperm with high negative surface electrical charge, named “Zeta potential”, are mature and more likely to have intact chromatin. In addition, X-bearing spermatozoa carry more negative charge. Therefore, we aimed to compare the clinical outcomes of Zeta procedure with routine sperm selection in infertile men candidate for ICSI. Materials and Methods From a total of 203 ICSI cycles studied, 101 cycles were allocated to density gradient centrifugation (DGC)/Zeta group and the remaining 102 were included in the DGC group in this prospective study. Clinical outcomes were com- pared between the two groups. The ratios of Xand Y bearing sperm were assessed by fluorescence in situ hybridization (FISH) and quantitative polymerase chain reaction (qPCR) methods in 17 independent semen samples. Results In the present double-blind randomized clinical trial, a significant increase in top quality embryos and pregnancy rate were observed in DGC/Zeta group compared to DGC group. Moreover, sex ratio (XY/XX) at birth significantly was lower in the DGC/Zeta group compared to DGC group despite similar ratio of X/Y bearings sper- matozoa following Zeta selection. Conclusion Zeta method not only improves the percentage of top embryo quality and pregnancy outcome but also alters the sex ratio compared to the conventional DGC method, despite no significant change in the ratio of Xand Ybearing sperm population (Registration number: IRCT201108047223N1). PMID:27441060

  6. Self-selection effects and modulation of TaOx resistive switching random access memory with bottom electrode of highly doped Si

    NASA Astrophysics Data System (ADS)

    Yu, Muxi; Fang, Yichen; Wang, Zongwei; Pan, Yue; Li, Ming; Cai, Yimao; Huang, Ru

    2016-05-01

    In this paper, we propose a TaOx resistive switching random access memory (RRAM) device with operation-polarity-dependent self-selection effect by introducing highly doped silicon (Si) electrode, which is promising for large-scale integration. It is observed that with highly doped Si as the bottom electrode (BE), the RRAM devices show non-linear (>103) I-V characteristic during negative Forming/Set operation and linear behavior during positive Forming/Set operation. The underling mechanisms for the linear and non-linear behaviors at low resistance states of the proposed device are extensively investigated by varying operation modes, different metal electrodes, and Si doping type. Experimental data and theoretical analysis demonstrate that the operation-polarity-dependent self-selection effect in our devices originates from the Schottky barrier between the TaOx layer and the interfacial SiOx formed by reaction between highly doped Si BE and immigrated oxygen ions in the conductive filament area.

  7. Randomizing Roaches: Exploring the "Bugs" of Randomization in Experimental Design

    ERIC Educational Resources Information Center

    Wagler, Amy; Wagler, Ron

    2014-01-01

    Understanding the roles of random selection and random assignment in experimental design is a central learning objective in most introductory statistics courses. This article describes an activity, appropriate for a high school or introductory statistics course, designed to teach the concepts, values and pitfalls of random selection and assignment…

  8. On Random Numbers and Design

    ERIC Educational Resources Information Center

    Ben-Ari, Morechai

    2004-01-01

    The term "random" is frequently used in discussion of the theory of evolution, even though the mathematical concept of randomness is problematic and of little relevance in the theory. Therefore, since the core concept of the theory of evolution is the non-random process of natural selection, the term random should not be used in teaching the…

  9. Impact of retreatment with an artemisinin-based combination on malaria incidence and its potential selection of resistant strains: study protocol for a randomized controlled clinical trial

    PubMed Central

    2013-01-01

    Background Artemisinin-based combination therapy is currently recommended by the World Health Organization as first-line treatment of uncomplicated malaria. Recommendations were adapted in 2010 regarding rescue treatment in case of treatment failure. Instead of quinine monotherapy, it should be combined with an antibiotic with antimalarial properties; alternatively, another artemisinin-based combination therapy may be used. However, for informing these policy changes, no clear evidence is yet available. The need to provide the policy makers with hard data on the appropriate rescue therapy is obvious. We hypothesize that the efficacy of the same artemisinin-based combination therapy used as rescue treatment is as efficacious as quinine + clindamycin or an alternative artemisinin-based combination therapy, without the risk of selecting drug resistant strains. Design We embed a randomized, open label, three-arm clinical trial in a longitudinal cohort design following up children with uncomplicated malaria until they are malaria parasite free for 4 weeks. The study is conducted in both the Democratic Republic of Congo and Uganda and performed in three steps. In the first step, the pre-randomized controlled trial (RCT) phase, children aged 12 to 59 months with uncomplicated malaria are treated with the recommended first-line drug and constitute a cohort that is passively followed up for 42 days. If the patients experience an uncomplicated malaria episode between days 14 and 42 of follow-up, they are randomized either to quinine + clindamycin, or an alternative artemisinin-based combination therapy, or the same first-line artemisinin-based combination therapy to be followed up for 28 additional days. If between days 14 and 28 the patients experience a recurrent parasitemia, they are retreated with the recommended first-line regimen and actively followed up for another 28 additional days (step three; post-RCT phase). The same methodology is followed for each subsequent

  10. Rock magnetic evidence of non-random raw material selection criteria in Cerro Toledo Obsidian Artifacts from Valles Caldera, New Mexico

    NASA Astrophysics Data System (ADS)

    Gregovich, A.; Feinberg, J. M.; Steffen, A.; Sternberg, R. S.

    2014-12-01

    Stone tools are one of the most enduring forms of ancient human behavior available to anthropologists. The geologic materials that comprise stone tools are a reflection of the rocks that were available locally or through trade, as are the intended use of the tools and the knapping technology needed to produce them. Investigation of the rock magnetic and geochemical characteristics of the artifacts and the geological source materials provides a baseline to explore these past behaviors. This study uses rock magnetic properties to explore the raw material selection criteria involved in the production of obsidian tools in the region around Valles Caldera in northern New Mexico. Obsidian is locally abundant and was traded by tribes across the central United States. Here we compare the rock magnetic properties of a sample of obsidian projectile points (N =25) that have been geochemically sourced to the Cerro Toledo obsidian flow with geological samples collected from four sites within the same flow (N =135). This collection of archaeological artifacts, albeit small, contains representatives of at least 8 different point styles that were used over 6000 years from the Archaic into the Late Prehistoric. Bulk rock hysteresis parameters (Mr, Ms, Bc, and Bcr) and low-field susceptibility (Χ) measurements show that the projectile points generally contain a lower concentration of magnetic minerals than the geologic samples. For example, the artifacts' median Ms value is 2.9 x 10-3 Am2kg-1, while that of the geological samples is 6.5 x 10-3 Am2kg-1. The concentration of magnetic minerals in obsidian is a proxy for the concentration of microlites in general, and this relationship suggests that although obsidian was locally abundant, toolmakers employed non-random selection criteria resulting in generally lower concentrations of microlites in their obsidian tools.

  11. Prevalence of skeletal and eye malformations in frogs from north-central United States: estimations based on collections from randomly selected sites

    USGS Publications Warehouse

    Schoff, P.K.; Johnson, C.M.; Schotthoefer, A.M.; Murphy, J.E.; Lieske, C.; Cole, R.A.; Johnson, L.B.; Beasley, V.R.

    2003-01-01

    Skeletal malformation rates for several frog species were determined in a set of randomly selected wetlands in the north-central USA over three consecutive years. In 1998, 62 sites yielded 389 metamorphic frogs, nine (2.3%) of which had skeletal or eye malformations. A subset of the original sites was surveyed in the following 2 yr. In 1999, 1,085 metamorphic frogs were collected from 36 sites and 17 (1.6%) had skeletal or eye malformations, while in 2000, examination of 1,131 metamorphs yielded 16 (1.4%) with skeletal or eye malformations. Hindlimb malformations predominated in all three years, but other abnormalities, involving forelimb, eye, and pelvis were also found. Northern leopard frogs (Rana pipiens) constituted the majority of collected metamorphs as well as most of the malformed specimens. However, malformations were also noted in mink frogs (R. septentrionalis), wood frogs (R. sylvatica), and gray tree frogs (Hyla spp.). The malformed specimens were found in clustered sites in all three years but the cluster locations were not the same in any year. The malformation rates reported here are higher than the 0.3% rate determined for metamorphic frogs collected from similar sites in Minnesota in the 1960s, and thus, appear to represent an elevation of an earlier baseline malformation rate.

  12. Exploring the Parameter Space of the Coarse-Grained UNRES Force Field by Random Search: Selecting a Transferable Medium-Resolution Force Field

    PubMed Central

    HE, YI; XIAO, YI; LIWO, ADAM; SCHERAGA, HAROLD A.

    2009-01-01

    We explored the energy-parameter space of our coarse-grained UNRES force field for large-scale ab initio simulations of protein folding, to obtain good initial approximations for hierarchical optimization of the force field with new virtual-bond-angle bending and side-chain-rotamer potentials which we recently introduced to replace the statistical potentials. 100 sets of energy-term weights were generated randomly, and good sets were selected by carrying out replica-exchange molecular dynamics simulations of two peptides with a minimal α-helical and a minimal β-hairpin fold, respectively: the tryptophan cage (PDB code: 1L2Y) and tryptophan zipper (PDB code: 1LE1). Eight sets of parameters produced native-like structures of these two peptides. These eight sets were tested on two larger proteins: the engrailed homeodomain (PDB code: 1ENH) and FBP WW domain (PDB code: 1E0L); two sets were found to produce native-like conformations of these proteins. These two sets were tested further on a larger set of nine proteins with α or α + β structure and found to locate native-like structures of most of them. These results demonstrate that, in addition to finding reasonable initial starting points for optimization, an extensive search of parameter space is a powerful method to produce a transferable force field. PMID:19242966

  13. Selective processing of auditory evoked responses with iterative-randomized stimulation and averaging: A strategy for evaluating the time-invariant assumption.

    PubMed

    Valderrama, Joaquin T; de la Torre, Angel; Medina, Carlos; Segura, Jose C; Thornton, A Roger D

    2016-03-01

    The recording of auditory evoked potentials (AEPs) at fast rates allows the study of neural adaptation, improves accuracy in estimating hearing threshold and may help diagnosing certain pathologies. Stimulation sequences used to record AEPs at fast rates require to be designed with a certain jitter, i.e., not periodical. Some authors believe that stimuli from wide-jittered sequences may evoke auditory responses of different morphology, and therefore, the time-invariant assumption would not be accomplished. This paper describes a methodology that can be used to analyze the time-invariant assumption in jittered stimulation sequences. The proposed method [Split-IRSA] is based on an extended version of the iterative randomized stimulation and averaging (IRSA) technique, including selective processing of sweeps according to a predefined criterion. The fundamentals, the mathematical basis and relevant implementation guidelines of this technique are presented in this paper. The results of this study show that Split-IRSA presents an adequate performance and that both fast and slow mechanisms of adaptation influence the evoked-response morphology, thus both mechanisms should be considered when time-invariance is assumed. The significance of these findings is discussed. PMID:26778545

  14. Comparing MTI randomization procedures to blocked randomization.

    PubMed

    Berger, Vance W; Bejleri, Klejda; Agnor, Rebecca

    2016-02-28

    Randomization is one of the cornerstones of the randomized clinical trial, and there is no shortage of methods one can use to randomize patients to treatment groups. When deciding which one to use, researchers must bear in mind that not all randomization procedures are equally adept at achieving the objective of randomization, namely, balanced treatment groups. One threat is chronological bias, and permuted blocks randomization does such a good job at controlling chronological bias that it has become the standard randomization procedure in clinical trials. But permuted blocks randomization is especially vulnerable to selection bias, so as a result, the maximum tolerated imbalance (MTI) procedures were proposed as better alternatives. In comparing the procedures, we have somewhat of a false controversy, in that actual practice goes uniformly one way (permuted blocks), whereas scientific arguments go uniformly the other way (MTI procedures). There is no argument in the literature to suggest that the permuted block design is better than or even as good as the MTI procedures, but this dearth is matched by an equivalent one regarding actual trials using the MTI procedures. So the 'controversy', if we are to call it that, pits misguided precedent against sound advice that tends to be ignored in practice. We shall review the issues to determine scientifically which of the procedures is better and, therefore, should be used. PMID:26337607

  15. Combining a dopamine agonist and selective serotonin reuptake inhibitor for the treatment of depression: A double-blind, randomized pilot study

    PubMed Central

    Franco-Chaves, Jose A.; Mateus, Camilo F.; Luckenbaugh, David A.; Martinez, Pedro E.; Mallinger, Alan G.; Zarate, Carlos A.

    2013-01-01

    Background Antidepressants that act on two or more amine neurotransmitters may confer higher remission rates when first-line agents affecting a single neurotransmitter have failed. Pramipexole, a dopamine agonist, has antidepressant effects in patients with major depressive disorder (MDD). This pilot study examined the efficacy and safety of combination therapy with pramipexole and the selective serotonin reuptake inhibitor (SSRI) escitalopram in MDD. Methods In this double-blind, controlled, pilot study, 39 patients with DSM-IV MDD who had failed to respond to a standard antidepressant treatment trial were randomized to receive pramipexole (n=13), escitalopram (n=13), or their combination (n=13) for six weeks. Pramipexole was started at 0.375 mg/day and titrated weekly up to 2.25 mg/day; escitalopram dosage remained at 10 mg/day. The primary outcome measure was the Montgomery–Asberg Depression Rating Scale (MADRS). Results Subjects receiving pramipexole monotherapy had significantly lower MADRS scores than the combination group (p=0.01); no other primary drug comparisons were significant. The combination group had a substantially higher dropout rate than the escitalopram and pramipexole groups (69%, 15%, 15%, respectively). Only 15% of patients in the combination group tolerated regularly scheduled increases of pramipexole throughout the study, compared with 46% of patients in the pramipexole group. Limitations Group size was small and the treatment phase lasted for only six weeks. Conclusions The combination of an SSRI and a dopamine agonist was not more effective than either agent alone, nor did it produce a more rapid onset of antidepressant action. Combination therapy with escitalopram and pramipexole may not be well-tolerated. PMID:23517885

  16. An assessment of the quality of care for children in eighteen randomly selected district and sub-district hospitals in Bangladesh

    PubMed Central

    2012-01-01

    Background Quality hospital care is important in ensuring that the needs of severely ill children are met to avert child mortality. However, the quality of hospital care for children in developing countries has often been found poor. As the first step of a country road map for improving hospital care for children, we assessed the baseline situation with respect to the quality of care provided to children under-five years age in district and sub-district level hospitals in Bangladesh. Methods Using adapted World Health Organization (WHO) hospital assessment tools and standards, an assessment of 18 randomly selected district (n=6) and sub-district (n=12) hospitals was undertaken. Teams of trained assessors used direct case observation, record review, interviews, and Management Information System (MIS) data to assess the quality of clinical case management and monitoring; infrastructure, processes and hospital administration; essential hospital and laboratory supports, drugs and equipment. Results Findings demonstrate that the overall quality of care provided in these hospitals was poor. No hospital had a functioning triage system to prioritise those children most in need of immediate care. Laboratory supports and essential equipment were deficient. Only one hospital had all of the essential drugs for paediatric care. Less than a third of hospitals had a back-up power supply, and just under half had functioning arrangements for safe-drinking water. Clinical case management was found to be sub-optimal for prevalent illnesses, as was the quality of neonatal care. Conclusion Action is needed to improve the quality of paediatric care in hospital settings in Bangladesh, with a particular need to invest in improving newborn care. PMID:23268650

  17. Descriptive analysis of the prevalence of anemia in a randomly selected sample of elderly people living at home: some results of an Italian multicentric study.

    PubMed

    Inelmen, E M; D'Alessio, M; Gatto, M R; Baggio, M B; Jimenez, G; Bizzotto, M G; Enzi, G

    1994-04-01

    We studied hematological indexes (RBC, HB, HT, MCV), serum iron and serum ferritin values in 1784 randomly selected subjects aged 65 and over (725 males and 1059 females) divided into five age groups (65-69, 70-74, 75-79, 80-84, > or = 85 years). The subjects were classified as anemic and normochromic according to the criteria for a "geriatric" level of anemia (HB < or = 12 g/dL in both sexes) as well as "W.H.O." levels for anemia (HB < 13 g/dL in males and < 12 g/dL in females). Macrocytosis (MCV > 100 fl) and low serum ferritin level (< or = 12 ng/dL) were classified according to MCV and serum ferritin values. Mean HB values in males were 14.85 +/- 1.33; 14.82 +/- 1.40; 14.77 +/- 1.43; 14.59 +/- 1.47 and 13.83 +/- 1.13 in the five age groups (65-69, 70-74, 75-79, 80-84 and > or = 85 years) respectively; in females, they were 13.77 +/- 1.15; 13.75 +/- 1.27; 13.44 +/- 1.39; 13.44 +/- 1.52 and 13.34 +/- 1.61, respectively. There was a low frequency of anemia in the entire sample: 2.9% in males and 9.9% in females according to the "geriatric" level, and 9.4% in males and 8.8% in females according to the "W.H.O." level. There was a higher prevalence of macrocytosis in males (6.3%) than in females (3.3%). We conclude that red cell parameters tend to decrease in aging, and further investigations are needed that exclude persons with existing chronic conditions, and incorporate data on nutritional status. PMID:7918735

  18. Embolization of the Gastroduodenal Artery Before Selective Internal Radiotherapy: A Prospectively Randomized Trial Comparing Standard Pushable Coils with Fibered Interlock Detachable Coils

    SciTech Connect

    Dudeck, Oliver Bulla, Karsten; Wieners, Gero; Ruehl, Ricarda; Ulrich, Gerd; Amthauer, Holger; Ricke, Jens; Pech, Maciej

    2011-02-15

    The purpose of this study was compare embolization of the gastroduodenal artery (GDA) using standard pushable coils with the Interlock detachable coil (IDC), a novel fibered mechanically detachable long microcoil, in patients scheduled for selective internal radiotherapy (SIRT). Fifty patients (31 male and 19 female; median age 66.6 {+-} 8.1 years) were prospectively randomized for embolization using either standard coils or IDCs. Procedure time, radiation dose, number of embolization devices, complications, and durability of vessel occlusion at follow-up angiography were recorded. The procedures differed significantly in time (14:32 {+-} 5:56 min for standard coils vs. 2:13 {+-} 1:04 min for IDCs; p < 0.001); radiation dose for coil deployment (2479 {+-} 1237 cGycm Superscript-Two for standard coils vs. 275 {+-} 268 cGycm Superscript-Two for IDCs; p < 0.001); and vessel occlusion (17:18 {+-} 6:39 min for standard coils vs. 11:19 {+-} 7:54 min for IDCs; p = 0.002). A mean of 6.2 {+-} 1.8 coils (n = 27) were used in the standard coil group, and 1.3 {+-} 0.9 coils (p < 0.0001) were used in the IDC group (n = 23) because additional pushable coils were required to achieve GDA occlusion in 4 patients. In 2 patients, the IDC could not be deployed through a Soft-VU catheter. One standard coil dislodged in the hepatic artery and was retrieved. Vessel reperfusion was noted in only 1 patient in the standard coil group. Controlled embolization of the GDA with fibered IDCs was achieved more rapidly than with pushable coils. However, vessel occlusion may not be obtained using a single device only, and the use of sharply angled guiding catheters hampered coil pushability.

  19. Random Walks on Random Graphs

    NASA Astrophysics Data System (ADS)

    Cooper, Colin; Frieze, Alan

    The aim of this article is to discuss some of the notions and applications of random walks on finite graphs, especially as they apply to random graphs. In this section we give some basic definitions, in Section 2 we review applications of random walks in computer science, and in Section 3 we focus on walks in random graphs.

  20. The RANDOM computer program: A linear congruential random number generator

    NASA Technical Reports Server (NTRS)

    Miles, R. F., Jr.

    1986-01-01

    The RANDOM Computer Program is a FORTRAN program for generating random number sequences and testing linear congruential random number generators (LCGs). The linear congruential form of random number generator is discussed, and the selection of parameters of an LCG for a microcomputer described. This document describes the following: (1) The RANDOM Computer Program; (2) RANDOM.MOD, the computer code needed to implement an LCG in a FORTRAN program; and (3) The RANCYCLE and the ARITH Computer Programs that provide computational assistance in the selection of parameters for an LCG. The RANDOM, RANCYCLE, and ARITH Computer Programs are written in Microsoft FORTRAN for the IBM PC microcomputer and its compatibles. With only minor modifications, the RANDOM Computer Program and its LCG can be run on most micromputers or mainframe computers.

  1. A prospective randomized multicenter trial of amnioreduction versus selective fetoscopic laser photocoagulation for the treatment of severe twin–twin transfusion syndrome

    PubMed Central

    Crombleholme, Timothy M.; Shera, David; Lee, Hanmin; Johnson, Mark; D’Alton, Mary; Porter, Flint; Chyu, Jacquelyn; Silver, Richard; Abuhamad, Alfred; Saade, George; Shields, Laurence; Kauffman, David; Stone, Joanne; Albanese, Craig T.; Bahado-Singh, Ray; Ball, Robert H.; Bilaniuk, Larissa; Coleman, Beverly; Farmer, Diana; Feldstein, Vickie; Harrison, Michael R.; Hedrick, Holly; Livingston, Jeffrey; Lorenz, Robert P.; Miller, David A.; Norton, Mary E.; Polzin, William J.; Robinson, Julian N.; Rychik, Jack; Sandberg, Per L.; Seri, Istvan; Simon, Erin; Simpson, Lynn L.; Yedigarova, Larisa; Wilson, R. Douglas; Young, Bruce

    2009-01-01

    Objective To examine the effect of selective fetoscopic laser photocoagulation (SFLP) versus serial amnioreduction (AR) on perinatal mortality in severe twin-twin transfusion syndrome (TTTS). Study Design 5-year multicenter prospective randomized controlled trial. The primary outcome variable was 30-day postnatal survival of donors and recipients. Results There is no statistically significant difference in 30-day postnatal survival between SFLP or AR treatment for donors at 55% (11/20) vs 55% (11/20) (p=1, OR=1, 95%CI=0.242 to 4.14) or recipients at 30% (6/20) vs 45% (9/20) (p=0.51, OR=1.88, 95%CI=0.44 to 8.64). There is no difference in 30-day survival of one or both twins on a per pregnancy basis between AR at 75% (15/20) and SFLP at 65% (13/20) (p=0.73, OR=1.62, 95%CI=0.34 to 8.09). Overall survival (newborns divided by the number of fetuses treated) is not statistically significant for AR at 60% (24/40) vs SFLP 45% (18/40) (p=0.18, OR=2.01, 95%CI=0.76 to 5.44). There is a statistically significant increase in fetal recipient mortality in the SFLP arm at 70% (14/20) versus the AR arm at 35% (7/20) (p=0.25, OR=5.31, 95%CI=1.19 to 27.6). This is offset by increased recipient neonatal mortality of 30% (6/20) in the AR arm. Echocardiographic abnormality in recipient twin Cardiovascular Profile Score is the most significant predictor of recipient mortality (p=0.055, OR=3.025/point) by logistic regression analysis. Conclusions The outcome of the trial does not conclusively determine whether AR or SFLP is a superior treatment modality. TTTS cardiomyopathy appears to be an important factor in recipient survival in TTTS. PMID:17904975

  2. Randomization and sampling issues

    USGS Publications Warehouse

    Geissler, P.H.

    1996-01-01

    The need for randomly selected routes and other sampling issues have been debated by the Amphibian electronic discussion group. Many excellent comments have been made, pro and con, but we have not reached consensus yet. This paper brings those comments together and attempts a synthesis. I hope that the resulting discussion will bring us closer to a consensus.

  3. Random thoughts

    NASA Astrophysics Data System (ADS)

    ajansen; kwhitefoot; panteltje1; edprochak; sudhakar, the

    2014-07-01

    In reply to the physicsworld.com news story “How to make a quantum random-number generator from a mobile phone” (16 May, http://ow.ly/xFiYc, see also p5), which describes a way of delivering random numbers by counting the number of photons that impinge on each of the individual pixels in the camera of a Nokia N9 smartphone.

  4. Selection of Patients and Anesthetic Types for Endovascular Treatment in Acute Ischemic Stroke: A Meta-Analysis of Randomized Controlled Trials

    PubMed Central

    Ouyang, Fubing; Chen, Yicong; Zhao, Yuhui; Dang, Ge; Liang, Jiahui; Zeng, Jinsheng

    2016-01-01

    Background and Purpose Recent randomized controlled trials have demonstrated consistent effectiveness of endovascular treatment (EVT) for acute ischemic stroke, leading to update on stroke management guidelines. We conducted this meta-analysis to assess the efficacy and safety of EVT overall and in subgroups stratified by age, baseline stroke severity, brain imaging feature, and anesthetic type. Methods Published randomized controlled trials comparing EVT and standard medical care alone were evaluated. The measured outcomes were 90-day functional independence (modified Rankin Scale ≤2), all-cause mortality, and symptomatic intracranial hemorrhage. Results Nine trials enrolling 2476 patients were included (1338 EVT, 1138 standard medical care alone). For patients with large vessel occlusions confirmed by noninvasive vessel imaging, EVT yielded improved functional outcome (pooled odds ratio [OR], 2.02; 95% confidence interval [CI], 1.64–2.50), lower mortality (OR, 0.75; 95% CI, 0.58–0.97), and similar symptomatic intracranial hemorrhage rate (OR, 1.12; 95% CI, 0.72–1.76) compared with standard medical care. A higher proportion of functional independence was seen in patients with terminus intracranial artery occlusion (±M1) (OR, 3.16; 95% CI, 1.64–6.06), baseline Alberta Stroke Program Early CT score of 8–10 (OR, 2.11; 95% CI, 1.25–3.57) and age ≤70 years (OR, 3.01; 95% CI, 1.73–5.24). EVT performed under conscious sedation had better functional outcomes (OR, 2.08; 95% CI, 1.47–2.96) without increased risk of symptomatic intracranial hemorrhage or short-term mortality compared with general anesthesia. Conclusions Vessel-imaging proven large vessel occlusion, a favorable scan, and younger age are useful predictors to identify anterior circulation stroke patients who may benefit from EVT. Conscious sedation is feasible and safe in EVT based on available data. However, firm conclusion on the choice of anesthetic types should be drawn from more

  5. Metabolomic Profiling of Urine: Response to a Randomized, Controlled Feeding Study of Select Fruits and Vegetables, and Application to an Observational Study 1,2

    PubMed Central

    May, Damon H.; Navarro, Sandi L.; Ruczinski, Ingo; Hogan, Jason; Ogata, Yuko; Schwarz, Yvonne; Levy, Lisa; Holzman, Ted; McIntosh, Martin W.; Lampe, Johanna W.

    2013-01-01

    Metabolomic profiles were used to characterize the effects of consuming a high-phytochemical diet compared to a diet devoid of fruits and vegetables in a randomized trial and cross-sectional study. In the trial, 8 h fasting urine from healthy men (n=5) and women (n=5) was collected after a 2-week randomized, controlled trial of 2 diet periods: a diet rich in cruciferous vegetables, citrus and soy (F&V), and a fruit- and vegetable-free (basal) diet. Among the ions found to differentiate the diets, 176 were putatively annotated with compound identifications, with 46 supported by MS/MS fragment evidence. Metabolites more abundant in the F&V diet included markers of dietary intervention (e.g., crucifers, citrus and soy), fatty acids and niacin metabolites. Ions more abundant in the basal diet included riboflavin, several acylcarnitines, and amino acid metabolites. In the cross-sectional study, we compared participants based on tertiles of crucifers, citrus and soy from 3 d food records (3DFR; n=36) and food frequency questionnaires (FFQ; n=57); intake was separately divided into tertiles of total fruit and vegetable intake for FFQ. As a group, ions individually differential between the experimental diets differentiated the observational study participants. However, only 4 ions were significant individually, differentiating the third vs. first tertile of crucifer, citrus and soy intake based on 3FDR. One of these was putatively annotated: proline betaine, a marker of citrus consumption. There were no ions significantly distinguishing tertiles by FFQ. Metabolomics assessment of controlled dietary interventions provides a more accurate and stronger characterization of diet than observational data. PMID:23657156

  6. Randomized Comparison of Selective Internal Radiotherapy (SIRT) Versus Drug-Eluting Bead Transarterial Chemoembolization (DEB-TACE) for the Treatment of Hepatocellular Carcinoma

    SciTech Connect

    Pitton, Michael B. Kloeckner, Roman; Ruckes, Christian; Wirth, Gesine M.; Eichhorn, Waltraud; Wörns, Marcus A.; Weinmann, Arndt; Schreckenberger, Mathias; Galle, Peter R.; Otto, Gerd; Dueber, Christoph

    2015-04-15

    PurposeTo prospectively compare SIRT and DEB-TACE for treating hepatocellular carcinoma (HCC).MethodsFrom 04/2010–07/2012, 24 patients with histologically proven unresectable N0, M0 HCCs were randomized 1:1 to receive SIRT or DEB-TACE. SIRT could be repeated once in case of recurrence; while, TACE was repeated every 6 weeks until no viable tumor tissue was detected by MRI or contraindications prohibited further treatment. Patients were followed-up by MRI every 3 months; the final evaluation was 05/2013.ResultsBoth groups were comparable in demographics (SIRT: 8males/4females, mean age 72 ± 7 years; TACE: 10males/2females, mean age 71 ± 9 years), initial tumor load (1 patient ≥25 % in each group), and BCLC (Barcelona Clinic Liver Cancer) stage (SIRT: 12×B; TACE 1×A, 11×B). Median progression-free survival (PFS) was 180 days for SIRT versus 216 days for TACE patients (p = 0.6193) with a median TTP of 371 days versus 336 days, respectively (p = 0.5764). Median OS was 592 days for SIRT versus 788 days for TACE patients (p = 0.9271). Seven patients died in each group. Causes of death were liver failure (n = 4 SIRT group), tumor progression (n = 4 TACE group), cardiovascular events, and inconclusive (n = 1 in each group).ConclusionsNo significant differences were found in median PFS, OS, and TTP. The lower rate of tumor progression in the SIRT group was nullified by a greater incidence of liver failure. This pilot study is the first prospective randomized trial comparing SIRT and TACE for treating HCC, and results can be used for sample size calculations of future studies.

  7. A non-randomized confirmatory study regarding selection of fertility-sparing surgery for patients with epithelial ovarian cancer: Japan Clinical Oncology Group Study (JCOG1203).

    PubMed

    Satoh, Toyomi; Tsuda, Hitoshi; Kanato, Keisuke; Nakamura, Kenichi; Shibata, Taro; Takano, Masashi; Baba, Tsukasa; Ishikawa, Mitsuya; Ushijima, Kimio; Yaegashi, Nobuo; Yoshikawa, Hiroyuki

    2015-06-01

    Fertility-sparing treatment has been accepted as a standard treatment for epithelial ovarian cancer in stage IA non-clear cell histology grade 1/grade 2. In order to expand an indication of fertility-sparing treatment, we have started a non-randomized confirmatory trial for stage IA clear cell histology and stage IC unilateral non-clear cell histology grade 1/grade 2. The protocol-defined fertility-sparing surgery is optimal staging laparotomy including unilateral salpingo-oophorectomy, omentectomy, peritoneal cytology and pelvic and para-aortic lymph node dissection or biopsy. After fertility-sparing surgery, four to six cycles of adjuvant chemotherapy with paclitaxel and carboplatin are administered. We plan to enroll 250 patients with an indication of fertility-sparing surgery, and then the primary analysis is to be conducted for 63 operated patients with pathologically confirmed stage IA clear cell histology and stage IC unilateral non-clear cell histology grade 1/grade 2. The primary endpoint is 5-year overall survival. Secondary endpoints are other survival endpoints and factors related to reproduction. This trial has been registered at the UMIN Clinical Trials Registry as UMIN000013380. PMID:26059697

  8. A phase 2 randomized dose-ranging study of the JAK2-selective inhibitor fedratinib (SAR302503) in patients with myelofibrosis

    PubMed Central

    Pardanani, A; Tefferi, A; Jamieson, C; Gabrail, N Y; Lebedinsky, C; Gao, G; Liu, F; Xu, C; Cao, H; Talpaz, M

    2015-01-01

    In this phase 2 open-label randomized study, 31 patients with intermediate-2 or high-risk myelofibrosis received fedratinib 300, 400 or 500 mg once daily in consecutive 4-week cycles. Mean spleen volume reductions at 12 weeks (primary end point) were 30.3% (300 mg), 33.1% (400 mg) and 43.3% (500 mg). Spleen response rates (patients achieving ⩾35% spleen reduction) at 12/24 weeks were 30%/30% (300 mg), 50%/60% (400 mg) and 64%/55% (500 mg), respectively. By 4 weeks, improvements in myelofibrosis (MF)-associated symptoms were observed. At 48 weeks, 68% of patients remained on fedratinib and 16% had discontinued because of adverse events (AEs). Common grade 3/4 AEs were anemia (58%), fatigue (13%), diarrhea (13%), vomiting (10%) and nausea (6%). Serious AEs included one case of reversible hepatic failure and one case of Wernicke's encephalopathy (after analysis cutoff). Fedratinib treatment led to reduced STAT3 phosphorylation but no meaningful change in JAK2V617F allele burden. Significant modulation (P<0.05, adjusted for multiple comparisons) of 28 cytokines was observed, many of which correlated with spleen reduction. These data confirm the clinical activity of fedratinib in MF. After the analysis cutoff date, additional reports of Wernicke's encephalopathy in other fedratinib trials led to discontinuation of the sponsored clinical development program. PMID:26252788

  9. Random Vibrations

    NASA Technical Reports Server (NTRS)

    Messaro. Semma; Harrison, Phillip

    2010-01-01

    Ares I Zonal Random vibration environments due to acoustic impingement and combustion processes are develop for liftoff, ascent and reentry. Random Vibration test criteria for Ares I Upper Stage pyrotechnic components are developed by enveloping the applicable zonal environments where each component is located. Random vibration tests will be conducted to assure that these components will survive and function appropriately after exposure to the expected vibration environments. Methodology: Random Vibration test criteria for Ares I Upper Stage pyrotechnic components were desired that would envelope all the applicable environments where each component was located. Applicable Ares I Vehicle drawings and design information needed to be assessed to determine the location(s) for each component on the Ares I Upper Stage. Design and test criteria needed to be developed by plotting and enveloping the applicable environments using Microsoft Excel Spreadsheet Software and documenting them in a report Using Microsoft Word Processing Software. Conclusion: Random vibration liftoff, ascent, and green run design & test criteria for the Upper Stage Pyrotechnic Components were developed by using Microsoft Excel to envelope zonal environments applicable to each component. Results were transferred from Excel into a report using Microsoft Word. After the report is reviewed and edited by my mentor it will be submitted for publication as an attachment to a memorandum. Pyrotechnic component designers will extract criteria from my report for incorporation into the design and test specifications for components. Eventually the hardware will be tested to the environments I developed to assure that the components will survive and function appropriately after exposure to the expected vibration environments.

  10. Embolization of the Gastroduodenal Artery Before Selective Internal Radiotherapy: A Prospectively Randomized Trial Comparing Platinum-Fibered Microcoils with the Amplatzer Vascular Plug II

    SciTech Connect

    Pech, Maciej Kraetsch, Annett; Wieners, Gero; Redlich, Ulf; Gaffke, Gunnar; Ricke, Jens; Dudeck, Oliver

    2009-05-15

    The Amplatzer Vascular Plug II (AVP II) is a novel device for transcatheter vessel occlusion, for which only limited comparative data exist. Embolotherapy of the gastroduodenal artery (GDA) is essential before internal radiotherapy (SIRT) in order to prevent radiation-induced peptic ulcerations due to migration of yttrium-90 microspheres. The purpose of this study was to compare the vascular anatomical limitations, procedure time, effectiveness, and safety of embolization of the GDA with coils versus the AVP II. Fifty patients stratified for SIRT were prospectively randomized for embolization of the GDA with either coils or the AVP II. The angle between the aorta and the celiac trunk, diameter of the GDA, fluoroscopy time and total time for embolization, number of embolization devices, complications, and durability of vessel occlusion at follow-up angiography for SIRT were recorded. A t-test was used for statistical analysis. Embolizations with either coils or the AVP II were technically feasible in all but two patients scheduled for embolization of the GDA with the AVP II. In both cases the plug could not be positioned due to the small celiac trunk outlet angles of 17{sup o} and 21{sup o}. The mean diameter of the GDA was 3.7 mm (range, 2.2-4.8 mm) for both groups. The procedures differed significantly in fluoroscopy time (7.8 min for coils vs. 2.6 min for the AVP II; P < 0.001) and embolization time (23.1 min for coils vs. 8.8 min for the AVP II; P < 0.001). A mean of 6.0 {+-} 3.2 coils were used for GDA embolization, while no more than one AVP II was needed for successful vessel occlusion (P < 0.001). One coil migration occurred during coil embolization, whereas no procedural complication was encountered with the use of the AVP II. Vessel reperfusion was noted in only one patient, in whom coil embolization was performed. In conclusion, embolization of the GDA with the AVP II is safe, easy, rapid, and highly effective; only an extremely sharp-angled celiac trunk

  11. Promoting mobility after hip fracture (ProMo): study protocol and selected baseline results of a year-long randomized controlled trial among community-dwelling older people

    PubMed Central

    2011-01-01

    Background To cope at their homes, community-dwelling older people surviving a hip fracture need a sufficient amount of functional ability and mobility. There is a lack of evidence on the best practices supporting recovery after hip fracture. The purpose of this article is to describe the design, intervention and demographic baseline results of a study investigating the effects of a rehabilitation program aiming to restore mobility and functional capacity among community-dwelling participants after hip fracture. Methods/Design Population-based sample of over 60-year-old community-dwelling men and women operated for hip fracture (n = 81, mean age 79 years, 78% were women) participated in this study and were randomly allocated into control (Standard Care) and ProMo intervention groups on average 10 weeks post fracture and 6 weeks after discharged to home. Standard Care included written home exercise program with 5-7 exercises for lower limbs. Of all participants, 12 got a referral to physiotherapy. After discharged to home, only 50% adhered to Standard Care. None of the participants were followed-up for Standard Care or mobility recovery. ProMo-intervention included Standard Care and a year-long program including evaluation/modification of environmental hazards, guidance for safe walking, pain management, progressive home exercise program and physical activity counseling. Measurements included a comprehensive battery of laboratory tests and self-report on mobility limitation, disability, physical functional capacity and health as well as assessments for the key prerequisites for mobility, disability and functional capacity. All assessments were performed blinded at the research laboratory. No significant differences were observed between intervention and control groups in any of the demographic variables. Discussion Ten weeks post hip fracture only half of the participants were compliant to Standard Care. No follow-up for Standard Care or mobility recovery occurred

  12. 49 CFR 655.45 - Random testing.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... is notified of selection for random drug or random alcohol testing proceed to the test site..., DEPARTMENT OF TRANSPORTATION PREVENTION OF ALCOHOL MISUSE AND PROHIBITED DRUG USE IN TRANSIT OPERATIONS Types... section, the minimum annual percentage rate for random drug testing shall be 50 percent of...

  13. UGT1A6 and UGT2B15 Polymorphisms and Acetaminophen Conjugation in Response to a Randomized, Controlled Diet of Select Fruits and Vegetables

    PubMed Central

    Navarro, Sandi L.; Chen, Yu; Li, Lin; Li, Shuying S.; Chang, Jyh-Lurn; Schwarz, Yvonne; King, Irena B.; Potter, John D.; Bigler, Jeannette

    2011-01-01

    Acetaminophen (APAP) glucuronidation is thought to occur mainly by UDP-glucuronosyltransferases (UGT) in the UGT1A family. Interindividual variation in APAP glucuronidation is attributed in part to polymorphisms in UGT1As. However, evidence suggests that UGT2B15 may also be important. We evaluated, in a controlled feeding trial, whether APAP conjugation differed by UGT1A6 and UGT2B15 genotypes and whether supplementation of known dietary inducers of UGT (crucifers, soy, and citrus) modulated APAP glucuronidation compared with a diet devoid of fruits and vegetables (F&V). Healthy adults (n = 66) received 1000 mg of APAP orally on days 7 and 14 of each 2-week feeding period and collected saliva and urine over 12 h. Urinary recovery of the percentage of the APAP dose as free APAP was higher (P = 0.02), and the percentage as APAP glucuronide (APAPG) was lower (P = 0.004) in women. The percentage of APAP was higher among UGT1A6*1/*1 genotypes, relative to *1/*2 and *2/*2 genotypes (P = 0.045). For UGT2B15, the percentage of APAPG decreased (P < 0.0001) and that of APAP sulfate increased (P = 0.002) in an allelic dose-dependent manner across genotypes from *1/*1 to *2/*2. There was a significant diet × UGT2B15 genotype interaction for the APAPG ratio (APAPG/total metabolites × 100) (P = 0.03), with *1/*1 genotypes having an approximately 2-fold higher F&V to basal diet difference in response compared with *1/*2 and *2/*2 genotypes. Salivary APAP maximum concentration (Cmax) was significantly higher in women (P = 0.0003), with F&V (P = 0.003), and among UGT1A6*2/*2 and UGT2B15*1/*2 genotypes (P = 0.02 and 0.002, respectively). APAP half-life was longer in UGT2B15*2/*2 genotypes with F&V (P = 0.009). APAP glucuronidation was significantly influenced by the UGT2B15*2 polymorphism, supporting a role in vivo for UGT2B15 in APAP glucuronidation, whereas the contribution of UGT1A6*2 was modest. Selected F&V known to affect UGT activity led to greater glucuronidation and less

  14. Is random access memory random?

    NASA Technical Reports Server (NTRS)

    Denning, P. J.

    1986-01-01

    Most software is contructed on the assumption that the programs and data are stored in random access memory (RAM). Physical limitations on the relative speeds of processor and memory elements lead to a variety of memory organizations that match processor addressing rate with memory service rate. These include interleaved and cached memory. A very high fraction of a processor's address requests can be satified from the cache without reference to the main memory. The cache requests information from main memory in blocks that can be transferred at the full memory speed. Programmers who organize algorithms for locality can realize the highest performance from these computers.

  15. The moral importance of selecting people randomly.

    PubMed

    Peterson, Martin

    2008-07-01

    This article discusses some ethical principles for distributing pandemic influenza vaccine and other indivisible goods. I argue that a number of principles for distributing pandemic influenza vaccine recently adopted by several national governments are morally unacceptable because they put too much emphasis on utilitarian considerations, such as the ability of the individual to contribute to society. Instead, it would be better to distribute vaccine by setting up a lottery. The argument for this view is based on a purely consequentialist account of morality; i.e. an action is right if and only if its outcome is optimal. However, unlike utilitarians I do not believe that alternatives should be ranked strictly according to the amount of happiness or preference satisfaction they bring about. Even a mere chance to get some vaccine matters morally, even if it is never realized. PMID:18445094

  16. Selected Vegetables/Sun's Soup (PDQ)

    MedlinePlus

    ... Selected Vegetables/Sun’s Soup along with other treatments. Randomized controlled trials , enrolling larger numbers of people, are ... treatments, or both. None of the trials were randomized or controlled . Randomized clinical trials give the highest ...

  17. Can randomization be informative?

    NASA Astrophysics Data System (ADS)

    Pereira, Carlos A. B.; Campos, Thiago F.; Silva, Gustavo M.; Wechsler, Sergio

    2012-10-01

    In this paper the Pair of Siblings Paradox introduced by Pereira [1] is extended by considering more than two children and more than one child observed for gender. We follow the same lines of Wechsler et al. [2] that generalizes the three prisoners' dilemma, introduced by Gardner [3]. This paper's conjecture is that the Pair of Siblings and the Three Prisoners dilemma are dual paradoxes. Looking at possible likelihoods, the sure (randomized) selection for the former is non informative (informative), the opposite that holds for the latter. This situation is maintained for generalizations. Non informative likelihood here means that prior and posterior are equal.

  18. Impact of patient-selected care buddies on adherence to HIV care, disease progression and conduct of daily life among pre-antiretroviral HIV-infected patients in Rakai, Uganda: a randomized controlled trial

    PubMed Central

    Nakigozi, Gertrude; Makumbi, Fredrick E.; Bwanika, John Baptist; Atuyambe, Lynn; Reynolds, Steven J.; Kigozi, Godfrey; Nalugoda, Fred; Chang, Larry W.; Kiggundu, Valerian; Serwadda, David; Wawer, Maria J.; Gray, Ronald H.; Kamya, Moses R.

    2015-01-01

    Background Data are limited on effects of household or community support persons (“care buddies”) on enrolment into and adherence to pre-antiretroviral HIV care. We assessed the impact of care buddies on adherence to HIV clinic appointments, HIV progression and conduct of daily life among pre-ART HIV-infected individuals in Rakai, Uganda. Methods 1209 HIV infected pre-ART patients aged ≥15 years were randomized to standard of care (SOC) (n = 604) or patient-selected care buddy (PSCB) (n= 605) and followed at 6 and 12 months. Outcomes were adherence to clinic visits; HIV disease progression and self-reported conduct of daily life. Incidence and prevalence rate ratios and 95% confidence intervals (95%CI) were used to assess outcomes in the intent-to-treat and as-treated analyses. Results Baseline characteristics were comparable. In the ITT analysis both arms were comparable with respect to adherence to CD4 monitoring visits (adjPRR 0.98, 95%CI 0.93-1.04, p=0.529) and HIV progression (adjPRR=1.00, 95%CI 0.77-1.31, p=0.946). Good conduct of daily life was significantly higher in the PSCB than the SOC arm (adjPRR 1.08, 95%CI 1.03-1.13, p=0.001). More men (61%) compared to women (30%) selected spouses/partners as buddies (p<0.0001.) 22% of PSCB arm participants discontinued use of buddies. Conclusion In pre-ART persons, having care buddies improved the conduct of daily life of the HIV infected patients but had no effect on HIV disease progression and only limited effect on clinic appointment adherence. PMID:26039929

  19. Evaluation of the effect of aromatherapy with Rosa damascena Mill. on postoperative pain intensity in hospitalized children in selected hospitals affiliated to Isfahan University of Medical Sciences in 2013: A randomized clinical trial

    PubMed Central

    Marofi, Maryam; Sirousfard, Motahareh; Moeini, Mahin; Ghanadi, Alireza

    2015-01-01

    Background: Pain is the common complication after a surgery. The aim of this study was to evaluate the effect of aromatherapy with Rosa damascena Mill. on the postoperative pain in children. Materials and Methods: In a double-blind, placebo-controlled clinical trial, we selected 64 children of 3–6 years of age through convenient sampling and divided them randomly into two groups. Patients in group A were given inhalation aromatherapy with R. damascena Mill., and in group B, the patients were given almond oil as a placebo. Inhalation aromatherapy was used at the first time of subjects’ arrival to the ward and then at 3, 6, 9, and 12 h afterward. Common palliative treatments to relieve pain were used in both groups. Thirty minutes after aromatherapy, the postoperative pain in children was evaluated with the Toddler Preschooler Postoperative Pain Scale (TPPPS). Data were statistically analyzed using Chi-square test, one-way analysis of variance (ANOVA), and repeated measures ANOVA. Results: There was no significant difference in pain scores at the first time of subjects’ arrival to the ward (before receiving any aromatherapy or palliative care) between the two groups. After each time of aromatherapy and at the end of treatment, the pain score was significantly reduced in the aromatherapy group with R. damascena Mill. compared to the placebo group. Conclusions: According to our results, aromatherapy with R. damascena Mill. can be used in postoperative pain in children, together with other common treatments without any significant side effects. PMID:25878704

  20. Assessing the prevalence of the Metabolic Syndrome according to NCEP ATP III in Germany: feasibility and quality aspects of a two step approach in 1550 randomly selected primary health care practices

    PubMed Central

    Moebus, Susanne; Hanisch, Jens Ulrich; Neuhäuser, Markus; Aidelsburger, Pamela; Wasem, Jürgen; Jöckel, Karl-Heinz

    2006-01-01

    Objective: Metabolic Syndrome (MetSyn) describes a cluster of metabolic disorders and is considered a risk factor for development of cardiovascular disease. Although a high prevalence is commonly assumed in Germany data about the degree of its occurrence in the population and in subgroups are still missing. The aim of this study was to assess the prevalence of the MetSyn according to the NCEP ATP-III (National Cholesterol Education Program Adult Treatment Panel III) criteria in persons aged ≥18 years attending a general practitioner in Germany. Here we describe in detail the methods used and the feasibility of determining the MetSyn in a primary health care setting. Research design and methods: The German-wide cross-sectional study was performed during two weeks in October 2005. Blood samples were analyzed in a central laboratory. Waist circumference and blood pressure were assessed, data on smoking, life style, fasting status, socio-demographic characteristics and core information from non-participants collected. Quality control procedures included telephone-monitoring and random on-site visits. In order to achieve a maximal number of fasting blood samples with a minimal need for follow-up appointments a stepwise approach was developed. Basic descriptive statistics were calculated, the Taylor expansion method used to estimate standard errors needed for calculation of confidence intervals for clustered observations. Results: In total, 1511 randomly selected general practices from 397 out of 438 German cities and administrative districts enrolled 35,869 patients (age range: 18-99, women 61.1%). More than 50,000 blood samples were taken. Fasting blood samples were available for 49% of the participants. Of the participating patients 99.3% returned questionnaires to the GP, only 12% were not filled out completely. The overall prevalence of the MetSyn (NCEP/ATP III 2001) was found to be 19.8%, with men showing higher prevalence rates than women (22.7% respective 18

  1. Randomized, Double-Blind, Placebo-Controlled, Multicenter Phase II Study of the Efficacy and Safety of Apricoxib in Combination With Either Docetaxel or Pemetrexed in Patients With Biomarker-Selected Non–Small-Cell Lung Cancer

    PubMed Central

    Edelman, Martin J.; Tan, Ming T.; Fidler, Mary J.; Sanborn, Rachel E.; Otterson, Greg; Sequist, Lecia V.; Evans, Tracey L.; Schneider, Bryan J.; Keresztes, Roger; Rogers, John S.; de Mayolo, Jorge Antunez; Feliciano, Josephine; Yang, Yang; Medeiros, Michelle; Zaknoen, Sara L.

    2015-01-01

    Purpose Overexpression of COX-2 correlates with advanced stage and worse outcomes in non–small-cell lung cancer (NSCLC), possibly as a result of elevated levels of COX-2–dependent prostaglandin E2 (PGE2). Exploratory analyses of studies that used COX-2 inhibitors have demonstrated potentially superior outcome in patients in whom the urinary metabolite of PGE2 (PGE-M) is suppressed. We hypothesized that patients with disease defined by PGE-M suppression would benefit from the addition of apricoxib to second-line docetaxel or pemetrexed. Patients and Methods Patients with NSCLC who had disease progression after one line of platinum-based therapy, performance status of 0 to 2, and normal organ function were potentially eligible. Only patients with a ≥ 50% decrease in urinary PGE-M after 5 days of treatment with apricoxib could enroll. Docetaxel 75 mg/m2 or pemetrexed 500 mg/m2 once every 21 days per the investigator was administered with apricoxib or placebo 400 mg once per day. The primary end point was progression-free survival (PFS). Exploratory analysis was performed regarding baseline urinary PGE-M and outcomes. Results In all, 101 patients completed screening, and 72 of the 80 who demonstrated ≥ 50% suppression were randomly assigned to apricoxib or placebo. Toxicity was similar between the arms. No improvement in PFS was seen with apricoxib versus placebo. The median PFS for the control arm was 97 days (95% CI, 52 to 193 days) versus 85 days (95% CI, 67 to 142 days) for the experimental arm (P = .91). Conclusion Apricoxib did not improve PFS, despite biomarker-driven patient selection. PMID:25452446

  2. Selective Transurethral Resection of the Prostate Combined with Transurethral Incision of the Bladder Neck for Bladder Outlet Obstruction in Patients with Small Volume Benign Prostate Hyperplasia (BPH): A Prospective Randomized Study

    PubMed Central

    Li, Xin; Pan, Jin-hong; Liu, Qi-gui; He, Peng; Song, Si-ji; Jiang, Tao; Zhou, Zhan-song

    2013-01-01

    Purpose Transurethral resection of the prostate (TURP) has a high failure rate in patients with small volume benign prostate hyperplasia (BPH) with bladder outlet obstruction (BOO). We describe and report the results of an alternative surgical method, selective transurethral resection of the prostate (STURP) in combination with transurethral incision of the bladder neck (TUIBN). Methods Patients were randomized to receive TURP or STRUP+TUIBN in combination with TUIBN. Maximum urinary flow rate (Qmax), voided volume, and post voiding residual volume (PVR) were assessed at baseline and at 1, 3, and 6 months after surgery. Efficacy of treatment was assessed by lower urinary tract symptoms and IPSS. Results Sixty three patients received STRUP+TUIBN and 61 received TURP. Surgical time, amount of prostate tissue resected, and blood loss was the same in both groups (all, p>0.05). The mean duration of follow-up was 9.02 and 8.53 months in patients receiving TURP and STRUP+TUIBN, respectively. At 6 months postoperatively, IPSS was 4.26±1.22 and 4.18±1.47 in patients receiving TURP and STRUP+TUIBN, respectively (p>0.05), and the Qmax in patients receiving STRUP+TUIBN was markedly higher than in those receiving TURP (28.28±6.46 mL/s vs. 21.59±7.14 mL/s; p<0.05). Bladder neck contracture and urinary tract infections were observed in 3 and 5 patients receiving TURP, respectively, and none in STURP. Conclusions STRUP+TUIBN may offer a more effective and safer alternative to TURP for small volume BPH patients. PMID:23691002

  3. Efficacy of bupropion and the selective serotonin reuptake inhibitors in the treatment of anxiety symptoms in major depressive disorder: a meta-analysis of individual patient data from 10 double-blind, randomized clinical trials.

    PubMed

    Papakostas, George I; Trivedi, Madhukar H; Alpert, Jonathan E; Seifert, Cheryl A; Krishen, Alok; Goodale, Elizabeth P; Tucker, Vivian L

    2008-01-01

    The goal of this work was to compare the efficacy of the norepinephrine-dopamine reuptake inhibitor bupropion with the selective serotonin reuptake inhibitors (SSRIs) in the treatment of anxiety symptoms in major depressive disorder (MDD). Ten double-blind, randomized studies, involving a total of 2890 bupropion-, SSRI- or placebo- treated patients were pooled. Anxiety symptoms of depression were defined using the Hamilton depression rating scale (HDRS) Anxiety-Somatization factor (HDRS-AS) score, as well as the Hamilton anxiety scale (HAM-A) score. Both bupropion and the SSRIs led to a comparable degree of improvement in anxiety symptoms, defined using the HDRS-AS score (-3.8+/-2.8 vs. -3.9+/-2.8, p=0.130) or HAM-A score (-8.8+/-7.2 vs. -9.1+/-7.0, p=0.177). There was no consistent difference in the time to anxiolysis between the two treatment groups. In addition, there was no difference in the proportion of bupropion- and SSRI- remitters who continued to experience residual anxiety, defined as a HDRS-AS score >0 at endpoint (69.2% vs. 74.7%, p=0.081) or a HAM-A score >7 at endpoint (9.5% vs. 8.4%, p=0.284). Finally, there was no statistically significant difference in the severity of residual anxiety symptoms between bupropion- or SSRI- treated patients with remitted depression, defined using the HDRS-AS (1.15+/-1.14 vs. 1.25+/-1.09, p=0.569), or HAM-A scores at endpoint (3.30+/-2.89 vs. 3.31+/-2.89, p=0.552). Contrary to clinician impression, there does not appear to be any difference in the anxiolytic efficacy of bupropion and the SSRIs when used to treat MDD. PMID:17631898

  4. Covariate-based constrained randomization of group-randomized trials.

    PubMed

    Moulton, Lawrence H

    2004-01-01

    Group-randomized study designs are useful when individually randomized designs are either not possible, or will not be able to estimate the parameters of interest. Blocked and/or stratified (for example, pair-matched) designs have been used, and their properties statistically evaluated by many researchers. Group-randomized trials often have small numbers of experimental units, and strong, geographically induced between-unit correlation, which increase the chance of obtaining a "bad" randomization outcome. This article describes a procedure--random selection from a list of acceptable allocations--to allocate treatment conditions in a way that ensures balance on relevant covariates. Numerous individual- and group-level covariates can be balanced using exact or caliper criteria. Simulation results indicate that this method has good frequency properties, but some care may be needed not to overly constrain the randomization. There is a trade-off between achieving good balance through a highly constrained design, and jeopardizing the appearance of impartiality of the investigator and potentially departing from the nominal Type I error. PMID:16279255

  5. Random broadcast on random geometric graphs

    SciTech Connect

    Bradonjic, Milan; Elsasser, Robert; Friedrich, Tobias

    2009-01-01

    In this work, we consider the random broadcast time on random geometric graphs (RGGs). The classic random broadcast model, also known as push algorithm, is defined as: starting with one informed node, in each succeeding round every informed node chooses one of its neighbors uniformly at random and informs it. We consider the random broadcast time on RGGs, when with high probability: (i) RGG is connected, (ii) when there exists the giant component in RGG. We show that the random broadcast time is bounded by {Omicron}({radical} n + diam(component)), where diam(component) is a diameter of the entire graph, or the giant component, for the regimes (i), or (ii), respectively. In other words, for both regimes, we derive the broadcast time to be {Theta}(diam(G)), which is asymptotically optimal.

  6. Quantumness, Randomness and Computability

    NASA Astrophysics Data System (ADS)

    Solis, Aldo; Hirsch, Jorge G.

    2015-06-01

    Randomness plays a central role in the quantum mechanical description of our interactions. We review the relationship between the violation of Bell inequalities, non signaling and randomness. We discuss the challenge in defining a random string, and show that algorithmic information theory provides a necessary condition for randomness using Borel normality. We close with a view on incomputablity and its implications in physics.

  7. How random is a random vector?

    NASA Astrophysics Data System (ADS)

    Eliazar, Iddo

    2015-12-01

    Over 80 years ago Samuel Wilks proposed that the "generalized variance" of a random vector is the determinant of its covariance matrix. To date, the notion and use of the generalized variance is confined only to very specific niches in statistics. In this paper we establish that the "Wilks standard deviation" -the square root of the generalized variance-is indeed the standard deviation of a random vector. We further establish that the "uncorrelation index" -a derivative of the Wilks standard deviation-is a measure of the overall correlation between the components of a random vector. Both the Wilks standard deviation and the uncorrelation index are, respectively, special cases of two general notions that we introduce: "randomness measures" and "independence indices" of random vectors. In turn, these general notions give rise to "randomness diagrams"-tangible planar visualizations that answer the question: How random is a random vector? The notion of "independence indices" yields a novel measure of correlation for Lévy laws. In general, the concepts and results presented in this paper are applicable to any field of science and engineering with random-vectors empirical data.

  8. Image segmentation using random features

    NASA Astrophysics Data System (ADS)

    Bull, Geoff; Gao, Junbin; Antolovich, Michael

    2014-01-01

    This paper presents a novel algorithm for selecting random features via compressed sensing to improve the performance of Normalized Cuts in image segmentation. Normalized Cuts is a clustering algorithm that has been widely applied to segmenting images, using features such as brightness, intervening contours and Gabor filter responses. Some drawbacks of Normalized Cuts are that computation times and memory usage can be excessive, and the obtained segmentations are often poor. This paper addresses the need to improve the processing time of Normalized Cuts while improving the segmentations. A significant proportion of the time in calculating Normalized Cuts is spent computing an affinity matrix. A new algorithm has been developed that selects random features using compressed sensing techniques to reduce the computation needed for the affinity matrix. The new algorithm, when compared to the standard implementation of Normalized Cuts for segmenting images from the BSDS500, produces better segmentations in significantly less time.

  9. Nonvolatile random access memory

    NASA Technical Reports Server (NTRS)

    Wu, Jiin-Chuan (Inventor); Stadler, Henry L. (Inventor); Katti, Romney R. (Inventor)

    1994-01-01

    A nonvolatile magnetic random access memory can be achieved by an array of magnet-Hall effect (M-H) elements. The storage function is realized with a rectangular thin-film ferromagnetic material having an in-plane, uniaxial anisotropy and inplane bipolar remanent magnetization states. The thin-film magnetic element is magnetized by a local applied field, whose direction is used to form either a 0 or 1 state. The element remains in the 0 or 1 state until a switching field is applied to change its state. The stored information is detcted by a Hall-effect sensor which senses the fringing field from the magnetic storage element. The circuit design for addressing each cell includes transistor switches for providing a current of selected polarity to store a binary digit through a separate conductor overlying the magnetic element of the cell. To read out a stored binary digit, transistor switches are employed to provide a current through a row of Hall-effect sensors connected in series and enabling a differential voltage amplifier connected to all Hall-effect sensors of a column in series. To avoid read-out voltage errors due to shunt currents through resistive loads of the Hall-effect sensors of other cells in the same column, at least one transistor switch is provided between every pair of adjacent cells in every row which are not turned on except in the row of the selected cell.

  10. Random Item IRT Models

    ERIC Educational Resources Information Center

    De Boeck, Paul

    2008-01-01

    It is common practice in IRT to consider items as fixed and persons as random. Both, continuous and categorical person parameters are most often random variables, whereas for items only continuous parameters are used and they are commonly of the fixed type, although exceptions occur. It is shown in the present article that random item parameters…

  11. Randomization in robot tasks

    NASA Technical Reports Server (NTRS)

    Erdmann, Michael

    1992-01-01

    This paper investigates the role of randomization in the solution of robot manipulation tasks. One example of randomization is shown by the strategy of shaking a bin holding a part in order to orient the part in a desired stable state with some high probability. Randomization can be useful for mobile robot navigation and as a means of guiding the design process.

  12. Signal Detection Models with Random Participant and Item Effects

    ERIC Educational Resources Information Center

    Rouder, Jeffrey N.; Lu, Jun; Sun, Dongchu; Speckman, Paul; Morey, Richard; Naveh-Benjamin, Moshe

    2007-01-01

    The theory of signal detection is convenient for measuring mnemonic ability in recognition memory paradigms. In these paradigms, randomly selected participants are asked to study randomly selected items. In practice, researchers aggregate data across items or participants or both. The signal detection model is nonlinear; consequently, analysis…

  13. Quantum random number generation

    SciTech Connect

    Ma, Xiongfeng; Yuan, Xiao; Cao, Zhu; Zhang, Zhen; Qi, Bing

    2016-01-01

    Here, quantum physics can be exploited to generate true random numbers, which play important roles in many applications, especially in cryptography. Genuine randomness from the measurement of a quantum system reveals the inherent nature of quantumness — coherence, an important feature that differentiates quantum mechanics from classical physics. The generation of genuine randomness is generally considered impossible with only classical means. Based on the degree of trustworthiness on devices, quantum random number generators (QRNGs) can be grouped into three categories. The first category, practical QRNG, is built on fully trusted and calibrated devices and typically can generate randomness at a high speed by properly modeling the devices. The second category is self-testing QRNG, where verifiable randomness can be generated without trusting the actual implementation. The third category, semi-self-testing QRNG, is an intermediate category which provides a tradeoff between the trustworthiness on the device and the random number generation speed.

  14. Quantum random number generation

    DOE PAGESBeta

    Ma, Xiongfeng; Yuan, Xiao; Cao, Zhu; Zhang, Zhen; Qi, Bing

    2016-06-28

    Here, quantum physics can be exploited to generate true random numbers, which play important roles in many applications, especially in cryptography. Genuine randomness from the measurement of a quantum system reveals the inherent nature of quantumness -- coherence, an important feature that differentiates quantum mechanics from classical physics. The generation of genuine randomness is generally considered impossible with only classical means. Based on the degree of trustworthiness on devices, quantum random number generators (QRNGs) can be grouped into three categories. The first category, practical QRNG, is built on fully trusted and calibrated devices and typically can generate randomness at amore » high speed by properly modeling the devices. The second category is self-testing QRNG, where verifiable randomness can be generated without trusting the actual implementation. The third category, semi-self-testing QRNG, is an intermediate category which provides a tradeoff between the trustworthiness on the device and the random number generation speed.« less

  15. Random pulse generator

    NASA Technical Reports Server (NTRS)

    Lindsey, R. S., Jr. (Inventor)

    1975-01-01

    An exemplary embodiment of the present invention provides a source of random width and random spaced rectangular voltage pulses whose mean or average frequency of operation is controllable within prescribed limits of about 10 hertz to 1 megahertz. A pair of thin-film metal resistors are used to provide a differential white noise voltage pulse source. Pulse shaping and amplification circuitry provide relatively short duration pulses of constant amplitude which are applied to anti-bounce logic circuitry to prevent ringing effects. The pulse outputs from the anti-bounce circuits are then used to control two one-shot multivibrators whose output comprises the random length and random spaced rectangular pulses. Means are provided for monitoring, calibrating and evaluating the relative randomness of the generator.

  16. Reflecting Random Flights

    NASA Astrophysics Data System (ADS)

    De Gregorio, Alessandro; Orsingher, Enzo

    2015-09-01

    We consider random flights in reflecting on the surface of a sphere with center at the origin and with radius R, where reflection is performed by means of circular inversion. Random flights studied in this paper are motions where the orientation of the deviations are uniformly distributed on the unit-radius sphere . We obtain the explicit probability distributions of the position of the moving particle when the number of changes of direction is fixed and equal to . We show that these distributions involve functions which are solutions of the Euler-Poisson-Darboux equation. The unconditional probability distributions of the reflecting random flights are obtained by suitably randomizing n by means of a fractional-type Poisson process. Random flights reflecting on hyperplanes according to the optical reflection form are considered and the related distributional properties derived.

  17. Quantum random number generator

    DOEpatents

    Pooser, Raphael C.

    2016-05-10

    A quantum random number generator (QRNG) and a photon generator for a QRNG are provided. The photon generator may be operated in a spontaneous mode below a lasing threshold to emit photons. Photons emitted from the photon generator may have at least one random characteristic, which may be monitored by the QRNG to generate a random number. In one embodiment, the photon generator may include a photon emitter and an amplifier coupled to the photon emitter. The amplifier may enable the photon generator to be used in the QRNG without introducing significant bias in the random number and may enable multiplexing of multiple random numbers. The amplifier may also desensitize the photon generator to fluctuations in power supplied thereto while operating in the spontaneous mode. In one embodiment, the photon emitter and amplifier may be a tapered diode amplifier.

  18. Autonomous Byte Stream Randomizer

    NASA Technical Reports Server (NTRS)

    Paloulian, George K.; Woo, Simon S.; Chow, Edward T.

    2013-01-01

    Net-centric networking environments are often faced with limited resources and must utilize bandwidth as efficiently as possible. In networking environments that span wide areas, the data transmission has to be efficient without any redundant or exuberant metadata. The Autonomous Byte Stream Randomizer software provides an extra level of security on top of existing data encryption methods. Randomizing the data s byte stream adds an extra layer to existing data protection methods, thus making it harder for an attacker to decrypt protected data. Based on a generated crypto-graphically secure random seed, a random sequence of numbers is used to intelligently and efficiently swap the organization of bytes in data using the unbiased and memory-efficient in-place Fisher-Yates shuffle method. Swapping bytes and reorganizing the crucial structure of the byte data renders the data file unreadable and leaves the data in a deconstructed state. This deconstruction adds an extra level of security requiring the byte stream to be reconstructed with the random seed in order to be readable. Once the data byte stream has been randomized, the software enables the data to be distributed to N nodes in an environment. Each piece of the data in randomized and distributed form is a separate entity unreadable on its own right, but when combined with all N pieces, is able to be reconstructed back to one. Reconstruction requires possession of the key used for randomizing the bytes, leading to the generation of the same cryptographically secure random sequence of numbers used to randomize the data. This software is a cornerstone capability possessing the ability to generate the same cryptographically secure sequence on different machines and time intervals, thus allowing this software to be used more heavily in net-centric environments where data transfer bandwidth is limited.

  19. Phase III, Randomized, Open-Label Study of Daily Imatinib Mesylate 400 mg Versus 800 mg in Patients With Newly Diagnosed, Previously Untreated Chronic Myeloid Leukemia in Chronic Phase Using Molecular End Points: Tyrosine Kinase Inhibitor Optimization and Selectivity Study

    PubMed Central

    Cortes, Jorge E.; Baccarani, Michele; Guilhot, François; Druker, Brian J.; Branford, Susan; Kim, Dong-Wook; Pane, Fabrizio; Pasquini, Ricardo; Goldberg, Stuart L.; Kalaycio, Matt; Moiraghi, Beatriz; Rowe, Jacob M.; Tothova, Elena; De Souza, Carmino; Rudoltz, Marc; Yu, Richard; Krahnke, Tillmann; Kantarjian, Hagop M.; Radich, Jerald P.; Hughes, Timothy P.

    2010-01-01

    Purpose To evaluate the safety and efficacy of initial treatment with imatinib mesylate 800 mg/d (400 mg twice daily) versus 400 mg/d in patients with newly diagnosed chronic myeloid leukemia in chronic phase. Patients and Methods A total of 476 patients were randomly assigned 2:1 to imatinib 800 mg (n = 319) or 400 mg (n = 157) daily. The primary end point was the major molecular response (MMR) rate at 12 months. Results At 12 months, differences in MMR and complete cytogenetic response (CCyR) rates were not statistically significant (MMR, 46% v 40%; P = .2035; CCyR, 70% v 66%; P = .3470). However, MMR occurred faster among patients randomly assigned to imatinib 800 mg/d, who had higher rates of MMR at 3 and 6 months compared with those in the imatinib 400-mg/d arm (P = .0035 by log-rank test). CCyR also occurred faster in the 800-mg/d arm (CCyR at 6 months, 57% v 45%; P = .0146). The most common adverse events were edema, gastrointestinal problems, and rash, and all were more common in patients in the 800-mg/d arm. Grades 3 to 4 hematologic toxicity also occurred more frequently in patients receiving imatinib 800 mg/d. Conclusion MMR rates at 1 year were similar with imatinib 800 mg/d and 400 mg/d, but MMR and CCyR occurred earlier in patients treated with 800 mg/d. Continued follow-up is needed to determine the clinical significance of earlier responses on high-dose imatinib. PMID:20008622

  20. Statistical properties of randomization in clinical trials.

    PubMed

    Lachin, J M

    1988-12-01

    This is the first of five articles on the properties of different randomization procedures used in clinical trials. This paper presents definitions and discussions of the statistical properties of randomization procedures as they relate to both the design of a clinical trial and the statistical analysis of trial results. The subsequent papers consider, respectively, the properties of simple (complete), permuted-block (i.e., blocked), and urn (adaptive biased-coin) randomization. The properties described herein are the probabilities of treatment imbalances and the potential effects on the power of statistical tests; the permutational basis for statistical tests; and the potential for experimental biases in the assessment of treatment effects due either to the predictability of the random allocations (selection bias) or the susceptibility of the randomization procedure to covariate imbalances (accidental bias). For most randomization procedures, the probabilities of overall treatment imbalances are readily computed, even when a stratified randomization is used. This is important because treatment imbalance may affect statistical power. It is shown, however, that treatment imbalance must be substantial before power is more than trivially affected. The differences between a population versus a permutation model as a basis for a statistical test are reviewed. It is argued that a population model can only be invoked in clinical trials as an untestable assumption, rather than being formally based on sampling at random from a population. On the other hand, a permutational analysis based on the randomization actually employed requires no assumptions regarding the origin of the samples of patients studied. The large sample permutational distribution of the family of linear rank tests is described as a basis for easily conducting a variety of permutation tests. Subgroup (stratified) analyses, analyses when some data are missing, and regression model analyses are also

  1. Random one-of-N selector

    DOEpatents

    Kronberg, J.W.

    1993-04-20

    An apparatus for selecting at random one item of N items on the average comprising counter and reset elements for counting repeatedly between zero and N, a number selected by the user, a circuit for activating and deactivating the counter, a comparator to determine if the counter stopped at a count of zero, an output to indicate an item has been selected when the count is zero or not selected if the count is not zero. Randomness is provided by having the counter cycle very often while varying the relatively longer duration between activation and deactivation of the count. The passive circuit components of the activating/deactivating circuit and those of the counter are selected for the sensitivity of their response to variations in temperature and other physical characteristics of the environment so that the response time of the circuitry varies. Additionally, the items themselves, which may be people, may vary in shape or the time they press a pushbutton, so that, for example, an ultrasonic beam broken by the item or person passing through it will add to the duration of the count and thus to the randomness of the selection.

  2. Random one-of-N selector

    DOEpatents

    Kronberg, James W.

    1993-01-01

    An apparatus for selecting at random one item of N items on the average comprising counter and reset elements for counting repeatedly between zero and N, a number selected by the user, a circuit for activating and deactivating the counter, a comparator to determine if the counter stopped at a count of zero, an output to indicate an item has been selected when the count is zero or not selected if the count is not zero. Randomness is provided by having the counter cycle very often while varying the relatively longer duration between activation and deactivation of the count. The passive circuit components of the activating/deactivating circuit and those of the counter are selected for the sensitivity of their response to variations in temperature and other physical characteristics of the environment so that the response time of the circuitry varies. Additionally, the items themselves, which may be people, may vary in shape or the time they press a pushbutton, so that, for example, an ultrasonic beam broken by the item or person passing through it will add to the duration of the count and thus to the randomness of the selection.

  3. Randomness for Free

    NASA Astrophysics Data System (ADS)

    Chatterjee, Krishnendu; Doyen, Laurent; Gimbert, Hugo; Henzinger, Thomas A.

    We consider two-player zero-sum games on graphs. These games can be classified on the basis of the information of the players and on the mode of interaction between them. On the basis of information the classification is as follows: (a) partial-observation (both players have partial view of the game); (b) one-sided complete-observation (one player has complete observation); and (c) complete-observation (both players have complete view of the game). On the basis of mode of interaction we have the following classification: (a) concurrent (players interact simultaneously); and (b) turn-based (players interact in turn). The two sources of randomness in these games are randomness in transition function and randomness in strategies. In general, randomized strategies are more powerful than deterministic strategies, and randomness in transitions gives more general classes of games. We present a complete characterization for the classes of games where randomness is not helpful in: (a) the transition function (probabilistic transition can be simulated by deterministic transition); and (b) strategies (pure strategies are as powerful as randomized strategies). As consequence of our characterization we obtain new undecidability results for these games.

  4. Correlated randomness and switching phenomena

    NASA Astrophysics Data System (ADS)

    Stanley, H. E.; Buldyrev, S. V.; Franzese, G.; Havlin, S.; Mallamace, F.; Kumar, P.; Plerou, V.; Preis, T.

    2010-08-01

    One challenge of biology, medicine, and economics is that the systems treated by these serious scientific disciplines have no perfect metronome in time and no perfect spatial architecture-crystalline or otherwise. Nonetheless, as if by magic, out of nothing but randomness one finds remarkably fine-tuned processes in time and remarkably fine-tuned structures in space. Further, many of these processes and structures have the remarkable feature of “switching” from one behavior to another as if by magic. The past century has, philosophically, been concerned with placing aside the human tendency to see the universe as a fine-tuned machine. Here we will address the challenge of uncovering how, through randomness (albeit, as we shall see, strongly correlated randomness), one can arrive at some of the many spatial and temporal patterns in biology, medicine, and economics and even begin to characterize the switching phenomena that enables a system to pass from one state to another. Inspired by principles developed by A. Nihat Berker and scores of other statistical physicists in recent years, we discuss some applications of correlated randomness to understand switching phenomena in various fields. Specifically, we present evidence from experiments and from computer simulations supporting the hypothesis that water’s anomalies are related to a switching point (which is not unlike the “tipping point” immortalized by Malcolm Gladwell), and that the bubbles in economic phenomena that occur on all scales are not “outliers” (another Gladwell immortalization). Though more speculative, we support the idea of disease as arising from some kind of yet-to-be-understood complex switching phenomenon, by discussing data on selected examples, including heart disease and Alzheimer disease.

  5. Parametric models for samples of random functions

    SciTech Connect

    Grigoriu, M.

    2015-09-15

    A new class of parametric models, referred to as sample parametric models, is developed for random elements that match sample rather than the first two moments and/or other global properties of these elements. The models can be used to characterize, e.g., material properties at small scale in which case their samples represent microstructures of material specimens selected at random from a population. The samples of the proposed models are elements of finite-dimensional vector spaces spanned by samples, eigenfunctions of Karhunen–Loève (KL) representations, or modes of singular value decompositions (SVDs). The implementation of sample parametric models requires knowledge of the probability laws of target random elements. Numerical examples including stochastic processes and random fields are used to demonstrate the construction of sample parametric models, assess their accuracy, and illustrate how these models can be used to solve efficiently stochastic equations.

  6. The Generation of Random Equilateral Polygons

    NASA Astrophysics Data System (ADS)

    Alvarado, Sotero; Calvo, Jorge Alberto; Millett, Kenneth C.

    2011-04-01

    Freely jointed random equilateral polygons serve as a common model for polymer rings, reflecting their statistical properties under theta conditions. To generate equilateral polygons, researchers employ many procedures that have been proved, or at least are believed, to be random with respect to the natural measure on the space of polygonal knots. As a result, the random selection of equilateral polygons, as well as the statistical robustness of this selection, is of particular interest. In this research, we study the key features of four popular methods: the Polygonal Folding, the Crankshaft Rotation, the Hedgehog, and the Triangle Methods. In particular, we compare the implementation and efficacy of these procedures, especially in regards to the population distribution of polygons in the space of polygonal knots, the distribution of edge vectors, the local curvature, and the local torsion. In addition, we give a rigorous proof that the Crankshaft Rotation Method is ergodic.

  7. Optofluidic random laser

    NASA Astrophysics Data System (ADS)

    Shivakiran Bhaktha, B. N.; Bachelard, Nicolas; Noblin, Xavier; Sebbah, Patrick

    2012-10-01

    Random lasing is reported in a dye-circulated structured polymeric microfluidic channel. The role of disorder, which results from limited accuracy of photolithographic process, is demonstrated by the variation of the emission spectrum with local-pump position and by the extreme sensitivity to a local perturbation of the structure. Thresholds comparable to those of conventional microfluidic lasers are achieved, without the hurdle of state-of-the-art cavity fabrication. Potential applications of optofluidic random lasers for on-chip sensors are discussed. Introduction of random lasers in the field of optofluidics is a promising alternative to on-chip laser integration with light and fluidic functionalities.

  8. Random array grid collimator

    DOEpatents

    Fenimore, E.E.

    1980-08-22

    A hexagonally shaped quasi-random no-two-holes touching grid collimator. The quasi-random array grid collimator eliminates contamination from small angle off-axis rays by using a no-two-holes-touching pattern which simultaneously provides for a self-supporting array increasng throughput by elimination of a substrate. The presentation invention also provides maximum throughput using hexagonally shaped holes in a hexagonal lattice pattern for diffraction limited applications. Mosaicking is also disclosed for reducing fabrication effort.

  9. Trends in the selection of insecticide resistance in Anopheles gambiae s.l. mosquitoes in northwest Tanzania during a community randomized trial of longlasting insecticidal nets and indoor residual spraying.

    PubMed

    Matowo, J; Kitau, J; Kaaya, R; Kavishe, R; Wright, A; Kisinza, W; Kleinschmidt, I; Mosha, F; Rowland, M; Protopopoff, N

    2015-03-01

    Anopheles gambiae s.l. (Diptera: Culicidae) in Muleba, Tanzania has developed high levels of resistance to most insecticides currently advocated for malaria control. The kdr mutation has almost reached fixation in An. gambiae s.s. in Muleba. This change has the potential to jeopardize malaria control interventions carried out in the region. Trends in insecticide resistance were monitored in two intervention villages using World Health Organization (WHO) susceptibility test kits. Additional mechanisms contributing to observed phenotypic resistance were investigated using Centers for Disease Control (CDC) bottle bioassays with piperonylbutoxide (PBO) and S,S,S-tributyl phosphorotrithioate (DEF) synergists. Resistance genotyping for kdr and Ace-1 alleles was conducted using quantitative polymerase chain reaction (qPCR). In both study villages, high phenotypic resistance to several pyrethroids and DDT was observed, with mortality in the range of 12-23%. There was a sharp decrease in mortality in An. gambiae s.l. exposed to bendiocarb (carbamate) from 84% in November 2011 to 31% in December 2012 after two rounds of bendiocarb-based indoor residual spraying (IRS). Anopheles gambiae s.l. remained susceptible to pirimiphos-methyl (organophosphate). Bendiocarb-based IRS did not lead to the reversion of pyrethroid resistance. There was no evidence for selection for Ace-1 resistance alleles. The need to investigate the operational impact of the observed resistance selection on the effectiveness of longlasting insecticidal nets and IRS for malaria control is urgent. PMID:25537754

  10. Trends in the selection of insecticide resistance in Anopheles gambiae s.l. mosquitoes in northwest Tanzania during a community randomized trial of longlasting insecticidal nets and indoor residual spraying

    PubMed Central

    Matowo, J; Kitau, J; Kaaya, R; Kavishe, R; Wright, A; Kisinza, W; Kleinschmidt, I; Mosha, F; Rowland, M; Protopopoff, N

    2015-01-01

    Anopheles gambiae s.l. (Diptera: Culicidae) in Muleba, Tanzania has developed high levels of resistance to most insecticides currently advocated for malaria control. The kdr mutation has almost reached fixation in An. gambiae s.s. in Muleba. This change has the potential to jeopardize malaria control interventions carried out in the region. Trends in insecticide resistance were monitored in two intervention villages using World Health Organization (WHO) susceptibility test kits. Additional mechanisms contributing to observed phenotypic resistance were investigated using Centers for Disease Control (CDC) bottle bioassays with piperonylbutoxide (PBO) and S,S,S-tributyl phosphorotrithioate (DEF) synergists. Resistance genotyping for kdr and Ace-1 alleles was conducted using quantitative polymerase chain reaction (qPCR). In both study villages, high phenotypic resistance to several pyrethroids and DDT was observed, with mortality in the range of 12–23%. There was a sharp decrease in mortality in An. gambiae s.l. exposed to bendiocarb (carbamate) from 84% in November 2011 to 31% in December 2012 after two rounds of bendiocarb-based indoor residual spraying (IRS). Anopheles gambiae s.l. remained susceptible to pirimiphos-methyl (organophosphate). Bendiocarb-based IRS did not lead to the reversion of pyrethroid resistance. There was no evidence for selection for Ace-1 resistance alleles. The need to investigate the operational impact of the observed resistance selection on the effectiveness of longlasting insecticidal nets and IRS for malaria control is urgent. PMID:25537754

  11. Functional proteins from a random-sequence library

    PubMed Central

    Keefe, Anthony D; Szostak, Jack W.

    2015-01-01

    Functional primordial proteins presumably originated from random sequences, but it is not known how frequently functional, or even folded, proteins occur in collections of random sequences. Here we have used in vitro selection of messenger RNA displayed proteins, in which each protein is covalently linked through its carboxy terminus to the 3′ end of its encoding mRNA1, to sample a large number of distinct random sequences. Starting from a library of 6 × 1012 proteins each containing 80 contiguous random amino acids, we selected functional proteins by enriching for those that bind to ATP. This selection yielded four new ATP-binding proteins that appear to be unrelated to each other or to anything found in the current databases of biological proteins. The frequency of occurrence of functional proteins in random-sequence libraries appears to be similar to that observed for equivalent RNA libraries2,3. PMID:11287961

  12. 47 CFR 1.1604 - Post-selection hearings.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 47 Telecommunication 1 2012-10-01 2012-10-01 false Post-selection hearings. 1.1604 Section 1.1604 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Grants by Random Selection Random Selection Procedures for Mass Media Services General Procedures § 1.1604 Post-selection hearings....

  13. Assessment of non-BDNF neurotrophins and GDNF levels after depression treatment with sertraline and transcranial direct current stimulation in a factorial, randomized, sham-controlled trial (SELECT-TDCS): An exploratory analysis

    PubMed Central

    Brunoni, André R.; Machado-Vieira, Rodrigo; Zarate, Carlos A.; Vieira, Erica L. M.; Valiengo, Leandro; Benseñor, Isabela M.; Lotufo, Paulo A.; Gattaz, Wagner F.; Teixeira, Antonio L.

    2014-01-01

    The neurotrophic hypothesis of depression states that the major depressive episode is associated with lower neurotrophic factors levels, which increase with amelioration of depressive symptoms. However, this hypothesis has not been extended to investigate neurotrophic factors other than the brain-derived neurotrophic factor (BDNF). We therefore explored whether plasma levels of neurotrophins 3 (NT-3) and 4 (NT-4), nerve growth factor (NGF) and glial cell line derived neurotrophic factor (GDNF) changed after antidepressant treatment and correlated with treatment response. Seventy-three patients with moderate-to-severe, antidepressant-free unipolar depression were assigned to a pharmacological (sertraline) and a non-pharmacological (transcranial direct current stimulation, tDCS) intervention in a randomized, 2 × 2, placebo-controlled design. The plasma levels of NT-3, NT-4, NGF and GDNF were determined by enzyme-linked immunosorbent assay before and after a 6-week treatment course and analyzed according to clinical response and allocation group. We found that tDCS and sertraline (separately and combined) produced significant improvement in depressive symptoms. Plasma levels of all neurotrophic factors were similar across groups at baseline and remained significantly unchanged regardless of the intervention and of clinical response. Also, baseline plasma levels were not associated with clinical response. To conclude, in this 6-week placebo-controlled trial, NT-3, NT-4, NGF and GDNF plasma levels did not significantly change with sertraline or tDCS. These data suggest that these neurotrophic factors are not surrogate biomarkers of treatment response or involved in the antidepressant mechanisms of tDCS. PMID:25172025

  14. First-dose analgesic effect of the cyclo-oxygenase-2 selective inhibitor lumiracoxib in osteoarthritis of the knee: a randomized, double-blind, placebo-controlled comparison with celecoxib [NCT00267215

    PubMed Central

    Wittenberg, Ralf H; Schell, Ernest; Krehan, Gerhard; Maeumbaed, Roland; Runge, Hans; Schlüter, Peter; Fashola, Taiwo OA; Thurston, Helen J; Burger, Klaus J; Trechsel, Ulrich

    2006-01-01

    Cyclo-oxygenase-2 selective inhibitors are frequently used to manage osteoarthritis. We compared the analgesic efficacy of the novel cyclo-oxygenase-2 selective inhibitor lumiracoxib (Prexige®) versus placebo and celecoxib in patients with knee osteoarthritis. This seven day, double-blind, placebo and active comparator controlled, parallel group study included 364 patients aged ≥50 years with moderate-to-severe symptomatic knee osteoarthritis. Patients received lumiracoxib 400 mg/day (four times the recommended chronic dose in osteoarthritis; n = 144), placebo (n = 75), or celecoxib 200 mg twice daily (n = 145). The primary variable was actual pain intensity difference (100 mm visual–analogue scale) between baseline and the mean of three hour and five hour assessments after the first dose. Actual pain intensity difference, average and worst pain, pain relief and functional status (Western Ontario and McMaster Universities Osteoarthritis Index [WOMAC™]) were measured over seven days. Patients also completed a global evaluation of treatment effect at study end or premature discontinuation. For the primary variable, the superiority of lumiracoxib versus placebo, the noninferiority of lumiracoxib versus celecoxib, and the superiority of lumiracoxib versus celecoxib were assessed by closed test procedure adjusting for multiplicity, thereby maintaining the overall 5% significance level. In addition, celecoxib was assessed versus placebo in a predefined exploratory manner to assess trial sensitivity. Lumiracoxib provided better analgesia than placebo 3–5 hours after the first dose (P = 0.004) through to study end. The estimated difference between lumiracoxib and celecoxib 3–5 hours after the first dose was not significant (P = 0.185). Celecoxib was not significantly different from placebo in this analysis (P = 0.069). At study end 13.9% of lumiracoxib-treated patients reported complete pain relief versus 5.5% and 5.3% of celecoxib and placebo recipients

  15. 9 CFR 590.350 - Procedures for selecting appeal samples.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... containers plus an equal number of containers selected at random. When the original sample containers cannot be located, the appeal sample shall consist of product taken at random from double the number of... the original sample containers plus an equal number of containers selected at random. A...

  16. Random walks on networks

    NASA Astrophysics Data System (ADS)

    Donnelly, Isaac

    Random walks on lattices are a well used model for diffusion on continuum. They have been to model subdiffusive systems, systems with forcing and reactions as well as a combination of the three. We extend the traditional random walk framework to the network to obtain novel results. As an example due to the small graph diameter, the early time behaviour of subdiffusive dynamics dominates the observed system which has implications for models of the brain or airline networks. I would like to thank the Australian American Fulbright Association.

  17. Intermittency and random matrices

    NASA Astrophysics Data System (ADS)

    Sokoloff, Dmitry; Illarionov, E. A.

    2015-08-01

    A spectacular phenomenon of intermittency, i.e. a progressive growth of higher statistical moments of a physical field excited by an instability in a random medium, attracted the attention of Zeldovich in the last years of his life. At that time, the mathematical aspects underlying the physical description of this phenomenon were still under development and relations between various findings in the field remained obscure. Contemporary results from the theory of the product of independent random matrices (the Furstenberg theory) allowed the elaboration of the phenomenon of intermittency in a systematic way. We consider applications of the Furstenberg theory to some problems in cosmology and dynamo theory.

  18. A discrete fractional random transform

    NASA Astrophysics Data System (ADS)

    Liu, Zhengjun; Zhao, Haifa; Liu, Shutian

    2005-11-01

    We propose a discrete fractional random transform based on a generalization of the discrete fractional Fourier transform with an intrinsic randomness. Such discrete fractional random transform inheres excellent mathematical properties of the fractional Fourier transform along with some fantastic features of its own. As a primary application, the discrete fractional random transform has been used for image encryption and decryption.

  19. Uniform random number generators

    NASA Technical Reports Server (NTRS)

    Farr, W. R.

    1971-01-01

    Methods are presented for the generation of random numbers with uniform and normal distributions. Subprogram listings of Fortran generators for the Univac 1108, SDS 930, and CDC 3200 digital computers are also included. The generators are of the mixed multiplicative type, and the mathematical method employed is that of Marsaglia and Bray.

  20. Generating "Random" Integers

    ERIC Educational Resources Information Center

    Griffiths, Martin

    2011-01-01

    One of the author's undergraduate students recently asked him whether it was possible to generate a random positive integer. After some thought, the author realised that there were plenty of interesting mathematical ideas inherent in her question. So much so in fact, that the author decided to organise a workshop, open both to undergraduates and…

  1. Random lattice superstrings

    SciTech Connect

    Feng Haidong; Siegel, Warren

    2006-08-15

    We propose some new simplifying ingredients for Feynman diagrams that seem necessary for random lattice formulations of superstrings. In particular, half the fermionic variables appear only in particle loops (similarly to loop momenta), reducing the supersymmetry of the constituents of the type IIB superstring to N=1, as expected from their interpretation in the 1/N expansion as super Yang-Mills.

  2. Efficient robust conditional random fields.

    PubMed

    Song, Dongjin; Liu, Wei; Zhou, Tianyi; Tao, Dacheng; Meyer, David A

    2015-10-01

    Conditional random fields (CRFs) are a flexible yet powerful probabilistic approach and have shown advantages for popular applications in various areas, including text analysis, bioinformatics, and computer vision. Traditional CRF models, however, are incapable of selecting relevant features as well as suppressing noise from noisy original features. Moreover, conventional optimization methods often converge slowly in solving the training procedure of CRFs, and will degrade significantly for tasks with a large number of samples and features. In this paper, we propose robust CRFs (RCRFs) to simultaneously select relevant features. An optimal gradient method (OGM) is further designed to train RCRFs efficiently. Specifically, the proposed RCRFs employ the l1 norm of the model parameters to regularize the objective used by traditional CRFs, therefore enabling discovery of the relevant unary features and pairwise features of CRFs. In each iteration of OGM, the gradient direction is determined jointly by the current gradient together with the historical gradients, and the Lipschitz constant is leveraged to specify the proper step size. We show that an OGM can tackle the RCRF model training very efficiently, achieving the optimal convergence rate [Formula: see text] (where k is the number of iterations). This convergence rate is theoretically superior to the convergence rate O(1/k) of previous first-order optimization methods. Extensive experiments performed on three practical image segmentation tasks demonstrate the efficacy of OGM in training our proposed RCRFs. PMID:26080050

  3. Random bits, true and unbiased, from atmospheric turbulence.

    PubMed

    Marangon, Davide G; Vallone, Giuseppe; Villoresi, Paolo

    2014-01-01

    Random numbers represent a fundamental ingredient for secure communications and numerical simulation as well as to games and in general to Information Science. Physical processes with intrinsic unpredictability may be exploited to generate genuine random numbers. The optical propagation in strong atmospheric turbulence is here taken to this purpose, by observing a laser beam after a 143 km free-space path. In addition, we developed an algorithm to extract the randomness of the beam images at the receiver without post-processing. The numbers passed very selective randomness tests for qualification as genuine random numbers. The extracting algorithm can be easily generalized to random images generated by different physical processes. PMID:24976499

  4. Relativistic Weierstrass random walks.

    PubMed

    Saa, Alberto; Venegeroles, Roberto

    2010-08-01

    The Weierstrass random walk is a paradigmatic Markov chain giving rise to a Lévy-type superdiffusive behavior. It is well known that special relativity prevents the arbitrarily high velocities necessary to establish a superdiffusive behavior in any process occurring in Minkowski spacetime, implying, in particular, that any relativistic Markov chain describing spacetime phenomena must be essentially Gaussian. Here, we introduce a simple relativistic extension of the Weierstrass random walk and show that there must exist a transition time t{c} delimiting two qualitative distinct dynamical regimes: the (nonrelativistic) superdiffusive Lévy flights, for tt{c} . Implications of this crossover between different diffusion regimes are discussed for some explicit examples. The study of such an explicit and simple Markov chain can shed some light on several results obtained in much more involved contexts. PMID:20866862

  5. Random very loose packings.

    PubMed

    Ciamarra, Massimo Pica; Coniglio, Antonio

    2008-09-19

    We measure the number Omega(phi) of mechanically stable states of volume fraction phi of a granular assembly under gravity. The granular entropy S(phi)=logOmega(phi) vanishes both at high density, at phi approximately equal to phi_rcp, and a low density, at phi approximately equal to phi_rvlp, where phi_rvlp is a new lower bound we call random very loose pack. phi_rlp is the volume fraction where the entropy is maximal. These findings allow for a clear explanation of compaction experiments and provide the first first-principle definition of the random loose volume fraction. In the context of the statistical mechanics approach to static granular materials, states with phi

  6. A random number generator for continuous random variables

    NASA Technical Reports Server (NTRS)

    Guerra, V. M.; Tapia, R. A.; Thompson, J. R.

    1972-01-01

    A FORTRAN 4 routine is given which may be used to generate random observations of a continuous real valued random variable. Normal distribution of F(x), X, E(akimas), and E(linear) is presented in tabular form.

  7. Selection Sources.

    ERIC Educational Resources Information Center

    Kerby, Ramona

    2002-01-01

    Discusses library collection development by school library media specialists and describes selection sources for new books and materials; retrospective selection sources for materials published in preceding years; and an acquisition source. Provides an overview of the selection process and includes 10 suggestions for selection. (LRW)

  8. Mechanical Selection.

    ERIC Educational Resources Information Center

    Brownson, Charles W.

    1988-01-01

    Defines mechanical and expert selection of library materials and investigates the usefulness of mechanical selection in three test cases involving popular novels, German literature, and contemporary poetry. The cost of mechanical selection is examined and more quantification in selection practice is recommended. Data are presented in eight tables,…

  9. 49 CFR 219.608 - FRA Administrator's determination of random alcohol testing rate.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... (Continued) FEDERAL RAILROAD ADMINISTRATION, DEPARTMENT OF TRANSPORTATION CONTROL OF ALCOHOL AND DRUG USE Random Alcohol and Drug Testing Programs § 219.608 FRA Administrator's determination of random alcohol... random selection, with each pool containing the covered employees who are subject to testing at the...

  10. Generation of kth-order random toposequences

    NASA Astrophysics Data System (ADS)

    Odgers, Nathan P.; McBratney, Alex. B.; Minasny, Budiman

    2008-05-01

    The model presented in this paper derives toposequences from a digital elevation model (DEM). It is written in ArcInfo Macro Language (AML). The toposequences are called kth-order random toposequences, because they take a random path uphill to the top of a hill and downhill to a stream or valley bottom from a randomly selected seed point, and they are located in a streamshed of order k according to a particular stream-ordering system. We define a kth-order streamshed as the area of land that drains directly to a stream segment of stream order k. The model attempts to optimise the spatial configuration of a set of derived toposequences iteratively by using simulated annealing to maximise the total sum of distances between each toposequence hilltop in the set. The user is able to select the order, k, of the derived toposequences. Toposequences are useful for determining soil sampling locations for use in collecting soil data for digital soil mapping applications. Sampling locations can be allocated according to equal elevation or equal-distance intervals along the length of the toposequence, for example. We demonstrate the use of this model for a study area in the Hunter Valley of New South Wales, Australia. Of the 64 toposequences derived, 32 were first-order random toposequences according to Strahler's stream-ordering system, and 32 were second-order random toposequences. The model that we present in this paper is an efficient method for sampling soil along soil toposequences. The soils along a toposequence are related to each other by the topography they are found in, so soil data collected by this method is useful for establishing soil-landscape rules for the preparation of digital soil maps.

  11. Weighted Hybrid Decision Tree Model for Random Forest Classifier

    NASA Astrophysics Data System (ADS)

    Kulkarni, Vrushali Y.; Sinha, Pradeep K.; Petare, Manisha C.

    2016-06-01

    Random Forest is an ensemble, supervised machine learning algorithm. An ensemble generates many classifiers and combines their results by majority voting. Random forest uses decision tree as base classifier. In decision tree induction, an attribute split/evaluation measure is used to decide the best split at each node of the decision tree. The generalization error of a forest of tree classifiers depends on the strength of the individual trees in the forest and the correlation among them. The work presented in this paper is related to attribute split measures and is a two step process: first theoretical study of the five selected split measures is done and a comparison matrix is generated to understand pros and cons of each measure. These theoretical results are verified by performing empirical analysis. For empirical analysis, random forest is generated using each of the five selected split measures, chosen one at a time. i.e. random forest using information gain, random forest using gain ratio, etc. The next step is, based on this theoretical and empirical analysis, a new approach of hybrid decision tree model for random forest classifier is proposed. In this model, individual decision tree in Random Forest is generated using different split measures. This model is augmented by weighted voting based on the strength of individual tree. The new approach has shown notable increase in the accuracy of random forest.

  12. Improved Compressive Sensing of Natural Scenes Using Localized Random Sampling

    PubMed Central

    Barranca, Victor J.; Kovačič, Gregor; Zhou, Douglas; Cai, David

    2016-01-01

    Compressive sensing (CS) theory demonstrates that by using uniformly-random sampling, rather than uniformly-spaced sampling, higher quality image reconstructions are often achievable. Considering that the structure of sampling protocols has such a profound impact on the quality of image reconstructions, we formulate a new sampling scheme motivated by physiological receptive field structure, localized random sampling, which yields significantly improved CS image reconstructions. For each set of localized image measurements, our sampling method first randomly selects an image pixel and then measures its nearby pixels with probability depending on their distance from the initially selected pixel. We compare the uniformly-random and localized random sampling methods over a large space of sampling parameters, and show that, for the optimal parameter choices, higher quality image reconstructions can be consistently obtained by using localized random sampling. In addition, we argue that the localized random CS optimal parameter choice is stable with respect to diverse natural images, and scales with the number of samples used for reconstruction. We expect that the localized random sampling protocol helps to explain the evolutionarily advantageous nature of receptive field structure in visual systems and suggests several future research areas in CS theory and its application to brain imaging. PMID:27555464

  13. Improved Compressive Sensing of Natural Scenes Using Localized Random Sampling.

    PubMed

    Barranca, Victor J; Kovačič, Gregor; Zhou, Douglas; Cai, David

    2016-01-01

    Compressive sensing (CS) theory demonstrates that by using uniformly-random sampling, rather than uniformly-spaced sampling, higher quality image reconstructions are often achievable. Considering that the structure of sampling protocols has such a profound impact on the quality of image reconstructions, we formulate a new sampling scheme motivated by physiological receptive field structure, localized random sampling, which yields significantly improved CS image reconstructions. For each set of localized image measurements, our sampling method first randomly selects an image pixel and then measures its nearby pixels with probability depending on their distance from the initially selected pixel. We compare the uniformly-random and localized random sampling methods over a large space of sampling parameters, and show that, for the optimal parameter choices, higher quality image reconstructions can be consistently obtained by using localized random sampling. In addition, we argue that the localized random CS optimal parameter choice is stable with respect to diverse natural images, and scales with the number of samples used for reconstruction. We expect that the localized random sampling protocol helps to explain the evolutionarily advantageous nature of receptive field structure in visual systems and suggests several future research areas in CS theory and its application to brain imaging. PMID:27555464

  14. Random numbers from vacuum fluctuations

    NASA Astrophysics Data System (ADS)

    Shi, Yicheng; Chng, Brenda; Kurtsiefer, Christian

    2016-07-01

    We implement a quantum random number generator based on a balanced homodyne measurement of vacuum fluctuations of the electromagnetic field. The digitized signal is directly processed with a fast randomness extraction scheme based on a linear feedback shift register. The random bit stream is continuously read in a computer at a rate of about 480 Mbit/s and passes an extended test suite for random numbers.

  15. Cluster Randomized Controlled Trial

    PubMed Central

    Young, John; Chapman, Katie; Nixon, Jane; Patel, Anita; Holloway, Ivana; Mellish, Kirste; Anwar, Shamaila; Breen, Rachel; Knapp, Martin; Murray, Jenni; Farrin, Amanda

    2015-01-01

    Background and Purpose— We developed a new postdischarge system of care comprising a structured assessment covering longer-term problems experienced by patients with stroke and their carers, linked to evidence-based treatment algorithms and reference guides (the longer-term stroke care system of care) to address the poor longer-term recovery experienced by many patients with stroke. Methods— A pragmatic, multicentre, cluster randomized controlled trial of this system of care. Eligible patients referred to community-based Stroke Care Coordinators were randomized to receive the new system of care or usual practice. The primary outcome was improved patient psychological well-being (General Health Questionnaire-12) at 6 months; secondary outcomes included functional outcomes for patients, carer outcomes, and cost-effectiveness. Follow-up was through self-completed postal questionnaires at 6 and 12 months. Results— Thirty-two stroke services were randomized (29 participated); 800 patients (399 control; 401 intervention) and 208 carers (100 control; 108 intervention) were recruited. In intention to treat analysis, the adjusted difference in patient General Health Questionnaire-12 mean scores at 6 months was −0.6 points (95% confidence interval, −1.8 to 0.7; P=0.394) indicating no evidence of statistically significant difference between the groups. Costs of Stroke Care Coordinator inputs, total health and social care costs, and quality-adjusted life year gains at 6 months, 12 months, and over the year were similar between the groups. Conclusions— This robust trial demonstrated no benefit in clinical or cost-effectiveness outcomes associated with the new system of care compared with usual Stroke Care Coordinator practice. Clinical Trial Registration— URL: http://www.controlled-trials.com. Unique identifier: ISRCTN 67932305. PMID:26152298

  16. Random recursive trees and the elephant random walk

    NASA Astrophysics Data System (ADS)

    Kürsten, Rüdiger

    2016-03-01

    One class of random walks with infinite memory, so-called elephant random walks, are simple models describing anomalous diffusion. We present a surprising connection between these models and bond percolation on random recursive trees. We use a coupling between the two models to translate results from elephant random walks to the percolation process. We calculate, besides other quantities, exact expressions for the first and the second moment of the root cluster size and of the number of nodes in child clusters of the first generation. We further introduce another model, the skew elephant random walk, and calculate the first and second moment of this process.

  17. Randomly Hyperbranched Polymers

    NASA Astrophysics Data System (ADS)

    Konkolewicz, Dominik; Gilbert, Robert G.; Gray-Weale, Angus

    2007-06-01

    We describe a model for the structures of randomly hyperbranched polymers in solution, and find a logarithmic growth of radius with polymer mass. We include segmental overcrowding, which puts an upper limit on the density. The model is tested against simulations, against data on amylopectin, a major component of starch, on glycogen, and on polyglycerols. For samples of synthetic polyglycerol and glycogen, our model holds well for all the available data. The model reveals higher-level scaling structure in glycogen, related to the β particles seen in electron microscopy.

  18. Coloring random graphs.

    PubMed

    Mulet, R; Pagnani, A; Weigt, M; Zecchina, R

    2002-12-23

    We study the graph coloring problem over random graphs of finite average connectivity c. Given a number q of available colors, we find that graphs with low connectivity admit almost always a proper coloring, whereas graphs with high connectivity are uncolorable. Depending on q, we find the precise value of the critical average connectivity c(q). Moreover, we show that below c(q) there exists a clustering phase c in [c(d),c(q)] in which ground states spontaneously divide into an exponential number of clusters and where the proliferation of metastable states is responsible for the onset of complexity in local search algorithms. PMID:12484862

  19. Randomized Response Analysis in Mplus

    ERIC Educational Resources Information Center

    Hox, Joop; Lensvelt-Mulders, Gerty

    2004-01-01

    This article describes a technique to analyze randomized response data using available structural equation modeling (SEM) software. The randomized response technique was developed to obtain estimates that are more valid when studying sensitive topics. The basic feature of all randomized response methods is that the data are deliberately…

  20. Random Numbers and Quantum Computers

    ERIC Educational Resources Information Center

    McCartney, Mark; Glass, David

    2002-01-01

    The topic of random numbers is investigated in such a way as to illustrate links between mathematics, physics and computer science. First, the generation of random numbers by a classical computer using the linear congruential generator and logistic map is considered. It is noted that these procedures yield only pseudo-random numbers since…

  1. Investigating the Randomness of Numbers

    ERIC Educational Resources Information Center

    Pendleton, Kenn L.

    2009-01-01

    The use of random numbers is pervasive in today's world. Random numbers have practical applications in such far-flung arenas as computer simulations, cryptography, gambling, the legal system, statistical sampling, and even the war on terrorism. Evaluating the randomness of extremely large samples is a complex, intricate process. However, the…

  2. Random-walk enzymes

    NASA Astrophysics Data System (ADS)

    Mak, Chi H.; Pham, Phuong; Afif, Samir A.; Goodman, Myron F.

    2015-09-01

    Enzymes that rely on random walk to search for substrate targets in a heterogeneously dispersed medium can leave behind complex spatial profiles of their catalyzed conversions. The catalytic signatures of these random-walk enzymes are the result of two coupled stochastic processes: scanning and catalysis. Here we develop analytical models to understand the conversion profiles produced by these enzymes, comparing an intrusive model, in which scanning and catalysis are tightly coupled, against a loosely coupled passive model. Diagrammatic theory and path-integral solutions of these models revealed clearly distinct predictions. Comparison to experimental data from catalyzed deaminations deposited on single-stranded DNA by the enzyme activation-induced deoxycytidine deaminase (AID) demonstrates that catalysis and diffusion are strongly intertwined, where the chemical conversions give rise to new stochastic trajectories that were absent if the substrate DNA was homogeneous. The C →U deamination profiles in both analytical predictions and experiments exhibit a strong contextual dependence, where the conversion rate of each target site is strongly contingent on the identities of other surrounding targets, with the intrusive model showing an excellent fit to the data. These methods can be applied to deduce sequence-dependent catalytic signatures of other DNA modification enzymes, with potential applications to cancer, gene regulation, and epigenetics.

  3. Randomness in Competitions

    NASA Astrophysics Data System (ADS)

    Ben-Naim, E.; Hengartner, N. W.; Redner, S.; Vazquez, F.

    2013-05-01

    We study the effects of randomness on competitions based on an elementary random process in which there is a finite probability that a weaker team upsets a stronger team. We apply this model to sports leagues and sports tournaments, and compare the theoretical results with empirical data. Our model shows that single-elimination tournaments are efficient but unfair: the number of games is proportional to the number of teams N, but the probability that the weakest team wins decays only algebraically with N. In contrast, leagues, where every team plays every other team, are fair but inefficient: the top √{N} of teams remain in contention for the championship, while the probability that the weakest team becomes champion is exponentially small. We also propose a gradual elimination schedule that consists of a preliminary round and a championship round. Initially, teams play a small number of preliminary games, and subsequently, a few teams qualify for the championship round. This algorithm is fair and efficient: the best team wins with a high probability and the number of games scales as N 9/5, whereas traditional leagues require N 3 games to fairly determine a champion.

  4. Random rough surface photofabrication

    NASA Astrophysics Data System (ADS)

    Brissonneau, Vincent; Escoubas, Ludovic; Flory, François; Berginc, Gérard

    2011-10-01

    Random rough surfaces are of primary interest for their optical properties: reducing reflection at the interface or obtaining specific scattering diagram for example. Thus controlling surface statistics during the fabrication process paves the way to original and specific behaviors of reflected optical waves. We detail an experimental method allowing the fabrication of random rough surfaces showing tuned statistical properties. A two-step photoresist exposure process was developed. In order to initiate photoresist polymerization, an energy threshold needs to be reached by light exposure. This energy is brought by a uniform exposure equipment comprising UV-LEDs. This pre-exposure is studied by varying parameters such as optical power and exposure time. The second step consists in an exposure based on the Gray method.1 The speckle pattern of an enlarged scattered laser beam is used to insolate the photoresist. A specific photofabrication bench using an argon ion laser was implemented. Parameters such as exposure time and distances between optical components are discussed. Then, we describe how we modify the speckle-based exposure bench to include a spatial light modulator (SLM). The SLM used is a micromirror matrix known as Digital Micromirror Device (DMD) which allows spatial modulation by displaying binary images. Thus, the spatial beam shape can be tuned and so the speckle pattern on the photoresist is modified. As the photoresist photofabricated surface is correlated to the speckle pattern used to insolate, the roughness parameters can be adjusted.

  5. Random-walk enzymes

    PubMed Central

    Mak, Chi H.; Pham, Phuong; Afif, Samir A.; Goodman, Myron F.

    2015-01-01

    Enzymes that rely on random walk to search for substrate targets in a heterogeneously dispersed medium can leave behind complex spatial profiles of their catalyzed conversions. The catalytic signatures of these random-walk enzymes are the result of two coupled stochastic processes: scanning and catalysis. Here we develop analytical models to understand the conversion profiles produced by these enzymes, comparing an intrusive model, in which scanning and catalysis are tightly coupled, against a loosely coupled passive model. Diagrammatic theory and path-integral solutions of these models revealed clearly distinct predictions. Comparison to experimental data from catalyzed deaminations deposited on single-stranded DNA by the enzyme activation-induced deoxycytidine deaminase (AID) demonstrates that catalysis and diffusion are strongly intertwined, where the chemical conversions give rise to new stochastic trajectories that were absent if the substrate DNA was homogeneous. The C → U deamination profiles in both analytical predictions and experiments exhibit a strong contextual dependence, where the conversion rate of each target site is strongly contingent on the identities of other surrounding targets, with the intrusive model showing an excellent fit to the data. These methods can be applied to deduce sequence-dependent catalytic signatures of other DNA modification enzymes, with potential applications to cancer, gene regulation, and epigenetics. PMID:26465508

  6. The Coalescent Process in Models with Selection

    PubMed Central

    Kaplan, N. L.; Darden, T.; Hudson, R. R.

    1988-01-01

    Statistical properties of the process describing the genealogical history of a random sample of genes are obtained for a class of population genetics models with selection. For models with selection, in contrast to models without selection, the distribution of this process, the coalescent process, depends on the distribution of the frequencies of alleles in the ancestral generations. If the ancestral frequency process can be approximated by a diffusion, then the mean and the variance of the number of segregating sites due to selectively neutral mutations in random samples can be numerically calculated. The calculations are greatly simplified if the frequencies of the alleles are tightly regulated. If the mutation rates between alleles maintained by balancing selection are low, then the number of selectively neutral segregating sites in a random sample of genes is expected to substantially exceed the number predicted under a neutral model. PMID:3066685

  7. A New Combinational Selection Operator in Genetic Algorithm

    NASA Astrophysics Data System (ADS)

    Rafsanjani, Marjan Kuchaki; Eskandari, Sadegh

    2011-09-01

    In this paper, a new Random Combinational Selection Operator (RCSO) is presented. Three existing selection operators and our proposed selection method are applied to traveling salesman problems using MATLAB. The tours obtained using our selection method, are shorter than those that were obtained with existing selection operators for large numbers of cities.

  8. Molecular selection in a unified evolutionary sequence

    NASA Technical Reports Server (NTRS)

    Fox, S. W.

    1986-01-01

    With guidance from experiments and observations that indicate internally limited phenomena, an outline of unified evolutionary sequence is inferred. Such unification is not visible for a context of random matrix and random mutation. The sequence proceeds from Big Bang through prebiotic matter, protocells, through the evolving cell via molecular and natural selection, to mind, behavior, and society.

  9. Mapping in random-structures

    SciTech Connect

    Reidys, C.M.

    1996-06-01

    A mapping in random-structures is defined on the vertices of a generalized hypercube Q{sub {alpha}}{sup n}. A random-structure will consist of (1) a random contact graph and (2) a family of relations imposed on adjacent vertices. The vertex set of a random contact graph will be the set of all coordinates of a vertex P {element_of} Q{sub {alpha}}{sup n}. Its edge will be the union of the edge sets of two random graphs. The first is a random 1-regular graph on 2m vertices (coordinates) and the second is a random graph G{sub p} with p = c{sub 2}/n on all n vertices (coordinates). The structure of the random contact graphs will be investigated and it will be shown that for certain values of m, c{sub 2} the mapping in random-structures allows to search by the set of random-structures. This is applied to mappings in RNA-secondary structures. Also, the results on random-structures might be helpful for designing 3D-folding algorithms for RNA.

  10. Adaptive Random Testing with Combinatorial Input Domain

    PubMed Central

    Lu, Yansheng

    2014-01-01

    Random testing (RT) is a fundamental testing technique to assess software reliability, by simply selecting test cases in a random manner from the whole input domain. As an enhancement of RT, adaptive random testing (ART) has better failure-detection capability and has been widely applied in different scenarios, such as numerical programs, some object-oriented programs, and mobile applications. However, not much work has been done on the effectiveness of ART for the programs with combinatorial input domain (i.e., the set of categorical data). To extend the ideas to the testing for combinatorial input domain, we have adopted different similarity measures that are widely used for categorical data in data mining and have proposed two similarity measures based on interaction coverage. Then, we propose a new version named ART-CID as an extension of ART in combinatorial input domain, which selects an element from categorical data as the next test case such that it has the lowest similarity against already generated test cases. Experimental results show that ART-CID generally performs better than RT, with respect to different evaluation metrics. PMID:24772036

  11. Structure of random foam.

    SciTech Connect

    Reinelt, Douglas A.; van Swol, Frank B.; Kraynik, Andrew Michael

    2004-06-01

    The Surface Evolver was used to compute the equilibrium microstructure of dry soap foams with random structure and a wide range of cell-size distributions. Topological and geometric properties of foams and individual cells were evaluated. The theory for isotropic Plateau polyhedra describes the dependence of cell geometric properties on their volume and number of faces. The surface area of all cells is about 10% greater than a sphere of equal volume; this leads to a simple but accurate theory for the surface free energy density of foam. A novel parameter based on the surface-volume mean bubble radius R32 is used to characterize foam polydispersity. The foam energy, total cell edge length, and average number of faces per cell all decrease with increasing polydispersity. Pentagonal faces are the most common in monodisperse foam but quadrilaterals take over in highly polydisperse structures.

  12. Accelerated randomized benchmarking

    NASA Astrophysics Data System (ADS)

    Granade, Christopher; Ferrie, Christopher; Cory, D. G.

    2015-01-01

    Quantum information processing offers promising advances for a wide range of fields and applications, provided that we can efficiently assess the performance of the control applied in candidate systems. That is, we must be able to determine whether we have implemented a desired gate, and refine accordingly. Randomized benchmarking reduces the difficulty of this task by exploiting symmetries in quantum operations. Here, we bound the resources required for benchmarking and show that, with prior information, we can achieve several orders of magnitude better accuracy than in traditional approaches to benchmarking. Moreover, by building on state-of-the-art classical algorithms, we reach these accuracies with near-optimal resources. Our approach requires an order of magnitude less data to achieve the same accuracies and to provide online estimates of the errors in the reported fidelities. We also show that our approach is useful for physical devices by comparing to simulations.

  13. Nature of Random Variation in the Nutrient Composition of Meals

    PubMed Central

    Balintfy, Joseph L.; Prekopa, Andras

    1966-01-01

    The mathematical formulation of nutrient variation in meals in presented by means of random vectors. The primary sources of nutrient variation in unit portions of menu items are identified and expressed in terms of random food-nutrient, random portion size and random ingredient composition variations. A secondary source of nutrient variation can be traced to the random selection process of combining menu items into individual meals from multiple choice menus. The separate as well as the joint effect of these sources on the total variation of the nutrient content of meals is described with the aid of variance-covariance matrices. The investigation is concluded with the formulation of multivariate probability statements concerning the adequacy of the nutrient content of meals relative to the distribution of the nutrient requirements over a given population. PMID:5971545

  14. Pseudo-random number generator for the Sigma 5 computer

    NASA Technical Reports Server (NTRS)

    Carroll, S. N.

    1983-01-01

    A technique is presented for developing a pseudo-random number generator based on the linear congruential form. The two numbers used for the generator are a prime number and a corresponding primitive root, where the prime is the largest prime number that can be accurately represented on a particular computer. The primitive root is selected by applying Marsaglia's lattice test. The technique presented was applied to write a random number program for the Sigma 5 computer. The new program, named S:RANDOM1, is judged to be superior to the older program named S:RANDOM. For applications requiring several independent random number generators, a table is included showing several acceptable primitive roots. The technique and programs described can be applied to any computer having word length different from that of the Sigma 5.

  15. How random are random numbers generated using photons?

    NASA Astrophysics Data System (ADS)

    Solis, Aldo; Angulo Martínez, Alí M.; Ramírez Alarcón, Roberto; Cruz Ramírez, Hector; U'Ren, Alfred B.; Hirsch, Jorge G.

    2015-06-01

    Randomness is fundamental in quantum theory, with many philosophical and practical implications. In this paper we discuss the concept of algorithmic randomness, which provides a quantitative method to assess the Borel normality of a given sequence of numbers, a necessary condition for it to be considered random. We use Borel normality as a tool to investigate the randomness of ten sequences of bits generated from the differences between detection times of photon pairs generated by spontaneous parametric downconversion. These sequences are shown to fulfil the randomness criteria without difficulties. As deviations from Borel normality for photon-generated random number sequences have been reported in previous work, a strategy to understand these diverging findings is outlined.

  16. Random Test Run Length and Effectiveness

    NASA Technical Reports Server (NTRS)

    Andrews, James H.; Groce, Alex; Weston, Melissa; Xu, Ru-Gang

    2008-01-01

    A poorly understood but important factor in many applications of random testing is the selection of a maximum length for test runs. Given a limited time for testing, it is seldom clear whether executing a small number of long runs or a large number of short runs maximizes utility. It is generally expected that longer runs are more likely to expose failures -- which is certainly true with respect to runs shorter than the shortest failing trace. However, longer runs produce longer failing traces, requiring more effort from humans in debugging or more resources for automated minimization. In testing with feedback, increasing ranges for parameters may also cause the probability of failure to decrease in longer runs. We show that the choice of test length dramatically impacts the effectiveness of random testing, and that the patterns observed in simple models and predicted by analysis are useful in understanding effects observed.

  17. Resolving social dilemmas on evolving random networks

    NASA Astrophysics Data System (ADS)

    Szolnoki, Attila; Perc, Matjaž

    2009-05-01

    We show that strategy-independent adaptations of random interaction networks can induce powerful mechanisms, ranging from the Red Queen to group selection, which promote cooperation in evolutionary social dilemmas. These two mechanisms emerge spontaneously as dynamical processes due to deletions and additions of links, which are performed whenever players adopt new strategies and after a certain number of game iterations, respectively. The potency of cooperation promotion, as well as the mechanism responsible for it, can thereby be tuned via a single parameter determining the frequency of link additions. We thus demonstrate that coevolving random networks may evoke an appropriate mechanism for each social dilemma, such that cooperation prevails even in highly unfavorable conditions.

  18. Cluster randomized trials for pharmacy practice research.

    PubMed

    Gums, Tyler; Carter, Barry; Foster, Eric

    2016-06-01

    Introduction Cluster randomized trials (CRTs) are now the gold standard in health services research, including pharmacy-based interventions. Studies of behaviour, epidemiology, lifestyle modifications, educational programs, and health care models are utilizing the strengths of cluster randomized analyses. Methodology The key property of CRTs is the unit of randomization (clusters), which may be different from the unit of analysis (individual). Subject sample size and, ideally, the number of clusters is determined by the relationship of between-cluster and within-cluster variability. The correlation among participants recruited from the same cluster is known as the intraclass correlation coefficient (ICC). Generally, having more clusters with smaller ICC values will lead to smaller sample sizes. When selecting clusters, stratification before randomization may be useful in decreasing imbalances between study arms. Participant recruitment methods can differ from other types of randomized trials, as blinding a behavioural intervention cannot always be done. When to use CRTs can yield results that are relevant for making "real world" decisions. CRTs are often used in non-therapeutic intervention studies (e.g. change in practice guidelines). The advantages of CRT design in pharmacy research have been avoiding contamination and the generalizability of the results. A large CRT that studied physician-pharmacist collaborative management of hypertension is used in this manuscript as a CRT example. The trial, entitled Collaboration Among Pharmacists and physicians To Improve Outcomes Now (CAPTION), was implemented in primary care offices in the United States for hypertensive patients. Limitations CRT design limitations include the need for a large number of clusters, high costs, increased training, increased monitoring, and statistical complexity. PMID:26715549

  19. Does Random Dispersion Help Survival?

    NASA Astrophysics Data System (ADS)

    Schinazi, Rinaldo B.

    2015-04-01

    Many species live in colonies that prosper for a while and then collapse. After the collapse the colony survivors disperse randomly and found new colonies that may or may not make it depending on the new environment they find. We use birth and death chains in random environments to model such a population and to argue that random dispersion is a superior strategy for survival.

  20. A Random Variable Transformation Process.

    ERIC Educational Resources Information Center

    Scheuermann, Larry

    1989-01-01

    Provides a short BASIC program, RANVAR, which generates random variates for various theoretical probability distributions. The seven variates include: uniform, exponential, normal, binomial, Poisson, Pascal, and triangular. (MVL)

  1. Selecting Software.

    ERIC Educational Resources Information Center

    Pereus, Steven C.

    2002-01-01

    Describes a comprehensive computer software selection and evaluation process, including documenting district needs, evaluating software packages, weighing the alternatives, and making the purchase. (PKP)

  2. Diffusion in random networks

    NASA Astrophysics Data System (ADS)

    Padrino, Juan C.; Zhang, Duan Z.

    2015-11-01

    The ensemble phase averaging technique is applied to model mass transport in a porous medium. The porous material is idealized as an ensemble of random networks, where each network consists of a set of junction points representing the pores and tortuous channels connecting them. Inside a channel, fluid transport is assumed to be governed by the one-dimensional diffusion equation. Mass balance leads to an integro-differential equation for the pores mass density. Instead of attempting to solve this equation, and equivalent set of partial differential equations is derived whose solution is sought numerically. As a test problem, we consider the one-dimensional diffusion of a substance from one end to the other in a bounded domain. For a statistically homogeneous and isotropic material, results show that for relatively large times the pore mass density evolution from the new theory is significantly delayed in comparison with the solution from the classical diffusion equation. In the short-time case, when the solution evolves with time as if the domain were semi-infinite, numerical results indicate that the pore mass density becomes a function of the similarity variable xt- 1 / 4 rather than xt- 1 / 2 characteristic of classical diffusion. This result was verified analytically. Possible applications of this framework include flow in gas shales. Work supported by LDRD project of LANL.

  3. Ferroelectric random access memories.

    PubMed

    Ishiwara, Hiroshi

    2012-10-01

    Ferroelectric random access memory (FeRAM) is a nonvolatile memory, in which data are stored using hysteretic P-E (polarization vs. electric field) characteristics in a ferroelectric film. In this review, history and characteristics of FeRAMs are first introduced. It is described that there are two types of FeRAMs, capacitor-type and FET-type, and that only the capacitor-type FeRAM is now commercially available. In chapter 2, properties of ferroelectric films are discussed from a viewpoint of FeRAM application, in which particular attention is paid to those of Pb(Zr,Ti)O3, SrBi2Ta2O9, and BiFeO3. Then, cell structures and operation principle of the capacitor-type FeRAMs are discussed in chapter 3. It is described that the stacked technology of ferroelectric capacitors and development of new materials with large remanent polarization are important for fabricating high-density memories. Finally, in chapter 4, the optimized gate structure in ferroelectric-gate field-effect transistors is discussed and experimental results showing excellent data retention characteristics are presented. PMID:23421123

  4. Randomized parcellation based inference.

    PubMed

    Da Mota, Benoit; Fritsch, Virgile; Varoquaux, Gaël; Banaschewski, Tobias; Barker, Gareth J; Bokde, Arun L W; Bromberg, Uli; Conrod, Patricia; Gallinat, Jürgen; Garavan, Hugh; Martinot, Jean-Luc; Nees, Frauke; Paus, Tomas; Pausova, Zdenka; Rietschel, Marcella; Smolka, Michael N; Ströhle, Andreas; Frouin, Vincent; Poline, Jean-Baptiste; Thirion, Bertrand

    2014-04-01

    Neuroimaging group analyses are used to relate inter-subject signal differences observed in brain imaging with behavioral or genetic variables and to assess risks factors of brain diseases. The lack of stability and of sensitivity of current voxel-based analysis schemes may however lead to non-reproducible results. We introduce a new approach to overcome the limitations of standard methods, in which active voxels are detected according to a consensus on several random parcellations of the brain images, while a permutation test controls the false positive risk. Both on synthetic and real data, this approach shows higher sensitivity, better accuracy and higher reproducibility than state-of-the-art methods. In a neuroimaging-genetic application, we find that it succeeds in detecting a significant association between a genetic variant next to the COMT gene and the BOLD signal in the left thalamus for a functional Magnetic Resonance Imaging contrast associated with incorrect responses of the subjects from a Stop Signal Task protocol. PMID:24262376

  5. Random sphere packing model of heterogeneous propellants

    NASA Astrophysics Data System (ADS)

    Kochevets, Sergei Victorovich

    It is well recognized that combustion of heterogeneous propellants is strongly dependent on the propellant morphology. Recent developments in computing systems make it possible to start three-dimensional modeling of heterogeneous propellant combustion. A key component of such large scale computations is a realistic model of industrial propellants which retains the true morphology---a goal never achieved before. The research presented develops the Random Sphere Packing Model of heterogeneous propellants and generates numerical samples of actual industrial propellants. This is done by developing a sphere packing algorithm which randomly packs a large number of spheres with a polydisperse size distribution within a rectangular domain. First, the packing code is developed, optimized for performance, and parallelized using the OpenMP shared memory architecture. Second, the morphology and packing fraction of two simple cases of unimodal and bimodal packs are investigated computationally and analytically. It is shown that both the Loose Random Packing and Dense Random Packing limits are not well defined and the growth rate of the spheres is identified as the key parameter controlling the efficiency of the packing. For a properly chosen growth rate, computational results are found to be in excellent agreement with experimental data. Third, two strategies are developed to define numerical samples of polydisperse heterogeneous propellants: the Deterministic Strategy and the Random Selection Strategy. Using these strategies, numerical samples of industrial propellants are generated. The packing fraction is investigated and it is shown that the experimental values of the packing fraction can be achieved computationally. It is strongly believed that this Random Sphere Packing Model of propellants is a major step forward in the realistic computational modeling of heterogeneous propellant of combustion. In addition, a method of analysis of the morphology of heterogeneous

  6. Phase transitions on random lattices: how random is topological disorder?

    PubMed

    Barghathi, Hatem; Vojta, Thomas

    2014-09-19

    We study the effects of topological (connectivity) disorder on phase transitions. We identify a broad class of random lattices whose disorder fluctuations decay much faster with increasing length scale than those of generic random systems, yielding a wandering exponent of ω=(d-1)/(2d) in d dimensions. The stability of clean critical points is thus governed by the criterion (d+1)ν>2 rather than the usual Harris criterion dν>2, making topological disorder less relevant than generic randomness. The Imry-Ma criterion is also modified, allowing first-order transitions to survive in all dimensions d>1. These results explain a host of puzzling violations of the original criteria for equilibrium and nonequilibrium phase transitions on random lattices. We discuss applications, and we illustrate our theory by computer simulations of random Voronoi and other lattices. PMID:25279615

  7. Random distributed feedback fibre lasers

    NASA Astrophysics Data System (ADS)

    Turitsyn, Sergei K.; Babin, Sergey A.; Churkin, Dmitry V.; Vatnik, Ilya D.; Nikulin, Maxim; Podivilov, Evgenii V.

    2014-09-01

    The concept of random lasers exploiting multiple scattering of photons in an amplifying disordered medium in order to generate coherent light without a traditional laser resonator has attracted a great deal of attention in recent years. This research area lies at the interface of the fundamental theory of disordered systems and laser science. The idea was originally proposed in the context of astrophysics in the 1960s by V.S. Letokhov, who studied scattering with “negative absorption” of the interstellar molecular clouds. Research on random lasers has since developed into a mature experimental and theoretical field. A simple design of such lasers would be promising for potential applications. However, in traditional random lasers the properties of the output radiation are typically characterized by complex features in the spatial, spectral and time domains, making them less attractive than standard laser systems in terms of practical applications. Recently, an interesting and novel type of one-dimensional random laser that operates in a conventional telecommunication fibre without any pre-designed resonator mirrors-random distributed feedback fibre laser-was demonstrated. The positive feedback required for laser generation in random fibre lasers is provided by the Rayleigh scattering from the inhomogeneities of the refractive index that are naturally present in silica glass. In the proposed laser concept, the randomly backscattered light is amplified through the Raman effect, providing distributed gain over distances up to 100 km. Although an effective reflection due to the Rayleigh scattering is extremely small (˜0.1%), the lasing threshold may be exceeded when a sufficiently large distributed Raman gain is provided. Such a random distributed feedback fibre laser has a number of interesting and attractive features. The fibre waveguide geometry provides transverse confinement, and effectively one-dimensional random distributed feedback leads to the generation

  8. Teacher Selection.

    ERIC Educational Resources Information Center

    Heynderickx, James J.

    1987-01-01

    A one-page introduction is followed by three pages containing summaries of three journal articles and two documents on teacher selection. Mary Cihak Jensen argues that final selection decisions should be based on multiple information sources, since teaching requires proficiency in many interrelated skills. Superintendent Richard J. Caliendo…

  9. 40 CFR 761.355 - Third level of sample selection.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... of sample selection further reduces the size of the subsample to 100 grams which is suitable for the... procedures in § 761.353 of this part into 100 gram portions. (b) Use a random number generator or random number table to select one 100 gram size portion as a sample for a procedure used to simulate...

  10. 40 CFR 761.355 - Third level of sample selection.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... of sample selection further reduces the size of the subsample to 100 grams which is suitable for the... procedures in § 761.353 of this part into 100 gram portions. (b) Use a random number generator or random number table to select one 100 gram size portion as a sample for a procedure used to simulate...

  11. 40 CFR 761.355 - Third level of sample selection.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... of sample selection further reduces the size of the subsample to 100 grams which is suitable for the... procedures in § 761.353 of this part into 100 gram portions. (b) Use a random number generator or random number table to select one 100 gram size portion as a sample for a procedure used to simulate...

  12. 40 CFR 761.355 - Third level of sample selection.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... of sample selection further reduces the size of the subsample to 100 grams which is suitable for the... procedures in § 761.353 of this part into 100 gram portions. (b) Use a random number generator or random number table to select one 100 gram size portion as a sample for a procedure used to simulate...

  13. Model Selection with the Linear Mixed Model for Longitudinal Data

    ERIC Educational Resources Information Center

    Ryoo, Ji Hoon

    2011-01-01

    Model building or model selection with linear mixed models (LMMs) is complicated by the presence of both fixed effects and random effects. The fixed effects structure and random effects structure are codependent, so selection of one influences the other. Most presentations of LMM in psychology and education are based on a multilevel or…

  14. Students' Misconceptions about Random Variables

    ERIC Educational Resources Information Center

    Kachapova, Farida; Kachapov, Ilias

    2012-01-01

    This article describes some misconceptions about random variables and related counter-examples, and makes suggestions about teaching initial topics on random variables in general form instead of doing it separately for discrete and continuous cases. The focus is on post-calculus probability courses. (Contains 2 figures.)

  15. Ticks of a Random clock

    NASA Astrophysics Data System (ADS)

    Jung, P.; Talkner, P.

    2010-09-01

    A simple way to convert a purely random sequence of events into a signal with a strong periodic component is proposed. The signal consists of those instants of time at which the length of the random sequence exceeds an integer multiple of a given number. The larger this number the more pronounced the periodic behavior becomes.

  16. Computer generation of random deviates.

    PubMed

    Cormack, J; Shuter, B

    1991-06-01

    The need for random deviates arises in many scientific applications, such as the simulation of physical processes, numerical evaluation of complex mathematical formulae and the modeling of decision processes. In medical physics, Monte Carlo simulations have been used in radiology, radiation therapy and nuclear medicine. Specific instances include the modelling of x-ray scattering processes and the addition of random noise to images or curves in order to assess the effects of various processing procedures. Reliable sources of random deviates with statistical properties indistinguishable from true random deviates are a fundamental necessity for such tasks. This paper provides a review of computer algorithms which can be used to generate uniform random deviates and other distributions of interest to medical physicists, along with a few caveats relating to various problems and pitfalls which can occur. Source code listings for the generators discussed (in FORTRAN, Turbo-PASCAL and Data General ASSEMBLER) are available on request from the authors. PMID:1747086

  17. Randomness versus nonlocality and entanglement.

    PubMed

    Acín, Antonio; Massar, Serge; Pironio, Stefano

    2012-03-01

    The outcomes obtained in Bell tests involving two-outcome measurements on two subsystems can, in principle, generate up to 2 bits of randomness. However, the maximal violation of the Clauser-Horne-Shimony-Holt inequality guarantees the generation of only 1.23 bits of randomness. We prove here that quantum correlations with arbitrarily little nonlocality and states with arbitrarily little entanglement can be used to certify that close to the maximum of 2 bits of randomness are produced. Our results show that nonlocality, entanglement, and randomness are inequivalent quantities. They also imply that device-independent quantum key distribution with an optimal key generation rate is possible by using almost-local correlations and that device-independent randomness generation with an optimal rate is possible with almost-local correlations and with almost-unentangled states. PMID:22463395

  18. 40 CFR 211.212-2 - Test hearing protector selection.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 24 2010-07-01 2010-07-01 false Test hearing protector selection. 211... selection. (a) The test request will specify the number of test protectors which will be selected for... of the test request. (b) If random selection is specified, it must be achieved by...

  19. Dynamic response of random parametered structures with random excitation. [DYNAMO

    SciTech Connect

    Branstetter, L.J.; Paez, T.L.

    1986-02-01

    A Taylor series expansion technique is used for numerical evaluation of the statistical response moments of a linear multidegree of freedom (MDF) system having random stiffness characteristics, when excited by either stationary or nonstationary random load components. Equations are developed for the cases of white noise loading and single step memory loading, and a method is presented to extend the solution to multistep memory loading. The equations are greatly simplified by the assumption that all random quantities are normally distributed. A computer program is developed to calculate the response moments of example systems. A program user's manual and listing (DYNAMO) are included. Future extensions of the work and potential applications are discussed.

  20. Selected Health Practices Among Ohio's Rural Residents.

    ERIC Educational Resources Information Center

    Phillips, G. Howard; Pugh, Albert

    Using a stratified random sample of 12 of Ohio's 88 counties, this 1967 study had as its objectives (1) to measure the level of participation in selected health practices by Ohio's rural residents, (2) to compare the level of participation in selected health practices of farm and rural nonfarm residents, and (3) to examine levels of participation…

  1. Selective mutism

    MedlinePlus

    ... may need to continue therapy for shyness and social anxiety into the teenage years, and possibly into adulthood. ... Call your health care provider if your child has symptoms of selective mutism, and it is affecting school and social activities.

  2. The in vitro selection world.

    PubMed

    Jijakli, Kenan; Khraiwesh, Basel; Fu, Weiqi; Luo, Liming; Alzahmi, Amnah; Koussa, Joseph; Chaiboonchoe, Amphun; Kirmizialtin, Serdal; Yen, Laising; Salehi-Ashtiani, Kourosh

    2016-08-15

    Through iterative cycles of selection, amplification, and mutagenesis, in vitro selection provides the ability to isolate molecules of desired properties and function from large pools (libraries) of random molecules with as many as 10(16) distinct species. This review, in recognition of a quarter of century of scientific discoveries made through in vitro selection, starts with a brief overview of the method and its history. It further covers recent developments in in vitro selection with a focus on tools that enhance the capabilities of in vitro selection and its expansion from being purely a nucleic acids selection to that of polypeptides and proteins. In addition, we cover how next generation sequencing and modern biological computational tools are being used to complement in vitro selection experiments. On the very least, sequencing and computational tools can translate the large volume of information associated with in vitro selection experiments to manageable, analyzable, and exploitable information. Finally, in vivo selection is briefly compared and contrasted to in vitro selection to highlight the unique capabilities of each method. PMID:27312879

  3. When Is Selection Effective?

    PubMed

    Gravel, Simon

    2016-05-01

    Deleterious alleles can reach high frequency in small populations because of random fluctuations in allele frequency. This may lead, over time, to reduced average fitness. In this sense, selection is more "effective" in larger populations. Recent studies have considered whether the different demographic histories across human populations have resulted in differences in the number, distribution, and severity of deleterious variants, leading to an animated debate. This article first seeks to clarify some terms of the debate by identifying differences in definitions and assumptions used in recent studies. We argue that variants of Morton, Crow, and Muller's "total mutational damage" provide the soundest and most practical basis for such comparisons. Using simulations, analytical calculations, and 1000 Genomes Project data, we provide an intuitive and quantitative explanation for the observed similarity in genetic load across populations. We show that recent demography has likely modulated the effect of selection and still affects it, but the net result of the accumulated differences is small. Direct observation of differential efficacy of selection for specific allele classes is nevertheless possible with contemporary data sets. By contrast, identifying average genome-wide differences in the efficacy of selection across populations will require many modeling assumptions and is unlikely to provide much biological insight about human populations. PMID:27010021

  4. Random Forest Classification for Surficial Material Mapping in Northern Canada

    NASA Astrophysics Data System (ADS)

    Parkinson, William

    There is a need at the Geological Survey of Canada to apply improved accuracy assessments of satellite image classification and to support remote predictive mapping techniques for geological map production and field operations. Most existing image classification algorithms, however, lack any robust capabilities for assessing classification accuracy and its variability throughout the landscape. In this study, a random forest classification workflow is introduced to improve understanding of overall image classification accuracy and to better describe its spatial variability across a heterogeneous landscape in Northern Canada. Random Forest model is a stochastic implementation of classification and regression trees, which is computationally efficient, effectively handles outlier bias can be used on non-parametric data sources. A variable selection methodology and stochastic accuracy assessment for Random Forest is introduced. Random forest provides an enhanced classification compared to the standard maximum likelihood algorithms improving predictive capacity of satellite imagery for surficial material mapping.

  5. Record statistics of financial time series and geometric random walks.

    PubMed

    Sabir, Behlool; Santhanam, M S

    2014-09-01

    The study of record statistics of correlated series in physics, such as random walks, is gaining momentum, and several analytical results have been obtained in the past few years. In this work, we study the record statistics of correlated empirical data for which random walk models have relevance. We obtain results for the records statistics of select stock market data and the geometric random walk, primarily through simulations. We show that the distribution of the age of records is a power law with the exponent α lying in the range 1.5≤α≤1.8. Further, the longest record ages follow the Fréchet distribution of extreme value theory. The records statistics of geometric random walk series is in good agreement with that obtained from empirical stock data. PMID:25314414

  6. Record statistics of financial time series and geometric random walks

    NASA Astrophysics Data System (ADS)

    Sabir, Behlool; Santhanam, M. S.

    2014-09-01

    The study of record statistics of correlated series in physics, such as random walks, is gaining momentum, and several analytical results have been obtained in the past few years. In this work, we study the record statistics of correlated empirical data for which random walk models have relevance. We obtain results for the records statistics of select stock market data and the geometric random walk, primarily through simulations. We show that the distribution of the age of records is a power law with the exponent α lying in the range 1.5≤α≤1.8. Further, the longest record ages follow the Fréchet distribution of extreme value theory. The records statistics of geometric random walk series is in good agreement with that obtained from empirical stock data.

  7. Performance of Random Effects Model Estimators under Complex Sampling Designs

    ERIC Educational Resources Information Center

    Jia, Yue; Stokes, Lynne; Harris, Ian; Wang, Yan

    2011-01-01

    In this article, we consider estimation of parameters of random effects models from samples collected via complex multistage designs. Incorporation of sampling weights is one way to reduce estimation bias due to unequal probabilities of selection. Several weighting methods have been proposed in the literature for estimating the parameters of…

  8. A random spatial sampling method in a rural developing nation

    PubMed Central

    2014-01-01

    Background Nonrandom sampling of populations in developing nations has limitations and can inaccurately estimate health phenomena, especially among hard-to-reach populations such as rural residents. However, random sampling of rural populations in developing nations can be challenged by incomplete enumeration of the base population. Methods We describe a stratified random sampling method using geographical information system (GIS) software and global positioning system (GPS) technology for application in a health survey in a rural region of Guatemala, as well as a qualitative study of the enumeration process. Results This method offers an alternative sampling technique that could reduce opportunities for bias in household selection compared to cluster methods. However, its use is subject to issues surrounding survey preparation, technological limitations and in-the-field household selection. Application of this method in remote areas will raise challenges surrounding the boundary delineation process, use and translation of satellite imagery between GIS and GPS, and household selection at each survey point in varying field conditions. This method favors household selection in denser urban areas and in new residential developments. Conclusions Random spatial sampling methodology can be used to survey a random sample of population in a remote region of a developing nation. Although this method should be further validated and compared with more established methods to determine its utility in social survey applications, it shows promise for use in developing nations with resource-challenged environments where detailed geographic and human census data are less available. PMID:24716473

  9. Lowest eigenvalues of random Hamiltonians

    SciTech Connect

    Shen, J. J.; Zhao, Y. M.; Arima, A.; Yoshinaga, N.

    2008-05-15

    In this article we study the lowest eigenvalues of random Hamiltonians for both fermion and boson systems. We show that an empirical formula of evaluating the lowest eigenvalues of random Hamiltonians in terms of energy centroids and widths of eigenvalues is applicable to many different systems. We improve the accuracy of the formula by considering the third central moment. We show that these formulas are applicable not only to the evaluation of the lowest energy but also to the evaluation of excited energies of systems under random two-body interactions.

  10. Random graphs with hidden color.

    PubMed

    Söderberg, Bo

    2003-07-01

    We propose and investigate a unifying class of sparse random graph models, based on a hidden coloring of edge-vertex incidences, extending an existing approach, random graphs with a given degree distribution, in a way that admits a nontrivial correlation structure in the resulting graphs. The approach unifies a number of existing random graph ensembles within a common general formalism, and allows for the analytic calculation of observable graph characteristics. In particular, generating function techniques are used to derive the size distribution of connected components (clusters) as well as the location of the percolation threshold where a giant component appears. PMID:12935185

  11. Random sequential adsorption on fractals

    NASA Astrophysics Data System (ADS)

    Ciesla, Michal; Barbasz, Jakub

    2012-07-01

    Irreversible adsorption of spheres on flat collectors having dimension d < 2 is studied. Molecules are adsorbed on Sierpinski's triangle and carpet-like fractals (1 < d < 2), and on general Cantor set (d < 1). Adsorption process is modeled numerically using random sequential adsorption (RSA) algorithm. The paper concentrates on measurement of fundamental properties of coverages, i.e., maximal random coverage ratio and density autocorrelation function, as well as RSA kinetics. Obtained results allow to improve phenomenological relation between maximal random coverage ratio and collector dimension. Moreover, simulations show that, in general, most of known dimensional properties of adsorbed monolayers are valid for non-integer dimensions.

  12. Random sequential adsorption on fractals.

    PubMed

    Ciesla, Michal; Barbasz, Jakub

    2012-07-28

    Irreversible adsorption of spheres on flat collectors having dimension d < 2 is studied. Molecules are adsorbed on Sierpinski's triangle and carpet-like fractals (1 < d < 2), and on general Cantor set (d < 1). Adsorption process is modeled numerically using random sequential adsorption (RSA) algorithm. The paper concentrates on measurement of fundamental properties of coverages, i.e., maximal random coverage ratio and density autocorrelation function, as well as RSA kinetics. Obtained results allow to improve phenomenological relation between maximal random coverage ratio and collector dimension. Moreover, simulations show that, in general, most of known dimensional properties of adsorbed monolayers are valid for non-integer dimensions. PMID:22852643

  13. A random forest classifier for lymph diseases.

    PubMed

    Azar, Ahmad Taher; Elshazly, Hanaa Ismail; Hassanien, Aboul Ella; Elkorany, Abeer Mohamed

    2014-02-01

    Machine learning-based classification techniques provide support for the decision-making process in many areas of health care, including diagnosis, prognosis, screening, etc. Feature selection (FS) is expected to improve classification performance, particularly in situations characterized by the high data dimensionality problem caused by relatively few training examples compared to a large number of measured features. In this paper, a random forest classifier (RFC) approach is proposed to diagnose lymph diseases. Focusing on feature selection, the first stage of the proposed system aims at constructing diverse feature selection algorithms such as genetic algorithm (GA), Principal Component Analysis (PCA), Relief-F, Fisher, Sequential Forward Floating Search (SFFS) and the Sequential Backward Floating Search (SBFS) for reducing the dimension of lymph diseases dataset. Switching from feature selection to model construction, in the second stage, the obtained feature subsets are fed into the RFC for efficient classification. It was observed that GA-RFC achieved the highest classification accuracy of 92.2%. The dimension of input feature space is reduced from eighteen to six features by using GA. PMID:24290902

  14. Quantifying randomness in real networks

    PubMed Central

    Orsini, Chiara; Dankulov, Marija M.; Colomer-de-Simón, Pol; Jamakovic, Almerima; Mahadevan, Priya; Vahdat, Amin; Bassler, Kevin E.; Toroczkai, Zoltán; Boguñá, Marián; Caldarelli, Guido; Fortunato, Santo; Krioukov, Dmitri

    2015-01-01

    Represented as graphs, real networks are intricate combinations of order and disorder. Fixing some of the structural properties of network models to their values observed in real networks, many other properties appear as statistical consequences of these fixed observables, plus randomness in other respects. Here we employ the dk-series, a complete set of basic characteristics of the network structure, to study the statistical dependencies between different network properties. We consider six real networks—the Internet, US airport network, human protein interactions, technosocial web of trust, English word network, and an fMRI map of the human brain—and find that many important local and global structural properties of these networks are closely reproduced by dk-random graphs whose degree distributions, degree correlations and clustering are as in the corresponding real network. We discuss important conceptual, methodological, and practical implications of this evaluation of network randomness, and release software to generate dk-random graphs. PMID:26482121

  15. Scattering from a random surface

    SciTech Connect

    Abarbanel, H.D.I.

    1980-11-01

    We give a formulation of the problem of propagation of scalar waves over a random surface. By a judicious choice of variables we are able to show that this situation is equivalent to propagation of these waves through a medium of random fluctuations with fluctuating source and receiver. The wave equation in the new coordinates has an additional term, the fluctuation operator, which depends on derivatives of the surface in space and time. An expansion in the fluctuation operator is given which guarantees the desired boundary conditions at every order. We treat both the cases where the surface is time dependent, such as the sea surface, or fixed in time. Also discussed is the situation where the source and receiver lie between the random surface and another, possibly also random, surface. In detail we consider acoustic waves for which the surfaces are pressure release. The method is directly applicable to electromagnetic waves and other boundary conditions.

  16. A Randomized Central Limit Theorem

    NASA Astrophysics Data System (ADS)

    Eliazar, Iddo; Klafter, Joseph

    2010-05-01

    The Central Limit Theorem (CLT), one of the most elemental pillars of Probability Theory and Statistical Physics, asserts that: the universal probability law of large aggregates of independent and identically distributed random summands with zero mean and finite variance, scaled by the square root of the aggregate-size (√{n}), is Gaussian. The scaling scheme of the CLT is deterministic and uniform - scaling all aggregate-summands by the common and deterministic factor √{n}. This Letter considers scaling schemes which are stochastic and non-uniform, and presents a "Randomized Central Limit Theorem" (RCLT): we establish a class of random scaling schemes which yields universal probability laws of large aggregates of independent and identically distributed random summands. The RCLT universal probability laws, in turn, are the one-sided and the symmetric Lévy laws.

  17. Quantifying randomness in real networks

    NASA Astrophysics Data System (ADS)

    Orsini, Chiara; Dankulov, Marija M.; Colomer-de-Simón, Pol; Jamakovic, Almerima; Mahadevan, Priya; Vahdat, Amin; Bassler, Kevin E.; Toroczkai, Zoltán; Boguñá, Marián; Caldarelli, Guido; Fortunato, Santo; Krioukov, Dmitri

    2015-10-01

    Represented as graphs, real networks are intricate combinations of order and disorder. Fixing some of the structural properties of network models to their values observed in real networks, many other properties appear as statistical consequences of these fixed observables, plus randomness in other respects. Here we employ the dk-series, a complete set of basic characteristics of the network structure, to study the statistical dependencies between different network properties. We consider six real networks--the Internet, US airport network, human protein interactions, technosocial web of trust, English word network, and an fMRI map of the human brain--and find that many important local and global structural properties of these networks are closely reproduced by dk-random graphs whose degree distributions, degree correlations and clustering are as in the corresponding real network. We discuss important conceptual, methodological, and practical implications of this evaluation of network randomness, and release software to generate dk-random graphs.

  18. Quantifying randomness in real networks.

    PubMed

    Orsini, Chiara; Dankulov, Marija M; Colomer-de-Simón, Pol; Jamakovic, Almerima; Mahadevan, Priya; Vahdat, Amin; Bassler, Kevin E; Toroczkai, Zoltán; Boguñá, Marián; Caldarelli, Guido; Fortunato, Santo; Krioukov, Dmitri

    2015-01-01

    Represented as graphs, real networks are intricate combinations of order and disorder. Fixing some of the structural properties of network models to their values observed in real networks, many other properties appear as statistical consequences of these fixed observables, plus randomness in other respects. Here we employ the dk-series, a complete set of basic characteristics of the network structure, to study the statistical dependencies between different network properties. We consider six real networks--the Internet, US airport network, human protein interactions, technosocial web of trust, English word network, and an fMRI map of the human brain--and find that many important local and global structural properties of these networks are closely reproduced by dk-random graphs whose degree distributions, degree correlations and clustering are as in the corresponding real network. We discuss important conceptual, methodological, and practical implications of this evaluation of network randomness, and release software to generate dk-random graphs. PMID:26482121

  19. Quantum-noise randomized ciphers

    NASA Astrophysics Data System (ADS)

    Nair, Ranjith; Yuen, Horace P.; Corndorf, Eric; Eguchi, Takami; Kumar, Prem

    2006-11-01

    We review the notion of a classical random cipher and its advantages. We sharpen the usual description of random ciphers to a particular mathematical characterization suggested by the salient feature responsible for their increased security. We describe a concrete system known as αη and show that it is equivalent to a random cipher in which the required randomization is affected by coherent-state quantum noise. We describe the currently known security features of αη and similar systems, including lower bounds on the unicity distances against ciphertext-only and known-plaintext attacks. We show how αη used in conjunction with any standard stream cipher such as the Advanced Encryption Standard provides an additional, qualitatively different layer of security from physical encryption against known-plaintext attacks on the key. We refute some claims in the literature that αη is equivalent to a nonrandom stream cipher.

  20. Quantum-noise randomized ciphers

    SciTech Connect

    Nair, Ranjith; Yuen, Horace P.; Kumar, Prem; Corndorf, Eric; Eguchi, Takami

    2006-11-15

    We review the notion of a classical random cipher and its advantages. We sharpen the usual description of random ciphers to a particular mathematical characterization suggested by the salient feature responsible for their increased security. We describe a concrete system known as {alpha}{eta} and show that it is equivalent to a random cipher in which the required randomization is affected by coherent-state quantum noise. We describe the currently known security features of {alpha}{eta} and similar systems, including lower bounds on the unicity distances against ciphertext-only and known-plaintext attacks. We show how {alpha}{eta} used in conjunction with any standard stream cipher such as the Advanced Encryption Standard provides an additional, qualitatively different layer of security from physical encryption against known-plaintext attacks on the key. We refute some claims in the literature that {alpha}{eta} is equivalent to a nonrandom stream cipher.

  1. Control theory for random systems

    NASA Technical Reports Server (NTRS)

    Bryson, A. E., Jr.

    1972-01-01

    A survey is presented of the current knowledge available for designing and predicting the effectiveness of controllers for dynamic systems which can be modeled by ordinary differential equations. A short discussion of feedback control is followed by a description of deterministic controller design and the concept of system state. The need for more realistic disturbance models led to the use of stochastic process concepts, in particular the Gauss-Markov process. A compensator controlled system, with random forcing functions, random errors in the measurements, and random initial conditions, is treated as constituting a Gauss-Markov random process; hence the mean-square behavior of the controlled system is readily predicted. As an example, a compensator is designed for a helicopter to maintain it in hover in a gusty wind over a point on the ground.

  2. Diffraction by random Ronchi gratings.

    PubMed

    Torcal-Milla, Francisco Jose; Sanchez-Brea, Luis Miguel

    2016-08-01

    In this work, we obtain analytical expressions for the near-and far-field diffraction of random Ronchi diffraction gratings where the slits of the grating are randomly displaced around their periodical positions. We theoretically show that the effect of randomness in the position of the slits of the grating produces a decrease of the contrast and even disappearance of the self-images for high randomness level at the near field. On the other hand, it cancels high-order harmonics in far field, resulting in only a few central diffraction orders. Numerical simulations by means of the Rayleigh-Sommerfeld diffraction formula are performed in order to corroborate the analytical results. These results are of interest for industrial and technological applications where manufacture errors need to be considered. PMID:27505363

  3. Random and non-random mating populations: Evolutionary dynamics in meiotic drive.

    PubMed

    Sarkar, Bijan

    2016-01-01

    Game theoretic tools are utilized to analyze a one-locus continuous selection model of sex-specific meiotic drive by considering nonequivalence of the viabilities of reciprocal heterozygotes that might be noticed at an imprinted locus. The model draws attention to the role of viability selections of different types to examine the stable nature of polymorphic equilibrium. A bridge between population genetics and evolutionary game theory has been built up by applying the concept of the Fundamental Theorem of Natural Selection. In addition to pointing out the influences of male and female segregation ratios on selection, configuration structure reveals some noted results, e.g., Hardy-Weinberg frequencies hold in replicator dynamics, occurrence of faster evolution at the maximized variance fitness, existence of mixed Evolutionarily Stable Strategy (ESS) in asymmetric games, the tending evolution to follow not only a 1:1 sex ratio but also a 1:1 different alleles ratio at particular gene locus. Through construction of replicator dynamics in the group selection framework, our selection model introduces a redefining bases of game theory to incorporate non-random mating where a mating parameter associated with population structure is dependent on the social structure. Also, the model exposes the fact that the number of polymorphic equilibria will depend on the algebraic expression of population structure. PMID:26524140

  4. Experimental evidence of quantum randomness incomputability

    SciTech Connect

    Calude, Cristian S.; Dinneen, Michael J.; Dumitrescu, Monica; Svozil, Karl

    2010-08-15

    In contrast with software-generated randomness (called pseudo-randomness), quantum randomness can be proven incomputable; that is, it is not exactly reproducible by any algorithm. We provide experimental evidence of incomputability--an asymptotic property--of quantum randomness by performing finite tests of randomness inspired by algorithmic information theory.

  5. On Pfaffian Random Point Fields

    NASA Astrophysics Data System (ADS)

    Kargin, V.

    2014-02-01

    We study Pfaffian random point fields by using the Moore-Dyson quaternion determinants. First, we give sufficient conditions that ensure that a self-dual quaternion kernel defines a valid random point field, and then we prove a CLT for Pfaffian point fields. The proofs are based on a new quaternion extension of the Cauchy-Binet determinantal identity. In addition, we derive the Fredholm determinantal formulas for the Pfaffian point fields which use the quaternion determinant.

  6. Digital random-number generator

    NASA Technical Reports Server (NTRS)

    Brocker, D. H.

    1973-01-01

    For binary digit array of N bits, use N noise sources to feed N nonlinear operators; each flip-flop in digit array is set by nonlinear operator to reflect whether amplitude of generator which feeds it is above or below mean value of generated noise. Fixed-point uniform distribution random number generation method can also be used to generate random numbers with other than uniform distribution.

  7. Quasi-Random Sequence Generators.

    Energy Science and Technology Software Center (ESTSC)

    1994-03-01

    Version 00 LPTAU generates quasi-random sequences. The sequences are uniformly distributed sets of L=2**30 points in the N-dimensional unit cube: I**N=[0,1]. The sequences are used as nodes for multidimensional integration, as searching points in global optimization, as trial points in multicriteria decision making, as quasi-random points for quasi Monte Carlo algorithms.

  8. Randomness and degrees of irregularity.

    PubMed Central

    Pincus, S; Singer, B H

    1996-01-01

    The fundamental question "Are sequential data random?" arises in myriad contexts, often with severe data length constraints. Furthermore, there is frequently a critical need to delineate nonrandom sequences in terms of closeness to randomness--e.g., to evaluate the efficacy of therapy in medicine. We address both these issues from a computable framework via a quantification of regularity. ApEn (approximate entropy), defining maximal randomness for sequences of arbitrary length, indicating the applicability to sequences as short as N = 5 points. An infinite sequence formulation of randomness is introduced that retains the operational (and computable) features of the finite case. In the infinite sequence setting, we indicate how the "foundational" definition of independence in probability theory, and the definition of normality in number theory, reduce to limit theorems without rates of convergence, from which we utilize ApEn to address rates of convergence (of a deficit from maximal randomness), refining the aforementioned concepts in a computationally essential manner. Representative applications among many are indicated to assess (i) random number generation output; (ii) well-shuffled arrangements; and (iii) (the quality of) bootstrap replicates. PMID:11607637

  9. Phase Transitions on Random Lattices: How Random is Topological Disorder?

    NASA Astrophysics Data System (ADS)

    Barghathi, Hatem; Vojta, Thomas

    2015-03-01

    We study the effects of topological (connectivity) disorder on phase transitions. We identify a broad class of random lattices whose disorder fluctuations decay much faster with increasing length scale than those of generic random systems, yielding a wandering exponent of ω = (d - 1) / (2 d) in d dimensions. The stability of clean critical points is thus governed by the criterion (d + 1) ν > 2 rather than the usual Harris criterion dν > 2 , making topological disorder less relevant than generic randomness. The Imry-Ma criterion is also modified, allowing first-order transitions to survive in all dimensions d > 1 . These results explain a host of puzzling violations of the original criteria for equilibrium and nonequilibrium phase transitions on random lattices. We discuss applications, and we illustrate our theory by computer simulations of random Voronoi and other lattices. This work was supported by the NSF under Grant Nos. DMR-1205803 and PHYS-1066293. We acknowledge the hospitality of the Aspen Center for Physics.

  10. Full randomness from arbitrarily deterministic events.

    PubMed

    Gallego, Rodrigo; Masanes, Lluis; De La Torre, Gonzalo; Dhara, Chirag; Aolita, Leandro; Acín, Antonio

    2013-01-01

    Do completely unpredictable events exist? Classical physics excludes fundamental randomness. Although quantum theory makes probabilistic predictions, this does not imply that nature is random, as randomness should be certified without relying on the complete structure of the theory being used. Bell tests approach the question from this perspective. However, they require prior perfect randomness, falling into a circular reasoning. A Bell test that generates perfect random bits from bits possessing high-but less than perfect-randomness has recently been obtained. Yet, the main question remained open: does any initial randomness suffice to certify perfect randomness? Here we show that this is indeed the case. We provide a Bell test that uses arbitrarily imperfect random bits to produce bits that are, under the non-signalling principle assumption, perfectly random. This provides the first protocol attaining full randomness amplification. Our results have strong implications onto the debate of whether there exist events that are fully random. PMID:24173040

  11. On Convergent Probability of a Random Walk

    ERIC Educational Resources Information Center

    Lee, Y.-F.; Ching, W.-K.

    2006-01-01

    This note introduces an interesting random walk on a straight path with cards of random numbers. The method of recurrent relations is used to obtain the convergent probability of the random walk with different initial positions.

  12. Variable Selection for Qualitative Interactions

    PubMed Central

    Gunter, L.; Zhu, J.; Murphy, S. A.

    2009-01-01

    In this article we discuss variable selection for decision making with focus on decisions regarding when to provide treatment and which treatment to provide. Current variable selection techniques were developed for use in a supervised learning setting where the goal is prediction of the response. These techniques often downplay the importance of interaction variables that have small predictive ability but that are critical when the ultimate goal is decision making rather than prediction. We propose two new techniques designed specifically to find variables that aid in decision making. Simulation results are given along with an application of the methods on data from a randomized controlled trial for the treatment of depression. PMID:21179592

  13. The Random-Threshold Generalized Unfolding Model and Its Application of Computerized Adaptive Testing

    ERIC Educational Resources Information Center

    Wang, Wen-Chung; Liu, Chen-Wei; Wu, Shiu-Lien

    2013-01-01

    The random-threshold generalized unfolding model (RTGUM) was developed by treating the thresholds in the generalized unfolding model as random effects rather than fixed effects to account for the subjective nature of the selection of categories in Likert items. The parameters of the new model can be estimated with the JAGS (Just Another Gibbs…

  14. Analysis of Quantitative Traits in Two Long-Term Randomly Mated Soybean Populations I. Genetic Variances

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The genetic effects of long term random mating and natural selection aided by genetic male sterility were evaluated in two soybean [Glycine max (L.) Merr.] populations: RSII and RSIII. Population means, variances, and heritabilities were estimated to determine the effects of 26 generations of random...

  15. Balancing Participation across Students in Large College Classes via Randomized Participation Credit

    ERIC Educational Resources Information Center

    McCleary, Daniel F.; Aspiranti, Kathleen B.; Foster, Lisa N.; Blondin, Carolyn A.; Gaylon, Charles E.; Yaw, Jared S.; Forbes, Bethany N.; Williams, Robert L.

    2011-01-01

    The study examines the effects of randomized credit on the percentage of students participating at four predefined levels. Students recorded their comments on specially designed record cards, and days were randomly selected for participation credit. This arrangement balanced participation across students while cutting instructor time for recording…

  16. Wave propagation through a random medium - The random slab problem

    NASA Technical Reports Server (NTRS)

    Acquista, C.

    1978-01-01

    The first-order smoothing approximation yields integral equations for the mean and the two-point correlation function of a wave in a random medium. A method is presented for the approximate solution of these equations that combines features of the eiconal approximation and of the Born expansion. This method is applied to the problem of reflection and transmission of a plane wave by a slab of a random medium. Both the mean wave and the covariance are calculated to determine the reflected and transmitted amplitudes and intensities.

  17. Cover times of random searches

    NASA Astrophysics Data System (ADS)

    Chupeau, Marie; Bénichou, Olivier; Voituriez, Raphaël

    2015-10-01

    How long must one undertake a random search to visit all sites of a given domain? This time, known as the cover time, is a key observable to quantify the efficiency of exhaustive searches, which require a complete exploration of an area and not only the discovery of a single target. Examples range from immune-system cells chasing pathogens to animals harvesting resources, from robotic exploration for cleaning or demining to the task of improving search algorithms. Despite its broad relevance, the cover time has remained elusive and so far explicit results have been scarce and mostly limited to regular random walks. Here we determine the full distribution of the cover time for a broad range of random search processes, including Lévy strategies, intermittent strategies, persistent random walks and random walks on complex networks, and reveal its universal features. We show that for all these examples the mean cover time can be minimized, and that the corresponding optimal strategies also minimize the mean search time for a single target, unambiguously pointing towards their robustness.

  18. Random root movements in weightlessness.

    PubMed

    Johnsson, A; Karlsson, C; Iversen, T H; Chapman, D K

    1996-02-01

    The dynamics of root growth was studied in weightlessness. In the absence of the gravitropic reference direction during weightlessness, root movements could be controlled by spontaneous growth processes, without any corrective growth induced by the gravitropic system. If truly random of nature, the bending behavior should follow so-called 'random walk' mathematics during weightlessness. Predictions from this hypothesis were critically tested. In a Spacelab ESA-experiment, denoted RANDOM and carried out during the IML-2 Shuttle flight in July 1994, the growth of garden cress (Lepidium sativum) roots was followed by time lapse photography at 1-h intervals. The growth pattern was recorded for about 20 h. Root growth was significantly smaller in weightlessness as compared to gravity (control) conditions. It was found that the roots performed spontaneous movements in weightlessness. The average direction of deviation of the plants consistently stayed equal to zero, despite these spontaneous movements. The average squared deviation increased linearly with time as predicted theoretically (but only for 8-10 h). Autocorrelation calculations showed that bendings of the roots, as determined from the 1-h photographs, were uncorrelated after about a 2-h interval. It is concluded that random processes play an important role in root growth. Predictions from a random walk hypothesis as to the growth dynamics could explain parts of the growth patterns recorded. This test of the hypothesis required microgravity conditions as provided for in a space experiment. PMID:11541141

  19. Using Psychokinesis to Explore the Nature of Quantum Randomness

    SciTech Connect

    Burns, Jean E.

    2011-11-29

    In retrocausation different causal events can produce different successor events, yet a successor event reflecting a particular cause occurs before the causal event does. It is sometimes proposed that the successor event is determined by propagation of the causal effect backwards in time via the dynamical equations governing the events. However, because dynamical equations are time reversible, the evolution of the system is not subject to change. Therefore, the backward propagation hypothesis implies that what may have seemed to be an arbitrary selection of a causal factor was in reality predetermined.Yet quantum randomness can be used to determine the causal factor, and a quantum random event is ordinarily thought of as being arbitrarily generated. So we must ask, when quantum random events occur, are they arbitrary (subject to their probabilistic constraints) or are they predetermined?Because psychokinesis (PK) can act on quantum random events, it can be used as a probe to explore questions such as the above. It is found that if quantum random events are predetermined (aside from the action of PK), certain types of experimental design can show enhanced PK through the use of precognition. Actual experiments are examined and compared, and most of those for which the design is especially suitable for showing this effect had unusually low p values for the number of trials. It is concluded that either the experimenter produced a remarkably strong experimenter effect or quantum random events are predetermined, thereby enabling enhanced PK in suitable experimental designs.

  20. A New Random Walk for Replica Detection in WSNs

    PubMed Central

    Aalsalem, Mohammed Y.; Saad, N. M.; Hossain, Md. Shohrab; Atiquzzaman, Mohammed; Khan, Muhammad Khurram

    2016-01-01

    Wireless Sensor Networks (WSNs) are vulnerable to Node Replication attacks or Clone attacks. Among all the existing clone detection protocols in WSNs, RAWL shows the most promising results by employing Simple Random Walk (SRW). More recently, RAND outperforms RAWL by incorporating Network Division with SRW. Both RAND and RAWL have used SRW for random selection of witness nodes which is problematic because of frequently revisiting the previously passed nodes that leads to longer delays, high expenditures of energy with lower probability that witness nodes intersect. To circumvent this problem, we propose to employ a new kind of constrained random walk, namely Single Stage Memory Random Walk and present a distributed technique called SSRWND (Single Stage Memory Random Walk with Network Division). In SSRWND, single stage memory random walk is combined with network division aiming to decrease the communication and memory costs while keeping the detection probability higher. Through intensive simulations it is verified that SSRWND guarantees higher witness node security with moderate communication and memory overheads. SSRWND is expedient for security oriented application fields of WSNs like military and medical. PMID:27409082

  1. A New Random Walk for Replica Detection in WSNs.

    PubMed

    Aalsalem, Mohammed Y; Khan, Wazir Zada; Saad, N M; Hossain, Md Shohrab; Atiquzzaman, Mohammed; Khan, Muhammad Khurram

    2016-01-01

    Wireless Sensor Networks (WSNs) are vulnerable to Node Replication attacks or Clone attacks. Among all the existing clone detection protocols in WSNs, RAWL shows the most promising results by employing Simple Random Walk (SRW). More recently, RAND outperforms RAWL by incorporating Network Division with SRW. Both RAND and RAWL have used SRW for random selection of witness nodes which is problematic because of frequently revisiting the previously passed nodes that leads to longer delays, high expenditures of energy with lower probability that witness nodes intersect. To circumvent this problem, we propose to employ a new kind of constrained random walk, namely Single Stage Memory Random Walk and present a distributed technique called SSRWND (Single Stage Memory Random Walk with Network Division). In SSRWND, single stage memory random walk is combined with network division aiming to decrease the communication and memory costs while keeping the detection probability higher. Through intensive simulations it is verified that SSRWND guarantees higher witness node security with moderate communication and memory overheads. SSRWND is expedient for security oriented application fields of WSNs like military and medical. PMID:27409082

  2. Mixing rates and limit theorems for random intermittent maps

    NASA Astrophysics Data System (ADS)

    Bahsoun, Wael; Bose, Christopher

    2016-04-01

    We study random transformations built from intermittent maps on the unit interval that share a common neutral fixed point. We focus mainly on random selections of Pomeu-Manneville-type maps {{T}α} using the full parameter range 0<α <∞ , in general. We derive a number of results around a common theme that illustrates in detail how the constituent map that is fastest mixing (i.e. smallest α) combined with details of the randomizing process, determines the asymptotic properties of the random transformation. Our key result (theorem 1.1) establishes sharp estimates on the position of return time intervals for the quenched dynamics. The main applications of this estimate are to limit laws (in particular, CLT and stable laws, depending on the parameters chosen in the range 0<α <1 ) for the associated skew product; these are detailed in theorem 3.2. Since our estimates in theorem 1.1 also hold for 1≤slant α <∞ we study a second class of random transformations derived from piecewise affine Gaspard-Wang maps, prove existence of an infinite (σ-finite) invariant measure and study the corresponding correlation asymptotics. To the best of our knowledge, this latter kind of result is completely new in the setting of random transformations.

  3. Enhanced view random access ability for multiview video coding

    NASA Astrophysics Data System (ADS)

    Elmesloul Nasri, Seif Allah; Khelil, Khaled; Doghmane, Noureddine

    2016-03-01

    Apart from the efficient compression, reducing the complexity of the view random access is one of the most important requirements that should be considered in multiview video coding. In order to obtain an efficient compression, both temporal and inter-view correlations are exploited in the multiview video coding schemes, introducing higher complexity in the temporal and view random access. We propose an inter-view prediction structure that aims to lower the cost of randomly accessing any picture at any position and instant, with respect to the multiview reference model JMVM and other recent relevant works. The proposed scheme is mainly based on the use of two base views (I-views) in the structure with selected positions instead of a single reference view as in the standard structures. This will, therefore, provide a direct inter-view prediction for all the remaining views and will ensure a low-delay view random access ability while maintaining a very competitive bit-rate performance with a similar video quality measured in peak signal-to-noise ratio. In addition to a new evaluation method of the random access ability, the obtained results show a significant improvement in the view random accessibility with respect to other reported works.

  4. Forest Fires in a Random Forest

    NASA Astrophysics Data System (ADS)

    Leuenberger, Michael; Kanevski, Mikhaïl; Vega Orozco, Carmen D.

    2013-04-01

    Forest fires in Canton Ticino (Switzerland) are very complex phenomena. Meteorological data can explain some occurrences of fires in time, but not necessarily in space. Using anthropogenic and geographical feature data with the random forest algorithm, this study tries to highlight factors that most influence the fire-ignition and to identify areas under risk. The fundamental scientific problem considered in the present research deals with an application of random forest algorithms for the analysis and modeling of forest fires patterns in a high dimensional input feature space. This study is focused on the 2,224 anthropogenic forest fires among the 2,401 forest fire ignition points that have occurred in Canton Ticino from 1969 to 2008. Provided by the Swiss Federal Institute for Forest, Snow and Landscape Research (WSL), the database characterizes each fire by their location (x,y coordinates of the ignition point), start date, duration, burned area, and other information such as ignition cause and topographic features such as slope, aspect, altitude, etc. In addition, the database VECTOR25 from SwissTopo was used to extract information of the distances between fire ignition points and anthropogenic structures like buildings, road network, rail network, etc. Developed by L. Breiman and A. Cutler, the Random Forests (RF) algorithm provides an ensemble of classification and regression trees. By a pseudo-random variable selection for each split node, this method grows a variety of decision trees that do not return the same results, and thus by a committee system, returns a value that has a better accuracy than other machine learning methods. This algorithm incorporates directly measurement of importance variable which is used to display factors affecting forest fires. Dealing with this parameter, several models can be fit, and thus, a prediction can be made throughout the validity domain of Canton Ticino. Comprehensive RF analysis was carried out in order to 1

  5. Mechanisms and consequences of widespread random monoallelic expression.

    PubMed

    Chess, Andrew

    2012-06-01

    Although random monoallelic expression has been known for decades to affect genes on the X chromosome in female placental mammals, until a few years ago it was thought that there were few autosomal genes that were regulated in this manner. New tools for assaying gene expression genome-wide are now revealing that there are perhaps more genes that are subject to random monoallelic expression on mammalian autosomes than there are on the X chromosome and that these expression properties are achieved by diverse molecular mechanisms. This mode of expression has the potential to have an impact on natural selection and on the evolution of gene families. PMID:22585065

  6. RANDOMIZED CONTROLLED CLINICAL TRIALS IN ORTHOPEDICS: DIFFICULTIES AND LIMITATIONS

    PubMed Central

    Malavolta, Eduardo Angeli; Demange, Marco Kawamura; Gobbi, Riccardo Gomes; Imamura, Marta; Fregni, Felipe

    2015-01-01

    Randomized controlled clinical trials (RCTs) are considered to be the gold standard for evidence-based medicine nowadays, and are important for directing medical practice through consistent scientific observations. Steps such as patient selection, randomization and blinding are fundamental for conducting a RCT, but some additional difficulties are presented in trials that involve surgical procedures, as is common in orthopedics. The aim of this article was to highlight and discuss some difficulties and possible limitations on RCTs within the field of surgery. PMID:27027037

  7. Selective Emitters

    NASA Technical Reports Server (NTRS)

    Chubb, Donald L. (Inventor)

    1992-01-01

    This invention relates to a small particle selective emitter for converting thermal energy into narrow band radiation with high efficiency. The small particle selective emitter is used in combination with a photovoltaic array to provide a thermal to electrical energy conversion device. An energy conversion apparatus of this type is called a thermo-photovoltaic device. In the first embodiment, small diameter particles of a rare earth oxide are suspended in an inert gas enclosed between concentric cylinders. The rare earth oxides are used because they have the desired property of large emittance in a narrow wavelength band and small emittance outside the band. However, it should be emphasized that it is the smallness of the particles that enhances the radiation property. The small particle selective emitter is surrounded by a photovoltaic array. In an alternate embodiment, the small particle gas mixture is circulated through a thermal energy source. This thermal energy source can be a nuclear reactor, solar receiver, or combustor of a fossil fuel.

  8. Propagation in multiscale random media

    NASA Astrophysics Data System (ADS)

    Balk, Alexander M.

    2003-10-01

    Many studies consider media with microstructure, which has variations on some microscale, while the macroproperties are under investigation. Sometimes the medium has several microscales, all of them being much smaller than the macroscale. Sometimes the variations on the macroscale are also included, which are taken into account by some procedures, like WKB or geometric optics. What if the medium has variations on all scales from microscale to macroscale? This situation occurs in several practical problems. The talk is about such situations, in particular, passive tracer in a random velocity field, wave propagation in a random medium, Schrödinger equation with random potential. To treat such problems we have developed the statistical near-identity transformation. We find anomalous attenuation of the pulse propagating in a multiscale medium.

  9. Relatively Random: Context Effects on Perceived Randomness and Predicted Outcomes

    ERIC Educational Resources Information Center

    Matthews, William J.

    2013-01-01

    This article concerns the effect of context on people's judgments about sequences of chance outcomes. In Experiment 1, participants judged whether sequences were produced by random, mechanical processes (such as a roulette wheel) or skilled human action (such as basketball shots). Sequences with lower alternation rates were judged more likely to…

  10. A Randomized Experiment Comparing Random and Cutoff-Based Assignment

    ERIC Educational Resources Information Center

    Shadish, William R.; Galindo, Rodolfo; Wong, Vivian C.; Steiner, Peter M.; Cook, Thomas D.

    2011-01-01

    In this article, we review past studies comparing randomized experiments to regression discontinuity designs, mostly finding similar results, but with significant exceptions. The latter might be due to potential confounds of study characteristics with assignment method or with failure to estimate the same parameter over methods. In this study, we…

  11. Nanotechnological selection.

    PubMed

    Demming, Anna

    2013-01-18

    At the nanoscale measures can move from a mass-scale analogue calibration to counters of discrete units. The shift redefines the possible levels of control that can be achieved in a system if adequate selectivity can be imposed. As an example as ionic substances pass through nanoscale pores, the quantity of ions is low enough that the pore can contain either negative or positive ions. Yet precise control over this selectivity still raises difficulties. In this issue researchers address the challenge of how to regulate the ionic selectivity of negative and positive charges with the use of an external charge. The approach may be useful for controlling the behaviour, properties and chemical composition of liquids and has possible technical applications for nanofluidic field effect transistors [1]. Selectivity is a critical advantage in the administration of drugs. Nanoparticles functionalized with targeting moieties can allow delivery of anti-cancer drugs to tumour cells, whilst avoiding healthy cells and hence reducing some of the debilitating side effects of cancer treatments [2]. Researchers in Belarus and the US developed a new theranostic approach-combining therapy and diagnosis-to support the evident benefits of cellular selectivity that can be achieved when nanoparticles are applied in medicine [3]. Their process uses nanobubbles of photothermal vapour, referred to as plasmonic nanobubbles, generated by plasmonic excitations in gold nanoparticles conjugated to diagnosis-specific antibodies. The intracellular plasmonic nanobubbles are controlled by laser fluence so that the response can be tuned in individual living cells. Lower fluence allows non-invasive high-sensitive imaging for diagnosis and higher fluence can disrupt the cellular membrane for treatments. The selective response of carbon nanotubes to different gases has leant them to be used within various different types of sensors, as summarized in a review by researchers at the University of California

  12. Abdominal lymphadenopathy detection using random forest

    NASA Astrophysics Data System (ADS)

    Cherry, Kevin M.; Wang, Shijun; Turkbey, Evrim B.; Summers, Ronald M.

    2014-03-01

    We propose a new method for detecting abdominal lymphadenopathy by utilizing a random forest statistical classifier to create voxel-level lymph node predictions, i.e. initial detection of enlarged lymph nodes. The framework permits the combination of multiple statistical lymph node descriptors and appropriate feature selection in order to improve lesion detection beyond traditional enhancement filters. We show that Hessian blobness measurements alone are inadequate for detecting lymph nodes in the abdominal cavity. Of the features tested here, intensity proved to be the most important predictor for lymph node classification. For initial detection, candidate lesions were extracted from the 3D prediction map generated by random forest. Statistical features describing intensity distribution, shape, and texture were calculated from each enlarged lymph node candidate. In the last step, a support vector machine (SVM) was trained and tested based on the calculated features from candidates and labels determined by two experienced radiologists. The computer-aided detection (CAD) system was tested on a dataset containing 30 patients with 119 enlarged lymph nodes. Our method achieved an AUC of 0.762+/-0.022 and a sensitivity of 79.8% with 15 false positives suggesting it can aid radiologists in finding enlarged lymph nodes.

  13. Two-Stage Modelling Of Random Phenomena

    NASA Astrophysics Data System (ADS)

    Barańska, Anna

    2015-12-01

    The main objective of this publication was to present a two-stage algorithm of modelling random phenomena, based on multidimensional function modelling, on the example of modelling the real estate market for the purpose of real estate valuation and estimation of model parameters of foundations vertical displacements. The first stage of the presented algorithm includes a selection of a suitable form of the function model. In the classical algorithms, based on function modelling, prediction of the dependent variable is its value obtained directly from the model. The better the model reflects a relationship between the independent variables and their effect on the dependent variable, the more reliable is the model value. In this paper, an algorithm has been proposed which comprises adjustment of the value obtained from the model with a random correction determined from the residuals of the model for these cases which, in a separate analysis, were considered to be the most similar to the object for which we want to model the dependent variable. The effect of applying the developed quantitative procedures for calculating the corrections and qualitative methods to assess the similarity on the final outcome of the prediction and its accuracy, was examined by statistical methods, mainly using appropriate parametric tests of significance. The idea of the presented algorithm has been designed so as to approximate the value of the dependent variable of the studied phenomenon to its value in reality and, at the same time, to have it "smoothed out" by a well fitted modelling function.

  14. [Randomized clinical trials and real clinical practice].

    PubMed

    Heerlein, Andrés

    2009-01-01

    One of the emerging problems in modern medicine is that part of its highly efficacious treatments do not show significant effectiveness in real world systems of care. Efficacy studies address the appropriate dosages, short term response and feasibility of treatments in carefully selected populations, but they do not necessarily provide information for decisions in clinical practice. This review aims to present strengths and limitations of different methodological types of trials and to offer an overview of how knowledge from clinical trials can be used for clinical practice. The important effect of funding source on the outcome of randomized controlled trials is discussed. Some key questions in the treatment assessment of depression, schizophrenia and different medical conditions are discussed, with a focus on the possibilities and restrictions of translating clinical trial results into real-world settings. Empirical evidence shows that although randomized controlled trials are the gold standard for proving efficacy of a therapeutic procedure they often suffer from funding source bias and from lack of generalizability. Effectiveness studies evaluate effects of treatments under conditions approximating usual care. Another key area that can be addressed by effectiveness studies is the impact on important health policy measures such as disability days, days of work or medical costs, etc. Conclusions show that the future assessment of treatment regimes for clinical utility requires less biased efficacy studies and more effectiveness studies addressing major issues from all relevant perspectives. PMID:19543562

  15. Extremal paths on a random Cayley tree

    NASA Astrophysics Data System (ADS)

    Majumdar, Satya N.; Krapivsky, P. L.

    2000-12-01

    We investigate the statistics of extremal path(s) (both the shortest and the longest) from the root to the bottom of a Cayley tree. The lengths of the edges are assumed to be independent identically distributed random variables drawn from a distribution ρ(l). Besides, the number of branches from any node is also random. Exact results are derived for arbitrary distribution ρ(l). In particular, for the binary \\{0,1\\} distribution ρ(l)=pδl,1+(1-p)δl,0, we show that as p increases, the minimal length undergoes an unbinding transition from a ``localized'' phase to a ``moving'' phase at the critical value, p=pc=1-b-1, where b is the average branch number of the tree. As the height n of the tree increases, the minimal length saturates to a finite constant in the localized phase (ppc) where the velocity vmin(p) is determined via a front selection mechanism. At p=pc, the minimal length grows with n in an extremely slow double-logarithmic fashion. The length of the maximal path, on the other hand, increases linearly as vmax(p)n for all p. The maximal and minimal velocities satisfy a general duality relation, vmin(p)+vmax(1-p)=1, which is also valid for directed paths on finite-dimensional lattices.

  16. EDITORIAL: Nanotechnological selection Nanotechnological selection

    NASA Astrophysics Data System (ADS)

    Demming, Anna

    2013-01-01

    At the nanoscale measures can move from a mass-scale analogue calibration to counters of discrete units. The shift redefines the possible levels of control that can be achieved in a system if adequate selectivity can be imposed. As an example as ionic substances pass through nanoscale pores, the quantity of ions is low enough that the pore can contain either negative or positive ions. Yet precise control over this selectivity still raises difficulties. In this issue researchers address the challenge of how to regulate the ionic selectivity of negative and positive charges with the use of an external charge. The approach may be useful for controlling the behaviour, properties and chemical composition of liquids and has possible technical applications for nanofluidic field effect transistors [1]. Selectivity is a critical advantage in the administration of drugs. Nanoparticles functionalized with targeting moieties can allow delivery of anti-cancer drugs to tumour cells, whilst avoiding healthy cells and hence reducing some of the debilitating side effects of cancer treatments [2]. Researchers in Belarus and the US developed a new theranostic approach—combining therapy and diagnosis—to support the evident benefits of cellular selectivity that can be achieved when nanoparticles are applied in medicine [3]. Their process uses nanobubbles of photothermal vapour, referred to as plasmonic nanobubbles, generated by plasmonic excitations in gold nanoparticles conjugated to diagnosis-specific antibodies. The intracellular plasmonic nanobubbles are controlled by laser fluence so that the response can be tuned in individual living cells. Lower fluence allows non-invasive high-sensitive imaging for diagnosis and higher fluence can disrupt the cellular membrane for treatments. The selective response of carbon nanotubes to different gases has leant them to be used within various different types of sensors, as summarized in a review by researchers at the University of

  17. Random walks on the mental number line.

    PubMed

    Shaki, Samuel; Fischer, Martin H

    2014-01-01

    The direction of influence between conceptual and motor activation, and its relevance for real-life activities, is still unclear. Here, we use the frequently reported association between small/large numbers and left/right space to investigate this issue during walking. We asked healthy adults to generate random numbers as they made lateral turns and found that (1) lateral turn decisions are predicted by the last few numbers generated prior to turning; (2) the intention to turn left/right makes small/large numbers more accessible; and (3) magnitude but not order of auditorily presented numbers influences the listener's turn selection. Our findings document a bidirectional influence between conceptual and motor activation and point to a hierarchically organized conceptual-motor activation. PMID:24091774

  18. Molecular random tilings as glasses

    PubMed Central

    Garrahan, Juan P.; Stannard, Andrew; Blunt, Matthew O.; Beton, Peter H.

    2009-01-01

    We have recently shown that p-terphenyl-3,5,3′,5′-tetracarboxylic acid adsorbed on graphite self-assembles into a two-dimensional rhombus random tiling. This tiling is close to ideal, displaying long-range correlations punctuated by sparse localized tiling defects. In this article we explore the analogy between dynamic arrest in this type of random tilings and that of structural glasses. We show that the structural relaxation of these systems is via the propagation–reaction of tiling defects, giving rise to dynamic heterogeneity. We study the scaling properties of the dynamics and discuss connections with kinetically constrained models of glasses. PMID:19720990

  19. Mode statistics in random lasers

    SciTech Connect

    Zaitsev, Oleg

    2006-12-15

    Representing an ensemble of random lasers with an ensemble of random matrices, we compute the average number of lasing modes and its fluctuations. The regimes of weak and strong coupling of the passive resonator to the environment are considered. In the latter case, contrary to an earlier claim in the literature, we do not find a power-law dependence of the average mode number on the pump strength. For the relative fluctuations, however, a power law can be established. It is shown that, due to the mode competition, the distribution of the number of excited modes over an ensemble of lasers is not binomial.

  20. Neutron transport in random media

    SciTech Connect

    Makai, M.

    1996-08-01

    The survey reviews the methods available in the literature which allow a discussion of corium recriticality after a severe accident and a characterization of the corium. It appears that to date no one has considered the eigenvalue problem, though for the source problem several approaches have been proposed. The mathematical formulation of a random medium may be approached in different ways. Based on the review of the literature, we can draw three basic conclusions. The problem of static, random perturbations has been solved. The static case is tractable by the Monte Carlo method. There is a specific time dependent case for which the average flux is given as a series expansion.

  1. Synchronizability of random rectangular graphs

    SciTech Connect

    Estrada, Ernesto Chen, Guanrong

    2015-08-15

    Random rectangular graphs (RRGs) represent a generalization of the random geometric graphs in which the nodes are embedded into hyperrectangles instead of on hypercubes. The synchronizability of RRG model is studied. Both upper and lower bounds of the eigenratio of the network Laplacian matrix are determined analytically. It is proven that as the rectangular network is more elongated, the network becomes harder to synchronize. The synchronization processing behavior of a RRG network of chaotic Lorenz system nodes is numerically investigated, showing complete consistence with the theoretical results.

  2. Random organization and plastic depinning

    SciTech Connect

    Reichhardt, Charles; Reichhardt, Cynthia

    2008-01-01

    We provide evidence that the general phenomenon of plastic depinning can be described as an absorbing phase transition, and shows the same features as the random organization which was recently studied in periodically driven particle systems [L. Corte, Nature Phys. 4, 420 (2008)]. In the plastic flow system, the pinned regime corresponds to the absorbing state and the moving state corresponds to the fluctuating state. When an external force is suddenly applied, the system eventually organizes into one of these two states with a time scale that diverges as a power law at a nonequilibrium transition. We propose a simple experiment to test for this transition in systems with random disorder.

  3. Random sequential adsorption of tetramers

    NASA Astrophysics Data System (ADS)

    Cieśla, Michał

    2013-07-01

    Adsorption of a tetramer built of four identical spheres was studied numerically using the random sequential adsorption (RSA) algorithm. Tetramers were adsorbed on a two-dimensional, flat and homogeneous surface. Two different models of the adsorbate were investigated: a rhomboid and a square one; monomer centres were put on vertices of rhomboids and squares, respectively. Numerical simulations allow us to establish the maximal random coverage ratio as well as the available surface function (ASF), which is crucial for determining kinetics of the adsorption process. These results were compared with data obtained experimentally for KfrA plasmid adsorption. Additionally, the density autocorrelation function was measured.

  4. 47 CFR 1.1604 - Post-selection hearings.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Post-selection hearings. 1.1604 Section 1.1604 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1604 Post-selection hearings. (a) Following the...

  5. Bridging Emergent Attributes and Darwinian Principles in Teaching Natural Selection

    ERIC Educational Resources Information Center

    Xu, Dongchen; Chi, Michelene T. H.

    2016-01-01

    Students often have misconceptions about natural selection as they misuse a direct causal schema to explain the process. Natural selection is in fact an emergent process where random interactions lead to changes in a population. The misconceptions stem from students' lack of emergent schema for natural selection. In order to help students…

  6. 47 CFR 1.1604 - Post-selection hearings.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 1 2011-10-01 2011-10-01 false Post-selection hearings. 1.1604 Section 1.1604 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1604 Post-selection hearings. (a) Following the...

  7. A Mixed Effects Randomized Item Response Model

    ERIC Educational Resources Information Center

    Fox, J.-P.; Wyrick, Cheryl

    2008-01-01

    The randomized response technique ensures that individual item responses, denoted as true item responses, are randomized before observing them and so-called randomized item responses are observed. A relationship is specified between randomized item response data and true item response data. True item response data are modeled with a (non)linear…

  8. Are quasar redshifts randomly distributed

    NASA Technical Reports Server (NTRS)

    Weymann, R. J.; Boroson, T.; Scargle, J. D.

    1978-01-01

    A statistical analysis of possible clumping (not periodicity) of emission line redshifts of QSO's shows the available data to be compatible with random fluctuations of a smooth, non-clumped distribution. This result is demonstrated with Monte Carlo simulations as well as with the Kolmogorov-Smirnov test. It is in complete disagreement with the analysis by Varshni, which is shown to be incorrect.

  9. Randomized Item Response Theory Models

    ERIC Educational Resources Information Center

    Fox, Jean-Paul

    2005-01-01

    The randomized response (RR) technique is often used to obtain answers on sensitive questions. A new method is developed to measure latent variables using the RR technique because direct questioning leads to biased results. Within the RR technique is the probability of the true response modeled by an item response theory (IRT) model. The RR…

  10. Common Randomness Principles of Secrecy

    ERIC Educational Resources Information Center

    Tyagi, Himanshu

    2013-01-01

    This dissertation concerns the secure processing of distributed data by multiple terminals, using interactive public communication among themselves, in order to accomplish a given computational task. In the setting of a probabilistic multiterminal source model in which several terminals observe correlated random signals, we analyze secure…

  11. Entropy of random entangling surfaces

    NASA Astrophysics Data System (ADS)

    Solodukhin, Sergey N.

    2012-09-01

    We consider the situation when a globally defined four-dimensional field system is separated on two entangled sub-systems by a dynamical (random) two-dimensional surface. The reduced density matrix averaged over ensemble of random surfaces of fixed area and the corresponding average entropy are introduced. The average entanglement entropy is analyzed for a generic conformal field theory in four dimensions. Two important particular cases are considered. In the first, both the intrinsic metric on the entangling surface and the spacetime metric are fluctuating. An important example of this type is when the entangling surface is a black hole horizon, the fluctuations of which cause necessarily the fluctuations in the spacetime geometry. In the second case, the spacetime is considered to be fixed. The detailed analysis is carried out for the random entangling surfaces embedded in flat Minkowski spacetime. In all cases, the problem reduces to an effectively two-dimensional problem of random surfaces which can be treated by means of the well-known conformal methods. Focusing on the logarithmic terms in the entropy, we predict the appearance of a new ln ln(A) term. This article is part of a special issue of Journal of Physics A: Mathematical and Theoretical in honour of Stuart Dowker's 75th birthday devoted to ‘Applications of zeta functions and other spectral functions in mathematics and physics’.

  12. Garnet Random-Access Memory

    NASA Technical Reports Server (NTRS)

    Katti, Romney R.

    1995-01-01

    Random-access memory (RAM) devices of proposed type exploit magneto-optical properties of magnetic garnets exhibiting perpendicular anisotropy. Magnetic writing and optical readout used. Provides nonvolatile storage and resists damage by ionizing radiation. Because of basic architecture and pinout requirements, most likely useful as small-capacity memory devices.

  13. Undecidability Theorem and Quantum Randomness

    NASA Astrophysics Data System (ADS)

    Berezin, Alexander A.

    2005-04-01

    As scientific folklore has it, Kurt Godel was once annoyed by question whether he sees any link between his Undecidability Theorem (UT) and Uncertainty Relationship. His reaction, however, may indicate that he probably felt that such a hidden link could indeed exist but he was unable clearly formulate it. Informational version of UT (G.J.Chaitin) states impossibility to rule out algorithmic compressibility of arbitrary digital string. Thus, (mathematical) randomness can only be disproven, not proven. Going from mathematical to physical (mainly quantum) randomness, we encounter seemingly random acts of radioactive decays of isotopes (such as C14), emission of excited atoms, tunneling effects, etc. However, our notion of quantum randomness (QR) may likely hit similarly formidable wall of physical version of UT leading to seemingly bizarre ideas such as Everett many world model (D.Deutsch) or backward causation (J.A.Wheeler). Resolution may potentially lie in admitting some form of Aristotelean final causation (AFC) as an ultimate foundational principle (G.W.Leibniz) connecting purely mathematical (Platonic) grounding aspects with it physically observable consequences, such as plethora of QR effects. Thus, what we interpret as QR may eventually be manifestation of AFC in which UT serves as delivery vehicle. Another example of UT/QR/AFC connection is question of identity (indistinguishability) of elementary particles (are all electrons exactly the same or just approximately so to a very high degree?).

  14. Plated wire random access memories

    NASA Technical Reports Server (NTRS)

    Gouldin, L. D.

    1975-01-01

    A program was conducted to construct 4096-work by 18-bit random access, NDRO-plated wire memory units. The memory units were subjected to comprehensive functional and environmental tests at the end-item level to verify comformance with the specified requirements. A technical description of the unit is given, along with acceptance test data sheets.

  15. Models of random graph hierarchies

    NASA Astrophysics Data System (ADS)

    Paluch, Robert; Suchecki, Krzysztof; Hołyst, Janusz A.

    2015-10-01

    We introduce two models of inclusion hierarchies: random graph hierarchy (RGH) and limited random graph hierarchy (LRGH). In both models a set of nodes at a given hierarchy level is connected randomly, as in the Erdős-Rényi random graph, with a fixed average degree equal to a system parameter c. Clusters of the resulting network are treated as nodes at the next hierarchy level and they are connected again at this level and so on, until the process cannot continue. In the RGH model we use all clusters, including those of size 1, when building the next hierarchy level, while in the LRGH model clusters of size 1 stop participating in further steps. We find that in both models the number of nodes at a given hierarchy level h decreases approximately exponentially with h. The height of the hierarchy H, i.e. the number of all hierarchy levels, increases logarithmically with the system size N, i.e. with the number of nodes at the first level. The height H decreases monotonically with the connectivity parameter c in the RGH model and it reaches a maximum for a certain c max in the LRGH model. The distribution of separate cluster sizes in the LRGH model is a power law with an exponent about - 1.25. The above results follow from approximate analytical calculations and have been confirmed by numerical simulations.

  16. Universality in random quantum networks

    NASA Astrophysics Data System (ADS)

    Novotný, Jaroslav; Alber, Gernot; Jex, Igor

    2015-12-01

    Networks constitute efficient tools for assessing universal features of complex systems. In physical contexts, classical as well as quantum networks are used to describe a wide range of phenomena, such as phase transitions, intricate aspects of many-body quantum systems, or even characteristic features of a future quantum internet. Random quantum networks and their associated directed graphs are employed for capturing statistically dominant features of complex quantum systems. Here, we develop an efficient iterative method capable of evaluating the probability of a graph being strongly connected. It is proven that random directed graphs with constant edge-establishing probability are typically strongly connected, i.e., any ordered pair of vertices is connected by a directed path. This typical topological property of directed random graphs is exploited to demonstrate universal features of the asymptotic evolution of large random qubit networks. These results are independent of our knowledge of the details of the network topology. These findings suggest that other highly complex networks, such as a future quantum internet, may also exhibit similar universal properties.

  17. Entanglement generation of nearly random operators.

    PubMed

    Weinstein, Yaakov S; Hellberg, C Stephen

    2005-07-15

    We study the entanglement generation of operators whose statistical properties approach those of random matrices but are restricted in some way. These include interpolating ensemble matrices, where the interval of the independent random parameters are restricted, pseudorandom operators, where there are far fewer random parameters than required for random matrices, and quantum chaotic evolution. Restricting randomness in different ways allows us to probe connections between entanglement and randomness. We comment on which properties affect entanglement generation and discuss ways of efficiently producing random states on a quantum computer. PMID:16090726

  18. Random trinomial tree models and vanilla options

    NASA Astrophysics Data System (ADS)

    Ganikhodjaev, Nasir; Bayram, Kamola

    2013-09-01

    In this paper we introduce and study random trinomial model. The usual trinomial model is prescribed by triple of numbers (u, d, m). We call the triple (u, d, m) an environment of the trinomial model. A triple (Un, Dn, Mn), where {Un}, {Dn} and {Mn} are the sequences of independent, identically distributed random variables with 0 < Dn < 1 < Un and Mn = 1 for all n, is called a random environment and trinomial tree model with random environment is called random trinomial model. The random trinomial model is considered to produce more accurate results than the random binomial model or usual trinomial model.

  19. Random density matrices versus random evolution of open system

    NASA Astrophysics Data System (ADS)

    Pineda, Carlos; Seligman, Thomas H.

    2015-10-01

    We present and compare two families of ensembles of random density matrices. The first, static ensemble, is obtained foliating an unbiased ensemble of density matrices. As criterion we use fixed purity as the simplest example of a useful convex function. The second, dynamic ensemble, is inspired in random matrix models for decoherence where one evolves a separable pure state with a random Hamiltonian until a given value of purity in the central system is achieved. Several families of Hamiltonians, adequate for different physical situations, are studied. We focus on a two qubit central system, and obtain exact expressions for the static case. The ensemble displays a peak around Werner-like states, modulated by nodes on the degeneracies of the density matrices. For moderate and strong interactions good agreement between the static and the dynamic ensembles is found. Even in a model where one qubit does not interact with the environment excellent agreement is found, but only if there is maximal entanglement with the interacting one. The discussion is started recalling similar considerations for scattering theory. At the end, we comment on the reach of the results for other convex functions of the density matrix, and exemplify the situation with the von Neumann entropy.

  20. Selective monitoring

    NASA Astrophysics Data System (ADS)

    Homem-de-Mello, Luiz S.

    1992-04-01

    While in NASA's earlier space missions such as Voyager the number of sensors was in the hundreds, future platforms such as the Space Station Freedom will have tens of thousands sensors. For these planned missions it will be impossible to use the comprehensive monitoring strategy that was used in the past in which human operators monitored all sensors all the time. A selective monitoring strategy must be substituted for the current comprehensive strategy. This selective monitoring strategy uses computer tools to preprocess the incoming data and direct the operators' attention to the most critical parts of the physical system at any given time. There are several techniques that can be used to preprocess the incoming information. This paper presents an approach to using diagnostic reasoning techniques to preprocess the sensor data and detect which parts of the physical system require more attention because components have failed or are most likely to have failed. Given the sensor readings and a model of the physical system, a number of assertions are generated and expressed as Boolean equations. The resulting system of Boolean equations is solved symbolically. Using a priori probabilities of component failure and Bayes' rule, revised probabilities of failure can be computed. These will indicate what components have failed or are the most likely to have failed. This approach is suitable for systems that are well understood and for which the correctness of the assertions can be guaranteed. Also, the system must be such that assertions can be made from instantaneous measurements. And the system must be such that changes are slow enough to allow the computation.

  1. Accelerated Mini-batch Randomized Block Coordinate Descent Method

    PubMed Central

    Zhao, Tuo; Yu, Mo; Wang, Yiming; Arora, Raman; Liu, Han

    2014-01-01

    We consider regularized empirical risk minimization problems. In particular, we minimize the sum of a smooth empirical risk function and a nonsmooth regularization function. When the regularization function is block separable, we can solve the minimization problems in a randomized block coordinate descent (RBCD) manner. Existing RBCD methods usually decrease the objective value by exploiting the partial gradient of a randomly selected block of coordinates in each iteration. Thus they need all data to be accessible so that the partial gradient of the block gradient can be exactly obtained. However, such a “batch” setting may be computationally expensive in practice. In this paper, we propose a mini-batch randomized block coordinate descent (MRBCD) method, which estimates the partial gradient of the selected block based on a mini-batch of randomly sampled data in each iteration. We further accelerate the MRBCD method by exploiting the semi-stochastic optimization scheme, which effectively reduces the variance of the partial gradient estimators. Theoretically, we show that for strongly convex functions, the MRBCD method attains lower overall iteration complexity than existing RBCD methods. As an application, we further trim the MRBCD method to solve the regularized sparse learning problems. Our numerical experiments shows that the MRBCD method naturally exploits the sparsity structure and achieves better computational performance than existing methods. PMID:25620860

  2. Randomized interpolative decomposition of separated representations

    NASA Astrophysics Data System (ADS)

    Biagioni, David J.; Beylkin, Daniel; Beylkin, Gregory

    2015-01-01

    We introduce an algorithm to compute tensor interpolative decomposition (dubbed CTD-ID) for the reduction of the separation rank of Canonical Tensor Decompositions (CTDs). Tensor ID selects, for a user-defined accuracy ɛ, a near optimal subset of terms of a CTD to represent the remaining terms via a linear combination of the selected terms. CTD-ID can be used as an alternative to or in combination with the Alternating Least Squares (ALS) algorithm. We present examples of its use within a convergent iteration to compute inverse operators in high dimensions. We also briefly discuss the spectral norm as a computational alternative to the Frobenius norm in estimating approximation errors of tensor ID. We reduce the problem of finding tensor IDs to that of constructing interpolative decompositions of certain matrices. These matrices are generated via randomized projection of the terms of the given tensor. We provide cost estimates and several examples of the new approach to the reduction of separation rank.

  3. 47 CFR 1.822 - General selection procedures.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ....822 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Complaints... service. Following the random selection, the Commission shall determine whether the applicant is qualified... pleadings properly filed against it, the Commission determines that a substantial and material question...

  4. 47 CFR 1.822 - General selection procedures.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ....822 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Complaints... service. Following the random selection, the Commission shall determine whether the applicant is qualified... pleadings properly filed against it, the Commission determines that a substantial and material question...

  5. Participant Informed Consent in Cluster Randomized Trials: Review

    PubMed Central

    Giraudeau, Bruno; Caille, Agnès; Le Gouge, Amélie; Ravaud, Philippe

    2012-01-01

    Background The Nuremberg code defines the general ethical framework of medical research with participant consent as its cornerstone. In cluster randomized trials (CRT), obtaining participant informed consent raises logistic and methodologic concerns. First, with randomization of large clusters such as geographical areas, obtaining individual informed consent may be impossible. Second, participants in randomized clusters cannot avoid certain interventions, which implies that participant informed consent refers only to data collection, not administration of an intervention. Third, complete participant information may be a source of selection bias, which then raises methodological concerns. We assessed whether participant informed consent was required in such trials, which type of consent was required, and whether the trial was at risk of selection bias because of the very nature of participant information. Methods and Findings We systematically reviewed all reports of CRT published in MEDLINE in 2008 and surveyed corresponding authors regarding the nature of the informed consent and the process of participant inclusion. We identified 173 reports and obtained an answer from 113 authors (65.3%). In total, 23.7% of the reports lacked information on ethics committee approval or participant consent, 53.1% of authors declared that participant consent was for data collection only and 58.5% that the group allocation was not specified for participants. The process of recruitment (chronology of participant recruitment with regard to cluster randomization) was rarely reported, and we estimated that only 56.6% of the trials were free of potential selection bias. Conclusions For CRTs, the reporting of ethics committee approval and participant informed consent is less than optimal. Reports should describe whether participants consented for administration of an intervention and/or data collection. Finally, the process of participant recruitment should be fully described (namely

  6. All optical mode controllable Er-doped random fiber laser with distributed Bragg gratings.

    PubMed

    Zhang, W L; Ma, R; Tang, C H; Rao, Y J; Zeng, X P; Yang, Z J; Wang, Z N; Gong, Y; Wang, Y S

    2015-07-01

    An all-optical method to control the lasing modes of Er-doped random fiber lasers (RFLs) is proposed and demonstrated. In the RFL, an Er-doped fiber (EDF) recoded with randomly separated fiber Bragg gratings (FBG) is used as the gain medium and randomly distributed reflectors, as well as the controllable element. By combining random feedback of the FBG array and Fresnel feedback of a cleaved fiber end, multi-mode coherent random lasing is obtained with a threshold of 14 mW and power efficiency of 14.4%. Moreover, a laterally-injected control light is used to induce local gain perturbation, providing additional gain for certain random resonance modes. As a result, active mode selection of the RFL is realized by changing locations of the laser cavity that is exposed to the control light. PMID:26125397

  7. In vitro selection of catalytic RNAs

    NASA Technical Reports Server (NTRS)

    Chapman, K. B.; Szostak, J. W.

    1994-01-01

    In vitro selection techniques are poised to allow a rapid expansion of the study of catalysis by RNA enzymes (ribozymes). This truly molecular version of genetics has already been applied to the study of the structures of known ribozymes and to the tailoring of their catalytic activity to meet specific requirements of substrate specificity or reaction conditions. During the past year, in vitro selection has been successfully used to isolate novel RNA catalysts from random sequence pools.

  8. Optimal randomized scheduling by replacement

    SciTech Connect

    Saias, I.

    1996-05-01

    In the replacement scheduling problem, a system is composed of n processors drawn from a pool of p. The processors can become faulty while in operation and faulty processors never recover. A report is issued whenever a fault occurs. This report states only the existence of a fault but does not indicate its location. Based on this report, the scheduler can reconfigure the system and choose another set of n processors. The system operates satisfactorily as long as, upon report of a fault, the scheduler chooses n non-faulty processors. We provide a randomized protocol maximizing the expected number of faults the system can sustain before the occurrence of a crash. The optimality of the protocol is established by considering a closely related dual optimization problem. The game-theoretic technical difficulties that we solve in this paper are very general and encountered whenever proving the optimality of a randomized algorithm in parallel and distributed computation.

  9. Random errors in egocentric networks.

    PubMed

    Almquist, Zack W

    2012-10-01

    The systematic errors that are induced by a combination of human memory limitations and common survey design and implementation have long been studied in the context of egocentric networks. Despite this, little if any work exists in the area of random error analysis on these same networks; this paper offers a perspective on the effects of random errors on egonet analysis, as well as the effects of using egonet measures as independent predictors in linear models. We explore the effects of false-positive and false-negative error in egocentric networks on both standard network measures and on linear models through simulation analysis on a ground truth egocentric network sample based on facebook-friendships. Results show that 5-20% error rates, which are consistent with error rates known to occur in ego network data, can cause serious misestimation of network properties and regression parameters. PMID:23878412

  10. Random errors in egocentric networks

    PubMed Central

    Almquist, Zack W.

    2013-01-01

    The systematic errors that are induced by a combination of human memory limitations and common survey design and implementation have long been studied in the context of egocentric networks. Despite this, little if any work exists in the area of random error analysis on these same networks; this paper offers a perspective on the effects of random errors on egonet analysis, as well as the effects of using egonet measures as independent predictors in linear models. We explore the effects of false-positive and false-negative error in egocentric networks on both standard network measures and on linear models through simulation analysis on a ground truth egocentric network sample based on facebook-friendships. Results show that 5–20% error rates, which are consistent with error rates known to occur in ego network data, can cause serious misestimation of network properties and regression parameters. PMID:23878412

  11. Percolation on correlated random networks

    NASA Astrophysics Data System (ADS)

    Agliari, E.; Cioli, C.; Guadagnini, E.

    2011-09-01

    We consider a class of random, weighted networks, obtained through a redefinition of patterns in an Hopfield-like model, and, by performing percolation processes, we get information about topology and resilience properties of the networks themselves. Given the weighted nature of the graphs, different kinds of bond percolation can be studied: stochastic (deleting links randomly) and deterministic (deleting links based on rank weights), each mimicking a different physical process. The evolution of the network is accordingly different, as evidenced by the behavior of the largest component size and of the distribution of cluster sizes. In particular, we can derive that weak ties are crucial in order to maintain the graph connected and that, when they are the most prone to failure, the giant component typically shrinks without abruptly breaking apart; these results have been recently evidenced in several kinds of social networks.

  12. Random modelling of contagious diseases.

    PubMed

    Demongeot, J; Hansen, O; Hessami, H; Jannot, A S; Mintsa, J; Rachdi, M; Taramasco, C

    2013-03-01

    Modelling contagious diseases needs to include a mechanistic knowledge about contacts between hosts and pathogens as specific as possible, e.g., by incorporating in the model information about social networks through which the disease spreads. The unknown part concerning the contact mechanism can be modelled using a stochastic approach. For that purpose, we revisit SIR models by introducing first a microscopic stochastic version of the contacts between individuals of different populations (namely Susceptible, Infective and Recovering), then by adding a random perturbation in the vicinity of the endemic fixed point of the SIR model and eventually by introducing the definition of various types of random social networks. We propose as example of application to contagious diseases the HIV, and we show that a micro-simulation of individual based modelling (IBM) type can reproduce the current stable incidence of the HIV epidemic in a population of HIV-positive men having sex with men (MSM). PMID:23525763

  13. The weighted random graph model

    NASA Astrophysics Data System (ADS)

    Garlaschelli, Diego

    2009-07-01

    We introduce the weighted random graph (WRG) model, which represents the weighted counterpart of the Erdos-Renyi random graph and provides fundamental insights into more complicated weighted networks. We find analytically that the WRG is characterized by a geometric weight distribution, a binomial degree distribution and a negative binomial strength distribution. We also characterize exactly the percolation phase transitions associated with edge removal and with the appearance of weighted subgraphs of any order and intensity. We find that even this completely null model displays a percolation behaviour similar to what is observed in real weighted networks, implying that edge removal cannot be used to detect community structure empirically. By contrast, the analysis of clustering successfully reveals different patterns between the WRG and real networks.

  14. Random drift and culture change.

    PubMed

    Bentley, R Alexander; Hahn, Matthew W; Shennan, Stephen J

    2004-07-22

    We show that the frequency distributions of cultural variants, in three different real-world examples--first names, archaeological pottery and applications for technology patents--follow power laws that can be explained by a simple model of random drift. We conclude that cultural and economic choices often reflect a decision process that is value-neutral; this result has far-reaching testable implications for social-science research. PMID:15306315

  15. Random drift and culture change.

    PubMed Central

    Bentley, R. Alexander; Hahn, Matthew W.; Shennan, Stephen J.

    2004-01-01

    We show that the frequency distributions of cultural variants, in three different real-world examples--first names, archaeological pottery and applications for technology patents--follow power laws that can be explained by a simple model of random drift. We conclude that cultural and economic choices often reflect a decision process that is value-neutral; this result has far-reaching testable implications for social-science research. PMID:15306315

  16. Randomized gap and amplitude estimation

    NASA Astrophysics Data System (ADS)

    Zintchenko, Ilia; Wiebe, Nathan

    2016-06-01

    We provide a method for estimating spectral gaps in low-dimensional systems. Unlike traditional phase estimation, our approach does not require ancillary qubits nor does it require well-characterized gates. Instead, it only requires the ability to perform approximate Haar random unitary operations, applying the unitary whose eigenspectrum is sought and performing measurements in the computational basis. We discuss application of these ideas to in-place amplitude estimation and quantum device calibration.

  17. Resolution analysis by random probing

    NASA Astrophysics Data System (ADS)

    Simutė, S.; Fichtner, A.; van Leeuwen, T.

    2015-12-01

    We develop and apply methods for resolution analysis in tomography, based on stochastic probing of the Hessian or resolution operators. Key properties of our methods are (i) low algorithmic complexity and easy implementation, (ii) applicability to any tomographic technique, including full-waveform inversion and linearized ray tomography, (iii) applicability in any spatial dimension and to inversions with a large number of model parameters, (iv) low computational costs that are mostly a fraction of those required for synthetic recovery tests, and (v) the ability to quantify both spatial resolution and inter-parameter trade-offs. Using synthetic full-waveform inversions as benchmarks, we demonstrate that auto-correlations of random-model applications to the Hessian yield various resolution measures, including direction- and position-dependent resolution lengths, and the strength of inter-parameter mappings. We observe that the required number of random test models is around 5 in one, two and three dimensions. This means that the proposed resolution analyses are not only more meaningful than recovery tests but also computationally less expensive. We demonstrate the applicability of our method in 3D real-data full-waveform inversions for the western Mediterranean and Japan. In addition to tomographic problems, resolution analysis by random probing may be used in other inverse methods that constrain continuously distributed properties, including electromagnetic and potential-field inversions, as well as recently emerging geodynamic data assimilation.

  18. Resolution analysis by random probing

    NASA Astrophysics Data System (ADS)

    Fichtner, Andreas; Leeuwen, Tristan van

    2015-08-01

    We develop and apply methods for resolution analysis in tomography, based on stochastic probing of the Hessian or resolution operators. Key properties of our methods are (i) low algorithmic complexity and easy implementation, (ii) applicability to any tomographic technique, including full-waveform inversion and linearized ray tomography, (iii) applicability in any spatial dimension and to inversions with a large number of model parameters, (iv) low computational costs that are mostly a fraction of those required for synthetic recovery tests, and (v) the ability to quantify both spatial resolution and interparameter trade-offs. Using synthetic full-waveform inversions as benchmarks, we demonstrate that autocorrelations of random-model applications to the Hessian yield various resolution measures, including direction- and position-dependent resolution lengths and the strength of interparameter mappings. We observe that the required number of random test models is around five in one, two, and three dimensions. This means that the proposed resolution analyses are not only more meaningful than recovery tests but also computationally less expensive. We demonstrate the applicability of our method in a 3-D real-data full-waveform inversion for the western Mediterranean. In addition to tomographic problems, resolution analysis by random probing may be used in other inverse methods that constrain continuously distributed properties, including electromagnetic and potential-field inversions, as well as recently emerging geodynamic data assimilation.

  19. Topological insulators in random potentials

    NASA Astrophysics Data System (ADS)

    Pieper, Andreas; Fehske, Holger

    2016-01-01

    We investigate the effects of magnetic and nonmagnetic impurities on the two-dimensional surface states of three-dimensional topological insulators (TIs). Modeling weak and strong TIs using a generic four-band Hamiltonian, which allows for a breaking of inversion and time-reversal symmetries and takes into account random local potentials as well as the Zeeman and orbital effects of external magnetic fields, we compute the local density of states, the single-particle spectral function, and the conductance for a (contacted) slab geometry by numerically exact techniques based on kernel polynomial expansion and Green's function approaches. We show that bulk disorder refills the surface-state Dirac gap induced by a homogeneous magnetic field with states, whereas orbital (Peierls-phase) disorder preserves the gap feature. The former effect is more pronounced in weak TIs than in strong TIs. At moderate randomness, disorder-induced conducting channels appear in the surface layer, promoting diffusive metallicity. Random Zeeman fields rapidly destroy any conducting surface states. Imprinting quantum dots on a TI's surface, we demonstrate that carrier transport can be easily tuned by varying the gate voltage, even to the point where quasibound dot states may appear.

  20. Approximating random quantum optimization problems

    NASA Astrophysics Data System (ADS)

    Hsu, B.; Laumann, C. R.; Läuchli, A. M.; Moessner, R.; Sondhi, S. L.

    2013-06-01

    We report a cluster of results regarding the difficulty of finding approximate ground states to typical instances of the quantum satisfiability problem k-body quantum satisfiability (k-QSAT) on large random graphs. As an approximation strategy, we optimize the solution space over “classical” product states, which in turn introduces a novel autonomous classical optimization problem, PSAT, over a space of continuous degrees of freedom rather than discrete bits. Our central results are (i) the derivation of a set of bounds and approximations in various limits of the problem, several of which we believe may be amenable to a rigorous treatment; (ii) a demonstration that an approximation based on a greedy algorithm borrowed from the study of frustrated magnetism performs well over a wide range in parameter space, and its performance reflects the structure of the solution space of random k-QSAT. Simulated annealing exhibits metastability in similar “hard” regions of parameter space; and (iii) a generalization of belief propagation algorithms introduced for classical problems to the case of continuous spins. This yields both approximate solutions, as well as insights into the free energy “landscape” of the approximation problem, including a so-called dynamical transition near the satisfiability threshold. Taken together, these results allow us to elucidate the phase diagram of random k-QSAT in a two-dimensional energy-density-clause-density space.

  1. Decoupling with Random Quantum Circuits

    NASA Astrophysics Data System (ADS)

    Brown, Winton; Fawzi, Omar

    2015-12-01

    Decoupling has become a central concept in quantum information theory, with applications including proving coding theorems, randomness extraction and the study of conditions for reaching thermal equilibrium. However, our understanding of the dynamics that lead to decoupling is limited. In fact, the only families of transformations that are known to lead to decoupling are (approximate) unitary two-designs, i.e., measures over the unitary group that behave like the Haar measure as far as the first two moments are concerned. Such families include for example random quantum circuits with O( n 2) gates, where n is the number of qubits in the system under consideration. In fact, all known constructions of decoupling circuits use Ω( n 2) gates. Here, we prove that random quantum circuits with O( n log2 n) gates satisfy an essentially optimal decoupling theorem. In addition, these circuits can be implemented in depth O(log3 n). This proves that decoupling can happen in a time that scales polylogarithmically in the number of particles in the system, provided all the particles are allowed to interact. Our proof does not proceed by showing that such circuits are approximate two-designs in the usual sense, but rather we directly analyze the decoupling property.

  2. Random Time Identity Based Firewall In Mobile Ad hoc Networks

    NASA Astrophysics Data System (ADS)

    Suman, Patel, R. B.; Singh, Parvinder

    2010-11-01

    A mobile ad hoc network (MANET) is a self-organizing network of mobile routers and associated hosts connected by wireless links. MANETs are highly flexible and adaptable but at the same time are highly prone to security risks due to the open medium, dynamically changing network topology, cooperative algorithms, and lack of centralized control. Firewall is an effective means of protecting a local network from network-based security threats and forms a key component in MANET security architecture. This paper presents a review of firewall implementation techniques in MANETs and their relative merits and demerits. A new approach is proposed to select MANET nodes at random for firewall implementation. This approach randomly select a new node as firewall after fixed time and based on critical value of certain parameters like power backup. This approach effectively balances power and resource utilization of entire MANET because responsibility of implementing firewall is equally shared among all the nodes. At the same time it ensures improved security for MANETs from outside attacks as intruder will not be able to find out the entry point in MANET due to the random selection of nodes for firewall implementation.

  3. 49 CFR 219.602 - FRA Administrator's determination of random drug testing rate.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... (Continued) FEDERAL RAILROAD ADMINISTRATION, DEPARTMENT OF TRANSPORTATION CONTROL OF ALCOHOL AND DRUG USE... of the selection process. (f) The railroad must randomly select a sufficient number of covered... objective, neutral criteria which ensures that every covered employee has a substantially equal...

  4. 49 CFR 219.602 - FRA Administrator's determination of random drug testing rate.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... (Continued) FEDERAL RAILROAD ADMINISTRATION, DEPARTMENT OF TRANSPORTATION CONTROL OF ALCOHOL AND DRUG USE... of the selection process. (f) The railroad must randomly select a sufficient number of covered... objective, neutral criteria which ensures that every covered employee has a substantially equal...

  5. Selection of informative parameters of vibroacoustic processes

    NASA Technical Reports Server (NTRS)

    Koshek, L. N.

    1973-01-01

    The problem of selecting informative parameters of vibro-acoustic processes and the construction of apparatus for their determination are discussed. It is assumed that the processes being investigated are structurally uniform and either purely random or contain not very many determinative components.

  6. In silico selection of RNA aptamers

    PubMed Central

    Chushak, Yaroslav; Stone, Morley O.

    2009-01-01

    In vitro selection of RNA aptamers that bind to a specific ligand usually begins with a random pool of RNA sequences. We propose a computational approach for designing a starting pool of RNA sequences for the selection of RNA aptamers for specific analyte binding. Our approach consists of three steps: (i) selection of RNA sequences based on their secondary structure, (ii) generating a library of three-dimensional (3D) structures of RNA molecules and (iii) high-throughput virtual screening of this library to select aptamers with binding affinity to a desired small molecule. We developed a set of criteria that allows one to select a sequence with potential binding affinity from a pool of random sequences and developed a protocol for RNA 3D structure prediction. As verification, we tested the performance of in silico selection on a set of six known aptamer–ligand complexes. The structures of the native sequences for the ligands in the testing set were among the top 5% of the selected structures. The proposed approach reduces the RNA sequences search space by four to five orders of magnitude—significantly accelerating the experimental screening and selection of high-affinity aptamers. PMID:19465396

  7. In silico selection of RNA aptamers.

    PubMed

    Chushak, Yaroslav; Stone, Morley O

    2009-07-01

    In vitro selection of RNA aptamers that bind to a specific ligand usually begins with a random pool of RNA sequences. We propose a computational approach for designing a starting pool of RNA sequences for the selection of RNA aptamers for specific analyte binding. Our approach consists of three steps: (i) selection of RNA sequences based on their secondary structure, (ii) generating a library of three-dimensional (3D) structures of RNA molecules and (iii) high-throughput virtual screening of this library to select aptamers with binding affinity to a desired small molecule. We developed a set of criteria that allows one to select a sequence with potential binding affinity from a pool of random sequences and developed a protocol for RNA 3D structure prediction. As verification, we tested the performance of in silico selection on a set of six known aptamer-ligand complexes. The structures of the native sequences for the ligands in the testing set were among the top 5% of the selected structures. The proposed approach reduces the RNA sequences search space by four to five orders of magnitude--significantly accelerating the experimental screening and selection of high-affinity aptamers. PMID:19465396

  8. Self-correcting random number generator

    DOEpatents

    Humble, Travis S.; Pooser, Raphael C.

    2016-09-06

    A system and method for generating random numbers. The system may include a random number generator (RNG), such as a quantum random number generator (QRNG) configured to self-correct or adapt in order to substantially achieve randomness from the output of the RNG. By adapting, the RNG may generate a random number that may be considered random regardless of whether the random number itself is tested as such. As an example, the RNG may include components to monitor one or more characteristics of the RNG during operation, and may use the monitored characteristics as a basis for adapting, or self-correcting, to provide a random number according to one or more performance criteria.

  9. Generation of pseudo-random numbers

    NASA Technical Reports Server (NTRS)

    Howell, L. W.; Rheinfurth, M. H.

    1982-01-01

    Practical methods for generating acceptable random numbers from a variety of probability distributions which are frequently encountered in engineering applications are described. The speed, accuracy, and guarantee of statistical randomness of the various methods are discussed.

  10. Localization for random and quasiperiodic potentials

    NASA Astrophysics Data System (ADS)

    Spencer, Thomas

    1988-06-01

    A survey is made of some recent mathematical results and techniques for Schrödinger operators with random and quasiperiodic potentials. A new proof of localization for random potentials, established in collaboration with H. von Dreifus, is sketched.

  11. High speed optical quantum random number generation.

    PubMed

    Fürst, Martin; Weier, Henning; Nauerth, Sebastian; Marangon, Davide G; Kurtsiefer, Christian; Weinfurter, Harald

    2010-06-01

    We present a fully integrated, ready-for-use quantum random number generator (QRNG) whose stochastic model is based on the randomness of detecting single photons in attenuated light. We show that often annoying deadtime effects associated with photomultiplier tubes (PMT) can be utilized to avoid postprocessing for bias or correlations. The random numbers directly delivered to a PC, generated at a rate of up to 50 Mbit/s, clearly pass all tests relevant for (physical) random number generators. PMID:20588431

  12. The HEART Pathway Randomized Trial

    PubMed Central

    Mahler, Simon A.; Riley, Robert F.; Hiestand, Brian C.; Russell, Gregory B.; Hoekstra, James W.; Lefebvre, Cedric W.; Nicks, Bret A.; Cline, David M.; Askew, Kim L.; Elliott, Stephanie B.; Herrington, David M.; Burke, Gregory L.; Miller, Chadwick D.

    2015-01-01

    Background The HEART Pathway is a decision aid designed to identify emergency department patients with acute chest pain for early discharge. No randomized trials have compared the HEART Pathway with usual care. Methods and Results Adult emergency department patients with symptoms related to acute coronary syndrome without ST-elevation on ECG (n=282) were randomized to the HEART Pathway or usual care. In the HEART Pathway arm, emergency department providers used the HEART score, a validated decision aid, and troponin measures at 0 and 3 hours to identify patients for early discharge. Usual care was based on American College of Cardiology/American Heart Association guidelines. The primary outcome, objective cardiac testing (stress testing or angiography), and secondary outcomes, index length of stay, early discharge, and major adverse cardiac events (death, myocardial infarction, or coronary revascularization), were assessed at 30 days by phone interview and record review. Participants had a mean age of 53 years, 16% had previous myocardial infarction, and 6% (95% confidence interval, 3.6%–9.5%) had major adverse cardiac events within 30 days of randomization. Compared with usual care, use of the HEART Pathway decreased objective cardiac testing at 30 days by 12.1% (68.8% versus 56.7%; P=0.048) and length of stay by 12 hours (9.9 versus 21.9 hours; P=0.013) and increased early discharges by 21.3% (39.7% versus 18.4%; P<0.001). No patients identified for early discharge had major adverse cardiac events within 30 days. Conclusions The HEART Pathway reduces objective cardiac testing during 30 days, shortens length of stay, and increases early discharges. These important efficiency gains occurred without any patients identified for early discharge suffering MACE at 30 days. PMID:25737484

  13. Random bearings and their stability.

    PubMed

    Mahmoodi Baram, Reza; Herrmann, Hans J

    2005-11-25

    Self-similar space-filling bearings have been proposed some time ago as models for the motion of tectonic plates and appearance of seismic gaps. These models have two features which, however, seem unrealistic, namely, high symmetry in the arrangement of the particles, and lack of a lower cutoff in the size of the particles. In this work, an algorithm for generating random bearings in both two and three dimensions is presented. Introducing a lower cutoff for the sizes of the particles, the instabilities of the bearing under an external force such as gravity, are studied. PMID:16384225

  14. Random Matrix Theory and Econophysics

    NASA Astrophysics Data System (ADS)

    Rosenow, Bernd

    2000-03-01

    Random Matrix Theory (RMT) [1] is used in many branches of physics as a ``zero information hypothesis''. It describes generic behavior of different classes of systems, while deviations from its universal predictions allow to identify system specific properties. We use methods of RMT to analyze the cross-correlation matrix C of stock price changes [2] of the largest 1000 US companies. In addition to its scientific interest, the study of correlations between the returns of different stocks is also of practical relevance in quantifying the risk of a given stock portfolio. We find [3,4] that the statistics of most of the eigenvalues of the spectrum of C agree with the predictions of RMT, while there are deviations for some of the largest eigenvalues. We interpret these deviations as a system specific property, e.g. containing genuine information about correlations in the stock market. We demonstrate that C shares universal properties with the Gaussian orthogonal ensemble of random matrices. Furthermore, we analyze the eigenvectors of C through their inverse participation ratio and find eigenvectors with large ratios at both edges of the eigenvalue spectrum - a situation reminiscent of localization theory results. This work was done in collaboration with V. Plerou, P. Gopikrishnan, T. Guhr, L.A.N. Amaral, and H.E Stanley and is related to recent work of Laloux et al.. 1. T. Guhr, A. Müller Groeling, and H.A. Weidenmüller, ``Random Matrix Theories in Quantum Physics: Common Concepts'', Phys. Rep. 299, 190 (1998). 2. See, e.g. R.N. Mantegna and H.E. Stanley, Econophysics: Correlations and Complexity in Finance (Cambridge University Press, Cambridge, England, 1999). 3. V. Plerou, P. Gopikrishnan, B. Rosenow, L.A.N. Amaral, and H.E. Stanley, ``Universal and Nonuniversal Properties of Cross Correlations in Financial Time Series'', Phys. Rev. Lett. 83, 1471 (1999). 4. V. Plerou, P. Gopikrishnan, T. Guhr, B. Rosenow, L.A.N. Amaral, and H.E. Stanley, ``Random Matrix Theory

  15. RANDOM FORESTS FOR PHOTOMETRIC REDSHIFTS

    SciTech Connect

    Carliles, Samuel; Szalay, Alexander S.; Budavari, Tamas; Heinis, Sebastien; Priebe, Carey

    2010-03-20

    The main challenge today in photometric redshift estimation is not in the accuracy but in understanding the uncertainties. We introduce an empirical method based on Random Forests to address these issues. The training algorithm builds a set of optimal decision trees on subsets of the available spectroscopic sample, which provide independent constraints on the redshift of each galaxy. The combined forest estimates have intriguing statistical properties, notable among which are Gaussian errors. We demonstrate the power of our approach on multi-color measurements of the Sloan Digital Sky Survey.

  16. Quantum random walks without walking

    SciTech Connect

    Manouchehri, K.; Wang, J. B.

    2009-12-15

    Quantum random walks have received much interest due to their nonintuitive dynamics, which may hold the key to a new generation of quantum algorithms. What remains a major challenge is a physical realization that is experimentally viable and not limited to special connectivity criteria. We present a scheme for walking on arbitrarily complex graphs, which can be realized using a variety of quantum systems such as a Bose-Einstein condensate trapped inside an optical lattice. This scheme is particularly elegant since the walker is not required to physically step between the nodes; only flipping coins is sufficient.

  17. Quantum random flip-flop and its applications in random frequency synthesis and true random number generation.

    PubMed

    Stipčević, Mario

    2016-03-01

    In this work, a new type of elementary logic circuit, named random flip-flop (RFF), is proposed, experimentally realized, and studied. Unlike conventional Boolean logic circuits whose action is deterministic and highly reproducible, the action of a RFF is intentionally made maximally unpredictable and, in the proposed realization, derived from a fundamentally random process of emission and detection of light quanta. We demonstrate novel applications of RFF in randomness preserving frequency division, random frequency synthesis, and random number generation. Possible usages of these applications in the information and communication technology, cryptographic hardware, and testing equipment are discussed. PMID:27036825

  18. Quantum random flip-flop and its applications in random frequency synthesis and true random number generation

    NASA Astrophysics Data System (ADS)

    Stipčević, Mario

    2016-03-01

    In this work, a new type of elementary logic circuit, named random flip-flop (RFF), is proposed, experimentally realized, and studied. Unlike conventional Boolean logic circuits whose action is deterministic and highly reproducible, the action of a RFF is intentionally made maximally unpredictable and, in the proposed realization, derived from a fundamentally random process of emission and detection of light quanta. We demonstrate novel applications of RFF in randomness preserving frequency division, random frequency synthesis, and random number generation. Possible usages of these applications in the information and communication technology, cryptographic hardware, and testing equipment are discussed.

  19. Randomness in Sequence Evolution Increases over Time

    PubMed Central

    Wang, Guangyu; Sun, Shixiang; Zhang, Zhang

    2016-01-01

    The second law of thermodynamics states that entropy, as a measure of randomness in a system, increases over time. Although studies have investigated biological sequence randomness from different aspects, it remains unknown whether sequence randomness changes over time and whether this change consists with the second law of thermodynamics. To capture the dynamics of randomness in molecular sequence evolution, here we detect sequence randomness based on a collection of eight statistical random tests and investigate the randomness variation of coding sequences with an application to Escherichia coli. Given that core/essential genes are more ancient than specific/non-essential genes, our results clearly show that core/essential genes are more random than specific/non-essential genes and accordingly indicate that sequence randomness indeed increases over time, consistent well with the second law of thermodynamics. We further find that an increase in sequence randomness leads to increasing randomness of GC content and longer sequence length. Taken together, our study presents an important finding, for the first time, that sequence randomness increases over time, which may provide profound insights for unveiling the underlying mechanisms of molecular sequence evolution. PMID:27224236

  20. Cluster randomization: a trap for the unwary.

    PubMed Central

    Underwood, M; Barnett, A; Hajioff, S

    1998-01-01

    Controlled trials that randomize by practice can provide robust evidence to inform patient care. However, compared with randomizing by each individual patient, this approach may have substantial implications for sample size calculations and the interpretation of results. An increased awareness of these effects will improve the quality of research based on randomization by practice. PMID:9624757

  1. Source-Independent Quantum Random Number Generation

    NASA Astrophysics Data System (ADS)

    Cao, Zhu; Zhou, Hongyi; Yuan, Xiao; Ma, Xiongfeng

    2016-01-01

    Quantum random number generators can provide genuine randomness by appealing to the fundamental principles of quantum mechanics. In general, a physical generator contains two parts—a randomness source and its readout. The source is essential to the quality of the resulting random numbers; hence, it needs to be carefully calibrated and modeled to achieve information-theoretical provable randomness. However, in practice, the source is a complicated physical system, such as a light source or an atomic ensemble, and any deviations in the real-life implementation from the theoretical model may affect the randomness of the output. To close this gap, we propose a source-independent scheme for quantum random number generation in which output randomness can be certified, even when the source is uncharacterized and untrusted. In our randomness analysis, we make no assumptions about the dimension of the source. For instance, multiphoton emissions are allowed in optical implementations. Our analysis takes into account the finite-key effect with the composable security definition. In the limit of large data size, the length of the input random seed is exponentially small compared to that of the output random bit. In addition, by modifying a quantum key distribution system, we experimentally demonstrate our scheme and achieve a randomness generation rate of over 5 ×103 bit /s .

  2. Randomness in Sequence Evolution Increases over Time.

    PubMed

    Wang, Guangyu; Sun, Shixiang; Zhang, Zhang

    2016-01-01

    The second law of thermodynamics states that entropy, as a measure of randomness in a system, increases over time. Although studies have investigated biological sequence randomness from different aspects, it remains unknown whether sequence randomness changes over time and whether this change consists with the second law of thermodynamics. To capture the dynamics of randomness in molecular sequence evolution, here we detect sequence randomness based on a collection of eight statistical random tests and investigate the randomness variation of coding sequences with an application to Escherichia coli. Given that core/essential genes are more ancient than specific/non-essential genes, our results clearly show that core/essential genes are more random than specific/non-essential genes and accordingly indicate that sequence randomness indeed increases over time, consistent well with the second law of thermodynamics. We further find that an increase in sequence randomness leads to increasing randomness of GC content and longer sequence length. Taken together, our study presents an important finding, for the first time, that sequence randomness increases over time, which may provide profound insights for unveiling the underlying mechanisms of molecular sequence evolution. PMID:27224236

  3. Instructive selection and immunological theory.

    PubMed

    Lederberg, Joshua

    2002-07-01

    The turning point of modern immunological theory was the advent of the clonal selection theory (Burnet, Talmage - 1957). A useful heuristic in the classification of theoretical models was the contrast of 'instructive' with 'selective' models of the acquisition of information by biological systems. The neo-Darwinian synthesis of the 1940s had consolidated biologists' model of evolution based on prior random variation and natural selection, viz. differential fecundity. While evolution in the large was by then pretty well settled, controversy remained about examples of cellular adaptation to chemical challenges, like induced drug-resistance, enzyme formation and the antibody response. While instructive theories have been on the decline, some clear cut examples can be found of molecular imprinting in the abiotic world, leading, e.g. to the production of specific sorbents. Template-driven assembly, as in DNA synthesis, has remained a paradigm of instructive specification. Nevertheless, the classification may break down with more microscopic scrutiny of the processes of molecular fit of substrates with enzymes, of monomers to an elongating polymer chain, as the reactants often traverse a state space from with activated components are appropriately selected. The same process may be 'instructive' from a holistic, 'selective' from an atomic perspective. PMID:12190921

  4. Random sources for cusped beams.

    PubMed

    Li, Jia; Wang, Fei; Korotkova, Olga

    2016-08-01

    We introduce two novel classes of partially coherent sources whose degrees of coherence are described by the rectangular Lorentz-correlated Schell-model (LSM) and rectangular fractional multi-Gaussian-correlated Schell-model (FMGSM) functions. Based on the generalized Collins formula, analytical expressions are derived for the spectral density distributions of these beams propagating through a stigmatic ABCD optical system. It is shown that beams belonging to both classes form the spectral density apex that is much higher and sharper than that generated by the Gaussian Schell-model (GSM) beam with a comparable coherence state. We experimentally generate these beams by using a nematic, transmissive spatial light modulator (SLM) that serves as a random phase screen controlled by a computer. The experimental data is consistent with theoretical predictions. Moreover, it is illustrated that the FMGSM beam generated in our experiments has a better focusing capacity than the GSM beam with the same coherence state. The applications that can potentially benefit from the use of novel beams range from material surface processing, to communications and sensing through random media. PMID:27505746

  5. Random Tensors and Planted Cliques

    NASA Astrophysics Data System (ADS)

    Brubaker, S. Charles; Vempala, Santosh S.

    The r-parity tensor of a graph is a generalization of the adjacency matrix, where the tensor’s entries denote the parity of the number of edges in subgraphs induced by r distinct vertices. For r = 2, it is the adjacency matrix with 1’s for edges and - 1’s for nonedges. It is well-known that the 2-norm of the adjacency matrix of a random graph is O(sqrt{n}). Here we show that the 2-norm of the r-parity tensor is at most f(r)sqrt{n}log^{O(r)}n, answering a question of Frieze and Kannan [1] who proved this for r = 3. As a consequence, we get a tight connection between the planted clique problem and the problem of finding a vector that approximates the 2-norm of the r-parity tensor of a random graph. Our proof method is based on an inductive application of concentration of measure.

  6. The wasteland of random supergravities

    NASA Astrophysics Data System (ADS)

    Marsh, David; McAllister, Liam; Wrase, Timm

    2012-03-01

    We show that in a general {N} = {1} supergravity with N ≫ 1 scalar fields, an exponentially small fraction of the de Sitter critical points are metastable vacua. Taking the superpotential and Kähler potential to be random functions, we construct a random matrix model for the Hessian matrix, which is well-approximated by the sum of a Wigner matrix and two Wishart matrices. We compute the eigenvalue spectrum analytically from the free convolution of the constituent spectra and find that in typical configurations, a significant fraction of the eigenvalues are negative. Building on the Tracy-Widom law governing fluctuations of extreme eigenvalues, we determine the probability P of a large fluctuation in which all the eigenvalues become positive. Strong eigenvalue repulsion makes this extremely unlikely: we find P ∝ exp(- c N p ), with c, p being constants. For generic critical points we find p ≈ 1 .5, while for approximately-supersymmetric critical points, p ≈ 1 .3. Our results have significant implications for the counting of de Sitter vacua in string theory, but the number of vacua remains vast.

  7. Parabolic Anderson Model in a Dynamic Random Environment: Random Conductances

    NASA Astrophysics Data System (ADS)

    Erhard, D.; den Hollander, F.; Maillard, G.

    2016-06-01

    The parabolic Anderson model is defined as the partial differential equation ∂ u( x, t)/ ∂ t = κ Δ u( x, t) + ξ( x, t) u( x, t), x ∈ ℤ d , t ≥ 0, where κ ∈ [0, ∞) is the diffusion constant, Δ is the discrete Laplacian, and ξ is a dynamic random environment that drives the equation. The initial condition u( x, 0) = u 0( x), x ∈ ℤ d , is typically taken to be non-negative and bounded. The solution of the parabolic Anderson equation describes the evolution of a field of particles performing independent simple random walks with binary branching: particles jump at rate 2 d κ, split into two at rate ξ ∨ 0, and die at rate (- ξ) ∨ 0. In earlier work we looked at the Lyapunov exponents λ p(κ ) = limlimits _{tto ∞} 1/t log {E} ([u(0,t)]p)^{1/p}, quad p in {N} , qquad λ 0(κ ) = limlimits _{tto ∞} 1/2 log u(0,t). For the former we derived quantitative results on the κ-dependence for four choices of ξ : space-time white noise, independent simple random walks, the exclusion process and the voter model. For the latter we obtained qualitative results under certain space-time mixing conditions on ξ. In the present paper we investigate what happens when κΔ is replaced by Δ𝓚, where 𝓚 = {𝓚( x, y) : x, y ∈ ℤ d , x ˜ y} is a collection of random conductances between neighbouring sites replacing the constant conductances κ in the homogeneous model. We show that the associated annealed Lyapunov exponents λ p (𝓚), p ∈ ℕ, are given by the formula λ p({K} ) = {sup} {λ p(κ ) : κ in {Supp} ({K} )}, where, for a fixed realisation of 𝓚, Supp(𝓚) is the set of values taken by the 𝓚-field. We also show that for the associated quenched Lyapunov exponent λ 0(𝓚) this formula only provides a lower bound, and we conjecture that an upper bound holds when Supp(𝓚) is replaced by its convex hull. Our proof is valid for three classes of reversible ξ, and for all 𝓚

  8. Properties of permuted-block randomization in clinical trials.

    PubMed

    Matts, J P; Lachin, J M

    1988-12-01

    This article describes some of the important statistical properties of the commonly used permuted-block design, also known simply as blocked-randomization. Under a permutation model for statistical tests, proper analyses should employ tests that incorporate the blocking used in the randomization. These include the block-stratified Mantel-Haenszel chi-square test for binary data, the blocked analysis of variance F test, and the blocked nonparametric linear rank test. It is common, however, to ignore the blocking in the analysis. For these tests, it is shown that the size of a test obtained from an analysis incorporating the blocking (say T), versus an analysis ignoring the blocking (say TI), is related to the intrablock correlation coefficient (R) as TI = T(1-R). For blocks of common length 2m, the range of R is from -1/(2m-1) to 1. Thus, if there is a positive intrablock correlation, which is more likely than not for m greater than 1, an analysis ignoring blocking will be unduly conservative. Permutation tests are also presented for the case of stratified analyses within one or more subgroups of patients defined post hoc on the basis of a covariate. This provides a basis for the analysis when responses from some patients are assumed to be missing-at-random. An alternative strategy that requires no assumptions is to perform the analysis using only the subset of complete blocks in which no observations are missing. The Blackwell-Hodges model is used to assess the potential for selection bias induced by investigator attempts to guess which treatment is more likely to be assigned to each incoming patient. In an unmasked trial, the permuted-block design provides substantial potential for selection bias in the comparison of treatments due to the predictability of the assignments that is induced by the requirement of balance within blocks. Further, this bias is not eliminated by the use of random block sizes. We also modify the Blackwell-Hodges model to allow for

  9. Truly random number generation: an example

    NASA Astrophysics Data System (ADS)

    Frauchiger, Daniela; Renner, Renato

    2013-10-01

    Randomness is crucial for a variety of applications, ranging from gambling to computer simulations, and from cryptography to statistics. However, many of the currently used methods for generating randomness do not meet the criteria that are necessary for these applications to work properly and safely. A common problem is that a sequence of numbers may look random but nevertheless not be truly random. In fact, the sequence may pass all standard statistical tests and yet be perfectly predictable. This renders it useless for many applications. For example, in cryptography, the predictability of a "andomly" chosen password is obviously undesirable. Here, we review a recently developed approach to generating true | and hence unpredictable | randomness.

  10. On fatigue crack growth under random loading

    NASA Astrophysics Data System (ADS)

    Zhu, W. Q.; Lin, Y. K.; Lei, Y.

    1992-09-01

    A probabilistic analysis of the fatigue crack growth, fatigue life and reliability of a structural or mechanical component is presented on the basis of fracture mechanics and theory of random processes. The material resistance to fatigue crack growth and the time-history of the stress are assumed to be random. Analytical expressions are obtained for the special case in which the random stress is a stationary narrow-band Gaussian random process, and a randomized Paris-Erdogan law is applicable. As an example, the analytical method is applied to a plate with a central crack, and the results are compared with those obtained from digital Monte Carlo simulations.

  11. Modeling Randomness in Judging Rating Scales with a Random-Effects Rating Scale Model

    ERIC Educational Resources Information Center

    Wang, Wen-Chung; Wilson, Mark; Shih, Ching-Lin

    2006-01-01

    This study presents the random-effects rating scale model (RE-RSM) which takes into account randomness in the thresholds over persons by treating them as random-effects and adding a random variable for each threshold in the rating scale model (RSM) (Andrich, 1978). The RE-RSM turns out to be a special case of the multidimensional random…

  12. Effects of zinc supplementation on subscales of anorexia in children: A randomized controlled trial

    PubMed Central

    Khademian, Majid; Farhangpajouh, Neda; Shahsanaee, Armindokht; Bahreynian, Maryam; Mirshamsi, Mehran; Kelishadi, Roya

    2014-01-01

    Objectives: This study aims to assess the effects of zinc supplementation on improving the appetite and its subscales in children. Methods: This study was conducted in 2013 in Isfahan, Iran. It had two phases. At the first step, after validation of the Child Eating Behaviour Questionaire (CEBQ), it was completed for 300 preschool children, who were randomly selected. The second phase was conducted as a randomized controlled trial. Eighty of these children were randomly selected, and were randomly assigned to two groups of equal number receiving zinc (10 mg/day) or placebo for 12 weeks. Results: Overall 77 children completed the trial (39 in the case and 3 in the control group).The results showed that zinc supplement can improve calorie intake in children by affecting some CEBQ subscales like Emotional over Eating and Food Responsible. Conclusion: Zinc supplementation had positive impact in promoting the calorie intake and some subscales of anorexia. PMID:25674110

  13. Causal Mediation Analyses for Randomized Trials.

    PubMed

    Lynch, Kevin G; Cary, Mark; Gallop, Robert; Ten Have, Thomas R

    2008-01-01

    In the context of randomized intervention trials, we describe causal methods for analyzing how post-randomization factors constitute the process through which randomized baseline interventions act on outcomes. Traditionally, such mediation analyses have been undertaken with great caution, because they assume that the mediating factor is also randomly assigned to individuals in addition to the randomized baseline intervention (i.e., sequential ignorability). Because the mediating factors are typically not randomized, such analyses are unprotected from unmeasured confounders that may lead to biased inference. We review several causal approaches that attempt to reduce such bias without assuming that the mediating factor is randomized. However, these causal approaches require certain interaction assumptions that may be assessed if there is enough treatment heterogeneity with respect to the mediator. We describe available estimation procedures in the context of several examples from the literature and provide resources for software code. PMID:19484136

  14. Certifying Unpredictable Randomness from Quantum Nonlocality

    NASA Astrophysics Data System (ADS)

    Bierhorst, Peter

    2015-03-01

    A device-independent quantum randomness protocol takes an initial random seed as input and then expands it in to a longer random string. It has been proven that if the initial random seed is trusted to be unpredictable, then the longer output string can also be certified to be unpredictable by an experimental violation of Bell's inequality. It has furthermore been argued that the initial random seed may not need to be truly unpredictable, but only uncorrelated to specific parts of the Bell experiment. In this work, we demonstrate rigorously that this is indeed true, under assumptions related to ``no superdeterminism/no conspiracy'' concepts along with the no-signaling assumption. So if we assume that superluminal signaling is impossible, then a loophole-free test of Bell's inequality would be able to generate provably unpredictable randomness from an input source of (potentially predictable) classical randomness.

  15. Solving the accuracy-diversity dilemma via directed random walks

    NASA Astrophysics Data System (ADS)

    Liu, Jian-Guo; Shi, Kerui; Guo, Qiang

    2012-01-01

    Random walks have been successfully used to measure user or object similarities in collaborative filtering (CF) recommender systems, which is of high accuracy but low diversity. A key challenge of a CF system is that the reliably accurate results are obtained with the help of peers' recommendation, but the most useful individual recommendations are hard to be found among diverse niche objects. In this paper we investigate the direction effect of the random walk on user similarity measurements and find that the user similarity, calculated by directed random walks, is reverse to the initial node's degree. Since the ratio of small-degree users to large-degree users is very large in real data sets, the large-degree users' selections are recommended extensively by traditional CF algorithms. By tuning the user similarity direction from neighbors to the target user, we introduce a new algorithm specifically to address the challenge of diversity of CF and show how it can be used to solve the accuracy-diversity dilemma. Without relying on any context-specific information, we are able to obtain accurate and diverse recommendations, which outperforms the state-of-the-art CF methods. This work suggests that the random-walk direction is an important factor to improve the personalized recommendation performance.

  16. Solving the accuracy-diversity dilemma via directed random walks.

    PubMed

    Liu, Jian-Guo; Shi, Kerui; Guo, Qiang

    2012-01-01

    Random walks have been successfully used to measure user or object similarities in collaborative filtering (CF) recommender systems, which is of high accuracy but low diversity. A key challenge of a CF system is that the reliably accurate results are obtained with the help of peers' recommendation, but the most useful individual recommendations are hard to be found among diverse niche objects. In this paper we investigate the direction effect of the random walk on user similarity measurements and find that the user similarity, calculated by directed random walks, is reverse to the initial node's degree. Since the ratio of small-degree users to large-degree users is very large in real data sets, the large-degree users' selections are recommended extensively by traditional CF algorithms. By tuning the user similarity direction from neighbors to the target user, we introduce a new algorithm specifically to address the challenge of diversity of CF and show how it can be used to solve the accuracy-diversity dilemma. Without relying on any context-specific information, we are able to obtain accurate and diverse recommendations, which outperforms the state-of-the-art CF methods. This work suggests that the random-walk direction is an important factor to improve the personalized recommendation performance. PMID:22400636

  17. Weight distributions for turbo codes using random and nonrandom permutations

    NASA Technical Reports Server (NTRS)

    Dolinar, S.; Divsalar, D.

    1995-01-01

    This article takes a preliminary look at the weight distributions achievable for turbo codes using random, nonrandom, and semirandom permutations. Due to the recursiveness of the encoders, it is important to distinguish between self-terminating and non-self-terminating input sequences. The non-self-terminating sequences have little effect on decoder performance, because they accumulate high encoded weight until they are artificially terminated at the end of the block. From probabilistic arguments based on selecting the permutations randomly, it is concluded that the self-terminating weight-2 data sequences are the most important consideration in the design of constituent codes; higher-weight self-terminating sequences have successively decreasing importance. Also, increasing the number of codes and, correspondingly, the number of permutations makes it more and more likely that the bad input sequences will be broken up by one or more of the permuters. It is possible to design nonrandom permutations that ensure that the minimum distance due to weight-2 input sequences grows roughly as the square root of (2N), where N is the block length. However, these nonrandom permutations amplify the bad effects of higher-weight inputs, and as a result they are inferior in performance to randomly selected permutations. But there are 'semirandom' permutations that perform nearly as well as the designed nonrandom permutations with respect to weight-2 input sequences and are not as susceptible to being foiled by higher-weight inputs.

  18. Lasso adjustments of treatment effect estimates in randomized experiments

    PubMed Central

    Bloniarz, Adam; Liu, Hanzhong; Zhang, Cun-Hui; Sekhon, Jasjeet S.; Yu, Bin

    2016-01-01

    We provide a principled way for investigators to analyze randomized experiments when the number of covariates is large. Investigators often use linear multivariate regression to analyze randomized experiments instead of simply reporting the difference of means between treatment and control groups. Their aim is to reduce the variance of the estimated treatment effect by adjusting for covariates. If there are a large number of covariates relative to the number of observations, regression may perform poorly because of overfitting. In such cases, the least absolute shrinkage and selection operator (Lasso) may be helpful. We study the resulting Lasso-based treatment effect estimator under the Neyman–Rubin model of randomized experiments. We present theoretical conditions that guarantee that the estimator is more efficient than the simple difference-of-means estimator, and we provide a conservative estimator of the asymptotic variance, which can yield tighter confidence intervals than the difference-of-means estimator. Simulation and data examples show that Lasso-based adjustment can be advantageous even when the number of covariates is less than the number of observations. Specifically, a variant using Lasso for selection and ordinary least squares (OLS) for estimation performs particularly well, and it chooses a smoothing parameter based on combined performance of Lasso and OLS. PMID:27382153

  19. Lasso adjustments of treatment effect estimates in randomized experiments.

    PubMed

    Bloniarz, Adam; Liu, Hanzhong; Zhang, Cun-Hui; Sekhon, Jasjeet S; Yu, Bin

    2016-07-01

    We provide a principled way for investigators to analyze randomized experiments when the number of covariates is large. Investigators often use linear multivariate regression to analyze randomized experiments instead of simply reporting the difference of means between treatment and control groups. Their aim is to reduce the variance of the estimated treatment effect by adjusting for covariates. If there are a large number of covariates relative to the number of observations, regression may perform poorly because of overfitting. In such cases, the least absolute shrinkage and selection operator (Lasso) may be helpful. We study the resulting Lasso-based treatment effect estimator under the Neyman-Rubin model of randomized experiments. We present theoretical conditions that guarantee that the estimator is more efficient than the simple difference-of-means estimator, and we provide a conservative estimator of the asymptotic variance, which can yield tighter confidence intervals than the difference-of-means estimator. Simulation and data examples show that Lasso-based adjustment can be advantageous even when the number of covariates is less than the number of observations. Specifically, a variant using Lasso for selection and ordinary least squares (OLS) for estimation performs particularly well, and it chooses a smoothing parameter based on combined performance of Lasso and OLS. PMID:27382153

  20. Security of practical private randomness generation

    NASA Astrophysics Data System (ADS)

    Pironio, Stefano; Massar, Serge

    2013-01-01

    Measurements on entangled quantum systems necessarily yield outcomes that are intrinsically unpredictable if they violate a Bell inequality. This property can be used to generate certified randomness in a device-independent way, i.e., without making detailed assumptions about the internal working of the quantum devices used to generate the random numbers. Furthermore these numbers are also private; i.e., they appear random not only to the user but also to any adversary that might possess a perfect description of the devices. Since this process requires a small initial random seed to sample the behavior of the quantum devices and to extract uniform randomness from the raw outputs of the devices, one usually speaks of device-independent randomness expansion. The purpose of this paper is twofold. First, we point out that in most real, practical situations, where the concept of device independence is used as a protection against unintentional flaws or failures of the quantum apparatuses, it is sufficient to show that the generated string is random with respect to an adversary that holds only classical side information; i.e., proving randomness against quantum side information is not necessary. Furthermore, the initial random seed does not need to be private with respect to the adversary, provided that it is generated in a way that is independent from the measured systems. The devices, however, will generate cryptographically secure randomness that cannot be predicted by the adversary, and thus one can, given access to free public randomness, talk about private randomness generation. The theoretical tools to quantify the generated randomness according to these criteria were already introduced in S. Pironio [Nature (London)NATUAS0028-083610.1038/nature09008 464, 1021 (2010)], but the final results were improperly formulated. The second aim of this paper is to correct this inaccurate formulation and therefore lay out a precise theoretical framework for practical device

  1. Postprocessing for quantum random-number generators: Entropy evaluation and randomness extraction

    NASA Astrophysics Data System (ADS)

    Ma, Xiongfeng; Xu, Feihu; Xu, He; Tan, Xiaoqing; Qi, Bing; Lo, Hoi-Kwong

    2013-06-01

    Quantum random-number generators (QRNGs) can offer a means to generate information-theoretically provable random numbers, in principle. In practice, unfortunately, the quantum randomness is inevitably mixed with classical randomness due to classical noises. To distill this quantum randomness, one needs to quantify the randomness of the source and apply a randomness extractor. Here, we propose a generic framework for evaluating quantum randomness of real-life QRNGs by min-entropy, and apply it to two different existing quantum random-number systems in the literature. Moreover, we provide a guideline of QRNG data postprocessing for which we implement two information-theoretically provable randomness extractors: Toeplitz-hashing extractor and Trevisan's extractor.

  2. 76 FR 51056 - Notice of Random Assignment Study To Evaluate the YouthBuild Program; Request for Comment

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-17

    ... Employment and Training Administration Notice of Random Assignment Study To Evaluate the YouthBuild Program... methodology for the study. In the DOL-funded and CNCS-funded sites randomly selected to participate in this... enrollment period will be required to participate in the study in order to be considered for services...

  3. A computer model allowing maintenance of large amounts of genetic variability in Mendelian populations. II. The balance of forces between linkage and random assortment.

    PubMed

    Wills, C; Miller, C

    1976-02-01

    It is shown, through theory and computer simulations of outbreeding Mendelian populations, that there may be conditions under which a balance is struck between two facotrs. The first is the advantage of random assortment, which will, when multilocus selection is for intermediate equilibrium values, lead to higher average heterozygosity than when linkage is introduced. There is some indication that random assortment is also advantageous when selection is toward a uniform distribution of equilibrium values. The second factor is the advantage of linkage between loci having positive epistatic interactions. When multilocus selection is for a bimodal distribution of equilibrium values an early advantage of random assortment is replaced by a later disadvantage. Linkage disequilibrium, which in finite populations is increased only by random or selective sampling, may hinder the movement of alleles to their selective equilibria, thus leading to the advantage of random assortment.-Some consequences of this approach to the structure of natural populations are discussed. PMID:1261798

  4. A Markov Chain Model for evaluating the effectiveness of randomized surveillance procedures

    SciTech Connect

    Edmunds, T.A.

    1994-01-01

    A Markov Chain Model has been developed to evaluate the effectiveness of randomized surveillance procedures. The model is applicable for surveillance systems that monitor a collection of assets by randomly selecting and inspecting the assets. The model provides an estimate of the detection probability as a function of the amount of time that an adversary would require to steal or sabotage the asset. An interactive computer code has been written to perform the necessary computations.

  5. Flow Through Randomly Curved Manifolds

    PubMed Central

    Mendoza, M.; Succi, S.; Herrmann, H. J.

    2013-01-01

    We present a computational study of the transport properties of campylotic (intrinsically curved) media. It is found that the relation between the flow through a campylotic media, consisting of randomly located curvature perturbations, and the average Ricci scalar of the system, exhibits two distinct functional expressions, depending on whether the typical spatial extent of the curvature perturbation lies above or below the critical value maximizing the overall scalar of curvature. Furthermore, the flow through such systems as a function of the number of curvature perturbations is found to present a sublinear behavior for large concentrations, due to the interference between curvature perturbations leading to an overall less curved space. We have also characterized the flux through such media as a function of the local Reynolds number and the scale of interaction between impurities. For the purpose of this study, we have also developed and validated a new lattice Boltzmann model. PMID:24173367

  6. Clique percolation in random networks.

    PubMed

    Derényi, Imre; Palla, Gergely; Vicsek, Tamás

    2005-04-29

    The notion of k-clique percolation in random graphs is introduced, where k is the size of the complete subgraphs whose large scale organizations are analytically and numerically investigated. For the Erdos-Rényi graph of N vertices we obtain that the percolation transition of k-cliques takes place when the probability of two vertices being connected by an edge reaches the threshold p(c) (k) = [(k - 1)N](-1/(k - 1)). At the transition point the scaling of the giant component with N is highly nontrivial and depends on k. We discuss why clique percolation is a novel and efficient approach to the identification of overlapping communities in large real networks. PMID:15904198

  7. Ergodic theory, randomness, and "chaos".

    PubMed

    Ornstein, D S

    1989-01-13

    Ergodic theory is the theory of the long-term statistical behavior of dynamical systems. The baker's transformation is an object of ergodic theory that provides a paradigm for the possibility of deterministic chaos. It can now be shown that this connection is more than an analogy and that at some level of abstraction a large number of systems governed by Newton's laws are the same as the baker's transformation. Going to this level of abstraction helps to organize the possible kinds of random behavior. The theory also gives new concrete results. For example, one can show that the same process could be produced by a mechanism governed by Newton's laws or by a mechanism governed by coin tossing. It also gives a statistical analog of structural stability. PMID:17747421

  8. Extremal properties of random trees

    NASA Astrophysics Data System (ADS)

    Ben-Naim, E.; Krapivsky, P. L.; Majumdar, Satya N.

    2001-09-01

    We investigate extremal statistical properties such as the maximal and the minimal heights of randomly generated binary trees. By analyzing the master evolution equations we show that the cumulative distribution of extremal heights approaches a traveling wave form. The wave front in the minimal case is governed by the small-extremal-height tail of the distribution, and conversely, the front in the maximal case is governed by the large-extremal-height tail of the distribution. We determine several statistical characteristics of the extremal height distribution analytically. In particular, the expected minimal and maximal heights grow logarithmically with the tree size, N, hmin~vmin ln N, and hmax~vmax ln N, with vmin=0.373365... and vmax=4.31107..., respectively. Corrections to this asymptotic behavior are of order O(ln ln N).

  9. Clique Percolation in Random Networks

    NASA Astrophysics Data System (ADS)

    Derényi, Imre; Palla, Gergely; Vicsek, Tamás

    2005-04-01

    The notion of k-clique percolation in random graphs is introduced, where k is the size of the complete subgraphs whose large scale organizations are analytically and numerically investigated. For the Erdős-Rényi graph of N vertices we obtain that the percolation transition of k-cliques takes place when the probability of two vertices being connected by an edge reaches the threshold pc(k)=[(k-1)N]-1/(k-1). At the transition point the scaling of the giant component with N is highly nontrivial and depends on k. We discuss why clique percolation is a novel and efficient approach to the identification of overlapping communities in large real networks.

  10. Structure of random bidisperse foam.

    SciTech Connect

    Reinelt, Douglas A.; van Swol, Frank B.; Kraynik, Andrew Michael

    2005-02-01

    The Surface Evolver was used to compute the equilibrium microstructure of random soap foams with bidisperse cell-size distributions and to evaluate topological and geometric properties of the foams and individual cells. The simulations agree with the experimental data of Matzke and Nestler for the probability {rho}(F) of finding cells with F faces and its dependence on the fraction of large cells. The simulations also agree with the theory for isotropic Plateau polyhedra (IPP), which describes the F-dependence of cell geometric properties, such as surface area, edge length, and mean curvature (diffusive growth rate); this is consistent with results for polydisperse foams. Cell surface areas are about 10% greater than spheres of equal volume, which leads to a simple but accurate relation for the surface free energy density of foams. The Aboav-Weaire law is not valid for bidisperse foams.

  11. Random vibration of compliant wall

    NASA Technical Reports Server (NTRS)

    Yang, J.-N.; Heller, R. A.

    1976-01-01

    The paper is concerned with the realistic case of two-dimensional random motion of a membrane with bending stiffness supported on a viscoelastic spring substrate and on an elastic base plate under both subsonic and supersonic boundary layer turbulence. The cross-power spectral density of surface displacements is solved in terms of design variables of the compliant wall - such as the dimensions and material properties of the membrane (Mylar), substrate (PVC foam), and panel (aluminum) - so that a sensitivity analysis can be made to examine the influence of each design variable on the surface response statistics. Three numerical examples typical of compliant wall design are worked out and their response statistics in relation to wave drag and roughness drag are assessed. The results can serve as a guideline for experimental investigation of the drag reduction concept through the use of a compliant wall.

  12. Strategic use of number representation is independent of test instruction in random number generation.

    PubMed

    Strenge, Hans; Rogge, Carolin

    2010-04-01

    The effects of different instructions on verbal random number generation were examined in 40 healthy students who attempted to generate random sequences of the digits 1 to 6. Two groups of 20 received different instructions with alternative numerical representations. The Symbolic group (Arabic digits) was instructed to randomize while continuously using the analogy of selecting and replacing numbered balls from a hat, whereas the Nonsymbolic group (arrays of dots) was instructed to imagine repeatedly throwing a die. Participants asked for self-reports on their strategies reported spontaneously occurring visuospatial imagination of a mental number line (42%), or imagining throwing a die (23%). Individual number representation was not affected by the initial instruction. There were no differences in randomization performance by group. Comprehensive understanding of the nature of the randomization task requires considering individual differences in construction of mental models. PMID:20499555

  13. In vitro selection of optimal DNA substrates for T4 RNA ligase

    NASA Technical Reports Server (NTRS)

    Harada, Kazuo; Orgel, Leslie E.

    1993-01-01

    We have used in vitro selection techniques to characterize DNA sequences that are ligated efficiently by T4 RNA ligase. We find that the ensemble of selected sequences ligated about 10 times as efficiently as the random mixture of sequences used as the input for selection. Surprisingly, the majority of the selected sequences approximated a well-defined consensus sequence.

  14. Selective Exposure and Retention of Political Advertising: A Regional Comparison.

    ERIC Educational Resources Information Center

    Surlin, Stuart H.; Gordon, Thomas F.

    The results presented in this article are but a portion of the information gathered in a larger survey examining the relative roles of "selective exposure" to and "selective retention" of political advertising during the 1972 presidential election. Random samples in two metropolitan areas in different regions of the country (Atlanta, Ga., n=281;…

  15. The Effect of Speed Alterations on Tempo Note Selection.

    ERIC Educational Resources Information Center

    Madsen, Clifford K.; And Others

    1986-01-01

    Investigated the tempo note preferences of 100 randomly selected college-level musicians using familiar orchestral music as stimuli. Subjects heard selections at increased, decreased, and unaltered tempi. Results showed musicians were not accurate in estimating original tempo and showed consistent preference for faster than actual tempo.…

  16. Quantifying consistent individual differences in habitat selection.

    PubMed

    Leclerc, Martin; Vander Wal, Eric; Zedrosser, Andreas; Swenson, Jon E; Kindberg, Jonas; Pelletier, Fanie

    2016-03-01

    Habitat selection is a fundamental behaviour that links individuals to the resources required for survival and reproduction. Although natural selection acts on an individual's phenotype, research on habitat selection often pools inter-individual patterns to provide inferences on the population scale. Here, we expanded a traditional approach of quantifying habitat selection at the individual level to explore the potential for consistent individual differences of habitat selection. We used random coefficients in resource selection functions (RSFs) and repeatability estimates to test for variability in habitat selection. We applied our method to a detailed dataset of GPS relocations of brown bears (Ursus arctos) taken over a period of 6 years, and assessed whether they displayed repeatable individual differences in habitat selection toward two habitat types: bogs and recent timber-harvest cut blocks. In our analyses, we controlled for the availability of habitat, i.e. the functional response in habitat selection. Repeatability estimates of habitat selection toward bogs and cut blocks were 0.304 and 0.420, respectively. Therefore, 30.4 and 42.0 % of the population-scale habitat selection variability for bogs and cut blocks, respectively, was due to differences among individuals, suggesting that consistent individual variation in habitat selection exists in brown bears. Using simulations, we posit that repeatability values of habitat selection are not related to the value and significance of β estimates in RSFs. Although individual differences in habitat selection could be the results of non-exclusive factors, our results illustrate the evolutionary potential of habitat selection. PMID:26597548

  17. Lower bounds for randomized Exclusive Write PRAMs

    SciTech Connect

    MacKenzie, P.D.

    1995-05-02

    In this paper we study the question: How useful is randomization in speeding up Exclusive Write PRAM computations? Our results give further evidence that randomization is of limited use in these types of computations. First we examine a compaction problem on both the CREW and EREW PRAM models, and we present randomized lower bounds which match the best deterministic lower bounds known. (For the CREW PRAM model, the lower bound is asymptotically optimal.) These are the first non-trivial randomized lower bounds known for the compaction problem on these models. We show that our lower bounds also apply to the problem of approximate compaction. Next we examine the problem of computing boolean functions on the CREW PRAM model, and we present a randomized lower bound, which improves on the previous best randomized lower bound for many boolean functions, including the OR function. (The previous lower bounds for these functions were asymptotically optimal, but we improve the constant multiplicative factor.) We also give an alternate proof for the randomized lower bound on PARITY, which was already optimal to within a constant additive factor. Lastly, we give a randomized lower bound for integer merging on an EREW PRAM which matches the best deterministic lower bound known. In all our proofs, we use the Random Adversary method, which has previously only been used for proving lower bounds on models with Concurrent Write capabilities. Thus this paper also serves to illustrate the power and generality of this method for proving parallel randomized lower bounds.

  18. Rare attributes in finite universe: Hypotheses testing specification and exact randomized upper confidence bounds

    SciTech Connect

    Wright, T.

    1993-03-01

    When attributes are rare and few or none are observed in the selected sample from a finite universe, sampling statisticians are increasingly being challenged to use whatever methods are available to declare with high probability or confidence that the universe is near or completely attribute-free. This is especially true when the attribute is undesirable. Approximations such as those based on normal theory are frequently inadequate with rare attributes. For simple random sampling without replacement, an appropriate probability distribution for statistical inference is the hypergeometric distribution. But even with the hypergeometric distribution, the investigator is limited from making claims of attribute-free with high confidence unless the sample size is quite large using nonrandomized techniques. In the hypergeometric setting with rare attributes, exact randomized tests of hypothesis a,re investigated to determine the effect on power of how one specifies the null hypothesis. In particular, specifying the null hypothesis as zero attributes does not always yield maximum possible power. We also consider the hypothesis specification question under complex sampling designs including stratified random sampling and two-stage cluster sampling (one case involves random selection at first stage and another case involves probability proportional to size without replacement selection at first stage). Also under simple random sampling, this article defines and presents a simple algorithm for the construction of exact ``randomized`` upper confidence bounds which permit one to possibly report tighter bounds than those exact bounds obtained using ``nonrandomized`` methods.

  19. Rare attributes in finite universe: Hypotheses testing specification and exact randomized upper confidence bounds

    SciTech Connect

    Wright, T.

    1993-03-01

    When attributes are rare and few or none are observed in the selected sample from a finite universe, sampling statisticians are increasingly being challenged to use whatever methods are available to declare with high probability or confidence that the universe is near or completely attribute-free. This is especially true when the attribute is undesirable. Approximations such as those based on normal theory are frequently inadequate with rare attributes. For simple random sampling without replacement, an appropriate probability distribution for statistical inference is the hypergeometric distribution. But even with the hypergeometric distribution, the investigator is limited from making claims of attribute-free with high confidence unless the sample size is quite large using nonrandomized techniques. In the hypergeometric setting with rare attributes, exact randomized tests of hypothesis a,re investigated to determine the effect on power of how one specifies the null hypothesis. In particular, specifying the null hypothesis as zero attributes does not always yield maximum possible power. We also consider the hypothesis specification question under complex sampling designs including stratified random sampling and two-stage cluster sampling (one case involves random selection at first stage and another case involves probability proportional to size without replacement selection at first stage). Also under simple random sampling, this article defines and presents a simple algorithm for the construction of exact randomized'' upper confidence bounds which permit one to possibly report tighter bounds than those exact bounds obtained using nonrandomized'' methods.

  20. Non-local MRI denoising using random sampling.

    PubMed

    Hu, Jinrong; Zhou, Jiliu; Wu, Xi

    2016-09-01

    In this paper, we propose a random sampling non-local mean (SNLM) algorithm to eliminate noise in 3D MRI datasets. Non-local means (NLM) algorithms have been implemented efficiently for MRI denoising, but are always limited by high computational complexity. Compared to conventional methods, which raster through the entire search window when computing similarity weights, the proposed SNLM algorithm randomly selects a small subset of voxels which dramatically decreases the computational burden, together with competitive denoising result. Moreover, structure tensor which encapsulates high-order information was introduced as an optimal sampling pattern for further improvement. Numerical experiments demonstrated that the proposed SNLM method can get a good balance between denoising quality and computation efficiency. At a relative sampling ratio (i.e. ξ=0.05), SNLM can remove noise as effectively as full NLM, meanwhile the running time can be reduced to 1/20 of NLM's. PMID:27114338

  1. Diffusive random laser modes under a spatiotemporal scope.

    PubMed

    García-Revilla, Sara; Fernández, Joaquín; Barredo-Zuriarrain, Macarena; Carlos, Luís D; Pecoraro, Edison; Iparraguirre, Ignacio; Azkargorta, Jon; Balda, Rolindes

    2015-01-26

    At present the prediction and characterization of the emission output of a diffusive random laser remains a challenge, despite the variety of investigated materials and theoretical interpretations given up to now. Here, a new mode selection method, based on spatial filtering and ultrafast detection, which allows to separate individual lasing modes and follow their temporal evolution is presented. In particular, the work explores the random laser behavior of a ground powder of an organic-inorganic hybrid compound based on Rhodamine B incorporated into a di-ureasil host. The experimental approach gives direct access to the mode structure and dynamics, shows clear modal relaxation oscillations, and illustrates the lasing modes stochastic behavior of this diffusive scattering system. The effect of the excitation energy on its modal density is also investigated. Finally, imaging measurements reveal the dominant role of diffusion over amplification processes in this kind of unconventional lasers. PMID:25835903

  2. Random geometric graph description of connectedness percolation in rod systems

    NASA Astrophysics Data System (ADS)

    Chatterjee, Avik P.; Grimaldi, Claudio

    2015-09-01

    The problem of continuum percolation in dispersions of rods is reformulated in terms of weighted random geometric graphs. Nodes (or sites or vertices) in the graph represent spatial locations occupied by the centers of the rods. The probability that an edge (or link) connects any randomly selected pair of nodes depends upon the rod volume fraction as well as the distribution over their sizes and shapes, and also upon quantities that characterize their state of dispersion (such as the orientational distribution function). We employ the observation that contributions from closed loops of connected rods are negligible in the limit of large aspect ratios to obtain percolation thresholds that are fully equivalent to those calculated within the second-virial approximation of the connectedness Ornstein-Zernike equation. Our formulation can account for effects due to interactions between the rods, and many-body features can be partially addressed by suitable choices for the edge probabilities.

  3. Matter-wave analog of an optical random laser

    SciTech Connect

    Plodzien, Marcin; Sacha, Krzysztof

    2011-08-15

    The accumulation of atoms in the lowest energy level of a trap and the subsequent out coupling of these atoms is a realization of a matter-wave analog of a conventional optical laser. Optical random lasers require materials that provide optical gain but, contrary to conventional lasers, the modes are determined by multiple scattering and not a cavity. We show that a Bose-Einstein condensate can be loaded in a spatially correlated disorder potential prepared in such a way that the Anderson localization phenomenon operates as a bandpass filter. A multiple scattering process selects atoms with certain momenta and determines laser modes which represents a matter-wave analog of an optical random laser.

  4. Randomized Controlled Trials of Add-On Antidepressants in Schizophrenia

    PubMed Central

    Joffe, Grigori; Stenberg, Jan-Henry

    2015-01-01

    Background: Despite adequate treatment with antipsychotics, a substantial number of patients with schizophrenia demonstrate only suboptimal clinical outcome. To overcome this challenge, various psychopharmacological combination strategies have been used, including antidepressants added to antipsychotics. Methods: To analyze the efficacy of add-on antidepressants for the treatment of negative, positive, cognitive, depressive, and antipsychotic-induced extrapyramidal symptoms in schizophrenia, published randomized controlled trials assessing the efficacy of adjunctive antidepressants in schizophrenia were reviewed using the following parameters: baseline clinical characteristics and number of patients, their on-going antipsychotic treatment, dosage of the add-on antidepressants, duration of the trial, efficacy measures, and outcomes. Results: There were 36 randomized controlled trials reported in 41 journal publications (n=1582). The antidepressants used were the selective serotonin reuptake inhibitors, duloxetine, imipramine, mianserin, mirtazapine, nefazodone, reboxetin, trazodone, and bupropion. Mirtazapine and mianserin showed somewhat consistent efficacy for negative symptoms and both seemed to enhance neurocognition. Trazodone and nefazodone appeared to improve the antipsychotics-induced extrapyramidal symptoms. Imipramine and duloxetine tended to improve depressive symptoms. No clear evidence supporting selective serotonin reuptake inhibitors’ efficacy on any clinical domain of schizophrenia was found. Add-on antidepressants did not worsen psychosis. Conclusions: Despite a substantial number of randomized controlled trials, the overall efficacy of add-on antidepressants in schizophrenia remains uncertain mainly due to methodological issues. Some differences in efficacy on several schizophrenia domains seem, however, to exist and to vary by the antidepressant subgroups—plausibly due to differences in the mechanisms of action. Antidepressants may not worsen

  5. Error Threshold of Fully Random Eigen Model

    NASA Astrophysics Data System (ADS)

    Li, Duo-Fang; Cao, Tian-Guang; Geng, Jin-Peng; Qiao, Li-Hua; Gu, Jian-Zhong; Zhan, Yong

    2015-01-01

    Species evolution is essentially a random process of interaction between biological populations and their environments. As a result, some physical parameters in evolution models are subject to statistical fluctuations. In this work, two important parameters in the Eigen model, the fitness and mutation rate, are treated as Gaussian distributed random variables simultaneously to examine the property of the error threshold. Numerical simulation results show that the error threshold in the fully random model appears as a crossover region instead of a phase transition point, and as the fluctuation strength increases the crossover region becomes smoother and smoother. Furthermore, it is shown that the randomization of the mutation rate plays a dominant role in changing the error threshold in the fully random model, which is consistent with the existing experimental data. The implication of the threshold change due to the randomization for antiviral strategies is discussed.

  6. The MCNP5 Random number generator

    SciTech Connect

    Brown, F. B.; Nagaya, Y.

    2002-01-01

    MCNP and other Monte Carlo particle transport codes use random number generators to produce random variates from a uniform distribution on the interval. These random variates are then used in subsequent sampling from probability distributions to simulate the physical behavior of particles during the transport process. This paper describes the new random number generator developed for MCNP Version 5. The new generator will optionally preserve the exact random sequence of previous versions and is entirely conformant to the Fortran-90 standard, hence completely portable. In addition, skip-ahead algorithms have been implemented to efficiently initialize the generator for new histories, a capability that greatly simplifies parallel algorithms. Further, the precision of the generator has been increased, extending the period by a factor of 10{sup 5}. Finally, the new generator has been subjected to 3 different sets of rigorous and extensive statistical tests to verify that it produces a sufficiently random sequence.

  7. Some physical applications of random hierarchical matrices

    SciTech Connect

    Avetisov, V. A.; Bikulov, A. Kh.; Vasilyev, O. A.; Nechaev, S. K.; Chertovich, A. V.

    2009-09-15

    The investigation of spectral properties of random block-hierarchical matrices as applied to dynamic and structural characteristics of complex hierarchical systems with disorder is proposed for the first time. Peculiarities of dynamics on random ultrametric energy landscapes are discussed and the statistical properties of scale-free and polyscale (depending on the topological characteristics under investigation) random hierarchical networks (graphs) obtained by multiple mapping are considered.

  8. Private randomness expansion with untrusted devices

    NASA Astrophysics Data System (ADS)

    Colbeck, Roger; Kent, Adrian

    2011-03-01

    Randomness is an important resource for many applications, from gambling to secure communication. However, guaranteeing that the output from a candidate random source could not have been predicted by an outside party is a challenging task, and many supposedly random sources used today provide no such guarantee. Quantum solutions to this problem exist, for example a device which internally sends a photon through a beamsplitter and observes on which side it emerges, but, presently, such solutions require the user to trust the internal workings of the device. Here, we seek to go beyond this limitation by asking whether randomness can be generated using untrusted devices—even ones created by an adversarial agent—while providing a guarantee that no outside party (including the agent) can predict it. Since this is easily seen to be impossible unless the user has an initially private random string, the task we investigate here is private randomness expansion. We introduce a protocol for private randomness expansion with untrusted devices which is designed to take as input an initially private random string and produce as output a longer private random string. We point out that private randomness expansion protocols are generally vulnerable to attacks that can render the initial string partially insecure, even though that string is used only inside a secure laboratory; our protocol is designed to remove this previously unconsidered vulnerability by privacy amplification. We also discuss extensions of our protocol designed to generate an arbitrarily long random string from a finite initially private random string. The security of these protocols against the most general attacks is left as an open question.

  9. Social Selection and Religiously Selective Faith Schools

    ERIC Educational Resources Information Center

    Pettinger, Paul

    2014-01-01

    This article reviews recent research looking at the socio-economic profile of pupils at faith schools and the contribution religiously selective admission arrangements make. It finds that selection by faith leads to greater social segregation and is open to manipulation. It urges that such selection should end, making the state-funded school…

  10. Using histograms to introduce randomization in the generation of ensembles of decision trees

    DOEpatents

    Kamath, Chandrika; Cantu-Paz, Erick; Littau, David

    2005-02-22

    A system for decision tree ensembles that includes a module to read the data, a module to create a histogram, a module to evaluate a potential split according to some criterion using the histogram, a module to select a split point randomly in an interval around the best split, a module to split the data, and a module to combine multiple decision trees in ensembles. The decision tree method includes the steps of reading the data; creating a histogram; evaluating a potential split according to some criterion using the histogram, selecting a split point randomly in an interval around the best split, splitting the data, and combining multiple decision trees in ensembles.

  11. Disturbing the random-energy landscape

    NASA Astrophysics Data System (ADS)

    Halpin-Healy, Timothy; Herbert, Devorah

    1993-09-01

    We examine the effects of correlated perturbations upon globally optimal paths through a random-energy landscape. Motivated by Zhang's early numerical investigations [Phys. Rev. Lett. 59, 2125 (1987)] into ground-state instabilities of disordered systems, as well as the work of Shapir [Phys. Rev. Lett. 66, 1473 (1991)] on random perturbations of roughened manifolds, we have studied the specific case of random bond interfaces unsettled by small random fields, confirming recent predictions for the instability exponents. Implications for disordered magnets and growing surfaces are discussed.

  12. The Theory of Random Laser Systems

    SciTech Connect

    Xunya Jiang

    2002-06-27

    Studies of random laser systems are a new direction with promising potential applications and theoretical interest. The research is based on the theories of localization and laser physics. So far, the research shows that there are random lasing modes inside the systems which is quite different from the common laser systems. From the properties of the random lasing modes, they can understand the phenomena observed in the experiments, such as multi-peak and anisotropic spectrum, lasing mode number saturation, mode competition and dynamic processes, etc. To summarize, this dissertation has contributed the following in the study of random laser systems: (1) by comparing the Lamb theory with the Letokhov theory, the general formulas of the threshold length or gain of random laser systems were obtained; (2) they pointed out the vital weakness of previous time-independent methods in random laser research; (3) a new model which includes the FDTD method and the semi-classical laser theory. The solutions of this model provided an explanation of the experimental results of multi-peak and anisotropic emission spectra, predicted the saturation of lasing modes number and the length of localized lasing modes; (4) theoretical (Lamb theory) and numerical (FDTD and transfer-matrix calculation) studies of the origin of localized lasing modes in the random laser systems; and (5) proposal of using random lasing modes as a new path to study wave localization in random systems and prediction of the lasing threshold discontinuity at mobility edge.

  13. True random numbers from amplified quantum vacuum.

    PubMed

    Jofre, M; Curty, M; Steinlechner, F; Anzolin, G; Torres, J P; Mitchell, M W; Pruneri, V

    2011-10-10

    Random numbers are essential for applications ranging from secure communications to numerical simulation and quantitative finance. Algorithms can rapidly produce pseudo-random outcomes, series of numbers that mimic most properties of true random numbers while quantum random number generators (QRNGs) exploit intrinsic quantum randomness to produce true random numbers. Single-photon QRNGs are conceptually simple but produce few random bits per detection. In contrast, vacuum fluctuations are a vast resource for QRNGs: they are broad-band and thus can encode many random bits per second. Direct recording of vacuum fluctuations is possible, but requires shot-noise-limited detectors, at the cost of bandwidth. We demonstrate efficient conversion of vacuum fluctuations to true random bits using optical amplification of vacuum and interferometry. Using commercially-available optical components we demonstrate a QRNG at a bit rate of 1.11 Gbps. The proposed scheme has the potential to be extended to 10 Gbps and even up to 100 Gbps by taking advantage of high speed modulation sources and detectors for optical fiber telecommunication devices. PMID:21997077

  14. Random packing of spheres in Menger sponge

    NASA Astrophysics Data System (ADS)

    Cieśla, Michał; Barbasz, Jakub

    2013-06-01

    Random packing of spheres inside fractal collectors of dimension 2 < d < 3 is studied numerically using Random Sequential Adsorption (RSA) algorithm. The paper focuses mainly on the measurement of random packing saturation limit. Additionally, scaling properties of density autocorrelations in the obtained packing are analyzed. The RSA kinetics coefficients are also measured. Obtained results allow to test phenomenological relation between random packing saturation density and collector dimension. Additionally, performed simulations together with previously obtained results confirm that, in general, the known dimensional relations are obeyed by systems having non-integer dimension, at least for d < 3.

  15. Random packing of spheres in Menger sponge.

    PubMed

    Cieśla, Michał; Barbasz, Jakub

    2013-06-01

    Random packing of spheres inside fractal collectors of dimension 2 < d < 3 is studied numerically using Random Sequential Adsorption (RSA) algorithm. The paper focuses mainly on the measurement of random packing saturation limit. Additionally, scaling properties of density autocorrelations in the obtained packing are analyzed. The RSA kinetics coefficients are also measured. Obtained results allow to test phenomenological relation between random packing saturation density and collector dimension. Additionally, performed simulations together with previously obtained results confirm that, in general, the known dimensional relations are obeyed by systems having non-integer dimension, at least for d < 3. PMID:23758392

  16. Random Copolymer: Gaussian Variational Approach

    NASA Astrophysics Data System (ADS)

    Moskalenko, A.; Kuznetsov, Yu. A.; Dawson, K. A.

    1997-03-01

    We study the phase transitions of a random copolymer chain with quenched disorder. We calculate the average over the quenched disorder in replica space and apply a Gaussian variational approach based on a generic quadratic trial Hamiltonian in terms of the correlation functions of monomer Fourier coordinates. This has the advantage that it allows us to incorporate fluctuations of the density, determined self-consistently, and to study collapse, phase separation transitions and the onset of the freezing transition within the same mean field theory. The effective free energy of the system is derived analytically and analyzed numerically in the one-step Parisi scheme. Such quantities as the radius of gyration, end-to-end distance or the average value of the overlap between different replicas are treated as observables and evaluated by introducing appropriate external fields to the Hamiltonian. As a result we obtain the phase diagram in terms of model parameters, scaling for the freezing transition and the dependence of correlation functions on the chain index.

  17. Random hypergraphs and their applications

    NASA Astrophysics Data System (ADS)

    Ghoshal, Gourab; Zlatić, Vinko; Caldarelli, Guido; Newman, M. E. J.

    2009-06-01

    In the last few years we have witnessed the emergence, primarily in online communities, of new types of social networks that require for their representation more complex graph structures than have been employed in the past. One example is the folksonomy, a tripartite structure of users, resources, and tags—labels collaboratively applied by the users to the resources in order to impart meaningful structure on an otherwise undifferentiated database. Here we propose a mathematical model of such tripartite structures that represents them as random hypergraphs. We show that it is possible to calculate many properties of this model exactly in the limit of large network size and we compare the results against observations of a real folksonomy, that of the online photography website Flickr. We show that in some cases the model matches the properties of the observed network well, while in others there are significant differences, which we find to be attributable to the practice of multiple tagging, i.e., the application by a single user of many tags to one resource or one tag to many resources.

  18. Random hypergraphs and their applications.

    PubMed

    Ghoshal, Gourab; Zlatić, Vinko; Caldarelli, Guido; Newman, M E J

    2009-06-01

    In the last few years we have witnessed the emergence, primarily in online communities, of new types of social networks that require for their representation more complex graph structures than have been employed in the past. One example is the folksonomy, a tripartite structure of users, resources, and tags-labels collaboratively applied by the users to the resources in order to impart meaningful structure on an otherwise undifferentiated database. Here we propose a mathematical model of such tripartite structures that represents them as random hypergraphs. We show that it is possible to calculate many properties of this model exactly in the limit of large network size and we compare the results against observations of a real folksonomy, that of the online photography website Flickr. We show that in some cases the model matches the properties of the observed network well, while in others there are significant differences, which we find to be attributable to the practice of multiple tagging, i.e., the application by a single user of many tags to one resource or one tag to many resources. PMID:19658575

  19. Dynamic computing random access memory

    NASA Astrophysics Data System (ADS)

    Traversa, F. L.; Bonani, F.; Pershin, Y. V.; Di Ventra, M.

    2014-07-01

    The present von Neumann computing paradigm involves a significant amount of information transfer between a central processing unit and memory, with concomitant limitations in the actual execution speed. However, it has been recently argued that a different form of computation, dubbed memcomputing (Di Ventra and Pershin 2013 Nat. Phys. 9 200-2) and inspired by the operation of our brain, can resolve the intrinsic limitations of present day architectures by allowing for computing and storing of information on the same physical platform. Here we show a simple and practical realization of memcomputing that utilizes easy-to-build memcapacitive systems. We name this architecture dynamic computing random access memory (DCRAM). We show that DCRAM provides massively-parallel and polymorphic digital logic, namely it allows for different logic operations with the same architecture, by varying only the control signals. In addition, by taking into account realistic parameters, its energy expenditures can be as low as a few fJ per operation. DCRAM is fully compatible with CMOS technology, can be realized with current fabrication facilities, and therefore can really serve as an alternative to the present computing technology.

  20. Dynamic computing random access memory.

    PubMed

    Traversa, F L; Bonani, F; Pershin, Y V; Di Ventra, M

    2014-07-18

    The present von Neumann computing paradigm involves a significant amount of information transfer between a central processing unit and memory, with concomitant limitations in the actual execution speed. However, it has been recently argued that a different form of computation, dubbed memcomputing (Di Ventra and Pershin 2013 Nat. Phys. 9 200-2) and inspired by the operation of our brain, can resolve the intrinsic limitations of present day architectures by allowing for computing and storing of information on the same physical platform. Here we show a simple and practical realization of memcomputing that utilizes easy-to-build memcapacitive systems. We name this architecture dynamic computing random access memory (DCRAM). We show that DCRAM provides massively-parallel and polymorphic digital logic, namely it allows for different logic operations with the same architecture, by varying only the control signals. In addition, by taking into account realistic parameters, its energy expenditures can be as low as a few fJ per operation. DCRAM is fully compatible with CMOS technology, can be realized with current fabrication facilities, and therefore can really serve as an alternative to the present computing technology. PMID:24972387

  1. Migration in asymmetric, random environments

    NASA Astrophysics Data System (ADS)

    Deem, Michael; Wang, Dong

    Migration is a key mechanism for expansion of communities. As a population migrates, it experiences a changing environment. In heterogeneous environments, rapid adaption is key to the evolutionary success of the population. In the case of human migration, environmental heterogeneity is naturally asymmetric in the North-South and East-West directions. We here consider migration in random, asymmetric, modularly correlated environments. Knowledge about the environment determines the fitness of each individual. We find that the speed of migration is proportional to the inverse of environmental change, and in particular we find that North-South migration rates are lower than East-West migration rates. Fast communication within the population of pieces of knowledge between individuals, similar to horizontal gene transfer in genetic systems, can help to spread beneficial knowledge among individuals. We show that increased modularity of the relation between knowledge and fitness enhances the rate of evolution. We investigate the relation between optimal information exchange rate and modularity of the dependence of fitness on knowledge. These results for the dependence of migration rate on heterogeneity, asymmetry, and modularity are consistent with existing archaeological facts.

  2. Aggregated Recommendation through Random Forests

    PubMed Central

    2014-01-01

    Aggregated recommendation refers to the process of suggesting one kind of items to a group of users. Compared to user-oriented or item-oriented approaches, it is more general and, therefore, more appropriate for cold-start recommendation. In this paper, we propose a random forest approach to create aggregated recommender systems. The approach is used to predict the rating of a group of users to a kind of items. In the preprocessing stage, we merge user, item, and rating information to construct an aggregated decision table, where rating information serves as the decision attribute. We also model the data conversion process corresponding to the new user, new item, and both new problems. In the training stage, a forest is built for the aggregated training set, where each leaf is assigned a distribution of discrete rating. In the testing stage, we present four predicting approaches to compute evaluation values based on the distribution of each tree. Experiments results on the well-known MovieLens dataset show that the aggregated approach maintains an acceptable level of accuracy. PMID:25180204

  3. Hierarchy in directed random networks

    NASA Astrophysics Data System (ADS)

    Mones, Enys

    2013-02-01

    In recent years, the theory and application of complex networks have been quickly developing in a markable way due to the increasing amount of data from real systems and the fruitful application of powerful methods used in statistical physics. Many important characteristics of social or biological systems can be described by the study of their underlying structure of interactions. Hierarchy is one of these features that can be formulated in the language of networks. In this paper we present some (qualitative) analytic results on the hierarchical properties of random network models with zero correlations and also investigate, mainly numerically, the effects of different types of correlations. The behavior of the hierarchy is different in the absence and the presence of giant components. We show that the hierarchical structure can be drastically different if there are one-point correlations in the network. We also show numerical results suggesting that the hierarchy does not change monotonically with the correlations and there is an optimal level of nonzero correlations maximizing the level of hierarchy.

  4. Organization of growing random networks

    SciTech Connect

    Krapivsky, P. L.; Redner, S.

    2001-06-01

    The organizational development of growing random networks is investigated. These growing networks are built by adding nodes successively, and linking each to an earlier node of degree k with an attachment probability A{sub k}. When A{sub k} grows more slowly than linearly with k, the number of nodes with k links, N{sub k}(t), decays faster than a power law in k, while for A{sub k} growing faster than linearly in k, a single node emerges which connects to nearly all other nodes. When A{sub k} is asymptotically linear, N{sub k}(t){similar_to}tk{sup {minus}{nu}}, with {nu} dependent on details of the attachment probability, but in the range 2{lt}{nu}{lt}{infinity}. The combined age and degree distribution of nodes shows that old nodes typically have a large degree. There is also a significant correlation in the degrees of neighboring nodes, so that nodes of similar degree are more likely to be connected. The size distributions of the in and out components of the network with respect to a given node{emdash}namely, its {open_quotes}descendants{close_quotes} and {open_quotes}ancestors{close_quotes}{emdash}are also determined. The in component exhibits a robust s{sup {minus}2} power-law tail, where s is the component size. The out component has a typical size of order lnt, and it provides basic insights into the genealogy of the network.

  5. Aggregated recommendation through random forests.

    PubMed

    Zhang, Heng-Ru; Min, Fan; He, Xu

    2014-01-01

    Aggregated recommendation refers to the process of suggesting one kind of items to a group of users. Compared to user-oriented or item-oriented approaches, it is more general and, therefore, more appropriate for cold-start recommendation. In this paper, we propose a random forest approach to create aggregated recommender systems. The approach is used to predict the rating of a group of users to a kind of items. In the preprocessing stage, we merge user, item, and rating information to construct an aggregated decision table, where rating information serves as the decision attribute. We also model the data conversion process corresponding to the new user, new item, and both new problems. In the training stage, a forest is built for the aggregated training set, where each leaf is assigned a distribution of discrete rating. In the testing stage, we present four predicting approaches to compute evaluation values based on the distribution of each tree. Experiments results on the well-known MovieLens dataset show that the aggregated approach maintains an acceptable level of accuracy. PMID:25180204

  6. Physical randomness sources for loophole-free Bell tests

    NASA Astrophysics Data System (ADS)

    Mitchell, Morgan W.

    2016-05-01

    We describe the strategy and physics used to select unpredictable measurement settings in the loophole-free Bell tests reported in [Hensen et al. Nature 2015, Giustina et al. PRL 2015, and Shalm et al. PRL 2015]. We demonstrate direct measurements of laser phase diffusion, a process driven by spontaneous emission, rigorous bounds on the effect of other, less-trusted contributions, and exponential predictability reduction by randomness extraction. As required for the cited experiments, we show the six-sigma bound for the predictability of the basis choices is below 0.001%. C. Abellan et al. PRL 2015.

  7. Random generalized linear model: a highly accurate and interpretable ensemble predictor

    PubMed Central

    2013-01-01

    Background Ensemble predictors such as the random forest are known to have superior accuracy but their black-box predictions are difficult to interpret. In contrast, a generalized linear model (GLM) is very interpretable especially when forward feature selection is used to construct the model. However, forward feature selection tends to overfit the data and leads to low predictive accuracy. Therefore, it remains an important research goal to combine the advantages of ensemble predictors (high accuracy) with the advantages of forward regression modeling (interpretability). To address this goal several articles have explored GLM based ensemble predictors. Since limited evaluations suggested that these ensemble predictors were less accurate than alternative predictors, they have found little attention in the literature. Results Comprehensive evaluations involving hundreds of genomic data sets, the UCI machine learning benchmark data, and simulations are used to give GLM based ensemble predictors a new and careful look. A novel bootstrap aggregated (bagged) GLM predictor that incorporates several elements of randomness and instability (random subspace method, optional interaction terms, forward variable selection) often outperforms a host of alternative prediction methods including random forests and penalized regression models (ridge regression, elastic net, lasso). This random generalized linear model (RGLM) predictor provides variable importance measures that can be used to define a “thinned” ensemble predictor (involving few features) that retains excellent predictive accuracy. Conclusion RGLM is a state of the art predictor that shares the advantages of a random forest (excellent predictive accuracy, feature importance measures, out-of-bag estimates of accuracy) with those of a forward selected generalized linear model (interpretability). These methods are implemented in the freely available R software package randomGLM. PMID:23323760

  8. High-Tg Polynorbornene-Based Block and Random Copolymers for Butanol Pervaporation Membranes

    NASA Astrophysics Data System (ADS)

    Register, Richard A.; Kim, Dong-Gyun; Takigawa, Tamami; Kashino, Tomomasa; Burtovyy, Oleksandr; Bell, Andrew

    Vinyl addition polymers of substituted norbornene (NB) monomers possess desirably high glass transition temperatures (Tg); however, until very recently, the lack of an applicable living polymerization chemistry has precluded the synthesis of such polymers with controlled architecture, or copolymers with controlled sequence distribution. We have recently synthesized block and random copolymers of NB monomers bearing hydroxyhexafluoroisopropyl and n-butyl substituents (HFANB and BuNB) via living vinyl addition polymerization with Pd-based catalysts. Both series of polymers were cast into the selective skin layers of thin film composite (TFC) membranes, and these organophilic membranes investigated for the isolation of n-butanol from dilute aqueous solution (model fermentation broth) via pervaporation. The block copolymers show well-defined microphase-separated morphologies, both in bulk and as the selective skin layers on TFC membranes, while the random copolymers are homogeneous. Both block and random vinyl addition copolymers are effective as n-butanol pervaporation membranes, with the block copolymers showing a better flux-selectivity balance. While polyHFANB has much higher permeability and n-butanol selectivity than polyBuNB, incorporating BuNB units into the polymer (in either a block or random sequence) limits the swelling of the polyHFANB and thereby improves the n-butanol pervaporation selectivity.

  9. A random interacting network model for complex networks

    NASA Astrophysics Data System (ADS)

    Goswami, Bedartha; Shekatkar, Snehal M.; Rheinwalt, Aljoscha; Ambika, G.; Kurths, Jürgen

    2015-12-01

    We propose a RAndom Interacting Network (RAIN) model to study the interactions between a pair of complex networks. The model involves two major steps: (i) the selection of a pair of nodes, one from each network, based on intra-network node-based characteristics, and (ii) the placement of a link between selected nodes based on the similarity of their relative importance in their respective networks. Node selection is based on a selection fitness function and node linkage is based on a linkage probability defined on the linkage scores of nodes. The model allows us to relate within-network characteristics to between-network structure. We apply the model to the interaction between the USA and Schengen airline transportation networks (ATNs). Our results indicate that two mechanisms: degree-based preferential node selection and degree-assortative link placement are necessary to replicate the observed inter-network degree distributions as well as the observed inter-network assortativity. The RAIN model offers the possibility to test multiple hypotheses regarding the mechanisms underlying network interactions. It can also incorporate complex interaction topologies. Furthermore, the framework of the RAIN model is general and can be potentially adapted to various real-world complex systems.

  10. A random interacting network model for complex networks

    PubMed Central

    Goswami, Bedartha; Shekatkar, Snehal M.; Rheinwalt, Aljoscha; Ambika, G.; Kurths, Jürgen

    2015-01-01

    We propose a RAndom Interacting Network (RAIN) model to study the interactions between a pair of complex networks. The model involves two major steps: (i) the selection of a pair of nodes, one from each network, based on intra-network node-based characteristics, and (ii) the placement of a link between selected nodes based on the similarity of their relative importance in their respective networks. Node selection is based on a selection fitness function and node linkage is based on a linkage probability defined on the linkage scores of nodes. The model allows us to relate within-network characteristics to between-network structure. We apply the model to the interaction between the USA and Schengen airline transportation networks (ATNs). Our results indicate that two mechanisms: degree-based preferential node selection and degree-assortative link placement are necessary to replicate the observed inter-network degree distributions as well as the observed inter-network assortativity. The RAIN model offers the possibility to test multiple hypotheses regarding the mechanisms underlying network interactions. It can also incorporate complex interaction topologies. Furthermore, the framework of the RAIN model is general and can be potentially adapted to various real-world complex systems. PMID:26657032

  11. A random interacting network model for complex networks.

    PubMed

    Goswami, Bedartha; Shekatkar, Snehal M; Rheinwalt, Aljoscha; Ambika, G; Kurths, Jürgen

    2015-01-01

    We propose a RAndom Interacting Network (RAIN) model to study the interactions between a pair of complex networks. The model involves two major steps: (i) the selection of a pair of nodes, one from each network, based on intra-network node-based characteristics, and (ii) the placement of a link between selected nodes based on the similarity of their relative importance in their respective networks. Node selection is based on a selection fitness function and node linkage is based on a linkage probability defined on the linkage scores of nodes. The model allows us to relate within-network characteristics to between-network structure. We apply the model to the interaction between the USA and Schengen airline transportation networks (ATNs). Our results indicate that two mechanisms: degree-based preferential node selection and degree-assortative link placement are necessary to replicate the observed inter-network degree distributions as well as the observed inter-network assortativity. The RAIN model offers the possibility to test multiple hypotheses regarding the mechanisms underlying network interactions. It can also incorporate complex interaction topologies. Furthermore, the framework of the RAIN model is general and can be potentially adapted to various real-world complex systems. PMID:26657032

  12. Dual genetic selection of synthetic riboswitches in Escherichia coli.

    PubMed

    Nomura, Yoko; Yokobayashi, Yohei

    2014-01-01

    This chapter describes a genetic selection strategy to engineer synthetic riboswitches that can chemically regulate gene expression in Escherichia coli. Riboswitch libraries are constructed by randomizing the nucleotides that potentially comprise an expression platform and fused to the hybrid selection/screening marker tetA-gfpuv. Iterative ON and OFF selections are performed under appropriate conditions that favor the survival or the growth of the cells harboring the desired riboswitches. After the selection, rapid screening of individual riboswitch clones is performed by measuring GFPuv fluorescence without subcloning. This optimized dual genetic selection strategy can be used to rapidly develop synthetic riboswitches without detailed computational design or structural knowledge. PMID:24549616

  13. Ewens sampling formulae with and without selection

    NASA Astrophysics Data System (ADS)

    Huillet, Thierry

    2007-09-01

    We shall first consider the random Dirichlet partitioning of the interval into n fragments at temperature [theta]>0. Using calculus for Dirichlet integrals, pre-asymptotic versions of the Ewens sampling formulae from finite Dirichlet partitions follow up. From these preliminaries, straightforward proofs of the usual sampling formulae from random proportions with Poisson-Dirichlet (PD)([gamma]) distribution can be obtained, while considering the Kingman limit n[NE pointing arrow][infinity], [theta][south east arrow]0, with n[theta]=[gamma]>0. In this manuscript, the Gibbs version of the Dirichlet partition with symmetric selection is considered. By use of similar series expansion calculus for Dirichlet integrals, closed-form expressions of Ewens sampling formulae in the presence of selection are obtained; special types of Bell polynomials are shown to be involved.

  14. Selective mutism - resources

    MedlinePlus

    Resources - selective mutism ... The following organizations are good resources for information on selective mutism : American Speech-Language-Hearing Association -- www.asha.org/public/speech/disorders/selectivemutism.htm Selective Mutism and ...

  15. Individual Differences Methods for Randomized Experiments

    ERIC Educational Resources Information Center

    Tucker-Drob, Elliot M.

    2011-01-01

    Experiments allow researchers to randomly vary the key manipulation, the instruments of measurement, and the sequences of the measurements and manipulations across participants. To date, however, the advantages of randomized experiments to manipulate both the aspects of interest and the aspects that threaten internal validity have been primarily…

  16. Brownian Optimal Stopping and Random Walks

    SciTech Connect

    Lamberton, D.

    2002-06-05

    One way to compute the value function of an optimal stopping problem along Brownian paths consists of approximating Brownian motion by a random walk. We derive error estimates for this type of approximation under various assumptions on the distribution of the approximating random walk.

  17. The Design of Cluster Randomized Crossover Trials

    ERIC Educational Resources Information Center

    Rietbergen, Charlotte; Moerbeek, Mirjam

    2011-01-01

    The inefficiency induced by between-cluster variation in cluster randomized (CR) trials can be reduced by implementing a crossover (CO) design. In a simple CO trial, each subject receives each treatment in random order. A powerful characteristic of this design is that each subject serves as its own control. In a CR CO trial, clusters of subjects…

  18. Color Charts, Esthetics, and Subjective Randomness

    ERIC Educational Resources Information Center

    Sanderson, Yasmine B.

    2012-01-01

    Color charts, or grids of evenly spaced multicolored dots or squares, appear in the work of modern artists and designers. Often the artist/designer distributes the many colors in a way that could be described as "random," that is, without an obvious pattern. We conduct a statistical analysis of 125 "random-looking" art and design color charts and…

  19. Random numbers certified by Bell's theorem.

    PubMed

    Pironio, S; Acín, A; Massar, S; de la Giroday, A Boyer; Matsukevich, D N; Maunz, P; Olmschenk, S; Hayes, D; Luo, L; Manning, T A; Monroe, C

    2010-04-15

    Randomness is a fundamental feature of nature and a valuable resource for applications ranging from cryptography and gambling to numerical simulation of physical and biological systems. Random numbers, however, are difficult to characterize mathematically, and their generation must rely on an unpredictable physical process. Inaccuracies in the theoretical modelling of such processes or failures of the devices, possibly due to adversarial attacks, limit the reliability of random number generators in ways that are difficult to control and detect. Here, inspired by earlier work on non-locality-based and device-independent quantum information processing, we show that the non-local correlations of entangled quantum particles can be used to certify the presence of genuine randomness. It is thereby possible to design a cryptographically secure random number generator that does not require any assumption about the internal working of the device. Such a strong form of randomness generation is impossible classically and possible in quantum systems only if certified by a Bell inequality violation. We carry out a proof-of-concept demonstration of this proposal in a system of two entangled atoms separated by approximately one metre. The observed Bell inequality violation, featuring near perfect detection efficiency, guarantees that 42 new random numbers are generated with 99 per cent confidence. Our results lay the groundwork for future device-independent quantum information experiments and for addressing fundamental issues raised by the intrinsic randomness of quantum theory. PMID:20393558

  20. Random ambience using high fidelity images

    NASA Astrophysics Data System (ADS)

    Abu, Nur Azman; Sahib, Shahrin

    2011-06-01

    Most of the secure communication nowadays mandates true random keys as an input. These operations are mostly designed and taken care of by the developers of the cryptosystem. Due to the nature of confidential crypto development today, pseudorandom keys are typically designed and still preferred by the developers of the cryptosystem. However, these pseudorandom keys are predictable, periodic and repeatable, hence they carry minimal entropy. True random keys are believed to be generated only via hardware random number generators. Careful statistical analysis is still required to have any confidence the process and apparatus generates numbers that are sufficiently random to suit the cryptographic use. In this underlying research, each moment in life is considered unique in itself. The random key is unique for the given moment generated by the user whenever he or she needs the random keys in practical secure communication. An ambience of high fidelity digital image shall be tested for its randomness according to the NIST Statistical Test Suite. Recommendation on generating a simple 4 megabits per second random cryptographic keys live shall be reported.

  1. A random Q-switched fiber laser.

    PubMed

    Tang, Yulong; Xu, Jianqiu

    2015-01-01

    Extensive studies have been performed on random lasers in which multiple-scattering feedback is used to generate coherent emission. Q-switching and mode-locking are well-known routes for achieving high peak power output in conventional lasers. However, in random lasers, the ubiquitous random cavities that are formed by multiple scattering inhibit energy storage, making Q-switching impossible. In this paper, widespread Rayleigh scattering arising from the intrinsic micro-scale refractive-index irregularities of fiber cores is used to form random cavities along the fiber. The Q-factor of the cavity is rapidly increased by stimulated Brillouin scattering just after the spontaneous emission is enhanced by random cavity resonances, resulting in random Q-switched pulses with high brightness and high peak power. This report is the first observation of high-brightness random Q-switched laser emission and is expected to stimulate new areas of scientific research and applications, including encryption, remote three-dimensional random imaging and the simulation of stellar lasing. PMID:25797520

  2. Evaluation of the Randomized Multiple Choice Format.

    ERIC Educational Resources Information Center

    Harke, Douglas James

    Each physics problem used in evaluating the effectiveness of Randomized Multiple Choice (RMC) tests was stated in the conventional manner and was followed by several multiple choice items corresponding to the steps in a written solution but presented in random order. Students were instructed to prepare a written answer and to use it to answer the…

  3. Effect Sizes in Cluster-Randomized Designs

    ERIC Educational Resources Information Center

    Hedges, Larry V.

    2007-01-01

    Multisite research designs involving cluster randomization are becoming increasingly important in educational and behavioral research. Researchers would like to compute effect size indexes based on the standardized mean difference to compare the results of cluster-randomized studies (and corresponding quasi-experiments) with other studies and to…

  4. Synchronization Properties of Random Piecewise Isometries

    NASA Astrophysics Data System (ADS)

    Gorodetski, Anton; Kleptsyn, Victor

    2016-08-01

    We study the synchronization properties of the random double rotations on tori. We give a criterion that show when synchronization is present in the case of random double rotations on the circle and prove that it is always absent in dimensions two and higher.

  5. Model selection for logistic regression models

    NASA Astrophysics Data System (ADS)

    Duller, Christine

    2012-09-01

    Model selection for logistic regression models decides which of some given potential regressors have an effect and hence should be included in the final model. The second interesting question is whether a certain factor is heterogeneous among some subsets, i.e. whether the model should include a random intercept or not. In this paper these questions will be answered with classical as well as with Bayesian methods. The application show some results of recent research projects in medicine and business administration.

  6. Truly random bit generation based on a novel random Brillouin fiber laser.

    PubMed

    Xiang, Dao; Lu, Ping; Xu, Yanping; Gao, Song; Chen, Liang; Bao, Xiaoyi

    2015-11-15

    We propose a novel dual-emission random Brillouin fiber laser (RBFL) with bidirectional pumping operation. Numerical simulations and experimental verification of the chaotic temporal and statistical properties of the RBFL are conducted, revealing intrinsic unpredictable intensity fluctuations and two completely uncorrelated laser outputs. A random bit generator based on quantum noise sources in the random Fabry-Perot resonator of the RBFL is realized at a bit rate of 5 Mbps with verified randomness. PMID:26565888

  7. Theories and Quantification of Thymic Selection

    PubMed Central

    Yates, Andrew J.

    2013-01-01

    The peripheral T cell repertoire is sculpted from prototypic T cells in the thymus bearing randomly generated T cell receptors (TCR) and by a series of developmental and selection steps that remove cells that are unresponsive or overly reactive to self-peptide–MHC complexes. The challenge of understanding how the kinetics of T cell development and the statistics of the selection processes combine to provide a diverse but self-tolerant T cell repertoire has invited quantitative modeling approaches, which are reviewed here. PMID:24550908

  8. Enhancing superconducting critical current by randomness

    NASA Astrophysics Data System (ADS)

    Wang, Y. L.; Thoutam, L. R.; Xiao, Z. L.; Shen, B.; Pearson, J. E.; Divan, R.; Ocola, L. E.; Crabtree, G. W.; Kwok, W. K.

    2016-01-01

    The key ingredient of high critical currents in a type-II superconductor is defect sites that pin vortices. Contrary to earlier understanding on nanopatterned artificial pinning, here we show unequivocally the advantages of a random pinscape over an ordered array in a wide magnetic field range. We reveal that the better performance of a random pinscape is due to the variation of its local density of pinning sites (LDOPS), which mitigates the motion of vortices. This is confirmed by achieving even higher enhancement of the critical current through a conformally mapped random pinscape, where the distribution of the LDOPS is further enlarged. The demonstrated key role of LDOPS in enhancing superconducting critical currents gets at the heart of random versus commensurate pinning. Our findings highlight the importance of random pinscapes in enhancing the superconducting critical currents of applied superconductors.

  9. Self-testing quantum random number generator.

    PubMed

    Lunghi, Tommaso; Brask, Jonatan Bohr; Lim, Charles Ci Wen; Lavigne, Quentin; Bowles, Joseph; Martin, Anthony; Zbinden, Hugo; Brunner, Nicolas

    2015-04-17

    The generation of random numbers is a task of paramount importance in modern science. A central problem for both classical and quantum randomness generation is to estimate the entropy of the data generated by a given device. Here we present a protocol for self-testing quantum random number generation, in which the user can monitor the entropy in real time. Based on a few general assumptions, our protocol guarantees continuous generation of high quality randomness, without the need for a detailed characterization of the devices. Using a fully optical setup, we implement our protocol and illustrate its self-testing capacity. Our work thus provides a practical approach to quantum randomness generation in a scenario of trusted but error-prone devices. PMID:25933297

  10. Generation of Random Numbers by Micromechanism

    NASA Astrophysics Data System (ADS)

    Mita, Makoto; Toshiyoshi, Hiroshi; Ataka, Manabu; Fujita, Hiroyuki

    We have successfully developed a novel micromechanism of random number generator (RNG) by using the silicon micromachining technique. The MEM(Micro Electro Mechanical)RNG produce a series of random numbers by using the pull-in instability of electrostatic actuation operated with a typical dc 150 volt. The MEM RNG is made by the deep reactive ion etching of a silicon-on-insulator(SOI) wafer, and is very small compared with the conventional RNG hardware based on the randomness of thermal noise or isotope radiation. Quality of randomness has been experimentally confirmed by the self-correlation study of the generated series of numbers. The MEM RNG proposed here would be a true random number generation, which is needed for the highly secured encryption system of today’s information technology.

  11. Randomization in clinical trials: conclusions and recommendations.

    PubMed

    Lachin, J M; Matts, J P; Wei, L J

    1988-12-01

    The statistical properties of simple (complete) randomization, permuted-block (or simply blocked) randomization, and the urn adaptive biased-coin randomization are summarized. These procedures are contrasted to covariate adaptive procedures such as minimization and to response adaptive procedures such as the play-the-winner rule. General recommendations are offered regarding the use of complete, permuted-block, or urn randomization. In a large double-masked trial, any of these procedures may be acceptable. For a given trial, the relative merits of each procedure should be carefully weighed in relation to the characteristics of the trial. Important considerations are the size of the trial, overall as well as within the smallest subgroup to be employed in a subgroup-specific analysis, whether or not the trial is to be masked, and the resources needed to perform the proper randomization-based permutational analysis. PMID:3203526

  12. Bright emission from a random Raman laser

    PubMed Central

    Hokr, Brett H.; Bixler, Joel N.; Cone, Michael T.; Mason, John D.; Beier, Hope T.; Noojin, Gary D.; Petrov, Georgi I.; Golovan, Leonid A.; Thomas, Robert J.; Rockwell, Benjamin A.; Yakovlev, Vladislav V.

    2014-01-01

    Random lasers are a developing class of light sources that utilize a highly disordered gain medium as opposed to a conventional optical cavity. Although traditional random lasers often have a relatively broad emission spectrum, a random laser that utilizes vibration transitions via Raman scattering allows for an extremely narrow bandwidth, on the order of 10 cm−1. Here we demonstrate the first experimental evidence of lasing via a Raman interaction in a bulk three-dimensional random medium, with conversion efficiencies on the order of a few percent. Furthermore, Monte Carlo simulations are used to study the complex spatial and temporal dynamics of nonlinear processes in turbid media. In addition to providing a large signal, characteristic of the Raman medium, the random Raman laser offers us an entirely new tool for studying the dynamics of gain in a turbid medium. PMID:25014073

  13. Gold nanostars for random lasing enhancement.

    PubMed

    Ziegler, Johannes; Djiango, Martin; Vidal, Cynthia; Hrelescu, Calin; Klar, Thomas A

    2015-06-15

    We demonstrate random lasing with star-shaped gold nanoparticles ("nanostars") as scattering centers embedded in a dye-doped gain medium. It is experimentally shown that star-shaped gold nanoparticles outperform those of conventional shapes, such as spherical or prolate nanoparticles. The nanoparticles are randomly distributed within a thin film of gain medium, forming resonators which support coherent laser modes. Driven by single-pulsed excitation, the random lasers exhibit coherent lasing thresholds in the order of 0.9 mJ/cm(2) and spectrally narrow emission peaks with linewidths less than 0.2 nm. The distinguished random laser comprising nanostars is likely to take advantage of the high plasmonic field enhancements, localized at the spiky tips of the nanostars, which improves the feedback mechanism for lasing and increases the emission intensity of the random laser. PMID:26193498

  14. Self-Testing Quantum Random Number Generator

    NASA Astrophysics Data System (ADS)

    Lunghi, Tommaso; Brask, Jonatan Bohr; Lim, Charles Ci Wen; Lavigne, Quentin; Bowles, Joseph; Martin, Anthony; Zbinden, Hugo; Brunner, Nicolas

    2015-04-01

    The generation of random numbers is a task of paramount importance in modern science. A central problem for both classical and quantum randomness generation is to estimate the entropy of the data generated by a given device. Here we present a protocol for self-testing quantum random number generation, in which the user can monitor the entropy in real time. Based on a few general assumptions, our protocol guarantees continuous generation of high quality randomness, without the need for a detailed characterization of the devices. Using a fully optical setup, we implement our protocol and illustrate its self-testing capacity. Our work thus provides a practical approach to quantum randomness generation in a scenario of trusted but error-prone devices.

  15. Estimating the Causal Effect of Randomization versus Treatment Preference in a Doubly Randomized Preference Trial

    ERIC Educational Resources Information Center

    Marcus, Sue M.; Stuart, Elizabeth A.; Wang, Pei; Shadish, William R.; Steiner, Peter M.

    2012-01-01

    Although randomized studies have high internal validity, generalizability of the estimated causal effect from randomized clinical trials to real-world clinical or educational practice may be limited. We consider the implication of randomized assignment to treatment, as compared with choice of preferred treatment as it occurs in real-world…

  16. Use of group-randomized trials in pet population research.

    PubMed

    Lord, L K; Wittum, T E; Scarlett, J M

    2007-12-14

    Communities invest considerable resources to address the animal welfare and public health concerns resulting from unwanted pet animals. Traditionally, research in this area has enumerated the pet-owning population, described pet population dynamics in individual communities, and estimated national euthanasia figures. Recent research has investigated the human-animal bond and explored the community implications of managed feral cat colonies. These reports have utilized traditional epidemiologic study designs to generate observational data to describe populations and measure associations. However, rigorous scientific evaluations of potential interventions at the group level have been lacking. Group-randomized trials have been used extensively in public health research to evaluate interventions that change a population's behavior, not just the behavior of selected individuals. We briefly describe the strengths and limitations of group-randomized trials as they are used to evaluate interventions that promote social and behavioral changes in the human public health field. We extend these examples to suggest the appropriate application of group-randomized trials for pet population dynamics research. PMID:17707934

  17. Robustness of optimal random searches in fragmented environments

    NASA Astrophysics Data System (ADS)

    Wosniack, M. E.; Santos, M. C.; Raposo, E. P.; Viswanathan, G. M.; da Luz, M. G. E.

    2015-05-01

    The random search problem is a challenging and interdisciplinary topic of research in statistical physics. Realistic searches usually take place in nonuniform heterogeneous distributions of targets, e.g., patchy environments and fragmented habitats in ecological systems. Here we present a comprehensive numerical study of search efficiency in arbitrarily fragmented landscapes with unlimited visits to targets that can only be found within patches. We assume a random walker selecting uniformly distributed turning angles and step lengths from an inverse power-law tailed distribution with exponent μ . Our main finding is that for a large class of fragmented environments the optimal strategy corresponds approximately to the same value μopt≈2 . Moreover, this exponent is indistinguishable from the well-known exact optimal value μopt=2 for the low-density limit of homogeneously distributed revisitable targets. Surprisingly, the best search strategies do not depend (or depend only weakly) on the specific details of the fragmentation. Finally, we discuss the mechanisms behind this observed robustness and comment on the relevance of our results to both the random search theory in general, as well as specifically to the foraging problem in the biological context.

  18. Repeated randomization and matching in multi-arm trials.

    PubMed

    Xu, Zhenzhen; Kalbfleisch, John D

    2013-12-01

    Cluster randomized trials with relatively few clusters have been widely used in recent years for evaluation of health-care strategies. The balance match weighted (BMW) design, introduced in Xu and Kalbfleisch (2010, Biometrics 66, 813-823), applies the optimal full matching with constraints technique to a prospective randomized design with the aim of minimizing the mean squared error (MSE) of the treatment effect estimator. This is accomplished through consideration of M independent randomizations of the experimental units and then selecting the one which provides the most balance evaluated by matching on the estimated propensity scores. Often in practice, clinical trials may involve more than two treatment arms and multiple treatment options need to be evaluated. Therefore, we consider extensions of the BMW propensity score matching method to allow for studies with three or more arms. In this article, we propose three approaches to extend the BMW design to clinical trials with more than two arms and evaluate the property of the extended design in simulation studies. PMID:24134592

  19. Health Screening and Random Recruitment for Cognitive Aging Research

    PubMed Central

    Christensen, Kathy J.; Moye, Jennifer; Armson, Rossana Rae; Kern, Thomas M.

    2016-01-01

    A survey of 197 cognitive aging studies revealed infrequent use of structured health assessments and random recruitment. In this study, a health screening questionnaire developed to identify subjects with medical problems that might impair cognition was administered to 315 adults aged 60 and older who were recruited by random digit dialing. On the basis of self-reported medical problems, 35% of the subjects were excluded. Those excluded were older (p < .001) and tended to be male but did not differ in education from those who passed the screening. Subjects who passed the screening and decided to participate in a neuropsychological research project were younger (p < .001), better educated (p < .001), and more likely to be male (p < .001) than nonparticipants. These findings suggest that careful assessment, selection, and description of subjects is needed to aid interpretation of cognitive aging research. Further attention to health status is needed to aid interpretation of cognitive aging research. Although random recruitment of the elderly is feasible, obtaining representative samples may require stratification on demographic variables. PMID:1610509

  20. The social selection alternative to sexual selection

    PubMed Central

    Roughgarden, Joan

    2012-01-01

    Social selection offers an alternative to sexual selection by reversing its logic. Social selection starts with offspring production and works back to mating, and starts with behavioural dynamics and works up to gene pool dynamics. In social selection, courtship can potentially be deduced as a negotiation, leading to an optimal allocation of tasks during offspring rearing. Ornaments facilitate this negotiation and also comprise ‘admission tickets’ to cliques. Mating pairs may form ‘teams’ based on the reciprocal sharing of pleasure. The parent–offspring relation can be managed by the parent considered as the owner of a ‘family firm’ whose product is offspring. The cooperation in reproductive social behaviour evolves as a mutual direct benefit through individual selection rather than as some form of altruism requiring kin or multi-level selection. PMID:22777017